zookeeper and kafka

 

kafka安装前期准备:

1,准备三个节点(根据自己需求决定)

2,三个节点上安装好zookeeper(也可以使用kafka自带的zookeeper)

3,关闭防火墙

chkconfig  iptables off

一、下载安装包

Kafka官网下载安装包 http://kafka.apache.org/downloads.html

找到 Binary downloads(已编译好的): 下载后解压

Mac:

brew cask install homebrew/cask-versions/adoptopenjdk8

$ brew install kafka

软件位置
  /usr/local/Cellar/zookeeper
  /usr/local/Cellar/kafka
 
配置文件位置
  /usr/local/etc/kafka/zookeeper.properties
  /usr/local/etc/kafka/server.properties

备注:后续操作均需进入 /usr/local/Cellar/kafka/xxxx/bin 目录下。

 

启动zookeeper

zookeeper-server-start /usr/local/etc/kafka/zookeeper.properties 
 

启动kafka服务

kafka-server-start /usr/local/etc/kafka/server.properties
 

或者

$ brew services start zookeeper

$ brew services start kafka

 

创建topic

kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test1

查看创建的topic

kafka-topics --list --zookeeper localhost:2181  

生产数据

kafka-console-producer --broker-list localhost:9092 --topic test1

消费数据

kafka-console-consumer --bootstrap-server 127.0.0.1:9092 --topic test1 --from-beginning

备注:--from-beginning  将从第一个消息还是接收
 

 

go操作kafka


Kafka是一种高吞吐量的分布式发布订阅消息系统,它可以处理消费者规模的网站中的所有动作流数据,具有高性能、持久化、多副本备份、横向扩展等特点。本文介绍了如何使用Go语言发送和接收kafka消息。

sarama

Go语言中连接kafka使用第三方库:github.com/Shopify/sarama

下载及安装

$ go get github.com/Shopify/sarama

注意事项

sarama v1.20之后的版本加入了zstd压缩算法,需要用到cgo,在Windows平台编译时会提示类似如下错误:

# github.com/DataDog/zstd
exec: "gcc":executable file not found in %PATH%

所以在Windows平台请使用v1.19版本的sarama。

连接kafka发送消息

package main

import (
	"fmt"

	"github.com/Shopify/sarama"
)

// 基于sarama第三方库开发的kafka client

func main() {
	config := sarama.NewConfig()
	config.Producer.RequiredAcks = sarama.WaitForAll          // 发送完数据需要leader和follow都确认
	config.Producer.Partitioner = sarama.NewRandomPartitioner // 新选出一个partition
	config.Producer.Return.Successes = true                   // 成功交付的消息将在success channel返回

	// 构造一个消息
	msg := &sarama.ProducerMessage{}
	msg.Topic = "web_log"
	msg.Value = sarama.StringEncoder("this is a test log")
	// 连接kafka
	client, err := sarama.NewSyncProducer([]string{"192.168.1.7:9092"}, config)
	if err != nil {
		fmt.Println("producer closed, err:", err)
		return
	}
	defer client.Close()
	// 发送消息
	pid, offset, err := client.SendMessage(msg)
	if err != nil {
		fmt.Println("send msg failed, err:", err)
		return
	}
	fmt.Printf("pid:%v offset:%v\n", pid, offset)
}

连接kafka消费消息

package main

import (
	"fmt"

	"github.com/Shopify/sarama"
)

// kafka consumer

func main() {
	consumer, err := sarama.NewConsumer([]string{"127.0.0.1:9092"}, nil)
	if err != nil {
		fmt.Printf("fail to start consumer, err:%v\n", err)
		return
	}
	partitionList, err := consumer.Partitions("web_log") // 根据topic取到所有的分区
	if err != nil {
		fmt.Printf("fail to get list of partition:err%v\n", err)
		return
	}
	fmt.Println(partitionList)
	for partition := range partitionList { // 遍历所有的分区
		// 针对每个分区创建一个对应的分区消费者
		pc, err := consumer.ConsumePartition("web_log", int32(partition), sarama.OffsetNewest)
		if err != nil {
			fmt.Printf("failed to start consumer for partition %d,err:%v\n", partition, err)
			return
		}
		defer pc.AsyncClose()
		// 异步从每个分区消费信息
		go func(sarama.PartitionConsumer) {
			for msg := range pc.Messages() {
				fmt.Printf("Partition:%d Offset:%d Key:%v Value:%v", msg.Partition, msg.Offset, msg.Key, msg.Value)
			}
		}(pc)
	}
}

 

相关推荐