clickhouse对接kafaka不能解析json问题
例如:
DB::Exception: Cannot parse input: expected { before: 555: (at row 1)
https://www.cqmaple.com/201907/clickhouse-kafka-engine.html
CREATE TABLE queue213 (
timestamp UInt64,
level String,
message String
) ENGINE = Kafka SETTINGS kafka_broker_list = '192.168.202.136:9092',
kafka_topic_list = 'log_test',
kafka_group_name = 'group1',
kafka_format = 'JSONEachRow',
kafka_row_delimiter = '\n',
kafka_num_consumers = 1;
select count(*) FROM queue213 ;
问题出在引擎版本上,我使用的是19.3.4 版本。19.1 版本没有问题, 19.5.2.6 版本解决了此问题,也就是中间版本存在这个问题。
原因: 消息中数据之间的分割符号未指定,导致无法处理。
解决办法: 添加 kafka_row_delimiter = ‘\n’,也就是上文键标红的部分。
问题二:
kafaka是集群应该写集群配置不可只写一个节点,否则不稳定
CREATE TABLE yang_mysql1.queue213 (
`timestamp` UInt64,
`level` String,
`message` String
) ENGINE = Kafka SETTINGS kafka_broker_list = '192.168.202.135:9092,192.168.202.136:9092,192.168.202.185:9092',
kafka_topic_list = 'log_test',
kafka_group_name = 'group1',
kafka_format = 'JSONEachRow',
kafka_row_delimiter = '\n',
kafka_num_consumers = 1
select * FROM yang_mysql1.queue213
参考解决地址: https://github.com/yandex/ClickHouse/issues/4442