Flume-ng将数据插入HBase-0.96.0
首先,修改node中flume文件夹下conf目录中的flume-node.conf文件(原配置参考上文),对其做如下修改:
- agent.sinks = k1
- agent.sinks.k1.type = hbase
- agent.sinks.k1.table = hello
- agent.sinks.k1.columnFamily = cf
- agent.sinks.k1.column = col1
- agent.sinks.k1.serializer = org.apache.flume.sink.hbase.SimpleHbaseEventSerializer
- agent.sinks.k1.channel = memoryChannel
agent.sinks = k1 agent.sinks.k1.type = hbase agent.sinks.k1.table = hello agent.sinks.k1.columnFamily = cf agent.sinks.k1.column = col1 agent.sinks.k1.serializer = org.apache.flume.sink.hbase.SimpleHbaseEventSerializer agent.sinks.k1.channel = memoryChannel
不过和上文不同的是,这次要想得到成功结果就没那么简单了,由于依赖的版本问题。此处需要将flume的lib文件夹下的protobuf用Hadoop-2.2.0中的2.5.0版本替换,还需要用hadoop-2.2.0中的guava替换flume的lib文件夹下的guava,删除原来相应的jar文件。启动即可生效。
flume-ng里面的SimpleHbaseEventSerializer只提供了最简单的数据插入hbase功能,如果还有其他需要,就得自己写HbaseEventSerializer类,在apache-flume-1.4.0-src/flume-ng-sinks/flume-ng-hbase-sink/src/main/java中定义自己的类,实现flume中的HbaseEventSerializer接口。一个简单的实例如下:
- publicclass MyHBaseSerializer implements HbaseEventSerializer {
- privatestaticfinal String[] COLUMNS = "column1,column2".split(",");
- privatestaticfinal String[] PARAMS = "col1,col2".split(",");
- privatebyte[] columnFamily = "cf".getBytes();
- privatebyte[] content;
- @Override
- publicvoid configure(Context context) {
- }
- @Override
- publicvoid configure(ComponentConfiguration conf) {
- }
- @Override
- publicvoid initialize(Event event, byte[] columnFamily) {
- this.content = event.getBody();
- this.columnFamily = columnFamily;
- }
- @Override
- public List<Row> getActions() {
- String string = Bytes.toString(content);
- String value1 = string.substring(0,string.length()/2);
- String value2 = string.substring(string.length()/2, string.length());
- Map<String,String> map = Maps.newHashMap();
- map.put(PARAMS[0], value1);
- map.put(PARAMS[1], value2);
- List<Row> actions = new LinkedList<Row>();
- String rowKey = String.valueOf(System.currentTimeMillis());
- Put put = new Put(Bytes.toBytes(rowKey));
- for (int i = 0; i < COLUMNS.length; i++) {
- String value = map.get(PARAMS[i]);
- if (value == null)
- value = "";
- put.add(columnFamily, Bytes.toBytes(COLUMNS[i]), Bytes.toBytes(value));
- }
- actions.add(put);
- return actions;
- }
- @Override
- public List<Increment> getIncrements() {
- List<Increment> incs = new LinkedList<Increment>();
- return incs;
- }
- @Override
- publicvoid close() {
- }
- }
public class MyHBaseSerializer implements HbaseEventSerializer { private static final String[] COLUMNS = "column1,column2".split(","); private static final String[] PARAMS = "col1,col2".split(","); private byte[] columnFamily = "cf".getBytes(); private byte[] content; @Override public void configure(Context context) { } @Override public void configure(ComponentConfiguration conf) { } @Override public void initialize(Event event, byte[] columnFamily) { this.content = event.getBody(); this.columnFamily = columnFamily; } @Override public List<Row> getActions() { String string = Bytes.toString(content); String value1 = string.substring(0,string.length()/2); String value2 = string.substring(string.length()/2, string.length()); Map<String,String> map = Maps.newHashMap(); map.put(PARAMS[0], value1); map.put(PARAMS[1], value2); List<Row> actions = new LinkedList<Row>(); String rowKey = String.valueOf(System.currentTimeMillis()); Put put = new Put(Bytes.toBytes(rowKey)); for (int i = 0; i < COLUMNS.length; i++) { String value = map.get(PARAMS[i]); if (value == null) value = ""; put.add(columnFamily, Bytes.toBytes(COLUMNS[i]), Bytes.toBytes(value)); } actions.add(put); return actions; } @Override public List<Increment> getIncrements() { List<Increment> incs = new LinkedList<Increment>(); return incs; } @Override public void close() { } }
该类实现的功能是将文件中的内容按行切分程两部分,分别插入列名为column1和column2的两列中,rowKey为当前时间。完成后将flume-ng代码重新编译打包。然后将flume-ng目录里面的lib文件夹的相应的jar文件替换。然后将上文中的agent.sinks.k1.serializer 值改为test..MyHBaseSerializer即可。其中test为包名。
HBase 的详细介绍:请点这里
HBase 的下载地址:请点这里
相关阅读: