这篇介绍过flume数据插入hdfs和普通目录(http://www.linuxidc.com/Linux/2014-01/95798.htm),本文继续介绍flume-ng将数据插入hbase-0.96.0.首先,修改node中flume文件夹下conf目录中的flume-node.conf文件(原配置参考上文),对其做如下修改:
- agent.sinks = k1
- agent.sinks.k1.type = hbase
- agent.sinks.k1.table = hello
- agent.sinks.k1.columnFamily = cf
- agent.sinks.k1.column = col1
- agent.sinks.k1.serializer = org.apache.flume.sink.hbase.SimpleHbaseEventSerializer
- agent.sinks.k1.channel = memoryChannel
agent.sinks = k1agent.sinks.k1.type = hbaseagent.sinks.k1.table = helloagent.sinks.k1.columnFamily = cfagent.sinks.k1.column = col1agent.sinks.k1.serializer = org.apache.flume.sink.hbase.SimpleHbaseEventSerializeragent.sinks.k1.channel = memoryChannel 不过和上文不同的是,这次要想得到成功结果就没那么简单了,由于依赖的版本问题。此处需要将flume的lib文件夹下的protobuf用Hadoop-2.2.0中的2.5.0版本替换,还需要用hadoop-2.2.0中的guava替换flume的lib文件夹下的guava,删除原来相应的jar文件。启动即可生效。flume-ng里面的SimpleHbaseEventSerializer只提供了最简单的数据插入hbase功能,如果还有其他需要,就得自己写HbaseEventSerializer类,在apache-flume-1.4.0-src/flume-ng-sinks/flume-ng-hbase-sink/src/main/java中定义自己的类,实现flume中的HbaseEventSerializer接口。一个简单的实例如下:
- publicclass MyHBaseSerializer implements HbaseEventSerializer {
- privatestaticfinal String[] COLUMNS = "column1,column2".split(",");
- privatestaticfinal String[] PARAMS = "col1,col2".split(",");
- privatebyte[] columnFamily = "cf".getBytes();
- privatebyte[] content;
- @Override
- publicvoid configure(Context context) {
- }
- @Override
- publicvoid configure(ComponentConfiguration conf) {
- }
- @Override
- publicvoid initialize(Event event, byte[] columnFamily) {
- this.content = event.getBody();
- this.columnFamily = columnFamily;
- }
- @Override
- public List<Row> getActions() {
- String string = Bytes.toString(content);
- String value1 = string.substring(0,string.length()/2);
- String value2 = string.substring(string.length()/2, string.length());
- Map<String,String> map = Maps.newHashMap();
- map.put(PARAMS[0], value1);
- map.put(PARAMS[1], value2);
- List<Row> actions = new LinkedList<Row>();
- String rowKey = String.valueOf(System.currentTimeMillis());
- Put put = new Put(Bytes.toBytes(rowKey));
- for (int i = 0; i < COLUMNS.length; i++) {
- String value = map.get(PARAMS[i]);
- if (value == null)
- value = "";
- put.add(columnFamily, Bytes.toBytes(COLUMNS[i]), Bytes.toBytes(value));
- }
- actions.add(put);
- return actions;
- }
- @Override
- public List<Increment> getIncrements() {
- List<Increment> incs = new LinkedList<Increment>();
- return incs;
- }
- @Override
- publicvoid close() {
- }
- }
public class MyHBaseSerializer implements HbaseEventSerializer {private static final String[] COLUMNS = "column1,column2".split(",");private static final String[] PARAMS = "col1,col2".split(",");private byte[] columnFamily = "cf".getBytes();private byte[] content;@Overridepublic void configure(Context context) {}@Overridepublic void configure(ComponentConfiguration conf) {}@Overridepublic void initialize(Event event, byte[] columnFamily) {this.content = event.getBody();this.columnFamily = columnFamily;}@Overridepublic List<Row> getActions() {String string = Bytes.toString(content);String value1 = string.substring(0,string.length()/2);String value2 = string.substring(string.length()/2, string.length());Map<String,String> map = Maps.newHashMap();map.put(PARAMS[0], value1);map.put(PARAMS[1], value2);List<Row> actions = new LinkedList<Row>();String rowKey = String.valueOf(System.currentTimeMillis());Put put = new Put(Bytes.toBytes(rowKey));for (int i = 0; i < COLUMNS.length; i++) {String value = map.get(PARAMS[i]);if (value == null)value = "";put.add(columnFamily, Bytes.toBytes(COLUMNS[i]), Bytes.toBytes(value));}actions.add(put);return actions;}@Overridepublic List<Increment> getIncrements() {List<Increment> incs = new LinkedList<Increment>();return incs;}@Overridepublic void close() {}}该类实现的功能是将文件中的内容按行切分程两部分,分别插入列名为column1和column2的两列中,rowKey为当前时间。完成后将flume-ng代码重新编译打包。然后将flume-ng目录里面的lib文件夹的相应的jar文件替换。然后将上文中的agent.sinks.k1.serializer 值改为test..MyHBaseSerializer即可。其中test为包名。
HBase 的详细介绍:请点这里
HBase 的下载地址:请点这里
相关阅读:Hadoop+HBase搭建云存储总结 PDF http://www.linuxidc.com/Linux/2013-05/83844.htmHBase 结点之间时间不一致造成regionserver启动失败 http://www.linuxidc.com/Linux/2013-06/86655.htmHadoop+ZooKeeper+HBase集群配置 http://www.linuxidc.com/Linux/2013-06/86347.htmHadoop集群安装&HBase实验环境搭建 http://www.linuxidc.com/Linux/2013-04/83560.htm基于Hadoop集群的HBase集群的配置 http://www.linuxidc.com/Linux/2013-03/80815.htm‘Hadoop安装部署笔记之-HBase完全分布模式安装 http://www.linuxidc.com/Linux/2012-12/76947.htm单机版搭建HBase环境图文教程详解 http://www.linuxidc.com/Linux/2012-10/72959.htmOracle单行函数ORA-39212: 安装错误: 未正确加载 XSL 样式表相关资讯 Hbase Flume-ng
- HBase 参考文档翻译之 Getting (08月15日)
- HBase应用开发回顾与总结系列 (01月10日)
- Apache HBase 2015年发展回顾与未 (01月04日)
| - 为啥HBase需要搭建SQL引擎层 (02月19日)
- HBase表数据分页处理 (01月10日)
- Hbase VS Oracle (11/21/2015 20:22:40)
|
本文评论 查看全部评论 (0)