在HBase shell中,我通过以下方法创建了我的表:
create 'pig_table','cf'在Pig中,下面是我希望存储到pig_table中的别名的结果
DUMP B;
生成包含6个字段的元组:
(D1|30|2014-01-01 13:00,D1,30,7.0,2014-01-01 13:00,DEF)
(D1|30|2014-01-01 22:00,D1,30,1.0,2014-01-01 22:00,JKL)
(D10|20|2014-01-01 11:00,D10,20,4.0,2014-01-01 11:00,PQR)
...第一个字段是第2、第3和第5字段的连接,将用作HBase行键。
但
STORE B INTO 'hbase://pig_table' USING org.apache.pig.backend.hadoop.hbase.HBaseStorage ( 'cf:device_id,cf:cost,cf:hours,cf:start_time,cf:code')
在以下方面的成果:
`Failed to produce result in "hbase:pig_table"日志给了我:
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.pig.data.DataByteArray
at org.apache.pig.backend.hadoop.hbase.HBaseStorage.objToBytes(HBaseStorage.java:924)
at org.apache.pig.backend.hadoop.hbase.HBaseStorage.putNext(HBaseStorage.java:875)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:98)
at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:551)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:99)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.runPipeline(PigGenericMapReduce.java:468)
... 11 more我的语法有什么问题?
发布于 2014-01-31 23:38:13
看来,HBaseStorage不会自动将元组的数据字段转换为chararray,而在将其存储在HBase中之前,这是必要的。我只是简单地说:
C = FOREACH B {
GENERATE
(chararray)$0
,(chararray)$1
,(chararray)$2
,(chararray)$3
,(chararray)$4
,(chararray)$5
,(chararray)$6
;
}STORE B INTO 'hbase://pig_table' USING org.apache.pig.backend.hadoop.hbase.HBaseStorage ( 'cf:device_id,cf:cost,cf:hours,cf:start_time,cf:code')
https://stackoverflow.com/questions/21469714
复制相似问题