首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >在hbase中大容量加载时出错

在hbase中大容量加载时出错
EN

Stack Overflow用户
提问于 2012-06-14 22:54:58
回答 2查看 2.4K关注 0票数 3

我正在通过Java MapReduce程序尝试Hbase - bulkLoad。我在Eclipse中运行我的程序。

但是我得到了以下错误:

代码语言:javascript
复制
12/06/14 20:04:28 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
12/06/14 20:04:28 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/06/14 20:04:29 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
12/06/14 20:04:29 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
12/06/14 20:04:29 INFO input.FileInputFormat: Total input paths to process : 1
12/06/14 20:04:29 WARN snappy.LoadSnappy: Snappy native library not loaded
12/06/14 20:04:29 INFO mapred.JobClient: Running job: job_local_0001
12/06/14 20:04:29 INFO mapred.MapTask: io.sort.mb = 100
12/06/14 20:04:29 INFO mapred.MapTask: data buffer = 79691776/99614720
12/06/14 20:04:29 INFO mapred.MapTask: record buffer = 262144/327680
12/06/14 20:04:29 WARN mapred.LocalJobRunner: job_local_0001
java.lang.IllegalArgumentException: Can't read partitions file
    at org.apache.hadoop.hbase.mapreduce.hadoopbackport.TotalOrderPartitioner.setConf(TotalOrderPartitioner.java:111)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:560)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:639)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:210)
Caused by: java.io.FileNotFoundException: File _partition.lst does not exist.
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:383)
    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251)
    at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:776)
    at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1424)
    at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1419)
    at org.apache.hadoop.hbase.mapreduce.hadoopbackport.TotalOrderPartitioner.readPartitions(TotalOrderPartitioner.java:296)
    at org.apache.hadoop.hbase.mapreduce.hadoopbackport.TotalOrderPartitioner.setConf(TotalOrderPartitioner.java:82)
    ... 6 more
12/06/14 20:04:30 INFO mapred.JobClient:  map 0% reduce 0%
12/06/14 20:04:30 INFO mapred.JobClient: Job complete: job_local_0001
12/06/14 20:04:30 INFO mapred.JobClient: Counters: 0

我用谷歌搜索了很多,但没有找到任何解决方案。

我试图从控制台运行这个相同的程序,但出现了以下错误:

代码语言:javascript
复制
 hadoop jar /home/user/hbase-0.90.4-cdh3u2/lib/zookeeper-3.3.3-cdh3u2.jar /home/user/hadoop-0.20.2-cdh3u2/Test.jar BulkLoadHBase_1 /bulkLoad.txt /out
Exception in thread "main" java.lang.NoSuchMethodException: org.apache.zookeeper.server.quorum.QuorumPeer.main([Ljava.lang.String;)
    at java.lang.Class.getMethod(Class.java:1605)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:180)

我的代码:

代码语言:javascript
复制
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.HFileOutputFormat;
import org.apache.hadoop.hbase.mapreduce.PutSortReducer;
import org.apache.hadoop.hbase.mapreduce.hadoopbackport.TotalOrderPartitioner;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.KeyValueTextInputFormat;

public class BulkLoadHBase_1 {

    public static class BulkLoadHBase_1Mapper 
            extends Mapper<Text, Text, ImmutableBytesWritable, Put>{

        public void map(Text key, Text value, Context context
                        ) throws IOException, InterruptedException {

            System.out.println("KEY  "+key.toString());
            System.out.println("VALUES : "+value);
            System.out.println("Context : "+context);

            ImmutableBytesWritable ibw =
                    new ImmutableBytesWritable(Bytes.toBytes(key.toString()));

            String val = value.toString();
            byte[] b = Bytes.toBytes(val);
            Put p = new Put(Bytes.toBytes(key.toString()));

            p.add(Bytes.toBytes("cf"),Bytes.toBytes("c"),Bytes.toBytes(val));

            context.write(ibw, p);
        }
    }

    public static void main(String[] args) throws Exception {
        Configuration conf = new Configuration();

        Job job = new Job(conf, "bulk-load");

        job.setJarByClass(BulkLoadHBase_1.class);
        job.setMapperClass(BulkLoadHBase_1Mapper.class);

        job.setReducerClass(PutSortReducer.class);
        job.setOutputKeyClass(ImmutableBytesWritable.class);
        job.setOutputValueClass(Put.class);
        job.setPartitionerClass(TotalOrderPartitioner.class);
        job.setInputFormatClass(KeyValueTextInputFormat.class);

        FileInputFormat.addInputPath(job,
                     new Path("/home/user/Desktop/bulkLoad.txt"));
        HFileOutputFormat.setOutputPath(job,
                     new Path("/home/user/Desktop/HBASE_BulkOutput/"));     

       System.exit(job.waitForCompletion(true) ? 0 : 1);
    }
}
EN

回答 2

Stack Overflow用户

回答已采纳

发布于 2012-10-15 18:27:55

问题是:这个程序需要在分布式模式下运行。和所需的JAR应发货...

票数 0
EN

Stack Overflow用户

发布于 2012-06-21 16:25:05

您是否在分布式模式下启动了HBase?!如果是这样,下面这行代码:

org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:210)

在堆栈中,跟踪显示map reduce作业在本地模式下运行,而不是在分布式模式下运行。

还要注意,如果您想从控制台中运行命令,您的输入文件必须驻留在hadoop文件系统上,而不是常规的(例如NTFS或EXT3)文件系统上。

问候

票数 4
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/11035798

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档