首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >sqoop导入数据以避免抛出错误org.apache.sqoop.hive.HiveConfig?

sqoop导入数据以避免抛出错误org.apache.sqoop.hive.HiveConfig?
EN

Stack Overflow用户
提问于 2016-10-31 04:06:47
回答 1查看 1.7K关注 0票数 1
  1. 我在ambari HDP 2.5.0上安装了色调3.10
  2. 完全配置hue.ini

我的问题是var sqoop同步数据从mysql到hive,它抛出一个异常:

代码语言:javascript
复制
[main] ERROR org.apache.sqoop.hive.HiveConfig – Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly. 

[main] ERROR org.apache.sqoop.hive.HiveConfig – Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly. 

[main] ERROR org.apache.sqoop.tool.ImportTool – Encountered IOException running import job: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
    at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:397)
    at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:384)
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:342)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:246)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:524)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
    at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:202)
    at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:182)
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:51)
    at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:242)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)

但是,如果在命令行中执行相同的sqoop脚本,它就能工作!

添加了环境变量HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/hdp/current/hive-client/lib to /etc/profile。但还是不管用。我试过几次自己解决这个问题,但是失败了。

脚本是/usr/hdp/2.5.0.0-1245/hive/bin/hive。看上去像${HADOOP_CLASSPATH} point to /usr/hdp/2.5.0.0-1245/atlas/hook/hive/* ?

代码语言:javascript
复制
    #!/bin/bash


    if [ -d "/usr/hdp/2.5.0.0-1245/atlas/hook/hive" ]; then
      if [ -z "${HADOOP_CLASSPATH}" ]; then
        export HADOOP_CLASSPATH=/usr/hdp/2.5.0.0-1245/atlas/hook/hive/*
      else
        export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:/usr/hdp/2.5.0.0-1245/atlas/hook/hive/*
      fi
    fi


    BIGTOP_DEFAULTS_DIR=${BIGTOP_DEFAULTS_DIR-/etc/default}
    [ -n "${BIGTOP_DEFAULTS_DIR}" -a -r ${BIGTOP_DEFAULTS_DIR}/hbase ] && . ${BIGTOP_DEFAULTS_DIR}/hbase




    export HIVE_HOME=${HIVE_HOME:-/usr/hdp/2.5.0.0-1245/hive}
    export HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/2.5.0.0-1245/hadoop}
    export ATLAS_HOME=${ATLAS_HOME:-/usr/hdp/2.5.0.0-1245/atlas}


    HCATALOG_JAR_PATH=/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/hcatalog/hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/hcatalog/hive-hcatalog-server-extensions-1.2.1000.2.5.0.0-1245.jar:/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/webhcat/java-client/hive-webhcat-java-client-1.2.1000.2.5.0.0-1245.jar


    if [ -z "${HADOOP_CLASSPATH}" ]; then
      export HADOOP_CLASSPATH=${HCATALOG_JAR_PATH}
    else
      export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:${HCATALOG_JAR_PATH}
    fi


    exec "${HIVE_HOME}/bin/hive.distro" "$@"

如何解决这个问题?

EN

回答 1

Stack Overflow用户

发布于 2017-10-04 13:22:32

对我来说,这个问题显示在Ambari工作流编辑器中。要解决这个问题,请在每个sqoop客户端节点中创建一个符号链接,用于hive,其中hiveexec.jar位于其中。下一步,将hive-exec.jar放入HDFS oozie共享库文件夹.

代码语言:javascript
复制
su root

cd /usr/hdp/current/sqoop-client/

ln -s /usr/hdp/current/hive-client/lib/hive-exec.jar hive-exec.jar

cp hive-exec.jar lib/

su -l hdfs 

hdfs dfs -put hive-exec.jar /user/oozie/share/lib/sqoop

hdfs dfs -put hive-exec.jar /user/oozie/share/lib/lib_20161117191926/sqoop
票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/40336448

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档