我尝试了下面这些方法:
>>./spark-shell –-jars /home/my_path/my_jar.jar在shell中,我尝试导入包:
scala> import com.vertica.spark._
<console>:23: error: object vertica is not a member of package com
import com.vertica.spark._它不起作用,我还尝试从jar路径中删除斜杠(/)
>>./spark-shell –-jars home/my_path/my_jar.jar这仍然是same..there的一个警告
20/04/21 22:34:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://ubuntu:4040
Spark context available as 'sc' (master = local[*], app id = local-1587488711233).
Spark session available as 'spark'.
Welcome to但另一方面,如果我进入shell并尝试使用相同的jar路径添加require,那么它会成功导入:
scala> :require /home/my_path/my_jar.jar
Added '/home/my_path/my_jar.jar' to classpath.
scala> import com.vertica.spark._
import com.vertica.spark._在添加带有spark-shell本身的jars时,我遗漏了什么?
发布于 2020-04-26 12:16:26
这个问题可能是由于hadoop本机问题造成的,试试下面的源码bashrc,你会很好的。
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH和
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native:$LD_LIBRARY_PATH https://stackoverflow.com/questions/61349306
复制相似问题