在将DSE 4.6升级到4.7之后,我的作业服务器-0.5.0出现了问题.如果我运行/usr/share/dse/spark/assembly/target/scala-2.10,我会得到一个错误:“在运行这个程序之前,您需要在server_start.sh中找到火花程序集。”
我在/usr/share/dse/share/bin/compute classpath.sh中找到了
此代码引发错误。
for f in ${assembly_folder}/spark-assembly*hadoop*.jar; do
if [[ ! -e "$f" ]]; then
echo "Failed to find Spark assembly in $assembly_folder" 1>&2
echo "You need to build Spark before running this program." 1>&2
exit 1
fi
ASSEMBLY_JAR="$f"
num_jars=$((num_jars+1))
done如果我运行/usr/share/dse/ same /bin/s火花-提交,我将得到同样的错误。
发布于 2015-05-27 16:22:18
如果您使用的是DSE,那么您很可能启动的是星星之火,而不是点击计算类路径。您可以尝试修改启动脚本以使用dse火花-submit,如下面的示例所示。
# job server jar needs to appear first so its deps take higher priority
# need to explicitly include app dir in classpath so logging configs can be found
#CLASSPATH="$appdir:$appdir/spark-job-server.jar:$($SPARK_HOME/bin/compute-classpath.sh)"
#exec java -cp $CLASSPATH $GC_OPTS $JAVA_OPTS $LOGGING_OPTS $CONFIG_OVERRIDES $MAIN $conffile 2>&1 &
dse spark-submit --class $MAIN $appdir/spark-job-server.jar --driver-java-options "$GC_OPTS $JAVA_OPTS $LOGGING_OPTS" $conffile 2>&1 &https://stackoverflow.com/questions/30478045
复制相似问题