我试着用spark 2.0运行spark j观察者,我从github存储库克隆了分支spark 2.0-预览。我遵循部署指南,但是当我尝试使用bin/ server _ployy.sh部署服务器时。我得到了编译错误:
Error:
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:4: cannot find symbol
[error] symbol: class DataFrame
[error] location: package org.apache.spark.sql
[error] import org.apache.spark.sql.DataFrame;
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java:13: java.lang.Object cannot be converted to org.apache.spark.sql.Row[]
[error] return sc.sql(data.getString("sql")).collect();
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:25: cannot find symbol
[error] symbol: class DataFrame
[error] location: class spark.jobserver.JHiveTestLoaderJob
[error] final DataFrame addrRdd = sc.sql("SELECT * FROM default.test_addresses");
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JSqlTestJob.java:13: array required, but java.lang.Object found
[error] Row row = sc.sql("select 1+1").take(1)[0];
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Some input files use or override a deprecated API.
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Recompile with -Xlint:deprecation for details.
[error] (job-server-extras/compile:compileIncremental) javac returned nonzero exit code我忘记添加一些依赖项了吗?
发布于 2017-03-15 14:40:42
我也有过类似的问题。我发现它是错误的,因为Spark从1.x更改为2.x。您可以在github https://github.com/spark-jobserver/spark-jobserver/issues/760上找到未决问题
我介绍了一些快速修复,它解决了这个问题,我可以部署j观察者。我提交了拉请求。https://github.com/spark-jobserver/spark-jobserver/pull/762
https://stackoverflow.com/questions/42812775
复制相似问题