我正在尝试通过docker或从https://almond.sh在线发射杏仁木星。在spark.ipynb图像中,带有NotebookSparkSession的行上显示错误
import $ivy.`org.apache.spark::spark-sql:2.4.0`
import $ivy.`sh.almond::almond-spark:0.3.0`
import org.apache.log4j.{Level, Logger}
Logger.getLogger("org").setLevel(Level.OFF)
import org.apache.spark.sql._
val spark = {
NotebookSparkSession.builder()
.master("local[*]")
.getOrCreate()
}docker有一个例外:
java.lang.NoSuchMethodError: coursier.package$Resolution$.apply$default$13()Lscala/collection/immutable/Map;
org.apache.spark.sql.ammonitesparkinternals.SparkDependencies$.sparkJars(SparkDependencies.scala:134)
org.apache.spark.sql.ammonitesparkinternals.AmmoniteSparkSessionBuilder.getOrCreate(AmmoniteSparkSessionBuilder.scala:234)
org.apache.spark.sql.almondinternals.NotebookSparkSessionBuilder.getOrCreate(NotebookSparkSessionBuilder.scala:62)我尝试使用具有相同spark.ipynb的在线版本,但有一个例外
java.lang.AssertionError: assertion failed:
NotebookSparkSession.builder()
while compiling: cmd3.sc
during phase: superaccessors
library version: version 2.12.8
compiler version: version 2.12.8
reconstructed args: -nowarn -Yresolve-term-conflict:object
last tree to typer: This(class cmd3)
tree position: line 19 of cmd3.sc
tree tpe: cmd3.this.type
symbol: final class cmd3 in package $sess
symbol definition: final class cmd3 extends Serializable (a ClassSymbol)
symbol package: ammonite.$sess
symbol owners: class cmd3
call site: class Helper in class cmd3 in package $sess发布于 2019-08-25 15:01:37
scala版本(2.12.8)与spark 2.4.0不匹配问题的明确指示。请参阅spark 2.4的发行说明
it has experimental support to 2.12 scala version
或者,我认为您需要使用spark 2.4.3来支持scala 2.12.8,正如文档所说。
https://stackoverflow.com/questions/57640297
复制相似问题