我使用的CDH集群带有Spark2.1和ScalaVersion2.11.8
我在我的方法中创建了一个UDF。
当我在Spark中执行该方法时,它的工作非常好。但是,相同的UDF无法在使用submit命令从Spark调用该方法时注册。
下面是build.sbt内容:
name := "newtest"
version := "0.0.2"
scalaVersion := "2.10.5"
sbtPlugin := true
val sparkVersion = "2.1.0"
mainClass in (Compile, run) := Some("com.testPackage.sq.newsparktest.Test")
assemblyJarName in assembly := "newtest.jar"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"com.databricks" %% "spark-avro" % "3.2.0",
"org.apache.spark" %% "spark-hive" % "1.5.0" % "provided",
"com.amazonaws" % "aws-java-sdk" % "1.0.002"
)
libraryDependencies +=
"log4j" % "log4j" % "1.2.15" excludeAll(
ExclusionRule(organization = "com.sun.jdmk"),
ExclusionRule(organization = "com.sun.jmx"),
ExclusionRule(organization = "javax.jms")
)
resolvers += "SparkPackages" at "https://dl.bintray.com/spark-packages/maven/"
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
} Plugins.SBT如下:
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "5.0.1")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")下面是导致此问题的代码片段:
val timestamp_diff = (endTime: Timestamp, startTime: Timestamp) => {
(endTime.getTime() - startTime.getTime())
}
logger.info("Function Created") --------> **This works fine**
spark.udf.register("timestamp_diff", timestamp_diff )
-----> **The above command causes error as below**以下是错误:
Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
at com.testPackage.sq.newsparktest$.main(newsparktest.scala:49)
at com.testPackage.sq.newsparktest.main(newsparktest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)注意:如前所述,当在REPL上直接执行失败的命令而不是通过Spark 执行时,失败的命令运行得非常好。
发布于 2017-10-09 02:36:24
这似乎是Scala版本的问题。来自Spark 2.1文档
Spark运行在Java 7+、Python2.6+/3.4+和R 3.1+上。对于Scala,Spark2.1.0使用Scala2.11。您需要使用兼容的Scala版本(2.11.x)。
因此,您可能希望使用Scala的更新版本:
scalaVersion := "2.11.11"注意,Scala 2.12.x还不受支持。
https://stackoverflow.com/questions/46605016
复制相似问题