我正在学习Scala编程,使用Windows7和最新的Spark2.2.0版本在Apache .I am中编写单词计数驱动程序。在执行程序时遇到下面提到的错误。
如何修正和取得结果?
SBT
命名为:=“:=”版本:= "0.1“scalaVersion:= "2.12.3”val sparkVersion = "2.2.0“libraryDependencies ++= Seq( "org.apache.spark”%“火花-核心2.11”% sparkVersion,"org.apache.spark“%”火花-sql_2.11% sparkVersion,"org.apache.spark“%”火花流_2.11% sparkVersion )
驾驶员程序
package com.demo.file
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.sql.SparkSession
object Reader {
def main(args: Array[String]): Unit = {
println("Welcome to Reader.")
val filePath = "C:\\notes.txt"
val spark = SparkSession.builder.appName("Simple app").config("spark.master", "local")getOrCreate();
val fileData = spark.read.textFile(filePath).cache()
val count_a = fileData.filter(line => line.contains("a")).count()
val count_b = fileData.filter(line => line.contains("b")).count()
println(s" count of A $count_a and count of B $count_b")
spark.stop()
}
}错误 欢迎来到读者。线程"main“中的异常: scala/Product$class在org.apache.spark.SparkConf$DeprecatedConfig.(SparkConf.scala:723) at org.apache.spark.SparkConf$.(SparkConf.scala:571) at org.apache.spark.SparkConf$.(SparkConf.scala)在org.apache.spark.SparkConf.set(SparkConf.scala:92) at org.apache.spark.SparkConf.set(SparkConf.scala:81) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6$$anonfun$apply$1.apply(SparkSession.scala:905)在org.apache.spark.sql.SparkSession$Builder$$anonfun$6$$anonfun$apply$1.apply(SparkSession.scala:905) at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:138) at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:236) at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:229) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap.foreach(HashMap.scala:138) at org.apache.spark.sql。SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:905) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901) at com.demo.file.Reader$.main(Reader.scala:11) at com.demo.file.Reader.main(Reader.scala):java.lang.ClassNotFoundException: scala.Product$class at java.net.URLClassLoader.findClass(java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) .再来18次
发布于 2017-09-24 05:23:08
在默认情况下,Spark 2.2.0是为使用Scala2.11构建和分发的。要用Scala编写应用程序,您需要使用兼容的Scala版本(例如2.11.X)。您的scala版本是2.12.X,这就是它抛出异常的原因。
https://stackoverflow.com/questions/46386812
复制相似问题