各位,
我想把卡桑德拉和火花流结合起来。以下是sbt文件:
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-sql" % "1.6.1",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.6.2",
"com.datastax.cassandra" % "cassandra-driver-core" % "3.0.0",
("org.apache.spark" %% "spark-streaming-kafka" % "1.6.0").
exclude("org.spark-project.spark", "unused")
)我为cassandra集成添加了以下行(下面提到的错误行):
val lines = KafkaUtils.createDirectStream[
String, String, StringDecoder, StringDecoder](
ssc, kafkaParams, topics)
//Getting errors once I add below line in program
lines.saveToCassandra("test", "test", SomeColumns("key", "value"))
lines.print()添加以上行后,在IDE中可以看到下面的错误:

如果我试图从命令提示符中打包这个项目,我会看到类似的错误:

FYR,我使用以下版本:
scala - 2.11
kafka - kafka_2.11-0.8.2.1
java -8
cassandra -datastax-社区-64位_2.2.8
请帮助解决这个问题。
发布于 2017-02-24 22:25:53
正如预期的那样,通过更新sbt文件来解决依赖问题,如下所示:
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-sql" % "2.0.0",
"com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0-RC1",
"com.datastax.cassandra" % "cassandra-driver-core" % "3.0.0",
("org.apache.spark" %% "spark-streaming-kafka" % "1.6.0").
exclude("org.spark-project.spark", "unused")
)https://stackoverflow.com/questions/42446513
复制相似问题