运行kafka consumer时出现以下错误:
ERROR receiver.BlockGenerator: Error in block pushing thread
java.io.NotSerializableException: org.jnetpcap.packet.PcapPacket
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1183)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:42)
at org.apache.spark.serializer.SerializationStream$class.writeAll(Serializer.scala:102)
at org.apache.spark.serializer.JavaSerializationStream.writeAll(JavaSerializer.scala:30)
at org.apache.spark.storage.BlockManager.dataSerializeStream(BlockManager.scala:996)
at org.apache.spark.storage.BlockManager.dataSerialize(BlockManager.scala:1005)
at org.apache.spark.storage.MemoryStore.putValues(MemoryStore.scala:79)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:663)
at org.apache.spark.storage.BlockManager.put(BlockManager.scala:574)build.sbt文件:
name := "testpacket"
version := "1.0"
scalaVersion := "2.10.3"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.0.2
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.0.2"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.0.2"
libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"错误的原因可能是什么?
发布于 2014-12-28 23:58:30
我以前在两种情况下遇到过这个问题,所以没有看到你的代码,我不能确定问题到底是什么。
RDD
我的猜测是,您正在经历#1,并且正在将PcapPacket作为RDD的一部分。如果是这种情况,那么您将需要创建一个可序列化的PcapPacket版本,这应该不是很困难,因为底层字节数组支持PcapPacket。
https://stackoverflow.com/questions/27670609
复制相似问题