我正在使用spark streaming进行编程,但使用scala遇到了一些问题。我正在尝试使用函数StreamingContext.fileStream
此函数的定义如下所示:
def fileStream[K, V, F <: InputFormat[K, V]](directory: String)(implicit arg0: ClassManifest[K], arg1: ClassManifest[V], arg2: ClassManifest[F]): DStream[(K, V)]创建一个输入流,它监视与Hadoop兼容的文件系统中的新文件,并使用给定的键值类型和输入格式读取它们。以开头的文件名。都被忽略了。K键类型用于读取HDFS文件V值类型用于读取HDFS文件F输入格式用于读取HDFS文件目录HDFS目录用于监视新文件
我不知道如何传递Key和Value的类型。我的spark streaming代码:
val ssc = new StreamingContext(args(0), "StreamingReceiver", Seconds(1),
System.getenv("SPARK_HOME"), Seq("/home/mesos/StreamingReceiver.jar"))
// Create a NetworkInputDStream on target ip:port and count the
val lines = ssc.fileStream("/home/sequenceFile")编写hadoop文件的Java代码:
public class MyDriver {
private static final String[] DATA = { "One, two, buckle my shoe",
"Three, four, shut the door", "Five, six, pick up sticks",
"Seven, eight, lay them straight", "Nine, ten, a big fat hen" };
public static void main(String[] args) throws IOException {
String uri = args[0];
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri), conf);
Path path = new Path(uri);
IntWritable key = new IntWritable();
Text value = new Text();
SequenceFile.Writer writer = null;
try {
writer = SequenceFile.createWriter(fs, conf, path, key.getClass(),
value.getClass());
for (int i = 0; i < 100; i++) {
key.set(100 - i);
value.set(DATA[i % DATA.length]);
System.out.printf("[%s]\t%s\t%s\n", writer.getLength(), key,
value);
writer.append(key, value);
}
} finally {
IOUtils.closeStream(writer);
}
}}
发布于 2013-05-15 20:23:23
如果你想使用fileStream,你必须在调用它的时候给它提供所有3个类型参数。在调用它之前,您需要知道您的Key、Value和InputFormat类型是什么。如果你的类型是LongWritable,Text和TextInputFormat,你可以这样调用fileStream:
val lines = ssc.fileStream[LongWritable, Text, TextInputFormat]("/home/sequenceFile")如果这三种类型恰好是您的类型,那么您可能希望使用textFileStream,因为它不需要任何类型参数,也不需要使用我提到的这三种类型委托给fileStream。使用它将看起来像这样:
val lines = ssc.textFileStream("/home/sequenceFile")发布于 2016-11-01 03:00:45
val filterF = new Function[Path, Boolean] {
def apply(x: Path): Boolean = {
val flag = if(x.toString.split("/").last.split("_").last.toLong < System.currentTimeMillis) true else false
return flag
}
}
val streamed_rdd = ssc.fileStream[LongWritable, Text, TextInputFormat]("/user/hdpprod/temp/spark_streaming_input",filterF,false).map(_._2.toString).map(u => u.split('\t'))https://stackoverflow.com/questions/16560833
复制相似问题