我开始研究用Scala语言编写的Apache Flink的CEP库,当我试图通过执行https://ci.apache.org/projects/flink/flink-docs-stable/dev/libs/cep.html教程中所示的CEP.pattern(input,pattern)来创建一个PatternStream时,集成开发环境说它“无法解析重载方法”,它指的是pattern方法。根据我用来分别创建输入和模式的readTextFile和Pattern[String].begin('line').where(_.length == 10)的实现,方法的参数或泛型类型应该不会有任何问题。
这是我写的代码。我知道它不完整,但自从这个问题出现后,我无论如何都不能完成它。
package FlinkCEPClasses
import java.util.Properties
import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.cep.CEP
import org.apache.flink.cep.scala.pattern.Pattern
import org.apache.flink.core.fs.FileSystem
import org.apache.flink.streaming.api.TimeCharacteristic
import org.apache.flink.streaming.api.datastream.DataStream
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer
class FlinkCEPPipeline {
var props : Properties = new Properties()
var env : StreamExecutionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment
env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime)
env.setParallelism(1)
var input : DataStream[String] = env.readTextFile("/home/luca/Desktop/lines")
var patt : Pattern[String,String] = Pattern.begin[String]("igual").where(_.length == 10)
// Problem appears at the following line. A red subscript appears at the pattern method,
// saying the following: "Cannot resolve overloaded method"
var CEPstream = CEP.pattern(input,patt)
input.writeAsText("/home/luca/Desktop/flinkcepout",FileSystem.WriteMode.OVERWRITE)
env.execute()下面是我的build.sbt文件内容:
name := "FlinkCEP"
version := "0.1"
scalaVersion := "2.12.10"
// https://mvnrepository.com/artifact/org.apache.flink/flink-cep-scala
libraryDependencies += "org.apache.flink" %% "flink-cep-scala" % "1.9.0"
// https://mvnrepository.com/artifact/org.apache.flink/flink-scala
libraryDependencies += "org.apache.flink" %% "flink-scala" % "1.9.0"
// https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-scala
libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % "1.9.0"
// https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka
libraryDependencies += "org.apache.flink" %% "flink-connector-kafka" % "1.9.0"
libraryDependencies += "log4j" % "log4j" % "1.2.17"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.3" % Runtime
libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.6.2" % Test```我使用这段代码的目的只是为了看到它运行一个简单的"where“条件,除此之外,它不应该有更大的实用程序。我使用IntelliJ作为集成开发环境。另外,我不确定Scala的CEP库是否可以使用。如果有人能对此有所了解,我将不胜感激。
发布于 2020-01-23 03:50:41
看了@DavidAnderson的github示例后,我终于解决了这个问题。由于我做了很多更改,我不能确定我的解决方案是否适用于你,但我从import org.apache.flink.streaming.api.datastream.DataStream改为了import org.apache.flink.streaming.api.scala.{StreamExecutionEnvironment, DataStream, _}。注意不明确的导入,并确保导入的是真正需要的类。
我将列出我的所有导入和build.sbt文件,这样您就可以完全访问我的配置。
导入
import java.util.Properties
import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.cep.scala.pattern.Pattern
import org.apache.flink.core.fs.FileSystem
import org.apache.flink.streaming.api.TimeCharacteristic
import org.apache.flink.cep.scala.PatternStream
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer
import org.apache.flink.cep.scala.{CEP, PatternStream}
import org.apache.flink.streaming.api.scala.{StreamExecutionEnvironment, DataStream, _}Build.sbt
name := "FlinkCEP"
version := "0.1"
scalaVersion := "2.12.10"
// https://mvnrepository.com/artifact/org.apache.flink/flink-cep-scala
//libraryDependencies += "org.apache.flink" %% "flink-cep-scala" % "1.9.0"
// https://mvnrepository.com/artifact/org.apache.flink/flink-cep
libraryDependencies += "org.apache.flink" %% "flink-cep" % "1.9.0"
// https://mvnrepository.com/artifact/org.apache.flink/flink-cep-scala
libraryDependencies += "org.apache.flink" %% "flink-cep-scala" % "1.9.0"
// https://mvnrepository.com/artifact/org.apache.flink/flink-runtime
libraryDependencies += "org.apache.flink" %% "flink-runtime" % "1.9.0" % Test
// https://mvnrepository.com/artifact/org.apache.flink/flink-scala
libraryDependencies += "org.apache.flink" %% "flink-scala" % "1.9.0"
// https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-scala
libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % "1.9.0"
// https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka
libraryDependencies += "org.apache.flink" %% "flink-connector-kafka" % "1.9.0"
libraryDependencies += "log4j" % "log4j" % "1.2.17"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.3" % Runtime
libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.6.2" % Test发布于 2020-01-14 16:41:29
试试这个:
import org.apache.flink.cep.scala.PatternStream
...
val CEPstream: PatternStream[String] = CEP.pattern[String](input, patt)有关使用CEP和github的简单示例,请参阅Scala。
https://stackoverflow.com/questions/59720416
复制相似问题