在我的spark应用程序中,我尝试使用fluentd-scala-logger,为此我必须在build.sbt中包含一个额外的依赖项。这是我在build.sbt中添加的两行
resolvers += "Apache Maven Central Repository" at "https://repo.maven.apache.org/maven2/"
"org.fluentd" %% "fluent-logger-scala" % "0.7.0"我的最终build.sbt如下所示:
name := "sample"
version := "1.4"
scalaVersion := "2.11.8"
resolvers += "Apache Maven Central Repository" at "https://repo.maven.apache.org/maven2/"
libraryDependencies ++= Seq("org.elasticsearch" %% "elasticsearch-spark" % "2.1.2", "org.apache.spark" %% "spark-sql" % "2.1.2", "org.apache.kafka" % "kafka-clients" % "2.4.1", "org.fluentd" %% "fluent-logger-scala" % "0.7.0")当我使用sbt包将我的spark应用程序捆绑到jar中时,我会遇到以下问题:
object tools is not a member of package scala
[error] import scala.tools.nsc.io.File当我之前的sbt看起来像这样(没有fluentd依赖)时,我并没有遇到这个问题:
name := "sample"
version := "1.4"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq("org.elasticsearch" %% "elasticsearch-spark" % "2.1.2", "org.apache.spark" %% "spark-sql" % "2.1.2", "org.apache.kafka" % "kafka-clients" % "2.4.1")是不是解析器线路导致了这个问题?或者我完全漏掉了其他东西。我使用的是sbt版本1.4.5 & Scala版本: 2.11.8
发布于 2020-12-22 01:53:26
我不确定是什么原因造成的,但我找到了一个解决方案。请尝试像这样添加依赖项:
libraryDependencies += "org.fluentd" %% "fluent-logger-scala" % "0.7.0" intransitive()它将导入此依赖项而不导入其依赖项。
话虽如此,我看了看this library dependencies,并试图逐一排除所有它们:
libraryDependencies += "org.fluentd" %% "fluent-logger-scala" % "0.7.0" excludeAll(
ExclusionRule("org.msgpack", "msgpack"),
ExclusionRule("org.slf4j", "slf4j-api"),
ExclusionRule("ch.qos.logback", "logback-classic"),
ExclusionRule("junit", "junit"),
)但它并没有起作用。所以我真的无法解释。
https://stackoverflow.com/questions/65390463
复制相似问题