首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >如何使用IntelliJ解决Scala中的异常?

如何使用IntelliJ解决Scala中的异常?
EN

Stack Overflow用户
提问于 2018-09-10 15:02:36
回答 1查看 1.5K关注 0票数 1

我正在尝试运行这个project,我已经在sbt文件中添加了依赖项,我的sbt文件如下所示:

代码语言:javascript
复制
name := "HelloScala"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"
resolvers += Resolver.bintrayRepo("salesforce", "maven")
libraryDependencies += "com.salesforce.transmogrifai" %% "transmogrifai-core" % "0.3.4"

然后我从他们的存储库中复制了Helloworld文件夹,但是有很多问题。

代码语言:javascript
复制
Information:10/09/18, 12:01 PM - Compilation completed with 88 errors and 0 warnings in 15 s 624 ms
Error:scalac: missing or invalid dependency detected while loading class file 'package.class'.
Could not access type Vector in value org.apache.spark.ml.linalg,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'package.class' was compiled against an incompatible version of org.apache.spark.ml.linalg.
Error:scalac: missing or invalid dependency detected while loading class file 'OPVector.class'.
Could not access type Vector in value org.apache.spark.ml.linalg,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OPVector.class' was compiled against an incompatible version of org.apache.spark.ml.linalg.
Error:scalac: missing or invalid dependency detected while loading class file 'OpEvaluatorBase.class'.
Could not access type Evaluator in value org.apache.spark.ml.evaluation,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpEvaluatorBase.class' was compiled against an incompatible version of org.apache.spark.ml.evaluation.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasLabelCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasLabelCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasPredictionCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasPredictionCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasFullPredictionCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasFullPredictionCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasRawPredictionCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasRawPredictionCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasProbabilityCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasProbabilityCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'ClassificationModelSelector.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'ClassificationModelSelector.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'InputParams.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'InputParams.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpPipelineStageBase.class'.
Could not access type MLWritable in value org.apache.spark.ml.util,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpPipelineStageBase.class' was compiled against an incompatible version of org.apache.spark.ml.util.
Error:scalac: missing or invalid dependency detected while loading class file 'HasLogisticRegression.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasLogisticRegression.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasRandomForestBase.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasRandomForestBase.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasDecisionTreeBase.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasDecisionTreeBase.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasNaiveBayes.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasNaiveBayes.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'DataReaders.class'.
Could not access type Encoder in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'DataReaders.class' was compiled against an incompatible version of org.apache.spark.sql.
Error:scalac: missing or invalid dependency detected while loading class file 'OpWorkflow.class'.
Could not access type SparkSession in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpWorkflow.class' was compiled against an incompatible version of org.apache.spark.sql.
Error:scalac: missing or invalid dependency detected while loading class file 'SplitterParams.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'SplitterParams.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'ModelSelectorBase.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'ModelSelectorBase.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'HasLinearRegression.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasLinearRegression.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasGradientBoostedTreeBase.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasGradientBoostedTreeBase.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasRandomForestBase.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasRandomForestBase.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'DataCutterParams.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'DataCutterParams.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasDecisionTreeBase.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasDecisionTreeBase.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'FeatureBuilder.class'.
Could not access term package in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'FeatureBuilder.class' was compiled against an incompatible version of org.apache.spark.sql.
Error:scalac: missing or invalid dependency detected while loading class file 'FeatureBuilder.class'.
Could not access type DataFrame in value org.apache.spark.sql.package,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'FeatureBuilder.class' was compiled against an incompatible version of org.apache.spark.sql.package.
Error:scalac: missing or invalid dependency detected while loading class file 'OpWorkflowCore.class'.
Could not access type Dataset in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpWorkflowCore.class' was compiled against an incompatible version of org.apache.spark.sql.
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/OpTitanicSimple.scala
Error:(42, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(95, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(conf).getOrCreate()
Error:(143, 8) overloaded method value setLabelCol with alternatives:
  (value: com.salesforce.op.features.FeatureLike[T])OpHasLabelCol.this.type <and>
  (value: String)OpHasLabelCol.this.type
 cannot be applied to (com.salesforce.op.features.Feature[com.salesforce.op.features.types.RealNN])
      .setLabelCol(survived)
Error:(154, 64) could not find implicit value for evidence parameter of type org.apache.spark.sql.Encoder[com.salesforce.hw.Passenger]
    val trainDataReader = DataReaders.Simple.csvCase[Passenger](
Error:(166, 40) could not find implicit value for parameter spark: org.apache.spark.sql.SparkSession
    val fittedWorkflow = workflow.train()
Error:(174, 15) value columns is not a member of Any
    dataframe.columns.foreach(println)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/boston/OpBoston.scala
Error:(41, 8) object Dataset is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(41, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(56, 47) not found: type SparkSession
  def customRead(path: Option[String], spark: SparkSession): RDD[BostonHouse] = {
Error:(69, 90) not found: type Dataset
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(69, 50) not found: type SparkSession
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(77, 90) not found: type Dataset
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(77, 50) not found: type SparkSession
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(94, 6) value setGradientBoostedTreeSeed is not a member of com.salesforce.op.stages.impl.selector.HasRandomForestBase[E,MS]
possible cause: maybe a semicolon is missing before `value setGradientBoostedTreeSeed'?
    .setGradientBoostedTreeSeed(randomSeed)
Error:(100, 43) overloaded method value setLabelCol with alternatives:
  (value: com.salesforce.op.features.FeatureLike[T])OpHasLabelCol.this.type <and>
  (value: String)OpHasLabelCol.this.type
 cannot be applied to (com.salesforce.op.features.Feature[com.salesforce.op.features.types.RealNN])
  val evaluator = Evaluators.Regression().setLabelCol(medv).setPredictionCol(prediction)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/dataprep/ConditionalAggregation.scala
Error:(40, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(69, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(conf).getOrCreate()
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/dataprep/JoinsAndAggregates.scala
Error:(40, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(74, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(conf).getOrCreate()
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/iris/IrisFeatures.scala
Error:(38, 36) not found: type Iris
  val id = FeatureBuilder.Integral[Iris].extract(_.getID.toIntegral).asPredictor
Error:(39, 41) not found: type Iris
  val sepalLength = FeatureBuilder.Real[Iris].extract(_.getSepalLength.toReal).asPredictor
Error:(40, 40) not found: type Iris
  val sepalWidth = FeatureBuilder.Real[Iris].extract(_.getSepalWidth.toReal).asPredictor
Error:(41, 41) not found: type Iris
  val petalLength = FeatureBuilder.Real[Iris].extract(_.getPetalLength.toReal).asPredictor
Error:(42, 40) not found: type Iris
  val petalWidth = FeatureBuilder.Real[Iris].extract(_.getPetalWidth.toReal).asPredictor
Error:(43, 39) not found: type Iris
  val irisClass = FeatureBuilder.Text[Iris].extract(_.getClass$.toText).asResponse
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/iris/IrisKryoRegistrator.scala
Error:(40, 47) type Iris is not a member of package com.salesforce.hw.iris
    doAvroRegistration[com.salesforce.hw.iris.Iris](kryo)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/iris/OpIris.scala
Error:(41, 8) object Dataset is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(41, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(56, 37) not found: type Iris
  val irisReader = new CustomReader[Iris](key = _.getID.toString){
Error:(57, 76) not found: type Iris
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[Iris], Dataset[Iris]] = {
Error:(57, 83) not found: type Dataset
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[Iris], Dataset[Iris]] = {
Error:(57, 50) not found: type SparkSession
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[Iris], Dataset[Iris]] = {
Error:(79, 6) value setInput is not a member of com.salesforce.op.stages.impl.selector.HasDecisionTreeBase[E,MS]
possible cause: maybe a semicolon is missing before `value setInput'?
    .setInput(labels, features).getOutput()
Error:(87, 53) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw, prob, labels)
Error:(87, 59) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw, prob, labels)
Error:(87, 64) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw, prob, labels)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/OpTitanic.scala
Error:(54, 45) not found: type Passenger
  val simpleReader = DataReaders.Simple.csv[Passenger](
Error:(55, 5) not found: value schema
    schema = Passenger.getClassSchema.toString, key = _.getPassengerId.toString
Error:(55, 49) not found: value key
    schema = Passenger.getClassSchema.toString, key = _.getPassengerId.toString
Error:(79, 6) value setModelsToTry is not a member of com.salesforce.op.stages.impl.selector.HasRandomForestBase[E,MS]
possible cause: maybe a semicolon is missing before `value setModelsToTry'?
    .setModelsToTry(LogisticRegression, RandomForest)
Error:(83, 53) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw)
Error:(83, 59) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/TitanicFeatures.scala
Error:(41, 40) not found: type Passenger
  val pClass = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getPclass).map(_.toString).toPickList).asPredictor // scalastyle:off
Error:(43, 34) not found: type Passenger
  val name = FeatureBuilder.Text[Passenger].extract(d => Option(d.getName).toText).asPredictor
Error:(45, 37) not found: type Passenger
  val sex = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getSex).toPickList).asPredictor
Error:(47, 33) not found: type Passenger
  val age = FeatureBuilder.Real[Passenger].extract(d => Option(Double.unbox(d.getAge)).toReal).asPredictor
Error:(49, 39) not found: type Passenger
  val sibSp = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getSibSp).map(_.toString).toPickList).asPredictor
Error:(51, 39) not found: type Passenger
  val parch = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getParch).map(_.toString).toPickList).asPredictor
Error:(53, 40) not found: type Passenger
  val ticket = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getTicket).toPickList).asPredictor
Error:(57, 39) not found: type Passenger
  val cabin = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getCabin).toPickList).asPredictor
Error:(59, 42) not found: type Passenger
  val embarked = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getEmbarked).toPickList).asPredictor
Error:(39, 40) not found: type Passenger
  val survived = FeatureBuilder.RealNN[Passenger].extract(_.getSurvived.toDouble.toRealNN).asResponse
Error:(55, 34) not found: type Passenger
  val fare = FeatureBuilder.Real[Passenger].extract(d => Option(Double.unbox(d.getFare)).toReal).asPredictor
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/OpTitanicMini.scala
Error:(40, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(66, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(new SparkConf()).getOrCreate()
Error:(75, 34) value transmogrify is not a member of Any
    val featureVector = features.transmogrify()
Error:(78, 36) value sanityCheck is not a member of Any
    val checkedFeatures = survived.sanityCheck(featureVector, checkSample = 1.0, removeBadFeatures = true)
Error:(78, 63) not found: value checkSample
    val checkedFeatures = survived.sanityCheck(featureVector, checkSample = 1.0, removeBadFeatures = true)
Error:(78, 82) not found: value removeBadFeatures
    val checkedFeatures = survived.sanityCheck(featureVector, checkSample = 1.0, removeBadFeatures = true)
Error:(81, 73) too many arguments for method setInput: (features: (com.salesforce.op.features.FeatureLike[com.salesforce.op.features.types.RealNN], com.salesforce.op.features.FeatureLike[com.salesforce.op.features.types.OPVector]))com.salesforce.op.stages.impl.classification.BinaryClassificationModelSelector
    val (pred, raw, prob) = BinaryClassificationModelSelector().setInput(survived, checkedFeatures).getOutput()
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/TitanicKryoRegistrator.scala
Error:(41, 50) type Passenger is not a member of package com.salesforce.hw.titanic
    doAvroRegistration[com.salesforce.hw.titanic.Passenger](kryo)

我试着搜索这些问题,发现可能是版本问题,但如果有版本问题,我不知道应该使用哪个版本。但是如果我尝试从命令行运行它,它是有效的:

代码语言:javascript
复制
cd helloworld
./gradlew compileTestScala installDist
./gradlew -q sparkSubmit -Dmain=com.salesforce.hw.OpTitanicSimple -Dargs="\
`pwd`/src/main/resources/TitanicDataset/TitanicPassengersTrainData.csv"

它在IntelliJ中不起作用,我如何解决这个问题?

EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2018-09-10 15:17:58

build.sbt中缺少两个依赖项:spark-mllibspark-sql

代码语言:javascript
复制
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.3.1",
  "org.apache.spark" %% "spark-sql" % "2.3.1",
  "com.salesforce.transmogrifai" %% "transmogrifai-core" % "0.3.4"
)

这将删除第一个错误块。

票数 3
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/52252621

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档