首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >类未找到错误- scala

类未找到错误- scala
EN

Stack Overflow用户
提问于 2016-03-04 11:46:16
回答 2查看 3.3K关注 0票数 1

我正在运行一个scala程序,如下所示。我正在使用maven进行构建,并且正确设置了依赖项,maven安装成功。但是当我运行jar文件时,我得到了java.lang.NoClassDefFoundError。

程序:

代码语言:javascript
复制
package RasterDataIngest.RasterDataIngestIntoHadoop

import geotrellis.spark._
import geotrellis.spark.ingest._
import geotrellis.spark.io.hadoop._
import geotrellis.spark.io.index._
import geotrellis.spark.tiling._
import geotrellis.spark.utils.SparkUtils
import geotrellis.vector._
import org.apache.hadoop.fs.Path
import org.apache.spark._
import com.quantifind.sumac.ArgMain
import com.quantifind.sumac.validation.Required

class HadoopIngestArgs extends IngestArgs {
  @Required var catalog: String = _
  def catalogPath = new Path(catalog)
}

object HadoopIngest extends ArgMain[HadoopIngestArgs] with Logging {
  def main(args: HadoopIngestArgs): Unit = {
   System.setProperty("com.sun.media.jai.disableMediaLib", "true")

    implicit val sparkContext = SparkUtils.createSparkContext("Ingest")
    val conf = sparkContext.hadoopConfiguration
    conf.set("io.map.index.interval", "1")

    val catalog = HadoopRasterCatalog(args.catalogPath)
    val source = sparkContext.hadoopGeoTiffRDD(args.inPath)
    val layoutScheme = ZoomedLayoutScheme()

    Ingest[ProjectedExtent, SpatialKey](source, args.destCrs, layoutScheme, args.pyramid){ (rdd, level) => 
      catalog
        .writer[SpatialKey](RowMajorKeyIndexMethod, args.clobber)
        .write(LayerId(args.layerName, level.zoom), rdd)
    }
  }
}

pom.xml:

代码语言:javascript
复制
<dependency>
  <groupId>org.scala-lang</groupId>
  <artifactId>scala-library</artifactId>
  <version>${scala.version}</version>
</dependency>
<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.10</artifactId>
  <version>1.5.2</version>
</dependency>
<dependency>
  <groupId>com.azavea.geotrellis</groupId>
  <artifactId>geotrellis-spark_2.10</artifactId> //this is the one
  <version>0.10.0-M1</version>
</dependency>
<dependency>
  <groupId>org.scalaz.stream</groupId>
  <artifactId>scalaz-stream_2.10</artifactId>
  <version>0.7.2a</version>
</dependency>
<dependency>
  <groupId>org.apache.hadoop</groupId>
  <artifactId>hadoop-core</artifactId>
  <version>0.20.2</version>
</dependency>
<dependency>
  <groupId>com.quantifind</groupId>
  <artifactId>sumac_2.10</artifactId>
  <version>0.3.0</version>
</dependency>

错误:

代码语言:javascript
复制
Exception in thread "main" java.lang.NoClassDefFoundError: geotrellis/spark/ingest/IngestArgs
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2625)
        at java.lang.Class.getMethod0(Class.java:2866)
        at java.lang.Class.getMethod(Class.java:1676)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:670)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: geotrellis.spark.ingest.IngestArgs
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

请告诉我哪里出了问题。事先谢谢!

EN

回答 2

Stack Overflow用户

回答已采纳

发布于 2016-03-04 12:31:07

听起来,您不是在构建一个jar,它实际上包含了您的依赖项。如果这是你的问题,也许这个答案会有帮助:

How can I create an executable JAR with dependencies using Maven?

票数 0
EN

Stack Overflow用户

发布于 2018-04-01 12:11:40

我遇到了一个类似的问题,并将其添加到我的pom.xml中,为我修复了这个问题:

代码语言:javascript
复制
<plugin>
  <artifactId>maven-assembly-plugin</artifactId>
  <configuration>
    <descriptorRefs>
      <descriptorRef>jar-with-dependencies</descriptorRef>
    </descriptorRefs>
  </configuration>
  <executions>
    <execution>
      <id>make-assembly</id>
      <phase>package</phase>
      <goals>
        <goal>single</goal>
      </goals>
    </execution>
  </executions>
</plugin>

这个插件有助于构建一个包含所有依赖项的JAR。来源:building.html#building

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/35795255

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档