首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >maven本地缓存错误

maven本地缓存错误
EN

Stack Overflow用户
提问于 2015-08-07 05:20:15
回答 1查看 1.4K关注 0票数 0

我正在尝试运行一些spark应用程序。问题是,maven似乎在其本地缓存存储库中找到了一些包,但当它试图加载它们时,它没有找到它们,实际上,一些包似乎存在,但在不包含任何jar的意义上是不完整的。

日志如下:

代码语言:javascript
复制
com.databricks#spark-avro_2.10 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
    confs: [default]
    found com.databricks#spark-avro_2.10;1.0.0 in local-m2-cache
    found org.apache.avro#avro;1.7.6 in local-m2-cache
    found org.codehaus.jackson#jackson-core-asl;1.9.13 in local-m2-cache
    found org.codehaus.jackson#jackson-mapper-asl;1.9.13 in local-m2-cache
    found com.thoughtworks.paranamer#paranamer;2.3 in local-m2-cache
    found org.xerial.snappy#snappy-java;1.0.5 in local-m2-cache
    found org.apache.commons#commons-compress;1.4.1 in local-m2-cache
    found org.tukaani#xz;1.0 in local-m2-cache
    found org.slf4j#slf4j-api;1.6.4 in local-m2-cache
:: resolution report :: resolve 484ms :: artifacts dl 22ms
    :: modules in use:
    com.databricks#spark-avro_2.10;1.0.0 from local-m2-cache in [default]
    com.thoughtworks.paranamer#paranamer;2.3 from local-m2-cache in [default]
    org.apache.avro#avro;1.7.6 from local-m2-cache in [default]
    org.apache.commons#commons-compress;1.4.1 from local-m2-cache in [default]
    org.codehaus.jackson#jackson-core-asl;1.9.13 from local-m2-cache in [default]
    org.codehaus.jackson#jackson-mapper-asl;1.9.13 from local-m2-cache in [default]
    org.slf4j#slf4j-api;1.6.4 from local-m2-cache in [default]
    org.tukaani#xz;1.0 from local-m2-cache in [default]
    org.xerial.snappy#snappy-java;1.0.5 from local-m2-cache in [default]
    ---------------------------------------------------------------------
    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   9   |   0   |   0   |   0   ||   9   |   0   |
    ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
        [NOT FOUND  ] org.xerial.snappy#snappy-java;1.0.5!snappy-java.jar(bundle) (1ms)

    ==== local-m2-cache: tried

      file:/Users/someuser/.m2/repository/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar

        [NOT FOUND  ] org.slf4j#slf4j-api;1.6.4!slf4j-api.jar (1ms)

    ==== local-m2-cache: tried

      file:/Users/someuser/.m2/repository/org/slf4j/slf4j-api/1.6.4/slf4j-api-1.6.4.jar

        ::::::::::::::::::::::::::::::::::::::::::::::

        ::              FAILED DOWNLOADS            ::

        :: ^ see resolution messages for details  ^ ::

        ::::::::::::::::::::::::::::::::::::::::::::::

        :: org.xerial.snappy#snappy-java;1.0.5!snappy-java.jar(bundle)

        :: org.slf4j#slf4j-api;1.6.4!slf4j-api.jar

        ::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [download failed: org.xerial.snappy#snappy-java;1.0.5!snappy-java.jar(bundle), download failed: org.slf4j#slf4j-api;1.6.4!slf4j-api.jar]
    at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:995)
    at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:263)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:145)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我尝试删除.m2/repository文件夹,然后再次运行maven,但没有任何改变。

编辑:这是我的pom文件

代码语言:javascript
复制
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>spark-ontology</groupId>
    <artifactId>spark-filter</artifactId>
    <version>1.0-SNAPSHOT</version>
    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.4.1</version>
        </dependency>
        <dependency>
            <groupId>com.databricks</groupId>
            <artifactId>spark-avro_2.10</artifactId>
            <version>1.0.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.10</artifactId>
            <version>1.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.7.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-yarn-common</artifactId>
            <version>2.7.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-yarn-api</artifactId>
            <version>2.7.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-common</artifactId>
            <version>2.7.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
            <version>2.7.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.avro</groupId>
            <artifactId>avro</artifactId>
            <version>1.7.7</version>
        </dependency>
        <dependency>
            <groupId>org.apache.avro</groupId>
            <artifactId>avro-mapred</artifactId>
            <version>1.7.7</version>
        </dependency>
        <dependency>
            <groupId>org.apache.zookeeper</groupId>
            <artifactId>zookeeper</artifactId>
            <version>3.4.5</version>
        </dependency>
    </dependencies>
    <properties>
        <java.version>1.7</java.version>
    </properties>
    <build>
        <pluginManagement>
            <plugins>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-compiler-plugin</artifactId>
                    <version>3.1</version>
                    <configuration>
                        <source>${java.version}</source>
                        <target>${java.version}</target>
                    </configuration>
                </plugin>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-shade-plugin</artifactId>
                    <version>2.4.1</version>
                    <configuration>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/*.SF</exclude>
                                    <exclude>META-INF/*.DSA</exclude>
                                    <exclude>META-INF/*.RSA</exclude>
                                </excludes>
                            </filter>
                        </filters>
                    </configuration>
                    <executions>
                        <execution>
                            <phase>package</phase>
                            <goals>
                                <goal>shade</goal>
                            </goals>
                        </execution>
                    </executions>
                </plugin>
            </plugins>
        </pluginManagement>
    </build>

</project>
EN

回答 1

Stack Overflow用户

发布于 2015-08-07 05:39:51

尝试添加以下依赖项-

代码语言:javascript
复制
<dependency>
    <groupId>org.xerial.snappy</groupId>
    <artifactId>snappy-java</artifactId>
    <version>1.0.5</version>
</dependency>

Maven在准备构建时会获得所有必需的依赖项。如果没有找到依赖项,它将抛出一个错误,但是除非明确地告诉它需要哪些依赖项,否则它不能解决这些问题。

在本例中,看起来其中一个命名依赖项需要xarie1.0.5,因此除非它作为显式命名的依赖项提供,否则将不会编译。

您将需要运行“mvn clean package”来强制重新解析依赖关系。在运行“干净包”之前,您可能还需要从.m2代码库的your文件夹中删除1.0.5目录。有时,maven会看到目录(不管其中是否有jar ),而不会尝试从中央存储库解析依赖项。

我使用maven已经有一段时间了,但我确实记得有几次,当maven试图从存储库中提取jar时,网络连接失败,导致.m2处于不一致的状态。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/31866009

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档