首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >使用maven阴影插件打包apache程序,但是在提交时会出现NoClassDefFoundError

使用maven阴影插件打包apache程序,但是在提交时会出现NoClassDefFoundError
EN

Stack Overflow用户
提问于 2021-12-13 09:02:53
回答 1查看 575关注 0票数 1

我需要提交一个flink sql程序来运行纱线,

我根据flink官方网站https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/overview/打包的

在flink目录下将依赖项添加到/lib时,它正常工作,但如果我希望从jar包加载依赖项,而不是在/lib下加载依赖项

java.lang.NoClassDefFoundError将在我执行flink run时发生

这是我的pom.xml

代码语言:javascript
复制
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  

    <groupId>org.example</groupId>
    <version>1.0-SNAPSHOT</version>
    <properties>
        <flink.version>1.14.0</flink.version>
        <hive.version>3.1.1</hive.version>
        <scala.version>2.11.12</scala.version>
        <scala.binary.version>2.11</scala.binary.version>
    </properties>
    <dependencies>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-core</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>
            <scope>provided</scope>
            <version>${flink.version}</version>
        </dependency>
        <!--Hive Dependency-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>3.1.1</version>
            <!--            <scope>provided</scope>-->
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala-bridge_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>${hive.version}</version>
            <scope>provided</scope>
        </dependency>
        <!--        <exclusions>-->
        <!--            <exclusion>-->
        <!--                <groupId>org.apache.hive</groupId>-->
        <!--                <artifactId>hive-common</artifactId>-->
        <!--            </exclusion>-->
        <!--        </exclusions>-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-hive_2.11</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <!--        <dependency>-->
        <!--            <groupId>org.apache.flink</groupId>-->
        <!--            <artifactId>flink-table-api-java-bridge_${scala.version}</artifactId>-->
        <!--            <version>${flink.version}</version>-->
        <!--        </dependency>-->
        <dependency>
            <groupId>com.alibaba</groupId>
            <artifactId>fastjson</artifactId>
            <version>1.2.76</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>2.6.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-csv</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <!--        <dependency>-->
        <!--            <groupId>org.apache.hadoop</groupId>-->
        <!--            <artifactId>hadoop-common</artifactId>-->
        <!--            <version>3.1.1</version>-->
        <!--            <scope>provided</scope>-->
        <!--        </dependency>-->
        <!--        <dependency>-->
        <!--            <groupId>org.apache.flink</groupId>-->
        <!--            <artifactId>flink-table-planner-blink_2.12</artifactId>-->
        <!--            <version>${flink.version}</version>-->
        <!--            <scope>provided</scope>-->
        <!--        </dependency>-->

        <!--        <dependency>-->
        <!--            <groupId>org.apache.hadoop</groupId>-->
        <!--            <artifactId>hadoop-hdfs-client</artifactId>-->
        <!--            <version>3.1.1</version>-->
        <!--        </dependency>-->
        <!--        <dependency>-->
        <!--            <groupId>org.apache.hadoop</groupId>-->
        <!--            <artifactId>hadoop-client-api</artifactId>-->
        <!--            <version>3.1.1</version>-->
        <!--        </dependency>-->
        <dependency>
            <groupId>jdk.tools</groupId>
            <artifactId>jdk.tools</artifactId>
            <version>1.8</version>
            <scope>system</scope>
            <systemPath>/Library/Java/JavaVirtualMachines/jdk1.8.0_311.jdk/Contents/Home/lib/tools.jar</systemPath>
        </dependency>
        <!--        <dependency>-->
        <!--            <groupId>org.apache.flink</groupId>-->
        <!--            <artifactId>flink-clients_${scala.version}</artifactId>-->
        <!--            <version>${flink.version}</version>-->
        <!--            <scope>provided</scope>-->
        <!--        </dependency>-->
    </dependencies>
    <build>
        <plugins>
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>4.4.0</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                            <scalaCompatVersion>${scala.binary.version}</scalaCompatVersion>
                            <scalaVersion>${scala.version}</scalaVersion>
                            <args>
                                <arg>-dependencyfile</arg>
                                <arg>${project.build.directory}/.scala_dependencies</arg>
                            </args>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>3.2.0</version>
                <executions>
                    <execution>
                        <id>shade</id>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <transformers combine.children="append">
                                <transformer
                                        implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                                <transformer
                                        implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                    <mainClass>com.ghost.executable.hive.FlinkPlaySubmit</mainClass>
                                </transformer>
                            </transformers>
                            <filters>
                                <filter>
                                    <artifact>*:*</artifact>
                                    <excludes>
                                        <exclude>META-INF/*.SF</exclude>
                                        <exclude>META-INF/*.DSA</exclude>
                                        <exclude>META-INF/*.RSA</exclude>
                                    </excludes>
                                </filter>
                            </filters>
                            <artifactSet>
                                <excludes>
                                    <exclude>org.apache.flink:force-shading</exclude>
                                    <exclude>com.google.code.findbugs:jsr305</exclude>
                                    <exclude>org.slf4j:*</exclude>
                                    <exclude>log4j:*</exclude>
                                </excludes>
                            </artifactSet>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

我执行了maven cleanmaven package来获取jar包,例如,FlinkTest-1.0-SNAPSHOT.jar和原始-FlinkTest测试-1.0-SNAPSHOT.jar,然后执行

flink run -t yarn-per-job -D yarn.application.name=flink_test -detached -c com.ghost.executable.hive.FlinkPlaySubmit FlinkTest-1.0-SNAPSHOT.jar

发生下列异常

java.lang.NoClassDefFoundError: org/apache/kafka/clients/consumer/OffsetResetStrategy

我发现jar包中有这个依赖项,但是flink程序找不到它

我还需要哪些其他操作来打包和执行?

谢谢!

这是我的flink/lib

代码语言:javascript
复制
-rw-r--r--  1 ghost  staff     167761 12  5 21:43 antlr-runtime-3.5.2.jar
-rw-r--r--  1 ghost  staff    7685322 12  5 21:29 flink-connector-hive_2.11-1.14.0.jar
-rw-r--r--  1 ghost  staff     388181 12  5 23:01 flink-connector-kafka_2.11-1.14.0.jar
-rw-r--r--  1 ghost  staff      85588  9 22 21:37 flink-csv-1.14.0.jar
-rw-r--r--  1 ghost  staff  143645853  9 22 21:40 flink-dist_2.11-1.14.0.jar
-rw-r--r--  1 ghost  staff     153148  9 22 21:36 flink-json-1.14.0.jar
-rw-rw-r--  1 ghost  staff    7709731  9  1 18:31 flink-shaded-zookeeper-3.4.14.jar
-rw-r--r--  1 ghost  staff   42286825  9 22 21:39 flink-table_2.11-1.14.0.jar
-rw-r--r--  1 ghost  staff    1654887 12 10 18:31 hadoop-mapreduce-client-core-3.1.1.jar.bak
-rw-r--r--  1 ghost  staff   40605995 12  5 21:43 hive-exec-3.1.1.jar
-rw-r--r--  1 ghost  staff    3535156 12 10 16:22 kafka-clients-2.6.3.jar.bak
-rw-r--r--  1 ghost  staff     313702 12  5 21:43 libfb303-0.9.3.jar
-rw-rw-r--  1 ghost  staff     206756  9  1 18:28 log4j-1.2-api-2.14.1.jar
-rw-rw-r--  1 ghost  staff     300365  9  1 18:28 log4j-api-2.14.1.jar
-rw-rw-r--  1 ghost  staff    1745700  9  1 18:28 log4j-core-2.14.1.jar
-rw-rw-r--  1 ghost  staff      23625  9  1 18:28 log4j-slf4j-impl-2.14.1.jar

我在我想要从jar包加载这两个依赖项的路径下,除了kafka-client和hadoop-mapreduce-client之外,还提供了依赖项,并且pom中的两个项没有被标记为提供的,但它们似乎没有在运行时加载。

EN

回答 1

Stack Overflow用户

发布于 2021-12-13 18:44:47

在pom中,将<scope>设置为provided以获取flink-connector-kafka_${scala.binary.version}工件。因此,Maven树荫插件不认为它需要在uber jar中包含那个jar (以及它唯一的传递依赖项)。因此,您需要确保在纱线集群中安装了jar (及其依赖项),或者删除provided作用域。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/70332105

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档