我使用aspectj-maven-plugin是为了在项目的现有依赖项(在我的例子中是org.apache.spark jars)中编织我的方面。然后我使用maven-assembly-plugin生成一个包含所有依赖项的独立jar。
aspectj插件似乎正确地编织了外部jar中的方面。但是,当我使用maven组装插件创建的依赖项调用jar时,不会为外部jar调用方面(但对于我自己的代码中的切入点却非常有效)。
我怀疑maven组装插件在创建带有依赖项的jar时没有使用编织的jar。但我不知道aspectj插件在哪里存储编织的jar,以及如何使用它们而不是原始的jar。
下面是我的maven配置:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>be.example.aspectj</groupId>
<artifactId>minimal</artifactId>
<version>1.0-SNAPSHOT</version>
<name>minimal</name>
<!-- FIXME change it to the project's website -->
<url>http://www.example.com</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.aspectj/aspectjrt -->
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.9.7</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>1.9.7</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.1.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.1.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/mysql/mysql-connector-java -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.26</version>
</dependency>
</dependencies>
<build>
<!-- lock down plugins versions to avoid using Maven defaults (may be moved to parent pom) -->
<plugins>
<!-- Maven assembly plugin -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.3.0</version>
<configuration>
<!-- get all project dependencies -->
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<!-- MainClass in mainfest make a executable jar -->
<archive>
<manifest>
<mainClass>be.example.aspectj.App</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Deactivating the default maven compiler plugin -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>default-compile</id>
<phase>none</phase>
</execution>
</executions>
</plugin>
<!-- Aspectj configuration -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.14.0</version>
<configuration>
<complianceLevel>${maven.compiler.source}</complianceLevel>
<source>${maven.compiler.source}</source>
<target>${maven.compiler.source}</target>
<showWeaveInfo>true</showWeaveInfo>
<verbose>true</verbose>
<Xlint>ignore</Xlint>
<encoding>UTF-8 </encoding>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>lib/</classpathPrefix>
<mainClass>be.example.aspectj.App</mainClass>
</manifest>
</archive>
<!-- Weaved dependencies -->
<weaveDependencies>
<weaveDependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
</weaveDependency>
<weaveDependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
</weaveDependency>
</weaveDependencies>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>下面是aspectj-maven-plugin的输出。我似乎表明编织已正确完成,但警告似乎表明同一jar的多个版本之间存在冲突:
[INFO] --- aspectj-maven-plugin:1.14.0:compile (default) @ minimal ---
[INFO] Showing AJC message detail for messages of types: [error, warning, fail]
[INFO] Join point 'method-call(java.sql.Connection java.sql.DriverManager.getConnection(java.lang.String, java.lang.String, java.lang.String))' in Type 'org.sparkproject.jetty.security.JDBCLoginService' (JDBCLoginService.java:183) advised by before advice from 'DriverManagerAspect' (DriverManagerAspect.aj:8)
[INFO] Join point 'method-call(java.sql.Connection java.sql.DriverManager.getConnection(java.lang.String))' in Type 'org.sparkproject.jetty.server.session.DatabaseAdaptor' (DatabaseAdaptor.java:305) advised by before advice from 'DriverManagerAspect' (DriverManagerAspect.aj:8)
[INFO] Join point 'method-call(boolean be.example.aspectj.Account.withdraw(int))' in Type 'be.example.aspectj.App' (App.java:19) advised by before advice from 'AccountAspect' (AccountAspect.aj:9)
[INFO] Join point 'method-call(java.sql.Connection java.sql.DriverManager.getConnection(java.lang.String))' in Type 'be.example.aspectj.App' (App.java:36) advised by before advice from 'DriverManagerAspect' (DriverManagerAspect.aj:8)
[WARNING] duplicate resource: 'META-INF/NOTICE'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/LICENSE'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/DEPENDENCIES'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/services/org.apache.spark.deploy.history.EventFilterBuilder'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/maven/org.spark-project.spark/unused/pom.xml'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/maven/org.spark-project.spark/unused/pom.properties'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0我的完整的最小示例可以在这个Github repository上找到。您是否知道如何创建一个包含由aspectj编织的依赖项jar的依赖项jar?或者你知道aspectj-maven-plugin存储他编织的罐子的位置吗?
感谢你阅读我的文章。
发布于 2021-09-04 13:20:21
前言
你在GitHub上的MCVE很有帮助,谢谢。在这方面,其他人应该效仿你,特别是如果他们像你一样有复杂的问题。
AspectJ二进制编织
关于AspectJ二进制编织,它的工作原理如下:所有编织依赖项(即在ajc的inpath上的内容)都写入到编译器的输出目录中。没有筛选或排除任何内容的选项。也就是说,编织的Java类以及未注册的类和所有资源文件都在编写中。
在您的特定情况下,您有两个编织依赖项,这两个编织依赖项恰好包含同名的资源文件,因此出现编译器警告。在您的例子中,避免这种情况的最简单方法是只编织spark-core_2.12,因为根据编译器输出,spark-sql_2.12中没有任何类是编织目标。但这只适用于您的特定情况。假设两个编织依赖项都包含要编织的目标类,您可以
EventFilterBuilder服务描述符文件具有冲突的内容,我通过手动比较它们进行了检查。在这种情况下,一个文件会被target/classes目录中的另一个文件覆盖。忽略此问题是否对您有效,很大程度上取决于您的用例。Maven
uber JAR中的原始文件与编织文件
第二个问题是,在创建程序集时,您的编织文件会被原始依赖项覆盖。这是因为您在类路径中有副本:原始副本和编织副本在您自己的目标目录中。这也可以通过修改程序集描述符以忽略原始描述符来修复。这样你就可以确保编织的将是首选的。
简单的解决方法
下面是我如何对您的项目进行最小程度的修改,以便
archive部分,spark-sql_2.12编织依赖项为了避免出现冲突警告,spark-core_2.12文件。diff --git a/minimal/src/assembly/executable-jar.xml b/minimal/src/assembly/executable-jar.xml
new file mode 100644
--- /dev/null (revision Staged)
+++ b/minimal/src/assembly/executable-jar.xml (revision Staged)
@@ -0,0 +1,27 @@
+<assembly xmlns="http://maven.apache.org/ASSEMBLY/2.1.0"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/ASSEMBLY/2.1.0 http://maven.apache.org/xsd/assembly-2.1.0.xsd">
+ <id>jar-with-dependencies</id>
+ <formats>
+ <format>jar</format>
+ </formats>
+ <includeBaseDirectory>false</includeBaseDirectory>
+ <dependencySets>
+ <dependencySet>
+ <outputDirectory>/</outputDirectory>
+ <useProjectArtifact>true</useProjectArtifact>
+ <unpack>true</unpack>
+ <scope>runtime</scope>
+ <excludes>
+<!--
+ <exclude>
+ org.apache.spark:spark-sql_2.12:*
+ </exclude>
+-->
+ <exclude>
+ org.apache.spark:spark-core_2.12:*
+ </exclude>
+ </excludes>
+ </dependencySet>
+ </dependencySets>
+</assembly>
diff --git a/minimal/pom.xml b/minimal/pom.xml
--- a/minimal/pom.xml (revision HEAD)
+++ b/minimal/pom.xml (revision Staged)
@@ -73,10 +73,9 @@
<artifactId>maven-assembly-plugin</artifactId>
<version>3.3.0</version>
<configuration>
- <!-- get all project dependencies -->
- <descriptorRefs>
- <descriptorRef>jar-with-dependencies</descriptorRef>
- </descriptorRefs>
+ <descriptors>
+ <descriptor>src/assembly/executable-jar.xml</descriptor>
+ </descriptors>
<!-- MainClass in mainfest make a executable jar -->
<archive>
@@ -121,22 +120,10 @@
<showWeaveInfo>true</showWeaveInfo>
<verbose>true</verbose>
<Xlint>ignore</Xlint>
- <encoding>UTF-8 </encoding>
- <archive>
- <manifest>
- <addClasspath>true</addClasspath>
- <classpathPrefix>lib/</classpathPrefix>
- <mainClass>be.example.aspectj.App</mainClass>
- </manifest>
- </archive>
+ <encoding>UTF-8</encoding>
<!-- Weaved dependencies -->
<weaveDependencies>
- <weaveDependency>
- <groupId>org.apache.spark</groupId>
- <artifactId>spark-sql_2.12</artifactId>
- </weaveDependency>
-
<weaveDependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>更通用的解决方案
如果您想要更通用的解决方案,您可以
https://stackoverflow.com/questions/69044381
复制相似问题