我有一个本地库(Freeling),它是我使用cmake和make编译的,并通过集群启动操作安装(所以,它应该存在于master和每个worker中)。
即便如此,我在调用System.loadLibrary时还是收到了这个错误
Exception in thread "main" java.lang.UnsatisfiedLinkError: no Jfreeling in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)我尝试使用以下属性让程序找到该库(在静态代码块中调用)
"properties": {
"spark.driver.extraClassPath": "/usr/local/share/freeling/APIs/java/Jfreeling.jar:/usr/local/lib/libfreeling.so",
"spark.executor.extraClassPath": "/usr/local/share/freeling/APIs/java/Jfreeling.jar:/usr/local/lib/libfreeling.so",
"spark.executor.extraLibraryPath": "/usr/local/lib/libfreeling.so",
"spark.driver.extraLibraryPath": "/usr/local/lib/libfreeling.so",
"spark.executorEnv.LD_PRELOAD": "/usr/local/lib/libfreeling.so",
"spark.yarn.dist.files": "/usr/local/lib/libfreeling.so",
"spark.yarn.appMasterEnv.LD_PRELOAD": "libfreeling.so",
"spark.files": "/usr/local/lib/libfreeling.so",
"spark.executorEnv.LD_LIBRARY_PATH": "libfreeling.so"
},
"jarFileUris": [
"file:///usr/local/share/freeling/APIs/java/Jfreeling.jar",
"file:///usr/local/lib/libfreeling.so"
],发布于 2019-07-25 06:05:37
你能试着把你的库放在/usr/lib/hadoop/lib/native/下吗?在/etc/spark/conf/spark-env.sh中,它有
# Spark got rid of SPARK_LIBRARY_PATH in 1.0
# It has properties for extraLibraryPaths, but this is more extensible
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${HADOOP_HOME}/lib/native发布于 2019-08-15 16:06:37
您应该将/usr/local/share/freeling/APIs/java/Jfreeling.jar添加到您的类中,并将/usr/local/share/freeling/APIs/java/libJfreeling.so添加到您的LD_LIBRARY_PATH中。
https://stackoverflow.com/questions/57149660
复制相似问题