首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >通过dask纱线加载YarnCluster会导致java错误。

通过dask纱线加载YarnCluster会导致java错误。
EN

Stack Overflow用户
提问于 2019-05-28 16:00:36
回答 1查看 297关注 0票数 0

我正在尝试设置并运行Dask,如页面中所描述的:https://yarn.dask.org/en/latest/quickstart.html#usage

我使用conda-pack将我的conda-pack环境打包到environment.tar.gz文件中,然后尝试在python中运行以下内容(来自同一个文件夹):

代码语言:javascript
复制
python
>>> from dask_yarn import YarnCluster
>>> cluster = YarnCluster(environment='environment.tar.gz')

这导致了下面粘贴的java错误。

代码语言:javascript
复制
19/05/28 15:45:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/05/28 15:45:40 ERROR skein.Driver: Error running Driver
java.lang.UnsupportedClassVersionError: com/google/cloud/hadoop/fs/gcs/GoogleHadoopFileSystem : Unsupported major.minor version 52.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:363)
        at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
        at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2750)
        at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2777)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2794)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2830)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2812)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:179)
        at com.anaconda.skein.Driver.getFs(Driver.java:304)
        at com.anaconda.skein.Driver.run(Driver.java:279)
        at com.anaconda.skein.Driver.main(Driver.java:174)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/.conda/envs/dask_yarn/lib/python3.7/site-packages/dask_yarn/core.py", line 295, in __init__
    self._start_cluster(spec, skein_client)
  File "/home/.conda/envs/dask_yarn/lib/python3.7/site-packages/dask_yarn/core.py", line 339, in _start_cluster
    skein_client = _get_skein_client(skein_client)
  File "/home/.conda/envs/dask_yarn/lib/python3.7/site-packages/dask_yarn/core.py", line 46, in _get_skein_client
    return skein.Client(security=security)
  File "/home/.conda/envs/dask_yarn/lib/python3.7/site-packages/skein/core.py", line 353, in __init__
    java_options=java_options)
  File "/home/.conda/envs/dask_yarn/lib/python3.7/site-packages/skein/core.py", line 266, in _start_driver
    raise DriverError("Failed to start java process")
skein.exceptions.DriverError: Failed to start java process

一些搜索似乎表明错误与编译和运行时之间的版本不匹配有关。我尝试设置环境变量,如下所示,但这也不起作用。还有其他解决这个错误的方法吗?

代码语言:javascript
复制
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk.x86_64
export JRE_HOME=$JAVA_HOME/jre
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2019-05-31 20:54:06

解决这个问题的方法是使用Java1.8而不是Java1.7。例如,请参见this

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/56346115

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档