首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >无法连接火花-云彩

无法连接火花-云彩
EN

Stack Overflow用户
提问于 2016-12-02 05:27:42
回答 1查看 133关注 0票数 0

我试图使用Java代码从Cloudant获取数据并获取错误,

我试过下面的火花和云火花版本,

火花2.0, 火花2.0.1, 火花2.0.2

所有版本的错误与下面发布的错误相同。

如果我添加scala依赖项来解决错误,则此错误与Spark库冲突。

下面是我的java代码,

代码语言:javascript
复制
package spark.cloudant.connecter;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.SQLContext;
import com.cloudant.spark.*;

public class cloudantconnecter {
    public static void main(String[] args) throws Exception {

        try {
            SparkConf sparkConf = new SparkConf().setAppName("spark cloudant connecter").setMaster("local[*]");
            sparkConf.set("spark.streaming.concurrentJobs", "30");

            JavaSparkContext sc = new JavaSparkContext(sparkConf);

            SQLContext sqlContext = new SQLContext(sc);
            System.out.print("initialization successfully");


            Dataset<org.apache.spark.sql.Row> st = sqlContext.read().format("com.cloudant.spark")
                    .option("cloudant.host", "HOSTNAME").option("cloudant.username", "USERNAME")
                    .option("cloudant.password", "PASSWORD").load("DATABASENAME");

            st.printSchema();


        } catch (

        Exception e) {
            e.printStackTrace();
        }
    }
}

Maven属地

代码语言:javascript
复制
<dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>2.0.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.10</artifactId>
            <version>2.0.0</version>
        </dependency>
        <dependency>
            <groupId>cloudant-labs</groupId>
            <artifactId>spark-cloudant</artifactId>
            <version>2.0.0-s_2.11</version>
        </dependency>
    </dependencies>

获取错误的详细信息,

代码语言:javascript
复制
Exception in thread "main" java.lang.NoSuchMethodError: scala/Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object; (loaded from file:/C:/Users/Administrator/.m2/repository/org/scala-lang/scala-library/2.10.6/scala-library-2.10.6.jar by sun.misc.Launcher$AppClassLoader@9f916f97) called from class scalaj.http.HttpConstants$ (loaded from file:/C:/Users/Administrator/.m2/repository/org/scalaj/scalaj-http_2.11/2.3.0/scalaj-http_2.11-2.3.0.jar by sun.misc.Launcher$AppClassLoader@9f916f97).
    at scalaj.http.HttpConstants$.liftedTree1$1(Http.scala:637)
    at scalaj.http.HttpConstants$.<init>(Http.scala:636)
    at scalaj.http.HttpConstants$.<clinit>(Http.scala)
    at scalaj.http.BaseHttp$.$lessinit$greater$default$2(Http.scala:754)
    at scalaj.http.Http$.<init>(Http.scala:738)
    at scalaj.http.Http$.<clinit>(Http.scala)
    at com.cloudant.spark.common.JsonStoreDataAccess.getQueryResult(JsonStoreDataAccess.scala:152)
    at com.cloudant.spark.common.JsonStoreDataAccess.getTotalRows(JsonStoreDataAccess.scala:99)
    at com.cloudant.spark.common.JsonStoreRDD.totalRows$lzycompute(JsonStoreRDD.scala:56)
    at com.cloudant.spark.common.JsonStoreRDD.totalRows(JsonStoreRDD.scala:55)
    at com.cloudant.spark.common.JsonStoreRDD.totalPartition$lzycompute(JsonStoreRDD.scala:59)
    at com.cloudant.spark.common.JsonStoreRDD.totalPartition(JsonStoreRDD.scala:58)
    at com.cloudant.spark.common.JsonStoreRDD.getPartitions(JsonStoreRDD.scala:81)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
    at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1934)
    at org.apache.spark.rdd.RDD$$anonfun$fold$1.apply(RDD.scala:1046)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
    at org.apache.spark.rdd.RDD.fold(RDD.scala:1040)
    at org.apache.spark.sql.execution.datasources.json.InferSchema$.infer(InferSchema.scala:68)
    at org.apache.spark.sql.DataFrameReader$$anonfun$3.apply(DataFrameReader.scala:317)
    at org.apache.spark.sql.DataFrameReader$$anonfun$3.apply(DataFrameReader.scala:317)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:316)
    at com.cloudant.spark.DefaultSource.create(DefaultSource.scala:127)
    at com.cloudant.spark.DefaultSource.createRelation(DefaultSource.scala:105)
    at com.cloudant.spark.DefaultSource.createRelation(DefaultSource.scala:100)
    at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:315)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:132)
    at spark.cloudant.connecter.cloudantconnecter.main(cloudantconnecter.java:24)
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2016-12-02 12:17:10

显示错误是因为所提到的库使用scala 2.10,提到的包使用2.11触发云库

因此,请将库火花-核心_2.10改为火花-核心_2.11。

所以现在依赖关系是,

代码语言:javascript
复制
<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.0.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>2.0.1</version>
        </dependency>   
        <dependency>
            <groupId>cloudant-labs</groupId>
            <artifactId>spark-cloudant</artifactId>
            <version>2.0.0-s_2.11</version>
        </dependency>
票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/40925387

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档