首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >无法将RDD注册为TempTable

无法将RDD注册为TempTable
EN

Stack Overflow用户
提问于 2018-07-16 00:46:52
回答 0查看 672关注 0票数 0

我正在使用IntelliJ,并试图从MySql DB获取数据,然后将其写入配置单元表。但是,我无法将我的RDD注册到临时表。错误是"Cannot Resolve Symbol registerTempTable“。

我知道这个问题是由于一些进口丢失,但我不能找出是哪一个。

我已经被这个问题困扰了很长一段时间,并尝试了堆栈溢出的所有可用选项/答案。

下面是我的代码:

代码语言:javascript
复制
import java.sql.Driver
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.JdbcRDD
import java.sql.{Connection, DriverManager, ResultSet}
import org.apache.spark.sql.hive.HiveContext


object JdbcRddExample {

  def main(args: Array[String]): Unit = {
    val url = "jdbc:mysql://localhost:3306/retail_db"
    val username = "retail_dba"
    val password ="cloudera"
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)

    val hiveContext = new HiveContext(sc)
    import hiveContext.implicits._

    Class.forName("com.mysql.jdbc.Driver").newInstance



    val conf = new SparkConf().setAppName("JDBC RDD").setMaster("local[2]").set("spark.executor.memory","1g")
    val sc = new SparkContext(conf)


    val myRDD = new JdbcRDD( sc, () => DriverManager.getConnection(url,username,password) ,
      "select department_id,department_name from departments limit ?,?",
      0,999999999,1,  r => r.getString("department_id") + ", " + r.getString("department_name"))

    myRDD.registerTempTable("My_Table") // error: Not able to resolve registerTempTable



    sqlContext.sql("use my_db")
    sqlContext.sql("Create table my_db.depts (department_id INT, department_name String")

我的SBT:(我相信我已经导入了所有的工件)

代码语言:javascript
复制
name := "JdbcRddExample"

version := "0.1"

scalaVersion := "2.11.12"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"

// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1"

// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.1" % "provided"

// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1" % "provided"

// https://mvnrepository.com/artifact/com.typesafe.scala-logging/scala-logging
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.7.1"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"

libraryDependencies += "org.apache.logging.log4j" % "log4j-api" % "2.11.0"
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % "2.11.0"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.1",
  "org.apache.spark" %% "spark-sql" % "2.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.3.1",
  "mysql" % "mysql-connector-java" % "5.1.12"
)
// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.1" % "provided"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.1"

请给我指出我遗漏的确切的导入。或者有没有别的办法。正如我之前提到的,我已经尝试了所有的解决方案,但到目前为止都没有奏效。

EN

回答

页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/51350272

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档