我正在尝试学习一个基于IntelliJ IDEA的Scala-Spark JDBC程序。为此,我创建了一个Scala SBT项目,项目结构如下:

在类中编写JDBC连接参数之前,我首先尝试加载一个包含所有连接属性的属性文件,并尝试显示它们是否正确加载,如下所示:
connection.properties内容:
devUserName=username
devPassword=password
gpDriverClass=org.postgresql.Driver
gpDevUrl=jdbc:url代码:
package com.yearpartition.obj
import java.io.FileInputStream
import java.util.Properties
import org.apache.spark.sql.SparkSession
import org.apache.log4j.{Level, LogManager, Logger}
import org.apache.spark.SparkConf
object PartitionRetrieval {
var conf = new SparkConf().setAppName("Spark-JDBC")
val properties = new Properties()
properties.load(new FileInputStream("connection.properties"))
val connectionUrl = properties.getProperty("gpDevUrl")
val devUserName=properties.getProperty("devUserName")
val devPassword=properties.getProperty("devPassword")
val gpDriverClass=properties.getProperty("gpDriverClass")
println("connectionUrl: " + connectionUrl)
Class.forName(gpDriverClass).newInstance()
def main(args: Array[String]): Unit = {
val spark = SparkSession.builder().enableHiveSupport().config(conf).master("local[2]").getOrCreate()
println("connectionUrl: " + connectionUrl)
}
}Build.sbt的内容:
name := "YearPartition"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies ++= {
val sparkCoreVer = "2.2.0"
val sparkSqlVer = "2.2.0"
Seq(
"org.apache.spark" %% "spark-core" % sparkCoreVer % "provided" withSources(),
"org.apache.spark" %% "spark-sql" % sparkSqlVer % "provided" withSources(),
"org.json4s" %% "json4s-jackson" % "3.2.11" % "provided",
"org.apache.httpcomponents" % "httpclient" % "4.5.3"
)
}由于我没有将数据写入或保存到任何文件中,并试图显示属性文件的值,因此我使用以下代码执行代码:
SPARK_MAJOR_VERSION=2 spark-submit --class com.yearpartition.obj.PartitionRetrieval yearpartition_2.11-0.1.jar但是我得到的文件没有找到异常,如下所示:
Caused by: java.io.FileNotFoundException: connection.properties (No such file or directory)我试图修复它,但徒劳无功。谁能让我知道我在这里做的错误是什么,以及我如何才能纠正它?
https://stackoverflow.com/questions/51464239
复制相似问题