首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >无法使用IntelliJ本地连接到hdfs内核化集群

无法使用IntelliJ本地连接到hdfs内核化集群
EN

Stack Overflow用户
提问于 2018-06-20 15:17:23
回答 1查看 2.6K关注 0票数 1

试图通过安装在我的intelliJ集群上的laptop.The在本地连接hdfs的Iam试图连接到hdfs是通过一个edge节点实现的。我为边缘节点生成了一个keytab,并在下面的代码中对其进行了配置。我现在可以登录到edgenode了。但是,当我现在尝试访问位于namenode上的hdfs数据时,它会抛出一个错误。下面是试图连接到hdfs的Scala代码:

代码语言:javascript
复制
import org.apache.spark.sql.SparkSession
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.{FileSystem, Path}
import org.apache.hadoop.security.{Credentials, UserGroupInformation}
import org.apache.hadoop.security.token.{Token, TokenIdentifier}
import java.security.{AccessController, PrivilegedAction, PrivilegedExceptionAction}
import java.io.PrintWriter

object DataframeEx {
  def main(args: Array[String]) {
    // $example on:init_session$
    val spark = SparkSession
      .builder()
      .master(master="local")
      .appName("Spark SQL basic example")
      .config("spark.some.config.option", "some-value")
      .getOrCreate()

    runHdfsConnect(spark)

    spark.stop()
  }

   def runHdfsConnect(spark: SparkSession): Unit = {

    System.setProperty("HADOOP_USER_NAME", "m12345")
    val path = new Path("/data/interim/modeled/abcdef")
    val conf = new Configuration()
    conf.set("fs.defaultFS", "hdfs://namenodename.hugh.com:8020")
    conf.set("hadoop.security.authentication", "kerberos")
    conf.set("dfs.namenode.kerberos.principal.pattern","hdfs/_HOST@HUGH.COM")

    UserGroupInformation.setConfiguration(conf);
    val ugi=UserGroupInformation.loginUserFromKeytabAndReturnUGI("m12345@HUGH.COM","C:\\Users\\m12345\\Downloads\\m12345.keytab");

    println(UserGroupInformation.isSecurityEnabled())
     ugi.doAs(new PrivilegedExceptionAction[String] {
       override def run(): String = {
         val fs= FileSystem.get(conf)
         val output = fs.create(path)
         val writer = new PrintWriter(output)
         try {
           writer.write("this is a test")
           writer.write("\n")
         }
         finally {
           writer.close()
           println("Closed!")
         }
          "done"
       }
     })
  }
}

我可以登录到edgenode。但是,当Iam试图写入hdfs ( doAs方法)时,它会引发以下错误:

代码语言:javascript
复制
WARN Client: Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM
18/06/11 12:12:01 ERROR UserGroupInformation: PriviledgedActionException m12345@HUGH.COM (auth:KERBEROS) cause:java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM
18/06/11 12:12:01 ERROR UserGroupInformation: PriviledgedActionException as:m12345@HUGH.COM (auth:KERBEROS) cause:java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM; Host Details : local host is: "INMBP-m12345/172.29.155.52"; destination host is: "namenodename.hugh.com":8020; 
Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM; Host Details : local host is: "INMBP-m12345/172.29.155.52"; destination host is: "namenodename.hugh.com":8020

如果我登录到edgenode并执行kinit,然后访问hdfs,就可以了。那么,当Iam能够登录到edgenode时,为什么我不能访问hdfs namenode呢?

如果我这边还需要更多的细节,请告诉我。

EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2018-06-25 11:33:59

Spark对象设置错误。以下是对我起作用的原因:

代码语言:javascript
复制
val conf = new Configuration()
conf.set("fs.defaultFS", "hdfs://namenodename.hugh.com:8020")
conf.set("hadoop.security.authentication", "kerberos")
conf.set("hadoop.rpc.protection", "privacy")   ***---(was missing this parameter)***
conf.set("dfs.namenode.kerberos.principal","hdfs/_HOST@HUGH.COM") ***---(this was initially wrongly set as dfs.namenode.kerberos.principal.pattern)***
票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/50951656

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档