我遵循Apache的文档并做了所有必要的配置更改,但是当我运行submit时,我得到了以下错误:
Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name如果我能得到任何帮助,我将不胜感激。
这是我的配置:
## Begin Kerberos configuration in spark-defaults.conf
spark.history.kerberos.enabled true
spark.history.kerberos.principal spark/${HOSTNAME}@<REALM>
spark.history.kerberos.keytab ${KEYTAB_HOME}/spark_svc_principal.keytab
spark.yarn.keytab ${KEYTAB_HOME}/hadoop.keytab
spark.yarn.principal hadoop/${HOSTNAME}@<REALM>
spark.yarn.kerberos.relogin.period 1m
## End Kerberos configuration in spark-defaults.conf
## Begin Kerberos configuration in hive-site.xml
<property>
<name>hive.metastore.kerberos.principal</name>
<value>hadoop/_HOST@REALM</value>
</property>
<property>
<name>hive.metastore.sasl.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.metastore.kerberos.keytab.file</name>
<value>${KEYTAB_HOME}/hadoop.keytab</value>
</property>
<property>
<name>hive.server2.authentication</name>
<value>KERBEROS</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.principal</name>
<value>hadoop/_HOST@REALM</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.keytab</name>
<value>${KEYTAB_HOME}/hadoop.keytab</value>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>true</value>
</property>
## End Kerberos configuration in hive-site.xml
I am not sure if I got this right.以下是我要引燃的论点:
"SPARK_SUBMIT_OPTS='-Xmx4g' \
${SPARK_HOME}/sbin/start-thriftserver.sh \
--executor-memory 5g \
--driver-cores 4 \
--num-executors 15"发布于 2020-07-30 08:21:27
AFAIK,下面的方法可以解决你的问题,
KERBEROS_KEYTAB_PATH=/home/user/user.keytab和KERBEROS_PRINCIPAL=user@NAME.COM .
方法1:使用kinit命令
步骤1:启动并继续执行火花提交
kinit -kt ${KERBEROS_KEYTAB_PATH} ${KERBEROS_PRINCIPAL}步骤2:运行并验证Kerberization是否正确地用于登录用户.
Ticket cache: FILE:/tmp/krb5cc_XXXXXXXXX_XXXXXX
Default principal: user@NAME.COM
Valid starting Expires Service principal
07/30/2020 15:52:28 07/31/2020 01:52:28 krbtgt/NAME.COM@NAME.COM
renew until 08/06/2020 15:52:28步骤3:用火花会话替换火花代码
val sparkSession = SparkSession.builder().config(sparkConf).appName("TEST1").enableHiveSupport().getOrCreate()步骤4:运行火花-提交如下.
$SPARK_HOME/bin/spark-submit --class com.test.load.Data \
--master yarn \
--deploy-mode cluster \
--driver-memory 2g \
--executor-memory 2g \
--executor-cores 2 --num-executors 2 \
--conf "spark.driver.cores=2" \
--name "TEST1" \
--principal ${KERBEROS_PRINCIPAL} \
--keytab ${KERBEROS_KEYTAB_PATH} \
--conf spark.files=$SPARK_HOME/conf/hive-site.xml \
/home/user/sparkproject/Test-jar-1.0.jar 2> /home/user/logs/test1.log编辑1:
<property>
<name>hive.server2.authentication</name>
<value>KERBEROS</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.keytab</name>
<value>/home/user/user.keytab</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.principal</name>
<value>hive/_HOST@NAME.COM</value>
</property>
<property>
<name>hive.server2.authentication.ldap.Domain</name>
<value>NAME.COM</value>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>true</value>
</property>
<property>
<name>hive.server2.keystore.password</name>
<value>password</value>
</property>
<property>
<name>hive.server2.keystore.path</name>
<value>/home/user/hive-jks/hive.jks</value>
</property>https://stackoverflow.com/questions/63162408
复制相似问题