首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >HADOOP -拒绝执行star-all.sh的权限

HADOOP -拒绝执行star-all.sh的权限
EN

Stack Overflow用户
提问于 2018-09-10 13:20:07
回答 3查看 4.1K关注 0票数 2

我在我的笔记本上安装Hadoop。我遵循了这个指南:https://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

当我尝试运行start-all.sh时,我得到了这样的信息:

代码语言:javascript
复制
vava@vava-ThinkPad:/usr/local/hadoop-3.1.1/sbin$ bash start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as vava in 10 seconds.

WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
pdsh@vava-ThinkPad: localhost: rcmd: socket: Permission denied
Starting datanodes
pdsh@vava-ThinkPad: localhost: rcmd: socket: Permission denied
Starting secondary namenodes [vava-ThinkPad]
pdsh@vava-ThinkPad: vava-ThinkPad: rcmd: socket: Permission denied
Starting resourcemanager
resourcemanager is running as process 3748.  Stop it first.
Starting nodemanagers
pdsh@vava-ThinkPad: localhost: rcmd: socket: Permission denied

我试着回答这个问题,但没什么改变:

starting hadoop process using start-all.sh runs into issues

Hadoop permission issue

编辑:在我尝试了所有选项之后的,似乎唯一有效的选项是export PDSH_RCMD_TYPE=ssh。现在问题在于namenode和datanode。它没有正确地开始:

代码语言:javascript
复制
vava@vava-ThinkPad:/usr/local/hadoop-3.1.1$ sbin/start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as vava in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
pdsh@vava-ThinkPad: localhost: ssh exited with exit code 1
Starting datanodes
localhost: ERROR: Cannot set priority of datanode process 10937
pdsh@vava-ThinkPad: localhost: ssh exited with exit code 1
Starting secondary namenodes [vava-ThinkPad]
Starting resourcemanager
Starting nodemanagers
EN

回答 3

Stack Overflow用户

发布于 2018-09-10 14:10:39

我会检查:

  • 终端中的export PDSH_RCMD_TYPE=ssh
  • 本地防火墙设置
  • 以root用户身份运行命令:sudo /usr/local/hadoop-3.1.1/sbin$ bash start-all.sh
  • chmod -R 755 /usr/local/hadoop-3.1.1

关于你的补充问题:

  • JAVA_HOME中设置hadoop-env.sh并确保该文件中所有其他选项都正确
  • 更改您的用户,"**vava**" Attempting to start all Apache Hadoop daemons as vava in 10 seconds.是错误的,尝试su -l hdfs,然后运行脚本
票数 0
EN

Stack Overflow用户

发布于 2018-09-12 02:54:41

创建一个新文件

代码语言:javascript
复制
/etc/pdsh/rcmd_default

写“ssh”到它,然后保存和退出。请确保输入返回字符并启动新行,否则将提示使用代码1的SSH退出。

代码语言:javascript
复制
echo "ssh" > /etc/pdsh/rcmd_default
票数 0
EN

Stack Overflow用户

发布于 2019-03-07 16:39:03

在我的例子中,您需要确保将RSA复制到当前的本地主机。

代码语言:javascript
复制
ssh-copy-id -i /home/hadoop/.ssh/id_rsa.pub hadoop@localhost

假设您是用"hadoop“登录到节点主节点的。

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/52258740

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档