Permission Denied error while running start-dfs.sh

21,983

Solution 1

I also encountered the same thing, I did so I found that my pdsh default rcmd is rsh, not ssh, rsh and ssh remote login authentication is not the same, when installing hadoop I configured ssh localhost password-free login, but rsh is not possible.

so,try:

1.check your pdsh default rcmd rsh

pdsh -q -w localhost

See what your pdsh default rcmd is.

2.Modify pdsh's default rcmd to ssh

export PDSH_RCMD_TYPE=ssh

you can be added to ~/.bashrc, and source ~/.bashrc

3.sbin / start-dfs.sh

Solution 2

Uninstall pdsh will solve this problem. I am working with Hadoop version 3.2.1 on Ubuntu 18.04.4 LTS.

I test start-dfs.sh on several fresh-installed virtual machines and one old VM. The command failed only on the old VM. I have tried the high-voted answer, and found that only the old VM has pdsh installed. So I uninstall this software, and after that the command executed successfully.

So, if you did not install pdsh for some purpose, you can try to uninstall it.

Solution 3

Try uninstalling pdsh

sudo apt-get remove pdsh

and then restart your hadoop by:

sudo start-dfs.sh

This is what worked for me.

Solution 4

A sudo user could also change pdsh's Rcmd type to ssh via the command:

echo "ssh" | sudo tee /etc/pdsh/rcmd_default

Share:
21,983

Related videos on Youtube

Gaurav A Dubey
Author by

Gaurav A Dubey

Student and currently doing masters in IT. Eager to learn, develop and contribute to open source. :)

Updated on July 09, 2022

Comments

  • Gaurav A Dubey
    Gaurav A Dubey almost 2 years

    I am getting this error while performing start-dfs.sh

    Starting namenodes on [localhost] 
    pdsh@Gaurav: localhost: rcmd: socket: Permission denied 
    Starting datanodes
    pdsh@Gaurav: localhost: rcmd: socket: Permission denied
    Starting secondary namenodes [Gaurav]
    pdsh@Gaurav: Gaurav: rcmd: socket: Permission denied 2017-03-13 09:39:29,559 
    WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    

    Using hadoop 3.0 alpha 2 version.

    Any help is appreciated

  • Gaurav A Dubey
    Gaurav A Dubey about 7 years
    Hi! thanks for the help... The problem was occurring due to 3.0 alpha 2. When I repeated process in hadoop2.7 it solved the problem...
  • Adi Krishnan
    Adi Krishnan over 3 years
    This worked for me too. pdsh is not an absolute requirement. From the docs - Additionally, it is recommmended that pdsh also be installed for better ssh resource management.
  • Erol Erdogan
    Erol Erdogan over 3 years
    You're great. It definitely worked for me. The people probably set-up it over ssh. But this setting was rsh. This is correct solution.
  • Lawhatre
    Lawhatre over 2 years
    after removing pdsh i got ERROR: JAVA_HOME is not set and could not be found. I don't think it was to be removed.
  • Gary Wang
    Gary Wang over 2 years
    @Lawhatre JAVA_HOME can be explicitly set in HADOOP_HOME/etc/hadoop/hadoop-env.sh or HOME/.bashrc or /etc/profile.d/hadoop.sh. Make sure that JDK/JRE have been installed on your system.