Find port number where HDFS is listening

46,059

Solution 1

Below command available in Apache hadoop 2.7.0 onwards, this can be used for getting the values for the hadoop configuration properties. fs.default.name is deprecated in hadoop 2.0, fs.defaultFS is the updated value. Not sure whether this will work incase of maprfs.

hdfs getconf -confKey fs.defaultFS  # ( new property ) 

or

hdfs getconf -confKey fs.default.name    # ( old property ) 

Not sure whether there is any command line utilities available for retrieving configuration properties values in Mapr or hadoop 0.20 hadoop versions. In case of this situation you better try the same in Java for retrieving the value corresponding to a configuration property.

Configuration hadoop conf = Configuration.getConf();
System.out.println(conf.get("fs.default.name"));

Solution 2

fs.default.name is deprecated.

use : hdfs getconf -confKey fs.defaultFS

Solution 3

I encountered this answer when I was looking for HDFS URI. Generally that's a URL pointing to the namenode. While hdfs getconf -confKey fs.defaultFS gets me the name of the nameservice but it won't help me building the HDFS URI.

I tried the command below to get a list of the namenodes instead

 hdfs getconf -namenodes

This gave me a list of all the namenodes, primary first followed by secondary. After that constructing the HDFS URI was simple

hdfs://<primarynamenode>/
Share:
46,059
ernesto
Author by

ernesto

developer

Updated on March 20, 2020

Comments

  • ernesto
    ernesto about 4 years

    I want to access hdfs with fully qualified names such as :

    hadoop fs -ls hdfs://machine-name:8020/user
    

    I could also simply access hdfs with

    hadoop fs -ls /user
    

    However, I am writing test cases that should work on different distributions(HDP, Cloudera, MapR...etc) which involves accessing hdfs files with qualified names.

    I understand that hdfs://machine-name:8020 is defined in core-site.xml as fs.default.name. But this seems to be different on different distributions. For example, hdfs is maprfs on MapR. IBM BigInsights don't even have core-site.xml in $HADOOP_HOME/conf.

    There doesn't seem to a way hadoop tells me what's defined in fs.default.name with it's command line options.

    How can I get the value defined in fs.default.name reliably from command line?

    The test will always be running on namenode, so machine name is easy. But getting the port number(8020) is a bit difficult. I tried lsof, netstat.. but still couldn't find a reliable way.

  • ernesto
    ernesto over 9 years
    The command line utility "hdfs" is not available in earlier hadoop version?
  • SachinJose
    SachinJose over 9 years
    hdfs command introduced to hadoop from 2.X, this command is not there in 1.X, You may try the second option in that case
  • Nabeel Moidu
    Nabeel Moidu about 9 years
    Use hadoop conf -key fs.default.name for pre 2.x versions.
  • SachinJose
    SachinJose over 6 years
    @eiram_mahera .. hdfs getconf subcommand was added only in Hadoop-2.7.0, this subcommand is not available in the older hadoop version, Which hadoop version are you using ?
  • eiram_mahera
    eiram_mahera over 6 years
    @sachin, Oh! I am using hadoop 2.6.5. Is there a way to find out name node and portnumber in this version?
  • SachinJose
    SachinJose over 6 years
    Updated hadoop version in the answer, You need to use the Java API in that case.
  • Ermolai
    Ermolai almost 2 years
    22/06/23 12:24:57 INFO Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS