Hadoop "Unable to load native-hadoop library for your platform" warning

537,103

Solution 1

I assume you're running Hadoop on 64bit CentOS. The reason you saw that warning is the native Hadoop library $HADOOP_HOME/lib/native/libhadoop.so.1.0.0 was actually compiled on 32 bit.

Anyway, it's just a warning, and won't impact Hadoop's functionalities.

Here is the way if you do want to eliminate this warning, download the source code of Hadoop and recompile libhadoop.so.1.0.0 on 64bit system, then replace the 32bit one.

Steps on how to recompile source code are included here for Ubuntu:

Good luck.

Solution 2

Just append word native to your HADOOP_OPTS like this:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"

PS: Thank Searene

Solution 3

The answer depends... I just installed Hadoop 2.6 from tarball on 64-bit CentOS 6.6. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:

/opt/hadoop/lib/native/libhadoop.so.1.0.0

And I know it is 64-bit:

[hadoop@VMWHADTEST01 native]$ ldd libhadoop.so.1.0.0
./libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
linux-vdso.so.1 =>  (0x00007fff43510000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f9be553a000)
libc.so.6 => /lib64/libc.so.6 (0x00007f9be51a5000)
/lib64/ld-linux-x86-64.so.2 (0x00007f9be5966000)

Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focuses on, "Is this library 32 pr 64 bit?":

`GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)

So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):

15/06/18 18:59:23 WARN util.NativeCodeLoader: Unable to load native-hadoop    library for your platform... using builtin-java classes where applicable

So, off to here to see what it does:

http://grepcode.com/file/repo1.maven.org/maven2/com.ning/metrics.action/0.2.6/org/apache/hadoop/util/NativeCodeLoader.java/

Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG

Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:

15/06/18 19:05:19 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /opt/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /opt/hadoop/lib/native/libhadoop.so.1.0.0)

And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:

`GLIBC_2.14' not found (required by opt/hadoop/lib/native/libhadoop.so.1.0.0)

What version of GLIBC do I have? Here's simple trick to find out:

[hadoop@VMWHADTEST01 hadoop]$ ldd --version
ldd (GNU libc) 2.12

So, can't update my OS to 2.14. Only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) buy using the same logging options we used to get the debug message, except now, just make it ERROR level.

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR

I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.

Solution 4

I had the same issue. It's solved by adding following lines in .bashrc:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

Solution 5

In my case , after I build hadoop on my 64 bit Linux mint OS, I replaced the native library in hadoop/lib. Still the problem persist. Then I figured out the hadoop pointing to hadoop/lib not to the hadoop/lib/native. So I just moved all content from native library to its parent. And the warning just gone.

Share:
537,103

Related videos on Youtube

Olshansk
Author by

Olshansk

Updated on July 08, 2022

Comments

  • Olshansk
    Olshansk almost 2 years

    I'm currently configuring hadoop on a server running CentOs. When I run start-dfs.sh or stop-dfs.sh, I get the following error:

    WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    I'm running Hadoop 2.2.0.

    Doing a search online brought up this link: http://balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html

    However, the contents of /native/ directory on hadoop 2.x appear to be different so I am not sure what to do.

    I've also added these two environment variables in hadoop-env.sh:

    export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"

    export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"

    Any ideas?

    • Greg Dubicki
      Greg Dubicki almost 9 years
      For searchability: this problem also applies at least to Hadoop 2.4.0, Hadoop 2.4.1 and probably other versions.
    • James Moore
      James Moore about 7 years
      Documentation for how to use native libraries is at hadoop.apache.org/docs/current/hadoop-project-dist/…
  • Akshay Hazari
    Akshay Hazari about 10 years
    Doesn't Work for me. Gives me the same Unable to load native-hadoop library for your platform error.
  • Akshay Hazari
    Akshay Hazari about 10 years
    I just happened to have tried everything on the net. I got tired and just emptied all the files in the lib folder itself i.e the ones compiled using the links provided in the above answer. Finally I don't know why despite the downvotes you've got I tried your suggestion and it worked after a tremendous struggle I put up for a day behind all this.It didn't matter whether I changed the native library location in .bashrc or hadoop-env.sh. Thanks a tonne.
  • Akshay Hazari
    Akshay Hazari about 10 years
    I got tired and just emptied all the native folder files in the lib folder itself i.e the ones compiled using the links provided in the above answer (native folder in the new hadoop-2.4.0-src.tar.gz.)
  • WattsInABox
    WattsInABox almost 10 years
    Even if this doesn't exactly work, it's still helpful. So will this impact performance, at all?
  • KunBetter
    KunBetter almost 10 years
    yes,you should have recompile 64bit lib/native via hadoop resource.
  • sandip divekar
    sandip divekar over 9 years
    I am using same hadoop 2.5.0 tar on Centos 7 and Centos 6.5. Both are 64 bit OS. There is no such warning on Centos7 but Centos 6.5 gives me this warning, why ?
  • Abbas Gadhia
    Abbas Gadhia about 9 years
    Cloudera distributions are many a time behind the current versions available for many of the packages. if you want "latest and greatest", Apache Hadoop is the way to go
  • Kaushik Lele
    Kaushik Lele almost 9 years
    Thanks. I did not realize that it is a warning. Actually says "starting namenode" and last sentence is "Unable to load native-hadoop .." which caused fear.
  • Greg Dubicki
    Greg Dubicki almost 9 years
    Note that you actually don't have to compile whole Hadoop, as the instructions suggest - hadoop-common-project/hadoop-common and hadoop-hdfs-project/hadoop-hdfs is enough.
  • Ala' Alnajjar
    Ala' Alnajjar over 8 years
    I had to add "/native" to HADOOP_OPTS value
  • borice
    borice over 8 years
    In my case I needed both: export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH and export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native:$LD_LIBRARY_PATH
  • ParagFlume
    ParagFlume over 8 years
    I am using hadoop-2.6.0 version in my local system. I was also facing same issue. Then I downloaded the hadoop-2.7.1-src and build binary and natives libraries, also replaced the native libraries hadoop-2.6.0 with the newly builded natives. But still I was getting same errors. Then I export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH and it worked for me.
  • pelazem
    pelazem over 8 years
    This did it for me also. On Ubuntu with Hadoop 2.6, the path was /home/user/hadoop-2.6.0/lib/native
  • arcee123
    arcee123 almost 8 years
    Thanks Philip. This solution worked perfect. In my case, All I needed was the option Djava.library.path. That was exactly what I was looking for. Thanks!!!
  • Searene
    Searene almost 8 years
    export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
  • dogwynn
    dogwynn almost 8 years
    Thank you sir for this beautifully detailed answer. I got my answer and learned something valuable (a few somethings) in the process.
  • ggorantl
    ggorantl about 7 years
    Thanks a lot.I have bzip2: false , openssl: false build does not support openssl. The others have path showing up. Any suggestions.
  • Shrikant Prabhu
    Shrikant Prabhu over 6 years
    for me it was just glitch in terminal when it had resumed the session after a mac restart. Quit the terminal app and restarted again ..error was gone and was able to start spark-shell.
  • Hoai-Thu Vuong
    Hoai-Thu Vuong almost 6 years
    I think, two solutions are the same. According to doc, java.library.path is a list of paths to search when loading the libraries. So that, you can export LD_LIBRARY_PATH or use -D option in java command line. In java command line, and -D<property>=value allow us to set a system property value.
  • ldmtwo
    ldmtwo almost 5 years
    The link is broken
  • ldmtwo
    ldmtwo almost 5 years
    The file might be 64bit, which is the standard now. To find out: stackoverflow.com/questions/19444904/…
  • Eric
    Eric almost 5 years
    Thanks. If you override LD_LIBRARY_PATH in order to use tomcat apr, just append hadoop native path as `export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib/hadoop/lib/native.
  • sailfish009
    sailfish009 over 4 years
    this only works solution for me. (tried all other answers).
  • Erol Erdogan
    Erol Erdogan almost 4 years
    this is the correct solution for me. It fixed the warning
  • blkpingu
    blkpingu about 3 years
    doesn't work for me. added /native to HADOOP_OPTS in .zshrc and sourced it, no dice