UnsatisfiedLinkError (NativeIO$Windows.access0) when submitting mapreduce job to hadoop 2.2 from windows to ubuntu
Solution 1
-
Get
hadoop.dll
(orlibhadoop.so
on *x). Make sure to match bitness (32- vs. 64-bit) with your JVM. -
Make sure it is available via PATH or java.library.path.
Note that setting
java.library.path
overridesPATH
. If you setjava.library.path
, make sure it is correct and contains the hadoop library.
Solution 2
This error generally occurs due to the mismatch in your binary files in your %HADOOP_HOME%\bin folder. So, what you need to do is to get hadoop.dll and winutils.exe specifically for your hadoop version.
Get hadoop.dll and winutils.exe for your specific hadoop version and copy them to your %HADOOP_HOME%\bin folder.
padmalcom
Updated on June 05, 2022Comments
-
padmalcom almost 2 years
I submit my mapreduce jobs from a java application running on windows to the hadoop 2.2 cluster running on ubuntu. In hadoop 1.x this worked as expected but on hadoop 2.2 I get a strange Error:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
I compiled the necesary windows libraries (hadoop.dll and winutils.exe) and can access the hdfs via code and read the cluster information using hadoop API. Only the job submission does not work.
Any help is aprecciated.
Solution: I found it out myself, the path where the windows hadoop binaries can be found has to be added to the PATH variable of windows.
-
ǨÅVËĔŊ RĀǞĴĄŅ almost 9 yearsHi add msvcr100.dll file to '${HADOOP_HOME}\bin' path.. me too face same problem..
-
centic over 8 yearsI think the answer at stackoverflow.com/a/23959201/411846 might help you here, it shows how you can check if there are some MSVC system libraries missing on your box.
-
centic over 8 yearspossible duplicate of Running Apache Hadoop 2.1.0 on Windows
-
10465355 over 5 years
-