hdfs - ls: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException:
Solution 1
HDFS is not running at 50075. To check your hdfs port use the following command in linux
hdfs getconf -confKey fs.default.name
You will get the output something like
hdfs://hmaster:54310
And correct your URL accordingly
Solution 2
On your cloudera manager, check on the Name node for configuration item "NameNode Service RPC Port" OR "dfs.namenode.servicerpc-address". Add the same port number from there on the URL. And it should work fine.
Solution 3
Is your NN running on port 50075
? You actually don't have to do that if you just want to list down all the directories. Simply use hadoop fs -ls /
. This will list all your directories under your root directory.
Solution 4
In /usr/local/hadoop/etc/hadoop/core-site.xml
In place of localhost, use 0.0.0.0 i.e..
Change <value>hdfs://localhost:50075</value>
to
<value>hdfs://0.0.0.0:50075</value>
This solved the problem for me
Tampa
Updated on July 05, 2022Comments
-
Tampa almost 2 years
I am trying to use the below to list my dirs in hdfs:
ubuntu@ubuntu:~$ hadoop fs -ls hdfs://127.0.0.1:50075/ ls: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "ubuntu/127.0.0.1"; destination host is: "ubuntu":50075;
Here is my /etc/hosts file
127.0.0.1 ubuntu localhost #127.0.1.1 ubuntu # The following lines are desirable for IPv6 capable hosts ::1 ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters
How do I properly use hdfs:// to list my dirs?
I am using couldera 4.3 on ubuntu 12.04
-
Open Food Broker over 6 yearsINFO Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS