To connect to Hadoop using Java

12,056

Solution 1

Depends on what do you understand by Hadoop. Hadoop can store data in many ways, it can be just a file in hdfs(Hadoop Distributed File System) or it can be a table in Hive or Hbase. There is a simplest code to read a file from hdfs:

import org.apache.commons.io.IOUtils;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;

public class HdfsFileReader {

    private static final String NAME_NODE = "hdfs://nameNomeHost:8020";//nameNomeHost = localhost if you use hadoop in local mode

    public static void main(String[] args) throws URISyntaxException, IOException {
        String fileInHdfs = args[0];
        FileSystem fs = FileSystem.get(new URI(NAME_NODE), new Configuration());
        String fileContent = IOUtils.toString(fs.open(new Path(fileInHdfs)), "UTF-8");
        System.out.println("File content - " + fileContent);
    }

}

Maven dependencies you need:

<dependency>
    <groupId>commons-io</groupId>
    <artifactId>commons-io</artifactId>
    <version>2.4</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.6.0</version>
</dependency>

Solution 2

This code uses the cloudera quickstart docker image. It pushes and pulls files from the local file system to hdfs. It needs to be exported as a Jar file and run on command line.

Sample: java -jar connect_hdfs.jar /local_file.txt push /hdfs_dir_location/

    import org.apache.hadoop.conf.Configuration;
    import org.apache.hadoop.fs.FileSystem;
    import org.apache.hadoop.fs.Path;

    import java.io.IOException;
    import java.net.URISyntaxException;

    public class Main {
        private static final String NAME_NODE = "hdfs://quickstart.cloudera:8020";

        public static void main(String[] args) throws URISyntaxException, IOException {
            if (args.length != 3){
                throw new IllegalArgumentException ("Must include inputs: source file location, action "
                        + "(push or pull), and target file location");
            }

            String sourceLocation = args[0];
            String action = args[1];
            String targetLocation = args[2];

            Configuration configuration = new Configuration();
            configuration.set("fs.defaultFS", NAME_NODE);

            configuration.set("fs.hdfs.impl", 
                    org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()
                );
            configuration.set("fs.file.impl",
                    org.apache.hadoop.fs.LocalFileSystem.class.getName()
                );

            FileSystem hdfsFileSystem = FileSystem.get(configuration);

            if (action.equals("push")) {
                hdfsFileSystem.copyFromLocalFile(new Path(sourceLocation), new Path(targetLocation));
            } else if (action.equals("pull")) {
                hdfsFileSystem.copyToLocalFile(false, new Path(sourceLocation), new Path(targetLocation), true);
            }
        }
    }

Pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>connect_hdfs</groupId>
  <artifactId>connect_hdfs</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <packaging>jar</packaging> 
  <dependencies>
  <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.7.0</version>
    <scope>provided</scope>
    </dependency>
    <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.7.0</version>
</dependency>
    <dependency>
    <groupId>commons-io</groupId>
    <artifactId>commons-io</artifactId>
    <version>2.6</version>
</dependency>
    <dependency>
        <groupId>jdk.tools</groupId>
        <artifactId>jdk.tools</artifactId>
        <version>jdk1.7.0_67</version>
        <scope>system</scope>
        <systemPath>C:/Program Files/Java/jdk1.7.0_67/lib/tools.jar</systemPath>
    </dependency>   
  </dependencies>
  <build>
    <sourceDirectory>src</sourceDirectory>
    <plugins>
      <plugin>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.5.1</version>
        <configuration>
          <source>1.7</source>
          <target>1.7</target>
        </configuration>
      </plugin>
    </plugins>
  </build>
</project>
Share:
12,056
PrajD
Author by

PrajD

Computer Science student

Updated on June 04, 2022

Comments

  • PrajD
    PrajD almost 2 years

    How to connect to Hadoop in a Java program. Here are few details : I am taking input from user in html form, using JSP to process the form data. I want to connect to hadoop to fetch some data based on form inputs. How can I connect to Hadoop using Java in this case?