java.lang.NoClassDefFoundError in Hadoop Basics' MapReduce Program

28,434

Solution 1

Please note that the exception is NoClassDefFoundError instead of ClassNotFoundException.

Note : NoClassDefFoundError is thrown when a class is not visible at run time but was visible at compile time. This may be something that can happen in the distribution or production of JAR files, where not all the required class files were included.

To fix : Please check for differences in your build time and runtime classpaths.

NoClassDefFoundError and ClassNotFoundException are different. one is an Error and the other is an Exception.

NoClassDefFoundError: arises from the JVM having problems finding a class it expected to find. A program that was working at compile-time can't run because of class files not being found.

ClassNotFoundException: This exception indicates that the class was not found on the classpath i.e we are trying to load the class definition and class/jar containing the class does not exist in the classpath.

Solution 2

NoClassDefFoundError occurs when the named class is successfully located in the classpath, but for some reason cannot be loaded and verified. Most often the problem is that another class needed for the verification of the named class is either missing or is the wrong version.

Generally speaking, this error means "double-check that you have all the right JAR files (of the right version) in your classpath".

Solution 3

It's a very common error when you run a Hadoop Map/Reduce program in local IDE (Eclipse).

You should already added hadoop-core.jar in your build path, so no compile error detected in your program. But you get the error when you run it, because hadoop-core is dependent on commons-logging.jar (as well as some other jars). You may need to add the jars under /lib to your build path.

I recommend you to use Maven or other dependency management tool to manage the dependency.

Share:
28,434
Shirish Herwade
Author by

Shirish Herwade

Android Developer, Pune India

Updated on February 16, 2020

Comments

  • Shirish Herwade
    Shirish Herwade about 4 years

    I'm trying Hadoop's Basic MapReduce Program whose tutorial is on http://java.dzone.com/articles/hadoop-basics-creating

    The Full code of the class is(the code is present on net on above url)

    import java.io.IOException;
    import java.util.StringTokenizer;
    import org.apache.hadoop.conf.Configuration;
    import org.apache.hadoop.fs.Path;
    import org.apache.hadoop.io.IntWritable;
    import org.apache.hadoop.io.Text;
    import org.apache.hadoop.mapreduce.Job;
    import org.apache.hadoop.mapreduce.Mapper;
    import org.apache.hadoop.mapreduce.Reducer;
    import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
    import org.apache.hadoop.mapreduce.lib.input.KeyValueTextInputFormat;
    import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
    import org.apache.hadoop.util.GenericOptionsParser;
    
    public class Dictionary {
    public static class WordMapper extends Mapper<Text, Text, Text, Text> {
        private Text word = new Text();
    
        public void map(Text key, Text value, Context context) throws IOException, InterruptedException {
            StringTokenizer itr = new StringTokenizer(value.toString(), ",");
            while (itr.hasMoreTokens()) {
                word.set(itr.nextToken());
                context.write(key, word);
            }
        }
    }
    
    public static class AllTranslationsReducer extends Reducer<Text, Text, Text, Text> {
        private Text result = new Text();
    
        public void reduce(Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException {
            String translations = "";
            for (Text val : values) {
                translations += "|" + val.toString();
            }
            result.set(translations);
            context.write(key, result);
        }
    }
    
    public static void main(String[] args) throws Exception {
        System.out.println("welcome to Java 1");
        Configuration conf = new Configuration();
        System.out.println("welcome to Java 2");
        Job job = new Job(conf, "dictionary");
        job.setJarByClass(Dictionary.class);
        job.setMapperClass(WordMapper.class);
        job.setReducerClass(AllTranslationsReducer.class);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(Text.class);
        job.setInputFormatClass(KeyValueTextInputFormat.class);
        FileInputFormat.addInputPath(job, new Path("/tmp/hadoop-cscarioni/dfs/name/file"));
        FileOutputFormat.setOutputPath(job, new Path("output"));
        System.exit(job.waitForCompletion(true) ? 0 : 1);
    }
    }
    

    But after running in eclipse; I'm getting the error,

    welcome to Java 1
    Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
    at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:73)
    at Dictionary.main(Dictionary.java:43)
    Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    ... 2 more