Handling Error: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected

15,613

The answer in the link I posted in the question above asked to compile against Hadoop 2.0 library. Incidentally the post Hadoop 1.0, instead of using one single Hadoop Core jar for compilation, two (or maybe more) different jars are to be used.

I used: hadoop-common-2.0.2-alpha.jar hadoop-mapreduce-client-core-2.0.2-alpha.jar

for compiling my code and after that it ran fine w/o giving the aforementioned error.

Share:
15,613
aa8y
Author by

aa8y

Updated on June 04, 2022

Comments

  • aa8y
    aa8y almost 2 years

    I am using CDH4 and have written a MapReduce application using the new mapreduce API. I have compiled it against hadoop-core-1.0.3.jar and when I run it on my Hadoop cluster I get the error:

    Error: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected

    I referred to this StackOverflow question which seems to be talking about the same problem. The answer suggests that we compile out code against the Hadoop-core-2.X.jar file, but I am unable to find anything like that.

    So how do I compile it so that it runs flawlessly in CDH4.