Details
-
Bug
-
Status: Closed
-
Minor
-
Resolution: Fixed
-
0.9.0
-
None
-
None
-
None
Description
I have a job jar that is nutch with additions. I can launch this job jar on a pure hadoop platform usually without issue. I can run nutch jobs – update db, invert links, etc. – without issue. Recently I tried to do the same with SegmentMerg'ing only it would fail complaining about ClassNotFound:
2006-07-28 20:43:54,371 WARN org.apache.hadoop.mapred.JobTracker: job init failed
java.io.IOException: java.lang.ClassNotFoundException: org.apache.nutch.segment.SegmentMerger$ObjectInputFormat
at org.apache.hadoop.mapred.JobInProgress.initTasks(JobInProgress.java:130)
at org.apache.hadoop.mapred.JobTracker$JobInitThread.run(JobTracker.java:310)
at java.lang.Thread.run(Thread.java:595)
java.io.IOException: Job failed!
After digging and chatting today with Stefan, the SegmentMerger and SegmentReader classes are not like the others. Others make a new JobConf inside in their job setup by doing a 'new NutchJob' whereas Segment* does 'new JobConf'. Sure enough, if I make the change, all works.
NutchJob triggers the setting of the job jar into the configuration (JobConf.findContainingJar is run). This doesn't happen for 'new JobConf'.