Details
-
Bug
-
Status: Closed
-
Blocker
-
Resolution: Fixed
-
2.0.4-alpha
-
Reviewed
Description
I am attaching a modified wordcount job that clearly demonstrates the problem we've encountered in running Sqoop2 on YARN (BIGTOP-949).
Here's what running it produces:
$ hadoop fs -mkdir in $ hadoop fs -put /etc/passwd in $ hadoop jar ./bug.jar org.myorg.LostCreds 13/05/12 03:13:46 WARN mapred.JobConf: The variable mapred.child.ulimit is no longer used. numberOfSecretKeys: 1 numberOfTokens: 0 .............. .............. .............. 13/05/12 03:05:35 INFO mapreduce.Job: Job job_1368318686284_0013 failed with state FAILED due to: Job commit failed: java.io.IOException: numberOfSecretKeys: 0 numberOfTokens: 0 at org.myorg.LostCreds$DestroyerFileOutputCommitter.commitJob(LostCreds.java:43) at org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler$EventProcessor.handleJobCommit(CommitterEventHandler.java:249) at org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler$EventProcessor.run(CommitterEventHandler.java:212) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:619)
As you can see, even though we've clearly initialized the creds via:
job.getCredentials().addSecretKey(new Text("mykey"), "mysecret".getBytes());
It doesn't seem to appear later in the job.
This is a pretty critical issue for Sqoop 2 since it appears to be DOA for YARN in Hadoop 2.0.4-alpha
Attachments
Attachments
Issue Links
- blocks
-
BIGTOP-949 Add Sqoop tests
- Closed