Details
-
Bug
-
Status: Resolved
-
Blocker
-
Resolution: Invalid
-
None
-
None
-
None
-
None
-
hadoop 2.3.0, Windows Environment, Development using Eclipse, Lenevo Laptop
Description
I'm an application developer. We recently moved from CDH4.7 to CDH5.1. The hadoop version have been from 1.x to 2.x. In order to perform development on Eclipse (on WINDOWS), the following class was created
public class WindowsLocalFileSystem extends LocalFileSystem {
public WindowsLocalFileSystem()
{ super(); } @Override
public boolean mkdirs(Path f, FsPermission permission) throws IOException
@Override
public void setPermission(Path p, FsPermission permission)
throws IOException {
try
catch (final IOException e)
{ System.err.println("Cant help it, hence ignoring IOException setting persmission for path \"" + p + "\": " + e.getMessage()); }}
}
This class was used in MapReduce Job as
if (RUN_LOCAL)
{ conf.set("fs.default.name", "file:///"); conf.set("mapred.job.tracker", "local"); conf.set("fs.file.impl", "org.scif.bdp.mrjobs.WindowsLocalFileSystem"); conf.set( "io.serializations", "org.apache.hadoop.io.serializer.JavaSerialization," + "org.apache.hadoop.io.serializer.WritableSerialization"); }It worked fine on CDH4.7. Now the same code when compiled on CDH5.1 works but when I try to execute it throws the following stacktrace
Exception in thread "main" java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:451)
at org.apache.hadoop.util.Shell.run(Shell.java:424)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:656)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:745)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:728)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:633)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:467)
at com.scif.bdp.common.WindowsLocalFileSystem.setPermission(WindowsLocalFileSystem.java:26)
at com.scif.bdp.common.WindowsLocalFileSystem.mkdirs(WindowsLocalFileSystem.java:17)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:348)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
at com.scif.bdp.mrjobs.DeDup.run(DeDup.java:55)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at com.scif.bdp.mrjobs.DeDup.main(DeDup.java:59)
(Note DeDup is my MR class to remove duplicates)
Upon investigation the only change I saw was the change in method .setPermission(). It invokes Native.POSIX.chmod as against Native.chmod
Attachments
Issue Links
- relates to
-
SPARK-6961 Cannot save data to parquet files when executing from Windows from a Maven Project
- Resolved