Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-8996

Error in Hadoop installation

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • None
    • None
    • None
    • None
    • fedora 15

    Description

      I am trying to install `Hadoop` on fedora machine by seeing "http://hadoop.apache.org/docs/r0.15.2/quickstart.html"

      1. Installed java (and verified whether java exists with `java -version`) and it exists
      2. I had ssh installed(since it is linux)
      3. Downloaded latest version `hadoop 1.0.4` from "http://apache.techartifact.com/mirror/hadoop/common/hadoop-1.0.4/"

      I have followed the process shown in installation tutorial(link given above) as below

      $ mkdir input
      $ cp conf/*.xml input
      $ bin/hadoop jar hadoop-examples.1.0.4.jar grep input output 'dfs[a-z.]+'

      Then i had got the following error, which i am unable to understand

      sh-4.2$ bin/hadoop jar hadoop-examples-1.0.4.jar grep input output 'dfs[a-z.]+'
      12/10/31 16:14:35 INFO util.NativeCodeLoader: Loaded the native-hadoop library
      12/10/31 16:14:35 WARN snappy.LoadSnappy: Snappy native library not loaded
      12/10/31 16:14:35 INFO mapred.FileInputFormat: Total input paths to process : 8
      12/10/31 16:14:35 INFO mapred.JobClient: Cleaning up the staging area file:/tmp/hadoop-thomas/mapred/staging/thomas-857393825/.staging/job_local_0001
      12/10/31 16:14:35 ERROR security.UserGroupInformation: PriviledgedActionException as:thomas cause:java.io.IOException: Not a file: file:/home/local/thomas/Hadoop/hadoop-1.0.4/input/conf
      java.io.IOException: Not a file: file:/home/local/thomas/Hadoop/hadoop-1.0.4/input/conf
      at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:215)
      at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989)
      at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981)
      at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
      at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
      at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:416)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
      at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
      at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
      at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
      at org.apache.hadoop.examples.Grep.run(Grep.java:69)
      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
      at org.apache.hadoop.examples.Grep.main(Grep.java:93)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:616)
      at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
      at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
      at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:616)
      at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

      Attachments

        Activity

          People

            Unassigned Unassigned
            shiva krishna Shiva
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: