Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-201

hadoop dfs -report throws exception

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • 0.2.0
    • 0.2.1
    • None
    • None
    • linux, jdk 1.5

    Description

      Running hadoop dfs -report throws the lovely exception below.
      Changing org.apache.hadoop.dfs.DatanodeInfo back to being a public class solves the problem.

      ~/hadoop$ bin/hadoop dfs -report
      060508 104801 parsing file:/home/hadoop/hadoop/conf/hadoop-default.xml
      060508 104801 parsing file:/home/hadoop/hadoop/conf/hadoop-site.xml
      060508 104801 No FS indicated, using default:xxx:9000
      060508 104801 Client connection to 10.0.0.12:9000: starting
      Total raw bytes: 2763338170368 (2573.55 Gb)
      Used raw bytes: 1548564473694 (1442.21 Gb)
      % used: 56.03%

      Total effective bytes: 145953744375 (135.93 Gb)
      Effective replication multiplier: 10.609967427182013
      -------------------------------------------------
      060508 104801 Client connection to 10.0.0.12:9000 caught: java.lang.RuntimeException: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.DatanodeInfo with modifiers "public"
      java.lang.RuntimeException: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.DatanodeInfo with modifiers "public"
      at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49)
      at org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:226)
      at org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:163)
      at org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:211)
      at org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:60)
      at org.apache.hadoop.ipc.Client$Connection.run(Client.java:170)
      Caused by: java.lang.IllegalAccessException: Class org.apache.hadoop.io.WritableFactories can not access a member of class org.apache.hadoop.dfs.DatanodeInfo with modifiers "public"
      at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:65)
      at java.lang.Class.newInstance0(Class.java:344)
      at java.lang.Class.newInstance(Class.java:303)
      at org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:45)
      ... 5 more
      060508 104801 Client connection to 10.0.0.12:9000: closing

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            cutting Doug Cutting
            johanoskarsson Johan Oskarsson
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment