Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-2323

start-dfs.sh script fails for tarball install

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.23.0
    • Component/s: None
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      I build Common and HDFS tarballs from trunk then tried to start a cluster with start-dfs.sh, but I got the following error:

      Starting namenodes on [localhost ]
      sbin/start-dfs.sh: line 55: /Users/tom/tmp/hadoop/libexec/../bin/hadoop-daemons.sh: No such file or directory
      sbin/start-dfs.sh: line 68: /Users/tom/tmp/hadoop/libexec/../bin/hadoop-daemons.sh: No such file or directory
      Starting secondary namenodes [0.0.0.0 ]
      sbin/start-dfs.sh: line 88: /Users/tom/tmp/hadoop/libexec/../bin/hadoop-daemons.sh: No such file or directory
      

        Attachments

          Activity

            People

            • Assignee:
              tomwhite Tom White
              Reporter:
              tomwhite Tom White
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: