Hadoop HDFS
  1. Hadoop HDFS
  2. HDFS-1288

start-all.sh / stop-all.sh does not seem to work with HDFS

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Blocker Blocker
    • Resolution: Cannot Reproduce
    • Affects Version/s: 0.21.0
    • Fix Version/s: 0.21.0
    • Component/s: scripts
    • Labels:
      None

      Description

      The start-all.sh / stop-all.sh script shipping with the "combined" hadoop-0.21.0-rc1 does not start/stop the DFS daemons unless $HADOOP_HDFS_HOME is explicitly set.

        Activity

        Hide
        Aaron Kimball added a comment -

        If I explicitly set $HADOOP_HDFS_HOME=$HADOOP_HOME/hdfs then it works fine. But what is curious is that I do not need to explicitly set $HADOOP_MAPRED_HOME.

        So there's some asymmetry in how these scripts work with HDFS and mapred. At the very least, they should print a warning that they couldn't do the dfs-side work if they can't find the scripts?

        Show
        Aaron Kimball added a comment - If I explicitly set $HADOOP_HDFS_HOME=$HADOOP_HOME/hdfs then it works fine. But what is curious is that I do not need to explicitly set $HADOOP_MAPRED_HOME. So there's some asymmetry in how these scripts work with HDFS and mapred. At the very least, they should print a warning that they couldn't do the dfs-side work if they can't find the scripts?
        Hide
        Allen Wittenauer added a comment -

        This is a regression and should be a blocker.

        Show
        Allen Wittenauer added a comment - This is a regression and should be a blocker.
        Hide
        Tom White added a comment -

        I haven't been able to reproduce this. I successfully ran the following with RC0 (HADOOP_HDFS_HOME was not set):

        export HADOOP_HOME=...
        
        $HADOOP_HOME/bin/hadoop namenode -format
        $HADOOP_HOME/bin/start-all.sh
        $HADOOP_HOME/bin/hdfs dfsadmin -safemode wait
        sleep 60
        $HADOOP_HOME/bin/hadoop fs -mkdir input
        $HADOOP_HOME/bin/hadoop fs -put $HADOOP_HOME/LICENSE.txt input
        $HADOOP_HOME/bin/hadoop jar $HADOOP_HOME/hadoop-*-examples-*.jar grep \
          input output Apache
        $HADOOP_HOME/bin/hadoop fs -cat 'output/part-r-00000' | grep Apache
        $HADOOP_HOME/bin/stop-all.sh
        

        Aaron, what did you run to see this problem?

        Show
        Tom White added a comment - I haven't been able to reproduce this. I successfully ran the following with RC0 (HADOOP_HDFS_HOME was not set): export HADOOP_HOME=... $HADOOP_HOME/bin/hadoop namenode -format $HADOOP_HOME/bin/start-all.sh $HADOOP_HOME/bin/hdfs dfsadmin -safemode wait sleep 60 $HADOOP_HOME/bin/hadoop fs -mkdir input $HADOOP_HOME/bin/hadoop fs -put $HADOOP_HOME/LICENSE.txt input $HADOOP_HOME/bin/hadoop jar $HADOOP_HOME/hadoop-*-examples-*.jar grep \ input output Apache $HADOOP_HOME/bin/hadoop fs -cat 'output/part-r-00000' | grep Apache $HADOOP_HOME/bin/stop-all.sh Aaron, what did you run to see this problem?
        Hide
        Aaron Kimball added a comment -

        Strange, now I can't reproduce it. I might have had some other configuration state in my environment which was conflicting with this. Feel free to resolve as invalid if nobody else can reproduce this.

        Show
        Aaron Kimball added a comment - Strange, now I can't reproduce it. I might have had some other configuration state in my environment which was conflicting with this. Feel free to resolve as invalid if nobody else can reproduce this.

          People

          • Assignee:
            Unassigned
            Reporter:
            Aaron Kimball
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development