Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Won't Fix
-
1.0.3, 1.1.0, 2.0.2-alpha
-
None
-
Fedora 17 3.3.4-5.fc17.x86_64t, java version "1.7.0_06-icedtea", Rackspace Cloud (NextGen)
-
Fixed a bug in hadoop-setup-conf.sh script which does not accept any change to JAVA_HOME directory.
-
config
Description
The JAVA_HOME directory remains unchanged no matter what you enter when you run hadoop-setup-conf.sh to generate hadoop configurations. Please see below example:
*********************************************************
[root@hadoop-slave ~]# /sbin/hadoop-setup-conf.sh
Setup Hadoop Configuration
Where would you like to put config directory? (/etc/hadoop)
Where would you like to put log directory? (/var/log/hadoop)
Where would you like to put pid directory? (/var/run/hadoop)
What is the host of the namenode? (hadoop-slave)
Where would you like to put namenode data directory? (/var/lib/hadoop/hdfs/namenode)
Where would you like to put datanode data directory? (/var/lib/hadoop/hdfs/datanode)
What is the host of the jobtracker? (hadoop-slave)
Where would you like to put jobtracker/tasktracker data directory? (/var/lib/hadoop/mapred)
Where is JAVA_HOME directory? (/usr/java/default) /usr/lib/jvm/jre
Would you like to create directories/copy conf files to localhost? (Y/n)
Review your choices:
Config directory : /etc/hadoop
Log directory : /var/log/hadoop
PID directory : /var/run/hadoop
Namenode host : hadoop-slave
Namenode directory : /var/lib/hadoop/hdfs/namenode
Datanode directory : /var/lib/hadoop/hdfs/datanode
Jobtracker host : hadoop-slave
Mapreduce directory : /var/lib/hadoop/mapred
Task scheduler : org.apache.hadoop.mapred.JobQueueTaskScheduler
JAVA_HOME directory : /usr/java/default
Create dirs/copy conf files : y
Proceed with generate configuration? (y/N) n
User aborted setup, exiting...
*********************************************************
Resolution:
Amend line 509 in file /sbin/hadoop-setup-conf.sh
from:
JAVA_HOME=${USER_USER_JAVA_HOME:-$JAVA_HOME}
to:
JAVA_HOME=${USER_JAVA_HOME:-$JAVA_HOME}
will resolve this issue.