Details
-
Bug
-
Status: Closed
-
Blocker
-
Resolution: Fixed
-
1.3.2, 1.4.0
-
None
Description
In config.sh we do this:
# Check if deprecated HADOOP_HOME is set. if [ -n "$HADOOP_HOME" ]; then # HADOOP_HOME is set. Check if its a Hadoop 1.x or 2.x HADOOP_HOME path if [ -d "$HADOOP_HOME/conf" ]; then # its a Hadoop 1.x HADOOP_CONF_DIR="$HADOOP_CONF_DIR:$HADOOP_HOME/conf" fi if [ -d "$HADOOP_HOME/etc/hadoop" ]; then # Its Hadoop 2.2+ HADOOP_CONF_DIR="$HADOOP_CONF_DIR:$HADOOP_HOME/etc/hadoop" fi fi
while our HadoopFileSystem actually only treats this paths as a single path, not a colon-separated path: https://github.com/apache/flink/blob/854b05376a459a6197e41e141bb28a9befe481ad/flink-runtime/src/main/java/org/apache/flink/runtime/fs/hdfs/HadoopFileSystem.java#L236
I also think that other tools don't assume multiple paths in there and at least one user ran into the problem on their setup.
Attachments
Issue Links
- links to