Description
It might be a better way to use SPARK_DIST_CLASSPATH for including the Hadoop libs instead of CLASSPATH.
export SPARK_DIST_CLASSPATH=$(hadoop classpath)
Refs. Using Spark's "Hadoop Free" Build, https://spark.apache.org/docs/2.1.1/hadoop-provided.html
Current spark-env.sh:
# Let's make sure that all needed hadoop libs are added properly
export CLASSPATH="$CLASSPATH:$HADOOP_HOME/*:$HADOOP_HDFS_HOME/*:$HADOOP_YARN_HOME/*:$HADOOP_MAPRED_HOME/*"
Attachments
Issue Links
- links to