Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
1.0.0, 1.0.2, 1.1.0
-
None
Description
In the current compute-classpath scripts, the Hadoop conf directory may appear before Spark's conf directory in the computed classpath. This leads to Hadoop's log4j.properties being used instead of Spark's, preventing users from easily changing Spark's logging settings.
To fix this, we should add a new classpath entry for Spark's log4j.properties file.
Attachments
Issue Links
- is duplicated by
-
SPARK-2027 spark-ec2 puts Hadoop's log4j ahead of Spark's in classpath
- Resolved
-
SPARK-4997 Check if Spark's conf needs to be put ahead of Hadoop's (for log4j purposes)
- Closed
- is related to
-
SPARK-2858 Default log4j configuration no longer seems to work
- Resolved
- links to