Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Auto Closed
-
None
-
None
Description
Some places use SparkHadoopUtil.get.conf, some create a new hadoop config. Prefer SparkHadoopUtil so that spark.hadoop.* properties are pulled in.
Attachments
Issue Links
- is related to
-
SPARK-4629 Spark SQL uses Hadoop Configuration in a thread-unsafe manner when writing Parquet files
- Resolved
- links to
1.
|
Consistent hadoop config for streaming | Resolved | Cody Koeninger | |
2.
|
Consistent hadoop config for SQL | Resolved | Unassigned | |
3.
|
Consistent hadoop config for core | Resolved | Unassigned | |
4.
|
Consistent hadoop config for external/* | Resolved | Unassigned |