Details
-
Sub-task
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
4.0.0
Description
The SharedSparkContext test suite mixin currently instantiates a SparkConf with `new SparkConf(false)`, which does not pick up defaults from system properties and thus misses certain test-default configurations and settings which are specified via test runner options in SBT or Maven; see https://github.com/apache/spark/blob/08a26bb56cfb48f27c68a79be1e15bc4c9e466e0/core/src/test/scala/org/apache/spark/SharedSparkContext.scala#L30 and https://github.com/apache/spark/blob/08a26bb56cfb48f27c68a79be1e15bc4c9e466e0/project/SparkBuild.scala#L1616-L1633 .
As a consequence, a number of Spark's test suites are missing test-only defaults and this can adversely impact the performance and flakiness of those tests.
I think that we should update SharedSparkContext to do `new SparkConf(true)` (a.k.a `new SparkConf()`) to address this problem. Note that most of our other test suites and mxins are already doing this, including SharedSparkSession in SparkSQL.
Attachments
Issue Links
- links to