Description
Looks like there is a race condition in TestSparkClient#runTest. The test creates a RemoteDriver in memory, which creates a JavaSparkContext. A new JavaSparkContext is created for each test that is run. There is a race condition where the RemoteDriver isn't given enough time to shutdown, so when the next test starts running it creates another JavaSparkContext which causes an exception like org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243).
Attachments
Attachments
Issue Links
- links to