-
Type:
Improvement
-
Status: Closed
-
Priority:
Major
-
Resolution: Fixed
-
Affects Version/s: 3.0.0
-
Fix Version/s: None
-
Component/s: Project Infra
-
Labels:None
Spark 3.0 is a major version change. We want to have the following new Jobs.
1. SBT with hadoop-3.2
2. Maven with hadoop-3.2 (on JDK8 and JDK11)
Also, shall we have a limit for the concurrent run for the following existing job? Currently, it invokes multiple jobs concurrently. We can save the resource by limiting to 1 like the other jobs.
We will drop four `branch-2.3` jobs at the end of August, 2019.