Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Fixed
-
v2.1.0
-
None
Description
Currently, when we discard spark job, the spark job will still running, and when we restart JobServer, the SparkExecutable will submit a new spark job. we should handle spark job as mr job.
Attachments
Issue Links
- is duplicated by
-
KYLIN-3381 Stop/abort cubing job doesn't stop Spark job
- Closed