Description
Should kill running Spark Jobs when a query is cancelled. When a query is cancelled, Driver.releaseDriverContext will be called by Driver.close. releaseDriverContext will call DriverContext.shutdown which will call all the running tasks' shutdown.
public synchronized void shutdown() { LOG.debug("Shutting down query " + ctx.getCmd()); shutdown = true; for (TaskRunner runner : running) { if (runner.isRunning()) { Task<?> task = runner.getTask(); LOG.warn("Shutting down task : " + task); try { task.shutdown(); } catch (Exception e) { console.printError("Exception on shutting down task " + task.getId() + ": " + e); } Thread thread = runner.getRunner(); if (thread != null) { thread.interrupt(); } } } running.clear(); }
since SparkTask didn't implement shutdown method to kill the running spark job, the spark job may be still running after the query is cancelled. So it will be good to kill the spark job in SparkTask.shutdown to save cluster resource.
Attachments
Attachments
Issue Links
- relates to
-
SPARK-21433 Spark SQL should support higher version of Hive metastore
- Resolved