Details
-
New Feature
-
Status: In Progress
-
Major
-
Resolution: Unresolved
-
3.1.2
-
None
-
None
Description
Many customers have been asking if there is a setting they can use to kill idle spark-shell since they can't really go to each developers desk and force them to use Cntr+D or exit() when their work is over. Our response so far has been to use dynamic allocation so that it will release the executors after the specified timeout.
However this is not always an ideal solution since the shell process would still be there, though AM would be occupying a very small resource and the user still needs to kill the idle spark shell via CM> Applications > spark-shell > Kill or run 'kill -9' on the OS to remove those. It would be nice to have a property in Spark (and exposed in CM) which deals with idle spark-shells, just like we have in beeline and let's leave it to the admins to see if they want the idle spark-shell timeout to be set as 1 day or a week.