Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Not A Problem
-
2.1.0
-
None
-
None
Description
Currently Spark Configuration can not be dynamically changed.
It requires Spark Job be killed and started again for a new configuration to take in to effect.
This bug is to enhance Spark ,such that configuration changes can be dynamically changed without requiring a application restart.
Ex: If Batch Interval in a Streaming Job is 20 seconds ,and if user wants to reduce it to 5 seconds,currently it requires a re-submit of the job.