Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Fixed
-
3.0.1-incubating
-
None
Description
So that users don't have to face the overhead of recreating Spark Contexts between jobs, we can provide an option so that contexts can be saved between jobs in a given JVM. This will also allow for simultaneous use of SparkGraphComputers sharing a single context.
I propose to enable this via a new property
gremlin.spark.persistContext , Default: false (old behavior)