During the graph construction phase, the given SDK generates an initial
execution graph for the program. At execution time, this graph is
executed, either locally or by a service. Currently, Beam only supports
parameterization at graph construction time. Both Flink and Spark supply
functionality that allows a pre-compiled job to be run without SDK
interaction with updated runtime parameters.
In its current incarnation, Dataflow can read values of PipelineOptions at
job submission time, but this requires the presence of an SDK to properly
encode these values into the job. We would like to build a common layer
into the Beam model so that these dynamic options can be properly provided
for the high-level model, and
the specific API proposal.