(sample code with description is here https://github.com/zdenulo/dataflow_bigquery_error)
I was running for the first time Dataflow job (with version 2.1.1) to read data from BigQuery, make some modifications, then write data to different table in BigQuery. When I was running locally (on small subset) it was ok, but when I tried to run on Dataflow I get following exception:
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
(ade3180ffa878a6b): Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session
File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session
module = unpickler.load()
File "/usr/lib/python2.7/pickle.py", line 858, in load
File "/usr/lib/python2.7/pickle.py", line 1182, in load_append
File "/usr/local/lib/python2.7/dist-packages/apitools/base/protorpclite/messages.py", line 1142, in append
AttributeError: 'FieldList' object has no attribute 'FieldList_field'
In my opinion, it looks like it has to do something with pickling schema definition for output table.