Uploaded image for project: 'Beam'
  1. Beam
  2. BEAM-11134

Using WriteToBigQuery FILE_LOADS in a streaming pipeline does not delete temp tables

Details

    • Bug
    • Status: Triage Needed
    • P3
    • Resolution: Unresolved
    • 2.24.0
    • None
    • io-py-gcp
    • Running on DataflowRunner on GCP Dataflow.

    Description

      Using the FILE_LOADS method in WriteToBigQuery, it initially appears to work, sending load jobs, which then (at least sometimes) succeed and the data goes into the correct tables.

      But the temporary tables that were created never get deleted. Often the data was just never even copied from the temp tables to the destination.

      In the code (https://github.com/apache/beam/blob/aca9099acca969dc217ab183782e5270347cd354/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py#L846)

      ...it appears that after the load jobs, beam should wait for them to finish, then copy the data from the temp tables and delete them; however, it seems that when used with a streaming pipeline, it doesn't complete these steps.

       

      In case it's not clear, this is for the python SDK.

       

      For reference: https://stackoverflow.com/questions/64526500/using-writetobigquery-file-loads-in-a-streaming-pipeline-just-creates-a-lot-of-t/64543619#64543619

      Attachments

        Activity

          People

            Unassigned Unassigned
            lkavenagh Luke Kavenagh
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: