Uploaded image for project: 'Beam'
  1. Beam
  2. BEAM-12915

No parallelism when using SDFBoundedSourceReader with Flink

Details

    • Bug
    • Status: Open
    • P3
    • Resolution: Unresolved
    • 2.32.0
    • None
    • runner-flink
    • None

    Description

      Background: I am using TFX pipelines with Flink as the runner for Beam (flink session cluster using flink-on-k8s-operator). The Flink cluster has 2 taskmanagers with 16 cores each, and parallelism is set to 32. TFX components call beam.io.ReadFromTFRecord to load data, passing in a glob file pattern. I have a dataset of TFRecords split across 160 files. When I try to run the component, processing for all 160 files ends up in a single subtask in Flink, i.e. the parallelism is effectively 1. See below images:

       
      I have tried all manner of Beam/Flink options and different versions of Beam/Flink but the behaviour remains the same.

      Furthermore, the behaviour affects anything that uses apache_beam.io.iobase.SDFBoundedSourceReader, e.g. apache_beam.io.parquetio.ReadFromParquet also has the same issue. Either I'm missing some obscure setting in my configuration, or this is a bug with the Flink runner.
       

      Attachments

        Activity

          People

            Unassigned Unassigned
            roganmorrow Rogan Morrow
            Votes:
            2 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated: