Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-8240

Create unified interfaces to configure and instatiate TableSources

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.5.0
    • Table SQL / API
    • None

    Description

      At the moment every table source has different ways for configuration and instantiation. Some table source are tailored to a specific encoding (e.g., KafkaAvroTableSource, KafkaJsonTableSource) or only support one encoding for reading (e.g., CsvTableSource). Each of them might implement a builder or support table source converters for external catalogs.

      The table sources should have a unified interface for discovery, defining common properties, and instantiation. The TableSourceConverters provide a similar functionality but use an external catalog. We might generialize this interface.

      In general a table source declaration depends on the following parts:

      - Source
        - Type (e.g. Kafka, Custom)
        - Properties (e.g. topic, connection info)
      - Encoding
        - Type (e.g. Avro, JSON, CSV)
        - Schema (e.g. Avro class, JSON field names/types)
      - Rowtime descriptor/Proctime
        - Watermark strategy and Watermark properties
        - Time attribute info
      - Bucketization
      

      This issue needs a design document before implementation. Any discussion is very welcome.

      Attachments

        Issue Links

          Activity

            People

              twalthr Timo Walther
              twalthr Timo Walther
              Votes:
              0 Vote for this issue
              Watchers:
              8 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: