Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-20945

flink hive insert heap out of memory

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: Table SQL / Ecosystem
    • Labels:
      None
    • Environment:

      flink 1.12.0
      hive-exec 2.3.5

      Description

      when using flink sql to insert into hive from kafka, heap out of memory occrus randomly.
      Hive table using year/month/day/hour as partition, it seems the max heap space needed is corresponded to active partition number(according to kafka message disordered and delay). which means if partition number increases, the heap space needed also increase, may cause the heap out of memory.
      when write record, is it possible to take the whole heap space usage into account in checkBlockSizeReached, or some other method to avoid OOM?

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              bruce-gao Bruce GAO
            • Votes:
              1 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated: