Uploaded image for project: 'Apache Arrow'
  1. Apache Arrow
  2. ARROW-5086

[Python] Space leak in ParquetFile.read_row_group()

    XMLWordPrintableJSON

Details

    Description

      I have a code pattern like this:

       

      reader = pq.ParquetFile(path)

      for ix in range(0, reader.num_row_groups):
          table = reader.read_row_group(ix, columns=self._columns)
          # operate on table

       

      But it leaks memory over time, only releasing it when the reader object is collected. Here's a workaround

       

      num_row_groups = pq.ParquetFile(path).num_row_groups

      for ix in range(0, num_row_groups):
          table = pq.ParquetFile(path).read_row_group(ix, columns=self._columns)
          # operate on table

       

      This puts an upper bound on memory usage and is what I'd  expect from the code. I also put gc.collect() to the end of every loop.

       

      I charted out memory usage for a small benchmark that just copies a file, one row group at a time, converting to pandas and back to arrow on the writer path. Line in black is the first one, using a single reader object. Blue is instantiating a fresh reader in every iteration.

      Attachments

        1. all.png
          38 kB
          Jakub Okoński
        2. all.png
          103 kB
          Jakub Okoński

        Issue Links

          Activity

            People

              wesm Wes McKinney
              farnoy Jakub Okoński
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 0.5h
                  0.5h