Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-4384

Too many open files during sort in pyspark

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: 1.2.0
    • Fix Version/s: 1.2.0
    • Component/s: PySpark
    • Labels:
      None
    • Target Version/s:

      Description

      Reported in maillist:

      On Thu, Nov 13, 2014 at 11:28 AM, santon <steven.m.anton@gmail.com> wrote:
      > Thanks for the thoughts. I've been testing on Spark 1.1 and haven't seen the
      > IndexError yet. I've run into some other errors ("too many open files"), but
      > these issues seem to have been discussed already. The dataset, by the way,
      > was about 40 Gb and 188 million lines; I'm running a sort on 3 worker nodes
      > with a total of about 80 cores.

        Attachments

          Activity

            People

            • Assignee:
              davies Davies Liu
              Reporter:
              davies Davies Liu
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: