Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2242

Running sc.parallelize(..).count() hangs pyspark

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 1.1.0
    • 1.1.0
    • PySpark
    • None

    Description

      Running the following code hangs pyspark in a shell:

      sc.parallelize(range(100), 100).count()
      

      It happens in the master branch, but not branch-1.0. And it seems that it only happens in a pyspark shell. andrewor14 helped confirm this bug.

      Attachments

        Issue Links

          Activity

            People

              andrewor14 Andrew Or
              mengxr Xiangrui Meng
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: