Description
Running the following code hangs pyspark in a shell:
sc.parallelize(range(100), 100).count()
It happens in the master branch, but not branch-1.0. And it seems that it only happens in a pyspark shell. andrewor14 helped confirm this bug.
Attachments
Issue Links
- duplicates
-
SPARK-2244 pyspark - RDD action hangs (after previously succeeding)
- Resolved
- incorporates
-
SPARK-1850 Bad exception if multiple jars exist when running PySpark
- Closed
- is duplicated by
-
SPARK-2244 pyspark - RDD action hangs (after previously succeeding)
- Resolved