Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
4.0.0
Description
```val rdd1 = spark.sparkContext.parallelize(Seq(1, 2, 3), numSlices = 65536)
val rdd2 = spark.sparkContext.parallelize(Seq(1, 2, 3), numSlices = 65536)rdd2.cartesian(rdd1).partitions```
Throws `ArrayIndexOutOfBoundsException: 0` at CartesianRDD.scala:69 because `s1.index * numPartitionsInRdd2 + s2.index` overflows and wraps to 0. We should provide a better error message which indicates the number of partition overflows so it's easier for the user to debug.