Description
Using a Dataset containing 10.000 rows, each containing null and an array of 5.000 Ints, I observe the following performance (in local mode):
scala> time(ds.select(explode($"value")).sample(false, 0.0000001, 1).collect) 1.219052 seconds res9: Array[org.apache.spark.sql.Row] = Array([3761], [3766], [3196]) scala> time(ds.select($"dummy", explode($"value")).sample(false, 0.0000001, 1).collect) 20.219447 seconds res5: Array[org.apache.spark.sql.Row] = Array([null,3761], [null,3766], [null,3196])
Attachments
Issue Links
- duplicates
-
SPARK-15214 Implement code generation for Generate
- Resolved
- is related to
-
SPARK-21657 Spark has exponential time complexity to explode(array of structs)
- Resolved