Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
Description
RS_Values(raster, array(st_point(1,1)) works well but if the point array is serialized (for instance because of a shuffle) an UnsupportedOperationException is thrown.
Spark has several ArrayData implementations. GenericArrayData is commonly returned from expressions. Once serialized it is converted to an UnsafeArrayData. UnsafeArrayData throws an exception if the "array" method is called. See https://github.com/apache/spark/blob/94de3ca2942bb04852510abccf06df1fa8b2dab3/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/UnsafeArrayData.java#L102
The relevant stack trace in Sedona:
Caused by: java.lang.UnsupportedOperationException: Not supported on UnsafeArrayData. at org.apache.spark.sql.catalyst.expressions.UnsafeArrayData.array(UnsafeArrayData.java:102) at org.apache.spark.sql.sedona_sql.expressions.raster.RS_Values.eval(Functions.scala:897)
It is possible to work around the bug by adding a bogus array operation to convert the UnsafeArrayData back to a GenericArrayData again after shuffling.
expr("RS_Values(raster, points)")
Becomes
expr("RS_Values(raster, filter(points, x->true)")
The filter function won't change the array. The only purpose is to internally convert the UnsafeArrayData back to a GenericArrayData.
Attachments
Issue Links
- links to