Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
1.3.1
-
Apache Sedona master branch, Spark version: 3.3.1
Description
The following code raises a SerializationException:
input_location = 'data/arealm-small.csv' point_rdd = PointRDD(sc, input_location, 1, FileDataSplitter.CSV, True, 1, StorageLevel.MEMORY_ONLY) circle_rdd = CircleRDD(point_rdd, 1.0) circle_rdd_2 = CircleRDD(point_rdd, 2.0) circle_rdd_2.rawSpatialRDD = circle_rdd.rawSpatialRDD circle_rdd_2.analyze()
The stack trace is as follows:
23/03/10 20:12:08 ERROR Executor: Exception in task 0.0 in stage 4.0 (TID 3) org.apache.sedona.python.wrapper.SerializationException: Can not deserialize object at org.apache.sedona.python.wrapper.translation.PythonGeometrySerializer.$anonfun$deserialize$1(PythonGeometrySerializer.scala:58) at org.apache.sedona.python.wrapper.translation.PythonRDDToJavaConverter.readGeometry(PythonRDDToJavaConverter.scala:57) at org.apache.sedona.python.wrapper.translation.PythonRDDToJavaConverter.$anonfun$translateToJava$1(PythonRDDToJavaConverter.scala:38) at scala.collection.Iterator$$anon$10.next(Iterator.scala:461) at scala.collection.Iterator.foreach(Iterator.scala:943) at scala.collection.Iterator.foreach$(Iterator.scala:943) at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199) at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192) at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431) at scala.collection.TraversableOnce.aggregate(TraversableOnce.scala:260) at scala.collection.TraversableOnce.aggregate$(TraversableOnce.scala:260) at scala.collection.AbstractIterator.aggregate(Iterator.scala:1431) at org.apache.spark.rdd.RDD.$anonfun$aggregate$2(RDD.scala:1198) at org.apache.spark.SparkContext.$anonfun$runJob$6(SparkContext.scala:2322) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) at org.apache.spark.scheduler.Task.run(Task.scala:136) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) 23/03/10 20:12:08 WARN TaskSetManager: Lost task 0.0 in stage 4.0 (TID 3) (kontinuation executor driver): org.apache.sedona.python.wrapper.SerializationException: Can not deserialize object at org.apache.sedona.python.wrapper.translation.PythonGeometrySerializer.$anonfun$deserialize$1(PythonGeometrySerializer.scala:58) at org.apache.sedona.python.wrapper.translation.PythonRDDToJavaConverter.readGeometry(PythonRDDToJavaConverter.scala:57) at org.apache.sedona.python.wrapper.translation.PythonRDDToJavaConverter.$anonfun$translateToJava$1(PythonRDDToJavaConverter.scala:38) at scala.collection.Iterator$$anon$10.next(Iterator.scala:461) at scala.collection.Iterator.foreach(Iterator.scala:943) at scala.collection.Iterator.foreach$(Iterator.scala:943) at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199) at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192) at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431) at scala.collection.TraversableOnce.aggregate(TraversableOnce.scala:260) at scala.collection.TraversableOnce.aggregate$(TraversableOnce.scala:260) at scala.collection.AbstractIterator.aggregate(Iterator.scala:1431) at org.apache.spark.rdd.RDD.$anonfun$aggregate$2(RDD.scala:1198) at org.apache.spark.SparkContext.$anonfun$runJob$6(SparkContext.scala:2322) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) at org.apache.spark.scheduler.Task.run(Task.scala:136) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750)
We found that the deserialization code for Circle objects in python-adapter does not match with the serialization code, which caused this problem.
Attachments
Issue Links
- links to