Uploaded image for project: 'Apache Sedona'
  1. Apache Sedona
  2. SEDONA-211

Enforce release managers to use JDK 8

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.3.0
    • 1.3.1

    Description

      When deploying an app built with Sedona 1.3.0-incubating to AWS EMR Serverless (which only supports Java 8), executors that indirectly use the ShapeSerde (e.g., by Kryo serializing a geometry column) crash with this error:

      java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
      at org.apache.sedona.core.formatMapper.shapefileParser.parseUtils.shp.ShapeSerde.putHeader(ShapeSerde.java:163)
      at org.apache.sedona.core.formatMapper.shapefileParser.parseUtils.shp.ShapeSerde.serialize(ShapeSerde.java:149)
      at org.apache.sedona.core.formatMapper.shapefileParser.parseUtils.shp.ShapeSerde.serialize(ShapeSerde.java:73)
      at org.apache.sedona.core.geometryObjects.GeometrySerde.writeGeometry(GeometrySerde.java:104)
      at org.apache.sedona.core.geometryObjects.GeometrySerde.write(GeometrySerde.java:72)
      at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:651)
      at com.twitter.chill.SomeSerializer.write(SomeSerializer.scala:21)
      at com.twitter.chill.SomeSerializer.write(SomeSerializer.scala:19)
      at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:575)
      at com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:79)
      at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:508)
      at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:651)
      at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:270)
      at org.apache.spark.serializer.SerializationStream.writeValue(Serializer.scala:145)
      at org.apache.spark.serializer.DefaultSerializedRecordAppender.process(SerializedRecordAppender.scala:44)
      at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.insertRecordIntoSorter(UnsafeShuffleWriter.java:242)
      at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:184)
      at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
      at org.apache.spark.scheduler.Task.run(Task.scala:138)
      at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
      at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1516)
      at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      at java.lang.Thread.run(Thread.java:750)

      From Spark's environment view:

      Java Home /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.342.b07-1.amzn2.0.1.x86_64/jre
      Java Version 1.8.0_342 (Red Hat, Inc.)
      Scala Version version 2.12.15

      This appears to be related to maven-compiler-plugin not doing the right thing (source and target are both set to 8). Perhaps release should be used instead?

      Attachments

        Activity

          People

            Unassigned Unassigned
            mojodna Seth Fitzsimmons
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Time Tracking

                Estimated:
                Original Estimate - Not Specified
                Not Specified
                Remaining:
                Remaining Estimate - 0h
                0h
                Logged:
                Time Spent - 40m
                40m