Details
-
Sub-task
-
Status: Closed
-
Major
-
Resolution: Duplicate
-
2.4.1, 2.4.2
-
None
-
None
Description
It throws the error while reading big tables from JDBC data source:
> Code:
sparkSession.read()
.option("numPartitions", data.numPartitions)
.option("partitionColumn", data.pk)
.option("lowerBound", data.min)
.option("upperBound", data.max)
.option("queryTimeout", 180).
format("jdbc").
jdbc(dbURL, tableName, props).
repartition(10).write().mode(SaveMode.Overwrite).parquet(tableF.getAbsolutePath());
> Stacktrace:
Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.lang.NoSuchMethodError: sun.nio.ch.DirectBuffer.cleaner()Lsun/misc/Cleaner; +details
Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.lang.NoSuchMethodError: sun.nio.ch.DirectBuffer.cleaner()Lsun/misc/Cleaner;
at org.apache.spark.storage.StorageUtils$.cleanDirectBuffer(StorageUtils.scala:212)
at org.apache.spark.storage.StorageUtils$.dispose(StorageUtils.scala:207)
at org.apache.spark.storage.StorageUtils.dispose(StorageUtils.scala)
at org.apache.spark.io.NioBufferedFileInputStream.close(NioBufferedFileInputStream.java:130)
at java.base/java.io.FilterInputStream.close(FilterInputStream.java:180)
at org.apache.spark.io.ReadAheadInputStream.close(ReadAheadInputStream.java:400)
at org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillReader.close(UnsafeSorterSpillReader.java:151)
at org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillReader.loadNext(UnsafeSorterSpillReader.java:123)
at org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillMerger$1.loadNext(UnsafeSorterSpillMerger.java:82)
at org.apache.spark.sql.execution.UnsafeExternalRowSorter$1.next(UnsafeExternalRowSorter.java:187)
at org.apache.spark.sql.execution.UnsafeExternalRowSorter$1.next(UnsafeExternalRowSorter.java:174)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:149)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Attachments
Issue Links
- duplicates
-
SPARK-24421 Accessing sun.misc.Cleaner in JDK11
- Resolved
- is duplicated by
-
SPARK-27585 No such method error (sun.nio.ch.DirectBuffer.cleaner())
- Closed