Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-28684 Hive module support JDK 11
  3. SPARK-28708

IsolatedClientLoader will not load hive classes from application jars

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.0
    • 3.0.0
    • SQL
    • None

    Description

      How to reproduce this issue:

      export JAVA_HOME="/usr/lib/jdk-11.0.3"
      build/sbt  "hive/test-only *.HiveSparkSubmitSuite" -Phive -Phadoop-3.2
      
      [info] - SPARK-18989: DESC TABLE should not fail with format class not found *** FAILED *** (9 seconds, 927 milliseconds)
      [info]   spark-submit returned with exit code 1.
      [info]   Command line: './bin/spark-submit' '--class' 'org.apache.spark.sql.hive.SPARK_18989_CREATE_TABLE' '--name' 'SPARK-18947' '--master' 'local-cluster[2,1,1024]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--jars' '/root/.m2/repository/org/apache/hive/hive-contrib/2.3.6-SNAPSHOT/hive-contrib-2.3.6-SNAPSHOT.jar' 'file:/root/opensource/spark/target/tmp/spark-36d27542-7b82-4962-a362-bb51ef3e457d/testJar-1565682620744.jar'
      [info]
      [info]   2019-08-13 00:50:22.073 - stderr> WARNING: An illegal reflective access operation has occurred
      [info]   2019-08-13 00:50:22.073 - stderr> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/root/opensource/spark/common/unsafe/target/scala-2.12/classes/) to constructor java.nio.DirectByteBuffer(long,int)
      [info]   2019-08-13 00:50:22.073 - stderr> WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
      [info]   2019-08-13 00:50:22.073 - stderr> WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
      [info]   2019-08-13 00:50:22.073 - stderr> WARNING: All illegal access operations will be denied in a future release
      [info]   2019-08-13 00:50:28.31 - stderr> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/metadata/HiveException
      [info]   2019-08-13 00:50:28.31 - stderr> 	at java.base/java.lang.Class.getDeclaredConstructors0(Native Method)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at java.base/java.lang.Class.privateGetDeclaredConstructors(Class.java:3138)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at java.base/java.lang.Class.getConstructors(Class.java:1944)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:294)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:410)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:305)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:68)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:67)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:221)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:221)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:139)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:129)
      [info]   2019-08-13 00:50:28.31 - stderr> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:42)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$1(HiveSessionStateBuilder.scala:57)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:91)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:91)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.databaseExists(SessionCatalog.scala:244)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.requireDbExists(SessionCatalog.scala:178)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:317)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:132)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:213)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3431)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$4(SQLExecution.scala:100)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:87)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3427)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:213)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:95)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:653)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.hive.SPARK_18989_CREATE_TABLE$.main(HiveSparkSubmitSuite.scala:829)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.hive.SPARK_18989_CREATE_TABLE.main(HiveSparkSubmitSuite.scala)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:920)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:179)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:202)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:89)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:999)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1008)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      [info]   2019-08-13 00:50:28.311 - stderr> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.metadata.HiveException
      [info]   2019-08-13 00:50:28.311 - stderr> 	at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:250)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:239)
      [info]   2019-08-13 00:50:28.311 - stderr> 	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
      [info]   2019-08-13 00:50:28.311 - stderr> 	... 48 more
      [info]   2019-08-13 00:50:29.183 - stderr> java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@206c1dd rejected from java.util.concurrent.ThreadPoolExecutor@41b0e33e[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0]
      [info]   2019-08-13 00:50:29.184 - stderr> 	at java.base/java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2055)
      [info]   2019-08-13 00:50:29.184 - stderr> 	at java.base/java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:825)
      [info]   2019-08-13 00:50:29.184 - stderr> 	at java.base/java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1355)
      [info]   2019-08-13 00:50:29.184 - stderr> 	at java.base/java.util.concurrent.Executors$DelegatedExecutorService.execute(Executors.java:687)
      [info]   2019-08-13 00:50:29.184 - stderr> 	at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.Promise.complete(Promise.scala:53)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.Promise.complete$(Promise.scala:52)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:874)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:872)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.Promise.complete(Promise.scala:53)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.Promise.complete$(Promise.scala:52)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
      [info]   2019-08-13 00:50:29.185 - stderr> 	at java.base/java.lang.Thread.run(Thread.java:834) (SparkSubmitTestUtils.scala:94)
      

      Attachments

        Issue Links

          Activity

            People

              yumwang Yuming Wang
              yumwang Yuming Wang
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: