Uploaded image for project: 'Apache Sedona'
  1. Apache Sedona
  2. SEDONA-79

Pure SQL does not work with Thrift Server

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Workaround
    • None
    • None

    Description

      Current situation

      When I add Sedona SQL support to the Thrift Server on a local Spark Cluster, a scala.ScalaReflectionException associated with the class org.locationtech.jts.geom.Geometry is triggered.

      /opt/spark/bin/spark-submit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 \
          --master $SPARK_MASTER  \
          --hiveconf hive.server2.thrift.port=10001 --hiveconf hive.server2.thrift.bind.host=0.0.0.0 \
          --packages org.apache.sedona:sedona-python-adapter-3.0_2.12:1.1.1-incubating,org.apache.sedona:sedona-viz-3.0_2.12:1.1.1-incubating,org.datasyslab:geotools-wrapper:geotools-24.0 \
          --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
          --conf spark.kryo.registrator=org.apache.sedona.viz.core.Serde.SedonaVizKryoRegistrator \
          --conf spark.sql.extensions=org.apache.sedona.viz.sql.SedonaVizExtensions,org.apache.sedona.sql.SedonaSqlExtensions
      

      The error is not triggered when I launch the server, but when I attempt to connect to it with the Apache Hive JDBC driver.

      This error is not present when I use spark-shell with the same parameter.

      Expected situation

      I would expect to be able to start the thrift server to expose sedona SQL on a standalone cluster.

      Logs

      docker-compose up spark-thrift
      WARNING: The Docker Engine you're using is running in swarm mode.
      
      Compose does not use swarm mode to deploy services to multiple nodes in a swarm. All containers will be scheduled on the current node.
      
      To deploy your application across the swarm, use `docker stack deploy`.
      
      spark-dbt_spark-master_1 is up-to-date
      spark-dbt_spark-worker-b_1 is up-to-date
      spark-dbt_spark-worker-a_1 is up-to-date
      Starting spark-dbt_spark-thrift_1 ... done
      Attaching to spark-dbt_spark-thrift_1
      spark-thrift_1    | WARNING: An illegal reflective access operation has occurred
      spark-thrift_1    | WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark/jars/spark-unsafe_2.12-3.0.2.jar) to constructor java.nio.DirectByteBuffer(long,int)
      spark-thrift_1    | WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
      spark-thrift_1    | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
      spark-thrift_1    | WARNING: All illegal access operations will be denied in a future release
      spark-thrift_1    | Ivy Default Cache set to: /root/.ivy2/cache
      spark-thrift_1    | The jars for the packages stored in: /root/.ivy2/jars
      spark-thrift_1    | :: loading settings :: url = jar:file:/opt/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
      spark-thrift_1    | org.apache.sedona#sedona-python-adapter-3.0_2.12 added as a dependency
      spark-thrift_1    | org.apache.sedona#sedona-viz-3.0_2.12 added as a dependency
      spark-thrift_1    | org.datasyslab#geotools-wrapper added as a dependency
      spark-thrift_1    | :: resolving dependencies :: org.apache.spark#spark-submit-parent-56248f53-61ad-4c72-bd1f-c13739ab99c0;1.0
      spark-thrift_1    | 	confs: [default]
      spark-thrift_1    | 	found org.apache.sedona#sedona-python-adapter-3.0_2.12;1.1.1-incubating in central
      spark-thrift_1    | 	found org.locationtech.jts#jts-core;1.18.0 in central
      spark-thrift_1    | 	found org.wololo#jts2geojson;0.16.1 in central
      spark-thrift_1    | 	found com.fasterxml.jackson.core#jackson-databind;2.12.2 in central
      spark-thrift_1    | 	found com.fasterxml.jackson.core#jackson-annotations;2.12.2 in central
      spark-thrift_1    | 	found com.fasterxml.jackson.core#jackson-core;2.12.2 in central
      spark-thrift_1    | 	found org.apache.sedona#sedona-core-3.0_2.12;1.1.1-incubating in central
      spark-thrift_1    | 	found org.scala-lang.modules#scala-collection-compat_2.12;2.5.0 in central
      spark-thrift_1    | 	found org.apache.sedona#sedona-sql-3.0_2.12;1.1.1-incubating in central
      spark-thrift_1    | 	found org.apache.sedona#sedona-viz-3.0_2.12;1.1.1-incubating in central
      spark-thrift_1    | 	found org.beryx#awt-color-factory;1.0.0 in central
      spark-thrift_1    | 	found org.datasyslab#geotools-wrapper;geotools-24.0 in central
      spark-thrift_1    | :: resolution report :: resolve 533ms :: artifacts dl 15ms
      spark-thrift_1    | 	:: modules in use:
      spark-thrift_1    | 	com.fasterxml.jackson.core#jackson-annotations;2.12.2 from central in [default]
      spark-thrift_1    | 	com.fasterxml.jackson.core#jackson-core;2.12.2 from central in [default]
      spark-thrift_1    | 	com.fasterxml.jackson.core#jackson-databind;2.12.2 from central in [default]
      spark-thrift_1    | 	org.apache.sedona#sedona-core-3.0_2.12;1.1.1-incubating from central in [default]
      spark-thrift_1    | 	org.apache.sedona#sedona-python-adapter-3.0_2.12;1.1.1-incubating from central in [default]
      spark-thrift_1    | 	org.apache.sedona#sedona-sql-3.0_2.12;1.1.1-incubating from central in [default]
      spark-thrift_1    | 	org.apache.sedona#sedona-viz-3.0_2.12;1.1.1-incubating from central in [default]
      spark-thrift_1    | 	org.beryx#awt-color-factory;1.0.0 from central in [default]
      spark-thrift_1    | 	org.datasyslab#geotools-wrapper;geotools-24.0 from central in [default]
      spark-thrift_1    | 	org.locationtech.jts#jts-core;1.18.0 from central in [default]
      spark-thrift_1    | 	org.scala-lang.modules#scala-collection-compat_2.12;2.5.0 from central in [default]
      spark-thrift_1    | 	org.wololo#jts2geojson;0.16.1 from central in [default]
      spark-thrift_1    | 	:: evicted modules:
      spark-thrift_1    | 	org.locationtech.jts#jts-core;1.18.1 by [org.locationtech.jts#jts-core;1.18.0] in [default]
      spark-thrift_1    | 	---------------------------------------------------------------------
      spark-thrift_1    | 	|                  |            modules            ||   artifacts   |
      spark-thrift_1    | 	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
      spark-thrift_1    | 	---------------------------------------------------------------------
      spark-thrift_1    | 	|      default     |   13  |   0   |   0   |   1   ||   12  |   0   |
      spark-thrift_1    | 	---------------------------------------------------------------------
      spark-thrift_1    | :: retrieving :: org.apache.spark#spark-submit-parent-56248f53-61ad-4c72-bd1f-c13739ab99c0
      spark-thrift_1    | 	confs: [default]
      spark-thrift_1    | 	0 artifacts copied, 12 already retrieved (0kB/13ms)
      spark-thrift_1    | 22/01/24 20:55:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      spark-thrift_1    | Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
      spark-thrift_1    | 22/01/24 20:55:06 INFO HiveThriftServer2: Started daemon with process name: 7@9d3fb159061a
      spark-thrift_1    | 22/01/24 20:55:06 INFO SignalUtils: Registered signal handler for TERM
      spark-thrift_1    | 22/01/24 20:55:06 INFO SignalUtils: Registered signal handler for HUP
      spark-thrift_1    | 22/01/24 20:55:06 INFO SignalUtils: Registered signal handler for INT
      spark-thrift_1    | 22/01/24 20:55:06 INFO HiveThriftServer2: Starting SparkContext
      spark-thrift_1    | 22/01/24 20:55:06 INFO HiveConf: Found configuration file null
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Running Spark version 3.0.2
      spark-thrift_1    | 22/01/24 20:55:07 INFO ResourceUtils: ==============================================================
      spark-thrift_1    | 22/01/24 20:55:07 INFO ResourceUtils: Resources for spark.driver:
      spark-thrift_1    | 
      spark-thrift_1    | 22/01/24 20:55:07 INFO ResourceUtils: ==============================================================
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Submitted application: SparkSQL::172.28.0.5
      spark-thrift_1    | 22/01/24 20:55:07 INFO SecurityManager: Changing view acls to: root
      spark-thrift_1    | 22/01/24 20:55:07 INFO SecurityManager: Changing modify acls to: root
      spark-thrift_1    | 22/01/24 20:55:07 INFO SecurityManager: Changing view acls groups to: 
      spark-thrift_1    | 22/01/24 20:55:07 INFO SecurityManager: Changing modify acls groups to: 
      spark-thrift_1    | 22/01/24 20:55:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
      spark-thrift_1    | 22/01/24 20:55:07 INFO Utils: Successfully started service 'sparkDriver' on port 34759.
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkEnv: Registering MapOutputTracker
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkEnv: Registering BlockManagerMaster
      spark-thrift_1    | 22/01/24 20:55:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
      spark-thrift_1    | 22/01/24 20:55:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
      spark-thrift_1    | 22/01/24 20:55:07 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-2f049248-da56-420b-b1ec-2c5fe1e924fc
      spark-thrift_1    | 22/01/24 20:55:07 INFO MemoryStore: MemoryStore started with capacity 434.4 MiB
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkEnv: Registering OutputCommitCoordinator
      spark-thrift_1    | 22/01/24 20:55:07 INFO Utils: Successfully started service 'SparkUI' on port 4040.
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkUI: Bound SparkUI to spark-thrift, and started at http://9d3fb159061a:4040
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/org.apache.sedona_sedona-python-adapter-3.0_2.12-1.1.1-incubating.jar at spark://9d3fb159061a:34759/jars/org.apache.sedona_sedona-python-adapter-3.0_2.12-1.1.1-incubating.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/org.apache.sedona_sedona-viz-3.0_2.12-1.1.1-incubating.jar at spark://9d3fb159061a:34759/jars/org.apache.sedona_sedona-viz-3.0_2.12-1.1.1-incubating.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/org.datasyslab_geotools-wrapper-geotools-24.0.jar at spark://9d3fb159061a:34759/jars/org.datasyslab_geotools-wrapper-geotools-24.0.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/org.locationtech.jts_jts-core-1.18.0.jar at spark://9d3fb159061a:34759/jars/org.locationtech.jts_jts-core-1.18.0.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/org.wololo_jts2geojson-0.16.1.jar at spark://9d3fb159061a:34759/jars/org.wololo_jts2geojson-0.16.1.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/org.apache.sedona_sedona-core-3.0_2.12-1.1.1-incubating.jar at spark://9d3fb159061a:34759/jars/org.apache.sedona_sedona-core-3.0_2.12-1.1.1-incubating.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/org.apache.sedona_sedona-sql-3.0_2.12-1.1.1-incubating.jar at spark://9d3fb159061a:34759/jars/org.apache.sedona_sedona-sql-3.0_2.12-1.1.1-incubating.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/org.scala-lang.modules_scala-collection-compat_2.12-2.5.0.jar at spark://9d3fb159061a:34759/jars/org.scala-lang.modules_scala-collection-compat_2.12-2.5.0.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/com.fasterxml.jackson.core_jackson-databind-2.12.2.jar at spark://9d3fb159061a:34759/jars/com.fasterxml.jackson.core_jackson-databind-2.12.2.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/com.fasterxml.jackson.core_jackson-annotations-2.12.2.jar at spark://9d3fb159061a:34759/jars/com.fasterxml.jackson.core_jackson-annotations-2.12.2.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/com.fasterxml.jackson.core_jackson-core-2.12.2.jar at spark://9d3fb159061a:34759/jars/com.fasterxml.jackson.core_jackson-core-2.12.2.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 INFO SparkContext: Added JAR file:///root/.ivy2/jars/org.beryx_awt-color-factory-1.0.0.jar at spark://9d3fb159061a:34759/jars/org.beryx_awt-color-factory-1.0.0.jar with timestamp 1643057707087
      spark-thrift_1    | 22/01/24 20:55:07 WARN SparkContext: Please ensure that the number of slots available on your executors is limited by the number of cores to task cpus and not another custom resource. If cores is not the limiting resource then dynamic allocation will not work properly!
      spark-thrift_1    | 22/01/24 20:55:07 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark-master:7077...
      spark-thrift_1    | 22/01/24 20:55:07 INFO TransportClientFactory: Successfully created connection to spark-master/172.28.0.2:7077 after 24 ms (0 ms spent in bootstraps)
      spark-thrift_1    | 22/01/24 20:55:08 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20220124205508-0004
      spark-thrift_1    | 22/01/24 20:55:08 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20220124205508-0004/0 on worker-20220124201503-172.28.0.3-7000 (172.28.0.3:7000) with 2 core(s)
      spark-thrift_1    | 22/01/24 20:55:08 INFO StandaloneSchedulerBackend: Granted executor ID app-20220124205508-0004/0 on hostPort 172.28.0.3:7000 with 2 core(s), 1024.0 MiB RAM
      spark-thrift_1    | 22/01/24 20:55:08 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20220124205508-0004/1 on worker-20220124201513-172.28.0.4-7000 (172.28.0.4:7000) with 2 core(s)
      spark-thrift_1    | 22/01/24 20:55:08 INFO StandaloneSchedulerBackend: Granted executor ID app-20220124205508-0004/1 on hostPort 172.28.0.4:7000 with 2 core(s), 1024.0 MiB RAM
      spark-thrift_1    | 22/01/24 20:55:08 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39909.
      spark-thrift_1    | 22/01/24 20:55:08 INFO NettyBlockTransferService: Server created on 9d3fb159061a:39909
      spark-thrift_1    | 22/01/24 20:55:08 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
      spark-thrift_1    | 22/01/24 20:55:08 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 9d3fb159061a, 39909, None)
      spark-thrift_1    | 22/01/24 20:55:08 INFO BlockManagerMasterEndpoint: Registering block manager 9d3fb159061a:39909 with 434.4 MiB RAM, BlockManagerId(driver, 9d3fb159061a, 39909, None)
      spark-thrift_1    | 22/01/24 20:55:08 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 9d3fb159061a, 39909, None)
      spark-thrift_1    | 22/01/24 20:55:08 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 9d3fb159061a, 39909, None)
      spark-thrift_1    | 22/01/24 20:55:08 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20220124205508-0004/1 is now RUNNING
      spark-thrift_1    | 22/01/24 20:55:08 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20220124205508-0004/0 is now RUNNING
      spark-thrift_1    | 22/01/24 20:55:08 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
      spark-thrift_1    | 22/01/24 20:55:08 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/opt/spark/spark-warehouse').
      spark-thrift_1    | 22/01/24 20:55:08 INFO SharedState: Warehouse path is 'file:/opt/spark/spark-warehouse'.
      spark-thrift_1    | 22/01/24 20:55:08 INFO HiveUtils: Initializing HiveMetastoreConnection version 2.3.7 using Spark classes.
      spark-thrift_1    | 22/01/24 20:55:09 INFO SessionState: Created HDFS directory: /tmp/hive/root/9a5cefa7-7ee5-46cc-a7ff-2b8bb4eb7ac2
      spark-thrift_1    | 22/01/24 20:55:09 INFO SessionState: Created local directory: /tmp/root/9a5cefa7-7ee5-46cc-a7ff-2b8bb4eb7ac2
      spark-thrift_1    | 22/01/24 20:55:09 INFO SessionState: Created HDFS directory: /tmp/hive/root/9a5cefa7-7ee5-46cc-a7ff-2b8bb4eb7ac2/_tmp_space.db
      spark-thrift_1    | 22/01/24 20:55:09 INFO HiveClientImpl: Warehouse location for Hive client (version 2.3.7) is file:/opt/spark/spark-warehouse
      spark-thrift_1    | 22/01/24 20:55:10 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
      spark-thrift_1    | 22/01/24 20:55:10 WARN HiveConf: HiveConf of name hive.stats.retries.wait does not exist
      spark-thrift_1    | 22/01/24 20:55:10 INFO HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
      spark-thrift_1    | 22/01/24 20:55:10 INFO ObjectStore: ObjectStore, initialize called
      spark-thrift_1    | 22/01/24 20:55:10 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
      spark-thrift_1    | 22/01/24 20:55:10 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
      spark-thrift_1    | 22/01/24 20:55:10 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
      spark-thrift_1    | 22/01/24 20:55:11 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (172.28.0.4:46986) with ID 1
      spark-thrift_1    | 22/01/24 20:55:11 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (172.28.0.3:48170) with ID 0
      spark-thrift_1    | 22/01/24 20:55:11 INFO BlockManagerMasterEndpoint: Registering block manager 172.28.0.4:34799 with 434.4 MiB RAM, BlockManagerId(1, 172.28.0.4, 34799, None)
      spark-thrift_1    | 22/01/24 20:55:11 INFO BlockManagerMasterEndpoint: Registering block manager 172.28.0.3:34961 with 434.4 MiB RAM, BlockManagerId(0, 172.28.0.3, 34961, None)
      spark-thrift_1    | 22/01/24 20:55:12 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
      spark-thrift_1    | 22/01/24 20:55:14 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
      spark-thrift_1    | 22/01/24 20:55:14 INFO ObjectStore: Initialized ObjectStore
      spark-thrift_1    | 22/01/24 20:55:14 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 2.3.0
      spark-thrift_1    | 22/01/24 20:55:14 WARN ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore UNKNOWN@172.28.0.5
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveMetaStore: Added admin role in metastore
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveMetaStore: Added public role in metastore
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveMetaStore: No user is added in admin role, since config is empty
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveMetaStore: 0: get_all_functions
      spark-thrift_1    | 22/01/24 20:55:14 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_all_functions	
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveMetaStore: 0: get_database: default
      spark-thrift_1    | 22/01/24 20:55:14 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_database: default	
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveUtils: Initializing execution hive, version 2.3.7
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveClientImpl: Warehouse location for Hive client (version 2.3.7) is file:/opt/spark/spark-warehouse
      spark-thrift_1    | 22/01/24 20:55:14 INFO SessionManager: Operation log root directory is created: /tmp/root/operation_logs
      spark-thrift_1    | 22/01/24 20:55:14 INFO SessionManager: HiveServer2: Background operation thread pool size: 100
      spark-thrift_1    | 22/01/24 20:55:14 INFO SessionManager: HiveServer2: Background operation thread wait queue size: 100
      spark-thrift_1    | 22/01/24 20:55:14 INFO SessionManager: HiveServer2: Background operation thread keepalive time: 10 seconds
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service:OperationManager is inited.
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service:SessionManager is inited.
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service: CLIService is inited.
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service:ThriftBinaryCLIService is inited.
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service: HiveServer2 is inited.
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service:OperationManager is started.
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service:SessionManager is started.
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service:CLIService is started.
      spark-thrift_1    | 22/01/24 20:55:14 INFO ObjectStore: ObjectStore, initialize called
      spark-thrift_1    | 22/01/24 20:55:14 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
      spark-thrift_1    | 22/01/24 20:55:14 INFO ObjectStore: Initialized ObjectStore
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveMetaStore: 0: get_databases: default
      spark-thrift_1    | 22/01/24 20:55:14 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=get_databases: default	
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveMetaStore: 0: Cleaning up thread local RawStore...
      spark-thrift_1    | 22/01/24 20:55:14 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=Cleaning up thread local RawStore...	
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveMetaStore: 0: Done cleaning up thread local RawStore
      spark-thrift_1    | 22/01/24 20:55:14 INFO audit: ugi=root	ip=unknown-ip-addr	cmd=Done cleaning up thread local RawStore	
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service:ThriftBinaryCLIService is started.
      spark-thrift_1    | 22/01/24 20:55:14 INFO AbstractService: Service:HiveServer2 is started.
      spark-thrift_1    | 22/01/24 20:55:14 INFO HiveThriftServer2: HiveThriftServer2 started
      spark-thrift_1    | 22/01/24 20:55:14 INFO ThriftCLIService: Starting ThriftBinaryCLIService on port 10001 with 5...500 worker threads
      spark-thrift_1    | 22/01/24 20:56:24 INFO ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V8
      spark-thrift_1    | 22/01/24 20:56:24 INFO SessionState: Created HDFS directory: /tmp/hive/anonymous/77e79a4b-6753-4d2a-8d46-b1af590f3ab2
      spark-thrift_1    | 22/01/24 20:56:24 INFO SessionState: Created local directory: /tmp/root/77e79a4b-6753-4d2a-8d46-b1af590f3ab2
      spark-thrift_1    | 22/01/24 20:56:24 INFO SessionState: Created HDFS directory: /tmp/hive/anonymous/77e79a4b-6753-4d2a-8d46-b1af590f3ab2/_tmp_space.db
      spark-thrift_1    | 22/01/24 20:56:24 INFO HiveSessionImpl: Operation log session directory is created: /tmp/root/operation_logs/77e79a4b-6753-4d2a-8d46-b1af590f3ab2
      spark-thrift_1    | Exception in thread "HiveServer2-Handler-Pool: Thread-87" java.lang.ExceptionInInitializerError
      spark-thrift_1    | 	at org.apache.sedona.sql.UDF.UdfRegistrator$.registerAll(UdfRegistrator.scala:32)
      spark-thrift_1    | 	at org.apache.sedona.sql.utils.SedonaSQLRegistrator$.registerAll(SedonaSQLRegistrator.scala:34)
      spark-thrift_1    | 	at org.apache.sedona.sql.SedonaSqlExtensions.$anonfun$apply$1(SedonaSqlExtensions.scala:28)
      spark-thrift_1    | 	at org.apache.spark.sql.SparkSessionExtensions.$anonfun$buildCheckRules$1(SparkSessionExtensions.scala:174)
      spark-thrift_1    | 	at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
      spark-thrift_1    | 	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
      spark-thrift_1    | 	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
      spark-thrift_1    | 	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
      spark-thrift_1    | 	at scala.collection.TraversableLike.map(TraversableLike.scala:238)
      spark-thrift_1    | 	at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
      spark-thrift_1    | 	at scala.collection.AbstractTraversable.map(Traversable.scala:108)
      spark-thrift_1    | 	at org.apache.spark.sql.SparkSessionExtensions.buildCheckRules(SparkSessionExtensions.scala:174)
      spark-thrift_1    | 	at org.apache.spark.sql.internal.BaseSessionStateBuilder.customCheckRules(BaseSessionStateBuilder.scala:227)
      spark-thrift_1    | 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:100)
      spark-thrift_1    | 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:74)
      spark-thrift_1    | 	at org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$build$2(BaseSessionStateBuilder.scala:336)
      spark-thrift_1    | 	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:83)
      spark-thrift_1    | 	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:83)
      spark-thrift_1    | 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:68)
      spark-thrift_1    | 	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
      spark-thrift_1    | 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:138)
      spark-thrift_1    | 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
      spark-thrift_1    | 	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:138)
      spark-thrift_1    | 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:68)
      spark-thrift_1    | 	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:66)
      spark-thrift_1    | 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:58)
      spark-thrift_1    | 	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
      spark-thrift_1    | 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
      spark-thrift_1    | 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
      spark-thrift_1    | 	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:607)
      spark-thrift_1    | 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
      spark-thrift_1    | 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:602)
      spark-thrift_1    | 	at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:650)
      spark-thrift_1    | 	at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:70)
      spark-thrift_1    | 	at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:203)
      spark-thrift_1    | 	at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:353)
      spark-thrift_1    | 	at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:247)
      spark-thrift_1    | 	at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1377)
      spark-thrift_1    | 	at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362)
      spark-thrift_1    | 	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38)
      spark-thrift_1    | 	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
      spark-thrift_1    | 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:53)
      spark-thrift_1    | 	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:310)
      spark-thrift_1    | 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
      spark-thrift_1    | 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
      spark-thrift_1    | 	at java.base/java.lang.Thread.run(Unknown Source)
      spark-thrift_1    | Caused by: scala.ScalaReflectionException: class org.locationtech.jts.geom.Geometry in JavaMirror with org.apache.hadoop.hive.ql.exec.UDFClassLoader@5a71c867 of type class org.apache.hadoop.hive.ql.exec.UDFClassLoader with classpath [] and parent being jdk.internal.loader.ClassLoaders$AppClassLoader@3c153a1 of type class jdk.internal.loader.ClassLoaders$AppClassLoader with classpath [<unknown>] and parent being jdk.internal.loader.ClassLoaders$PlatformClassLoader@435dacd9 of type class jdk.internal.loader.ClassLoaders$PlatformClassLoader with classpath [<unknown>] and parent being primordial classloader with boot classpath [<unknown>] not found.
      spark-thrift_1    | 	at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:146)
      spark-thrift_1    | 	at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:29)
      spark-thrift_1    | 	at org.apache.spark.sql.sedona_sql.expressions.TraitSTAggregateExec$$typecreator1$1.apply(AggregateFunctions.scala:42)
      spark-thrift_1    | 	at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:237)
      spark-thrift_1    | 	at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:237)
      spark-thrift_1    | 	at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:54)
      spark-thrift_1    | 	at org.apache.spark.sql.sedona_sql.expressions.TraitSTAggregateExec.$init$(AggregateFunctions.scala:42)
      spark-thrift_1    | 	at org.apache.spark.sql.sedona_sql.expressions.ST_Union_Aggr.<init>(AggregateFunctions.scala:56)
      spark-thrift_1    | 	at org.apache.sedona.sql.UDF.Catalog$.<init>(Catalog.scala:126)
      spark-thrift_1    | 	at org.apache.sedona.sql.UDF.Catalog$.<clinit>(Catalog.scala)
      spark-thrift_1    | 	... 46 more
      
      

      Attachments

        Activity

          People

            Unassigned Unassigned
            jdenisgiguere Jean-Denis Giguère
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: