Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
2.4.0, 3.0.0
-
None
-
None
Description
When startup SparkSQL CLI .
For extra jar passed through hive conf HiveConf.ConfVars.HIVEAUXJARS, we don't need to use complex APIs to fix different hive version problem, we just can handle it through spark's SessionResourceLoader's API. add jar to Spark and SparkSession's running env.
SessionResourceLoader api :
val resourceLoader = SparkSQLEnv.sqlContext.sessionState.resourceLoader
StringUtils.split(auxJars, ",").foreach(resourceLoader.addJar(_))
v1.2.1ThriftServerShimUtils:
private[thriftserver] def addToClassPath( loader: ClassLoader, auxJars: Array[String]): ClassLoader = { Utilities.addToClassPath(loader, auxJars) }
v2.3.5ThriftServerShimUtils:
private[thriftserver] def addToClassPath( loader: ClassLoader, auxJars: Array[String]): ClassLoader = { val addAction = new AddToClassPathAction(loader, auxJars.toList.asJava) AccessController.doPrivileged(addAction) }
Attachments
Issue Links
- duplicates
-
SPARK-28840 --jar argument with spark-sql failed to load the jars to driver classpath
- Resolved
- links to