Details
Description
I have a local Hive 0.13.1 metastore backed by MySQL. MySQL JDBC connector is put under $JAVA_HOME/jre/lib/ext. With the following spark-defaults.conf
spark.sql.hive.metastore.version 0.13.1 spark.sql.hive.metastore.jars maven
Spark shell fails to create HiveContext because MySQL JDBC driver couldn't be properly loaded.
This is probably because IsolatedClientLoader ignores shared prefixes and barrier prefixes when spark.sql.hive.metastore.jars is set to maven. See this line.
Attachments
Issue Links
- links to