load Spark configuration into Hive driver, there are 3 ways to setup spark configurations:
- Java property.
- Configure properties in spark configuration file(spark-defaults.conf).
- Hive configuration file(hive-site.xml).
The below configuration has more priority, and would overwrite previous configuration with the same property name.
Please refer to http://spark.apache.org/docs/latest/configuration.html for all configurable properties of spark, and you can configure spark configuration in Hive through following ways:
- Configure through spark configuration file.
- Create spark-defaults.conf, and place it in the /etc/spark/conf configuration directory. configure properties in spark-defaults.conf in java properties format.
- Create the $SPARK_CONF_DIR environment variable and set it to the location of spark-defaults.conf.
- Add $SAPRK_CONF_DIR to the $HADOOP_CLASSPATH environment variable.
- Configure through hive configuration file.
- edit hive-site.xml in hive conf directory, configure properties in spark-defaults.conf in xml format.
Hive driver default spark properties:
|spark.master||local||Spark master url.|
|spark.app.name||Hive on Spark||Default Spark application name.|
NO PRECOMMIT TESTS. This is for spark-branch only.