Details
Description
in spark 2.0.1 ,I enable hive support and when init the sqlContext ,throw a AlreadyExistsException(message:Database default already exists),same as
https://www.mail-archive.com/dev@spark.apache.org/msg15306.html ,my code is
private val master = "local[*]" private val appName = "xqlServerSpark" val fileSystem = FileSystem.get() val sparkConf = new SparkConf().setMaster(master). setAppName(appName).set("spark.sql.warehouse.dir", s"${fileSystem.getUri.toASCIIString}/user/hive/warehouse") val hiveContext = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate().sqlContext print(sparkConf.get("spark.sql.warehouse.dir")) hiveContext.sql("show tables").show()
the result is correct,but a exception also throwBy the code