Details
-
Bug
-
Status: Closed
-
Minor
-
Resolution: Fixed
-
1.5.1
-
None
-
None
Description
1. Start thriftserver
2. ./bin/beeline -u '...' --hiveconf hive.exec.max.dynamic.partitions=10000
3. set hive.exec.max.dynamic.partitions; – return is default value 1000, not 10000
May be we can pass conf to context when open session
override def openSession(...): SessionHandle = { ... if (sessionConf != null) { import scala.collection.JavaConversions._ for ((k, v) <- sessionConf) { if (k.startsWith("set:hiveconf:")) { val setK = k.split("set:hiveconf:")(1) ctx.setConf(setK, v) } } } ctx.setConf("spark.sql.hive.version", HiveContext.hiveExecutionVersion) ... }
Attachments
Issue Links
- is duplicated by
-
SPARK-13983 HiveThriftServer2 can not get "--hiveconf" or ''--hivevar" variables since 1.6 version (both multi-session and single session)
- Resolved