Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-13768

Set hive conf failed use --hiveconf when beeline connect to thriftserver

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • 1.5.1
    • None
    • SQL
    • None

    Description

      1. Start thriftserver
      2. ./bin/beeline -u '...' --hiveconf hive.exec.max.dynamic.partitions=10000
      3. set hive.exec.max.dynamic.partitions; – return is default value 1000, not 10000

      May be we can pass conf to context when open session

      override def openSession(...): SessionHandle = {
        ...
        if (sessionConf != null) {
          import scala.collection.JavaConversions._
          for ((k, v) <- sessionConf) {
            if (k.startsWith("set:hiveconf:")) {
              val setK = k.split("set:hiveconf:")(1)
              ctx.setConf(setK, v)
            }
          }
        }
        ctx.setConf("spark.sql.hive.version", HiveContext.hiveExecutionVersion)
        ...
      }
      

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              Sephiroth-Lin Weizhong
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: