Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19664

put 'hive.metastore.warehouse.dir' in hadoopConf place

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.1.0
    • 2.2.0
    • SQL
    • None

    Description

      In SPARK-15959, we bring back the 'hive.metastore.warehouse.dir' , while in the logic, when use the value of 'spark.sql.warehouse.dir' to overwrite 'spark.sql.warehouse.dir' , it set it to 'sparkContext.conf' , I think it should put in 'sparkContext.hadoopConfiguration' and overwrite the original value of hadoopConf

      https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/internal/SharedState.scala#L64

      Attachments

        Activity

          People

            windpiger Song Jun
            windpiger Song Jun
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: