Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25880

user set some hadoop configurations can not work

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 2.2.0
    • None
    • Spark Core, SQL
    • None

    Description

      When user set some hadoop configuration in spark-defaults.conf, for instance: 

      spark.hadoop.mapreduce.input.fileinputformat.split.maxsize   100000

      and then user use the spark-sql and use set command to overwrite this configuration, but it can not cover the value which set in the file of spark-defaults.conf. 

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            gjhkael guojh
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: