Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19218

Fix SET command to show a result correctly and in a sorted order

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Trivial
    • Resolution: Fixed
    • 2.1.0
    • 2.2.0
    • SQL
    • None

    Description

      This issue aims to fix the following two things.

      1. `sql("SET -v").collect()` or `sql("SET -v").show()` raises the following exceptions for String configuration with default value, `null`. For the test, please see [Jenkins result](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/71539/testReport/) and https://github.com/apache/spark/commit/60953bf1f1ba144e709fdae3903a390ff9479fd0 in #16624 .

      sbt.ForkMain$ForkError: java.lang.RuntimeException: Error while decoding: java.lang.NullPointerException
      createexternalrow(input[0, string, false].toString, input[1, string, false].toString, input[2, string, false].toString, StructField(key,StringType,false), StructField(value,StringType,false), StructField(meaning,StringType,false))
      :- input[0, string, false].toString
      :  +- input[0, string, false]
      :- input[1, string, false].toString
      :  +- input[1, string, false]
      +- input[2, string, false].toString
         +- input[2, string, false]
      

      2. Currently, `SET` and `SET -v` commands show unsorted result.
      We had better show a sorted result for UX. Also, this is compatible with Hive.

      BEFORE

      scala> sql("set").show(false)
      ...
      |spark.driver.host              |10.22.16.140                                                                                                                                 |
      |spark.driver.port              |63893                                                                                                                                        |
      |spark.repl.class.uri           |spark://10.22.16.140:63893/classes                                                                                                           |
      ...
      |spark.app.name                 |Spark shell                                                                                                                                  |
      |spark.driver.memory            |4G                                                                                                                                           |
      |spark.executor.id              |driver                                                                                                                                       |
      |spark.submit.deployMode        |client                                                                                                                                       |
      |spark.master                   |local[*]                                                                                                                                     |
      |spark.home                     |/Users/dhyun/spark                                                                                                                           |
      |spark.sql.catalogImplementation|hive                                                                                                                                         |
      |spark.app.id                   |local-1484333618945                                                                                                                          |
      

      AFTER

      scala> sql("set").show(false)
      ...
      |spark.app.id                   |local-1484333925649                                                                                                                          |
      |spark.app.name                 |Spark shell                                                                                                                                  |
      |spark.driver.host              |10.22.16.140                                                                                                                                 |
      |spark.driver.memory            |4G                                                                                                                                           |
      |spark.driver.port              |64994                                                                                                                                        |
      |spark.executor.id              |driver                                                                                                                                       |
      |spark.jars                     |                                                                                                                                             |
      |spark.master                   |local[*]                                                                                                                                     |
      |spark.repl.class.uri           |spark://10.22.16.140:64994/classes                                                                                                           |
      |spark.sql.catalogImplementation|hive                                                                                                                                         |
      |spark.submit.deployMode        |client                                                                                                                                       |
      

      Attachments

        Issue Links

          Activity

            People

              dongjoon Dongjoon Hyun
              dongjoon Dongjoon Hyun
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: