Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26493

spark.sql.extensions should support multiple extensions

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 3.0.0
    • Fix Version/s: 3.0.0
    • Component/s: SQL
    • Labels:

      Description

      The spark.sql.extensions configuration options should support multiple extensions. It is currently possible to load multiple extensions using the programatic interface (e.g. SparkSession.builder().master("..").withExtensions(sparkSessionExtensions1).withExtensions(sparkSessionExtensions2).getOrCreate() ) but the same cannot currently be done with the command line options without writing a wrapper extensions that combines multiple extensions.

       

      Allowing multiple spark.sql.extensions, would allow the extensions to be easily changes on the command line or via the configuration file. Multiple extensions could be specified using a comma separated list of class names. Allowing multiple extensions should maintain backwards compatibility because existing spark.sql.extensions configuration settings shouldn't contain a comma because the value is a class name.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                jamison.bennett Jamison Bennett
                Reporter:
                jamison.bennett Jamison Bennett
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: