Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-13139

Create native DDL commands

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 2.0.0
    • SQL
    • None

    Description

      We currently delegate most DDLs directly to Hive, through NativePlaceholder in HiveQl.scala. In Spark 2.0, we want to provide native implementations for DDLs for both SQLContext and HiveContext.

      The first step is to properly parse these DDLs, and then create logical commands that encapsulate them. The actual implementation can still delegate to HiveNativeCommand. As an example, we should define a command for RenameTable with the proper fields, and just delegate the implementation to HiveNativeCommand (we might need to track the original sql query in order to run HiveNativeCommand, but we can remove the sql query in the future once we do the next step).

      Once we flush out the internal persistent catalog API, we can then switch the implementation of these newly added commands to use the catalog API.

      Attachments

        Issue Links

          There are no Sub-Tasks for this issue.

          Activity

            People

              andrewor14 Andrew Or
              rxin Reynold Xin
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: