Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19059

Unable to retrieve data from a parquet table whose name starts with underscore

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.1.0
    • 2.2.0
    • Spark Core
    • None

    Description

      It looks like there is some bug introduced in Spark 2.1.0 preventing to read data from a parquet table (hive support is enabled) whose name starts with underscore. CREATE and INSERT statements on the same table instead seems to work as expected.

      The problem can be reproduced from spark-shell through the following steps:
      1) Create a table with some values
      scala> spark.sql("CREATE TABLE `_a`(i INT) USING parquet").show
      scala> spark.sql("INSERT INTO `_a` VALUES (1), (2), (3)").show

      2) Select data from the just created and filled table --> no results
      scala> spark.sql("SELECT * FROM `_a`").show
      ---

      i

      ---
      ---

      3) rename the table so that the prefixing underscore disappears
      scala> spark.sql("ALTER TABLE `_a` RENAME TO `a`").show

      4) select data from the just renamed table --> results are shown
      scala> spark.sql("SELECT * FROM `a`").show
      ---

      i

      ---

      1
      2
      3

      ---

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            jayadevan.m Jayadevan M
            gbloisi Giambattista
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment