Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21852

Empty Parquet Files created as a result of spark jobs fail when read

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Cannot Reproduce
    • 2.2.0
    • None
    • Input/Output
    • None

    Description

      I have faced an issue intermittently with certain spark jobs writing parquet files which apparently succeed but the written .parquet directory in HDFS is an empty directory (with no _SUCCESS and _metadata parts, even). Surprisingly, no errors are thrown from spark dataframe writer.

      However, when attempting to read this written file, spark throws the error:
      Unable to infer schema for Parquet. It must be specified manually

      Attachments

        Activity

          People

            Unassigned Unassigned
            sdalmia_asf Shivam Dalmia
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: