Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-6067

Spark sql hive dynamic partitions job will fail if task fails

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Duplicate
    • 1.2.0
    • None
    • SQL
    • None

    Description

      When inserting into a hive table from spark sql while using dynamic partitioning, if a task fails it will cause the task to continue to fail and eventually fail the job.

      /mytable/.hive-staging_hive_2015-02-27_11-53-19_573_222-3/-ext-10000/partition=2015-02-04/part-00001 for client <ip> already exists

      The task may need to clean up after a failed task to write to the location of the previously failed task.

      Attachments

        1. job.log
          10 kB
          Jason Hubbard

        Issue Links

          Activity

            People

              Unassigned Unassigned
              jahubba Jason Hubbard
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: