Details

    • Type: Sub-task
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: 2.1.0
    • Fix Version/s: 2.1.0
    • Component/s: SQL
    • Labels:
      None
    • Target Version/s:

      Description

      The first three test cases fail due to a crash in hive client when dropping partitions that don't contain files. The last one deletes too many files due to a partition case resolution failure.

        test("foo") {
          withTable("test") {
            spark.range(10)
              .selectExpr("id", "id as A", "'x' as B")
              .write.partitionBy("A", "B").mode("overwrite")
              .saveAsTable("test")
            spark.sql("insert overwrite table test select id, id, 'x' from range(1)")
            assert(spark.sql("select * from test").count() == 1)
          }
        }
      
        test("bar") {
          withTable("test") {
            spark.range(10)
              .selectExpr("id", "id as A", "'x' as B")
              .write.partitionBy("A", "B").mode("overwrite")
              .saveAsTable("test")
            spark.sql("insert overwrite table test partition (a, b) select id, id, 'x' from range(1)")
            assert(spark.sql("select * from test").count() == 1)
          }
        }
      
        test("baz") {
          withTable("test") {
            spark.range(10)
              .selectExpr("id", "id as A", "'x' as B")
              .write.partitionBy("A", "B").mode("overwrite")
              .saveAsTable("test")
            spark.sql("insert overwrite table test partition (A, B) select id, id, 'x' from range(1)")
            assert(spark.sql("select * from test").count() == 1)
          }
        }
      
        test("qux") {
          withTable("test") {
            spark.range(10)
              .selectExpr("id", "id as A", "'x' as B")
              .write.partitionBy("A", "B").mode("overwrite")
              .saveAsTable("test")
            spark.sql("insert overwrite table test partition (a=1, b) select id, 'x' from range(1)")
            assert(spark.sql("select * from test").count() == 10)
          }
        }
      

        Attachments

          Activity

            People

            • Assignee:
              ekhliang Eric Liang
              Reporter:
              ekhliang Eric Liang
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: