Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
3.3.0
-
None
-
None
Description
The issue is not happening in Spark 2.x (I am using 2.4.0), but only in 3.3.0 (tested with 3.4.1 as well)
Code to reproduce the issue
scala> spark.conf.set("spark.sql.sources.partitionOverwriteMode", "dynamic") scala> val DF = Seq(("test1", 123)).toDF("name", "num") scala> DF.write.option("path", "gs://test_bucket/table").mode("overwrite").partitionBy("num").format("orc").saveAsTable("test_schema.test_tb1")
The above code succeeds and creates external Hive table, but there is no SUCCESS file generated.
Adding the content of the bucket after table creation
The same code when running with spark 2.4.0 (with or without external path), generates the SUCCESS file.
scala> DF.write.mode(SaveMode.Overwrite).partitionBy("num").format("orc").saveAsTable("test_schema.test_tb1")