Description
For example, the test below:
test("SPARK-XXXXX: refresh cache in partition adding") { withNamespaceAndTable("ns", "tbl") { t => sql(s"CREATE TABLE $t (part int) $defaultUsing PARTITIONED BY (part)") sql(s"ALTER TABLE $t ADD PARTITION (part=0)") assert(!spark.catalog.isCached(t)) sql(s"CACHE TABLE $t") assert(spark.catalog.isCached(t)) checkAnswer(sql(s"SELECT * FROM $t"), Row(0)) sql(s"ALTER TABLE $t ADD PARTITION (part=1)") assert(spark.catalog.isCached(t)) checkAnswer(sql(s"SELECT * FROM $t"), Seq(Row(0), Row(1))) } }
fails with;
!== Correct Answer - 2 == == Spark Answer - 1 ==
!struct<> struct<part:int>
[0] [0]
![1]
ScalaTestFailureLocation: org.apache.spark.sql.QueryTest$ at (QueryTest.scala:243)
because the command doesn't refresh the cache.
Attachments
Issue Links
- is related to
-
SPARK-34161 Check re-caching of v2 table dependents after table altering
- Resolved
- links to