Details
-
Bug
-
Status: Closed
-
Critical
-
Resolution: Won't Fix
-
None
-
1
Description
when adding the hudi spark sql support, this breaks the ability to read a hudi metastore from spark:
bash-4.2$ ./spark3.0.2/bin/spark-shell --packages org.apache.hudi:hudi-spark3-bundle_2.12:0.9.0,org.apache.spark:spark-avro_2.12:3.1.2 --conf "spark.serializer=org.apache.spark.serializer.KryoSerializer" --conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
scala> spark.table("default.test_hudi_table").show
java.lang.UnsupportedOperationException: Unsupported parseMultipartIdentifier method
at org.apache.spark.sql.parser.HoodieCommonSqlParser.parseMultipartIdentifier(HoodieCommonSqlParser.scala:65)
at org.apache.spark.sql.SparkSession.table(SparkSession.scala:581)
... 47 elided
removing the config makes the hive table readable again from spark
this affect at least spark 3.0.x and 3.1.x