Uploaded image for project: 'Apache Hudi'
  1. Apache Hudi
  2. HUDI-2426

spark sql extensions breaks read.table from metastore

    XMLWordPrintableJSON

Details

    • 1

    Description

      when adding the hudi spark sql support, this breaks the ability to read a hudi metastore from spark:

       bash-4.2$ ./spark3.0.2/bin/spark-shell --packages org.apache.hudi:hudi-spark3-bundle_2.12:0.9.0,org.apache.spark:spark-avro_2.12:3.1.2 --conf "spark.serializer=org.apache.spark.serializer.KryoSerializer" --conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'

       

      scala> spark.table("default.test_hudi_table").show
      java.lang.UnsupportedOperationException: Unsupported parseMultipartIdentifier method
      at org.apache.spark.sql.parser.HoodieCommonSqlParser.parseMultipartIdentifier(HoodieCommonSqlParser.scala:65)
      at org.apache.spark.sql.SparkSession.table(SparkSession.scala:581)
      ... 47 elided

       

      removing the config makes the hive table readable again from spark

      this affect at least spark 3.0.x and 3.1.x

      Attachments

        Issue Links

          Activity

            People

              biyan900116@gmail.com Yann Byron
              parisni nicolas paris
              Shiyan Xu
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: