Description
There is error when running test code:
test("create orc table") { spark.sql( s"""CREATE TABLE normal_orc_as_source_hive |USING org.apache.spark.sql.hive.orc |OPTIONS ( | PATH '${new File(orcTableAsDir.getAbsolutePath).toURI}' |) """.stripMargin) val df = spark.sql("select * from normal_orc_as_source_hive") spark.sql("desc formatted normal_orc_as_source_hive").show() }
warning:
05:00:44.038 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.hive.orc. Persisting data source table `default`.`normal_orc_as_source_hive` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
Root cause analysis:
ORC related code is incorrect in HiveSerDe :
org.apache.spark.sql.internal.HiveSerDe#sourceToSerDe
def sourceToSerDe(source: String): Option[HiveSerDe] = { val key = source.toLowerCase(Locale.ROOT) match { case s if s.startsWith("org.apache.spark.sql.parquet") => "parquet" case s if s.startsWith("org.apache.spark.sql.orc") => "orc" case s if s.equals("orcfile") => "orc" case s if s.equals("parquetfile") => "parquet" case s if s.equals("avrofile") => "avro" case s => s }
Solution:
change "org.apache.spark.sql.orc“ to "org.apache.spark.sql.hive.orc"