Details
Description
In the kerberized hadoop cluster, when Spark creates tables, the owner of tables are filled with PRINCIPAL strings instead of USER names. This is inconsistent with Hive and causes problems when using ROLE in Hive. We had better to fix this.
BEFORE
scala> sql("create table t(a int)").show scala> sql("desc formatted t").show(false) ... |Owner: |spark@EXAMPLE.COM | |
AFTER
scala> sql("create table t(a int)").show scala> sql("desc formatted t").show(false) ... |Owner: |spark | |