Hive compatibility is pretty important for the users who run or migrate both Hive and Spark SQL.
We plan to add a SQLConf for type coercion compatibility (spark.sql.typeCoercion.mode). Users can choose Spark's native mode (default) or Hive mode (hive).
Before we deliver the Hive compatibility mode, we plan to write a set of test cases that can be easily run in both Spark and Hive sides. We can easily compare whether they are the same or not. When new typeCoercion rules are added, we also can easily track the changes. These test cases can also be backported to the previous Spark versions for determining the changes we made.