Description
The given SQL should work:
spark-sql> CREATE TABLE tbl1 STORED AS PARQUET AS SELECT INTERVAL '1-1' YEAR TO MONTH AS YM; 21/10/07 21:35:59 WARN SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. Error in query: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: Error: type expected at the position 0 of 'interval year to month' but 'interval year to month' is found