Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Not A Problem
-
1.2.1
-
None
-
None
Description
If you populate a table using INSERT OVERWRITE and then try to rename the table using alter table it fails with:
Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. (state=,code=0)
Using the following SQL statement creates the error:
CREATE TABLE `tmp_table` (salesamount_c1 DOUBLE); INSERT OVERWRITE table tmp_table SELECT MIN(sales_customer.salesamount) salesamount_c1 FROM ( SELECT SUM(sales.salesamount) salesamount FROM internalsales sales ) sales_customer; ALTER TABLE tmp_table RENAME to not_tmp;
But if you change the 'OVERWRITE' to be 'INTO' the SQL statement works.
This is happening on our CDH5.3 cluster with multiple workers, If we use the CDH5.3 Quickstart VM the SQL does not produce an error. Both cases were spark 1.2.1 built for hadoop2.4+