Details
-
Bug
-
Status: Closed
-
Minor
-
Resolution: Fixed
-
1.4.1
-
None
-
None
Description
On Spark 1.4.1, when using DataFramWriter.insertInto() method on a Hive table created in a non-default schema like so:
`myDF.write.insertInto("my_db.some_table")`
the following exception is thrown:
org.apache.spark.sql.AnalysisException: no such table my_db.some_table;
The table exists because I can query it with `sqlContext.sql("SELECT * FROM my_db.some_table")`
However, when some_table is created in the default schema and the call to insertInto becomes:
`myDF.write.insertInto("some_table")`
it works fine.