Details
Description
Since the introduction of Dataframes in Spark 1.3.0 and prior to SPARK-6908 landing into master, a user could get a DataFrame to a Hive table using `sqlContext.table("databaseName.tableName")`
Since SPARK-6908, the user now receives a NoSuchTableException.
This amounts to a change in non experimental sqlContext.table() api and will require user code to be modified to work properly with 1.4.0.
The only viable work around I could find is
`sqlContext.sql("select * from databseName.tableName")`
which seems like a hack.
Attachments
Issue Links
- is duplicated by
-
SPARK-8107 sqlContext.table() should be able to take a database name as an additional argument.
- Resolved
-
SPARK-8550 table() no longer supports specifying the database - ie. table([database].[table]).
- Resolved
- links to