Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
1.4.1, 1.5.1
-
None
-
None
-
Spark stand alone cluster
Description
I am running spark-submit in window 8.1 with spark standalone cluster (01 worker and 01 master), the job throw Exception in DataFrameWrite.jdbc(..) scala function.
We found that the following test:
var tableExists = JdbcUtils.tableExists(conn, table)
always return false event if we already created a table.
That drive the function to do creating table from specified DataFrame and the SQL Syntax error for creating table, we locate the SQL execution statement hereafter:
if (!tableExists)
This happened with spark-1.4.1 and Spark-1.5.1 (our dev environment).
Please help!
Attachments
Issue Links
- duplicates
-
SPARK-9078 Use of non-standard LIMIT keyword in JDBC tableExists code
- Resolved
- is cloned by
-
SPARK-11953 Acknowledge Append Mode in DataFrameWriter
- Resolved