Description
In particular, it would be good if the dialect could handle Derby's user-defined types. The following script fails:
import org.apache.spark.sql._ import org.apache.spark.sql.types._ // the following script was used to create a Derby table // which has a column of user-defined type: // // create type properties external name 'java.util.Properties' language java; // // create function systemProperties() returns properties // language java parameter style java no sql // external name 'java.lang.System.getProperties'; // // create table propertiesTable( props properties ); // // insert into propertiesTable values ( null ), ( systemProperties() ); // // select * from propertiesTable; // cannot handle a table which has a column of type java.sql.Types.JAVA_OBJECT: // // java.sql.SQLException: Unsupported type 2000 // val df = sqlContext.read.format("jdbc").options( Map("url" -> "jdbc:derby:/Users/rhillegas/derby/databases/derby1", "dbtable" -> "app.propertiesTable")).load() // shutdown the Derby engine val shutdown = sqlContext.read.format("jdbc").options( Map("url" -> "jdbc:derby:;shutdown=true", "dbtable" -> "")).load() exit()
The inability to handle user-defined types probably affects other databases besides Derby.
Attachments
Issue Links
- is related to
-
SPARK-11010 Fixes and enhancements addressing UDTs' api and several usability concerns
- Resolved
- links to