Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22303

Getting java.sql.SQLException: Unsupported type 101 for BINARY_DOUBLE

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.2.0
    • 2.3.0
    • SQL
    • None

    Description

      When a table contains columns such as BINARY_DOUBLE or BINARY_FLOAT, this JDBC connector throws SQL exception

      java.sql.SQLException: Unsupported type 101
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$getCatalystType(JdbcUtils.scala:235)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:292)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:292)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getSchema(JdbcUtils.scala:291)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:64)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:113)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:47)
        at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:306)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
      

      these types are Oracle specific ones, described here
      https://docs.oracle.com/cd/E11882_01/timesten.112/e21642/types.htm#TTSQL148

      Attachments

        Activity

          People

            taroplus Kohki Nishio
            taroplus Kohki Nishio
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: