Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-8500

Support for array types in JDBCRDD

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 1.4.0
    • None
    • SQL
    • MacOSX 10.10.3, Postgres 9.3.5, Spark 1.4 hadoop 2.6,
      Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_40)
      spark-shell --driver-class-path ./postgresql-9.3-1103.jdbc41.jar

    Description

      Loading a table with a text[] column via sqlContext causes an error.

      sqlContext.load("jdbc", Map("url" -> "jdbc:postgresql://localhost/my_db", "dbtable" -> "table"))

      Table has a column:
      my_col | text[] |

      Stacktrace: https://gist.github.com/8b163bf5fdc2aea7dbb6.git

      Same occurs in pyspark shell.
      Loading another table without text array column works allright.

      Possible hint:
      https://github.com/apache/spark/blob/d986fb9a378416248768828e6e6c7405697f9a5a/sql/core/src/main/scala/org/apache/spark/sql/jdbc/JDBCRDD.scala#L57

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              mpisanko michal pisanko
              Votes:
              1 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: