Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-9505

DataFrames : Mysql JDBC not support column names with special characters

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 1.3.0
    • None
    • Spark Core, SQL
    • None

    Description

      HI all,

      I had above issue on connect to mySQL database through SQLContext. If the mySQL table's column name contains special characters like #[ ] %, it throw exception : "You have an error in your SQL syntax".

      Below is coding:

      Class.forName("com.mysql.jdbc.Driver").newInstance()

      val url = "jdbc:mysql://localhost:3306/sakila?user=root&password=xxx"
      val driver = "com.mysql.jdbc.Driver"

      val sqlContext = new SQLContext(sc)

      val output = { sqlContext.load("jdbc", Map
      (
      "url" -> url,
      "driver" -> driver,
      "dbtable" -> "(SELECT `ID`, `NAME%` FROM `agent`) AS tableA "
      )
      )
      }

      Hope dataframes for sqlContext can support for special characters very soon as this become a stopper now.

      Thanks

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              Pangjiu Pangjiu
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: