Uploaded image for project: 'Livy'
  1. Livy
  2. LIVY-796

spark submit by livy is failing because of jdbc driver issue

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • None
    • 0.9.0
    • API
    • None
    • spark 2.4.4

    Description

      I am facing a troubling issue when i submit my spark job with livy. This is the spark code which i am submitting.

       

      raw_df = spark.read.format("jdbc")\
      .option("url", url)\
      .option("query", query)\
      .option("user", user)\
      .option("password", pass)\
      .option("numPartitions", 4)\
      .option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")\
      .load()

      This code is running fine perfectly with normal spark submit. But i want to submit this job via livy . This is payload i am passing for post request

       

      payload =

      { "queue": "default", "name": "appname", "proxyUser": "hadoop", "conf":\{"spark.jars.packages": "com.microsoft.sqlserver:mssql-jdbc:7.2.1.jre8"}

      ,
      "file": "s3://dbbucket/emrbuilds/app/src/main_pull.py",
      "args":["2020-10-10"]
      }

      But i am getting java nullpointer execption

       

      {{Traceback (most recent call last):
      File "/mnt/tmp/spark-327a7343-4558-4fd1-a63e-d3a1e68a1963/main_pull.py", line 87, in <module>
      .option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")\
      File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/readwriter.py", line 172, in load
      File "/usr/lib/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in _call_
      File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
      File "/usr/lib/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
      py4j.protocol.Py4JJavaError: An error occurred while calling o206.load.
      : java.lang.NullPointerException
      at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:71)
      at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:210)
      at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)
      at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
      at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
      at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
      at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:498)
      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
      at py4j.Gateway.invoke(Gateway.java:282)
      at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
      at py4j.commands.CallCommand.execute(CallCommand.java:79)
      at py4j.GatewayConnection.run(GatewayConnection.java:238)
      at java.lang.Thread.run(Thread.java:748)}}{{}}

      Attachments

        Activity

          People

            Unassigned Unassigned
            kalani.mahesh mahesh
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: