Uploaded image for project: 'Livy'
  1. Livy
  2. LIVY-590

ClassNotFoundException: javax.ws.rs.ext.MessageBodyReader on Livy 0.6.0

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 0.6.0
    • Fix Version/s: None
    • Component/s: Server
    • Labels:
      None

      Description

      After I upgraded Livy to 0.6.0-incubating, running Spark job using Livy started failing. The details of the problem are below.

      1. When I start Livy server, following message is logged:

      19/04/18 23:13:35 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      java.lang.NoClassDefFoundError: javax/ws/rs/ext/MessageBodyReader
      at java.lang.ClassLoader.defineClass1(Native Method)
      at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
      ..
      Caused by: java.lang.ClassNotFoundException: javax.ws.rs.ext.MessageBodyReader
      at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
      at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
      ... 50 more

       

      2. When I submit a Spark job using Livy, the job state stuck on "starting". Also Livy cannot get the job's appId.

      $ curl http://10.10.144.20:8998/batches
      { "from": 0, "total": 1, "sessions": [ { "id": 0, "name": null, "state": "starting", "appId": null, "appInfo": { "driverLogUrl": null, "sparkUiUrl": null }, "log": [ "19/04/18 20:28:58 INFO MemoryStore: MemoryStore cleared", "19/04/18 20:28:58 INFO BlockManager: BlockManager stopped", "19/04/18 20:28:58 INFO BlockManagerMaster: BlockManagerMaster stopped", "19/04/18 20:28:58 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!", "19/04/18 20:28:58 INFO SparkContext: Successfully stopped SparkContext", "19/04/18 20:28:58 INFO ShutdownHookManager: Shutdown hook called", "19/04/18 20:28:58 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-b8039adb-f3df-4526-8123-1bb2aee6ed7c", "19/04/18 20:28:58 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-b48219fd-2607-4a7d-95bd-538c89f90ebb", "\nstderr: ", "\nYARN Diagnostics: " ] } ] }

       

      This is caused because Livy package does not have jersey-core jar file. This change was introduced by LIVY-502

      I think Livy package should have the jersey-core jar file. Modifying server/pom.xml (I attached the diff to this JIRA) was able to run Spark jobs without this error. 

        Attachments

        1. LIVY-590.patch
          0.5 kB
          Aki Tanaka

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              tanakahda Aki Tanaka
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated: