Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-2987

Flink 0.10 fails to start on YARN 2.6.0

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 0.10.0
    • Fix Version/s: 0.10.1, 1.0.0
    • Labels:
      None
    • Environment:

      Microsoft Azure HDInsight on Linux: 3.2.1000.0

      Description

      While testing Flink for the release, I noticed that it does not start on YARN due to classloading issues.

      The error message is the following

      robert@hn0-apache:~/flink-0.10.0$ ./bin/yarn-session.sh  -n 2
      Exception in thread "main" java.lang.RuntimeException: Could not instantiate type 'org.apache.flink.yarn.FlinkYarnClient' Most likely the constructor (or a member variable initialization) threw an exception: com/sun/jersey/api/client/config/ClientConfig
      	at org.apache.flink.util.InstantiationUtil.instantiate(InstantiationUtil.java:152)
      	at org.apache.flink.util.InstantiationUtil.instantiate(InstantiationUtil.java:118)
      	at org.apache.flink.client.FlinkYarnSessionCli.getFlinkYarnClient(FlinkYarnSessionCli.java:262)
      	at org.apache.flink.client.FlinkYarnSessionCli.createFlinkYarnClient(FlinkYarnSessionCli.java:107)
      	at org.apache.flink.client.FlinkYarnSessionCli.run(FlinkYarnSessionCli.java:400)
      	at org.apache.flink.client.FlinkYarnSessionCli.main(FlinkYarnSessionCli.java:351)
      Caused by: java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
      	at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:45)
      	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:163)
      	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
      	at org.apache.flink.yarn.FlinkYarnClientBase.<init>(FlinkYarnClientBase.java:157)
      	at org.apache.flink.yarn.FlinkYarnClient.<init>(FlinkYarnClient.java:23)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      	at java.lang.Class.newInstance(Class.java:379)
      	at org.apache.flink.util.InstantiationUtil.instantiate(InstantiationUtil.java:139)
      	... 5 more
      Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client.config.ClientConfig
      	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
      	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
      	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
      	... 16 more
      

      The issue occurs with the following versions:

      • flink-0.10.0-bin-hadoop26-scala_2.11.tgz
      • flink-0.10.0-bin-hadoop26-scala_2.10.tgz
      • flink-0.10.0-bin-hadoop27-scala_2.10.tgz

      It works for:

      • flink-0.10.0-bin-hadoop24-scala_2.10.tgz
      • flink-0.9.1-bin-hadoop26.tgz

      Interestingly, the issue only occurs, when HADOOP_CONF_DIR is set. Otherwise, the YARN client is starting until its unable to connect to the RM.

      The missing class is in the jersey-client dependency, which we exclude from Hadoop, in flink-shaded-hadoop2.

        Attachments

          Activity

            People

            • Assignee:
              rmetzger Robert Metzger
              Reporter:
              rmetzger Robert Metzger
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: