Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12736

Standalone Master cannot be started due to NoClassDefFoundError: org/spark-project/guava/collect/Maps

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.0.0
    • Fix Version/s: 2.0.0
    • Component/s: Deploy, Spark Core
    • Labels:
      None

      Description

      After https://github.com/apache/spark/commit/659fd9d04b988d48960eac4f352ca37066f43f5c starting standalone Master (using ./sbin/start-master.sh) fails with the following exception:

      Spark Command: /Library/Java/JavaVirtualMachines/Current/Contents/Home/bin/java
      -cp /Users/jacek/dev/oss/spark/conf/:/Users/jacek/dev/oss/spark/assembly/target/scala-2.11/spark-assembly-2.0.0-SNAPSHOT-hadoop2.7.1.jar:/Users/jacek/dev/oss/spark/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar:/Users/jacek/dev/oss/spark/lib_managed/jars/datanucleus-core-3.2.10.jar:/Users/jacek/dev/oss/spark/lib_managed/jars/datanucleus-rdbms-3.2.9.jar
      -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip japila.local
      --port 7077 --webui-port 8080
      ========================================
      Setting default log level to "WARN".
      To adjust logging level use sc.setLogLevel(newLevel).
      Exception in thread "main" java.lang.NoClassDefFoundError:
      org/spark-project/guava/collect/Maps
              at org.apache.hadoop.metrics2.lib.MetricsRegistry.<init>(MetricsRegistry.java:42)
              at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:94)
              at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:141)
              at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:38)
              at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:36)
              at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:120)
              at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:236)
              at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2156)
              at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2156)
              at scala.Option.getOrElse(Option.scala:121)
              at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2156)
              at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:214)
              at org.apache.spark.deploy.master.Master$.startRpcEnvAndEndpoint(Master.scala:1108)
              at org.apache.spark.deploy.master.Master$.main(Master.scala:1093)
              at org.apache.spark.deploy.master.Master.main(Master.scala)
      Caused by: java.lang.ClassNotFoundException:
      org.spark-project.guava.collect.Maps
              at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
              at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
              at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
              at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
              ... 15 more
      

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                jlaskowski Jacek Laskowski
                Reporter:
                jlaskowski Jacek Laskowski
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: