Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-9441

NoSuchMethodError: Com.typesafe.config.Config.getDuration

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 1.3.1
    • None
    • Deploy
    • None

    Description

      I recently migrated my spark based rest service from 1.0.2 to 1.3.1

      15/07/29 10:31:12 INFO spark.SparkContext: Running Spark version 1.3.1

      15/07/29 10:31:12 INFO spark.SecurityManager: Changing view acls to: npatel

      15/07/29 10:31:12 INFO spark.SecurityManager: Changing modify acls to: npatel

      15/07/29 10:31:12 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(npatel); users with modify permissions: Set(npatel)

      Exception in thread "main" java.lang.NoSuchMethodError: com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J

      at akka.util.Helpers$ConfigOps$.akka$util$Helpers$ConfigOps$$getDuration$extension(Helpers.scala:125)

      at akka.util.Helpers$ConfigOps$.getMillisDuration$extension(Helpers.scala:120)

      at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:171)

      at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:504)

      at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)

      at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)

      at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)

      at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55)

      at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)

      at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)

      at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)

      at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)

      at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57)

      at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223)

      at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)

      at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)

      at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)

      I read on blogs where people suggest to modify classpath and put right version before, put scala libs before in classpath and similar suggestions. which is all ridiculous. I think typesafe config package included with spark-core lib is incorrect. I did following with my maven build and now it works. But i think someone need to fix spark-core package.

      <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <exclusions>
      <exclusion>
      <artifactId>config</artifactId>
      <groupId>com.typesafe</groupId>
      </exclusion>
      </exclusions>
      </dependency>

      <dependency>
      <groupId>com.typesafe</groupId>
      <artifactId>config</artifactId>
      <version>1.2.1</version>
      </dependency>

      Attachments

        Activity

          People

            Unassigned Unassigned
            tenstriker nirav patel
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: