Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-7436

Cannot implement nor use custom StandaloneRecoveryModeFactory implementations

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.3.1
    • Fix Version/s: 1.3.2, 1.4.0
    • Component/s: Deploy
    • Labels:
      None

      Description

      At least, this code fragment is buggy (Master.scala):

            case "CUSTOM" =>
              val clazz = Class.forName(conf.get("spark.deploy.recoveryMode.factory"))
              val factory = clazz.getConstructor(conf.getClass, Serialization.getClass)
                .newInstance(conf, SerializationExtension(context.system))
                .asInstanceOf[StandaloneRecoveryModeFactory]
              (factory.createPersistenceEngine(), factory.createLeaderElectionAgent(this))
      

      Because here: val factory = clazz.getConstructor(conf.getClass, Serialization.getClass) it tries to find the constructor which accepts org.apache.spark.SparkConf and class of companion object of akka.serialization.Serialization and then it tries to instantiate newInstance(conf, SerializationExtension(context.system)) with instance of SparkConf and instance of Serialization class - not the companion objects.

        Attachments

          Activity

            People

            • Assignee:
              jlewandowski Jacek Lewandowski
              Reporter:
              jlewandowski Jacek Lewandowski
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: