Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22011

model <- spark.logit(training, Survived ~ ., regParam = 0.5) shwoing error

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Not A Problem
    • Affects Version/s: 2.2.0
    • Fix Version/s: None
    • Component/s: Examples
    • Labels:
      None
    • Environment:

      Error showing on SparkR

      Description

      Sys.setenv(SPARK_HOME="C:/spark")
      .libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))

      Sys.setenv(JAVA_HOME="C:/Program Files/Java/jdk1.8.0_144/")

      library(SparkR)
      sc <- sparkR.session(master = "local")
      sqlContext <- sparkRSQL.init(sc)

      o/p: showing error in Rstudio
      Warning message:
      'sparkRSQL.init' is deprecated.
      Use 'sparkR.session' instead.
      See help("Deprecated")

      Can you help me what exactly error/warning...and next

      model <- spark.logit(training, Survived ~ ., regParam = 0.5)

      Error in handleErrors(returnStatus, conn) :
      org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 35.0 failed 1 times, most recent failure: Lost task 0.0 in stage 35.0 (TID 31, localhost, executor driver): org.apache.spark.SparkException: SPARK_HOME not set. Can't locate SparkR package.
      at org.apache.spark.api.r.RUtils$$anonfun$2.apply(RUtils.scala:88)
      at org.apache.spark.api.r.RUtils$$anonfun$2.apply(RUtils.scala:88)
      at scala.Option.getOrElse(Option.scala:121)
      at org.apache.spark.api.r.RUtils$.sparkRPackagePath(RUtils.scala:87)
      at org.apache.spark.api.r.RRunner$.createRProcess(RRunner.scala:339)
      at org.apache.spark.api.r.RRunner$.createRWorker(RRunner.scala:391)
      at org.apache.spark.api.r.RRunner.compute(RRunner.scala:69)
      at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:51)
      at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
      at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
      at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38

        Activity

        Hide
        srowen Sean Owen added a comment -

        The error pretty much says it all – set SPARK_HOME. See the SparkR docs.

        Show
        srowen Sean Owen added a comment - The error pretty much says it all – set SPARK_HOME. See the SparkR docs.

          People

          • Assignee:
            Unassigned
            Reporter:
            Atulsk06 Atul Khairnar
          • Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development