Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-14727

NullPointerException while trying to launch local spark job

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • None
    • None
    • None
    • None

    Description

      OS : Windows 10
      Spark Version : 1.6.1
      Java version : 1.8

      I am trying to launch a simple Spark job from eclipse, after starting spark master and registering one worker. JavaRDDs are created successfully, however, a NPE is thrown while collect() operation is executed. Below are the steps that I performed:

      1. Downloaded Spark 1.6.1
      2. Built it locally with 'sbt package' and 'sbt assembly' commands
      3. Started Master with 'spark-class org.apache.spark.deploy.master.Master'
      4. Started Worker with 'spark-class org.apache.spark.deploy.worker.Worker spark://master:7077 -c 2'
      5. Verified both Master and Worker are up, and have enough resources in Spark UI
      6. Created a maven project in eclipse, with spark dependency
      7. Executed attached "SparkCrud.java" in eclipse
      8. NPE is thrown, logs are attached "Logs.log"

      It seems it's trying to execute Hadoop binaries, however, I am not using Hadoop anywhere at all. Also, I tried placing winutil.exe in C:
      and configured "hadoop.home.dir" System property (as suggested in another JIRA), however that doesn't seem to have done the trick.

      Attachments

        1. SparkCrud.java
          2 kB
          Darshan Mehta
        2. Logs.log
          17 kB
          Darshan Mehta

        Issue Links

          Activity

            People

              Unassigned Unassigned
              darshanmehta2 Darshan Mehta
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: