Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-20465

Throws a proper exception rather than ArrayIndexOutOfBoundsException when temp directories could not be got/created

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Trivial
    • Resolution: Fixed
    • 2.2.0
    • 2.2.0
    • Spark Core, Spark Submit
    • None

    Description

      If none of temp directories could not be created, it throws an ArrayIndexOutOfBoundsException as below:

      ./bin/spark-shell --conf spark.local.dir=/NONEXISTENT_DIR_ONE,/NONEXISTENT_DIR_TWO
      17/04/26 13:11:06 ERROR Utils: Failed to create dir in /NONEXISTENT_DIR_ONE. Ignoring this directory.
      17/04/26 13:11:06 ERROR Utils: Failed to create dir in /NONEXISTENT_DIR_TWO. Ignoring this directory.
      Exception in thread "main" java.lang.ExceptionInInitializerError
      	at org.apache.spark.repl.Main.main(Main.scala)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:497)
      	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:756)
      	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:179)
      	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:204)
      	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:118)
      	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      Caused by: java.lang.ArrayIndexOutOfBoundsException: 0
      	at org.apache.spark.util.Utils$.getLocalDir(Utils.scala:743)
      	at org.apache.spark.repl.Main$$anonfun$1.apply(Main.scala:37)
      	at org.apache.spark.repl.Main$$anonfun$1.apply(Main.scala:37)
      	at scala.Option.getOrElse(Option.scala:121)
      	at org.apache.spark.repl.Main$.<init>(Main.scala:37)
      	at org.apache.spark.repl.Main$.<clinit>(Main.scala)
      	... 10 more
      

      It seems we should throw a proper exception with better message.

      Attachments

        Activity

          People

            gurwls223 Hyukjin Kwon
            gurwls223 Hyukjin Kwon
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: