Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-32824

The error is confusing when resource .amount not provided

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.0
    • 3.0.2, 3.1.0
    • Spark Core
    • None

    Description

      If the user forgets to specify the .amount when specifying a resource, the error that comes out is confusing, we should improve.

       

      $ $SPARK_HOME/bin/spark-shell  --master spark://host9:7077 --conf spark.executor.resource.gpu=1

       

      Using Spark's default log4j profile: org/apache/spark/log4j-defaults.propertiesSetting default log level to "WARN".To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).20/09/08 08:19:35 ERROR SparkContext: Error initializing SparkContext.java.lang.StringIndexOutOfBoundsException: String index out of range: -1 at java.lang.String.substring(String.java:1967) at org.apache.spark.resource.ResourceUtils$.$anonfun$listResourceIds$1(ResourceUtils.scala:151) at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238) at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36) at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198) at scala.collection.TraversableLike.map(TraversableLike.scala:238) at scala.collection.TraversableLike.map$(TraversableLike.scala:231) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198) at org.apache.spark.resource.ResourceUtils$.listResourceIds(ResourceUtils.scala:150) at org.apache.spark.resource.ResourceUtils$.parseAllResourceRequests(ResourceUtils.scala:158) at org.apache.spark.SparkContext$.checkResourcesPerTask$1(SparkContext.scala:2773) at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2884) at org.apache.spark.SparkContext.<init>(SparkContext.scala:528) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555) at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106) at $line3.$read$$iw$$iw.<init>(<console>:15) at $line3.$read$$iw.<init>(<console>:42) at $line3.$read.<init>(<console>:44)

      '

      Attachments

        Activity

          People

            tgraves Thomas Graves
            tgraves Thomas Graves
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: