Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
3.2.0
-
None
Description
With resource aware scheduling, if you specify a default value in the spark-defaults.conf, a user can't override that to set it to 0.
Meaning spark-defaults.conf has something like:
spark.executor.resource.{resourceName}.amount=1
spark.task.resource.{resourceName}.amount =1
If the user tries to override when submitting an application with spark.executor.resource.{resourceName}.amount=0 and spark.task.resource.{resourceName}.amount =0, it gives the user an error:
23/06/21 09:12:57 ERROR Main: Failed to initialize Spark session.
org.apache.spark.SparkException: No executor resource configs were not specified for the following task configs: gpu
at org.apache.spark.resource.ResourceProfile.calculateTasksAndLimitingResource(ResourceProfile.scala:206)
at org.apache.spark.resource.ResourceProfile.$anonfun$limitingResource$1(ResourceProfile.scala:139)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.resource.ResourceProfile.limitingResource(ResourceProfile.scala:138)
at org.apache.spark.resource.ResourceProfileManager.addResourceProfile(ResourceProfileManager.scala:95)
at org.apache.spark.resource.ResourceProfileManager.<init>(ResourceProfileManager.scala:49)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:455)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
This used to work, my guess is this may have gotten broken with the stage level scheduling feature.