Description
Follow up from https://github.com/apache/spark/pull/40199#discussion_r1119453996
If OMP_NUM_THREADS is not set explicitly, we should set it to `spark.task.cpus` instead of `spark.executor.cores` as described in PR #38699.
Attachments
Issue Links
- is related to
-
SPARK-41188 Set executorEnv OMP_NUM_THREADS to be spark.task.cpus by default for spark executor JVM processes
- Resolved
-
SPARK-42596 [YARN] OMP_NUM_THREADS not set to number of executor cores by default
- Resolved
-
SPARK-28843 Set OMP_NUM_THREADS to executor cores reduce Python memory consumption
- Resolved
- links to