Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Incomplete
-
2.1.0
-
None
Description
We can config the executor cores by "spark.executor.cores". For example, if we config 8 cores for a executor, then the driver can only scheduler 8 tasks to this executor concurrently. In fact, most cases a task does not always occupy a core or more. More time, tasks spent on disk IO or network IO, so we can make driver to scheduler more than 8 tasks(virtual the cores to 16,32 or more the executor report to driver) to this executor concurrently, it will make the whole job execute more quickly.
Attachments
Attachments
Issue Links
- links to