Uploaded image for project: 'MRQL'
  1. MRQL
  2. MRQL-73

Set the max number of tasks in Spark mode

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Critical
    • Resolution: Fixed
    • 0.9.6
    • None
    • Run-Time/Spark
    • None

    Description

      The number of worker nodes in Spark distributed mode, which are specified by the MRQL -nodes parameter, must set the parameters SPARK_WORKER_INSTANCES (called SPARK_EXECUTOR_INSTANCES in Spark 1.3.*) and SPARK_WORKER_CORES; otherwise, Spark will always use all the available cores in the cluster.

      Attachments

        Activity

          People

            fegaras Leonidas Fegaras
            fegaras Leonidas Fegaras
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: