Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2154

Worker goes down.

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 0.8.1, 0.9.0, 1.0.0
    • Fix Version/s: 1.0.2, 1.1.0
    • Component/s: Spark Core
    • Labels:
    • Environment:

      Spark on cluster of three nodes on Ubuntu 12.04.4 LTS

      Description

      Worker dies when i try to submit drivers more than the allocated cores. When I submit 9 drivers with one core for each driver on a cluster having 8 cores all together the worker dies as soon as i submit the 9 the driver. It works fine until it reaches 8 cores, As soon as i submit 9th driver the driver status remains "Submitted" and the worker crashes. I understand that we cannot run drivers more than the allocated cores but the problem here is instead of the 9th driver being in queue it is being executed and as a result it is crashing the worker. Let me know if there is a way to get around this issue or is it being fixed in the upcoming version?

      Cluster Details:
      Spark 1.00
      2 nodes with 4 cores each.

        Attachments

        1. Sccreenhot at various states of driver ..jpg
          263 kB
          siva venkat gogineni

          Issue Links

            Activity

              People

              • Assignee:
                adav Aaron Davidson
                Reporter:
                talk2siva8 siva venkat gogineni
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: