Uploaded image for project: 'Apache DataLab (Retired)'
  1. Apache DataLab (Retired)
  2. DATALAB-953

[Notebook]: Investigate how it is allocated memory for spark

ResolvedReopenedClosedVerifiedBlockedon hold
    XMLWordPrintableJSON

Details

    Description

       This allocation is obsolete because of task https://issues.apache.org/jira/browse/DATALAB-1985

      Previously it was:
      1. the value of spark memory is equal 75% from total (if instance shape is up to 8 GB)
      2. the value of spark.executor.memory is equal formula value: total value minus 3.5 GB (if instance shape is over than 8 GB).

      Currently it is:
      Case 1: Rstudio [4 GB]
      3.4*0.75*1024=3072MB (by theory)
      But by practical:

       Case2:
      Rstudio [122 GB]
      (122-3.5)*1024= 121344
      But by practical: (by theory)

      Please, investigate how memory should be allocated

      Attachments

        1. image-2019-07-26-17-55-04-023.png
          38 kB
          Vira Vitanska
        2. image-2019-07-26-17-54-18-544.png
          25 kB
          Vira Vitanska
        3. image-2019-07-26-17-49-24-994.png
          37 kB
          Vira Vitanska

        Activity

          People

            mykolabodnar Mykola Bodnar
            vira_vitanska@epam.com Vira Vitanska
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: