Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-23738

Memory usage for executors

    XMLWordPrintableJSON

Details

    • Question
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.1.0
    • None
    • Spark Core, Spark Submit

    Description

      Hi, 

      I'm running a spark cluster with 3 nodes (one as master+ worker) and 2 other workers. each worker has one executor. Then, I execute the pagerank example implemented in spark. I need to gather the memory usage by each executor while running the application and gather them into a file to further analysis. 

      How can I do it please? 

      One idea in my mind is to get the PID  of executor ans driver processes and then, use a linux command line to get this information..is it right?

      please, guide me

      Attachments

        Activity

          People

            Unassigned Unassigned
            assia6 assia ydroudj
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: