Details
-
Question
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
2.1.0
-
None
Description
Hi,
I'm running a spark cluster with 3 nodes (one as master+ worker) and 2 other workers. each worker has one executor. Then, I execute the pagerank example implemented in spark. I need to gather the memory usage by each executor while running the application and gather them into a file to further analysis.
How can I do it please?
One idea in my mind is to get the PID of executor ans driver processes and then, use a linux command line to get this information..is it right?
please, guide me