Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Fixed
-
None
-
None
-
None
-
Reviewed
-
Description
Sometimes user Map/Reduce applications can get extremely memory intensive, maybe due to some inadvertent bugs in the user code, or the amount of data processed. When this happens, the user tasks start to interfere with the proper execution of other processes on the node, including other Hadoop daemons like the DataNode and TaskTracker. Thus, the node would become unusable for any Hadoop tasks. There should be a way to prevent such tasks from bringing down the node.
Attachments
Attachments
Issue Links
- blocks
-
HADOOP-3759 Provide ability to run memory intensive jobs without affecting other running tasks on the nodes
- Closed
- is part of
-
HADOOP-3444 Implementing a Resource Manager (V1) for Hadoop
- Resolved