Details
-
Improvement
-
Status: Open
-
Major
-
Resolution: Unresolved
-
3.1.2
-
None
-
None
Description
In my hadoop cluster :
set yarn.timeline-service.generic-application-history.enabled=true;
set yarn.timeline-service.generic-application-history.store-class=org.apache.hadoop.yarn.server.applicationhistoryservice.FileSystemApplicationHistoryStore;
set yarn.timeline-service.generic-application-history.fs-history-store.uri=/user/yarn/timeline/generic_history;
hdfs dfs -count /user/yarn/timeline/generic_history/ApplicationHistoryDataRoot
1 3150832 82436180812
the number of files in it will increase to the limit of dfs.namenode.fs-limits.max-directory-items=3200000 in less than one month and increase too fast.Now the files were cleaned manually. Does have any ttl suggestions to clean automatically ?what should i do?