The new metrics instrumentation is using the current Hadoop metrics implementation, that way we avoid code duplication and maintenance in 2 places
Current patch provide metrics information for:
Metrics are available using the standard Hadoop Metrics context and JMX.
However, the Hadoop AbstractMetricsContext has a "bug/feature" depending on who you are talking to.
The AbstractMetricsContext does not reset value and therefore only output accumulated values instead of rate.
I've copy the Hadoop class in Chukwa tree to fix this problem (output rate and accumulated value to be compatible).
The idea is to test this functionality in Chukwa and then submit this change to Hadoop.
Also current patch
- fix the chukwa-agent.jar creation
- include class files, not just .java files
- fix Log4JMetricsContext
The previous Log4JMetricsContext contains a bug
CHUKWA-49 that has been fixed here but the previous Log4JMetricsContext
was hard to configure.
In order to output dfs metrics for example we had to configure the standard hadoop-metrics.properties AND conf/chukwa-hadoop-metrics-log4j.properties.
The current implementation is using only the hadoop-metrics.properties file and dynamically register all appenders/loggers.
There's an incompatible change:
the RecordType was previously set in chukwa-hadoop-metrics-log4j.properties, now the recordType is set to the contextName.
This should not be a problem since we already have aliases on demux Parsers.
Also, now we have to provide the metrics output directory using hadoop-metrics.properties
The "uuid" parameter is to append the ms time to the log file's name in order to make it unique. this is required for hadoop jvm/rpc
metrics since more than one process is running on the same machine.
I'm providing a updated version of chukwa-demux-conf.xml.template and hadoop-metrics.properties.