Details
-
Improvement
-
Status: Closed
-
Minor
-
Resolution: Fixed
-
1.2.2, 1.3.1
-
None
Description
Metrics are configured in metrics.properties file. Path to this file is specified in SparkConf at a key spark.metrics.conf. The property is read when MetricsSystem is created which means, during SparkEnv initialisation.
Problem
When the user runs his application he has no way to provide the metrics configuration for executors. Although one can specify the path to metrics configuration file (1) the path is common for all the nodes and the client machine so there is implicit assumption that all the machines has same file in the same location, and (2) actually the user needs to copy the file manually to the worker nodes because the file is read before the user files are populated to the executor local directories. All of this makes it very difficult to play with the metrics configuration.
Proposed solution
I think that the easiest and the most consistent solution would be to move the configuration from a separate file directly to SparkConf. We may prefix all the configuration settings from the metrics configuration by, say spark.metrics.props. For the backward compatibility, these properties would be loaded from the specified as it works now. Such a solution doesn't change the API so maybe it could be even included in patch release of Spark 1.2 and Spark 1.3.
Appreciate any feedback.
Attachments
Issue Links
- is related to
-
SPARK-5152 Let metrics.properties file take an hdfs:// path
- Resolved
- links to