Details
-
Sub-task
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
4.0.0
Description
Introduce Structured Logging Framework as per https://docs.google.com/document/d/1rATVGmFLNVLmtxSpWrEceYm7d-ocgu8ofhryVs4g3XU/edit?usp=sharing .
- The default logging output format will be json lines. For example
{ "ts":"2023-03-12T12:02:46.661-0700", "level":"ERROR", "msg":"Cannot determine whether executor 289 is alive or not", "context":{ "executor_id":"289" }, "exception":{ "class":"org.apache.spark.SparkException", "msg":"Exception thrown in awaitResult", "stackTrace":"..." }, "source":"BlockManagerMasterEndpoint" }
- Introduce a new configuration `spark.log.structuredLogging.enabled` to control the default log4j configuration. Users can set it as false to get plain text log outputs
- The change will start with logError method. Example changes on the API: from
`logError(s"Cannot determine whether executor $executorId is alive or not.", e)` to `logError(log"Cannot determine whether executor ${MDC(EXECUTOR_ID, executorId)} is alive or not.", e)`
Attachments
Issue Links
- links to