Details
-
Sub-task
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
3.3.0
-
None
Description
./bin/spark-shell
Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 21/12/31 10:55:04 INFO SignalUtils: Registering signal handler for INT 21/12/31 10:55:08 INFO HiveConf: Found configuration file null 21/12/31 10:55:08 INFO SparkContext: Running Spark version 3.3.0-SNAPSHOT ... 21/12/31 10:55:09 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, ..., None) ... Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 3.3.0-SNAPSHOT /_/ Using Scala version 2.12.15 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_291) Type in expressions to have them evaluated. Type :help for more information.
Attachments
Issue Links
- is caused by
-
SPARK-37746 log4j2-defaults.properties is not working since log4j 2 is always initialized by default
- Resolved
- is related to
-
SPARK-37887 PySpark shell sets log level to INFO by default
- Resolved
- links to