Description
In Spark 3.2:
scala> import org.apache.spark.sql.SparkSession import org.apache.spark.sql.SparkSession scala> spark.sparkContext.setLogLevel("FATAL") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res1: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@7dafb9f9 scala> spark.sparkContext.setLogLevel("WARN") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate 21/12/23 21:08:18 WARN SparkSession$Builder: Using an existing SparkSession; some spark core configurations may not take effect. res3: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@7dafb9f9 scala> spark.sparkContext.setLogLevel("FATAL") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res5: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@7dafb9f9
In the current master:
scala> import org.apache.spark.sql.SparkSession import org.apache.spark.sql.SparkSession scala> spark.sparkContext.setLogLevel("FATAL") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res1: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@3e8a1137 scala> spark.sparkContext.setLogLevel("WARN") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res3: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@3e8a1137 scala> spark.sparkContext.setLogLevel("FATAL") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res5: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@3e8a1137
Seems like it works when you set via setLogLevel initially but cannot be changed afterward.