Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
2.4.0
-
None
Description
When running a spark standalone cluster with spark.authenticate.secret setup, you cannot submit a program in cluster mode, even with the right secret. The driver fails with:
18/08/09 08:17:21 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(systest); groups with view permissions: Set(); users with modify permissions: Set(systest); groups with modify permissions: Set() 18/08/09 08:17:21 ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: requirement failed: A secret key must be specified via the spark.authenticate.secret config. at scala.Predef$.require(Predef.scala:224) at org.apache.spark.SecurityManager.initializeAuth(SecurityManager.scala:361) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:238) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:424) ...
but its actually doing the wrong check in SecurityManager.initializeAuth(). The secret is there, its just in an environment variable _SPARK_AUTH_SECRET (so its not visible to another process).
Workaround: In your program, you can pass in a dummy secret to your spark conf. It doesn't matter what it is at all, later it'll be ignored and when establishing connections, the secret from the env variable will be used. Eg.
val conf = new SparkConf() conf.setIfMissing("spark.authenticate.secret", "doesn't matter") val sc = new SparkContext(conf)