Several Spark properties equivalent to Spark submit command line options are missing.
The equivalent for spark-submit --num-executors should be
When use in SparkConf?
Could you try setting that with sparkR.init()?
From: Franc Carter <email@example.com>
Sent: Friday, December 25, 2015 9:23 PM
Subject: number of executors in sparkR.init()
I'm having trouble working out how to get the number of executors set when using sparkR.init().
If I start sparkR with
sparkR --master yarn --num-executors 6
then I get 6 executors
However, if start sparkR with
sc <- sparkR.init(master="yarn-client", sparkEnvir=list(spark.num.executors='6'))
then I only get 2 executors.
Can anyone point me in the direction of what I might doing wrong ? I need to initialise this was so that rStudio can hook in to SparkR