Details
Description
Hi,
When we are trying to allocate spark memory by sparkR from jupyter notebook Spark always setting default config memory.We want to set spark memory configuration by SparkR code but with current Livy it's not working.Same things when trying with pySpark its working properly.I'm attaching both screenshots which we have tried from pySpark and sparkR with this mail.
If you need further any information regarding this issue please let us know.
Livy version: 0.5 version latest 2008 release
Spark: 2.1.1
JupyterNote book: 4.3.1
SparkMagic: 0.11.3