Description
Currently sparkR.stop removes relevant variables from .sparkREnv for SparkContext and backend. However it doesn't clean up .sparkRSQLsc and .sparkRHivesc.
It results
sc <- sparkR.init("local") sqlContext <- sparkRSQL.init(sc) sparkR.stop() sc <- sparkR.init("local") sqlContext <- sparkRSQL.init(sc) sqlContext
producing
sqlContext
Error in callJMethod(x, "getClass") :
Invalid jobj 1. If SparkR was restarted, Spark operations need to be re-executed.