Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
None
-
None
Description
The timeout from R to the JVM is hardcoded at 6000 seconds in https://github.com/apache/spark/blob/6c5bbd628aaedb6efb44c15f816fea8fb600decc/R/pkg/R/client.R#L22
This results in Spark jobs that take more than 100 minutes to always fail. We should make this timeout configurable through SparkConf.
Attachments
Issue Links
- duplicates
-
SPARK-17919 Make timeout to RBackend configurable in SparkR
-
- Resolved
-