Details
-
Wish
-
Status: Resolved
-
Minor
-
Resolution: Won't Fix
-
None
-
None
-
None
Description
It would be great if Spark supported using AWS Lambda for execution in addition to Standalone, Mesos and YARN, getting rid of the concept of a "cluster" and having a single infinite-sized one.
Couple of problems I see today:
- Execution time is limited to 60s. This will probably change in the future.
- "Burstiness" is still not very high.