Details
Description
We finally sometimes reach the time limit, 1.5 hours, https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark/build/1676-master
I requested to increase this from an hour to 1.5 hours before but it looks we should fix this in Spark. I asked this for my account few times before but it looks we can't increase this time limit again and again.
I could identify two things that look taking a quite a bit of time:
1. Disabled cache feature in pull request builder, which ends up downloading Maven dependencies (roughly 10ish mins)
https://www.appveyor.com/docs/build-cache/
Note: Saving cache is disabled in Pull Request builds.
and also see http://help.appveyor.com/discussions/problems/4159-cache-doesnt-seem-to-be-working
This seems difficult to fix within Spark.
2. "MLlib classification algorithms" tests (30-35ish mins)
This test below looks taking 30-35ish mins.
MLlib classification algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: C:\projects\spark\bin\.. ......................................................................
As a (I think) last resort, we could make a matrix for this test alone, so that we run the other tests after a build and then run this test after another build, for example, I run Scala tests by this workaround - https://ci.appveyor.com/project/spark-test/spark/build/757-20170716 (a matrix with 7 build and test each).
I am also checking and testing other ways.