Details
-
Bug
-
Status: Closed
-
Critical
-
Resolution: Not A Problem
-
None
-
None
-
None
-
Spark running locally
Description
The classpath retrieval is using a "-spark" flag that returns nothing, using the default "mahout classpath" seems to get all needed jar paths so commenting out the "-spark" makes it work for me. Not sure this is the best fix though.
This is in def mahoutSparkContext(...)
//val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, "-spark", "classpath")) val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, "classpath"))