Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
None
-
None
-
None
Description
SystemML's file handling currently relies only on the default file system implementation configured in used Hadoop configuration. This creates two major problems:
- When reading from object stores (e.g., via swift or s3), the configured hadoop configuration might not necessarily specify the default fs implementation, leading to failures as shown below.
- If HDFS is used as default fs implementation, users cannot load local files by prefixing file:/// but are required to put these files first into HDFS.
Caused by: org.apache.sysml.api.mlcontext.MLContextException: Exception occurred while validating script at org.apache.sysml.api.mlcontext.ScriptExecutor.validateScript(ScriptExecutor.java:506) at org.apache.sysml.api.mlcontext.ScriptExecutor.execute(ScriptExecutor.java:286) at org.apache.sysml.api.mlcontext.MLContext.execute(MLContext.java:303) ... 43 more Caused by: org.apache.sysml.parser.LanguageException: Invalid Parameters : ERROR: null -- line 2, column 2 -- Read input file does not exist on FS (local mode): swift://<logical_name>/census.csv at org.apache.sysml.parser.Expression.raiseValidateError(Expression.java:549) at org.apache.sysml.parser.DataExpression.validateExpression(DataExpression.java:641) at org.apache.sysml.parser.StatementBlock.validate(StatementBlock.java:592) at org.apache.sysml.parser.DMLTranslator.validateParseTree(DMLTranslator.java:143) at org.apache.sysml.api.mlcontext.ScriptExecutor.validateScript(ScriptExecutor.java:504)