Details
-
Bug
-
Status: Resolved
-
P0
-
Resolution: Fixed
-
2.6.0
-
None
Description
IllegalArgumentException when using Hadoop file system for WordCount example.
Occurred when running WordCount example using Spark runner on a YARN cluster.
Command-line arguments:
--runner=SparkRunner --inputFile=hdfs:///user/myuser/kinglear.txt --output=hdfs:///user/myuser/wc/wc
Stack trace:
java.lang.IllegalArgumentException: Expect srcResourceIds and destResourceIds have the same scheme, but received file, hdfs. at org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkArgument(Preconditions.java:122) at org.apache.beam.sdk.io.FileSystems.validateSrcDestLists(FileSystems.java:394) at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:236) at org.apache.beam.sdk.io.FileBasedSink$WriteOperation.copyToOutputFiles(FileBasedSink.java:626) at org.apache.beam.sdk.io.FileBasedSink$WriteOperation.finalize(FileBasedSink.java:516) at org.apache.beam.sdk.io.WriteFiles$2.processElement(WriteFiles.java:592)