Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Not A Problem
-
1.4.1-incubating
-
None
-
None
Description
We are getting a UnsupportedOperationException trying to sqoop import, not sure what the reason is. Other commands like list-tables, list-databases are working fine without any issues.
hduser@canberra:~/work/software/cloudera/sqoop-1.4.1-cdh4.0.0/bin$
sqoop create-hive-table --connect jdbc:mysql://localhost:3306/sqoop --username root --password root --table foo1
12/08/07 10:31:21 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/08/07 10:31:21 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/08/07 10:31:21 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/08/07 10:31:21 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hduser/work/software/cloudera/hadoop-2.0.0-cdh4.0.0/share/hadoop/common/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hduser/work/software/cloudera/hbase-0.94.0/lib/slf4j-log4j12-1.5.8.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hduser/work/software/cloudera/sqoop-1.4.1-cdh4.0.0/build/ivy/lib/sqoop/hadoop23test/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
12/08/07 10:31:21 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.UnsupportedOperationException: Not implemented by the DistributedFileSystem FileSystem implementation
java.lang.UnsupportedOperationException: Not implemented by the DistributedFileSystem FileSystem implementation
at org.apache.hadoop.fs.FileSystem.getScheme(FileSystem.java:202)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2118)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2128)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2184)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2166)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:302)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:158)
at org.apache.sqoop.hive.HiveImport.removeTempLogs(HiveImport.java:105)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:175)
at org.apache.sqoop.tool.CreateHiveTableTool.run(CreateHiveTableTool.java:58)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)