Sqoop
  1. Sqoop
  2. SQOOP-583

Zero exit code on Exception in sqoop import

    Details

    • Type: Bug Bug
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Not a Problem
    • Affects Version/s: 1.3.0
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None

      Description

      I am getting zero exit code when there is a real exception when
      running Sqoop Import. The correct exit code (whether it is error or
      not) is important for our scheduling system to notify us of any
      errors. Should I file a jira issue for this bug?
      Here is what I get:

      For a regular sqoop command:

      [cloudera@localhost workhive]$ sqoop
      Warning: /usr/lib/hbase does not exist! HBase imports will fail.
      Please set $HBASE_HOME to the root of your HBase installation.
      Try 'sqoop help' for usage.
      [cloudera@localhost workhive]$ echo $?
      1
      

      So, the error code is correct here

      But for the import:

      [cloudera@localhost workhive]$ sqoop import --username
      username--password password--hive-import --table ExternalPublisher
      --connect jdbc:sqlserver://url:port;databaseName=DBName;
      Warning: /usr/lib/hbase does not exist! HBase imports will fail.
      Please set $HBASE_HOME to the root of your HBase installation.
      12/08/17 20:52:39 WARN tool.BaseSqoopTool: Setting your password on
      the command-line is insecure. Consider using -P instead.
      12/08/17 20:52:39 INFO tool.BaseSqoopTool: Using Hive-specific
      delimiters for output. You can override
      12/08/17 20:52:39 INFO tool.BaseSqoopTool: delimiters with
      --fields-terminated-by, etc.
      12/08/17 20:52:39 INFO SqlServer.MSSQLServerManagerFactory: Using
      Microsoft's SQL Server - Hadoop Connector
      12/08/17 20:52:39 INFO manager.SqlManager: Using default fetchSize of 1000
      12/08/17 20:52:39 INFO tool.CodeGenTool: Beginning code generation
      12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
      SELECT TOP 1 * FROM [ExternalPublisher]
      12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
      SELECT TOP 1 * FROM [ExternalPublisher]
      12/08/17 20:52:43 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
      12/08/17 20:52:43 INFO orm.CompilationManager: Found hadoop core jar
      at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u4-core.jar
      12/08/17 20:52:45 ERROR orm.CompilationManager: Could not rename
      /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.java
      to /home/cloudera/workhive/./ExternalPublisher.java
      java.io.IOException: Destination
      '/home/cloudera/workhive/./ExternalPublisher.java' already exists
              at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
              at com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
              at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
              at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:370)
              at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
              at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
              at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
              at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
              at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
              at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
              at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
      12/08/17 20:52:45 INFO orm.CompilationManager: Writing jar file:
      /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.jar
      12/08/17 20:52:45 INFO mapreduce.ImportJobBase: Beginning import of
      ExternalPublisher
      12/08/17 20:52:46 INFO manager.SqlManager: Executing SQL statement:
      SELECT TOP 1 * FROM [ExternalPublisher]
      12/08/17 20:52:48 INFO mapred.JobClient: Cleaning up the staging area
      hdfs://localhost/var/lib/hadoop-0.20/cache/mapred/mapred/staging/cloudera/.staging/job_201208072011_0004
      12/08/17 20:52:48 ERROR security.UserGroupInformation:
      PriviledgedActionException as:cloudera (auth:SIMPLE)
      cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output
      directory ExternalPublisher already exists
      12/08/17 20:52:48 ERROR tool.ImportTool: Encountered IOException
      running import job:
      org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
      ExternalPublisher already exists
              at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
              at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:872)
              at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
              at java.security.AccessController.doPrivileged(Native Method)
              at javax.security.auth.Subject.doAs(Subject.java:396)
              at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
              at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
              at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
              at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
              at com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143)
              at com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203)
              at com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
              at com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:145)
              at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:383)
              at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
              at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
              at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
              at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
              at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
              at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
              at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
      
      [cloudera@localhost workhive]$ echo $?
      0
      

      The error code shows success here, which is undesirable. And I am not
      interested in why I get FileAlreadyExistsException, I know how to
      handle it. The correct error code is more important for maintenance.

        Activity

        Hide
        Jarek Jarcec Cecho added a comment -

        Hi Ruslan,
        I wasn't able to reproduce your issue with sqoop 1.4.1 - more precisely:

        
        root@ubuntu-cdh4:~# sqoop import --connect jdbc:mysql://172.16.252.1/sqoop --username XXXXX --password XXXXX --table pokus -m 1
        Warning: /usr/lib/hbase does not exist! HBase imports will fail.
        Please set $HBASE_HOME to the root of your HBase installation.
        12/08/20 06:27:58 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
        12/08/20 06:27:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
        12/08/20 06:27:59 INFO tool.CodeGenTool: Beginning code generation
        12/08/20 06:27:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1
        12/08/20 06:27:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1
        12/08/20 06:27:59 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
        Note: /tmp/sqoop-root/compile/e4022ea411fadd7976b3f1de65291dd2/pokus.java uses or overrides a deprecated API.
        Note: Recompile with -Xlint:deprecation for details.
        12/08/20 06:28:00 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-root/compile/e4022ea411fadd7976b3f1de65291dd2/pokus.java to /root/./pokus.java
        org.apache.commons.io.FileExistsException: Destination '/root/./pokus.java' already exists
                at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
                at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
                at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
                at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:390)
                at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
                at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
                at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
                at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
                at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
                at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
                at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
        12/08/20 06:28:00 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/e4022ea411fadd7976b3f1de65291dd2/pokus.jar
        12/08/20 06:28:00 WARN manager.MySQLManager: It looks like you are importing from mysql.
        12/08/20 06:28:00 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
        12/08/20 06:28:00 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
        12/08/20 06:28:00 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
        12/08/20 06:28:00 INFO mapreduce.ImportJobBase: Beginning import of pokus
        12/08/20 06:28:02 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
        12/08/20 06:28:02 INFO mapred.JobClient: Cleaning up the staging area hdfs://ubuntu-cdh4/tmp/hadoop-mapred/mapred/staging/root/.staging/job_201208192257_0003
        12/08/20 06:28:02 ERROR security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists
        12/08/20 06:28:02 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists
                at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
                at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:883)
                at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:844)
                at java.security.AccessController.doPrivileged(Native Method)
                at javax.security.auth.Subject.doAs(Subject.java:416)
                at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
                at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:844)
                at org.apache.hadoop.mapreduce.Job.submit(Job.java:481)
                at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:511)
                at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141)
                at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:201)
                at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
                at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:100)
                at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
                at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
                at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
                at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
                at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
                at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
                at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
                at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
        
        root@ubuntu-cdh4:~# echo $?
        1
        

        Could you please upgrade your sqoop version to the most recent one?

        Jarcec

        Show
        Jarek Jarcec Cecho added a comment - Hi Ruslan, I wasn't able to reproduce your issue with sqoop 1.4.1 - more precisely: root@ubuntu-cdh4:~# sqoop import --connect jdbc:mysql: //172.16.252.1/sqoop --username XXXXX --password XXXXX --table pokus -m 1 Warning: /usr/lib/hbase does not exist! HBase imports will fail. Please set $HBASE_HOME to the root of your HBase installation. 12/08/20 06:27:58 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 12/08/20 06:27:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 12/08/20 06:27:59 INFO tool.CodeGenTool: Beginning code generation 12/08/20 06:27:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1 12/08/20 06:27:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1 12/08/20 06:27:59 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop Note: /tmp/sqoop-root/compile/e4022ea411fadd7976b3f1de65291dd2/pokus.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 12/08/20 06:28:00 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-root/compile/e4022ea411fadd7976b3f1de65291dd2/pokus.java to /root/./pokus.java org.apache.commons.io.FileExistsException: Destination '/root/./pokus.java' already exists at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378) at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227) at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:390) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476) at org.apache.sqoop.Sqoop.run(Sqoop.java:145) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) at org.apache.sqoop.Sqoop.main(Sqoop.java:238) 12/08/20 06:28:00 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/e4022ea411fadd7976b3f1de65291dd2/pokus.jar 12/08/20 06:28:00 WARN manager.MySQLManager: It looks like you are importing from mysql. 12/08/20 06:28:00 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 12/08/20 06:28:00 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 12/08/20 06:28:00 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 12/08/20 06:28:00 INFO mapreduce.ImportJobBase: Beginning import of pokus 12/08/20 06:28:02 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 12/08/20 06:28:02 INFO mapred.JobClient: Cleaning up the staging area hdfs: //ubuntu-cdh4/tmp/hadoop-mapred/mapred/staging/root/.staging/job_201208192257_0003 12/08/20 06:28:02 ERROR security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists 12/08/20 06:28:02 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:883) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:844) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:416) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:844) at org.apache.hadoop.mapreduce.Job.submit(Job.java:481) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:511) at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141) at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:201) at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:464) at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:100) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476) at org.apache.sqoop.Sqoop.run(Sqoop.java:145) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) at org.apache.sqoop.Sqoop.main(Sqoop.java:238) root@ubuntu-cdh4:~# echo $? 1 Could you please upgrade your sqoop version to the most recent one? Jarcec
        Hide
        Ruslan Al-Fakikh added a comment -

        We are using CDH, so now we have to use 1.3.0-cdh3u4. We'll upgrade Sqoop when we do it for all Hadoop in the future. I'll use a workaround for now.
        Thanks

        Show
        Ruslan Al-Fakikh added a comment - We are using CDH, so now we have to use 1.3.0-cdh3u4. We'll upgrade Sqoop when we do it for all Hadoop in the future. I'll use a workaround for now. Thanks
        Hide
        Jarek Jarcec Cecho added a comment -

        Ruslan,
        for your information, I've tried this on CDH3u5 and it's working as expected:

        jarcec@vm-cdh3:~$ sqoop import --connect jdbc:mysql://172.16.252.1/sqoop --username XXXXX --password XXXXX --table pokus -m 1
        Warning: /usr/lib/hbase does not exist! HBase imports will fail.
        Please set $HBASE_HOME to the root of your HBase installation.
        12/08/20 08:35:31 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
        12/08/20 08:35:31 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
        12/08/20 08:35:31 INFO tool.CodeGenTool: Beginning code generation
        12/08/20 08:35:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1
        12/08/20 08:35:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1
        12/08/20 08:35:31 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
        12/08/20 08:35:31 INFO orm.CompilationManager: Found hadoop core jar at: /usr/lib/hadoop/hadoop-core.jar
        12/08/20 08:35:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-jarcec/compile/e4cb26ef5a5142e253ef0cf132c64873/pokus.jar
        12/08/20 08:35:32 WARN manager.MySQLManager: It looks like you are importing from mysql.
        12/08/20 08:35:32 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
        12/08/20 08:35:32 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
        12/08/20 08:35:32 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
        12/08/20 08:35:32 INFO mapreduce.ImportJobBase: Beginning import of pokus
        12/08/20 08:35:33 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost:8020/var/lib/hadoop-0.20/cache/mapred/mapred/staging/jarcec/.staging/job_201208200830_0004
        12/08/20 08:35:33 ERROR security.UserGroupInformation: PriviledgedActionException as:jarcec (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists
        12/08/20 08:35:33 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists
                at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
                at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:921)
                at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:882)
                at java.security.AccessController.doPrivileged(Native Method)
                at javax.security.auth.Subject.doAs(Subject.java:396)
                at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
                at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:882)
                at org.apache.hadoop.mapreduce.Job.submit(Job.java:526)
                at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:556)
                at com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143)
                at com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203)
                at com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
                at com.cloudera.sqoop.manager.MySQLManager.importTable(MySQLManager.java:101)
                at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:382)
                at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:455)
                at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
                at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
                at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
                at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
                at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
                at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
        
        jarcec@vm-cdh3:~$ echo $?
        1
        jarcec@vm-cdh3:~$ sqoop version
        Warning: /usr/lib/hbase does not exist! HBase imports will fail.
        Please set $HBASE_HOME to the root of your HBase installation.
        Sqoop 1.3.0-cdh3u5
        git commit id 291d657234526497fe516a611ac966bc55e2df5c
        Compiled by jenkins@ubuntu-slave02 on Mon Aug  6 19:39:05 PDT 2012
        
        
        Show
        Jarek Jarcec Cecho added a comment - Ruslan, for your information, I've tried this on CDH3u5 and it's working as expected: jarcec@vm-cdh3:~$ sqoop import --connect jdbc:mysql: //172.16.252.1/sqoop --username XXXXX --password XXXXX --table pokus -m 1 Warning: /usr/lib/hbase does not exist! HBase imports will fail. Please set $HBASE_HOME to the root of your HBase installation. 12/08/20 08:35:31 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 12/08/20 08:35:31 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 12/08/20 08:35:31 INFO tool.CodeGenTool: Beginning code generation 12/08/20 08:35:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1 12/08/20 08:35:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pokus` AS t LIMIT 1 12/08/20 08:35:31 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop 12/08/20 08:35:31 INFO orm.CompilationManager: Found hadoop core jar at: /usr/lib/hadoop/hadoop-core.jar 12/08/20 08:35:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-jarcec/compile/e4cb26ef5a5142e253ef0cf132c64873/pokus.jar 12/08/20 08:35:32 WARN manager.MySQLManager: It looks like you are importing from mysql. 12/08/20 08:35:32 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 12/08/20 08:35:32 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 12/08/20 08:35:32 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 12/08/20 08:35:32 INFO mapreduce.ImportJobBase: Beginning import of pokus 12/08/20 08:35:33 INFO mapred.JobClient: Cleaning up the staging area hdfs: //localhost:8020/ var /lib/hadoop-0.20/cache/mapred/mapred/staging/jarcec/.staging/job_201208200830_0004 12/08/20 08:35:33 ERROR security.UserGroupInformation: PriviledgedActionException as:jarcec (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists 12/08/20 08:35:33 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory pokus already exists at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:921) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:882) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:882) at org.apache.hadoop.mapreduce.Job.submit(Job.java:526) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:556) at com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143) at com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203) at com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464) at com.cloudera.sqoop.manager.MySQLManager.importTable(MySQLManager.java:101) at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:382) at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:455) at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182) at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221) at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230) at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239) jarcec@vm-cdh3:~$ echo $? 1 jarcec@vm-cdh3:~$ sqoop version Warning: /usr/lib/hbase does not exist! HBase imports will fail. Please set $HBASE_HOME to the root of your HBase installation. Sqoop 1.3.0-cdh3u5 git commit id 291d657234526497fe516a611ac966bc55e2df5c Compiled by jenkins@ubuntu-slave02 on Mon Aug 6 19:39:05 PDT 2012
        Hide
        Ruslan Al-Fakikh added a comment -

        Thanks again. Your version is a bit higher though. We decided to upgrade to CDH 4 soon. I'll let you know if I still see the error.

        Show
        Ruslan Al-Fakikh added a comment - Thanks again. Your version is a bit higher though. We decided to upgrade to CDH 4 soon. I'll let you know if I still see the error.
        Hide
        Ruslan Al-Fakikh added a comment -

        Hey Jarek,
        I've just noticed that it works fine in our 1.3.0-cdh3u4 version of Sqoop if we use MySql like you did. The problem occurs only with the imports from MS SQL Server for which we are using this connector:
        http://www.microsoft.com/en-us/download/details.aspx?id=27584
        So I guess the bug is in the connector. It is interesting whether it is possible to let Microsoft know about it (it seems that they haven't updated the connector for a long time) or download their source code to fix.

        Show
        Ruslan Al-Fakikh added a comment - Hey Jarek, I've just noticed that it works fine in our 1.3.0-cdh3u4 version of Sqoop if we use MySql like you did. The problem occurs only with the imports from MS SQL Server for which we are using this connector: http://www.microsoft.com/en-us/download/details.aspx?id=27584 So I guess the bug is in the connector. It is interesting whether it is possible to let Microsoft know about it (it seems that they haven't updated the connector for a long time) or download their source code to fix.
        Hide
        Jarek Jarcec Cecho added a comment -

        Good catch Ruslan!

        Please do not hesitate and report this behavior if you know where. I tried to search for some bug submission form, but without much success.

        Jarcec

        Show
        Jarek Jarcec Cecho added a comment - Good catch Ruslan! Please do not hesitate and report this behavior if you know where. I tried to search for some bug submission form, but without much success. Jarcec
        Hide
        Abhijeet Gaikwad added a comment -

        Ruslan,
        Try this for your feedback:
        https://connect.microsoft.com/SQLServer/Feedback

        Thanks!

        Show
        Abhijeet Gaikwad added a comment - Ruslan, Try this for your feedback: https://connect.microsoft.com/SQLServer/Feedback Thanks!
        Show
        Ruslan Al-Fakikh added a comment - Submitted a bug: https://connect.microsoft.com/SQLServer/feedback/details/759648/sqoop-connector-zero-exit-code-on-exception-in-sqoop-import
        Hide
        Jarek Jarcec Cecho added a comment -

        Thank you Ruslan for filing the bug report!

        Jarcec

        Show
        Jarek Jarcec Cecho added a comment - Thank you Ruslan for filing the bug report! Jarcec

          People

          • Assignee:
            Jarek Jarcec Cecho
            Reporter:
            Ruslan Al-Fakikh
          • Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development