Uploaded image for project: 'Sqoop'
  1. Sqoop
  2. SQOOP-382

Connection parameters should be used on the mapper

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.4.3
    • Component/s: None
    • Labels:
      None

      Description

      Currently you can specify connection parameters using --connection-param-file <properties-file>.
      This applies the connection parameters to the connection when generating the Sqoop code - but the parameters are not passed down to the mapper.
      Instead of specifying a parameters file couldn't we have a comma seperated list that could be specified on the command line or in sqoop-site.xml - that way it would be easier to override the settings per job, and they would be passed down to the mappers. It would then be simple to modify DBConfiguration.getConnection to read these.

      1. SQOOP-382.patch
        20 kB
        David Robson

        Issue Links

          Activity

          Hide
          janmechtel Jan Mechtel added a comment -

          I also have this issue when trying to connect to MS SQL on Azure. I can tell that the connection works because it stops before the mappers if i try with a wrong password or table name that does not exist.

          If the password is correct then:

          12/11/20 23:16:49 INFO mapred.JobClient: Cleaning up the staging area hdfs://overlord-datanode1:8020/user/overlord/.staging/job_201211201516_0015
          12/11/20 23:16:49 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match.

          Complete output ( I believe the first error about connector file is not relevant)
          overlord@overlord-datanode1:~/Documents/hadoop/scripts/03_azure_sql$ ./test_sqoop_azure_connection.sh
          Enter password:
          12/11/20 23:30:57 ERROR sqoop.ConnFactory: Error loading ManagerFactory information from file /usr/lib/sqoop/bin/../conf/managers.d/mssqoop-sqlserver: java.io.IOException: the content of connector file must be in form of key=value
          at org.apache.sqoop.ConnFactory.addManagersFromFile(ConnFactory.java:153)
          at org.apache.sqoop.ConnFactory.loadManagersFromConfDir(ConnFactory.java:228)
          at org.apache.sqoop.ConnFactory.instantiateFactories(ConnFactory.java:83)
          at org.apache.sqoop.ConnFactory.<init>(ConnFactory.java:60)
          at com.cloudera.sqoop.ConnFactory.<init>(ConnFactory.java:36)
          at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:200)
          at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:83)
          at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:464)
          at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
          at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
          at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
          at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
          at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
          at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
          at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)

          12/11/20 23:30:57 INFO manager.SqlManager: Using default fetchSize of 1000
          12/11/20 23:30:57 INFO tool.CodeGenTool: Beginning code generation
          12/11/20 23:30:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [Unlocks] AS t WHERE 1=0
          12/11/20 23:30:58 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
          Note: /tmp/sqoop-overlord/compile/74f660b6b6f05d767baeb6b4266f35c1/Unlocks.java uses or overrides a deprecated API.
          Note: Recompile with -Xlint:deprecation for details.
          12/11/20 23:31:00 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-overlord/compile/74f660b6b6f05d767baeb6b4266f35c1/Unlocks.jar
          12/11/20 23:31:01 INFO mapreduce.ImportJobBase: Beginning import of Unlocks
          12/11/20 23:31:01 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
          12/11/20 23:31:18 INFO mapred.JobClient: Cleaning up the staging area hdfs://overlord-datanode1:8020/user/overlord/.staging/job_201211201516_0016
          12/11/20 23:31:18 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match.
          java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match.
          at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
          at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
          at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
          at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1011)
          at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1031)
          at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:172)
          at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:943)
          at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:896)
          at java.security.AccessController.doPrivileged(Native Method)
          at javax.security.auth.Subject.doAs(Subject.java:396)
          at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
          at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:896)
          at org.apache.hadoop.mapreduce.Job.submit(Job.java:531)
          at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:561)
          at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141)
          at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:202)
          at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:465)
          at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
          at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
          at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
          at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
          at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
          at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
          at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
          at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
          at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
          Caused by: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match.
          at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:193)
          at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162)
          ... 25 more
          Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match.
          at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
          at com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:246)
          at com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:83)
          at com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:2529)
          at com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:1905)
          at com.microsoft.sqlserver.jdbc.SQLServerConnection.access$000(SQLServerConnection.java:41)
          at com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:1893)
          at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
          at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
          at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:1045)
          at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:817)
          at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:700)
          at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:842)
          at java.sql.DriverManager.getConnection(DriverManager.java:582)
          at java.sql.DriverManager.getConnection(DriverManager.java:207)
          at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:181)
          at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187)
          ... 26 more

          Show
          janmechtel Jan Mechtel added a comment - I also have this issue when trying to connect to MS SQL on Azure. I can tell that the connection works because it stops before the mappers if i try with a wrong password or table name that does not exist. If the password is correct then: 12/11/20 23:16:49 INFO mapred.JobClient: Cleaning up the staging area hdfs://overlord-datanode1:8020/user/overlord/.staging/job_201211201516_0015 12/11/20 23:16:49 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match. Complete output ( I believe the first error about connector file is not relevant) overlord@overlord-datanode1:~/Documents/hadoop/scripts/03_azure_sql$ ./test_sqoop_azure_connection.sh Enter password: 12/11/20 23:30:57 ERROR sqoop.ConnFactory: Error loading ManagerFactory information from file /usr/lib/sqoop/bin/../conf/managers.d/mssqoop-sqlserver: java.io.IOException: the content of connector file must be in form of key=value at org.apache.sqoop.ConnFactory.addManagersFromFile(ConnFactory.java:153) at org.apache.sqoop.ConnFactory.loadManagersFromConfDir(ConnFactory.java:228) at org.apache.sqoop.ConnFactory.instantiateFactories(ConnFactory.java:83) at org.apache.sqoop.ConnFactory.<init>(ConnFactory.java:60) at com.cloudera.sqoop.ConnFactory.<init>(ConnFactory.java:36) at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:200) at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:83) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:464) at org.apache.sqoop.Sqoop.run(Sqoop.java:145) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) at org.apache.sqoop.Sqoop.main(Sqoop.java:238) at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57) 12/11/20 23:30:57 INFO manager.SqlManager: Using default fetchSize of 1000 12/11/20 23:30:57 INFO tool.CodeGenTool: Beginning code generation 12/11/20 23:30:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [Unlocks] AS t WHERE 1=0 12/11/20 23:30:58 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop Note: /tmp/sqoop-overlord/compile/74f660b6b6f05d767baeb6b4266f35c1/Unlocks.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 12/11/20 23:31:00 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-overlord/compile/74f660b6b6f05d767baeb6b4266f35c1/Unlocks.jar 12/11/20 23:31:01 INFO mapreduce.ImportJobBase: Beginning import of Unlocks 12/11/20 23:31:01 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 12/11/20 23:31:18 INFO mapred.JobClient: Cleaning up the staging area hdfs://overlord-datanode1:8020/user/overlord/.staging/job_201211201516_0016 12/11/20 23:31:18 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match. java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match. at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130) at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1011) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1031) at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:172) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:943) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:896) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:896) at org.apache.hadoop.mapreduce.Job.submit(Job.java:531) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:561) at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141) at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:202) at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:465) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476) at org.apache.sqoop.Sqoop.run(Sqoop.java:145) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) at org.apache.sqoop.Sqoop.main(Sqoop.java:238) at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57) Caused by: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match. at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:193) at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162) ... 25 more Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match. at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197) at com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:246) at com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:83) at com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:2529) at com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:1905) at com.microsoft.sqlserver.jdbc.SQLServerConnection.access$000(SQLServerConnection.java:41) at com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:1893) at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575) at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400) at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:1045) at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:817) at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:700) at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:842) at java.sql.DriverManager.getConnection(DriverManager.java:582) at java.sql.DriverManager.getConnection(DriverManager.java:207) at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:181) at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187) ... 26 more
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          Hi Jan,
          thank you very much for your feedback on this issue. Would you mind sharing with us the final sqoop command that your script test_sqoop_azure_connection.sh generates?

          Show
          jarcec Jarek Jarcec Cecho added a comment - Hi Jan, thank you very much for your feedback on this issue. Would you mind sharing with us the final sqoop command that your script test_sqoop_azure_connection.sh generates?
          Hide
          janmechtel Jan Mechtel added a comment -

          Sure, my bad.
          overlord@overlord-datanode1:~/Documents/hadoop/scripts/03_azure_sql$ cat test_sqoop_azure_connection.sh
          /usr/lib/sqoop/bin/sqoop import -P --connect 'jdbc:sqlserver://ns74c5q6t9.database.windows.net' --connection-param-file sqoop_azure_connection.properties --table Unlocks --target-dir /data/azure/dbo_unlocks

          overlord@overlord-datanode1:~/Documents/hadoop/scripts/03_azure_sql$ cat sqoop_azure_connection.properties
          username=us@ns74c5q6t9
          database=veodin
          currentSchema=dbo

          I put the username inside the connection-param-file because otherwise it would not work with the @ , maybe one way around the issue would be to escape the @ (I tried

          {@}

          ) ?

          Also I was thinking that maybe because of the first error:

          ERROR sqoop.ConnFactory: Error loading ManagerFactory information from file /usr/lib/sqoop/bin/../conf/managers.d/mssqoop-sqlserver: java.io.IOException: the content of connector file must be in form of key=value

          The parameters are not passed correctly because the default ConnectionFactory is used?

          Thanks for taking the time to look at this. I would have posted a question if I hadn't found this bug report which seemed to match.

          Show
          janmechtel Jan Mechtel added a comment - Sure, my bad. overlord@overlord-datanode1:~/Documents/hadoop/scripts/03_azure_sql$ cat test_sqoop_azure_connection.sh /usr/lib/sqoop/bin/sqoop import -P --connect 'jdbc:sqlserver://ns74c5q6t9.database.windows.net' --connection-param-file sqoop_azure_connection.properties --table Unlocks --target-dir /data/azure/dbo_unlocks overlord@overlord-datanode1:~/Documents/hadoop/scripts/03_azure_sql$ cat sqoop_azure_connection.properties username=us@ns74c5q6t9 database=veodin currentSchema=dbo I put the username inside the connection-param-file because otherwise it would not work with the @ , maybe one way around the issue would be to escape the @ (I tried {@} ) ? Also I was thinking that maybe because of the first error: ERROR sqoop.ConnFactory: Error loading ManagerFactory information from file /usr/lib/sqoop/bin/../conf/managers.d/mssqoop-sqlserver: java.io.IOException: the content of connector file must be in form of key=value The parameters are not passed correctly because the default ConnectionFactory is used? Thanks for taking the time to look at this. I would have posted a question if I hadn't found this bug report which seemed to match.
          Hide
          janmechtel Jan Mechtel added a comment -

          Siden ote: I also found this thread: http://mail-archives.apache.org/mod_mbox/sqoop-user/201207.mbox/%3CCAL=o-uRrzu5_FsOhb3AtjTGgAubyKEfRdGNJXm=aRyE5aMHu4A@mail.gmail.com%3E

          > Actually when i pasted sql server connector and jdbc connectors in
          > hadoop lib folder,the errors are gone

          But I'm unsure which files to copy. Putting sqljdbc4.jar and sqoop-sqlserver-1.0.jar into /usr/lib/hadoop/lib didn't do the trick.

          Show
          janmechtel Jan Mechtel added a comment - Siden ote: I also found this thread: http://mail-archives.apache.org/mod_mbox/sqoop-user/201207.mbox/%3CCAL=o-uRrzu5_FsOhb3AtjTGgAubyKEfRdGNJXm=aRyE5aMHu4A@mail.gmail.com%3E > Actually when i pasted sql server connector and jdbc connectors in > hadoop lib folder,the errors are gone But I'm unsure which files to copy. Putting sqljdbc4.jar and sqoop-sqlserver-1.0.jar into /usr/lib/hadoop/lib didn't do the trick.
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          Hi Jan,
          thank you very much for additional information. I would break your particular issue to several sub-issues:

          1) Problematic "@" character.
          What shell are you using? I've tried bash and I did not had any issues with putting password with "@" character. You can always put entire password to quotes (') or (") on most commonly available shells.

          2) The content of connector file must be in form of key=value
          You have invalid configuration of your Microsoft SQL Connector and as a result this connector is not loaded. Sqoop might work though as it will likely load build-in Microsoft SQL Server connector instead. You should fix your configuration in /usr/lib/sqoop/bin/../conf/managers.d/mssqoop-sqlserver to contain following format com.solar.earth.class.name=/path/to/jar .

          Show
          jarcec Jarek Jarcec Cecho added a comment - Hi Jan, thank you very much for additional information. I would break your particular issue to several sub-issues: 1) Problematic "@" character. What shell are you using? I've tried bash and I did not had any issues with putting password with "@" character. You can always put entire password to quotes (') or (") on most commonly available shells. 2) The content of connector file must be in form of key=value You have invalid configuration of your Microsoft SQL Connector and as a result this connector is not loaded. Sqoop might work though as it will likely load build-in Microsoft SQL Server connector instead. You should fix your configuration in /usr/lib/sqoop/bin/../conf/managers.d/mssqoop-sqlserver to contain following format com.solar.earth.class.name=/path/to/jar .
          Hide
          janmechtel Jan Mechtel added a comment -

          1) The "@" is in the username it's a Azure MS SQL specific thing where you need to connect with the username: <username>@<host>. When I put this directly into the connection string the server would deny authorization to "<username>" that lead me to the assumption that everything behind the @ get's ignored. When I put it into the connection.properties I pass this barrier at least.

          2) OK I will go back and check the configuration of the MSSQL_SQOOP_CONNECTOR, I might have missed a step. With "com.solar.earth.class.name=/path/to/jar " you mean I should put the file to the connector jar as value for the MSSQL Scoop Connector?

          Show
          janmechtel Jan Mechtel added a comment - 1) The "@" is in the username it's a Azure MS SQL specific thing where you need to connect with the username: <username>@<host>. When I put this directly into the connection string the server would deny authorization to "<username>" that lead me to the assumption that everything behind the @ get's ignored. When I put it into the connection.properties I pass this barrier at least. 2) OK I will go back and check the configuration of the MSSQL_SQOOP_CONNECTOR, I might have missed a step. With "com.solar.earth.class.name=/path/to/jar " you mean I should put the file to the connector jar as value for the MSSQL Scoop Connector?
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          1) I see, I've actually missed that it's in username and not password. However from Sqoop perspective it shouldn't matter. Would you mind trying to put your username, password and currentSchema into the JDBC URL instead of external file? Something like

          jdbc:sqlserver://ns74c5q6t9.database.windows.net;user=us@ns74c5q6t9;password=...;
          

          2) You seems to have configuration file for Microsoft SQL Connector in following path:

          /usr/lib/sqoop/bin/../conf/managers.d/mssqoop-sqlserver
          

          Based on the provided error message, I'm guess that it's content is:

          com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
          

          However this content is not valid as it should be in form of key=value pair. You should change it to something like:

          com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory=/usr/lib/sqoop/lib/sqoop-sqlserver-1.0.jar
          

          Please note that I've guessed the path based on information provided in this JIRA. It might not be accurate and you should check that the sqoop-sqlserver-1.0.jar is indeed stored in this location.

          Show
          jarcec Jarek Jarcec Cecho added a comment - 1) I see, I've actually missed that it's in username and not password. However from Sqoop perspective it shouldn't matter. Would you mind trying to put your username, password and currentSchema into the JDBC URL instead of external file? Something like jdbc:sqlserver: //ns74c5q6t9.database.windows.net;user=us@ns74c5q6t9;password=...; 2) You seems to have configuration file for Microsoft SQL Connector in following path: /usr/lib/sqoop/bin/../conf/managers.d/mssqoop-sqlserver Based on the provided error message, I'm guess that it's content is: com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory However this content is not valid as it should be in form of key=value pair. You should change it to something like: com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory=/usr/lib/sqoop/lib/sqoop-sqlserver-1.0.jar Please note that I've guessed the path based on information provided in this JIRA. It might not be accurate and you should check that the sqoop-sqlserver-1.0.jar is indeed stored in this location.
          Hide
          janmechtel Jan Mechtel added a comment -

          1) I've tried it with all three combinations (1st username in connection string + 2nd username in connection string and properties file + 3rd user name in properties).
          2) That's exactly what I've done now. btw you guessed the path correct. The first error message is now gone but I still can't get past this message:

          overlord@overlord-datanode1:~/Documents/hadoop/scripts/03_azure_sql$ ./test_sqoop_azure_connection.sh
          Enter password:
          12/11/21 18:20:05 INFO SqlServer.MSSQLServerManagerFactory: Using Microsoft's SQL Server - Hadoop Connector
          12/11/21 18:20:05 INFO manager.SqlManager: Using default fetchSize of 1000
          12/11/21 18:20:05 INFO tool.CodeGenTool: Beginning code generation
          12/11/21 18:20:06 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Unlocks]
          12/11/21 18:20:06 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Unlocks]
          12/11/21 18:20:06 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
          Note: /tmp/sqoop-overlord/compile/25615050f948c94583e6659e87c39a58/Unlocks.java uses or overrides a deprecated API.
          Note: Recompile with -Xlint:deprecation for details.
          12/11/21 18:20:07 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-overlord/compile/25615050f948c94583e6659e87c39a58/Unlocks.jar
          12/11/21 18:20:08 INFO mapreduce.ImportJobBase: Beginning import of Unlocks
          12/11/21 18:20:08 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Unlocks]
          12/11/21 18:20:09 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
          12/11/21 18:20:12 INFO mapred.JobClient: Cleaning up the staging area hdfs://overlord-datanode1:8020/user/overlord/.staging/job_201211201516_0077
          12/11/21 18:20:12 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'veodin_user'.
          java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'veodin_user'.

          Any other idea how I could escape the "@" or pass the properties file to the mapreduce?

          Show
          janmechtel Jan Mechtel added a comment - 1) I've tried it with all three combinations (1st username in connection string + 2nd username in connection string and properties file + 3rd user name in properties). 2) That's exactly what I've done now. btw you guessed the path correct. The first error message is now gone but I still can't get past this message: overlord@overlord-datanode1:~/Documents/hadoop/scripts/03_azure_sql$ ./test_sqoop_azure_connection.sh Enter password: 12/11/21 18:20:05 INFO SqlServer.MSSQLServerManagerFactory: Using Microsoft's SQL Server - Hadoop Connector 12/11/21 18:20:05 INFO manager.SqlManager: Using default fetchSize of 1000 12/11/21 18:20:05 INFO tool.CodeGenTool: Beginning code generation 12/11/21 18:20:06 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Unlocks] 12/11/21 18:20:06 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Unlocks] 12/11/21 18:20:06 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop Note: /tmp/sqoop-overlord/compile/25615050f948c94583e6659e87c39a58/Unlocks.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 12/11/21 18:20:07 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-overlord/compile/25615050f948c94583e6659e87c39a58/Unlocks.jar 12/11/21 18:20:08 INFO mapreduce.ImportJobBase: Beginning import of Unlocks 12/11/21 18:20:08 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Unlocks] 12/11/21 18:20:09 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 12/11/21 18:20:12 INFO mapred.JobClient: Cleaning up the staging area hdfs://overlord-datanode1:8020/user/overlord/.staging/job_201211201516_0077 12/11/21 18:20:12 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'veodin_user'. java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'veodin_user'. Any other idea how I could escape the "@" or pass the properties file to the mapreduce?
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          Hi Jan,
          would you mind sharing with us whole sqoop log generated with parameter --verbose and map task log from following two cases:

          • Sqoop executed with --username argument
          • Username stored in the JDBC URL.

          (e.g. 4 files)

          Show
          jarcec Jarek Jarcec Cecho added a comment - Hi Jan, would you mind sharing with us whole sqoop log generated with parameter --verbose and map task log from following two cases: Sqoop executed with --username argument Username stored in the JDBC URL. (e.g. 4 files)
          Hide
          janmechtel Jan Mechtel added a comment -

          Thanks for looking into this for me:

          1) Sqoop executed with --username argument
          Sqoop output: http://pastebin.com/sML5yL7w

          12/11/22 11:07:25 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Cannot open server "ns74c56t9" requested by the login. The login failed.
          java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Cannot open server "ns74c56t9" requested by the login. The login failed.

          2) Username stored in JDBC URL

          Sqoop output: http://pastebin.com/P2DZ7YqF

          12/11/22 10:32:35 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'veodin_user'.

          MR output >> in both cases the job doesn't show up in the task tracker, I thought it would show up as job_201211201516_0089 based on this line from the output:
          12/11/22 10:32:35 INFO mapred.JobClient: Cleaning up the staging area hdfs://overlord-datanode1:8020/user/overlord/.staging/job_201211201516_0089

          Ideas?

          Show
          janmechtel Jan Mechtel added a comment - Thanks for looking into this for me: 1) Sqoop executed with --username argument Sqoop output: http://pastebin.com/sML5yL7w 12/11/22 11:07:25 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Cannot open server "ns74c56t9" requested by the login. The login failed. java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Cannot open server "ns74c56t9" requested by the login. The login failed. 2) Username stored in JDBC URL Sqoop output: http://pastebin.com/P2DZ7YqF 12/11/22 10:32:35 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'veodin_user'. MR output >> in both cases the job doesn't show up in the task tracker, I thought it would show up as job_201211201516_0089 based on this line from the output: 12/11/22 10:32:35 INFO mapred.JobClient: Cleaning up the staging area hdfs://overlord-datanode1:8020/user/overlord/.staging/job_201211201516_0089 Ideas?
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          Hi Jan,
          I'm no Microsoft SQL Server on Azure expert, however from the error message "Cannot open server "ns74c56t9" requested by the login. The login failed" I would say that your entire username was transferred correctly as the the error message contains server name that we thought is not transported.

          Sqoop requires access to your database from all nodes in your hadoop cluster as it's doing several data movements in parallel. We've seen similar issues with for example MySQL where master node did have sufficient privileges, but all slave nodes did not. Would you mind checking that you are able to connect to your database from all nodes in your cluster? It's pretty easy task with the MySQL, but I'm not sure how to do it for Microsoft SQL Server and I'm not sure that Microsoft is shipping any linux compatible client.

          Show
          jarcec Jarek Jarcec Cecho added a comment - Hi Jan, I'm no Microsoft SQL Server on Azure expert, however from the error message "Cannot open server "ns74c56t9" requested by the login. The login failed" I would say that your entire username was transferred correctly as the the error message contains server name that we thought is not transported. Sqoop requires access to your database from all nodes in your hadoop cluster as it's doing several data movements in parallel. We've seen similar issues with for example MySQL where master node did have sufficient privileges, but all slave nodes did not. Would you mind checking that you are able to connect to your database from all nodes in your cluster? It's pretty easy task with the MySQL, but I'm not sure how to do it for Microsoft SQL Server and I'm not sure that Microsoft is shipping any linux compatible client.
          Hide
          janmechtel Jan Mechtel added a comment -

          Ok, basically you want to check that the other nodes can connect (so that we are sure that not for example a dns errors causes this?) You assume correctly that this is abit more complicated than mysql.

          Could we instead put our cluster to only one master node and try to run sqoop then, to rule out that problem source? (I didn't install cloudera sorry if that's an easy question, I'll ask our admin too).

          Show
          janmechtel Jan Mechtel added a comment - Ok, basically you want to check that the other nodes can connect (so that we are sure that not for example a dns errors causes this?) You assume correctly that this is abit more complicated than mysql. Could we instead put our cluster to only one master node and try to run sqoop then, to rule out that problem source? (I didn't install cloudera sorry if that's an easy question, I'll ask our admin too).
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          You're right, I would like you to check that you're able to connect from all your nodes to you Azure database to verify that this is indeed Sqoop related issue.

          Reconfiguring you cluster seems weird to me (a lot of work for small gain). What about putting together very simple java application (one class with main()) that will open connection using JDBC and run simple query? That should do the work and it inherently must be working on linux.

          Show
          jarcec Jarek Jarcec Cecho added a comment - You're right, I would like you to check that you're able to connect from all your nodes to you Azure database to verify that this is indeed Sqoop related issue. Reconfiguring you cluster seems weird to me (a lot of work for small gain). What about putting together very simple java application (one class with main()) that will open connection using JDBC and run simple query? That should do the work and it inherently must be working on linux.
          Hide
          janmechtel Jan Mechtel added a comment -

          Ok, I'm trying that, I'm new to linux so I have trouble setting the classpath correctly? I'm trying to find a solution and get back: http://stackoverflow.com/questions/13521330/run-mssql-jdbc-example-on-centos-java-lang-classnotfoundexception-com-microsoft

          Show
          janmechtel Jan Mechtel added a comment - Ok, I'm trying that, I'm new to linux so I have trouble setting the classpath correctly? I'm trying to find a solution and get back: http://stackoverflow.com/questions/13521330/run-mssql-jdbc-example-on-centos-java-lang-classnotfoundexception-com-microsoft
          Hide
          david.robson David Robson added a comment -

          It looks like you need to add the jar to the classpath - eg "java TestAzure -cp ./:./sqljdbc4.jar"
          You also probably need to add the connection string to your Java code to actually create the connection.
          As for your problem - I asked a colleague who has used Sqoop on Azure before and he gave me the following command line he used which worked successfully:
          sqoop import --connect "jdbc:sqlserver://YOUR_HOST.database.windows.net:1433;database=YOUR_DATABASE;user=YOUR_USER@YOUR_HOST;password=YOUR_PASSWORD" --table YOUR_TABLE --target-dir YOUR_DIRECTORY --split-by YOUR_SPLIT_COLUMN
          Could you try this command - so get rid of the connection parameters completely - and specify the password in the URL just to see if it works?
          From my experience the connection parameters using the connection-param-file do not work (hence this bug) - so probably best to eliminate them all together.

          Show
          david.robson David Robson added a comment - It looks like you need to add the jar to the classpath - eg "java TestAzure -cp ./:./sqljdbc4.jar" You also probably need to add the connection string to your Java code to actually create the connection. As for your problem - I asked a colleague who has used Sqoop on Azure before and he gave me the following command line he used which worked successfully: sqoop import --connect "jdbc:sqlserver://YOUR_HOST.database.windows.net:1433;database=YOUR_DATABASE;user=YOUR_USER@YOUR_HOST;password=YOUR_PASSWORD" --table YOUR_TABLE --target-dir YOUR_DIRECTORY --split-by YOUR_SPLIT_COLUMN Could you try this command - so get rid of the connection parameters completely - and specify the password in the URL just to see if it works? From my experience the connection parameters using the connection-param-file do not work (hence this bug) - so probably best to eliminate them all together.
          Hide
          janmechtel Jan Mechtel added a comment -

          I'll try the connection string first, it looks very similar to what I started with, so expect this to fail because of the @ but let's see. I am actually re-arranging my string and include the password as well.

          The only thing I'm unfamilar with is the --split-by parameter from the manual: "Column of the table used to split work units" is that similar to bucketing/partioning? Can I skip that, the test table is not very large.

          Thanks for the suggestions.

          Show
          janmechtel Jan Mechtel added a comment - I'll try the connection string first, it looks very similar to what I started with, so expect this to fail because of the @ but let's see. I am actually re-arranging my string and include the password as well. The only thing I'm unfamilar with is the --split-by parameter from the manual: "Column of the table used to split work units" is that similar to bucketing/partioning? Can I skip that, the test table is not very large. Thanks for the suggestions.
          Hide
          janmechtel Jan Mechtel added a comment -

          I skipped the column split and it's runing now. I have other errors but they seem to be permissions. I'll try to find what change made it work and post it here. Thx!

          Show
          janmechtel Jan Mechtel added a comment - I skipped the column split and it's runing now. I have other errors but they seem to be permissions. I'll try to find what change made it work and post it here. Thx!
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          Wonderful, I'm glad that it's working for your Jan Mechtel. Thank you David Robson for your suggestion, I appreciate your help!

          Show
          jarcec Jarek Jarcec Cecho added a comment - Wonderful, I'm glad that it's working for your Jan Mechtel . Thank you David Robson for your suggestion, I appreciate your help!
          Hide
          janmechtel Jan Mechtel added a comment -

          OK, it seems that the -P option makes the difference. With -P you provide the password in the input, in that case the preparation runs, but the mapreduce does not. When I includ the password with password=<mypwd> in the connection string it works. Unconvenient, but I can live with that

          Thanks a ton for your help.

          Show
          janmechtel Jan Mechtel added a comment - OK, it seems that the -P option makes the difference. With -P you provide the password in the input, in that case the preparation runs, but the mapreduce does not. When I includ the password with password=<mypwd> in the connection string it works. Unconvenient, but I can live with that Thanks a ton for your help.
          Hide
          david.robson David Robson added a comment -

          It looks to me like your problem might be with the Microsoft connector itself - I am not sure where to get support for that as I am not familiar with it - but seems like a bug they should fix.
          Anyway - in regards to SQOOP-382 the connection parameters don't work - should this be fixed in 1.x branch or should we leave it to 2.x? If it's not going to be fixed in 1.x should we at least update the documentation warning people is does not work?

          Show
          david.robson David Robson added a comment - It looks to me like your problem might be with the Microsoft connector itself - I am not sure where to get support for that as I am not familiar with it - but seems like a bug they should fix. Anyway - in regards to SQOOP-382 the connection parameters don't work - should this be fixed in 1.x branch or should we leave it to 2.x? If it's not going to be fixed in 1.x should we at least update the documentation warning people is does not work?
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          Good question David Robson, I'll try to take a look on this and either fix it before next release or improve documentation that it won't work for import/export tools.

          Show
          jarcec Jarek Jarcec Cecho added a comment - Good question David Robson , I'll try to take a look on this and either fix it before next release or improve documentation that it won't work for import/export tools.
          Hide
          david.robson David Robson added a comment -

          I actually looked into this a fair bit when I raised it - I think the easiest way is to just make it a Configuration parameter - then you can use the standard Hadoop Configuration object to get the array of connection parameters.
          Part of the problem with this for Sqoop 1.x is each connector has to implement it - so while it can be fixed in Sqoop - it won't work in say the Microsoft connector or the Quest one. Of course if it's fixed I can fix the Quest connector...
          Let me know if this feature is desirable - seeing as I'd have to fix OraOop anyway I could fix it in Sqoop at the same time and submit a patch.

          Show
          david.robson David Robson added a comment - I actually looked into this a fair bit when I raised it - I think the easiest way is to just make it a Configuration parameter - then you can use the standard Hadoop Configuration object to get the array of connection parameters. Part of the problem with this for Sqoop 1.x is each connector has to implement it - so while it can be fixed in Sqoop - it won't work in say the Microsoft connector or the Quest one. Of course if it's fixed I can fix the Quest connector... Let me know if this feature is desirable - seeing as I'd have to fix OraOop anyway I could fix it in Sqoop at the same time and submit a patch.
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          David Robson Please do no hesitate and try to put something together. Contributions are more then welcomed. Please feel free to reassign this ticket to you.

          Show
          jarcec Jarek Jarcec Cecho added a comment - David Robson Please do no hesitate and try to put something together. Contributions are more then welcomed. Please feel free to reassign this ticket to you.
          Hide
          david.robson David Robson added a comment -
          Show
          david.robson David Robson added a comment - Added review request https://reviews.apache.org/r/8221/
          Hide
          david.robson David Robson added a comment -

          This will make all connections use the connection parameters specified rather than just the initial one.

          Show
          david.robson David Robson added a comment - This will make all connections use the connection parameters specified rather than just the initial one.
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          The patch is in: https://git-wip-us.apache.org/repos/asf?p=sqoop.git;a=commit;h=817195ebb0bd54c9a81f1f1960858f0b495e63bd

          Thank you very much for your contribution David!

          Jarcec

          Show
          jarcec Jarek Jarcec Cecho added a comment - The patch is in: https://git-wip-us.apache.org/repos/asf?p=sqoop.git;a=commit;h=817195ebb0bd54c9a81f1f1960858f0b495e63bd Thank you very much for your contribution David! Jarcec
          Hide
          david.robson David Robson added a comment -

          Thanks Jarcec - for anyone using the Oracle Connector I have fixed it there as well: https://questmos.jira.com/browse/ORAOOP-18
          Jan - The Microsoft connector may need to be updated as well - you might want to point them at this issue so they can fix it as well.

          Show
          david.robson David Robson added a comment - Thanks Jarcec - for anyone using the Oracle Connector I have fixed it there as well: https://questmos.jira.com/browse/ORAOOP-18 Jan - The Microsoft connector may need to be updated as well - you might want to point them at this issue so they can fix it as well.
          Hide
          hudson Hudson added a comment -

          Integrated in Sqoop-ant-jdk-1.6-hadoop20 #332 (See https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop20/332/)
          SQOOP-382: Connection parameters should be used on the mapper (Revision 817195ebb0bd54c9a81f1f1960858f0b495e63bd)

          Result = FAILURE
          jarcec : https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=817195ebb0bd54c9a81f1f1960858f0b495e63bd
          Files :

          • src/java/org/apache/sqoop/mapreduce/JdbcUpsertExportJob.java
          • src/java/org/apache/sqoop/mapreduce/JdbcUpdateExportJob.java
          • src/test/org/apache/sqoop/mapreduce/db/TestDBConfiguration.java
          • src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java
          • src/java/org/apache/sqoop/mapreduce/db/DBConfiguration.java
          • src/test/com/cloudera/sqoop/manager/PGBulkloadManagerManualTest.java
          • src/java/org/apache/sqoop/mapreduce/MySQLDumpImportJob.java
          • src/java/org/apache/sqoop/mapreduce/PGBulkloadExportJob.java
          • src/java/org/apache/sqoop/mapreduce/JdbcExportJob.java
          • src/test/com/cloudera/sqoop/mapreduce/db/TestDataDrivenDBInputFormat.java
          • src/java/org/apache/sqoop/mapreduce/MySQLExportJob.java
          Show
          hudson Hudson added a comment - Integrated in Sqoop-ant-jdk-1.6-hadoop20 #332 (See https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop20/332/ ) SQOOP-382 : Connection parameters should be used on the mapper (Revision 817195ebb0bd54c9a81f1f1960858f0b495e63bd) Result = FAILURE jarcec : https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=817195ebb0bd54c9a81f1f1960858f0b495e63bd Files : src/java/org/apache/sqoop/mapreduce/JdbcUpsertExportJob.java src/java/org/apache/sqoop/mapreduce/JdbcUpdateExportJob.java src/test/org/apache/sqoop/mapreduce/db/TestDBConfiguration.java src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java src/java/org/apache/sqoop/mapreduce/db/DBConfiguration.java src/test/com/cloudera/sqoop/manager/PGBulkloadManagerManualTest.java src/java/org/apache/sqoop/mapreduce/MySQLDumpImportJob.java src/java/org/apache/sqoop/mapreduce/PGBulkloadExportJob.java src/java/org/apache/sqoop/mapreduce/JdbcExportJob.java src/test/com/cloudera/sqoop/mapreduce/db/TestDataDrivenDBInputFormat.java src/java/org/apache/sqoop/mapreduce/MySQLExportJob.java
          Hide
          hudson Hudson added a comment -

          Integrated in Sqoop-ant-jdk-1.6-hadoop100 #315 (See https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop100/315/)
          SQOOP-382: Connection parameters should be used on the mapper (Revision 817195ebb0bd54c9a81f1f1960858f0b495e63bd)

          Result = FAILURE
          jarcec : https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=817195ebb0bd54c9a81f1f1960858f0b495e63bd
          Files :

          • src/java/org/apache/sqoop/mapreduce/PGBulkloadExportJob.java
          • src/java/org/apache/sqoop/mapreduce/MySQLExportJob.java
          • src/test/com/cloudera/sqoop/mapreduce/db/TestDataDrivenDBInputFormat.java
          • src/java/org/apache/sqoop/mapreduce/MySQLDumpImportJob.java
          • src/test/org/apache/sqoop/mapreduce/db/TestDBConfiguration.java
          • src/java/org/apache/sqoop/mapreduce/db/DBConfiguration.java
          • src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java
          • src/java/org/apache/sqoop/mapreduce/JdbcUpsertExportJob.java
          • src/test/com/cloudera/sqoop/manager/PGBulkloadManagerManualTest.java
          • src/java/org/apache/sqoop/mapreduce/JdbcUpdateExportJob.java
          • src/java/org/apache/sqoop/mapreduce/JdbcExportJob.java
          Show
          hudson Hudson added a comment - Integrated in Sqoop-ant-jdk-1.6-hadoop100 #315 (See https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop100/315/ ) SQOOP-382 : Connection parameters should be used on the mapper (Revision 817195ebb0bd54c9a81f1f1960858f0b495e63bd) Result = FAILURE jarcec : https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=817195ebb0bd54c9a81f1f1960858f0b495e63bd Files : src/java/org/apache/sqoop/mapreduce/PGBulkloadExportJob.java src/java/org/apache/sqoop/mapreduce/MySQLExportJob.java src/test/com/cloudera/sqoop/mapreduce/db/TestDataDrivenDBInputFormat.java src/java/org/apache/sqoop/mapreduce/MySQLDumpImportJob.java src/test/org/apache/sqoop/mapreduce/db/TestDBConfiguration.java src/java/org/apache/sqoop/mapreduce/db/DBConfiguration.java src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java src/java/org/apache/sqoop/mapreduce/JdbcUpsertExportJob.java src/test/com/cloudera/sqoop/manager/PGBulkloadManagerManualTest.java src/java/org/apache/sqoop/mapreduce/JdbcUpdateExportJob.java src/java/org/apache/sqoop/mapreduce/JdbcExportJob.java
          Hide
          hudson Hudson added a comment -

          Integrated in Sqoop-ant-jdk-1.6-hadoop23 #494 (See https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop23/494/)
          SQOOP-382: Connection parameters should be used on the mapper (Revision 817195ebb0bd54c9a81f1f1960858f0b495e63bd)

          Result = FAILURE
          jarcec : https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=817195ebb0bd54c9a81f1f1960858f0b495e63bd
          Files :

          • src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java
          • src/test/com/cloudera/sqoop/manager/PGBulkloadManagerManualTest.java
          • src/java/org/apache/sqoop/mapreduce/PGBulkloadExportJob.java
          • src/java/org/apache/sqoop/mapreduce/JdbcUpsertExportJob.java
          • src/java/org/apache/sqoop/mapreduce/JdbcExportJob.java
          • src/java/org/apache/sqoop/mapreduce/db/DBConfiguration.java
          • src/test/org/apache/sqoop/mapreduce/db/TestDBConfiguration.java
          • src/java/org/apache/sqoop/mapreduce/JdbcUpdateExportJob.java
          • src/java/org/apache/sqoop/mapreduce/MySQLExportJob.java
          • src/java/org/apache/sqoop/mapreduce/MySQLDumpImportJob.java
          • src/test/com/cloudera/sqoop/mapreduce/db/TestDataDrivenDBInputFormat.java
          Show
          hudson Hudson added a comment - Integrated in Sqoop-ant-jdk-1.6-hadoop23 #494 (See https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop23/494/ ) SQOOP-382 : Connection parameters should be used on the mapper (Revision 817195ebb0bd54c9a81f1f1960858f0b495e63bd) Result = FAILURE jarcec : https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=817195ebb0bd54c9a81f1f1960858f0b495e63bd Files : src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java src/test/com/cloudera/sqoop/manager/PGBulkloadManagerManualTest.java src/java/org/apache/sqoop/mapreduce/PGBulkloadExportJob.java src/java/org/apache/sqoop/mapreduce/JdbcUpsertExportJob.java src/java/org/apache/sqoop/mapreduce/JdbcExportJob.java src/java/org/apache/sqoop/mapreduce/db/DBConfiguration.java src/test/org/apache/sqoop/mapreduce/db/TestDBConfiguration.java src/java/org/apache/sqoop/mapreduce/JdbcUpdateExportJob.java src/java/org/apache/sqoop/mapreduce/MySQLExportJob.java src/java/org/apache/sqoop/mapreduce/MySQLDumpImportJob.java src/test/com/cloudera/sqoop/mapreduce/db/TestDataDrivenDBInputFormat.java
          Hide
          hudson Hudson added a comment -

          Integrated in Sqoop-ant-jdk-1.6-hadoop200 #327 (See https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop200/327/)
          SQOOP-382: Connection parameters should be used on the mapper (Revision 817195ebb0bd54c9a81f1f1960858f0b495e63bd)

          Result = SUCCESS
          jarcec : https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=817195ebb0bd54c9a81f1f1960858f0b495e63bd
          Files :

          • src/java/org/apache/sqoop/mapreduce/MySQLExportJob.java
          • src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java
          • src/java/org/apache/sqoop/mapreduce/PGBulkloadExportJob.java
          • src/java/org/apache/sqoop/mapreduce/JdbcUpsertExportJob.java
          • src/java/org/apache/sqoop/mapreduce/MySQLDumpImportJob.java
          • src/java/org/apache/sqoop/mapreduce/JdbcUpdateExportJob.java
          • src/java/org/apache/sqoop/mapreduce/db/DBConfiguration.java
          • src/test/com/cloudera/sqoop/mapreduce/db/TestDataDrivenDBInputFormat.java
          • src/test/com/cloudera/sqoop/manager/PGBulkloadManagerManualTest.java
          • src/test/org/apache/sqoop/mapreduce/db/TestDBConfiguration.java
          • src/java/org/apache/sqoop/mapreduce/JdbcExportJob.java
          Show
          hudson Hudson added a comment - Integrated in Sqoop-ant-jdk-1.6-hadoop200 #327 (See https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop200/327/ ) SQOOP-382 : Connection parameters should be used on the mapper (Revision 817195ebb0bd54c9a81f1f1960858f0b495e63bd) Result = SUCCESS jarcec : https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=817195ebb0bd54c9a81f1f1960858f0b495e63bd Files : src/java/org/apache/sqoop/mapreduce/MySQLExportJob.java src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java src/java/org/apache/sqoop/mapreduce/PGBulkloadExportJob.java src/java/org/apache/sqoop/mapreduce/JdbcUpsertExportJob.java src/java/org/apache/sqoop/mapreduce/MySQLDumpImportJob.java src/java/org/apache/sqoop/mapreduce/JdbcUpdateExportJob.java src/java/org/apache/sqoop/mapreduce/db/DBConfiguration.java src/test/com/cloudera/sqoop/mapreduce/db/TestDataDrivenDBInputFormat.java src/test/com/cloudera/sqoop/manager/PGBulkloadManagerManualTest.java src/test/org/apache/sqoop/mapreduce/db/TestDBConfiguration.java src/java/org/apache/sqoop/mapreduce/JdbcExportJob.java
          Hide
          jarcec Jarek Jarcec Cecho added a comment -

          I think that the Jenkins failures are not related to introduced change and I'll investigate them.

          Show
          jarcec Jarek Jarcec Cecho added a comment - I think that the Jenkins failures are not related to introduced change and I'll investigate them.

            People

            • Assignee:
              david.robson David Robson
              Reporter:
              david.robson David Robson
            • Votes:
              1 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development