Uploaded image for project: 'Sqoop (Retired)'
  1. Sqoop (Retired)
  2. SQOOP-3290

Not able to import from SQL Server on HDFS Parquet (Date columns)

    XMLWordPrintableJSON

Details

    • Test
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • None
    • None
    • None
    • None

    Description

      Hi,

      I am able to import data from SQL Server to HDFS Parquet format, but the script fails when there is date or datetime or datetime2 column in SQL Server table. 

      I tried to use --map-column-hive option as mentioned below

      sqoop import --connect "jdbc:jtds:sqlserver://<<Servername>>;useNTLMv2=true;domain=APNET;databaseName=<<DB Name>>" \
      -connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username *** --password *** --table T1 --split-by C1-target-dir /user/user1/bkp1/11/ --map-column-java D1=timestamp --as-parquetfile – --schema=sch1;

       

      The script succeeds successfully but when I load the file in Hive parquet table, the date column is coming as NULL. Am I doing in wrongly or is it a known bug for sqoop for SQL conversion error? Please advice.

      Attachments

        Activity

          People

            Unassigned Unassigned
            ashishprem Ashish Kumar Sinha
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: