I have found a bug, when importing a table from an oracle database using the oracle connector for hadoop.
The sqoop command looks like this:
The generated AVRO schema for the date field is [null, long]. This is why an error is thrown when sqoop tries to write the date string to that field. The workaround for this problem is to add a mapping for this column:
But since I want to build a general solution adding a mapping for each DATE column is not an option.
The type of the generated AVRO schema for DATE Columns should be [null, string] when using the Oracle connector for Hadoop and when oraoop.timestamp.string=true.