Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
It looks like there's one place left in the codebase, SpecificParquetRecordReaderBase, where we didn't use SparkHadoopUtil's reflective accesses of TaskAttemptContext methods, creating problems when using a single Spark artifact with both Hadoop 1.x and 2.x.
Attachments
Issue Links
- relates to
-
SPARK-10330 Use SparkHadoopUtil TaskAttemptContext reflection methods in more places
- Resolved
- links to