Details
-
Sub-task
-
Status: Closed
-
Minor
-
Resolution: Not A Problem
-
3.0.0
-
None
-
None
Description
A few classes in our test code extend Hadoop's LocalFileSystem. Scala 2.13 returns a compile error here - not for the Spark code, but because the Hadoop code (it says) illegally overrides appendFile() with slightly different generic types in its return value. This code is valid Java, evidently, and the code actually doesn't define any generic types, so, I even wonder if it's a scalac bug.
So far I don't see a workaround for this.
This only affects the Hadoop 3.2 build, in that it comes up with respect to a method new in Hadoop 3. (There is actually another instance of a similar problem that affects Hadoop 2, but I can see a tiny hack workaround for it).
Attachments
Issue Links
- links to