Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Won't Fix
-
0.14.3
-
None
-
None
-
None
Description
It looks like multiple close() calls, while reading files from DFS is failing in hadoop 0.14. This was somehow not caught in hadoop 0.13.
The use case was to open a file on DFS like shown below
<code>
FSDataInputStream
fSDataInputStream =
fileSystem.open(new Path(propertyFileName));
Properties subProperties =
new Properties();
subProperties.
loadFromXML(fSDataInputStream);
fSDataInputStream.
close();
</code>
This failed with an IOException
<exception>
EXCEPTIN RAISED, which is java.io.IOException: Stream closed
java.io.IOException: Stream closed
</exception>
The stack trace shows its being closed twice. While this used to work in hadoop 0.13 which used to hide this.
Attached with this JIRA is a text file which has stack trace for both hadoop 0.13 and hadoop 0.14.
How should this be handled from a users point of view?
Thanks
Attachments
Attachments
Issue Links
- is duplicated by
-
HADOOP-2998 Calling DFSClient.close() should not throw IOException when it is already closed.
- Closed