Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
0.8.2, 0.8.3, trunk
-
None
Description
Steps to Duplicate
- Setup Atlas with very large data. Atlas one entity should be about 300 MB in size.
- Perform export with parameters that will include the large entity.
Following error will be encountered:
java.lang.OutOfMemoryError: Requested array size exceeds VM limit at java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300) at java.lang.StringCoding.encode(StringCoding.java:344) at java.lang.StringCoding.encode(StringCoding.java:387) at java.lang.String.getBytes(String.java:958) at org.apache.atlas.repository.impexp.ZipSink.addToZipStream(ZipSink.java:106) at org.apache.atlas.repository.impexp.ZipSink.saveToZip(ZipSink.java:95) at org.apache.atlas.repository.impexp.ZipSink.add(ZipSink.java:55) at org.apache.atlas.repository.impexp.ExportService.addEntity(ExportService.java:467)
Additional Information
During conversion of entity string to bytes, a known JDK error is encountered.