Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Information Provided
-
4.7.0
-
None
-
Unknown
-
Regression
Description
Hello everyone,
I just discovered a StackOverflowError when using reading files. Here's the smallest reprocase I could find.
// Create a temp directory with a CSV file Path tempDirectory = Files.createTempDirectory("camel-test"); try (BufferedWriter writer = Files.newBufferedWriter(tempDirectory.resolve("file1.csv"))) { writer.write("fieldA,fieldB,fieldC,fieldD\n"); for (int i = 0; i < 20000; i++) { writer.write("fieldA" + i + ",fieldB" + i + ",fieldC" + i + ",fieldD" + i + "\n"); } } // Seems to fail if the target producer extends DefaultProducer and works if it extends DefaultAsyncProducer String target = "file://output"; // this fails //String target = "log://speed?groupSize=1000"; // this works DefaultCamelContext context = new DefaultCamelContext(); context.addRoutes(new RouteBuilder() { @Override public void configure() { from("file://" + tempDirectory.toAbsolutePath() + "?noop=true").to("direct:read").log("Done!"); from("direct:read").unmarshal().csv().split(body()).to("direct:agg"); from("direct:agg").aggregate(constant("SINGLE_GROUP"), new GroupedExchangeAggregationStrategy()) .completionSize(1) .setBody((Exchange exchange) -> { List<Exchange> list = (List<Exchange>) exchange.getMessage().getBody(); return list.stream().map(e -> e.getMessage().getBody().toString()).collect(joining("\n")); }) .to(target); } }); context.start();
As mentioned in the example, it only seems to fail if the processor in the aggregation is a DefaultProcessor and not a DefaultAsyncProcessor
It still fails after converting my component to DefaultAsyncProcessor, so it's unrelated.
Can you have a look? Thank you
Attachments
Attachments
Issue Links
- relates to
-
CAMEL-21494 camel-file - Producer should be AsyncProducer based
- Resolved