Uploaded image for project: 'Camel'
  1. Camel
  2. CAMEL-21400

StackOverflowError when processing files

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Information Provided
    • 4.7.0
    • 4.10.0
    • camel-core
    • None
    • Unknown
    • Regression

    Description

      Hello everyone,

      I just discovered a StackOverflowError when using reading files. Here's the smallest reprocase I could find.

      // Create a temp directory with a CSV file
      Path tempDirectory = Files.createTempDirectory("camel-test");
      try (BufferedWriter writer = Files.newBufferedWriter(tempDirectory.resolve("file1.csv"))) {
          writer.write("fieldA,fieldB,fieldC,fieldD\n");
          for (int i = 0; i < 20000; i++) {
              writer.write("fieldA" + i + ",fieldB" + i + ",fieldC" + i + ",fieldD" + i + "\n");
          }
      }
      
      // Seems to fail if the target producer extends DefaultProducer and works if it extends DefaultAsyncProducer
      String target = "file://output"; // this fails
      //String target = "log://speed?groupSize=1000"; // this works
      
      DefaultCamelContext context = new DefaultCamelContext();
      context.addRoutes(new RouteBuilder() {
          @Override
          public void configure() {
              from("file://" + tempDirectory.toAbsolutePath() + "?noop=true").to("direct:read").log("Done!");
              from("direct:read").unmarshal().csv().split(body()).to("direct:agg");
               from("direct:agg").aggregate(constant("SINGLE_GROUP"), new GroupedExchangeAggregationStrategy())
                      .completionSize(1)
                      .setBody((Exchange exchange) -> {
                          List<Exchange> list = (List<Exchange>) exchange.getMessage().getBody();
                          return list.stream().map(e -> e.getMessage().getBody().toString()).collect(joining("\n"));
                      })
                      .to(target);
          }
      });
      context.start();
      

      As mentioned in the example, it only seems to fail if the processor in the aggregation is a DefaultProcessor and not a DefaultAsyncProcessor
      It still fails after converting my component to DefaultAsyncProcessor, so it's unrelated.

      Can you have a look? Thank you

      Attachments

        1. SplitAggregateTest.java
          3 kB
          Antoine DESSAIGNE

        Issue Links

          Activity

            People

              davsclaus Claus Ibsen
              antoine.dessaigne Antoine DESSAIGNE
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: