I have a huge xml in the file system:
I’m splitting it and processing every chunk in parallel.
It works pretty well. However, I need to understand the point at which the end of file has reached and the stream has ended so I can react to it accordingly.
This is my current route.
<code> from("file:/data?fileName=huge_xml.xml&noop=true")
.split().xtokenize("//my_node", 'i', new Namespaces("", "http://www.mynamespace.com/201907/"))
.parallelProcessing().parallelAggregate()
.streaming()
.process(exchange -> System.out.println(exchange.getMessage().getBody()))
.to("direct:endOFStream");
</code>
<code> from("file:/data?fileName=huge_xml.xml&noop=true")
.split().xtokenize("//my_node", 'i', new Namespaces("", "http://www.mynamespace.com/201907/"))
.parallelProcessing().parallelAggregate()
.streaming()
.process(exchange -> System.out.println(exchange.getMessage().getBody()))
.to("direct:endOFStream");
</code>
from("file:/data?fileName=huge_xml.xml&noop=true")
.split().xtokenize("//my_node", 'i', new Namespaces("", "http://www.mynamespace.com/201907/"))
.parallelProcessing().parallelAggregate()
.streaming()
.process(exchange -> System.out.println(exchange.getMessage().getBody()))
.to("direct:endOFStream");
But currently the endOFStream is called for every chunk (I suppose). I want that to be called only once when all the nodes have been successfully streamed.
What’s missing there?