Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Don’t Let Your DataMapper Streaming be Out of Control

DZone's Guide to

Don’t Let Your DataMapper Streaming be Out of Control

· Integration Zone ·
Free Resource

How to Transform Your Business in the Digital Age: Learn how organizations are re-architecting their integration strategy with data-driven app integration for true digital transformation.

Originally authored by Mariano Simone

As Nicolas pointed out in “7 things you didn’t know about DataMapper“, it’s not a trivial task to map a big file to some other data structure without eating up a lot of memory.

If the input is large enough, you might run out of memory: either while mapping or while processing the data in your flow-ref

Enabling the “streaming” function in DataMapper makes this a lot easier (and efficient!).

just enable "streaming"

But just doing this doesn’t let you decide how many records at a time you want to get passed on to the next processor: in the worst-case-scenario, you might end up with just one line at a type processing. If your next processor is a database, you will have as many queries as lines in your file.

There is, however, a little trick to gain fine-grained control on how many lines are being processed: setting the batchSize property of the foreach:

<foreach batchSize="100" doc:name="For Each">
    <logger message="Got a bunch! (of #[payload.size()])" level="INFO" doc:name="Logger"/>
</foreach>

If you want to see this in action, go grab the example app, import it into studio and start playing around ;)


Make your mark on the industry’s leading annual report. Fill out the State of API Integration 2019 Survey and receive $25 to the Cloud Elements store.

Topics:

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}