Batch processing

Understand what batch processing is and how it works.
Many integration scenarios require the processing of large volumes of data. Obtaining all the data at once is highly inefficient or even impracticable, what can burst your pipeline memory.
When that happens, the best approach is to consume the data in smaller batches, in a continuous and controlled way, which is exactly what the concept of batch processing proposes.
Currently the Platform has 5 components that natively support this strategy. Here they are:
  • Stream DB
  • Stream DB V3
  • Stream Excel
  • Stream File Reader
  • For Each