Batch processing

Learn why the best approach in integration scenarios is to consume data in batches, in a continuous and controlled way, with the ability to perform batch processing.
Many integration scenarios require the processing of large volumes of data. Obtaining all the data at once is highly inefficient or even impracticable, what can burst your pipeline memory.
When that happens, the best approach is to consume the data in smaller batches, in a continuous and controlled way, which is exactly what the concept of batch processing proposes.
Currently the Platform has 5 components that natively support this strategy. Here they are:
  • Stream DB
  • Stream DB V3
  • Stream Excel
  • Stream File Reader
  • For Each