Stream File Reader
Discover more about the Stream File Reader component and how to use it on the Digibee Integration Platform.
Stream File Reader reads a local file in a JSON structure, that currently supports CSV only, and triggers subpipelines to process each message. It should be used for heavy files.
Parameters
Take a look at the configuration parameters of the component. Parameters supported by Double Braces expressions are marked with (DB)
.
Parameter | Description | Default value | Data type |
---|---|---|---|
File Name | File name or full file path (i.e. tmp/processed/file.txt) of the local file. | data.csv | String |
Charset | Name of the characters code for the default file reading. | UTF-8 | String |
Element Identifier | Attribute to be sent in case of errors. | data | String |
Parallel Execution Of Each Iteration | Occurs in parallel with the loop execution. | False | Boolean |
Ignore Invalid Charset | If the option is activated, the invalid charset configured in the component will be ignored along with the received file. | False | Boolean |
Fail On Error | If the option is activated, the execution of the pipeline with error will be interrupted; otherwise, the pipeline execution proceeds, but the result will show a false value for the "success" property. | False | Boolean |
Advanced | Definition of advanced parameters. | False | Boolean |
Skip | Number of lines to be skipped before the file reading. | N/A | Integer |
Limit | Maximum number of lines to be read. | N/A | Integer |
Messages flow
Input
The component waits for a message in the following format:
Local File Name overrides the default local file.
Output
total: total number of processed lines.
success: total number of lines successfully processed.
failed: total number of line whose processing failed.
To know whether a line has been processed correctly, the return value { "success": true }
must be present for each processed line.
The component throws an exception if the File Name doesn't exist or can't be read.
The files manipulation inside a pipeline occurs in a protected way. All the files can be accessed with a temporary directory only, where each pipeline key gives access to its own files set.
This component makes batch processing. To better understand the concept, read the article about Batch processing.
Last updated