Stream CSV Reader (Beta)
Discover more about the Stream CSV Reader connector and how to use it on the Digibee Integration Platform.
Stream CSV Reader reads a local CSV file row by row in a JSON structure and triggers subflows to process each line. This resource is recommended for scenarios in which large files need to be processed efficiently and at scale.
Parameters
Take a look at the configuration parameters for the connector. Parameters supported by Double Braces expressions are marked with (DB)
.
General tab
File Name (DB)
Name of the local CSV file to read.
data.csv
String
Charset
Character encoding used to read the file.
UTF-8
String
Headers
Custom headers to replace the file's original header (comma-separated).
A,B,C
String
Delimiter
Character used to separate values in the CSV file.
,
String
Parallel Execution Of Each Iteration
If enabled, each line is processed in parallel.
False
Boolean
Ignore Invalid Charset
If enabled, invalid charset characters are ignored.
False
Boolean
Ignore Header
If enabled, skips the first line (header) of the file.
False
Boolean
Keep Header
If enabled, processes the header line as data.
False
Boolean
Advanced
Enables advanced parameters.
False
Boolean
Metadata Only
If enabled, returns only file metadata (row count and file size) without processing data.
False
Boolean
Limit (DB)
Maximum number of rows to read from the file. A value of 0 means no limit.
0
Integer
Fail On Error
If enabled, interrupts the pipeline execution when an error occurs. If disabled, execution continues, but the "success"
property will be set to false
.
False
Boolean
Documentation
Documentation
Optional field to describe the connector configuration and any relevant business rules.
N/A
String
Messages flow
Input
The connector waits for a message in the following format:
{
"filename": "fileName"
}
Output
{
"total": 0,
"success": 0,
"failed": 0
}
total
: Total number of processed rows.success
: Total number of rows successfully processed.failed
: Total number of rows whose processing failed.
To know whether a line has been processed correctly, the return value { "success": true }
must be present for each processed line.
Additional information
The connector throws an exception if the File Name doesn't exist or can't be read.
The file manipulation inside a pipeline occurs in a protected way. All the files can be accessed with a temporary directory only, where each pipeline key gives access to its own files set.
This connector makes batch processing, which means processing the data continuously and in a controlled manner in smaller batches.
Last updated
Was this helpful?