How to send pipeline logs to external monitoring systems
Learn how to export integration pipeline logs to external monitoring systems outside the Digibee Integration Platform.
Last updated
Was this helpful?
Learn how to export integration pipeline logs to external monitoring systems outside the Digibee Integration Platform.
Last updated
Was this helpful?
Logs help you understand what happened during pipeline executions and troubleshoot unexpected behavior. You can view them directly on the page in Monitor.
However, if you need to monitor logs using external monitoring systems like ElasticSearch, Kibana, Graylog, or Splunk, you can build dedicated flows using the Log Stream pattern.
The Log Stream pattern uses Digibee’s internal Event Broker to decouple log transmission from the main flow and avoid overloading or delaying the primary integration execution. This pattern involves splitting your flow into two pipelines:
Business Rule pipeline: Handles core business logic and generates log events.
Log Stream pipeline: Listens to events and exports logs to the desired external services.
See how each of these pipelines should be configured below.
This pipeline defines the main flow logic and generates log events.
OnProcess (mandatory): The main execution path.
OnException: Triggered only when an error occurs in the OnProcess subflow.
Inside OnProcess, add the following:
Configure the Event Publisher to send the desired log payload, for example:
The resulting structure in your Business Rule pipeline (where a Log connector would normally be used) should look like this:
This pipeline receives and processes log events. Follow these steps to configure it:
Send logs to multiple destinations
In the example below, a Choice connector directs logs to either Graylog or Kibana via a REST V2 connector:
Avoid placing too many Log connectors in your pipeline. It can degrade performance and increase the required deployment size.
Mask or redact any sensitive information (like PII, credentials, tokens).
Use field-level controls or Capsules to enforce compliance with data protection policies.
Instead of creating a dedicated Log Stream pipeline, consider using Capsules to centralize and reuse log-handling logic.
Where you would typically place a , follow the steps below to improve log management using the Log Stream pattern:
Use the (in “put data” mode) to store key values from the pipeline context, such as user info or error details, that you will later retrieve before publishing logs. This ensures all relevant log data is available when triggering the Log Stream pipeline.
The creates two subflows:
A
A (in “get data” mode) to retrieve previously stored context
An to trigger the Log Stream pipeline
Tip: Use variables like {{ metadata.$ }}
to extract data from the flow and the pipeline and facilitate analysis in a troubleshooting process. Learn more about .
Inside OnException, add a to avoid silent failures.
Set the pipeline’s trigger to and configure it to match the event name used in the Event Publisher in the Business Rule pipeline. This ensures the Log Stream pipeline is activated whenever a log event is published.
Use connectors like or to send log data to external tools. You can:
Route logs conditionally using a
Wrap external calls in a .
OnProcess: Use the connector and validate responses with a connector followed by a , or use an connector.
OnException: Send alerts for failed deliveries using followed by a .
Enable retry logic in the connector’s Advanced Settings.
Learn more about .
Learn more about .
Put your knowledge into practice! Try the .