# How to send pipeline logs to external monitoring systems

Logs help you understand what happened during pipeline executions and troubleshoot unexpected behavior. You can view them directly on the [**Pipeline Logs**](https://app.gitbook.com/s/jvO5S91EQURCEhbZOuuZ/development-cycle/dashboards/pipeline-logs) page in Monitor.

However, if you need to monitor logs using external monitoring systems like ElasticSearch, Kibana, Graylog, or Splunk, you can build dedicated flows using the **Log Stream pattern**.

## **Architecture: Log Stream pattern**

The **Log Stream pattern** uses Digibee’s internal **Event Broker** to decouple log transmission from the main flow and avoid overloading or delaying the primary integration execution. This pattern involves splitting your flow into two pipelines:

1. **Business Rule pipeline**: Handles core business logic and generates log events.
2. **Log Stream pipeline**: Listens to events and exports logs to the desired external services.

See how each of these pipelines should be configured below.

{% hint style="info" %}
Before implementing this approach, learn more about [event-driven architecture](https://app.gitbook.com/s/aD6wuPRxnEQEsYpePq36/best-practices/event-oriented-architecture).
{% endhint %}

<figure><img src="https://content.gitbook.com/content/boT4qPJIk6PZotrxlJWL/blobs/rfQ3rm0M7lxCe6u1oCap/Untitled%20design%20(2).png" alt=""><figcaption><p>Diagram: Log Stream Pattern</p></figcaption></figure>

{% stepper %}
{% step %}

### **Create the Business Rule pipeline**

This pipeline defines the main flow logic and generates log events.

Where you would typically place a [**Log** connector](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/tools/log), follow the steps below to improve log management using the Log Stream pattern:

1. #### **(Optional) Store context with Session Management**

Use the [**Session Management** connector](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/structured-data/session-management) (in “put data” mode) to store key values from the pipeline context, such as user info or error details, that you will later retrieve before publishing logs. This ensures all relevant log data is available when triggering the **Log Stream** pipeline.

2. #### **Use Block Execution**

The [**Block Execution** connector](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/logic/block-execution) creates two subflows:

* **OnProcess** (mandatory): The main execution path.
* **OnException**: Triggered only when an error occurs in the OnProcess subflow.

Inside **OnProcess**, add the following:

* A [**Log** connector](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/tools/log)
* A [**Session Management** connector](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/structured-data/session-management) (in “get data” mode) to retrieve previously stored context
* An [**Event Publisher**](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/queues-and-messaging/event-publisher) to trigger the **Log Stream** pipeline

Configure the **Event Publisher** to send the desired log payload, for example:

```json
{
  "timestamp": "2025-05-05T12:00:00Z",
  "pipelineId": "customer-onboarding",
  "executionId": "abc123",
   "message": "Customer onboarding started"
}
```

{% hint style="success" %}
**Tip:** Use variables like `{{ metadata.$ }}` to extract data from the flow and the pipeline and facilitate analysis in a troubleshooting process. Learn more about [referencing metadata with Double Braces](https://docs.digibee.com/documentation/troubleshooting/integration-guides/broken-reference).
{% endhint %}

Inside **OnException**, add a [**Throw Error** connector](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/tools/throw-error) to avoid silent failures.

{% hint style="info" %}
Although the Log Stream pattern is especially useful for capturing and managing error logs, which is a common best practice in production environments, it also supports a wide range of use cases, such as:

* Successful execution events
* Audit logs
* Custom business events
  {% endhint %}

The resulting structure in your **Business Rule** pipeline (where a **Log** connector would normally be used) should look like this:

<figure><img src="https://content.gitbook.com/content/boT4qPJIk6PZotrxlJWL/blobs/gOun2tknDXiS8s1XhGOZ/English.png" alt=""><figcaption><p>Business Rule pipeline example</p></figcaption></figure>
{% endstep %}

{% step %}

### **Set up the Log Stream pipeline**

This pipeline receives and processes log events. Follow these steps to configure it:

1. #### **Configure the trigger**

Set the pipeline’s trigger to [**Event Trigger**](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/triggers/messaging-and-events/event) and configure it to match the event name used in the **Event Publisher** in the **Business Rule** pipeline. This ensures the **Log Stream** pipeline is activated whenever a log event is published.

2. #### **Forward logs to external tools**

Use connectors like [**REST V2**](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/web-protocols/rest-v2) or [**SOAP V3**](https://app.gitbook.com/s/SKBJ6ZiEWBU93x170HH4/connectors/web-protocols/soap-v3) to send log data to external tools. You can:

* Send logs to multiple destinations
* Route logs conditionally using a [**Choice** connector](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/logic/choice)

In the example below, a **Choice** connector directs logs to either Graylog or Kibana via a **REST V2** connector:

<figure><img src="https://content.gitbook.com/content/boT4qPJIk6PZotrxlJWL/blobs/KVp1m3M72A9szBoojqDr/image.png" alt=""><figcaption></figcaption></figure>

{% hint style="danger" %}
Avoid placing too many **Log** connectors in your pipeline. It can degrade performance and increase the required deployment size.
{% endhint %}
{% endstep %}
{% endstepper %}

## **Additional information**

### **Best practices for external log calls**

* Wrap external calls in a [**Block Execution** connector](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/logic/block-execution).
  * **OnProcess:** Use the [**REST**](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/web-protocols/rest-v2) connector and validate responses with a [**Choice**](https://app.gitbook.com/s/SKBJ6ZiEWBU93x170HH4/connectors/logic/choice) connector followed by a [**Throw Error**](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/tools/throw-error), or use an [**Assert**](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/tools/assert-v2) connector.
  * **OnException**: Send alerts for failed deliveries using [**Email V2**](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/web-protocols/email-v2) followed by a [**Throw Error**](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/tools/throw-error).
* Enable retry logic in the [**REST V2**](https://app.gitbook.com/s/EKM2LD3uNAckQgy1OUyZ/connectors/web-protocols/rest-v2) connector’s Advanced Settings.
* Mask or redact any sensitive information (like PII, credentials, tokens).
* Use field-level controls or Capsules to enforce compliance with data protection policies.

Learn more about [Key practices for securing sensitive information in pipelines with Digibee](https://app.gitbook.com/s/aD6wuPRxnEQEsYpePq36/use-cases/key-practices-for-securing-sensitive-information).

### **Alternative: Use capsules**

Instead of creating a dedicated Log Stream pipeline, consider using Capsules to centralize and reuse log-handling logic.

Learn more about [using Capsules](https://app.gitbook.com/s/jvO5S91EQURCEhbZOuuZ/development-cycle/build-overview/capsulas).

### **Practice challenge**

Put your knowledge into practice! Try the [Log Stream Pattern Challenge on Digibee Academy](https://digibee.academy/challenge-log-stream/).
