Digibee Documentation
Request documentationBook a demo
English
English
  • Quick start
  • Highlights
    • Release notes
      • Release notes 2025
        • May
        • April
        • March
        • February
        • January
      • Release notes 2024
        • December
        • November
        • October
        • September
        • August
          • Connectors release 08/20/2024
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2023
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2022
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2021
      • Release notes 2020
    • AI Pair Programmer
    • Digibeectl
      • Getting started
        • How to install Digibeectl on Windows
      • Digibeectl syntax
      • Digibeectl operations
  • Digibee in action
    • Use Cases in Action
      • Improving integration performance with API pagination
      • Automating file storage with Digibee
      • Reprocessing strategy in event-driven integrations
      • Key practices for securing sensitive information in pipelines with Digibee
      • OAuth2 for secure API access
      • Secure your APIs with JWT in Digibee
      • Integration best practices for developers on the Digibee Integration Platform
      • How to use Event-driven architecture on the Digibee Integration Platform
      • Dynamic file download with Digibee
      • Microservices: Circuit Breaker pattern for improving resilience
      • Error handling strategy in event-driven integrations
    • Troubleshooting
      • Integration guidance
        • How to resolve common pipeline issues
        • How to resolve Error 409: “You cannot update a pipeline that is not on draft mode”
        • How to resolve the "Pipeline execution was aborted" error
        • Integrated authentication with Microsoft Entra ID
        • How to resolve the "Failed to initialize pool: ONS configuration failed" error
        • How to perform IP address mapping with Progress database
        • How to build integration flows that send error notifications
        • How to send logs to external services
        • How JSONPath differs in connectors and the Execution panel
        • Using JSONPath to validate numbers with specific initial digits
        • How to analyze the "Network error: Failed to fetch" in the Execution panel
        • How to handle request payloads larger than 5MB
        • How to configure Microsoft Entra ID to display groups on the Digibee Integration Platform
        • How to build an HL7 message
      • Connectors behavior and configuration
        • Timeout in the Pipeline Executor connector
        • How to use DISTINCT and COUNT in the Object Store
        • Understanding @@DGB_TRUNCATED@@ on the Digibee Integration Platform
        • How to resolve names without a DNS - REST, SOAP, SAP (web protocols)
        • How to read and write files inside a folder
        • AuthToken Reuse for Salesforce connector
        • How to resolve the "Invalid payload" error in API Integration
        • Supported databases
          • Functions and uses for databases
      • Connectors implementation and usage examples
        • Google Storage: Usage scenarios
        • DB V2: Usage scenarios
        • For Each: Usage example
        • Template and its uses
        • Digibee JWT implementation
        • Email V1: Usage example (Deprecated)
      • JOLT applications
        • Transformer: Getting to know JOLT
        • Transformer: Transformations with JOLT
        • Transformer: Add values to list elements
        • Transformer: Operations overview
        • Transformer: Date formatting using split and concat
        • Transformer: Simple IF-ELSE logic with JOLT
      • Platform access and performance tips
        • How to solve login problems on the Digibee Integration Platform
        • How to receive updates from Digibee Status Page
        • How to clean the Digibee Integration Platform cache
      • Governance troubleshooting guidance
        • How to consume Internal API pipelines using ZTNA
        • How to use Internal API with and without a VPN
        • How to generate, convert, and register SSH Keys
        • mTLS authentication
          • How to configure mTLS on the Digibee Integration Platform
          • FAQs: Certificates in mTLS
        • How to connect Digibee to Oracle RAC
        • How to connect Digibee to SAP
        • How to connect Digibee to MongoDB Atlas using VPN
        • How to manage IPs on the Digibee Integration Platform
        • Configuring the Dropbox account
        • How to use your Gmail account with the Digibee email component (SMTP)
        • How to use the CORS policy on the Digibee Integration Platform
      • Deployment scenarios
        • Solving the “Out of memory” errors in deployment
        • Warning of route conflicts
    • Best practices
      • Best practices for building a pipeline
      • Best practices on validating messages in a consumer pipeline
      • Avoiding loops and maximizing pipeline efficiency
      • Naming: Global, Accounts, and API Keys
      • Pagination tutorial
        • Pagination tutorial - part 1
        • Pagination tutorial - part 2
        • Pagination tutorial - part 3
        • Pagination tutorial - part 4
      • Pagination example
      • Event-driven architecture
      • Notification model in event-driven integrations
      • OAuth2 integration model with Digibee
      • Best practices for error handling in pipelines
    • Digibee Academy
      • Integration Developer Bootcamp
  • Reference guides
    • Connectors
      • AWS
        • S3 Storage
        • SQS
        • AWS Secrets Manager
        • AWS Athena
        • AWS CloudWatch
        • AWS Elastic Container Service (ECS)
        • AWS Eventbridge
        • AWS Identity and Access Management (IAM)
        • AWS Kinesis
        • AWS Kinesis Firehose
        • AWS Key Management Service (KMS)
        • AWS Lambda
        • AWS MQ
        • AWS Simple Email Service (SES)
        • AWS Simple Notification System (SNS)
        • AWS Security Token Service (STS)
        • AWS Translate
      • Azure
        • Azure CosmosDB
        • Azure Event Hubs
        • Azure Key Vault
        • Azure ServiceBus
        • Azure Storage DataLake Service
        • Azure Storage Queue Service
      • Enterprise applications
        • SAP
        • Salesforce
        • Braintree
        • Facebook
        • GitHub
        • Jira
        • ServiceNow
        • Slack
        • Telegram
        • Twilio
        • WhatsApp
        • Wordpress
        • Workday
        • Zendesk
      • File storage
        • Blob Storage (Azure)
        • Digibee Storage
        • Dropbox
        • FTP
        • Google Drive
        • Google Storage
        • OneDrive
        • SFTP
        • WebDav V2
        • WebDav (Deprecated)
      • Files
        • Append Files
        • Avro File Reader
        • Avro File Writer
        • CSV to Excel
        • Excel
        • File Reader
        • File Writer
        • GZIP V2
        • GZIP V1 (Deprecated)
        • Parquet File Reader
        • Parquet File Writer
        • Stream Avro File Reader
        • Stream Excel
        • Stream File Reader
        • Stream File Reader Pattern
        • Stream JSON File Reader
        • Stream Parquet File Reader
        • Stream XML File Reader
        • XML Schema Validator
        • ZIP File
        • NFS
      • Flow
        • Delayer
      • Google/GCP
        • Google BigQuery
        • Google BigQuery Standard SQL
        • Google Calendar
        • Google Cloud Functions
        • Google Mail
        • Google PubSub
        • Google Secret Manager
        • Google Sheets
      • Industry solutions
        • FHIR (Beta)
        • Gupy Public API
        • HL7
        • HubSpot: Sales and CMS
        • Mailgun API
        • Oracle NetSuite (Beta)
        • Orderful
        • Protheus: Billing and Inventory of Cost
      • Logic
        • Block Execution
        • Choice
        • Do While
        • For Each
        • Retry
        • Parallel Execution
      • Queues and messaging
        • Event Publisher
        • JMS
        • Kafka
        • RabbitMQ
      • Security
        • AES Cryptography
        • Asymmetric Cryptography
        • CMS
        • Digital Signature
        • JWT (Deprecated)
        • JWT V2
        • Google IAP Token
        • Hash
        • Digibee JWT (Generate and Decode)
        • LDAP
        • PBE Cryptography
        • PGP
        • RSA Cryptography
        • Symmetric Cryptography
      • Structured data
        • CassandraDB
        • DB V2
        • DB V1 (Deprecated)
        • DynamoDB
        • Google Big Table
        • Memcached
        • MongoDB
        • Object Store
        • Relationship
        • Session Management
        • Stored Procedure
        • Stream DB V3
        • Stream DB V1 (Deprecated)
        • ArangoDb
        • Caffeine Cache
        • Caffeine LoadCache
        • Couchbase
        • CouchDB
        • Ehcache
        • InfluxDB
      • Tools
        • Assert V2
        • Assert V1 (Deprecated)
        • Base64
        • CSV to JSON V2
        • CSV to JSON V1 (Deprecated)
        • HL7 Message Transformer (Beta)
        • HTML to PDF
        • Transformer (JOLT) V2
        • JSLT
        • JSON String to JSON Transformer
        • JSON to JSON String Transformer
        • JSON to XML Transformer
        • JSON to CSV V2
        • JSON to CSV Transformer (Deprecated)
        • JSON Path Transformer V2
        • JSON Path Transformer
        • JSON Transformer
        • Log
        • Pipeline Executor
        • QuickFix (Beta)
        • SSH Remote Command
        • Script (JavaScript)
        • Secure PDF
        • Store Account
        • Template Transformer
        • Throw Error
        • Transformer (JOLT)
        • Validator V1 (Deprecated)
        • Validator V2
        • XML to JSON Transformer
        • XML Transformer
        • JSON Generator (Mock)
      • Web protocols
        • Email V2
        • Email V1 (Deprecated)
        • REST V2
        • REST V1 (Deprecated)
        • SOAP V1 (Deprecated)
        • SOAP V2
        • SOAP V3
        • WGet (Download HTTP)
        • gRPC
    • Triggers
      • Web Protocols
        • API Trigger
        • Email Trigger
        • Email Trigger V2
        • HTTP Trigger
        • HTTP File Trigger
          • HTTP File Trigger - Downloads
          • HTTP File Trigger - Uploads
        • REST Trigger
      • Scheduling
        • Scheduler Trigger
      • Messaging and Events
        • Event Trigger
        • JMS Trigger
        • Kafka Trigger
        • RabbitMQ Trigger
      • Others
        • DynamoDB Streams Trigger
        • HL7 Trigger
        • Salesforce Trigger - Events
    • Double Braces
      • How to reference data using Double Braces
      • Double Braces functions
        • Math functions
        • Utilities functions
        • Numerical functions
        • String functions
        • JSON functions
        • Date functions
        • Comparison functions
        • File functions
        • Conditional functions
      • Double Braces autocomplete
  • Development cycle
    • Build
      • Canvas
        • AI Assistant
        • Smart Connector User Experience
        • Execution panel
        • Design and Inspect Mode
        • Linter: Canvas building validation
        • Connector Mocking
      • Pipeline
        • How to create a pipeline
        • How to scaffold a pipeline using an OpenAPI specification
        • How to create a project
        • Pipeline version history
        • Pipeline versioning
        • Messages processing
        • Subpipelines
      • Capsules
        • How to use Capsules
          • How to create a Capsule collection
            • Capsule header dimensions
          • How to create a Capsule group
          • How to configure a Capsule
          • How to build a Capsule
          • How to test a Capsule
          • How to save a Capsule
          • How to publish a Capsule
          • How to change a Capsule collection or group
          • How to archive and restore a Capsule
        • Capsules versioning
        • Public capsules
          • SAP
          • Digibee Tools
          • Google Sheets
          • Gupy
          • Send notifications via email
          • Totvs Live
          • Canvas LMS
        • AI Assistant for Capsules Docs Generation
    • Run
      • Run concepts
        • Autoscalling
      • Deployment
        • Deploying a pipeline
        • How to redeploy a pipeline
        • How to promote pipelines across environments
        • How to check the pipeline deployment History
        • How to rollback to a previous deployment version
        • Using deployment history advanced functions
        • Pipeline deployment status
      • How warnings work on pipelines in Run
    • Monitor
      • Monitor Insights (Beta)
      • Completed executions
        • Pipeline execution logs download
      • Pipeline logs
      • Pipeline Metrics
        • Pipeline Metrics API
          • How to set up Digibee API metrics with Datadog
          • How to set up Digibee API metrics with Prometheus
        • Connector Latency
      • Alerts
        • How to create an alert
        • How to edit an alert
        • How to activate, deactivate or duplicate an alert
        • How to delete an alert
        • How to configure alerts on Slack
        • How to configure alerts on Telegram
        • How to configure alerts through a webhook
        • Available metrics
        • Best practices about alerts
        • Use cases for alerts
      • VPN connections monitoring
        • Alerts for VPN metrics
  • Connectivity management
    • Connectivity
    • Zero Trust Network Access (ZTNA)
      • Prerequisites for using ZTNA
      • How to view connections (Edge Routers)
      • How to view the Network Mappings associated with an Edge Router
      • How to add new ZTNA connections (Edge Routers)
      • How to delete connections (Edge Routers)
      • How to view routes (Network Mapping)
      • How to add new routes (Network Mapping)
      • How to add routes in batch for ZTNA
      • How to edit routes (Network Mapping)
      • How to delete routes (Network Mapping)
      • How to generate new keys (Edge Router)
      • How to change the environment of Edge routers
      • ZTNA Inverse Flow
      • ZTNA Groups
    • Virtual Private Network (VPN)
  • Platform administration
    • Administration
      • Audit
      • Access control
        • Users
        • Groups
        • Roles
          • List of permissions by service
          • Roles and responsibilities: Governance and key stakeholder identification
      • Identity provider integration
        • How to integrate an identity provider
        • Authentication rules
        • Integration of IdP groups with Digibee groups
          • How to create a group integration
          • How to test a group integration
          • How to enable group integrations
          • How to edit a group integration
          • How to delete a group integration
      • User authentication and authorization
        • How to activate and deactivate two-factor authentication
        • Login flow
      • Organization groups
    • Settings
      • Globals
        • How to create Globals
        • How to edit or delete Globals
        • How to use Globals
      • Accounts
        • Configuring each account type
        • Monitor changes to account settings in deployed pipelines
        • OAuth2 Architecture
          • Registration of new OAuth providers
      • Consumers (API Keys)
      • Relationship model
      • Multi-Instance
        • Deploying a multi-instance pipeline
      • Log Streaming
        • How to use Log Streaming with Datadog
    • Governance
      • Policies
        • Security
          • Internal API access policy
          • External API access policy
          • Sensitive fields policy
        • Transformation
          • Custom HTTP header
          • CORS HTTP header
        • Limit of Replicas policy
    • Licensing
      • Licensing models
        • Consumption Based model
      • Capacity and quotas
      • License consumption
    • Digibee APIs
      • How to create API credentials
  • Digibee concepts
    • Pipeline Engine
      • Digibee Integration Platform Pipeline Engine v2
      • Support Dynamic Accounts (Restricted Beta)
    • Digibee Integration Platform Dedicated SaaS
      • Digibee Integration Platform architecture on Dedicated Saas model
      • Requirements for Digibee Dedicated Saas model
      • Site-to-Site VPN for dedicated SaaS customer support
      • Dedicated Saas customer responsibilities
      • Custom Images of Kubernetes Nodes
      • Digibee Dedicated SaaS installation on AWS
        • How to install requirements before installing Digibee Integration Platform on EKS
        • Permissions to use Digibee Integration Platform on EKS
        • How to create custom nodes for EKS (Golden Images)
    • Introduction to ZTNA
  • Help & FAQ
    • Digibee Customer Support
    • Request documentation, suggest features, or send feedback
    • Beta Program
    • Security and compliance
    • About Digibee
Powered by GitBook
On this page
  • Test
  • Messages
  • Logs
  • Export and import
  • Executing the flow
  • Additional information

Was this helpful?

  1. Development cycle
  2. Build
  3. Canvas

Execution panel

Learn how to test your pipelines or capsules using the Execution panel on the Digibee Integration Platform.

PreviousSmart Connector User ExperienceNextDesign and Inspect Mode

Last updated 1 month ago

Was this helpful?

This feature was named "Test mode" until June 2023.

The Execution panel allows you to test a pipeline or capsule in the test environment while designing the flow. It uses test values from , , , and services, enabling you to validate the implementation logic and debug the flow in real-time.

You can access the Execution panel from the lower-left corner of the Canvas or by using the Cmd (CTRL) + D shortcut. The panel is divided into three tabs:

  • Test: Input test data and run the execution.

  • Messages: Review the results of each connector’s execution individually.

  • Logs: View informational, error, and warning logs related to the execution.

Test

The Test tab allows you to input data, execute a test, and review the results.

Parameters column (only for capsules)

The Parameters column appears when running an execution in a capsule. Here, you should input the parameters and account values based on the capsule's configuration. This simulates a real execution, providing the same workflow that future users will experience.

Payload column

The Payload column is where you enter input data for the test. Once the data is entered, you can:

Select a specific instance

Save the payload

You can save the entered payload for future use in the pipeline or capsule you are building.

  • Click Save as Payload to store it. Payloads are saved per pipeline or capsule and cannot be accessed across different pipelines or capsules.

  • Once saved, you can access your payloads by clicking Payloads, where you can also select, delete or save additional payloads.

Payloads can’t be edited after saving. If you make changes, you will need to save it as a new payload.

Format the JSON

For readability, use the magic wand icon in the upper-right corner of the Payload column to automatically format the JSON.

Output column

Once the test is executed, the results are shown in the Output column. You can:

Download or copy the output

At the top-right corner, there are buttons to:

  • Download the JSON output as a file.

  • Copy the output to your clipboard.

Search for JSONPath

At the bottom of the Output column, you can apply a JSONPath expression to filter specific elements from the JSON output.

Messages

The Messages tab shows the execution result of each connector in the pipeline. Each connector receives the payload from the previous connector, processes it based on its function, and generates a new payload in response.

List of messages

The left column of the Messages tab displays the first 5,000 messages. This list includes:

  • Name: The message name, which corresponds to the Step Name of the connector.

  • Time: The execution time of the connector in milliseconds.

To find a specific message, use the Search for messages field below the list. You can search by the full or partial message name, or by a connector parameter.

Message preview

When you select a message, its preview appears in the right column. In this preview, you can:

Download or copy the message preview

At the top-right corner, there are buttons to:

  • Download the JSON of the message preview as a file.

  • Copy the message preview to your clipboard.

Search for JSONPath

At the bottom of the message preview, you can apply a JSONPath expression to filter specific elements from the JSON message.

Logs

The Logs tab contains information about the event logs that occur during the execution of a pipeline in the Execution panel.

List of logs

The log list displays the following details:

  • Log type: The classification of the log, indicated by an icon. The available types are:

    • Info: Informational logs.

    • Error: Logs that capture errors during execution.

    • Warning: Logs that indicate warnings.

  • Timestamp: The date and time the step was executed in the pipeline.

  • Log message: The message in each log.

In this tab, you can:

  • Filter logs by type.

  • Copy the message from any log.

  • Search for specific logs by entering part or all of the log message.

Export and import

The Execution panel allows you to export or import an execution.

Export

To download a file with the pipeline configuration and execution data, click Export execution. Ensure that all execution data is fully loaded on the screen before exporting to avoid missing information.

The exported file contains the following data:

  • pipelineId: The ID of the pipeline.

  • pipelineName: The name of the pipeline.

  • currentFlowSpec: The pipeline flow data from Canvas at the time of export, including all connectors and their configurations.

  • executedFlowSpec: The pipeline flow data as it was at the time of execution, including all connectors and their configurations.

  • realm: The realm of the pipeline.

  • execution: The execution data, which includes Payload, Output, Messages, and Logs.

If the pipeline is not executed before exporting, the file will only include pipeline-related information, such as the ID, name, currentFlowSpec, and executedFlowSpec.

Import

To upload a file from your computer containing pipeline configuration and execution data, click Import execution.

Once imported, the execution data will appear in the Execution panel. However, the currentFlowSpec and executedFlowSpec will not be displayed in Canvas.

Additionally, you can only run the execution of an imported file if you are in the same pipeline from which the execution was exported and if the original pipeline flow still exists.

Executing the flow

By executing a flow, you can test and validate your pipeline or capsule to make sure it works as expected. You can execute the entire flow or just part of it, depending on your needs.

How to execute the entire flow

You can execute the entire flow using one of the following methods:

  • Press Cmd (Ctrl) + Enter.

  • Open the Execution panel and click Play.

  • Open the Execution panel, click the dropdown next to Play, and select Run pipeline or Run capsule.

Regardless of any selected connectors, these options will always execute the entire flow.

How to execute part of the flow

To execute a specific part of the flow:

Step 1: Select the connectors

Choose the connectors you want to execute using one of these methods:

  • Hold Shift and drag the mouse over the desired connectors.

  • Hold Shift and click each connector individually.

For a selection to be valid, the chosen connectors must be connected and in sequence. Skipping a connector in the middle of the sequence will invalidate the selection.

Step 2: Execute the selection

After selecting the connectors, you can execute the flow in two ways:

  • Press Cmd (Ctrl) + Shift + Enter.

  • Open the Execution panel, click the dropdown next to Play, and select Run selected steps.

How to execute from a specific step

You can start the pipeline execution from any chosen step, ensuring that all subsequent steps in the flow are executed. To do this, follow these steps:

  1. Click the connector where you want the execution to begin.

  2. Open the Execution panel.

  3. Click the dropdown next to Play and select Run from step. You can also use the shortcut Ctrl + Alt + Enter for Windows or Cmd (⌘) + Option (⌥) + Enter for macOS.

The pipeline is then executed from the selected step and continues with the remaining flow.

Additional information

Below you will find some important details about the Execution panel and useful keyboard shortcuts.

Execution panel inactivity

If you don’t access the Execution panel for 24 hours, it will become inactive. When you reopen the Canvas, it may take up to 3 minutes for the Execution panel to become active again.

Keyboard shortcuts
  • Cmd (Ctrl) + D: Open or close the Execution panel.

  • Cmd (Ctrl) + Enter: Execute the entire flow.

  • Cmd (Ctrl) + Shift + Enter: Execute the selected connectors.

  • Ctrl + Alt + Enter (Windows): Execute from the selected step.

  • Cmd (⌘) + Option (⌥) + Enter (macOS): Execute from the selected step.

When a pipeline is configured as multi-instance, the Multi-instance dropdown is displayed. You must choose the instance for the test. For more details, see the .

Multi-Instance documentation
Globals
Accounts
Relationship
Multi-Instance