Digibee Documentation
Request documentationBook a demo
English
English
  • Quick start
  • Highlights
    • Release notes
      • Release notes 2025
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2024
        • December
        • November
        • October
        • September
        • August
          • Connectors release 08/20/2024
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2023
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2022
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2021
      • Release notes 2020
    • AI Pair Programmer
    • Digibeectl
      • Getting started
        • How to install Digibeectl on Windows
      • Digibeectl syntax
      • Digibeectl operations
  • Digibee in action
    • Use Cases in Action
      • Improving integration performance with API pagination
      • Automating file storage with Digibee
      • Reprocessing strategy in event-driven integrations
      • Key practices for securing sensitive information in pipelines with Digibee
      • OAuth2 for secure API access
      • Secure your APIs with JWT in Digibee
      • Integration best practices for developers on the Digibee Integration Platform
      • How to use Event-driven architecture on the Digibee Integration Platform
      • Dynamic file download with Digibee
      • Microservices: Circuit Breaker pattern for improving resilience
      • Error handling strategy in event-driven integrations
    • Troubleshooting
      • Integration guidance
        • How to resolve common pipeline issues
        • How to resolve Error 409: “You cannot update a pipeline that is not on draft mode”
        • How to resolve the "Pipeline execution was aborted" error
        • Integrated authentication with Microsoft Entra ID
        • How to resolve the "Failed to initialize pool: ONS configuration failed" error
        • How to perform IP address mapping with Progress database
        • How to build integration flows that send error notifications
        • How to send pipeline logs to external monitoring systems
        • How JSONPath differs in connectors and the Execution panel
        • Using JSONPath to validate numbers with specific initial digits
        • How to analyze the "Network error: Failed to fetch" in the Execution panel
        • How to handle request payloads larger than 5MB
        • How to configure Microsoft Entra ID to display groups on the Digibee Integration Platform
        • How to build an HL7 message
      • Connectors behavior and configuration
        • Timeout in the Pipeline Executor connector
        • How to use DISTINCT and COUNT in the Object Store
        • Understanding @@DGB_TRUNCATED@@ on the Digibee Integration Platform
        • How to resolve names without a DNS - REST, SOAP, SAP (web protocols)
        • How to read and write files inside a folder
        • AuthToken Reuse for Salesforce connector
        • How to resolve the "Invalid payload" error in API Integration
        • Supported databases
          • Functions and uses for databases
      • Connectors implementation and usage examples
        • Google Storage: Usage scenarios
        • DB V2: Usage scenarios
        • For Each: Usage example
        • Template and its uses
        • Digibee JWT implementation
        • Email V1: Usage example (Deprecated)
      • JOLT applications
        • Transformer: Getting to know JOLT
        • Transformer: Transformations with JOLT
        • Transformer: Add values to list elements
        • Transformer: Operations overview
        • Transformer: Date formatting using split and concat
        • Transformer: Simple IF-ELSE logic with JOLT
      • Platform access and performance tips
        • How to solve login problems on the Digibee Integration Platform
        • How to receive updates from Digibee Status Page
        • How to clean the Digibee Integration Platform cache
      • Governance troubleshooting guidance
        • How to consume Internal API pipelines using ZTNA
        • How to use Internal API with and without a VPN
        • How to generate, convert, and register SSH Keys
        • mTLS authentication
          • How to configure mTLS on the Digibee Integration Platform
          • FAQs: Certificates in mTLS
        • How to connect Digibee to Oracle RAC
        • How to connect Digibee to SAP
        • How to connect Digibee to MongoDB Atlas using VPN
        • How to manage IPs on the Digibee Integration Platform
        • Configuring the Dropbox account
        • How to use your Gmail account with the Digibee email component (SMTP)
        • How to use the CORS policy on the Digibee Integration Platform
      • Deployment scenarios
        • Solving the “Out of memory” errors in deployment
        • Warning of route conflicts
    • Best practices
      • JOLT and Double Braces on Digibee: Choosing the right method for data transformation
      • Best practices for building a pipeline
      • Best practices on validating messages in a consumer pipeline
      • Avoiding loops and maximizing pipeline efficiency
      • Naming: Global, Accounts, and API Keys
      • Pagination tutorial
        • Pagination tutorial - part 1
        • Pagination tutorial - part 2
        • Pagination tutorial - part 3
        • Pagination tutorial - part 4
      • Pagination example
      • Event-driven architecture
      • Notification model in event-driven integrations
      • OAuth2 integration model with Digibee
      • Best practices for error handling in pipelines
      • Highly scalable ETL model for Digibee
    • Digibee Academy
      • Integration Developer Bootcamp
  • Reference guides
    • Connectors
      • AI Tools
        • LLM Connector
      • AWS
        • S3 Storage
        • SQS
        • AWS Secrets Manager
        • AWS Athena
        • AWS CloudWatch
        • AWS Elastic Container Service (ECS)
        • AWS Eventbridge
        • AWS Identity and Access Management (IAM)
        • AWS Kinesis
        • AWS Kinesis Firehose
        • AWS Key Management Service (KMS)
        • AWS Lambda
        • AWS MQ
        • AWS Simple Email Service (SES)
        • AWS Simple Notification System (SNS)
        • AWS Security Token Service (STS)
        • AWS Translate
      • Azure
        • Azure CosmosDB
        • Azure Event Hubs
        • Azure Key Vault
        • Azure ServiceBus
        • Azure Storage DataLake Service
        • Azure Storage Queue Service
      • Enterprise applications
        • SAP
        • Salesforce
        • Braintree
        • Facebook
        • GitHub
        • Jira
        • ServiceNow
        • Slack
        • Telegram
        • Twilio
        • WhatsApp
        • Wordpress
        • Workday
        • Zendesk
      • File storage
        • Blob Storage (Azure)
        • Digibee Storage
        • Dropbox
        • FTP
        • Google Drive
        • Google Storage
        • OneDrive
        • SFTP
        • WebDav V2
        • WebDav (Deprecated)
      • Files
        • Append Files
        • Avro File Reader
        • Avro File Writer
        • CSV to Excel
        • Excel
        • File Reader
        • File Writer
        • GZIP V2
        • GZIP V1 (Deprecated)
        • Parquet File Reader
        • Parquet File Writer
        • Stream Avro File Reader
        • Stream Excel
        • Stream File Reader
        • Stream File Reader Pattern
        • Stream JSON File Reader
        • Stream Parquet File Reader
        • Stream XML File Reader
        • XML Schema Validator
        • ZIP File V2
        • ZIP File
        • NFS
      • Flow
        • Delayer
      • Google/GCP
        • Google BigQuery
        • Google BigQuery Standard SQL
        • Google Calendar
        • Google Cloud Functions
        • Google Mail
        • Google PubSub
        • Google Secret Manager
        • Google Sheets
      • Industry solutions
        • FHIR (Beta)
        • Gupy Public API
        • HL7
        • HubSpot: Sales and CMS
        • Mailgun API
        • Oracle NetSuite (Beta)
        • Orderful
        • Protheus: Billing and Inventory of Cost
      • Logic
        • Block Execution
        • Choice
        • Do While
        • For Each
        • Retry
        • Parallel Execution
      • Queues and messaging
        • Event Publisher
        • JMS
        • Kafka
        • RabbitMQ
      • Security
        • AES Cryptography
        • Asymmetric Cryptography
        • CMS
        • Digital Signature
        • JWT (Deprecated)
        • JWT V2
        • Google IAP Token
        • Hash
        • Digibee JWT (Generate and Decode)
        • LDAP
        • PBE Cryptography
        • PGP
        • RSA Cryptography
        • Symmetric Cryptography
      • Structured data
        • CassandraDB
        • DB V2
        • DB V1 (Deprecated)
        • DynamoDB
        • Google Big Table
        • Memcached
        • MongoDB
        • Object Store
        • Relationship
        • Session Management
        • Stored Procedure
        • Stream DB V3
        • Stream DB V1 (Deprecated)
        • ArangoDb
        • Caffeine Cache
        • Caffeine LoadCache
        • Couchbase
        • CouchDB
        • Ehcache
        • InfluxDB
      • Tools
        • Assert V2
        • Assert V1 (Deprecated)
        • Base64
        • CSV to JSON V2
        • CSV to JSON V1 (Deprecated)
        • HL7 Message Transformer (Beta)
        • HTML to PDF
        • Transformer (JOLT) V2
        • JSLT
        • JSON String to JSON Transformer
        • JSON to JSON String Transformer
        • JSON to XML Transformer
        • JSON to CSV V2
        • JSON to CSV Transformer (Deprecated)
        • JSON Path Transformer V2
        • JSON Path Transformer
        • JSON Transformer
        • Log
        • Pipeline Executor
        • QuickFix (Beta)
        • SSH Remote Command
        • Script (JavaScript)
        • Secure PDF
        • Store Account
        • Template Transformer
        • Throw Error
        • Transformer (JOLT)
        • Validator V1 (Deprecated)
        • Validator V2
        • XML to JSON Transformer
        • XML Transformer
        • JSON Generator (Mock)
      • Web protocols
        • Email V2
        • Email V1 (Deprecated)
        • REST V2
        • REST V1 (Deprecated)
        • SOAP V1 (Deprecated)
        • SOAP V2
        • SOAP V3
        • WGet (Download HTTP)
        • gRPC
    • Triggers
      • Web Protocols
        • API Trigger
        • Email Trigger
        • Email Trigger V2
        • HTTP Trigger
        • HTTP File Trigger
          • HTTP File Trigger - Downloads
          • HTTP File Trigger - Uploads
        • REST Trigger
      • Scheduling
        • Scheduler Trigger
      • Messaging and Events
        • Event Trigger
        • JMS Trigger
        • Kafka Trigger
        • RabbitMQ Trigger
      • Others
        • DynamoDB Streams Trigger
        • HL7 Trigger
        • Salesforce Trigger - Events
    • Double Braces
      • How to reference data using Double Braces
        • Previous Steps Access
      • Double Braces functions
        • Math functions
        • Utilities functions
        • Numerical functions
        • String functions
        • JSON functions
        • Date functions
        • Comparison functions
        • File functions
        • Conditional functions
      • Double Braces autocomplete
  • Development cycle
    • Build
      • Canvas
        • AI Assistant
        • Smart Connector User Experience
        • Execution panel
        • Design and Inspect Mode
        • Test Cases
        • Linter: Canvas building validation
        • Connector Mocking
      • Pipeline
        • How to create a pipeline
        • How to scaffold a pipeline using an OpenAPI specification
        • How to create a project
        • Pipeline version history
        • Pipeline versioning
        • Messages processing
        • Subpipelines
      • Capsules
        • How to use Capsules
          • How to create a Capsule collection
            • Capsule header dimensions
          • How to create a Capsule group
          • How to configure a Capsule
          • How to build a Capsule
          • How to test a Capsule
          • How to save a Capsule
          • How to publish a Capsule
          • How to change a Capsule collection or group
          • How to archive and restore a Capsule
        • Capsules versioning
        • Public capsules
          • SAP
          • Digibee Tools
          • Google Sheets
          • Gupy
          • Send notifications via email
          • Totvs Live
          • Canvas LMS
        • AI Assistant for Capsules Docs Generation
    • Run
      • Run concepts
        • Autoscalling
      • Deployment
        • Deploying a pipeline
        • How to redeploy a pipeline
        • How to promote pipelines across environments
        • How to check the pipeline deployment History
        • How to rollback to a previous deployment version
        • Using deployment history advanced functions
        • Pipeline deployment status
      • How warnings work on pipelines in Run
    • Monitor
      • Monitor Insights (Beta)
      • Completed executions
        • Pipeline execution logs download
      • Pipeline logs
      • Pipeline Metrics
        • Pipeline Metrics API
          • How to set up Digibee API metrics with Datadog
          • How to set up Digibee API metrics with Prometheus
        • Connector Latency
      • Alerts
        • How to create an alert
        • How to edit an alert
        • How to activate, deactivate or duplicate an alert
        • How to delete an alert
        • How to configure alerts on Slack
        • How to configure alerts on Telegram
        • How to configure alerts through a webhook
        • Available metrics
        • Best practices about alerts
        • Use cases for alerts
      • VPN connections monitoring
        • Alerts for VPN metrics
  • Connectivity management
    • Connectivity
    • Zero Trust Network Access (ZTNA)
      • Requirements for using ZTNA
      • How to view connections (Edge Routers)
      • How to view the Network Mappings associated with an Edge Router
      • How to add new ZTNA connections (Edge Routers)
      • How to delete connections (Edge Routers)
      • How to view routes (Network Mapping)
      • How to add new routes (Network Mapping)
      • How to add routes in batch for ZTNA
      • How to edit routes (Network Mapping)
      • How to delete routes (Network Mapping)
      • How to generate new keys (Edge Router)
      • How to change the environment of Edge routers
      • ZTNA Inverse Flow
      • ZTNA Groups
    • Virtual Private Network (VPN)
  • Platform administration
    • Administration
      • Audit
      • Access control
        • Users
        • Groups
        • Roles
          • List of permissions by service
          • Roles and responsibilities: Governance and key stakeholder identification
      • Identity provider integration
        • How to integrate an identity provider
        • Authentication rules
        • Integration of IdP groups with Digibee groups
          • How to create a group integration
          • How to test a group integration
          • How to enable group integrations
          • How to edit a group integration
          • How to delete a group integration
      • User authentication and authorization
        • How to activate and deactivate two-factor authentication
        • Login flow
      • Organization groups
    • Settings
      • Globals
        • How to create Globals
        • How to edit or delete Globals
        • How to use Globals
      • Accounts
        • Configuring each account type
        • Monitor changes to account settings in deployed pipelines
        • OAuth2 Architecture
          • Registration of new OAuth providers
      • Consumers (API Keys)
      • Relationship model
      • Multi-Instance
        • Deploying a multi-instance pipeline
      • Data Streaming
        • How to use Data Streaming with Datadog
    • Governance
      • Policies
        • Security
          • Internal API access policy
          • External API access policy
          • Sensitive fields policy
        • Transformation
          • Custom HTTP header
          • CORS HTTP header
        • Limit of Replicas policy
    • Licensing
      • Licensing models
        • Consumption Based model
      • Capacity and quotas
      • License consumption
      • Platform hosting
    • Digibee APIs
      • How to create API credentials
  • Digibee concepts
    • Pipeline Engine
      • Digibee Integration Platform Pipeline Engine v2
      • Support Dynamic Accounts (Restricted Beta)
    • Introduction to ZTNA
  • Help & FAQ
    • Digibee Customer Support
    • Request documentation, suggest features, or send feedback
    • Beta Program
    • Security and compliance
    • About Digibee
Powered by GitBook
On this page
  • Creating the Test Case
  • Define the conditions
  • Document the test
  • Save the test
  • Running the Test Case
  • Example: Add default birth date
  • Pipeline overview
  • Step 1: Configure the Transformer JOLT (V2)
  • Step 2: Create the Test Case
  • Result validation
  • Example: Organize product catalog
  • Pipeline overview
  • Step 1: Configure the Transformer JOLT (V2)
  • Step 2: Create the Test Case
  • Result validation

Was this helpful?

  1. Development cycle
  2. Build
  3. Canvas

Test Cases

Discover how to build effective test scenarios for different integration cases.

PreviousDesign and Inspect ModeNextLinter: Canvas building validation

Last updated 2 days ago

Was this helpful?

A Test Case is a configuration that includes a defined set of inputs, conditions, and expected outcomes used to verify that a specific integration pipeline is working as intended. Test cases are especially useful for ensuring the quality of your integrations and documenting the expected behavior of your flows.

They allow you to:

  • Simulate various input conditions.

  • Mock connector responses.

  • Automatically compare actual and expected results.

Creating the Test Case

  1. In the sidebar on Canvas, go to Flow.

  2. Click Create a new test case.

1

Define the conditions

When creating a test case, you must configure the elements that define the test scenario:

Payload

The payload represents the input data for your test. When adding a payload, provide the following:

  • Name: A name for the payload.

  • Description: A short description explaining the context or purpose of the payload.

Mock Response

Mock responses let you simulate connector outputs, which is helpful when validating flows under different conditions. When adding a mock response, provide the following:

  • Connector: The connector whose response you want to mock.

  • JSON: The simulated response, in JSON format.

  • Name: A name for the mock response.

  • Description: A brief explanation of the scenario represented by this mock.

To simulate more connectors, click Add more mocks.

Expected result

The expected result defines what the output of the flow should be after execution. It’s used to compare the actual result with the expected one.

Provide the following:

  • Operator: The comparison operator to be used. Currently the only available option is Equals.

  • JSONPath: The JSONPath expression pointing to the result field.

  • Value: The expected value, in JSON format.

2

Document the test

Before saving the test case, make sure to provide the following information:

  • Name: A unique name to identify the test case later.

  • Description: A brief explanation of the test case. Use this field to document the purpose or specific scenario being tested.

3

Save the test

Once all required information has been provided, click Save.

The test case must contain at least the name and the expected result to be saved.

The test case will then appear in the left sidebar under the Flow section. From there, you can click the three-dots icon to:

  • Edit: Modify the test case settings.

  • Remove: Delete the test case.

  • Duplicate: Make a copy of the test case.

  • Single Run: Execute the test case.

  • Open execution: Only available after the test case is executed.

Running the Test Case

To execute the test case:

  1. Ensure the flow is connected to a trigger. If the flow isn’t connected, the test case won’t run.

  2. In the sidebar, locate the test case and click the three-dots icon.

  3. Select Single Run to start the execution.

  4. The test will run, and the platform will compare the actual output with the expected result:

  • If it passes: A green success icon appears next to the test case name.

  • If it fails: A red failure icon appears next to the test case name.

  1. To view the execution details on the Execution panel, click Open execution. More information can be found on the Results tab.

Example: Add default birth date

  • Use case: You want to ensure that a default birth date is added when the input JSON is missing.

Pipeline overview

This pipeline uses a single connector:

  • Transformer (JOLT) V2: Adds a default birth date if it’s missing from the input payload.

We’ll test this transformation using a test case with a sample input that lacks the birthDate field.

Step 1: Configure the Transformer JOLT (V2)

We’ll use the default operation to add a field when it doesn't already exist.

JOLT specification:

[    
 {
   "operation": "default",
   "spec": {
     "customer": {
       "birthDate": "01/01/1970"
     }
   }
 }
]

This spec checks if birthDate exists under customer. If not, it adds it with the default value ”01/01/1970”.

Step 2: Create the Test Case

We'll simulate a payload that contains only the name and SSN, and verify that the Transformer (JOLT) V2 connector adds the default birth date.

  • Payload:

{
 "customer": {
   "name": "Customer Default",
   "ssn": "123-45-6789"
 }
}
  • Mock Response: (Leave empty for this example)

  • Expected result:

    • JSONPath: $.customer

    • Value:

{
   "name": "Customer Default",
   "ssn": "123-45-6789",
   "birthDate": "01/01/1970"
}

Result validation

After running the test case, the output is as follows:

{
  "customer": {
    "name": "Customer Default",
    "ssn": "123-45-6789",
    "birthDate": "01/01/1970"
  }
}

This confirms that the test passed:

  • The Transformer (JOLT) V2 applied the default operation as expected.

  • Since the original input didn’t include a birthDate, the field was added with the default value ”01/01/1970”.

Example: Organize product catalog

  • Use case: You want to restructure product data returned from a database into a new format.

  • Objective: Group product details under a details object and rename fields for better organization.

Pipeline overview

This pipeline uses the following connectors:

As we don’t have access to the actual database, we’ll simulate it using a test case with a mocked DB response.

Step 1: Configure the Transformer JOLT (V2)

We’ll use a JOLT shift operation to map and reorganize the fields.

JOLT specification:

[
  {
    "operation": "shift",
    "spec": {
      "product": {
        "id": "productId",
        "name": "details.name",
        "brand": "details.brand",
        "price": "details.pricing.amount",
        "currency": "details.pricing.currency",
        "stock": "details.availability"
      }
    }
  }
]

This spec renames id to productId and nests other fields under a details object.

Step 2: Create the Test Case

We'll simulate the database response and validate the transformation.

  • Payload: (Leave empty for this example)

  • Mock Response:

    • Connector: DB V2

    • JSON Payload:

{
  "product": {
    "id": "12345",
    "name": "Smartphone X1",
    "brand": "TechBrand",
    "price": 699.99,
    "currency": "USD",
    "stock": 25
  }
}
  • Expected result:

    • JSONPath: $.details

    • Value:

{
    "name": "Smartphone X1",
    "brand": "TechBrand",
    "pricing": {
      "amount": 699.99,
      "currency": "USD"
    },
    "availability": 25
  }

Result validation

After running the test case, the output is as follows:

{
  "productId": "12345",
  "details": {
    "name": "Smartphone X1",
    "brand": "TechBrand",
    "pricing": {
      "amount": 699.99,
      "currency": "USD"
    },
    "availability": 25
  }
}

This confirms that the test passed:

  • The Transformer (JOLT) V2 applied the specification correctly to restructure the input data.

  • Fields were renamed and nested under the details object as intended.

  • The productId field remained at the top level, while other attributes were grouped logically under details.

JSON: The input data in JSON format. You can also reuse of the pipeline by clicking Load to edit.

Objective: Use the to add a default "birthDate": "01/01/1970" field when it’s not present in the input.

: Retrieves product data.

: Reorganizes and renames the fields using the shift operation.

payloads previously saved in the Execution Panel
Transformer (JOLT) V2
DB V2
Transformer (JOLT) V2