Digibee Documentation
Request documentationBook a demo
English
English
  • Quick start
  • Highlights
    • Release notes
      • Release notes 2025
        • May
        • April
        • March
        • February
        • January
      • Release notes 2024
        • December
        • November
        • October
        • September
        • August
          • Connectors release 08/20/2024
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2023
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2022
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2021
      • Release notes 2020
    • AI Pair Programmer
    • Digibeectl
      • Getting started
        • How to install Digibeectl on Windows
      • Digibeectl syntax
      • Digibeectl operations
  • Digibee in action
    • Use Cases in Action
      • Improving integration performance with API pagination
      • Automating file storage with Digibee
      • Reprocessing strategy in event-driven integrations
      • Key practices for securing sensitive information in pipelines with Digibee
      • OAuth2 for secure API access
      • Secure your APIs with JWT in Digibee
      • Integration best practices for developers on the Digibee Integration Platform
      • How to use Event-driven architecture on the Digibee Integration Platform
      • Dynamic file download with Digibee
      • Microservices: Circuit Breaker pattern for improving resilience
      • Error handling strategy in event-driven integrations
    • Troubleshooting
      • Integration guidance
        • How to resolve common pipeline issues
        • How to resolve Error 409: “You cannot update a pipeline that is not on draft mode”
        • How to resolve the "Pipeline execution was aborted" error
        • Integrated authentication with Microsoft Entra ID
        • How to resolve the "Failed to initialize pool: ONS configuration failed" error
        • How to perform IP address mapping with Progress database
        • How to build integration flows that send error notifications
        • How to send logs to external services
        • How JSONPath differs in connectors and the Execution panel
        • Using JSONPath to validate numbers with specific initial digits
        • How to analyze the "Network error: Failed to fetch" in the Execution panel
        • How to handle request payloads larger than 5MB
        • How to configure Microsoft Entra ID to display groups on the Digibee Integration Platform
        • How to build an HL7 message
      • Connectors behavior and configuration
        • Timeout in the Pipeline Executor connector
        • How to use DISTINCT and COUNT in the Object Store
        • Understanding @@DGB_TRUNCATED@@ on the Digibee Integration Platform
        • How to resolve names without a DNS - REST, SOAP, SAP (web protocols)
        • How to read and write files inside a folder
        • AuthToken Reuse for Salesforce connector
        • How to resolve the "Invalid payload" error in API Integration
        • Supported databases
          • Functions and uses for databases
      • Connectors implementation and usage examples
        • Google Storage: Usage scenarios
        • DB V2: Usage scenarios
        • For Each: Usage example
        • Template and its uses
        • Digibee JWT implementation
        • Email V1: Usage example (Deprecated)
      • JOLT applications
        • Transformer: Getting to know JOLT
        • Transformer: Transformations with JOLT
        • Transformer: Add values to list elements
        • Transformer: Operations overview
        • Transformer: Date formatting using split and concat
        • Transformer: Simple IF-ELSE logic with JOLT
      • Platform access and performance tips
        • How to solve login problems on the Digibee Integration Platform
        • How to receive updates from Digibee Status Page
        • How to clean the Digibee Integration Platform cache
      • Governance troubleshooting guidance
        • How to consume Internal API pipelines using ZTNA
        • How to use Internal API with and without a VPN
        • How to generate, convert, and register SSH Keys
        • mTLS authentication
          • How to configure mTLS on the Digibee Integration Platform
          • FAQs: Certificates in mTLS
        • How to connect Digibee to Oracle RAC
        • How to connect Digibee to SAP
        • How to connect Digibee to MongoDB Atlas using VPN
        • How to manage IPs on the Digibee Integration Platform
        • Configuring the Dropbox account
        • How to use your Gmail account with the Digibee email component (SMTP)
        • How to use the CORS policy on the Digibee Integration Platform
      • Deployment scenarios
        • Solving the “Out of memory” errors in deployment
        • Warning of route conflicts
    • Best practices
      • Best practices for building a pipeline
      • Best practices on validating messages in a consumer pipeline
      • Avoiding loops and maximizing pipeline efficiency
      • Naming: Global, Accounts, and API Keys
      • Pagination tutorial
        • Pagination tutorial - part 1
        • Pagination tutorial - part 2
        • Pagination tutorial - part 3
        • Pagination tutorial - part 4
      • Pagination example
      • Event-driven architecture
      • Notification model in event-driven integrations
      • OAuth2 integration model with Digibee
      • Best practices for error handling in pipelines
    • Digibee Academy
      • Integration Developer Bootcamp
  • Reference guides
    • Connectors
      • AWS
        • S3 Storage
        • SQS
        • AWS Secrets Manager
        • AWS Athena
        • AWS CloudWatch
        • AWS Elastic Container Service (ECS)
        • AWS Eventbridge
        • AWS Identity and Access Management (IAM)
        • AWS Kinesis
        • AWS Kinesis Firehose
        • AWS Key Management Service (KMS)
        • AWS Lambda
        • AWS MQ
        • AWS Simple Email Service (SES)
        • AWS Simple Notification System (SNS)
        • AWS Security Token Service (STS)
        • AWS Translate
      • Azure
        • Azure CosmosDB
        • Azure Event Hubs
        • Azure Key Vault
        • Azure ServiceBus
        • Azure Storage DataLake Service
        • Azure Storage Queue Service
      • Enterprise applications
        • SAP
        • Salesforce
        • Braintree
        • Facebook
        • GitHub
        • Jira
        • ServiceNow
        • Slack
        • Telegram
        • Twilio
        • WhatsApp
        • Wordpress
        • Workday
        • Zendesk
      • File storage
        • Blob Storage (Azure)
        • Digibee Storage
        • Dropbox
        • FTP
        • Google Drive
        • Google Storage
        • OneDrive
        • SFTP
        • WebDav V2
        • WebDav (Deprecated)
      • Files
        • Append Files
        • Avro File Reader
        • Avro File Writer
        • CSV to Excel
        • Excel
        • File Reader
        • File Writer
        • GZIP V2
        • GZIP V1 (Deprecated)
        • Parquet File Reader
        • Parquet File Writer
        • Stream Avro File Reader
        • Stream Excel
        • Stream File Reader
        • Stream File Reader Pattern
        • Stream JSON File Reader
        • Stream Parquet File Reader
        • Stream XML File Reader
        • XML Schema Validator
        • ZIP File
        • NFS
      • Flow
        • Delayer
      • Google/GCP
        • Google BigQuery
        • Google BigQuery Standard SQL
        • Google Calendar
        • Google Cloud Functions
        • Google Mail
        • Google PubSub
        • Google Secret Manager
        • Google Sheets
      • Industry solutions
        • FHIR (Beta)
        • Gupy Public API
        • HL7
        • HubSpot: Sales and CMS
        • Mailgun API
        • Oracle NetSuite (Beta)
        • Orderful
        • Protheus: Billing and Inventory of Cost
      • Logic
        • Block Execution
        • Choice
        • Do While
        • For Each
        • Retry
        • Parallel Execution
      • Queues and messaging
        • Event Publisher
        • JMS
        • Kafka
        • RabbitMQ
      • Security
        • AES Cryptography
        • Asymmetric Cryptography
        • CMS
        • Digital Signature
        • JWT (Deprecated)
        • JWT V2
        • Google IAP Token
        • Hash
        • Digibee JWT (Generate and Decode)
        • LDAP
        • PBE Cryptography
        • PGP
        • RSA Cryptography
        • Symmetric Cryptography
      • Structured data
        • CassandraDB
        • DB V2
        • DB V1 (Deprecated)
        • DynamoDB
        • Google Big Table
        • Memcached
        • MongoDB
        • Object Store
        • Relationship
        • Session Management
        • Stored Procedure
        • Stream DB V3
        • Stream DB V1 (Deprecated)
        • ArangoDb
        • Caffeine Cache
        • Caffeine LoadCache
        • Couchbase
        • CouchDB
        • Ehcache
        • InfluxDB
      • Tools
        • Assert V2
        • Assert V1 (Deprecated)
        • Base64
        • CSV to JSON V2
        • CSV to JSON V1 (Deprecated)
        • HL7 Message Transformer (Beta)
        • HTML to PDF
        • Transformer (JOLT) V2
        • JSLT
        • JSON String to JSON Transformer
        • JSON to JSON String Transformer
        • JSON to XML Transformer
        • JSON to CSV V2
        • JSON to CSV Transformer (Deprecated)
        • JSON Path Transformer V2
        • JSON Path Transformer
        • JSON Transformer
        • Log
        • Pipeline Executor
        • QuickFix (Beta)
        • SSH Remote Command
        • Script (JavaScript)
        • Secure PDF
        • Store Account
        • Template Transformer
        • Throw Error
        • Transformer (JOLT)
        • Validator V1 (Deprecated)
        • Validator V2
        • XML to JSON Transformer
        • XML Transformer
        • JSON Generator (Mock)
      • Web protocols
        • Email V2
        • Email V1 (Deprecated)
        • REST V2
        • REST V1 (Deprecated)
        • SOAP V1 (Deprecated)
        • SOAP V2
        • SOAP V3
        • WGet (Download HTTP)
        • gRPC
    • Triggers
      • Web Protocols
        • API Trigger
        • Email Trigger
        • Email Trigger V2
        • HTTP Trigger
        • HTTP File Trigger
          • HTTP File Trigger - Downloads
          • HTTP File Trigger - Uploads
        • REST Trigger
      • Scheduling
        • Scheduler Trigger
      • Messaging and Events
        • Event Trigger
        • JMS Trigger
        • Kafka Trigger
        • RabbitMQ Trigger
      • Others
        • DynamoDB Streams Trigger
        • HL7 Trigger
        • Salesforce Trigger - Events
    • Double Braces
      • How to reference data using Double Braces
      • Double Braces functions
        • Math functions
        • Utilities functions
        • Numerical functions
        • String functions
        • JSON functions
        • Date functions
        • Comparison functions
        • File functions
        • Conditional functions
      • Double Braces autocomplete
  • Development cycle
    • Build
      • Canvas
        • AI Assistant
        • Smart Connector User Experience
        • Execution panel
        • Design and Inspect Mode
        • Linter: Canvas building validation
        • Connector Mocking
      • Pipeline
        • How to create a pipeline
        • How to scaffold a pipeline using an OpenAPI specification
        • How to create a project
        • Pipeline version history
        • Pipeline versioning
        • Messages processing
        • Subpipelines
      • Capsules
        • How to use Capsules
          • How to create a Capsule collection
            • Capsule header dimensions
          • How to create a Capsule group
          • How to configure a Capsule
          • How to build a Capsule
          • How to test a Capsule
          • How to save a Capsule
          • How to publish a Capsule
          • How to change a Capsule collection or group
          • How to archive and restore a Capsule
        • Capsules versioning
        • Public capsules
          • SAP
          • Digibee Tools
          • Google Sheets
          • Gupy
          • Send notifications via email
          • Totvs Live
          • Canvas LMS
        • AI Assistant for Capsules Docs Generation
    • Run
      • Run concepts
        • Autoscalling
      • Deployment
        • Deploying a pipeline
        • How to redeploy a pipeline
        • How to promote pipelines across environments
        • How to check the pipeline deployment History
        • How to rollback to a previous deployment version
        • Using deployment history advanced functions
        • Pipeline deployment status
      • How warnings work on pipelines in Run
    • Monitor
      • Monitor Insights (Beta)
      • Completed executions
        • Pipeline execution logs download
      • Pipeline logs
      • Pipeline Metrics
        • Pipeline Metrics API
          • How to set up Digibee API metrics with Datadog
          • How to set up Digibee API metrics with Prometheus
        • Connector Latency
      • Alerts
        • How to create an alert
        • How to edit an alert
        • How to activate, deactivate or duplicate an alert
        • How to delete an alert
        • How to configure alerts on Slack
        • How to configure alerts on Telegram
        • How to configure alerts through a webhook
        • Available metrics
        • Best practices about alerts
        • Use cases for alerts
      • VPN connections monitoring
        • Alerts for VPN metrics
  • Connectivity management
    • Connectivity
    • Zero Trust Network Access (ZTNA)
      • Prerequisites for using ZTNA
      • How to view connections (Edge Routers)
      • How to view the Network Mappings associated with an Edge Router
      • How to add new ZTNA connections (Edge Routers)
      • How to delete connections (Edge Routers)
      • How to view routes (Network Mapping)
      • How to add new routes (Network Mapping)
      • How to add routes in batch for ZTNA
      • How to edit routes (Network Mapping)
      • How to delete routes (Network Mapping)
      • How to generate new keys (Edge Router)
      • How to change the environment of Edge routers
      • ZTNA Inverse Flow
      • ZTNA Groups
    • Virtual Private Network (VPN)
  • Platform administration
    • Administration
      • Audit
      • Access control
        • Users
        • Groups
        • Roles
          • List of permissions by service
          • Roles and responsibilities: Governance and key stakeholder identification
      • Identity provider integration
        • How to integrate an identity provider
        • Authentication rules
        • Integration of IdP groups with Digibee groups
          • How to create a group integration
          • How to test a group integration
          • How to enable group integrations
          • How to edit a group integration
          • How to delete a group integration
      • User authentication and authorization
        • How to activate and deactivate two-factor authentication
        • Login flow
      • Organization groups
    • Settings
      • Globals
        • How to create Globals
        • How to edit or delete Globals
        • How to use Globals
      • Accounts
        • Configuring each account type
        • Monitor changes to account settings in deployed pipelines
        • OAuth2 Architecture
          • Registration of new OAuth providers
      • Consumers (API Keys)
      • Relationship model
      • Multi-Instance
        • Deploying a multi-instance pipeline
      • Log Streaming
        • How to use Log Streaming with Datadog
    • Governance
      • Policies
        • Security
          • Internal API access policy
          • External API access policy
          • Sensitive fields policy
        • Transformation
          • Custom HTTP header
          • CORS HTTP header
        • Limit of Replicas policy
    • Licensing
      • Licensing models
        • Consumption Based model
      • Capacity and quotas
      • License consumption
    • Digibee APIs
      • How to create API credentials
  • Digibee concepts
    • Pipeline Engine
      • Digibee Integration Platform Pipeline Engine v2
      • Support Dynamic Accounts (Restricted Beta)
    • Digibee Integration Platform Dedicated SaaS
      • Digibee Integration Platform architecture on Dedicated Saas model
      • Requirements for Digibee Dedicated Saas model
      • Site-to-Site VPN for dedicated SaaS customer support
      • Dedicated Saas customer responsibilities
      • Custom Images of Kubernetes Nodes
      • Digibee Dedicated SaaS installation on AWS
        • How to install requirements before installing Digibee Integration Platform on EKS
        • Permissions to use Digibee Integration Platform on EKS
        • How to create custom nodes for EKS (Golden Images)
    • Introduction to ZTNA
  • Help & FAQ
    • Digibee Customer Support
    • Request documentation, suggest features, or send feedback
    • Beta Program
    • Security and compliance
    • About Digibee
Powered by GitBook
On this page
  • Parameters
  • Parameters additional information
  • Maximum Request Size
  • Add Cross-Origin Resource Sharing (CORS) - CORS Headers
  • mTLS enabled API
  • Additional API Routes
  • Custom static route
  • Custom route with parameter in the path
  • Remove Digibee Prefix from Route
  • Rate Limit
  • REST Trigger in Action
  • Query API of information with response in JSON
  • Dispatch API of information with response in JSON

Was this helpful?

  1. Reference guides
  2. Triggers
  3. Web Protocols

REST Trigger

Learn more about the REST Trigger and how to use it on the Digibee Integration Platform.

PreviousHTTP File Trigger - UploadsNextScheduling

Last updated 3 months ago

Was this helpful?

When a pipeline is configured and published with REST Trigger, a REST endpoint is automatically created. You can visualize this endpoint after the deployment - just click on the pipeline card in the Run screen.

With this trigger, you can create APIs that meet the REST standard and quickly define which methods your endpoint will answer to.

Parameters

Take a look at the configuration parameters of the trigger. Parameters supported by are marked with (DB).

Parameter
Description
Default value
Data type

Methods

Configures the HTTP verbs to be supported by the endpoint after the deployment. If no value is informed, the default value will be considered.

POST, PUT, GET, PATCH, DELETE and OPTIONS

String

Maximum Timeout

Limit time (in milliseconds) for the pipeline to process information before returning a response. Limit: 900000

30000

Integer

Maximum Request Size

Maximum size of the payload (in MB). The maximum size of the configurable payload is 5MB.

5

Integer

Response Headers (DB)

Headers to be returned by the endpoint when processing in the pipeline is complete. Cannot be left empty. Accepts Double Braces.

N/A

String

Add Cross-Origin Resource Sharing (CORS)

Add the CORS headers to be returned by the endpoint when processing in the pipeline is complete.

False

Boolean

CORS Headers

Specifies CORS for the pipeline.

N/A

Key-value pair

External API

If enabled, publishes the API in an external gateway.

True

Boolean

Internal API

If enabled, publishes the API in an internal gateway. The pipeline can have both the External API and Internal API options enabled simultaneously.

False

Boolean

mTLS enabled API

If enabled, publishes the API to a dedicated gateway with mTLS enabled by default.

False

Boolean

API Key

If enabled, the endpoint can only be consumed if an API key is configured in the Digibee Integration Platform.

False

Boolean

Token JWT

False

Boolean

Basic Auth

If enabled, the endpoint can only be consumed if a Basic Auth setting is present in the request. This setting can be registered beforehand through the Consumers page in the Digibee Integration Platform.

False

Boolean

Additional API Routes

False

Boolean

Remove Digibee Prefix from Route

False

Boolean

Routes

Displayed when the Additional API Routes parameter is enabled; used to define additional endpoint routes.

N/A

String

Rate Limit

If activated, applies a rate limiting configuration on the API gateway. Available if API Key or Basic Auth is active.

False

Boolean

Limit by

Defines the entity to which the limits will be applied. Options: API.

API

String

Aggregate by

Defines the entity for aggregating the limits. Options: Consumer and Credential (API Key, Basic Auth).

Consumer

String

Options

Defines the limit of requests that can be made within a time interval.

N/A

Options of Rate Limit

Interval

Defines the time interval for the limit of requests. Options: second, minute, hour, day, and month.

Second

String

Limit

Defines the maximum number of requests that users can make in the specified time interval.

N/A

Integer

Allow Redelivery Of Messages

False

Boolean

There is a global configuration parameter that obliges all the pipelines to be published with at least the API Key or Basic Auth options enabled.

Parameters additional information

Maximum Request Size

If the payload sent by the endpoint consumer goes beyond the limit, a message will be returned informing that the maximum size has been overcome and a status-code 413 with the following message:

{  
    "message": "Request size limit exceeded"
}

Add Cross-Origin Resource Sharing (CORS) - CORS Headers

We use a comma to enter multiple values in a header, but we don't add a space before or after the comma. Special characters should not be used in keys, due to possible failures in proxies and gateways.

mTLS enabled API

The pipeline can have both the External API and Internal API options enabled at the same time, but it is recommended to leave them inactive. This parameter does not support API Key, JWT, or Basic Auth.

To use it in your realm, it is necessary to make a request via chat and we will send you the necessary information to install this service.

Additional API Routes

As previously explained, this option is to configure new routes in the endpoint.

When a pipeline is deployed, an URL is automatically created. However, you can customize the route according to your convenience. It also includes receiving parameters through the route.

After the pipeline's deployment, the URL will get the following structure:

TEST:

https://test.godigibee.io/pipeline/{realm}/v{n}/{pipeline-name}

or PROD:

https://api.godigibee.io/pipeline/{realm}/v{n}/{pipeline-name}
  • {realm}: corresponds to Realm.

  • v{n}: pipeline's major version.

  • {pipeline-name}: name given to the pipeline.

Custom static route

Let’s say you’ve created the product-list pipeline. Considering the comment above, its URL would have the following appearance:

https://test.godigibee.io/pipeline/realm/v1/product-list

Now, see how to configure a static route for this case.

With the configurations applied and the pipeline deployed, you will get a new URL:

https://test.godigibee.io/pipeline/realm/v1/products

Custom route with parameter in the path

Using as example the same pipeline previously configured, see how to configure the route:

With the configurations applied and the pipeline deployed, you will get a new URL:

https://test.godigibee.io/pipeline/realm/v1/products/:id

In this case, the endpoint consumer can send a request with the id of a product and return information about it only. Example of URL in the request:

https://test.godigibee.io/pipeline/realm/v1/products/10156

To use this value sent by the route inside the pipeline, go for the Double Braces syntax:

{{ message.queryAndPath.id }}

Remove Digibee Prefix from Route

As previously explained, this option is recommended for removing the default Digibee route prefix from pipeline route.

Let’s say you’ve created a pipeline and set the trigger as follows:

With the configurations applied and the pipeline deployed, you will get a new URL:

https://test.godigibee.io/products

When removing the default prefix and setting the pipeline route through the Additional API Routes parameter, be careful not to set an existing pipeline route used by other pipelines. In case you have more than one pipeline major version, it’s also important to keep in mind that the pipeline route versioning must be done by the user due the absence of a versioning path parameter. For example: /pipeline/realm/v1/.

Rate Limit

When creating APIs, we usually want to limit the number of API requests users can make in a given time interval.

This action can be performed by activating the Rate Limit option and applying the following settings:

If the API has additional paths, the limit is shared among all paths. To apply the rate limit settings, the API must be configured with an API key or Basic Auth so that the Aggregate by parameter can be used by groups of credentials if the Consumer option is selected, or by an individual credential if the Credential (API Key, Basic Auth) option is selected.

If multiple interval parameters are configured with repeating values, only one of these values is considered. It’s also necessary that a value greater than zero be informed for the Limit parameter.

If the rate limiting options aren't set correctly, they'll be ignored and a warning log will be issued. You can view this log on the Pipeline Logs page.

REST Trigger in Action

See below how the trigger behaves in specific situations and what its respective configuration is.

Query API of information with response in JSON

See how to configure a pipeline with REST Trigger to return information inside the pipeline in JSON format and how the return must be treated specifically for this trigger.

First of all, create a new pipeline and configure the trigger. The configuration can be made in the following way:

With the configurations above, you determine that:

  • the endpoint works with the verb GET only.

Besides, you determine that the API is external and doesn’t need a token for the communication.

This example works for educational matters only. In some cases you can’t let the endpoint open for security reasons.

{
    "data": {
        "products": [
            {
                "name": "Samsung 4k Q60T 55",
                "price": 3278.99
            },
            {
                "name": "Samsung galaxy S20 128GB",
                "price": 3698.99
            }
        ]
    }
}

After doing that, the endpoint will already automatically return the JSON defined above as the endpoint response.

After the deployment, take the generated url and send a GET-type request. The endpoint must return the status-code 200 and the response body must have the same appearance of the JSON we previously defined inside the MOCK component.

Dispatch API of information with response in JSON

See how to configure a pipeline with REST Trigger to return information inside the pipeline in JSON format and how the return must be treated specifically for this trigger.

First of all, create a new pipeline and configure the trigger. The configuration can be made in the following way:

With the configurations above, you determine that:

  • the endpoint works with the verb POST only.

Besides, you determine that the API is external and doesn’t need a token for the communication.

This example works for educational matters only. In some cases you can’t let the endpoint open for security reasons.

{
    "data": {
        "products": [
            {
                "name": "Samsung 4k Q60T 55",
                "price": 3278.99
            },
            {
                "name": "Samsung galaxy S20 128GB",
                "price": 3698.99
            },
            {{ message.body.product }}
        ]
    }
}

With this configuration, a payload with a new product will be received and it will be added to the array. After that, the pipeline will return the array with the new added product to the consumer.

After the deployment, take the generated url and send a POST-type request with the following body:

{
    "product": {
        "name": "Samsung galaxy note 10 256GB",
        "price": 2879.99
    }
}

The endpoint must return the status-code 200 and the response body must have the following appearance:

{
  "data": {
    "products": [
      {
        "name": "Samsung 4k Q60T 55",
        "price": 3278.99
      },
      {
        "name": "Samsung galaxy S20 128GB",
        "price": 3698.99
      },
      {
        "name": "Samsung galaxy note 10 256GB",
        "price": 2879.99
      }
    ]
  }
}

Everytime you make a request to the created endpoint, the structure of the message that the trigger delivers to the pipeline is always the same and follows this pattern:

{
  "body": "{}",
  "form": {},
  "headers": {
    "Host": "pipeline-trigger-http:8100",
    "Connection": "keep-alive",
    "X-Forwarded-For": "***",
    "X-Forwarded-Proto": "http",
    "X-Forwarded-Host": "***",
    "my-custom-header": "a"
  },
  "queryAndPath": {
    "id": "1"
  },
  "method": "POST",
  "contentType": "application/json",
  "path": "/pipeline/digibee/v1/trigger-rest/1"
}
  • body: content to be sent in the request payload to be transformed into string in this field.

  • form: if the form-data is used in the request, the sent data is delivered in this field.

  • headers: the headers sent in the request are delivered in this field, but some are automatically filled according to the tool used to make the request.

  • queryAndPath: the query and path parameters provided in the URL are delivered in this field.

  • method: HTTP method used in the request to be delivered in this field.

  • contentType: when informed in the request, the Content-type value is repassed to the pipeline inside this field.

  • path: the path used in the URL in the request is repassed to this field.

If enabled, the endpoint can only be consumed if a JWT token previously generated by another endpoint is sent. Read the article about to have more details.

If enabled, allows configuration of new routes for the trigger. See more about this parameter in the below.

Removes the default Digibee route prefix if certain conditions are met. Learn more about the Remove Digibee Prefix from Route parameter in the below.

If enabled, allows the message to be resent if the Pipeline Engine fails. Read the article about the to have more details.

Cross-Origin Resource Sharing (CORS) is a mechanism that lets you tell the browser which origins are allowed to make requests. This parameter defines CORS specifically for the pipeline and its constraints. To configure globally rather than individually on each pipeline see the .

Now see how to configure a in the pipeline so it becomes the data provider that the endpoint returns in the end. Place the indicated component, connect it to the trigger and configure it with the following JSON:

Now see how to configure a in the pipeline so it changes the received data and the endpoint will return in the end. Place the indicated component, connect it to the trigger and configure it with the following JSON:

Double Braces expressions
CORS HTTP header policy
MOCK
MOCK
JWT implementation
Pipeline Engine
section
section