Digibee Documentation
Request documentationBook a demo
English
English
  • Quick start
  • Highlights
    • Release notes
      • Release notes 2025
        • May
        • April
        • March
        • February
        • January
      • Release notes 2024
        • December
        • November
        • October
        • September
        • August
          • Connectors release 08/20/2024
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2023
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2022
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2021
      • Release notes 2020
    • AI Pair Programmer
    • Digibeectl
      • Getting started
        • How to install Digibeectl on Windows
      • Digibeectl syntax
      • Digibeectl operations
  • Digibee in action
    • Use Cases in Action
      • Improving integration performance with API pagination
      • Automating file storage with Digibee
      • Reprocessing strategy in event-driven integrations
      • Key practices for securing sensitive information in pipelines with Digibee
      • OAuth2 for secure API access
      • Secure your APIs with JWT in Digibee
      • Integration best practices for developers on the Digibee Integration Platform
      • How to use Event-driven architecture on the Digibee Integration Platform
      • Dynamic file download with Digibee
      • Microservices: Circuit Breaker pattern for improving resilience
      • Error handling strategy in event-driven integrations
    • Troubleshooting
      • Integration guidance
        • How to resolve common pipeline issues
        • How to resolve Error 409: “You cannot update a pipeline that is not on draft mode”
        • How to resolve the "Pipeline execution was aborted" error
        • Integrated authentication with Microsoft Entra ID
        • How to resolve the "Failed to initialize pool: ONS configuration failed" error
        • How to perform IP address mapping with Progress database
        • How to build integration flows that send error notifications
        • How to send logs to external services
        • How JSONPath differs in connectors and the Execution panel
        • Using JSONPath to validate numbers with specific initial digits
        • How to analyze the "Network error: Failed to fetch" in the Execution panel
        • How to handle request payloads larger than 5MB
        • How to configure Microsoft Entra ID to display groups on the Digibee Integration Platform
        • How to build an HL7 message
      • Connectors behavior and configuration
        • Timeout in the Pipeline Executor connector
        • How to use DISTINCT and COUNT in the Object Store
        • Understanding @@DGB_TRUNCATED@@ on the Digibee Integration Platform
        • How to resolve names without a DNS - REST, SOAP, SAP (web protocols)
        • How to read and write files inside a folder
        • AuthToken Reuse for Salesforce connector
        • How to resolve the "Invalid payload" error in API Integration
        • Supported databases
          • Functions and uses for databases
      • Connectors implementation and usage examples
        • Google Storage: Usage scenarios
        • DB V2: Usage scenarios
        • For Each: Usage example
        • Template and its uses
        • Digibee JWT implementation
        • Email V1: Usage example (Deprecated)
      • JOLT applications
        • Transformer: Getting to know JOLT
        • Transformer: Transformations with JOLT
        • Transformer: Add values to list elements
        • Transformer: Operations overview
        • Transformer: Date formatting using split and concat
        • Transformer: Simple IF-ELSE logic with JOLT
      • Platform access and performance tips
        • How to solve login problems on the Digibee Integration Platform
        • How to receive updates from Digibee Status Page
        • How to clean the Digibee Integration Platform cache
      • Governance troubleshooting guidance
        • How to consume Internal API pipelines using ZTNA
        • How to use Internal API with and without a VPN
        • How to generate, convert, and register SSH Keys
        • mTLS authentication
          • How to configure mTLS on the Digibee Integration Platform
          • FAQs: Certificates in mTLS
        • How to connect Digibee to Oracle RAC
        • How to connect Digibee to SAP
        • How to connect Digibee to MongoDB Atlas using VPN
        • How to manage IPs on the Digibee Integration Platform
        • Configuring the Dropbox account
        • How to use your Gmail account with the Digibee email component (SMTP)
        • How to use the CORS policy on the Digibee Integration Platform
      • Deployment scenarios
        • Solving the “Out of memory” errors in deployment
        • Warning of route conflicts
    • Best practices
      • Best practices for building a pipeline
      • Best practices on validating messages in a consumer pipeline
      • Avoiding loops and maximizing pipeline efficiency
      • Naming: Global, Accounts, and API Keys
      • Pagination tutorial
        • Pagination tutorial - part 1
        • Pagination tutorial - part 2
        • Pagination tutorial - part 3
        • Pagination tutorial - part 4
      • Pagination example
      • Event-driven architecture
      • Notification model in event-driven integrations
      • OAuth2 integration model with Digibee
      • Best practices for error handling in pipelines
    • Digibee Academy
      • Integration Developer Bootcamp
  • Reference guides
    • Connectors
      • AWS
        • S3 Storage
        • SQS
        • AWS Secrets Manager
        • AWS Athena
        • AWS CloudWatch
        • AWS Elastic Container Service (ECS)
        • AWS Eventbridge
        • AWS Identity and Access Management (IAM)
        • AWS Kinesis
        • AWS Kinesis Firehose
        • AWS Key Management Service (KMS)
        • AWS Lambda
        • AWS MQ
        • AWS Simple Email Service (SES)
        • AWS Simple Notification System (SNS)
        • AWS Security Token Service (STS)
        • AWS Translate
      • Azure
        • Azure CosmosDB
        • Azure Event Hubs
        • Azure Key Vault
        • Azure ServiceBus
        • Azure Storage DataLake Service
        • Azure Storage Queue Service
      • Enterprise applications
        • SAP
        • Salesforce
        • Braintree
        • Facebook
        • GitHub
        • Jira
        • ServiceNow
        • Slack
        • Telegram
        • Twilio
        • WhatsApp
        • Wordpress
        • Workday
        • Zendesk
      • File storage
        • Blob Storage (Azure)
        • Digibee Storage
        • Dropbox
        • FTP
        • Google Drive
        • Google Storage
        • OneDrive
        • SFTP
        • WebDav V2
        • WebDav (Deprecated)
      • Files
        • Append Files
        • Avro File Reader
        • Avro File Writer
        • CSV to Excel
        • Excel
        • File Reader
        • File Writer
        • GZIP V2
        • GZIP V1 (Deprecated)
        • Parquet File Reader
        • Parquet File Writer
        • Stream Avro File Reader
        • Stream Excel
        • Stream File Reader
        • Stream File Reader Pattern
        • Stream JSON File Reader
        • Stream Parquet File Reader
        • Stream XML File Reader
        • XML Schema Validator
        • ZIP File
        • NFS
      • Flow
        • Delayer
      • Google/GCP
        • Google BigQuery
        • Google BigQuery Standard SQL
        • Google Calendar
        • Google Cloud Functions
        • Google Mail
        • Google PubSub
        • Google Secret Manager
        • Google Sheets
      • Industry solutions
        • FHIR (Beta)
        • Gupy Public API
        • HL7
        • HubSpot: Sales and CMS
        • Mailgun API
        • Oracle NetSuite (Beta)
        • Orderful
        • Protheus: Billing and Inventory of Cost
      • Logic
        • Block Execution
        • Choice
        • Do While
        • For Each
        • Retry
        • Parallel Execution
      • Queues and messaging
        • Event Publisher
        • JMS
        • Kafka
        • RabbitMQ
      • Security
        • AES Cryptography
        • Asymmetric Cryptography
        • CMS
        • Digital Signature
        • JWT (Deprecated)
        • JWT V2
        • Google IAP Token
        • Hash
        • Digibee JWT (Generate and Decode)
        • LDAP
        • PBE Cryptography
        • PGP
        • RSA Cryptography
        • Symmetric Cryptography
      • Structured data
        • CassandraDB
        • DB V2
        • DB V1 (Deprecated)
        • DynamoDB
        • Google Big Table
        • Memcached
        • MongoDB
        • Object Store
        • Relationship
        • Session Management
        • Stored Procedure
        • Stream DB V3
        • Stream DB V1 (Deprecated)
        • ArangoDb
        • Caffeine Cache
        • Caffeine LoadCache
        • Couchbase
        • CouchDB
        • Ehcache
        • InfluxDB
      • Tools
        • Assert V2
        • Assert V1 (Deprecated)
        • Base64
        • CSV to JSON V2
        • CSV to JSON V1 (Deprecated)
        • HL7 Message Transformer (Beta)
        • HTML to PDF
        • Transformer (JOLT) V2
        • JSLT
        • JSON String to JSON Transformer
        • JSON to JSON String Transformer
        • JSON to XML Transformer
        • JSON to CSV V2
        • JSON to CSV Transformer (Deprecated)
        • JSON Path Transformer V2
        • JSON Path Transformer
        • JSON Transformer
        • Log
        • Pipeline Executor
        • QuickFix (Beta)
        • SSH Remote Command
        • Script (JavaScript)
        • Secure PDF
        • Store Account
        • Template Transformer
        • Throw Error
        • Transformer (JOLT)
        • Validator V1 (Deprecated)
        • Validator V2
        • XML to JSON Transformer
        • XML Transformer
        • JSON Generator (Mock)
      • Web protocols
        • Email V2
        • Email V1 (Deprecated)
        • REST V2
        • REST V1 (Deprecated)
        • SOAP V1 (Deprecated)
        • SOAP V2
        • SOAP V3
        • WGet (Download HTTP)
        • gRPC
    • Triggers
      • Web Protocols
        • API Trigger
        • Email Trigger
        • Email Trigger V2
        • HTTP Trigger
        • HTTP File Trigger
          • HTTP File Trigger - Downloads
          • HTTP File Trigger - Uploads
        • REST Trigger
      • Scheduling
        • Scheduler Trigger
      • Messaging and Events
        • Event Trigger
        • JMS Trigger
        • Kafka Trigger
        • RabbitMQ Trigger
      • Others
        • DynamoDB Streams Trigger
        • HL7 Trigger
        • Salesforce Trigger - Events
    • Double Braces
      • How to reference data using Double Braces
      • Double Braces functions
        • Math functions
        • Utilities functions
        • Numerical functions
        • String functions
        • JSON functions
        • Date functions
        • Comparison functions
        • File functions
        • Conditional functions
      • Double Braces autocomplete
  • Development cycle
    • Build
      • Canvas
        • AI Assistant
        • Smart Connector User Experience
        • Execution panel
        • Design and Inspect Mode
        • Linter: Canvas building validation
        • Connector Mocking
      • Pipeline
        • How to create a pipeline
        • How to scaffold a pipeline using an OpenAPI specification
        • How to create a project
        • Pipeline version history
        • Pipeline versioning
        • Messages processing
        • Subpipelines
      • Capsules
        • How to use Capsules
          • How to create a Capsule collection
            • Capsule header dimensions
          • How to create a Capsule group
          • How to configure a Capsule
          • How to build a Capsule
          • How to test a Capsule
          • How to save a Capsule
          • How to publish a Capsule
          • How to change a Capsule collection or group
          • How to archive and restore a Capsule
        • Capsules versioning
        • Public capsules
          • SAP
          • Digibee Tools
          • Google Sheets
          • Gupy
          • Send notifications via email
          • Totvs Live
          • Canvas LMS
        • AI Assistant for Capsules Docs Generation
    • Run
      • Run concepts
        • Autoscalling
      • Deployment
        • Deploying a pipeline
        • How to redeploy a pipeline
        • How to promote pipelines across environments
        • How to check the pipeline deployment History
        • How to rollback to a previous deployment version
        • Using deployment history advanced functions
        • Pipeline deployment status
      • How warnings work on pipelines in Run
    • Monitor
      • Monitor Insights (Beta)
      • Completed executions
        • Pipeline execution logs download
      • Pipeline logs
      • Pipeline Metrics
        • Pipeline Metrics API
          • How to set up Digibee API metrics with Datadog
          • How to set up Digibee API metrics with Prometheus
        • Connector Latency
      • Alerts
        • How to create an alert
        • How to edit an alert
        • How to activate, deactivate or duplicate an alert
        • How to delete an alert
        • How to configure alerts on Slack
        • How to configure alerts on Telegram
        • How to configure alerts through a webhook
        • Available metrics
        • Best practices about alerts
        • Use cases for alerts
      • VPN connections monitoring
        • Alerts for VPN metrics
  • Connectivity management
    • Connectivity
    • Zero Trust Network Access (ZTNA)
      • Prerequisites for using ZTNA
      • How to view connections (Edge Routers)
      • How to view the Network Mappings associated with an Edge Router
      • How to add new ZTNA connections (Edge Routers)
      • How to delete connections (Edge Routers)
      • How to view routes (Network Mapping)
      • How to add new routes (Network Mapping)
      • How to add routes in batch for ZTNA
      • How to edit routes (Network Mapping)
      • How to delete routes (Network Mapping)
      • How to generate new keys (Edge Router)
      • How to change the environment of Edge routers
      • ZTNA Inverse Flow
      • ZTNA Groups
    • Virtual Private Network (VPN)
  • Platform administration
    • Administration
      • Audit
      • Access control
        • Users
        • Groups
        • Roles
          • List of permissions by service
          • Roles and responsibilities: Governance and key stakeholder identification
      • Identity provider integration
        • How to integrate an identity provider
        • Authentication rules
        • Integration of IdP groups with Digibee groups
          • How to create a group integration
          • How to test a group integration
          • How to enable group integrations
          • How to edit a group integration
          • How to delete a group integration
      • User authentication and authorization
        • How to activate and deactivate two-factor authentication
        • Login flow
      • Organization groups
    • Settings
      • Globals
        • How to create Globals
        • How to edit or delete Globals
        • How to use Globals
      • Accounts
        • Configuring each account type
        • Monitor changes to account settings in deployed pipelines
        • OAuth2 Architecture
          • Registration of new OAuth providers
      • Consumers (API Keys)
      • Relationship model
      • Multi-Instance
        • Deploying a multi-instance pipeline
      • Log Streaming
        • How to use Log Streaming with Datadog
    • Governance
      • Policies
        • Security
          • Internal API access policy
          • External API access policy
          • Sensitive fields policy
        • Transformation
          • Custom HTTP header
          • CORS HTTP header
        • Limit of Replicas policy
    • Licensing
      • Licensing models
        • Consumption Based model
      • Capacity and quotas
      • License consumption
    • Digibee APIs
      • How to create API credentials
  • Digibee concepts
    • Pipeline Engine
      • Digibee Integration Platform Pipeline Engine v2
      • Support Dynamic Accounts (Restricted Beta)
    • Digibee Integration Platform Dedicated SaaS
      • Digibee Integration Platform architecture on Dedicated Saas model
      • Requirements for Digibee Dedicated Saas model
      • Site-to-Site VPN for dedicated SaaS customer support
      • Dedicated Saas customer responsibilities
      • Custom Images of Kubernetes Nodes
      • Digibee Dedicated SaaS installation on AWS
        • How to install requirements before installing Digibee Integration Platform on EKS
        • Permissions to use Digibee Integration Platform on EKS
        • How to create custom nodes for EKS (Golden Images)
    • Introduction to ZTNA
  • Help & FAQ
    • Digibee Customer Support
    • Request documentation, suggest features, or send feedback
    • Beta Program
    • Security and compliance
    • About Digibee
Powered by GitBook
On this page
  • Parameters
  • Parameters additional information
  • Add Cross-Origin Resource Sharing (CORS) - CORS Headers (DB)
  • Maximum Request Size
  • mTLS enabled API
  • Additional API Routes
  • Custom static route
  • Custom route with parameter in the route
  • Remove Digibee Prefix from Route
  • Rate Limit
  • HTTP Trigger in action
  • Information query API with XML response

Was this helpful?

  1. Reference guides
  2. Triggers
  3. Web Protocols

HTTP Trigger

Discover more about the HTTP Trigger and how to use it on the Digibee Integration Platform.

PreviousEmail Trigger V2NextHTTP File Trigger

Last updated 3 months ago

Was this helpful?

When a pipeline is configured and published with HTTP Trigger, an HTTP endpoint is automatically created. You can visualize this endpoint after the implantation - all you have to do is click on the pipeline card in the Run screen.

With this trigger, you have the flexibility to define different content types not only for the request, but also for the endpoint response.

Parameters

Take a look at the configuration parameters of the trigger. Parameters supported by are marked with (DB).

Parameter
Description
Default value
Data type

Methods

Configured the HTTP verbs to be supported by the endpoint after the implantation.

POST, PUT, GET, PATCH, DELETE, and OPTIONS

String

Request Content Types

Determines the content types the endpoint can receive.

text/xml, application/xml and application/x-www-form-urlencoded

String

Response Content Types (DB)

Content types to be returned by the endpoint when the pipeline processing ends. This parameter can't be left blank (the response depends on the treatment with mock + Double Braces).

text/xml, application/xml

String

Response Headers (DB)

Headers to be returned by the endpoint when processing in the pipeline is complete.

This parameter cannot be left empty and accepts Double Braces. Special characters should not be used in keys, due to possible failures in proxies and gateways.

N/A

Key-Value Pairs

Add Cross-Origin Resource Sharing (CORS) - CORS Headers

Add the CORS headers to be returned by the endpoint when processing in the pipeline is complete. This parameter defines CORS specifically for the pipeline and its constraints.

N/A

Key-Value Pairs

Maximum Timeout

Maximum time that the pipeline takes to process information before returning a response. Limit: 900000. In milliseconds.

If the processing takes longer than the parameter definition, the request is finished and returns status-code 500, but but without a body.

30000

Integer

Maximum Request Size

Maximum payload size (in MB). The maximum limit of the configurable payload is 5MB.

5

Integer

External API

If the option is enabled, the API is published in an external gateway.

True

Boolean

Internal API

If the option is enabled, the API is published in an internal gateway. The pipeline can have both the External API and the Internal API options enabled simultaneously.

False

Boolean

mTLS enabled API

If the option is activated, the API is published to a gateway dedicated to APIs with mTLS enabled by default. In this case, the access host will be different from the others. The pipeline can have both the External API and Internal API options enabled at the same time, but it is recommended to leave them inactive.

False

Boolean

API Key

If the option is activated, the endpoint can be consumed only if an API key is previously configured in the Digibee Integration Platform.

False

Boolean

Token JWT

False

Boolean

Basic Auth

If the option is activated, the endpoint can only be consumed if a Basic Auth setting is present in the request. This setting can be registered beforehand through the Consumers page in the Digibee Integration Platform.

False

Boolean

Additional API Routes

If the option is enabled, the trigger allows you to configure new routes. See more about this parameter in the below.

False

Boolean

Remove Digibee Prefix from Route

False

Boolean

Routes

Displayed only when the Additional API Routes parameter is enabled. Here you can define the endpoint additional routes.

N/A

String

Rate Limit

False

Boolean

Limit by

Defines the entity to which the limits will be applied.

API

String

Aggregate by

Defines the entity for aggregating the limits. Options: Consumer and Credential (API Key, Basic Auth).

Consumer

String

Options

Defines the limit of requests that can be made within a time interval.

N/A

Options for limit and interval

Interval

Defines the time interval for the limit of requests. Options: second, minute, hour, day, and month.

second

String

Limit

Defines the maximum number of requests that users can make in the specified time interval.

N/A

Integer

Allow Redelivery Of Messages

False

Boolean

There is a global configuration parameter that obliges all the pipelines to be published with at least the API Key or Basic Auth options enabled.

Parameters additional information

Add Cross-Origin Resource Sharing (CORS) - CORS Headers (DB)

We use a comma to enter multiple values in a header, but we don't add a space before or after the comma. Special characters should not be used in keys, due to possible failures in proxies and gateways.

Maximum Request Size

If the payload sent by the endpoint consumer overcomes the limit, a message will be returned informing that the maximum size has been overcome and a status-code 413 with the following message:

{  
    "message": "Request size limit exceeded"
}

mTLS enabled API

This parameter does not support API Key, JWT, or Basic Auth. To use it in your realm, it is necessary to make a request via chat and we will send you the necessary information to install this service.

Additional API Routes

As previously explained, this option is for the configurations of new endpoint routes.

When deploying a pipeline, a URL is automatically created. However, you can customize the route according to your convenience. It also includes the parameters received through the route.

After the pipelines are deployed, the URLs will get the following structure:

TEST:

https://test.godigibee.io/pipeline/{realm}/v{n}/{pipeline-name}

or PROD:

https://api.godigibee.io/pipeline/{realm}/v{n}/{pipeline-name}
  • {realm}: corresponds to realm.

  • v{n}: pipeline's major version.

  • {pipeline-name}: name given to the pipeline.

Custom static route

Let’s say you’ve created the product-list pipeline. Considering the comment above, your URL would look like this:

https://test.godigibee.io/pipeline/realm/v1/product-list

Now, see how to configure a static route for this case.

With the configurations applied and the pipeline deployed, you will get a new URL:

https://test.godigibee.io/pipeline/realm/v1/products 

Custom route with parameter in the route

Using the same example of the previously configured pipeline, see how to custom the route:

With the configurations applied and the pipeline deployed, you will get a new URL:

https://test.godigibee.io/pipeline/realm/v1/products/:id

In this case, the endpoint consumer can send a request containing the id of a product and return information only about it. Example of the request URL:

https://test.godigibee.io/pipeline/realm/v1/products/10156

To use this value sent by the route inside the pipeline, go for the Double Braces syntax:

{{ message.queryAndPath.id }}

Remove Digibee Prefix from Route

As previously explained, this option is recommended for removing the default Digibee route prefix from pipeline route.

Let’s say you’ve created a pipeline and set the trigger as follows:

With the configurations applied and the pipeline deployed, you will get a new URL:

https://test.godigibee.io/products

When removing the default prefix and setting the pipeline route through the Additional API Routes parameter, be careful not to set an existing pipeline route used by other pipelines. In case you have more than one pipeline major version, it’s also important to keep in mind that the pipeline route versioning must be done by the user due the absence of a versioning path parameter. For example: /pipeline/realm/v1/.

Rate Limit

When creating APIs, we usually want to limit the number of API requests users can make in a given time interval.

This action can be performed by activating the Rate Limit option and applying the following settings:

If the API has additional paths, the limit is shared among all paths. To apply the rate limit settings, the API must be configured with an API key or Basic Auth so that the Aggregate by parameter can be used by groups of credentials if the Consumer option is selected, or by an individual credential if the Credential (API Key, Basic Auth) option is selected.

If multiple interval parameters are configured with repeating values, only one of these values is considered. It’s also necessary that a value greater than zero be informed for the Limit parameter.

If the rate limiting options aren't set correctly, they'll be ignored and a warning log will be issued. You can view this log on the Pipeline Logs page.

HTTP Trigger in action

See below how the trigger behaves in a determined situation and what its respective configuration is.

Information query API with XML response

Check how to configure a pipeline with HTTP Trigger to return information from inside the pipeline in XML format and how the response must be handled specifically for this trigger.

First of all, create a new pipeline and configure the trigger. The configuration can be made as follows:

With the configurations above, you set that:

  • the endpoint works with the verb GET only;

  • the request accepts content-type related to the XML only;

  • the response returns content-type related to the XML only.

Besides, you determine that the API is external and doesn’t need a token for the communication.

This example work for educational matters only. In some cases, you can’t leave the endpoint open for security reasons.

{
    "data": {
        "products": [
            {
                "name": "Samsung 4k Q60T 55",
                "price": 3278.99
            },
            {
                "name": "Samsung galaxy S20 128GB",
                "price": 3698.99
            }
        ]
    }
}

After that, the last step is to set and determine how the response of this information will be made to the consumer. Add a MOCK again, given that its function will be to set the response only. Connect it to the JSON To XML Transformer output.

To configure this MOCK, use the following JSON:

{    
    "code": 200,    
    "body": {{ message.data }},    
    "Content-Type": "text/xml"
}
  • code: HTTP Status Code to be returned by the endpoint after a request is finished

  • Content-type: content type of the response body. All the types are supported, but must be declared in the Response Content Types field.

When done with all these configurations, you should have a pipeline that looks like the image below:

After the deployment, get the generated url and send a GET-type request. The endpoint must return the status-code 200, as previously shown, and the response body should look like this:

<?xml version='1.0' encoding='UTF-8' standalone='no' ?>
<doc>
  <products>
    <name>Samsung 4k Q60T 55</name>
    <price>3278.99</price>
  </products>
  <products>
    <name>Samsung galaxy S20 128GB</name>
    <price>3698.99</price>
  </products>
</doc>

Whenever you want to make a request to the created endpoint, the structure of the message the trigger delivers to the pipeline is always the same and follows this pattern:

{
"body": "<xml>\n\t<id>1</id>\n</xml>\t",
  "form": {},
  "headers": {
    "Host": "pipeline-trigger-http:8100",
    "Connection": "keep-alive",
    "X-Forwarded-For": "***",
    "X-Forwarded-Proto": "http",
    "X-Forwarded-Host": "***",
    "my-custom-header": "a"
  },
  "queryAndPath": {
    "id": "1"
  },
  "method": "POST",
  "contentType": "application/xml",
  "path": "/pipeline/digibee/v1/trigger-http/1"
}
  • body: content to be sent in the request payload to be transformed into a string in this field.

  • form: if the form-data is used in the request, the sent data is delivered in this field.

  • headers: the headers sent in the request are delivered in this field, but some of them are automatically filled with the tool used to make the request.

  • queryAndPath: the query and path parameters provided in the URL are delivered in this field.

  • method: HTTP method used in the request to be delivered in this field.

  • contentType: when informed in the request, the Content-type value is repassed to the pipeline inside this field.

  • path: the path used in the URL in the request is repassed to this field.

If the option is activated, the endpoint can be consumed only if a JWT token previously generated by another endpoint with this capacity is sent. Read the article of the to have more details.

This option is available only when External API and Internal API parameters are disabled, and mTLS enabled API and Additional API Routes parameters are enabled. Set this option to remove the default Digibee route prefix "/pipeline/{realm}/v{n}" from the pipeline route. See more about this parameter in the below.

If the option is activated, a rate limiting configuration will be applied on the API gateway. This option is only available if API Key or Basic Auth is active. See more about the Rate Limit parameter in the below.

If the option is enabled, it allows the message to be resent in case the Pipeline Engine fails. Read the article about the to have more details.

Cross-Origin Resource Sharing (CORS) is a mechanism that lets you tell the browser which origins are allowed to make requests. To configure globally rather than individually on each pipeline see the .

Now observe how to configure a in the pipeline so it becomes the data provider the endpoint returns in the end. Add the indicated component, connect it to the trigger and configure it with the following JSON:

The next step is to add a component that transforms the previously created JSON into a XML pattern. For that, use , add it to the canvas and connect it to the JSON Generator that was previously added. Keep the following configuration:

body: response message body (Double Braces are being used to repass the information converted in the previous step). This item must necessarily be a string. If the data you want to send is a JSON, use the function.

Double Braces expressions
CORS HTTP header policy
MOCK
JSON To XML Transformer
JWT implementation
Pipeline Engine
section
section
TOSTRING