Digibee Documentation
Request documentationBook a demo
English
English
  • Quick start
  • Highlights
    • Release notes
      • Release notes 2025
        • May
        • April
        • March
        • February
        • January
      • Release notes 2024
        • December
        • November
        • October
        • September
        • August
          • Connectors release 08/20/2024
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2023
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2022
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2021
      • Release notes 2020
    • AI Pair Programmer
    • Digibeectl
      • Getting started
        • How to install Digibeectl on Windows
      • Digibeectl syntax
      • Digibeectl operations
  • Digibee in action
    • Use Cases in Action
      • Improving integration performance with API pagination
      • Automating file storage with Digibee
      • Reprocessing strategy in event-driven integrations
      • Key practices for securing sensitive information in pipelines with Digibee
      • OAuth2 for secure API access
      • Secure your APIs with JWT in Digibee
      • Integration best practices for developers on the Digibee Integration Platform
      • How to use Event-driven architecture on the Digibee Integration Platform
      • Dynamic file download with Digibee
      • Microservices: Circuit Breaker pattern for improving resilience
      • Error handling strategy in event-driven integrations
    • Troubleshooting
      • Integration guidance
        • How to resolve common pipeline issues
        • How to resolve Error 409: “You cannot update a pipeline that is not on draft mode”
        • How to resolve the "Pipeline execution was aborted" error
        • Integrated authentication with Microsoft Entra ID
        • How to resolve the "Failed to initialize pool: ONS configuration failed" error
        • How to perform IP address mapping with Progress database
        • How to build integration flows that send error notifications
        • How to send pipeline logs to external monitoring systems
        • How JSONPath differs in connectors and the Execution panel
        • Using JSONPath to validate numbers with specific initial digits
        • How to analyze the "Network error: Failed to fetch" in the Execution panel
        • How to handle request payloads larger than 5MB
        • How to configure Microsoft Entra ID to display groups on the Digibee Integration Platform
        • How to build an HL7 message
      • Connectors behavior and configuration
        • Timeout in the Pipeline Executor connector
        • How to use DISTINCT and COUNT in the Object Store
        • Understanding @@DGB_TRUNCATED@@ on the Digibee Integration Platform
        • How to resolve names without a DNS - REST, SOAP, SAP (web protocols)
        • How to read and write files inside a folder
        • AuthToken Reuse for Salesforce connector
        • How to resolve the "Invalid payload" error in API Integration
        • Supported databases
          • Functions and uses for databases
      • Connectors implementation and usage examples
        • Google Storage: Usage scenarios
        • DB V2: Usage scenarios
        • For Each: Usage example
        • Template and its uses
        • Digibee JWT implementation
        • Email V1: Usage example (Deprecated)
      • JOLT applications
        • Transformer: Getting to know JOLT
        • Transformer: Transformations with JOLT
        • Transformer: Add values to list elements
        • Transformer: Operations overview
        • Transformer: Date formatting using split and concat
        • Transformer: Simple IF-ELSE logic with JOLT
      • Platform access and performance tips
        • How to solve login problems on the Digibee Integration Platform
        • How to receive updates from Digibee Status Page
        • How to clean the Digibee Integration Platform cache
      • Governance troubleshooting guidance
        • How to consume Internal API pipelines using ZTNA
        • How to use Internal API with and without a VPN
        • How to generate, convert, and register SSH Keys
        • mTLS authentication
          • How to configure mTLS on the Digibee Integration Platform
          • FAQs: Certificates in mTLS
        • How to connect Digibee to Oracle RAC
        • How to connect Digibee to SAP
        • How to connect Digibee to MongoDB Atlas using VPN
        • How to manage IPs on the Digibee Integration Platform
        • Configuring the Dropbox account
        • How to use your Gmail account with the Digibee email component (SMTP)
        • How to use the CORS policy on the Digibee Integration Platform
      • Deployment scenarios
        • Solving the “Out of memory” errors in deployment
        • Warning of route conflicts
    • Best practices
      • JOLT and Double Braces on Digibee: Choosing the right method for data transformation
      • Best practices for building a pipeline
      • Best practices on validating messages in a consumer pipeline
      • Avoiding loops and maximizing pipeline efficiency
      • Naming: Global, Accounts, and API Keys
      • Pagination tutorial
        • Pagination tutorial - part 1
        • Pagination tutorial - part 2
        • Pagination tutorial - part 3
        • Pagination tutorial - part 4
      • Pagination example
      • Event-driven architecture
      • Notification model in event-driven integrations
      • OAuth2 integration model with Digibee
      • Best practices for error handling in pipelines
      • Highly scalable ETL model for Digibee
    • Digibee Academy
      • Integration Developer Bootcamp
  • Reference guides
    • Connectors
      • AI Tools
        • LLM Connector
      • AWS
        • S3 Storage
        • SQS
        • AWS Secrets Manager
        • AWS Athena
        • AWS CloudWatch
        • AWS Elastic Container Service (ECS)
        • AWS Eventbridge
        • AWS Identity and Access Management (IAM)
        • AWS Kinesis
        • AWS Kinesis Firehose
        • AWS Key Management Service (KMS)
        • AWS Lambda
        • AWS MQ
        • AWS Simple Email Service (SES)
        • AWS Simple Notification System (SNS)
        • AWS Security Token Service (STS)
        • AWS Translate
      • Azure
        • Azure CosmosDB
        • Azure Event Hubs
        • Azure Key Vault
        • Azure ServiceBus
        • Azure Storage DataLake Service
        • Azure Storage Queue Service
      • Enterprise applications
        • SAP
        • Salesforce
        • Braintree
        • Facebook
        • GitHub
        • Jira
        • ServiceNow
        • Slack
        • Telegram
        • Twilio
        • WhatsApp
        • Wordpress
        • Workday
        • Zendesk
      • File storage
        • Blob Storage (Azure)
        • Digibee Storage
        • Dropbox
        • FTP
        • Google Drive
        • Google Storage
        • OneDrive
        • SFTP
        • WebDav V2
        • WebDav (Deprecated)
      • Files
        • Append Files
        • Avro File Reader
        • Avro File Writer
        • CSV to Excel
        • Excel
        • File Reader
        • File Writer
        • GZIP V2
        • GZIP V1 (Deprecated)
        • Parquet File Reader
        • Parquet File Writer
        • Stream Avro File Reader
        • Stream Excel
        • Stream File Reader
        • Stream File Reader Pattern
        • Stream JSON File Reader
        • Stream Parquet File Reader
        • Stream XML File Reader
        • XML Schema Validator
        • ZIP File
        • NFS
      • Flow
        • Delayer
      • Google/GCP
        • Google BigQuery
        • Google BigQuery Standard SQL
        • Google Calendar
        • Google Cloud Functions
        • Google Mail
        • Google PubSub
        • Google Secret Manager
        • Google Sheets
      • Industry solutions
        • FHIR (Beta)
        • Gupy Public API
        • HL7
        • HubSpot: Sales and CMS
        • Mailgun API
        • Oracle NetSuite (Beta)
        • Orderful
        • Protheus: Billing and Inventory of Cost
      • Logic
        • Block Execution
        • Choice
        • Do While
        • For Each
        • Retry
        • Parallel Execution
      • Queues and messaging
        • Event Publisher
        • JMS
        • Kafka
        • RabbitMQ
      • Security
        • AES Cryptography
        • Asymmetric Cryptography
        • CMS
        • Digital Signature
        • JWT (Deprecated)
        • JWT V2
        • Google IAP Token
        • Hash
        • Digibee JWT (Generate and Decode)
        • LDAP
        • PBE Cryptography
        • PGP
        • RSA Cryptography
        • Symmetric Cryptography
      • Structured data
        • CassandraDB
        • DB V2
        • DB V1 (Deprecated)
        • DynamoDB
        • Google Big Table
        • Memcached
        • MongoDB
        • Object Store
        • Relationship
        • Session Management
        • Stored Procedure
        • Stream DB V3
        • Stream DB V1 (Deprecated)
        • ArangoDb
        • Caffeine Cache
        • Caffeine LoadCache
        • Couchbase
        • CouchDB
        • Ehcache
        • InfluxDB
      • Tools
        • Assert V2
        • Assert V1 (Deprecated)
        • Base64
        • CSV to JSON V2
        • CSV to JSON V1 (Deprecated)
        • HL7 Message Transformer (Beta)
        • HTML to PDF
        • Transformer (JOLT) V2
        • JSLT
        • JSON String to JSON Transformer
        • JSON to JSON String Transformer
        • JSON to XML Transformer
        • JSON to CSV V2
        • JSON to CSV Transformer (Deprecated)
        • JSON Path Transformer V2
        • JSON Path Transformer
        • JSON Transformer
        • Log
        • Pipeline Executor
        • QuickFix (Beta)
        • SSH Remote Command
        • Script (JavaScript)
        • Secure PDF
        • Store Account
        • Template Transformer
        • Throw Error
        • Transformer (JOLT)
        • Validator V1 (Deprecated)
        • Validator V2
        • XML to JSON Transformer
        • XML Transformer
        • JSON Generator (Mock)
      • Web protocols
        • Email V2
        • Email V1 (Deprecated)
        • REST V2
        • REST V1 (Deprecated)
        • SOAP V1 (Deprecated)
        • SOAP V2
        • SOAP V3
        • WGet (Download HTTP)
        • gRPC
    • Triggers
      • Web Protocols
        • API Trigger
        • Email Trigger
        • Email Trigger V2
        • HTTP Trigger
        • HTTP File Trigger
          • HTTP File Trigger - Downloads
          • HTTP File Trigger - Uploads
        • REST Trigger
      • Scheduling
        • Scheduler Trigger
      • Messaging and Events
        • Event Trigger
        • JMS Trigger
        • Kafka Trigger
        • RabbitMQ Trigger
      • Others
        • DynamoDB Streams Trigger
        • HL7 Trigger
        • Salesforce Trigger - Events
    • Double Braces
      • How to reference data using Double Braces
      • Double Braces functions
        • Math functions
        • Utilities functions
        • Numerical functions
        • String functions
        • JSON functions
        • Date functions
        • Comparison functions
        • File functions
        • Conditional functions
      • Double Braces autocomplete
  • Development cycle
    • Build
      • Canvas
        • AI Assistant
        • Smart Connector User Experience
        • Execution panel
        • Design and Inspect Mode
        • Linter: Canvas building validation
        • Connector Mocking
      • Pipeline
        • How to create a pipeline
        • How to scaffold a pipeline using an OpenAPI specification
        • How to create a project
        • Pipeline version history
        • Pipeline versioning
        • Messages processing
        • Subpipelines
      • Capsules
        • How to use Capsules
          • How to create a Capsule collection
            • Capsule header dimensions
          • How to create a Capsule group
          • How to configure a Capsule
          • How to build a Capsule
          • How to test a Capsule
          • How to save a Capsule
          • How to publish a Capsule
          • How to change a Capsule collection or group
          • How to archive and restore a Capsule
        • Capsules versioning
        • Public capsules
          • SAP
          • Digibee Tools
          • Google Sheets
          • Gupy
          • Send notifications via email
          • Totvs Live
          • Canvas LMS
        • AI Assistant for Capsules Docs Generation
    • Run
      • Run concepts
        • Autoscalling
      • Deployment
        • Deploying a pipeline
        • How to redeploy a pipeline
        • How to promote pipelines across environments
        • How to check the pipeline deployment History
        • How to rollback to a previous deployment version
        • Using deployment history advanced functions
        • Pipeline deployment status
      • How warnings work on pipelines in Run
    • Monitor
      • Monitor Insights (Beta)
      • Completed executions
        • Pipeline execution logs download
      • Pipeline logs
      • Pipeline Metrics
        • Pipeline Metrics API
          • How to set up Digibee API metrics with Datadog
          • How to set up Digibee API metrics with Prometheus
        • Connector Latency
      • Alerts
        • How to create an alert
        • How to edit an alert
        • How to activate, deactivate or duplicate an alert
        • How to delete an alert
        • How to configure alerts on Slack
        • How to configure alerts on Telegram
        • How to configure alerts through a webhook
        • Available metrics
        • Best practices about alerts
        • Use cases for alerts
      • VPN connections monitoring
        • Alerts for VPN metrics
  • Connectivity management
    • Connectivity
    • Zero Trust Network Access (ZTNA)
      • Prerequisites for using ZTNA
      • How to view connections (Edge Routers)
      • How to view the Network Mappings associated with an Edge Router
      • How to add new ZTNA connections (Edge Routers)
      • How to delete connections (Edge Routers)
      • How to view routes (Network Mapping)
      • How to add new routes (Network Mapping)
      • How to add routes in batch for ZTNA
      • How to edit routes (Network Mapping)
      • How to delete routes (Network Mapping)
      • How to generate new keys (Edge Router)
      • How to change the environment of Edge routers
      • ZTNA Inverse Flow
      • ZTNA Groups
    • Virtual Private Network (VPN)
  • Platform administration
    • Administration
      • Audit
      • Access control
        • Users
        • Groups
        • Roles
          • List of permissions by service
          • Roles and responsibilities: Governance and key stakeholder identification
      • Identity provider integration
        • How to integrate an identity provider
        • Authentication rules
        • Integration of IdP groups with Digibee groups
          • How to create a group integration
          • How to test a group integration
          • How to enable group integrations
          • How to edit a group integration
          • How to delete a group integration
      • User authentication and authorization
        • How to activate and deactivate two-factor authentication
        • Login flow
      • Organization groups
    • Settings
      • Globals
        • How to create Globals
        • How to edit or delete Globals
        • How to use Globals
      • Accounts
        • Configuring each account type
        • Monitor changes to account settings in deployed pipelines
        • OAuth2 Architecture
          • Registration of new OAuth providers
      • Consumers (API Keys)
      • Relationship model
      • Multi-Instance
        • Deploying a multi-instance pipeline
      • Data Streaming
        • How to use Data Streaming with Datadog
    • Governance
      • Policies
        • Security
          • Internal API access policy
          • External API access policy
          • Sensitive fields policy
        • Transformation
          • Custom HTTP header
          • CORS HTTP header
        • Limit of Replicas policy
    • Licensing
      • Licensing models
        • Consumption Based model
      • Capacity and quotas
      • License consumption
    • Digibee APIs
      • How to create API credentials
  • Digibee concepts
    • Pipeline Engine
      • Digibee Integration Platform Pipeline Engine v2
      • Support Dynamic Accounts (Restricted Beta)
    • Introduction to ZTNA
  • Help & FAQ
    • Digibee Customer Support
    • Request documentation, suggest features, or send feedback
    • Beta Program
    • Security and compliance
    • About Digibee
Powered by GitBook
On this page
  • Parameters
  • General
  • System Prompt
  • Settings
  • Error Handling
  • Documentation
  • LLM Connector in action
  • Minimal configuration: User Prompt request
  • Mid-level configuration: User + System Prompts request
  • High-level configuration: Prompts + Output Format request
  • Dynamic configuration: Prompt with Double Braces reference
  • FAQ

Was this helpful?

  1. Reference guides
  2. Connectors
  3. AI Tools

LLM Connector

Discover more about the LLM Connector and how to use it on the Digibee Integration Platform.

PreviousAI ToolsNextAWS

Last updated 1 day ago

Was this helpful?

LLM Connector sends requests to Large Language Models (LLMs) within Digibee pipelines, enabling tasks such as text classification, information extraction, summarization, and content evaluation.

It supports built-in authentication and works with major providers: , , , and . The configuration allows you to control model behavior, response format, and output structure based on your integration needs.

Parameters

Take a look at the configuration parameters for the connector. Parameters supported by are marked with (DB).

General

Parameter
Description
Default value
Data type

LLM Provider

Specifies the LLM provider to use. Available options are: Anthropic Claude, DeepSeek, Google Gemini, and OpenAI.

N/A

String

Model

The AI model to be used, based on the selected provider. Only text models are supported; image generation is not available.

N/A

String

Account

N/A

Account

User Prompt (DB)

The prompt sent to the AI model. Supports Double Braces syntax to include data or variables from earlier steps.

N/A

Plain Text

Output Format

When enabled, allows you to define a custom output format for the AI response.

False

Boolean

Output Format Body (DB)

The structure of the desired output format.

N/A

JSON

System Prompt

Parameter
Description
Default value
Data type

System Prompt (DB)

A predefined instruction that sets the tone and behavior of the AI. You can use it to define roles or the type of response the model should always follow.

N/A

Plain Text

Maximum Output Token (DB)

Sets the maximum number of tokens allowed in the AI response. Lower values may reduce the quality and completeness of the output.

1024

Integer

Settings

Parameter
Description
Default value
Data type

Stop On Client Error

If enabled, stops the pipeline execution when an HTTP 4xx error occurs.

False

Boolean

Stop On Server Error

If enabled, stops the pipeline execution when an HTTP 5xx error occurs.

False

Boolean

Error Handling

Parameter
Description
Default value
Data type

Fail On Error

If enabled, interrupts the pipeline execution when an error occurs. If disabled, execution continues, but the "success" property will be set to false.

False

Boolean

Documentation

Parameter
Description
Default value
Data type

Documentation

Optional field to describe the connector configuration and any relevant business rules.

N/A

String

LLM Connector in action

Minimal configuration: User Prompt request

This configuration uses only the User Prompt parameter to send a request to the AI model.

Advantages:

  • Easy to set up with just one input.

  • Good for testing different prompts quickly.

  • Works well for simple requests.

Practical example

  • Use case: A pipeline integrated with Zendesk receives a new customer ticket. The LLM Connector is used to analyze the request and classify its topic.

  • Goal: Classify the topic of a support ticket.

User Prompt:

Classify the topic of the following customer request:  
"My payment was declined, but the amount was debited from my account. I need help fixing this."

Example output:

{
  "status": 200,
  "body": "Payment Issues"
}

Mid-level configuration: User + System Prompts request

This configuration uses both the User Prompt and System Prompt parameters to guide the AI response.

Advantages:

  • Helps guide the AI’s tone and behavior.

  • Makes responses more consistent.

  • Adds context that helps the AI understand the prompt better.

Practical example

  • Use case: After classifying the support ticket, the pipeline queries a knowledge database. The LLM Connector is then used again to generate a personalized response for the customer.

  • Goal: Generate a custom response using predefined tone and style.

System Prompt:

You are a friendly and helpful support agent. Always use an empathetic tone and provide clear instructions. Return the message as plain text with no line breaks.

User Prompt:

Write a response to the customer below, explaining that we will investigate the payment and get back to them within 24 hours:  
"My payment was declined, but the amount was debited from my account. I need help fixing this."

Example output:

{
  "status": 200,
  "body": "Thank you for reaching out, and I’m sorry to hear about the payment issue. I completely understand how frustrating this must be. We’ll investigate this right away and get back to you with an update within 24 hours. In the meantime, please rest assured that we’re on it and will do everything we can to resolve this for you. If you have any additional details or questions, feel free to share them. We appreciate your patience!"
}

High-level configuration: Prompts + Output Format request

This configuration uses User Prompt, System Prompt, and Output Format to generate a structured response.

Advantages:

  • Produces output in a clear and structured format.

  • Useful when the result needs to follow a specific structure.

  • Helps manage the AI’s unpredictability by setting a fixed format.

Practical example

  • Use case: A pipeline receives a user-generated comment from an ISV (independent software vendor) platform. The LLM Connector sends the comment to the AI to evaluate whether it’s harmful or offensive. The returned score is then used to decide whether the comment should be published or if the user should be flagged.

  • Goal: Evaluate and score a comment’s harmfulness and determine whether it should be approved.

System Prompt:

You are a content moderator. Evaluate whether the comment is harmful, assign a score from 0 to 1 for severity, and indicate whether it should be approved.

User Prompt:

Evaluate the following comment:  
"This company is a joke. Everyone working there is completely incompetent."

Output Format Body:

{
  "score": "",
  "label": "",
  "should_approve": 
}

Possible output:

{
  "status": 200,
  "body": {
    "score": "0.6",
    "label": "potentially harmful",
    "should_approve": false
  }
}

Dynamic configuration: Prompt with Double Braces reference

This configuration uses the User Prompt field to dynamically inject data from a previous connector using Double Braces expressions. In addition, the System Prompt and Output Format fields are used to guide the AI and generate a structured response.

Advantages:

  • Enables contextual prompts based on pipeline data.

  • Connects the AI response to runtime information.

Practical example

  • Goal: Categorize the address type using dynamic data from the previous connector.

System Prompt:

You are an address classification assistant. Based on the street name and neighborhood, classify the address as residential, commercial, or rural. Explain your reasoning.

User Prompt with Double Braces:

Use the following address to make your evaluation: {{message.body}}

Output Format Body:

{
  "type": "",
  "reason": ""
}

Possible output:

{
  "status": 200,
  "body": {
    "type": "residential",
    "reason": "The street name 'Rua Abilio Carvalho Bastos' and the neighborhood 'Fósforo' suggest a typical residential area. The presence of house numbers (até 799/800) further supports this classification, as commercial areas are more likely to have business names or larger ranges of numbers."
  }
}

FAQ

How can I test and experiment with my prompts?
Can I use data from the previous connectors?
How is sensitive data handled?

The connector doesn’t redact or filter payload data. We recommend following the same data handling practices used with other connectors.

Can I chain multiple LLM calls in one pipeline?

Yes. You can use the output of one LLM call as input for another. For example, first classify a support ticket, then generate a response based on the classification.

What if the LLM produces inaccurate or made-up results?

For critical tasks, reduce hallucination risk by splitting the process into smaller steps, such as generating first and verifying afterward. This gives you more control and lets you validate the result before using it.

What happens if the provider takes too long to respond?

If the provider takes too long to respond, the request will time out and an error message will be shown in the Execution Panel.

The account to authenticate with the connector. It must be previously registered on the page. Supported type: Secret Key.

Use case: A pipeline receives address data from a that queries a Brazilian public ZIP code API (OpenCEP). The LLM Connector is then used to classify the type of address as residential, commercial or rural, based on the street name and neighborhood returned by the API.

Use the to test your prompts. The option is especially useful for testing prompts separately from the rest of the pipeline.

Yes. You can use to reference data from previous connectors and include it in your prompt.

Anthropic Claude
DeepSeek
Google Gemini
OpenAI
Double Braces expressions
REST connector
Double Braces expressions
Accounts
Execution Panel
Run Selected Steps