Digibee Documentation
Request documentationBook a demo
English
English
  • Quick start
  • Highlights
    • Release notes
      • Release notes 2025
        • May
        • April
        • March
        • February
        • January
      • Release notes 2024
        • December
        • November
        • October
        • September
        • August
          • Connectors release 08/20/2024
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2023
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2022
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2021
      • Release notes 2020
    • AI Pair Programmer
    • Digibeectl
      • Getting started
        • How to install Digibeectl on Windows
      • Digibeectl syntax
      • Digibeectl operations
  • Digibee in action
    • Use Cases in Action
      • Improving integration performance with API pagination
      • Automating file storage with Digibee
      • Reprocessing strategy in event-driven integrations
      • Key practices for securing sensitive information in pipelines with Digibee
      • OAuth2 for secure API access
      • Secure your APIs with JWT in Digibee
      • Integration best practices for developers on the Digibee Integration Platform
      • How to use Event-driven architecture on the Digibee Integration Platform
      • Dynamic file download with Digibee
      • Microservices: Circuit Breaker pattern for improving resilience
      • Error handling strategy in event-driven integrations
    • Troubleshooting
      • Integration guidance
        • How to resolve common pipeline issues
        • How to resolve Error 409: “You cannot update a pipeline that is not on draft mode”
        • How to resolve the "Pipeline execution was aborted" error
        • Integrated authentication with Microsoft Entra ID
        • How to resolve the "Failed to initialize pool: ONS configuration failed" error
        • How to perform IP address mapping with Progress database
        • How to build integration flows that send error notifications
        • How to send logs to external services
        • How JSONPath differs in connectors and the Execution panel
        • Using JSONPath to validate numbers with specific initial digits
        • How to analyze the "Network error: Failed to fetch" in the Execution panel
        • How to handle request payloads larger than 5MB
        • How to configure Microsoft Entra ID to display groups on the Digibee Integration Platform
        • How to build an HL7 message
      • Connectors behavior and configuration
        • Timeout in the Pipeline Executor connector
        • How to use DISTINCT and COUNT in the Object Store
        • Understanding @@DGB_TRUNCATED@@ on the Digibee Integration Platform
        • How to resolve names without a DNS - REST, SOAP, SAP (web protocols)
        • How to read and write files inside a folder
        • AuthToken Reuse for Salesforce connector
        • How to resolve the "Invalid payload" error in API Integration
        • Supported databases
          • Functions and uses for databases
      • Connectors implementation and usage examples
        • Google Storage: Usage scenarios
        • DB V2: Usage scenarios
        • For Each: Usage example
        • Template and its uses
        • Digibee JWT implementation
        • Email V1: Usage example (Deprecated)
      • JOLT applications
        • Transformer: Getting to know JOLT
        • Transformer: Transformations with JOLT
        • Transformer: Add values to list elements
        • Transformer: Operations overview
        • Transformer: Date formatting using split and concat
        • Transformer: Simple IF-ELSE logic with JOLT
      • Platform access and performance tips
        • How to solve login problems on the Digibee Integration Platform
        • How to receive updates from Digibee Status Page
        • How to clean the Digibee Integration Platform cache
      • Governance troubleshooting guidance
        • How to consume Internal API pipelines using ZTNA
        • How to use Internal API with and without a VPN
        • How to generate, convert, and register SSH Keys
        • mTLS authentication
          • How to configure mTLS on the Digibee Integration Platform
          • FAQs: Certificates in mTLS
        • How to connect Digibee to Oracle RAC
        • How to connect Digibee to SAP
        • How to connect Digibee to MongoDB Atlas using VPN
        • How to manage IPs on the Digibee Integration Platform
        • Configuring the Dropbox account
        • How to use your Gmail account with the Digibee email component (SMTP)
        • How to use the CORS policy on the Digibee Integration Platform
      • Deployment scenarios
        • Solving the “Out of memory” errors in deployment
        • Warning of route conflicts
    • Best practices
      • Best practices for building a pipeline
      • Best practices on validating messages in a consumer pipeline
      • Avoiding loops and maximizing pipeline efficiency
      • Naming: Global, Accounts, and API Keys
      • Pagination tutorial
        • Pagination tutorial - part 1
        • Pagination tutorial - part 2
        • Pagination tutorial - part 3
        • Pagination tutorial - part 4
      • Pagination example
      • Event-driven architecture
      • Notification model in event-driven integrations
      • OAuth2 integration model with Digibee
      • Best practices for error handling in pipelines
    • Digibee Academy
      • Integration Developer Bootcamp
  • Reference guides
    • Connectors
      • AWS
        • S3 Storage
        • SQS
        • AWS Secrets Manager
        • AWS Athena
        • AWS CloudWatch
        • AWS Elastic Container Service (ECS)
        • AWS Eventbridge
        • AWS Identity and Access Management (IAM)
        • AWS Kinesis
        • AWS Kinesis Firehose
        • AWS Key Management Service (KMS)
        • AWS Lambda
        • AWS MQ
        • AWS Simple Email Service (SES)
        • AWS Simple Notification System (SNS)
        • AWS Security Token Service (STS)
        • AWS Translate
      • Azure
        • Azure CosmosDB
        • Azure Event Hubs
        • Azure Key Vault
        • Azure ServiceBus
        • Azure Storage DataLake Service
        • Azure Storage Queue Service
      • Enterprise applications
        • SAP
        • Salesforce
        • Braintree
        • Facebook
        • GitHub
        • Jira
        • ServiceNow
        • Slack
        • Telegram
        • Twilio
        • WhatsApp
        • Wordpress
        • Workday
        • Zendesk
      • File storage
        • Blob Storage (Azure)
        • Digibee Storage
        • Dropbox
        • FTP
        • Google Drive
        • Google Storage
        • OneDrive
        • SFTP
        • WebDav V2
        • WebDav (Deprecated)
      • Files
        • Append Files
        • Avro File Reader
        • Avro File Writer
        • CSV to Excel
        • Excel
        • File Reader
        • File Writer
        • GZIP V2
        • GZIP V1 (Deprecated)
        • Parquet File Reader
        • Parquet File Writer
        • Stream Avro File Reader
        • Stream Excel
        • Stream File Reader
        • Stream File Reader Pattern
        • Stream JSON File Reader
        • Stream Parquet File Reader
        • Stream XML File Reader
        • XML Schema Validator
        • ZIP File
        • NFS
      • Flow
        • Delayer
      • Google/GCP
        • Google BigQuery
        • Google BigQuery Standard SQL
        • Google Calendar
        • Google Cloud Functions
        • Google Mail
        • Google PubSub
        • Google Secret Manager
        • Google Sheets
      • Industry solutions
        • FHIR (Beta)
        • Gupy Public API
        • HL7
        • HubSpot: Sales and CMS
        • Mailgun API
        • Oracle NetSuite (Beta)
        • Orderful
        • Protheus: Billing and Inventory of Cost
      • Logic
        • Block Execution
        • Choice
        • Do While
        • For Each
        • Retry
        • Parallel Execution
      • Queues and messaging
        • Event Publisher
        • JMS
        • Kafka
        • RabbitMQ
      • Security
        • AES Cryptography
        • Asymmetric Cryptography
        • CMS
        • Digital Signature
        • JWT (Deprecated)
        • JWT V2
        • Google IAP Token
        • Hash
        • Digibee JWT (Generate and Decode)
        • LDAP
        • PBE Cryptography
        • PGP
        • RSA Cryptography
        • Symmetric Cryptography
      • Structured data
        • CassandraDB
        • DB V2
        • DB V1 (Deprecated)
        • DynamoDB
        • Google Big Table
        • Memcached
        • MongoDB
        • Object Store
        • Relationship
        • Session Management
        • Stored Procedure
        • Stream DB V3
        • Stream DB V1 (Deprecated)
        • ArangoDb
        • Caffeine Cache
        • Caffeine LoadCache
        • Couchbase
        • CouchDB
        • Ehcache
        • InfluxDB
      • Tools
        • Assert V2
        • Assert V1 (Deprecated)
        • Base64
        • CSV to JSON V2
        • CSV to JSON V1 (Deprecated)
        • HL7 Message Transformer (Beta)
        • HTML to PDF
        • Transformer (JOLT) V2
        • JSLT
        • JSON String to JSON Transformer
        • JSON to JSON String Transformer
        • JSON to XML Transformer
        • JSON to CSV V2
        • JSON to CSV Transformer (Deprecated)
        • JSON Path Transformer V2
        • JSON Path Transformer
        • JSON Transformer
        • Log
        • Pipeline Executor
        • QuickFix (Beta)
        • SSH Remote Command
        • Script (JavaScript)
        • Secure PDF
        • Store Account
        • Template Transformer
        • Throw Error
        • Transformer (JOLT)
        • Validator V1 (Deprecated)
        • Validator V2
        • XML to JSON Transformer
        • XML Transformer
        • JSON Generator (Mock)
      • Web protocols
        • Email V2
        • Email V1 (Deprecated)
        • REST V2
        • REST V1 (Deprecated)
        • SOAP V1 (Deprecated)
        • SOAP V2
        • SOAP V3
        • WGet (Download HTTP)
        • gRPC
    • Triggers
      • Web Protocols
        • API Trigger
        • Email Trigger
        • Email Trigger V2
        • HTTP Trigger
        • HTTP File Trigger
          • HTTP File Trigger - Downloads
          • HTTP File Trigger - Uploads
        • REST Trigger
      • Scheduling
        • Scheduler Trigger
      • Messaging and Events
        • Event Trigger
        • JMS Trigger
        • Kafka Trigger
        • RabbitMQ Trigger
      • Others
        • DynamoDB Streams Trigger
        • HL7 Trigger
        • Salesforce Trigger - Events
    • Double Braces
      • How to reference data using Double Braces
      • Double Braces functions
        • Math functions
        • Utilities functions
        • Numerical functions
        • String functions
        • JSON functions
        • Date functions
        • Comparison functions
        • File functions
        • Conditional functions
      • Double Braces autocomplete
  • Development cycle
    • Build
      • Canvas
        • AI Assistant
        • Smart Connector User Experience
        • Execution panel
        • Design and Inspect Mode
        • Linter: Canvas building validation
        • Connector Mocking
      • Pipeline
        • How to create a pipeline
        • How to scaffold a pipeline using an OpenAPI specification
        • How to create a project
        • Pipeline version history
        • Pipeline versioning
        • Messages processing
        • Subpipelines
      • Capsules
        • How to use Capsules
          • How to create a Capsule collection
            • Capsule header dimensions
          • How to create a Capsule group
          • How to configure a Capsule
          • How to build a Capsule
          • How to test a Capsule
          • How to save a Capsule
          • How to publish a Capsule
          • How to change a Capsule collection or group
          • How to archive and restore a Capsule
        • Capsules versioning
        • Public capsules
          • SAP
          • Digibee Tools
          • Google Sheets
          • Gupy
          • Send notifications via email
          • Totvs Live
          • Canvas LMS
        • AI Assistant for Capsules Docs Generation
    • Run
      • Run concepts
        • Autoscalling
      • Deployment
        • Deploying a pipeline
        • How to redeploy a pipeline
        • How to promote pipelines across environments
        • How to check the pipeline deployment History
        • How to rollback to a previous deployment version
        • Using deployment history advanced functions
        • Pipeline deployment status
      • How warnings work on pipelines in Run
    • Monitor
      • Monitor Insights (Beta)
      • Completed executions
        • Pipeline execution logs download
      • Pipeline logs
      • Pipeline Metrics
        • Pipeline Metrics API
          • How to set up Digibee API metrics with Datadog
          • How to set up Digibee API metrics with Prometheus
        • Connector Latency
      • Alerts
        • How to create an alert
        • How to edit an alert
        • How to activate, deactivate or duplicate an alert
        • How to delete an alert
        • How to configure alerts on Slack
        • How to configure alerts on Telegram
        • How to configure alerts through a webhook
        • Available metrics
        • Best practices about alerts
        • Use cases for alerts
      • VPN connections monitoring
        • Alerts for VPN metrics
  • Connectivity management
    • Connectivity
    • Zero Trust Network Access (ZTNA)
      • Prerequisites for using ZTNA
      • How to view connections (Edge Routers)
      • How to view the Network Mappings associated with an Edge Router
      • How to add new ZTNA connections (Edge Routers)
      • How to delete connections (Edge Routers)
      • How to view routes (Network Mapping)
      • How to add new routes (Network Mapping)
      • How to add routes in batch for ZTNA
      • How to edit routes (Network Mapping)
      • How to delete routes (Network Mapping)
      • How to generate new keys (Edge Router)
      • How to change the environment of Edge routers
      • ZTNA Inverse Flow
      • ZTNA Groups
    • Virtual Private Network (VPN)
  • Platform administration
    • Administration
      • Audit
      • Access control
        • Users
        • Groups
        • Roles
          • List of permissions by service
          • Roles and responsibilities: Governance and key stakeholder identification
      • Identity provider integration
        • How to integrate an identity provider
        • Authentication rules
        • Integration of IdP groups with Digibee groups
          • How to create a group integration
          • How to test a group integration
          • How to enable group integrations
          • How to edit a group integration
          • How to delete a group integration
      • User authentication and authorization
        • How to activate and deactivate two-factor authentication
        • Login flow
      • Organization groups
    • Settings
      • Globals
        • How to create Globals
        • How to edit or delete Globals
        • How to use Globals
      • Accounts
        • Configuring each account type
        • Monitor changes to account settings in deployed pipelines
        • OAuth2 Architecture
          • Registration of new OAuth providers
      • Consumers (API Keys)
      • Relationship model
      • Multi-Instance
        • Deploying a multi-instance pipeline
      • Log Streaming
        • How to use Log Streaming with Datadog
    • Governance
      • Policies
        • Security
          • Internal API access policy
          • External API access policy
          • Sensitive fields policy
        • Transformation
          • Custom HTTP header
          • CORS HTTP header
        • Limit of Replicas policy
    • Licensing
      • Licensing models
        • Consumption Based model
      • Capacity and quotas
      • License consumption
    • Digibee APIs
      • How to create API credentials
  • Digibee concepts
    • Pipeline Engine
      • Digibee Integration Platform Pipeline Engine v2
      • Support Dynamic Accounts (Restricted Beta)
    • Introduction to ZTNA
  • Help & FAQ
    • Digibee Customer Support
    • Request documentation, suggest features, or send feedback
    • Beta Program
    • Security and compliance
    • About Digibee
Powered by GitBook
On this page
  • How to use the AI Assistant
  • Scaffold a pipeline
  • Create JOLT spec from samples
  • Generate documentation
  • Get general help
  • Writing effective prompts for AI-generated pipelines
  • Additional tips
  • FAQs

Was this helpful?

  1. Development cycle
  2. Build
  3. Canvas

AI Assistant

Discover the AI Assistant resources available on the Digibee Integration Platform.

PreviousCanvasNextSmart Connector User Experience

Last updated 7 days ago

Was this helpful?

The AI Assistant is an AI-powered tool within the Canvas of the Digibee Integration Platform. As part of our AI Pair Programmer approach, it helps accelerate time-consuming tasks, such as:

  • Designing integration logic

  • Generating JOLT specifications

  • Documenting integrations

  • Getting general guidance and support

With the AI Assistant, you can complete these tasks without leaving the environment where you're building your integrations.

How to use the AI Assistant

Access the AI Assistant by clicking the AI icon in the left menu. Once opened, you will see a list of options tailored to your goal:

In any of these options, you can give feedback by liking or disliking the answer, helping us improve our AI models.

Scaffold a pipeline

Provide a prompt describing the logic of your integration, and receive a structured pipeline in return. This includes:

  • A flow tree

  • The integration steps

  • A summary of the structure

Once you receive the pipeline structure, you can:

  • Adjust the flow by asking the AI for changes until you achieve the desired result.

  • Copy the flow to use elsewhere.

  • Insert the flow directly into the Canvas of your pipeline.

Create JOLT spec from samples

Send the JSON input to the AI. If needed, you can make changes to the input. Once it's processed, provide the expected output. The AI will then return the corresponding JOLT specification.

After receiving the JOLT specification, you can:

If the answer isn’t what you expected, you can give the AI more context to refine the output until it meets your needs.

Generate documentation

Generate real-time documentation for your pipeline. The AI analyzes your pipeline and creates a document that includes:

  • Flow description: A summary of what the pipeline does.

  • External Systems Involved: The systems involved in the integration.

  • Events: The event triggered by the pipeline.

  • Globals: The global variables used in the connectors.

  • Accounts: The accounts used in the connectors, except for the Object Store connector, where account identification is disabled due to its inherent configuration in the Digibee Integration Platform.

Once you receive the document, you can:

  • Export the document as a PDF file.

Get general help

Ask anything about Digibee and get an answer based on our Documentation Portal. This gives you fast access to the information you need to create, configure, and troubleshoot your integrations.

Writing effective prompts for AI-generated pipelines

When using AI to generate a pipeline, the way you describe your request matters. Providing clear details helps the AI understand your needs and generate an accurate pipeline structure. Below are examples of good and bad prompts to help you get better results.

Prompts with specific connectors

If you already know which connectors your integration needs, mention them in your prompt. This helps the AI generate a more accurate pipeline.

Do: Provide clear, step-by-step details.

✅ I want to create a pipeline that starts by using the REST V2 connector to make a request. After that, it will use a Choice connector to evaluate the API response, followed by a Log connector to check the result. If the request is successful, the pipeline should use the JSON Generator to create the output message. If there’s an error, the Email V2 connector will be used to send a notification, ending the flow.

Don’t: Be vague or general. Without connector names, the AI might not generate the correct flow.

❌ I want to create a pipeline that makes an API request, checks the response, and then sends an email if something goes wrong.

Why is this unclear? This doesn’t specify which connectors to use, what kind of API request, or what “something going wrong” means.

Prompts with endpoint types

If you don’t know the exact connectors, but know the types of endpoints involved, you can describe the integration at a high level and let the AI select the appropriate connectors.

Do: Mention the systems or endpoints types clearly.

✅ Connect my SAP ERP system to a database and a REST API.

Don’t: Be too generic. If you only say “ERP system”, the AI won’t know which kind you’re referring to.

❌ Connect my ERP system to a database and a REST API.

Why is this unclear? Different ERP systems might require different connectors. Providing the system name helps the AI choose the best options.

Prompts without mentioning connectors

If you are unsure which connectors to use, you can still write a detailed prompt that describes your goals. The AI will then determine the best approach.

Do: Clearly describe the steps of the process.

✅ I need to create a pipeline integration that reads data from a stream database, publishes an event, and writes the data to another database. It starts by connecting to the stream database to read the necessary data. The data is then published as an event to notify other systems or services. Finally, the data is written to another database for storage or further processing.

Don’t: Leave out key details. A vague request might result in an incomplete or incorrect pipeline.

❌ I need a pipeline that reads data, publishes it, and writes it somewhere else.

Why is it unclear? It doesn’t specify what kind of data, where it’s coming from, what kind of event needs to be published, or what database is involved.

Additional tips

Keep these in mind when formulating your prompt:

  • Be as specific as possible about the structure of the flow you want to generate.

  • Write clear, direct instructions describing the part or the entire pipeline you want the AI to generate.

  • Mention specific connectors if you already know which ones to use.

  • If your prompt is complex, break it into parts and ask the AI to make adjustments if needed.

FAQs

Can I create large pipelines using AI?

Yes, but the AI may generate hallucinations when handling excessively large pipelines.

Does the AI understand the existing pipeline on the Canvas?

No, the AI doesn’t recognize the context of an existing pipeline on the Canvas. It generates new content without associating it with the current pipeline.

Does the AI configure the connector’s parameters?

No, the AI doesn’t set connector parameters. Its main focus is on defining the logical structure and connections between connectors.

Can I use the AI Assistant for both pipelines and capsules?
Can I generate unlimited documentation for my pipeline?

No. OpenAI processes text using tokens, which are units of text limited by the model's capacity. Once the limit is reached, it stops generating output.

Our documentation generator supports up to 128,000 tokens per request, which typically covers around 500 to 600 connectors. If this limit is exceeded, the documentation will not be generated, and an error message will be displayed.

Important notes:

  • The tokens are shared across all Digibee accounts and reset daily.

  • Since the documentation is generated by AI, hallucinations may occur. Note that empty pipelines don’t generate documentation or hallucinations.

Is the chat history saved in the AI Assistant?

Yes. Chat history is saved per pipeline and per user. Every time you access a pipeline where you previously interacted with the AI Assistant, you will be able to retrieve the chat history for that pipeline.

See below for a .

Copy it and paste it into the connector.

OpenAI’s documentation provides a comprehensive guide on creating more precise prompts through prompt engineering. .

No, for now, the AI Assistant is only available for pipelines. However, some AI-powered features are available for capsules. Learn more about the .

Transformer (JOLT)
Access the documentation for more details
AI Assistant for Capsules Docs Generation
guide on creating effective prompts