Digibee Documentation
Request documentationBook a demo
English
English
  • Quick start
  • Highlights
    • Release notes
      • Release notes 2025
        • May
        • April
        • March
        • February
        • January
      • Release notes 2024
        • December
        • November
        • October
        • September
        • August
          • Connectors release 08/20/2024
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2023
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2022
        • December
        • November
        • October
        • September
        • August
        • July
        • June
        • May
        • April
        • March
        • February
        • January
      • Release notes 2021
      • Release notes 2020
    • AI Pair Programmer
    • Digibeectl
      • Getting started
        • How to install Digibeectl on Windows
      • Digibeectl syntax
      • Digibeectl operations
  • Digibee in action
    • Use Cases in Action
      • Improving integration performance with API pagination
      • Automating file storage with Digibee
      • Reprocessing strategy in event-driven integrations
      • Key practices for securing sensitive information in pipelines with Digibee
      • OAuth2 for secure API access
      • Secure your APIs with JWT in Digibee
      • Integration best practices for developers on the Digibee Integration Platform
      • How to use Event-driven architecture on the Digibee Integration Platform
      • Dynamic file download with Digibee
      • Microservices: Circuit Breaker pattern for improving resilience
      • Error handling strategy in event-driven integrations
    • Troubleshooting
      • Integration guidance
        • How to resolve common pipeline issues
        • How to resolve Error 409: “You cannot update a pipeline that is not on draft mode”
        • How to resolve the "Pipeline execution was aborted" error
        • Integrated authentication with Microsoft Entra ID
        • How to resolve the "Failed to initialize pool: ONS configuration failed" error
        • How to perform IP address mapping with Progress database
        • How to build integration flows that send error notifications
        • How to send pipeline logs to external monitoring systems
        • How JSONPath differs in connectors and the Execution panel
        • Using JSONPath to validate numbers with specific initial digits
        • How to analyze the "Network error: Failed to fetch" in the Execution panel
        • How to handle request payloads larger than 5MB
        • How to configure Microsoft Entra ID to display groups on the Digibee Integration Platform
        • How to build an HL7 message
      • Connectors behavior and configuration
        • Timeout in the Pipeline Executor connector
        • How to use DISTINCT and COUNT in the Object Store
        • Understanding @@DGB_TRUNCATED@@ on the Digibee Integration Platform
        • How to resolve names without a DNS - REST, SOAP, SAP (web protocols)
        • How to read and write files inside a folder
        • AuthToken Reuse for Salesforce connector
        • How to resolve the "Invalid payload" error in API Integration
        • Supported databases
          • Functions and uses for databases
      • Connectors implementation and usage examples
        • Google Storage: Usage scenarios
        • DB V2: Usage scenarios
        • For Each: Usage example
        • Template and its uses
        • Digibee JWT implementation
        • Email V1: Usage example (Deprecated)
      • JOLT applications
        • Transformer: Getting to know JOLT
        • Transformer: Transformations with JOLT
        • Transformer: Add values to list elements
        • Transformer: Operations overview
        • Transformer: Date formatting using split and concat
        • Transformer: Simple IF-ELSE logic with JOLT
      • Platform access and performance tips
        • How to solve login problems on the Digibee Integration Platform
        • How to receive updates from Digibee Status Page
        • How to clean the Digibee Integration Platform cache
      • Governance troubleshooting guidance
        • How to consume Internal API pipelines using ZTNA
        • How to use Internal API with and without a VPN
        • How to generate, convert, and register SSH Keys
        • mTLS authentication
          • How to configure mTLS on the Digibee Integration Platform
          • FAQs: Certificates in mTLS
        • How to connect Digibee to Oracle RAC
        • How to connect Digibee to SAP
        • How to connect Digibee to MongoDB Atlas using VPN
        • How to manage IPs on the Digibee Integration Platform
        • Configuring the Dropbox account
        • How to use your Gmail account with the Digibee email component (SMTP)
        • How to use the CORS policy on the Digibee Integration Platform
      • Deployment scenarios
        • Solving the “Out of memory” errors in deployment
        • Warning of route conflicts
    • Best practices
      • JOLT and Double Braces on Digibee: Choosing the right method for data transformation
      • Best practices for building a pipeline
      • Best practices on validating messages in a consumer pipeline
      • Avoiding loops and maximizing pipeline efficiency
      • Naming: Global, Accounts, and API Keys
      • Pagination tutorial
        • Pagination tutorial - part 1
        • Pagination tutorial - part 2
        • Pagination tutorial - part 3
        • Pagination tutorial - part 4
      • Pagination example
      • Event-driven architecture
      • Notification model in event-driven integrations
      • OAuth2 integration model with Digibee
      • Best practices for error handling in pipelines
      • Highly scalable ETL model for Digibee
    • Digibee Academy
      • Integration Developer Bootcamp
  • Reference guides
    • Connectors
      • AI Tools
        • LLM Connector
      • AWS
        • S3 Storage
        • SQS
        • AWS Secrets Manager
        • AWS Athena
        • AWS CloudWatch
        • AWS Elastic Container Service (ECS)
        • AWS Eventbridge
        • AWS Identity and Access Management (IAM)
        • AWS Kinesis
        • AWS Kinesis Firehose
        • AWS Key Management Service (KMS)
        • AWS Lambda
        • AWS MQ
        • AWS Simple Email Service (SES)
        • AWS Simple Notification System (SNS)
        • AWS Security Token Service (STS)
        • AWS Translate
      • Azure
        • Azure CosmosDB
        • Azure Event Hubs
        • Azure Key Vault
        • Azure ServiceBus
        • Azure Storage DataLake Service
        • Azure Storage Queue Service
      • Enterprise applications
        • SAP
        • Salesforce
        • Braintree
        • Facebook
        • GitHub
        • Jira
        • ServiceNow
        • Slack
        • Telegram
        • Twilio
        • WhatsApp
        • Wordpress
        • Workday
        • Zendesk
      • File storage
        • Blob Storage (Azure)
        • Digibee Storage
        • Dropbox
        • FTP
        • Google Drive
        • Google Storage
        • OneDrive
        • SFTP
        • WebDav V2
        • WebDav (Deprecated)
      • Files
        • Append Files
        • Avro File Reader
        • Avro File Writer
        • CSV to Excel
        • Excel
        • File Reader
        • File Writer
        • GZIP V2
        • GZIP V1 (Deprecated)
        • Parquet File Reader
        • Parquet File Writer
        • Stream Avro File Reader
        • Stream Excel
        • Stream File Reader
        • Stream File Reader Pattern
        • Stream JSON File Reader
        • Stream Parquet File Reader
        • Stream XML File Reader
        • XML Schema Validator
        • ZIP File V2
        • ZIP File
        • NFS
      • Flow
        • Delayer
      • Google/GCP
        • Google BigQuery
        • Google BigQuery Standard SQL
        • Google Calendar
        • Google Cloud Functions
        • Google Mail
        • Google PubSub
        • Google Secret Manager
        • Google Sheets
      • Industry solutions
        • FHIR (Beta)
        • Gupy Public API
        • HL7
        • HubSpot: Sales and CMS
        • Mailgun API
        • Oracle NetSuite (Beta)
        • Orderful
        • Protheus: Billing and Inventory of Cost
      • Logic
        • Block Execution
        • Choice
        • Do While
        • For Each
        • Retry
        • Parallel Execution
      • Queues and messaging
        • Event Publisher
        • JMS
        • Kafka
        • RabbitMQ
      • Security
        • AES Cryptography
        • Asymmetric Cryptography
        • CMS
        • Digital Signature
        • JWT (Deprecated)
        • JWT V2
        • Google IAP Token
        • Hash
        • Digibee JWT (Generate and Decode)
        • LDAP
        • PBE Cryptography
        • PGP
        • RSA Cryptography
        • Symmetric Cryptography
      • Structured data
        • CassandraDB
        • DB V2
        • DB V1 (Deprecated)
        • DynamoDB
        • Google Big Table
        • Memcached
        • MongoDB
        • Object Store
        • Relationship
        • Session Management
        • Stored Procedure
        • Stream DB V3
        • Stream DB V1 (Deprecated)
        • ArangoDb
        • Caffeine Cache
        • Caffeine LoadCache
        • Couchbase
        • CouchDB
        • Ehcache
        • InfluxDB
      • Tools
        • Assert V2
        • Assert V1 (Deprecated)
        • Base64
        • CSV to JSON V2
        • CSV to JSON V1 (Deprecated)
        • HL7 Message Transformer (Beta)
        • HTML to PDF
        • Transformer (JOLT) V2
        • JSLT
        • JSON String to JSON Transformer
        • JSON to JSON String Transformer
        • JSON to XML Transformer
        • JSON to CSV V2
        • JSON to CSV Transformer (Deprecated)
        • JSON Path Transformer V2
        • JSON Path Transformer
        • JSON Transformer
        • Log
        • Pipeline Executor
        • QuickFix (Beta)
        • SSH Remote Command
        • Script (JavaScript)
        • Secure PDF
        • Store Account
        • Template Transformer
        • Throw Error
        • Transformer (JOLT)
        • Validator V1 (Deprecated)
        • Validator V2
        • XML to JSON Transformer
        • XML Transformer
        • JSON Generator (Mock)
      • Web protocols
        • Email V2
        • Email V1 (Deprecated)
        • REST V2
        • REST V1 (Deprecated)
        • SOAP V1 (Deprecated)
        • SOAP V2
        • SOAP V3
        • WGet (Download HTTP)
        • gRPC
    • Triggers
      • Web Protocols
        • API Trigger
        • Email Trigger
        • Email Trigger V2
        • HTTP Trigger
        • HTTP File Trigger
          • HTTP File Trigger - Downloads
          • HTTP File Trigger - Uploads
        • REST Trigger
      • Scheduling
        • Scheduler Trigger
      • Messaging and Events
        • Event Trigger
        • JMS Trigger
        • Kafka Trigger
        • RabbitMQ Trigger
      • Others
        • DynamoDB Streams Trigger
        • HL7 Trigger
        • Salesforce Trigger - Events
    • Double Braces
      • How to reference data using Double Braces
      • Double Braces functions
        • Math functions
        • Utilities functions
        • Numerical functions
        • String functions
        • JSON functions
        • Date functions
        • Comparison functions
        • File functions
        • Conditional functions
      • Double Braces autocomplete
  • Development cycle
    • Build
      • Canvas
        • AI Assistant
        • Smart Connector User Experience
        • Execution panel
        • Design and Inspect Mode
        • Linter: Canvas building validation
        • Connector Mocking
      • Pipeline
        • How to create a pipeline
        • How to scaffold a pipeline using an OpenAPI specification
        • How to create a project
        • Pipeline version history
        • Pipeline versioning
        • Messages processing
        • Subpipelines
      • Capsules
        • How to use Capsules
          • How to create a Capsule collection
            • Capsule header dimensions
          • How to create a Capsule group
          • How to configure a Capsule
          • How to build a Capsule
          • How to test a Capsule
          • How to save a Capsule
          • How to publish a Capsule
          • How to change a Capsule collection or group
          • How to archive and restore a Capsule
        • Capsules versioning
        • Public capsules
          • SAP
          • Digibee Tools
          • Google Sheets
          • Gupy
          • Send notifications via email
          • Totvs Live
          • Canvas LMS
        • AI Assistant for Capsules Docs Generation
    • Run
      • Run concepts
        • Autoscalling
      • Deployment
        • Deploying a pipeline
        • How to redeploy a pipeline
        • How to promote pipelines across environments
        • How to check the pipeline deployment History
        • How to rollback to a previous deployment version
        • Using deployment history advanced functions
        • Pipeline deployment status
      • How warnings work on pipelines in Run
    • Monitor
      • Monitor Insights (Beta)
      • Completed executions
        • Pipeline execution logs download
      • Pipeline logs
      • Pipeline Metrics
        • Pipeline Metrics API
          • How to set up Digibee API metrics with Datadog
          • How to set up Digibee API metrics with Prometheus
        • Connector Latency
      • Alerts
        • How to create an alert
        • How to edit an alert
        • How to activate, deactivate or duplicate an alert
        • How to delete an alert
        • How to configure alerts on Slack
        • How to configure alerts on Telegram
        • How to configure alerts through a webhook
        • Available metrics
        • Best practices about alerts
        • Use cases for alerts
      • VPN connections monitoring
        • Alerts for VPN metrics
  • Connectivity management
    • Connectivity
    • Zero Trust Network Access (ZTNA)
      • Prerequisites for using ZTNA
      • How to view connections (Edge Routers)
      • How to view the Network Mappings associated with an Edge Router
      • How to add new ZTNA connections (Edge Routers)
      • How to delete connections (Edge Routers)
      • How to view routes (Network Mapping)
      • How to add new routes (Network Mapping)
      • How to add routes in batch for ZTNA
      • How to edit routes (Network Mapping)
      • How to delete routes (Network Mapping)
      • How to generate new keys (Edge Router)
      • How to change the environment of Edge routers
      • ZTNA Inverse Flow
      • ZTNA Groups
    • Virtual Private Network (VPN)
  • Platform administration
    • Administration
      • Audit
      • Access control
        • Users
        • Groups
        • Roles
          • List of permissions by service
          • Roles and responsibilities: Governance and key stakeholder identification
      • Identity provider integration
        • How to integrate an identity provider
        • Authentication rules
        • Integration of IdP groups with Digibee groups
          • How to create a group integration
          • How to test a group integration
          • How to enable group integrations
          • How to edit a group integration
          • How to delete a group integration
      • User authentication and authorization
        • How to activate and deactivate two-factor authentication
        • Login flow
      • Organization groups
    • Settings
      • Globals
        • How to create Globals
        • How to edit or delete Globals
        • How to use Globals
      • Accounts
        • Configuring each account type
        • Monitor changes to account settings in deployed pipelines
        • OAuth2 Architecture
          • Registration of new OAuth providers
      • Consumers (API Keys)
      • Relationship model
      • Multi-Instance
        • Deploying a multi-instance pipeline
      • Data Streaming
        • How to use Data Streaming with Datadog
    • Governance
      • Policies
        • Security
          • Internal API access policy
          • External API access policy
          • Sensitive fields policy
        • Transformation
          • Custom HTTP header
          • CORS HTTP header
        • Limit of Replicas policy
    • Licensing
      • Licensing models
        • Consumption Based model
      • Capacity and quotas
      • License consumption
    • Digibee APIs
      • How to create API credentials
  • Digibee concepts
    • Pipeline Engine
      • Digibee Integration Platform Pipeline Engine v2
      • Support Dynamic Accounts (Restricted Beta)
    • Introduction to ZTNA
  • Help & FAQ
    • Digibee Customer Support
    • Request documentation, suggest features, or send feedback
    • Beta Program
    • Security and compliance
    • About Digibee
Powered by GitBook
On this page
  • Parameters
  • Best practices
  • Messages flow
  • Input
  • Output
  • Object Store in action
  • Insert multiple items at once in a collection
  • Find items from a determined query
  • Find all the items from a query
  • Update an item from a specific ID
  • Remove an item from a specific ID
  • Aggregation to copy the collection
  • Aggregation to filter collection items
  • Create an index with expiration time
  • List all indexes
  • Drop an existing index
  • Technology

Was this helpful?

  1. Reference guides
  2. Connectors
  3. Structured data

Object Store

Discover more about the Object Store connector and how to use it on the Digibee Integration Platform.

PreviousMongoDBNextRelationship

Was this helpful?

Object Store makes operations to store any document in the Object Store of Digibee. It's a simple and quick way to save useful JSON-type information, which has operations to help in multiple uses during the creation of a pipeline.

Parameters

Take a look at the configuration parameters of the connector. Parameters supported by are marked with (DB).

Parameter
Description
Default value
Data type

Account

Account to be used by the connector. This item can't be changed.

Digibee Object Store Account

String

Operation

Operation to be executed inside Object Store - Find by Object ID, Find By Query, Insert, Aggregate, Update By Object ID, Update By Query, Delete By Object ID, Delete By Query, Create Index, List Indexes and Drop Index.

Find by Object ID

String

Object Store Name

Name of the collection to be used to record or read information. If it doesn't exist, it will be automatically created.

sales

String

Expire After Seconds (DB)

Defines the number of seconds after which a document from the Object Store expires.

0

Integer

Object ID (DB)

Identifier of the object to be stored or searched. It can be a unique number or a UUID. This item supports Double Braces.

1

String

Limit (DB)

Maximum number of objects to be returned in a search. This item supports Double Braces.

0

Integer

Skip (DB)

Amount of objects to be skipped before returning to the query. This parameter is usually used along with the Limit parameter to create a way of pagination. This item supports Double Braces.

0

Integer

Sort (DB)

Specification of the query ordination rules.

N/A

String

Query (DB)

This JSON field for a query is available only if Find by Query, Aggregate, Update by Query, Delete by Query, Create Index or Drop Index operations are selected.

N/A

String

Document (DB)

Available only if Insert, Update by Object ID, or Update by Query are selected. Double braces expressions are supported.

N/A

String

Unique Index

If the option is activated, an Object ID will be created to accept unique values only; otherwise, a non-unique index will be created.

True

Boolean

Isolated

If the option is activated, all the queries will be delimited in the execution scope.

False

Boolean

Upsert

This option is available only if Update By Object ID or Update By Query operations are selected. When enabled, the item informed for the object will be inserted in the collection in case it doesn't exist.

False

Boolean

Fail On Error

If the option is enabled, the execution of the pipeline with error will be interrupted; otherwise, the pipeline execution proceeds, but the result will show a false value for the “success” property.

False

Boolean

Best practices

Object Store is an auxiliary database (NoSQL) for integrations. Its use provides more agility and practicality in the development of integrations. To exemplify the applicability of this connector, we list the following good usage practices:

  • The Object Store connector has the function of an intermediary database, i.e., it is used to mediate information between the flows of an integration. It must therefore only be used to store information that is relevant to the integration in question.

  • The Object Store is a temporary database. Once it’s used to intermediate relevant information for the integration flow, old and dispensable data must be regularly removed from the database. You can remove them manually or create an index with a TTL mechanism to automatically expire the old data.

  • Since it is an auxiliary base, the Object Store connector must not be used as a permanent database and only in certain cases, with the purpose of supporting the user in the development of integrations.

  • All data is stored with maximum security within the Digibee Integration Platform. However, we recommend that sensitive data stored in the Object Store be encrypted. To do this, use our encryption connectors available in Canvas.

Messages flow

Input

For this specific connector, the only mandatory input message pattern is the JSON format applied to the object. The input parameter can use the Double Braces syntax to send the received message to the connector.

Output

  • Insert

{
    "data": [],
    "updateCount": 1
}
  • Find

{
   "data": [
       {
           "name": "Galaxy s20",
           "uuid": "123",
           "_oId": "1"
       }
   ],
   "rowCount": 1
}
  • Update

{
    "data": [],
    "updateCount": 1
}
  • Delete

{
    "data": [],
    "updateCount": 1
}
  • Aggregate

{
    "data": [],
    "rowCount": 0
}

Object Store in action

Some output examples of each operation were shown above. See below more applications that demonstrate the correct configuration for a determined result to be obtained:

Insert multiple items at once in a collection

When sending an object array in the query field, the connector inserts each item in a separate way inside the selected collection.

Observe how to configure the connector with the Operation (Insert), Unique Index (False) and Query parameters:

[
   {
       "id": 1,
       "name": "Galaxy s20",
       "price": 5000
   },
   {
       "id": 2,
       "name": "Samsung 4k 55\"",
       "price": 5000
   },
   {
       "id": 3,
       "name": "Galaxy A10",
       "price": 699
   },
   {
       "id": 4,
       "name": "Galaxy A51",
       "price": 1620
   }
]

Output

{
    "data": [],
    "updateCount": 4
}

The insertion of multiple objects at once is allowed in collections created with Unique Index (False) only. The Unique Index property is defined in the collection creation. After the index is created, it's not possible to change the property.

Find items from a determined query

As an example, consider an Object Store that already has registered product-type items and whose characteristics are name and price.

Observe how to configure the connector with the Operation (Find By Query) and Query parameters:

{
    "product.price": { $gt: 2000 }
}

Output

{
   "data": [
       {
           "product": {
               "id": 1,
               "name": "Galaxy s20",
               "price": 5000
           },
           "_oId": "1"
       },
       {
           "product": {
               "id": 2,
               "name": "Samsung 4k 55\"",
               "price": 5000
           },
           "_oId": "2"
       }
   ],
   "rowCount": 2
}

Find all the items from a query

As an example, consider an Object Store that already has registered product-type items and whose characteristics are name and price.

Observe how to configure the connector with the Operation (Find By Query), Limit (10) and Query parameters:

{}

Output

{
   "data": [
       {
           "product": {
               "id": 1,
               "name": "Galaxy s20",
               "price": 5000
           },
           "_oId": "1"
       },
       {
           "product": {
               "id": 2,
               "name": "Samsung 4k 55\"",
               "price": 5000
           },
           "_oId": "2"
       },
       {
           "product": {
               "id": 3,
               "name": "Galaxy A10",
               "price": 699
           },
           "_oId": "3"
       },
       {
           "product": {
               "id": 4,
               "name": "Galaxy A51",
               "price": 1620
           },
           "_oId": "4"
       }
   ],
   "rowCount": 4
}

In this specific scenario, the Limit parameter was configured so there wouldn't be an unnecessary overload when returning the objects from an Object Store. If the option isn't configured that way, an "Out Of Memory" error can occur inside the pipeline. In the indicated way, there's control over how many objects are seen in the response.

Update an item from a specific ID

As an example, consider an Object Store that already has registered product-type items and whose characteristics are name and price.

Observe how to configure the connector with the Operation (Update By Object ID), Object ID (3) and Document parameters:

{
   $set: {
       "product": {
         "id": 3,
         "name": "Galaxy A10",
         "price": 605
       }
   }
}

Output

{
    "data": [],
    "updateCount": 1
}

In this specific scenario, it's possible to see that the output is only an object identifying an update. To check if the object has been properly updated, repeat the ID search scenario.

Remove an item from a specific ID

As an example, consider an Object Store that already has registered product-type items and whose characteristics are name and price.

Observe chow to configure the connector with the Operation (Delete By Object Id) and Object ID (4) parameters:

Output

{
    "data": [],
    "updateCount": 1
}

In this specific scenario, it's possible to see that the output is only an object that identifies the update. To check if the object has been properly updated, repeat the ID search scenario.

Aggregation to copy the collection

As an example, consider an Object Store called "product" that already has registered product-type items and whose characteristics are name and price. From that, create a new Object Store called "product-backup", copying all the items of the collection mentioned above.

You must receive an object array containing the query aggregation pipelines in the Document parameter.

Observe how to configure the connector with the Operation (Aggregate) and Query parameters:

[
   {
       $merge: {
           into: "product-backup",
           on: "_id",
           whenMatched: "replace",
           whenNotMatched: "insert"
       }
   }
]

In this specific scenario, the query was configured to replace the repeated items with the new ones in the collection.

Output

{
    "data": [],
    "rowCount": 0
}

To check if the collection has been properly created with the proposed items, repeat the search scenario with all the items and inform the new collection.

Aggregation to filter collection items

You must receive an object array containing the query aggregation pipelines in the Document parameter.

Observe how to configure the connector with the Operation (Aggregate) and Query parameters:

[
   {
       $match: {
           "product.price": {
               $gt: 3000
           }
       }
   },
   {
       $group: {
           _id: null,
           count: {
               $sum: 1
           }
       }
   }
]

In this specific scenario, the query was configured to search products that has a determined value and to show their sum.

Output

{
    "data": [{
        "count": 2
    }],
    "rowCount": 0
}

Create an index with expiration time

With the Object Store it’s possible to create an index with a TTL (Time to live) attribute, with which you can define an expiration strategy for the documents. This behavior is controlled in the Expire after seconds parameter. Here are some examples of how you can create indexes with expiration strategies:

Create an index with constant TTL

See an example for the configuration of the parameters Expire after seconds and Query:

The name of the field must match the date attribute of your document, which will control the lifetime of the object. In addition, all documents with this field are automatically deleted after the number of seconds specified in the Expire after seconds parameter.

Output

{
    "data": "expiresAt_1",
    "rowCount": 0
}

Create an index with custom dates

See an example for the configuration of the parameters Expire after seconds and Query:

The name of the field must match the date attribute of your document, which will control the lifetime of the object. You must also set the Expire after seconds parameter to 0.

In this scenario, the expiration is defined with the datetime defined in the value of the corresponding field in the Query parameter.

When you insert new documents, make sure that the field you have configured in the index is created with a date value.

{
    "id": 1,
    "name": "Galaxy s20",
    "price": 5000,
    "expiresAt": new Date()
}

The TTL index doesn't ensure that expired data is deleted immediately after expiration. There may be a delay between the time a document expires and the time the Object Store removes the document from the database. This happens because the background task that removes expired documents runs every 60 seconds.

List all indexes

This operation must list all indexes that the user has created in an Object Store.

Output

{
  "data": [
    {
      "v": 2,
      "key": {
        "expiresAt": 1
      },
      "name": "expiresAt_1",
      "expireAfterSeconds": 0
    }
  ],
  "rowCount": 1
}

Drop an existing index

In this operation, it’s possible to drop an index previously created by the customer.

See how to configure the connector with the Drop Index Operation and Query parameter:

{
    "expiresAt": 1
}

Output

{
    "data": "expiresAt_1",
    "rowCount": 0
}

Technology

If the Object Store connector is involved in updates, inside an iteration connector (, , etc.) and making parallel executions, there can be concurrence in the registers update if the update instructions are exactly the same. Consequently, one instruction will return "updateCount":1 and the other "updateCount": 0. It happens when 2 registers that are exactly the same get into the Object Store operation pool and the update or insertion instructions (with the Upsert parameter enabled) are sequentially executed. The first instruction makes an update and the second one finds the already-persisted register and checks there's nothing to be changed, returning there was no need for an action ("updateCount": 0).

Object Store uses search operators and objects aggregation similar to the MongoDB syntax.

The Object Store data is isolated between environments. However, the Test environment data shares the same environment as the data.

Double Braces expressions
For Each
Stream File Reader
Refer to the MongoDB external documentation to learn more.
Execution panel