DB V2
Discover more about the DB V2 component and how to use it on the Digibee Integration Platform.
Last updated
Discover more about the DB V2 component and how to use it on the Digibee Integration Platform.
Last updated
DB V2 executes Select, Insert, Delete and Update operations and also makes Procedures calls, returning the values to a JSON structure. To see all the databases supported by this component, read the Supported databases documentation.
Take a look at the configuration parameters of the component. Parameters supported by Double Braces expressions are marked with (DB)
.
Parameter | Description | Default value | Data type |
---|---|---|---|
Parameter | Description | Default value | Data type |
---|---|---|---|
Important information:
When activating the use of credentials, the connection pool is created each time the pipeline is executed and closed when the execution is complete. Connection pools will continue to be communicated between bank connectors if the same setting is applied between them.
Currently, the Use Dynamic Account, Account Name and Scoped parameters can only be used in Pipeline Engine v2 and are only available in the Restricted Beta phase. To learn more about it, read the article Beta program.
In situations where an Apache Hive database is used, the Updatecount data may be unavailable due to a system behavior. This information will be available only if the updated row control is enabled on the Apache Hive server. To learn more about Apache Hive support for the Digibee Integration Platform, refer to Supported databases.
The Key parameter refers to a property declared through Double Braces expression in the SQL Statement in case of Procedures and INSERT Queries that handles CLOB/BLOB data types. Every Double Braces declaration has an index and it must be used to set this parameter.
Example:
INSERT INTO TABLE (MY_CLOB, MY_BLOB, MY_STRING) VALUES ({{ message.clob }}, {{ message.blob }}, {{ message.string }})
In this case, we have index 0 for {{ message.clob }}
, 1 for {{ message.blob }}
and 2 for {{ message.string }}
.
To bring more flexibility when using DB V2, we can activate the Raw SQL Statement option, previously configure a query, and reference it through Double Braces in the SQL Statement parameter as follows:
Important: as a good practice, we strongly recommend that, when activating the Raw SQL Statement option, the queries are previously defined using the Template Transformer component. The use of Template Transformer allows validating parameters through FreeMarker technology and also parameter declaration through Double Braces. These parameters are not resolved by Template Transformer but by the component DB V2, which by default configures and validates the parameters of the SQL statement beforehand (PreparedStatement). By applying these security measures, you reduce the risks of attacks like SQL Injection.
In the image below, we have an example of recommended use of the component on the left (with the Double Braces in the WHERE clause, highlighted in green); and another example of non-recommended use on the right (with the FreeMarker in the WHERE clause, highlighted in red), which may pose risks to pipeline safety:
When it's necessary to make a batch processing of some instructions, you can make calls in batch mode in the queries.
Example
Let's say you need to inform an array of objects in the component, that will be used in the batch execution:
Items
In the SQL instructions, you must inform it in the following way:
SQL
INSERT INTO TABLE VALUES ( {{ item.name }}, {{ item.type }} )
When you use Double Braces expressions {{ item.name }}, an iteration is made inside the array (informed in items) and a corresponding property is searched inside the object. In that case, the property is "name".
After the execution, 3 registers are inserted. The expected return is:
If one of the executions fails, an object will be returned with the "error" property:
If one of the executions fails, an object will be returned with the "errors" property:
Important: the errors returned in the “errors” property vary according to the database driver. Some drivers don't return all the errors that occurred during the execution in batch mode.
If this option is activated, the commits of the operations will be made only if all of them are successful. Otherwise, the rollback of all the batch operations will be made.
If the option is inactive, then the commit and the successful changes by commit will be made even if there's an error between the executions.
Important: for some databases, especially for Oracle, it's not possible to return the consolidated number of successful or unsuccessful executions.
If an error occurs, an object containing all the errors will be returned (inside the "errors" property) and the consolidated with the -1 value will also be returned:
For other databases, as Firebird, the occurrence of errors isn't informed. Therefore, an object with no error can be returned even if there was a failure:
For these error cases in Batch Mode, don't forget to check the "success" property. If it returns "false", it means at least one error occurred during the execution.
By standard, we use a pool based on the configurations of the deployed pipeline. For example, if the pipeline is SMALL, then the pool size will be 10. For MEDIUM the size would be 20 and for LARGE it would be 40.
It's possible to manage the pool size by the deployment time as well. For that, it's necessary to enable the Pool Size By Actual Consumers property in the component. With it, what's manually configured in the deployment screen is used.
See in the image below the configuration of a SMALL pipeline with 5 consumers. If you want the pool of the database components (DB V2 and Stream DB V3) to use this size, you'll have to enable the “Pool Size By Actual Consumers” property in every existing component:
Important: attention when manually configuring the pool size so there's no deadlock in concurrent calls to the same database.
Our pool is shared between the database components that access the same database inside the pipeline. If an exclusive pool for a determined component is necessary, enable the “Exclusive Pool” property.
By default, the connector defines how the connections in the pool are handled based on the Keep Connection parameter. This configuration aims to facilitate the connection pool configuration for scenarios in which, for example, high availability is necessary or the opposite is the case.
If enabled, the connections in the pool are kept open all the time and renewed after 30 minutes. This could be used for high availability scenarios reducing the cost of frequently opening new connections.
However, if we have the opposite scenario, where connections are needed between longer intervals of time, we can disable the parameter to configure the pool without open connections and only open it when it is really necessary. After the connection has been used, it is kept in the pool for no more than 5 minutes and is not renewed.
Although this means you don’t have to worry about configuring the connection pool, this default option may not be the best for some use cases.
For more flexibility on the pool configuration, you can enable the Custom Pool option to ignore the default configuration and set a custom configuration.
In this case, you need to set the three parameters below (which are implicitly defined in the default configuration):
Connection Maximum Lifetime
Minimum Idle Connections
Idle Connection Timeout
Customizing a connection pool can be a difficult task as it requires a deeper knowledge of the subject. Besides, when applied to the context of the Digibee Integration Platform, you have to consider the variables that can affect the pool performance.
Things like the deployment size and its replicas, access to the same databases using distinct credentials and properties on DB connectors inside the same and different pipelines, the exclusive DB pool option on DB connectors, directly affect how the connection pool should be configured.
Based on all the mentioned above, it is highly recommended that you only enable the Custom Pool option if you have some knowledge on the topic and if it is really necessary.
The connector uses the HikariCP framework to manage the connection pools. Additional information can be found in official documentation.
It's possible to make an authentication via Kerberos in database components. For that, all you have to do is:
inform a KERBEROS-type account
configure a Kerberos principal
configure a keytab (that must be the base64 of the own keytab generated file)
See in the following documentation how to use DB V2 in different scenarios:
Parameter | Description | Default value | Data type |
---|---|---|---|
Parameter | Description | Default value | Data type |
---|---|---|---|
Use Dynamic Account
When the option is active, the component will use the account dynamically. When deactivated, it will use the account statically.
False
Boolean
Account Type
Defines the Account type to be used by the component. Options are Basic, AWS V4, and Kerberos.
N/A
String
Account
Account to be used by the component to connect.
N/A
String
Fail On Error
If the option is activated, the pipeline's execution with an error will be interrupted. Otherwise, the pipeline execution proceeds, but the result will show a false value for the "success"
property.
False
Boolean
Type
The data type of the property that is declared in the SQL Statement.
VARCHAR
String
Database URL
Define a Database URL.
jdbc:mysql://35.223.175.97/db-training
String
SQL Statement (DB)
Accepts any SQL statement supported by the underlying database. Double braces expressions are allowed. E.g.: {{ message.id }}.
SELECT DATE_FORMAT(SYSDATE(), '%Y-%m-%d') as DATA
String
Batch
If the option is enabled, batch processing of some statements is performed.
False
Boolean
Rollback On Error
If this option is enabled, the commits of operations are performed only if all of them are successful. Otherwise, a rollback of all batch operations is performed. This option is only available if the Batch parameter is enabled.
False
Boolean
Batch Items
If Batch is enabled, specify the batch items.
N/A
String
Blob As File
If the option is enabled, all the BLOB parameters for Query or Procedure operations are expected to receive the file path.
N/A
Boolean
Clob As File
If the option is enabled, all the CLOB parameters for Query or Procedure operations are expected to receive the file path.
False
Boolean
Charset
Name of the character code for the file reading (standard UTF-8).
UTF-8
String
Type Properties
Click on the Add button to activate the following additional parameters: Key, Type, Out Parameter Name, and Parameter Type.
N/A
Options of Type Properties
Custom Connection Properties
Specific connection and database properties defined by the user.
N/A
String
Pool Size By Actual Consumers
If the option is enabled, the number of pooled connections is equal to the number of consumers configured in the pipeline deployment. If the option is disabled, the pool size is determined by the pipeline deployment size, regardless of the number of consumers.
False
Boolean
Exclusive DB Pool
If the option is enabled, a new non-shared pool is always created for the sole use by this connectors. If the option is disabled, a pool might be shared between connectors if the URL, credentials, and custom connection properties are the same. Important: the custom connection properties must be defined in the exact same order between connectors for a pool to be shared, otherwise it will be considered as a different configuration and another pool will be created.
False
Boolean
Custom Pool
If the option is enabled, the base configuration for the connection pool will be set based on the Connection Maximum Lifetime, Minimum Idle Connections, and Idle Connection Timeout parameters. If the option is disabled, the pool will be configured based on the Keep Connection parameter. Important: this is an advanced feature and must be used carefully. See the dedicated section for the Connection Pool subject below.
False
Boolean
Connection Maximum Lifetime
Defines the maximum lifetime of a connection in the connection pool. An in-use connection will never be retired. It will be removed only when it is closed. The minimum value allowed is 30000 milliseconds (30 seconds). If a lower value is set, then the default of 1800000 milliseconds (30 minutes) will be used. This option is only available if the Custom Pool parameter is enabled.
N/A
Integer
Minimum Idle Connections
Defines the minimum number of idle connections to be maintained in the pool. The maximum value allowed is based on the deployment size, that is, 10, 20, or 40. If a higher value is set, the maximum for the current deployment size will be used. This option is only available if the Custom Pool parameter is enabled.
N/A
Integer
Idle Connection Timeout
Defines the maximum amount of time that a connection is allowed to sit idle in the pool. For this option to take effect: Its value must be lower than the one set in Connection Maximum Lifetime. The value configured in Minimum Idle Connections must be lower than the pool size (defined through Pool Size By Actual Consumers). This option is only available if the Custom Pool parameter is enabled.
N/A
Integer
Keep Connection
If the option is enabled, the connection pool will always maintain a minimum number of open connections ready to use. After 30 minutes, these connections will be renewed. The minimum number of open connections will be defined based on the Pool Size By Actual Consumers parameter. If the option is disabled, the pool will be created empty and connections will be opened on demand, being kept in the pool for no more than 5 minutes. In this case, the connections are not renewed. This option is only available if the Custom Pool parameter is disabled.
True
Boolean
Output Column From Label
For some databases, if your Select uses an alias, you must enable this flag so that the column name is displayed exactly like the alias.
False
Boolean
Connection Test Query
SQL statement to be used before each connection is established. This is an optional parameter and should be used with databases that do not provide reliable connection status information.
N/A
String
Raw SQL Statement (DB)
If the option is active, the SQL Statement parameter allows the use of dynamic queries through Double Braces statements. When using this functionality, you must ensure that the pipeline has security measures against unwanted SQL statements (SQL Injection). See more about this parameter in the section below.
False
Boolean
Documentation
Section for documenting any necessary information about the connector configuration and business rules.
N/A
String