Links

DB V2

Know the component and how to use it.
DB V2 executes SELECT, INSERT, DELETE and UPDATE operations and also makes PROCEDURES calls, returning the values to a JSON structure.
Take a look at some of the component's configuration parameters:
  • Operation: operation to be performed (Query or Procedure).
  • Account: account to be used by the component to connect.
  • Database URL: define a Database URL.
  • SQL Statement: accepts any SQL statement supported by the underlying database. Double braces expressions are allowed. E.g.: {{ message.id }}.
  • Batch: if the option is enabled, batch processing of some statements is performed.
  • Rollback On Error: if this option is enabled, the commits of operations are performed only if all of them are successful. Otherwise, a rollback of all batch operations is performed.
  • Batch Items: if the Batch mode is enabled, specify the batch items.
  • Fail On Error: if the option is enabled, the execution of the pipeline with error will be stopped; otherwise, the pipeline execution continues, and the result will show the value false for the "success" property.
  • Blob As File: if the option is enabled, all the BLOB parameters for QUERY or PROCEDURE operations are expected to receive the file path.
  • Clob As File: if the option is enabled, all the CLOB parameters for QUERY or PROCEDURE operations are expected to receive the file path.
  • Charset: name of the characters code for the file reading (standard UTF-8).
  • Custom Connection Properties: specific connection and database properties defined by the user.
  • Keep Connection: if the option is enabled, the database connections will be kept for no more than 30 minutes - otherwise, they'll be kept for 5 minutes.
  • Advanced: allows to define a query to be executed before the query defined in SQL STATEMENT to ensure that the connection to the database is established and errors are avoided.
  • Pool Size By Actual Consumers: if the option is enabled, the number of pooled connections is equal to the number of consumers configured in the pipeline deployment. If the option is disabled, the pool size is determined by the pipeline deployment size, regardless of the number of consumers.
  • Exclusive DB Pool: if the option is enabled, a new non-shared pool is always created for the sole use by this component. If the option is disabled, a pool might be shared between components if the URL is the same.
  • Output Column From Label: for some databases, if your SELECT uses an alias, you must enable this flag so that the column name is displayed exactly like the alias.
  • Connection Test Query: SQL statement to be used before each connection is established. This is an optional parameter and should be used with databases that do not provide reliable connection status information.
  • Raw SQL Statement: if the option is active, the SQL Statement parameter allows the use of dynamic queries through Double Braces statements. When using this functionality, you must ensure that the pipeline has security measures against unwanted SQL statements (SQL Injection). See more about this parameter in the section below.
IMPORTANT: in situations where an Apache Hive database is used, the Updatecount data may be unavailable due to a system behavior. This information will be available only if the updated row control is enabled on the Apache Hive server. To learn more about Apache Hive support for the Digibee Integration Platform, refer to Supported databases.

Raw SQL Statement

To bring more flexibility when using DB V2, we can activate the Raw SQL Statement option, previously configure a query, and reference it through Double Braces in the SQL Statement parameter as follows:

Query previously defined via Template Transformer

Raw SQL Statement activation

Query referenced in SQL Statement parameter

IMPORTANT: as a good practice, we strongly recommend that, when activating the Raw SQL Statement option, the queries are previously defined using the Template Transformer component. The use of Template Transformer allows validating parameters through FreeMarker technology and also parameter declaration through Double Braces. These parameters are not resolved by Template Transformer but by the component DB V2, which by default configures and validates the parameters of the SQL statement beforehand (PreparedStatement). By applying these security measures, you reduce the risks of attacks like SQL Injection.
In the image below, we have an example of recommended use of the component on the left (with the Double Braces in the WHERE clause, highlighted in green); and another example of non-recommended use on the right (with the FreeMarker in the WHERE clause, highlighted in red), which may pose risks to pipeline safety:

DB V2 in Action

Batch mode

When it's necessary to make a batch processing of some instructions, you can make calls in batch mode in the queries.
Example
Let's say you need to inform an array of objects in the component, that will be used in the batch execution:
ITEMS
[ { "name": "Mathews", "type":"A"},
{ "name": "Jules", "type":"A"},
{ "name": "Raphael", "type":"B"} ]
In the SQL instructions, you must inform it in the following way:
SQL
INSERT INTO TABLE VALUES ( {{ item.name }}, {{ item.type }} )
When you use Double Braces expressions {{ item.name }}, an iteration is made inside the array (informed in items) and a corresponding property is searched inside the object. In that case, the property is "name".
After the execution, 3 registers are inserted. The expected return is:
{
"totalSucceeded":3,
"totalFailed":0
}
If one of the executions fails, an object will be returned with the "error" property:
{
"totalSucceeded":1,
"totalFailed":1
}
If one of the executions fails, an object will be returned with the "errors" property:
{
"totalSucceeded":1,
"totalFailed":1
"errors": ["error1", "error2"]
}
IMPORTANT: the errors returned in the “errors” property vary according to the database driver. Some drivers don't return all the errors that occurred during the execution in batch mode.

Rollback On Error

If this option is activated, the commits of the operations will be made only if all of them are successful. Otherwise, the rollback of all the batch operations will be made.
If the option is inactive, then the commit and the successful changes by commit will be made even if there's an error between the executions.
IMPORTANT: for some databases, especially for Oracle, it's not possible to return the consolidated number of successful or unsuccessful executions.
If an error occurs, an object containing all the errors will be returned (inside the "errors" property) and the consolidated with the -1 value will also be returned:
{
"totalSucceeded":-1,
"totalFailed":-1,
"errors": ["erro1", "error2"],
"success": false
}
For other databases, as Firebird, the occurrence of errors isn't informed. Therefore, an object with no error can be returned even if there was a failure:
{
"totalSucceeded":0,
"totalFailed":3
"errors": ["error1", "error2"],
"success": false
}
For these error cases in Batch Mode, don't forget to check the "success" property. If it returns "false", it means at least one error occurred during the execution.

Connection pool

By standard, we use a pool based on the configurations of the deployed pipeline. For example, if the pipeline is SMALL, then the pool size will be 10. For MEDIUM the size would be 20 and for LARGE it would be 40.
It's possible to manage the pool size by the deployment time as well. For that, it's necessary to enable the “Pool Size By Actual Consumers” property in the component. With it, what's manually configured in the deployment screen is used.
See in the image below the configuration of a SMALL pipeline with 5 consumers. If you want the pool of the database components (DB V2 and Stream DB V3) to use this size, you'll have to enable the “Pool Size By Actual Consumers” property in every existing component:
IMPORTANT: attention when manually configuring the pool size so there's no deadlock in concurrent calls to the same database.
Our pool is shared between the database components that access the same database inside the pipeline. If an exclusive pool for a determined component is necessary, enable the “Exclusive Pool” property.

Technology

Authentication via Kerberos

It's possible to make an authentication via Kerberos in database components. For that, all you have to do is:
  • inform a KERBEROS-type account
  • configure a Kerberos principal
  • configure a keytab (that must be the base64 of the own keytab generated file)

More usage scenarios with DB V2

See in the following documentation how to use DB V2 in different scenarios: