Data Sources Documentation: Database

This section outlines the supported database-to-database (DB to DB) connections for Optidata, including prerequisites for each source. Leveraging Matillion’s capabilities, Optidata can connect to several popular databases, enabling direct data extraction and integration.

Below is a list of databases that Optimove has successfully integrated with or confirmed compatibility. For each database, specific connection details are required from you, so please ensure all necessary information is provided prior to connection initiation.

With a direct DB-to-DB connection, Optimove extracts data directly from your database, eliminating the need for daily data uploads on your end. It is essential, however, to maintain up-to-date data in your database.

📘

Optimove supports security protocols for DB-to-DB connections:

IP Allowlisting: To enable secure communication between Optimove and your environment, please allowlist specific IP addresses provided in this section. Allow listing ensures a secure connection based on your configuration needs.
If you have any questions about IP allow listing or the integration process, review the documentation or reach out to our support team for assistance.

For Optimove's ETL (Extract, Transform, Load) process, you will need to allow list the following IP addresses. Click here to view the full list of required IPs for secure data integration.

Data Delivery MethodConnection DetailsExtra Information Needed
SQL ServerHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
MySQL,
Maria DB
Host
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
PostgreSQLHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
Google BigQuery1. For data loading only, you need to grant the following permissions :
[email protected]:
bigquery.datasets.get bigquery.tables.get bigquery.tables.getData bigquery.tables.list bigquery.jobs.create resourcemanager.projects.get bigquery.readsessions.create bigquery.readsessions.getData bigquery.readsessions.update

2. Once configured these permissions, you will need to send us the Project ID and the Dataset ID for the Google BigQuery connection.
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
Amazon RedshiftHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
OracleHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
SnowFlakeHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
TeradataHost (URL)
UserName
Role
Password
Port
Database Name
Warehouse
Schema
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
DeltalakeHost (URL)
UserName
Password
Port
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data