Data Sources Documentation: Database

This section outlines the supported database-to-database (DB to DB) connections for Optidata, including prerequisites for each source. Leveraging Matillion’s capabilities, Optidata can connect to several popular databases, enabling direct data extraction and integration.

Below is a list of databases that Optimove has successfully integrated with or confirmed compatibility. For each database, specific connection details are required from you, so please ensure all necessary information is provided prior to connection initiation.

With a direct DB-to-DB connection, Optimove extracts data directly from your database, eliminating the need for daily data uploads on your end. It is essential, however, to maintain up-to-date data in your database.

📘

Optimove supports security protocols for DB-to-DB connections:

IP Allowlisting
To secure all Optimove integrations, please allowlist the IP addresses in our IP Allowlisting Guide

ETL-Specific IPs
For Optimove’s ETL (Extract, Transform, Load) process, refer to the ETL section of that same guide for the additional IPs required for secure data transfer.

If you have any questions about IP allowlisting or the integration process, don't hesitate to contact our support team.

Data Delivery MethodConnection DetailsExtra Information Needed
SQL ServerHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
MySQL,
Maria DB
Host
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
PostgreSQLHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
Google BigQuery1. For data loading only, you need to grant the following permissions :
[email protected]:
bigquery.datasets.get bigquery.tables.get bigquery.tables.getData bigquery.tables.list bigquery.jobs.create resourcemanager.projects.get bigquery.readsessions.create bigquery.readsessions.getData bigquery.readsessions.update

2. Once configured these permissions, you will need to send us the Project ID and the Dataset ID for the Google BigQuery connection.
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
Amazon RedshiftHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
OracleHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
SnowFlakeHost
UserName
Password
Port
Database Name
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
TeradataHost (URL)
UserName
Role
Password
Port
Database Name
Warehouse
Schema
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data
DeltalakeHost (URL)
UserName
Password
Port
We expect to receive the UpdatedAt column in all the incremental tables in the DB so we can pull the data