Data Delivery to Optimove

This section covers the various methods available to deliver data to Optimove, allowing you to choose the most suitable approach for your system. It includes a link to the QA processes documentation and an overview of the batch data process to help ensure smooth and accurate data transfers.

Data delivery methods

Optimove can gain access to the data you prepared using one of the following methods:

  • Delivery of files
  • Connection to a database
  • API connection

Delivery of files

The common file formats used in Optimove are CSV, JSON, and PARQUET.

All supported file formats and requirements for each source are available here.

To ensure optimal performance and seamless integration, please consider the following limitations when delivering files to Optimove:

  1. Compression:
    • Supported Compression: GZ (ZIP is not supported)
  2. File Size:
    • The maximum supported file size is 1 GB; however, for optimal performance and faster data processing, it is recommended to split large files into smaller batches ranging from 100 to 250 MB.
  3. JSON Object Structure:
    • Each JSON object should be in a separate row without additional brackets at the beginning or end of the file. Use curly braces {} as identifiers:
      {"Id":"30515240","SubscriptionId":null}
      {"Id":"30515241","SubscriptionId":null,"Amount":"99.00","Currency":"INR"}
      

Files can be provided through the following storage options. For detailed requirements, click here

  1. AWS S3:
  2. Google Cloud Storage:
    • Required Details: Bucket Name
    • Optimove will provide the relevant permissions details, please contact your PM/CSM.
  3. Azure Blob:
    • Required Details: SAS Token, URL (Stage URL referencing the Azure account)
  4. SFTP:
    • If Cloud Storage is unavailable, SFTP is an alternative option. Optimove will provide the SFTP details, please contact your PM/CSM.
    • If SFTP is managed on your end, provide the following details:
      • Host
      • Username
      • Password
      • Port

Connection to a Database

All supported file formats and requirements for each source are available here

You can configure security by whitelisting specific IP addresses. Please contact your PM/CSM for the IP addresses list.

Required Details for Database Connection:

To initiate a successful database connection, the following details must be provided:

  • Host
  • UserName
  • Password
  • Port
  • Database Name

For Google BigQuery, Here are the relevant permissions details. After configuring the permissions, provide the following details:

  • Project ID
  • Dataset ID

📘

❄️ Optimove x Snowflake

In addition to the supported database connections mentioned above, Optimove has partnered with Snowflake (a cloud-based data warehouse provided as a SaaS platform).

Snowflake Secure Data Sharing is a feature of the Snowflake data platform that allows users to securely share data with external parties. With Snowflake Secure Data Sharing, you can create a virtual copy of your data, called a "share", which can be shared with Optimove.

Snowflake allows for fastercheaper, and more reliable data sharing.

  • A shorter and simplified data transfer process to execute campaigns on time
  • Querying costs are on Optimove = much cheaper for the client
  • No longer relying on daily files transfer = Increased stability of the process

More information can be found here and in the external guide.

Optimove will provide the relevant steps required. Please contact your PM/CSM.

💡Optimove’s batch-data ETL process will extract the data directly from your server. It is important to have the LastUpdated filter on the API call (since Optimove only extracts incremental data from the transactional and fact tables).


API connection

  • To establish the connection, please send Optimove full documentation, including details of the available API calls (along with specific examples) and relevant credentials.
  • A connection may be made available with or without whitelisting.
  • Kindly remove any relevant restrictions, that might impact the data extraction process (firewall for example). In addition, please prepare your sources based on the amount of data, to avoid overloading the servers.

💡Optimove’s batch-data ETL process will extract the data directly from your server. It is important to have the _LastUpdated _filter on the API call (since Optimove only extracts incremental data from the transactional and fact tables)

All supported API types and requirements for each source are available here

Questions

We are here to help! Please contact us with any questions, and we will be happy to assist you. Email us at [email protected] or call us (seven days a week from 5:00 am to 8:00 pm GMT) at +972-3-672-4546.