Optimove Data Delivery Guide

Step 1: Understand the Data Requirements

Before setting up data delivery, ensure you understand what data Optimove requires. The data provided must offer a complete and actionable view of client activity.

Data Sets

  • Customer Profiles
    Demographic information, account details, and other static data.

  • Transaction History
    Records of purchases, deposits, withdrawals, and other key actions.

  • Product or Service Catalog
    Details about products or services available to customers.

  • Customer Events
    Real-time actions, such as new registrations, account updates, or purchases.
    Supported File Formats: CSV, JSON, PARQUET (gzip compression recommended).

Please Note Ensure that all data is structured according to Optimove's specifications to prevent integration issues.


Before transferring data, confirm that it meets Optimove’s format and readiness requirements.

Key Data Prerequisites

  • Data Readiness
    Designate a daily data readiness time (in UTC) when your data is clean, complete, and up-to-date.

  • Data Structure Alignment
    Ensure field names, formats, and values match Optimove’s required structures. Custom structures may require additional configuration with Optimove’s support.

  • Event Structure
    Maintain consistency in how events are recorded (including timestamps). For reference, see Optimove’s event structure guidelines.

  • File Size Management
    Use GZ (gzip) compression for data files. Keep each file under 100 MB for optimal transfer speeds; large datasets should be split into multiple files.

  • Incremental Data Updates
    Include accurate LastUpdated fields to enable Optimove to ingest only new or modified records—rather than reprocessing entire datasets.

Please Note Validate data before submission to avoid processing delays or errors.


Step 3: Choose Your Data Delivery Method

Optimove supports multiple data delivery methods based on your business infrastructure. Expand the sections below to learn more.

Cloud Storage Solutions
  • AWS S3
    Requires Access Key ID, Secret Access Key, and Bucket Name.

  • Google Cloud Storage
    Deliver data to a designated bucket, with access managed by Optimove.

  • Azure Blob
    Uses a SAS Token and stage URL for data transfer.

SFTP (Secure File Transfer Protocol)
Snowflake Data Share

For clients using Snowflake, Optimove supports
Snowflake Data Sharing:

  • Direct sharing of structured tables (daily or near real-time).
  • Coordinate with Optimove to confirm setup and account permissions.

Please Note Select the method that best aligns with your internal systems and security policies. Find the full guide to Choosing Your Data Delivery Method here.


Step 4: Set Up and Connect with Optimove

Once your data is prepared and a delivery method is chosen, connect with Optimove for secure integration.

  • Provide Access Credentials
    Share the necessary credentials (SFTP login, database access, API keys, etc.) with your Optimove representative.
    For SFTP, confirm if you are using an Optimove-provided server or your own.

  • Configure Automated Transfers
    Set up a scheduled process for regular data uploads (e.g., daily at your data readiness time).

Please Note Involve your IT or data team for a seamless setup and to ensure security compliance.


Step 5: Test and Validate Data Transfers

Before going live, conduct a test data transfer to confirm that Optimove can successfully access and process your data.

  1. Send a Sample File
    Upload a test dataset to verify structure, format, and connectivity.

  2. Confirm Data Accuracy
    Check that all required fields are correctly populated, and event timestamps are consistent.

  3. Validate Processing
    Ensure Optimove successfully ingests and maps the data without errors.

Please Note Repeat test loads if needed to correct any issues before full deployment.


Step 6: Start Daily Data Updates

Once test transfers are successful, begin regular data submissions.

  • Maintain a Consistent Schedule
    Deliver data at the agreed-upon frequency (daily, hourly, etc.) and at or after the specified readiness time.

  • Monitor Data Integrity
    Regularly review logs and reports to identify any errors or anomalies.

Please Note Assign a dedicated team member to oversee ongoing data transfers and troubleshoot potential issues.