Optimove Data Delivery Guide
Step 1: Understand the Data Requirements
Before setting up data delivery, ensure you understand what data Optimove requires. The data provided must offer a complete and actionable view of client activity.
Data Sets-
Customer Profiles
Demographic information, account details, and other static data. -
Transaction History
Records of purchases, deposits, withdrawals, and other key actions. -
Product or Service Catalog
Details about products or services available to customers. -
Customer Events
Real-time actions, such as new registrations, account updates, or purchases.
Supported File Formats: CSV, JSON, PARQUET (gzip compression recommended).
Please Note Ensure that all data is structured according to Optimove's specifications to prevent integration issues.
Step 2: Prepare Your Data for Delivery
Before transferring data, confirm that it meets Optimove’s format and readiness requirements.
Key Data Prerequisites-
Data Readiness
Designate a daily data readiness time (in UTC) when your data is clean, complete, and up-to-date. -
Data Structure Alignment
Ensure field names, formats, and values match Optimove’s required structures. Custom structures may require additional configuration with Optimove’s support. -
Event Structure
Maintain consistency in how events are recorded (including timestamps). For reference, see Optimove’s event structure guidelines. -
File Size Management
Use GZ (gzip) compression for data files. Keep each file under 100 MB for optimal transfer speeds; large datasets should be split into multiple files. -
Incremental Data Updates
Include accurateLastUpdatedfields to enable Optimove to ingest only new or modified records—rather than reprocessing entire datasets.
Please Note Validate data before submission to avoid processing delays or errors.
Step 3: Choose Your Data Delivery Method
Optimove supports multiple data delivery methods based on your business infrastructure. Expand the sections below to select the method that best fits your architecture.
Secure Data Sharing
The most efficient way to deliver data. This method utilizes "Zero-Copy" sharing, allowing Optimove to query your data directly from your data warehouse without file transfers.
- Snowflake Secure Data Sharing: Native sharing for Snowflake users.
- Apache Iceberg: Supported for Databricks and Snowflake (on AWS/GCP).
- Benefits: Real-time availability, highest security, and no ETL file maintenance.
Direct Database Connection (DB2DB)
Optimove can connect directly to your database to ingest data incrementally. This is preferred for clients who wish to automate extraction without managing file uploads.
- Supported Databases: Standard SQL Databases (SQL Server, PostgreSQL, MySQL, etc.) and Google BigQuery.
- Configuration: Requires whitelisting Optimove's IP addresses and providing read-only credentials.
Cloud Storage Solutions
If your data already resides in a cloud environment, you can grant Optimove access to specific buckets for ingestion.
- Supported Providers: AWS S3, Google Cloud Storage (GCS), and Azure Blob Storage.
- Configuration: Varies by provider (Access Keys, SAS Tokens, or IAM permissions).
API Connection
Optimove can extract data via API calls if your system supports data retrieval over HTTP.
- Use Case: Best for systems where direct DB access is restricted, but an API is available.
- Configuration: Requires providing endpoint documentation, authentication, and a
LastUpdatedfilter for incremental fetching.
SFTP (Secure File Transfer Protocol)
The traditional method of transferring flat files (CSV, JSON, PARQUET) via a secure server.
- Options: You can host your own SFTP server, or Optimove can host one for you.
Please Note Select the method that best aligns with your internal systems and security policies. Find the full technical reference for Data Delivery here.
Step 4: Set Up and Connect with Optimove
Once your data is prepared and a delivery method is chosen, connect with Optimove for secure integration.
-
Provide Access Credentials
Share the necessary credentials (SFTP login, database access, etc.) with your Optimove representative.
For SFTP, confirm if you are using an Optimove-provided server or your own. -
Configure Automated Transfers
Set up a scheduled process for regular data uploads (e.g., daily at your data readiness time).
Please Note Involve your IT or data team for a seamless setup and to ensure security compliance.
Step 5: Test and Validate Data Transfers
Before going live, conduct a test data transfer to confirm that Optimove can successfully access and process your data.
-
Send a Sample File
Upload a test dataset to verify structure, format, and connectivity. -
Confirm Data Accuracy
Check that all required fields are correctly populated, and event timestamps are consistent. -
Validate Processing
Ensure Optimove successfully ingests and maps the data without errors.
Please Note Repeat test loads if needed to correct any issues before full deployment.
Step 6: Start Daily Data Updates
Once test transfers are successful, begin regular data submissions.
-
Maintain a Consistent Schedule
Deliver data at the agreed-upon frequency (daily, hourly, etc.) and at or after the specified readiness time. -
Monitor Data Integrity
Regularly review logs and reports to identify any errors or anomalies.
Please Note Assign a dedicated team member to oversee ongoing data transfers and troubleshoot potential issues.
Updated about 8 hours ago
