Data Sources Documentation: Files

This section provides details on data sources that can be delivered as files to Optidata, along with the specific prerequisites required for successful integration. Optidata supports various file formats and transfer methods, allowing for flexible data ingestion through file-based delivery.

Please review the table below to understand the supported file sources and any client requirements needed for setup.

There are 4 main ways to share files with Optimove:

Data Delivery Method

Connection Details

Valid File Formats

Compression Options

Extra Information Needed

Extra Notes

SFTP Optimove's side (Most common way)

Optimove will provide the details for the SFTP to the client.

CSV, JSON, PARQUET, TXT, AVRO, XML

GZIP

  • *ZipandExcelformats arenot** supported.

SFTP Client's Side

The client needs to provide the following details:
Host
Username
Password
Port

CSV, JSON, PARQUET, TXT, AVRO, XML

GZIP

  • *ZipandExcelformats arenot** supported.

Amazon S3

Preferred Method:
Access Key ID
Secret Access Key
Bucket Name

Additional Option (not suggested):

  • S3 Integration with Snowflake (To explain the process and Pros/Cons, such as better connectivity and the need to get R&D support per client)

CSV, JSON, PARQUET, TXT, AVRO, XML

GZIP

  • *ZipandExcelformats arenot** supported.

Please note: Optimove’s infrastructure is responsible for GCP,
Therefore we won’t be able to provide AWS User credentials from our end.
https://docs.snowflake.com/en/user-guide/data-load-s3-config-aws-iam-user

Link for Snowflake Documentation - Preferred method:
https://docs.snowflake.com/en/user-guide/data-load-s3-config-aws-iam-user#step-1-configure-an-s3-bucket-access-policy

Google
Cloud Storage (GCS)

For Snowflake US tenants - Send the following to the client:

For data loading only, you need to grant the following permissions to:
[email protected] :
storage.buckets.get
storage.objects.get
storage.objects.list
In addition, please share the bucket name.

For Snowflake EU tenants - Send the following to the client:

For data loading only, you need to grant the following permissions to:
[email protected] :
storage.buckets.get
storage.objects.get
storage.objects.list
In addition, please share the bucket name.

CSV, JSON, PARQUET, TXT, AVRO, XML

GZIP

  • *ZipandExcelformats arenot** supported.

Link for GCS documentation : https://docs.snowflake.com/en/user-guide/data-load-gcs-config

Azure Blob

SAS Token
URL (The stage URL references the Azure account)

CSV, JSON, PARQUET, TXT, AVRO, XML

GZIP

  • *ZipandExcelformats arenot** supported.

Link for Snowflake Documentation:
https://docs.snowflake.com/en/user-guide/data-load-azure-config#step-2-create-an-external-stage