Snowplow Behavioral Data Platform Product Description

Last updated: 20 September 2024

This Snowplow Product Directory provides descriptions of the products and services you have purchased on an Order Form or enabled as part of a pilot service. While our Product Directory may be updated from time to time, the descriptions of the products as of the Start Date in your Order Form will apply to the Products or Services specified in your Order Form. If new terms are introduced for new features or functions made available within a Product or Service during the Term of your Agreement, these new terms will apply to the use of those new features or functions if you use them.

Product Overview

Snowplow helps businesses gain actionable insights from customer behavior across all digital touchpoints, including web, mobile, and more. Our platform collects, manages, and delivers this important data to your chosen destinations through our Behavioral Data Platform (“BDP”), to help you maximize your data quality and usability.

Our BDP product includes Data Pipeline, Data Management, and Data Applications components. Our services cover Onboarding, Technical Account Management, and Service Levels. The specific Products and Services applicable to you are detailed in your Order Form.

Data Pipeline

The Snowplow Data Pipeline enables the collection of your behavioral data across multiple digital touchpoints and applications, including but not limited to:

  • Web applications
  • Mobile applications
  • Desktop applications
  • Server applications
  • Smart TV applications

Your data is processed by Snowplow in real time. Data processing steps include, but are not limited to:

  • Validating the data against schemas
  • Enriching the data with first-party and third-party data sets
  • Obfuscating data fields (e.g., for data protection)
  • Transforming the data into a format optimized for loading into downstream data destinations

The data is delivered into downstream destinations activated by you. These cloud data destinations can include:

  • Data warehouses and lakehouses (e.g., Snowflake, Databricks, GCP BigQuery, AWS Redshift, AWS S3, Azure Fabric)
  • Streaming technologies (e.g., Kafka, GCP Pub/Sub, AWS Kinesis, Azure Event Hubs)

Distribution: Cloud or Private Managed Cloud (“PMC”)

Snowplow BDP is distributed via two models: Cloud and Private Managed Cloud.

If you choose Cloud distribution, all your customer behavioral data is processed on Snowplow’s own infrastructure, in our own cloud environment, before being delivered to your selected cloud data destination (e.g., data warehouse or lakehouse).

If you choose the Private Managed Cloud (PMC) distribution, your customer behavioral data is processed end-to-end in your own cloud infrastructure (on your AWS, GCP, or Azure Cloud accounts).

Outage Protection

(only available for PMC customers who use AWS)
The Outage Protection product protects you against data loss in the event of a region-wide AWS outage. In the event of a region-wide AWS outage, this service redirects your data processing into a secondary region selected by you, until the outage is over and your data can be redirected to your previous region. While we can’t guarantee that data loss will not be completely eliminated, Outage Protection will minimize that as much as practicable.

Infrastructure and Security (PMC only)

The following Infrastructure and Security bolt-ons can be added by PMC customers to their Data Pipeline:

Feature

High

Advanced

HTTP Access controls

All HTTP (i.e. non-encrypted) traffic to internet facing load balancers deployed as part of Snowplow BDP can be disabled.

    X
    X

VPC Peering (AWS and GCP only)

As part of the Snowplow pipeline setup, a Virtual Private Cloud (VPC) housing the pipeline is set up in your cloud account. If you wish to enable VPC peering between any existing VPC you own and the new Snowplow VPC, can choose the CIDR/IP range used in the Snowplow-setup VPC so that peering is possible.

    X
    X

SSH access controls (AWS only)

As part of your internal security policies Snowplow’s SSH access to the environment can be disabled.

    X
    X

CVE Reporting (AWS and GCP only)

CVE Reporting provides a periodic report on Common Vulnerabilities and Exploits identified in any relevant software component, as well as regular patching of the same

    X
    X

Custom IAM Policy (AWS only)

As part of agent installation on EC2 nodes extra IAM permissions can be required (e.g. SSM agent) for correct functionality.  IAM policies attached to EC2 servers can be extended with a customer defined policy if needed.

    X

Custom VPC integration (AWS only)

As part of a Private Managed Cloud deployment, Snowplow deploys a VPC for all other Snowplow infrastructure  to be deployed within. If customers require Snowplow to set up pipelines and other Snowplow infrastructure into a pre-existing VPC (rather than creating one from scratch), they need to select this option. 

This VPC must allow Snowplow access to the internet via a directly connected Internet Gateway (IGW) and ensure sufficient NACL rules are allowed for the deployment to function as expected in order to be signed off by the Snowplow team prior to deployment.

    X

Custom security agents (AWS, GCP only)

On AWS, for all EC2 servers that are deployed as part of the service, a customer’s custom security agents may be installed via an S3 object made available by the customer. On all EKS clusters that are deployed as part of the service a customer’s custom security agent can be deployed via a helm chart.


On GCP, for all GKE clusters that are deployed as part of the service, a customer’s custom security agents may be installed via a helm chart. 

    X

Custom EKS AMIs (AWS only)

Provision of a custom hardened AMI (machine image) for use in EKS node pools instead of standard AWS images.

    X

Note: The features that apply to you depend on the cloud infrastructure vendor you are using. For example, Custom EKS AMIs described above only apply if you use AWS and are a PMC customer who purchases a subscription to the “Advanced” Infrastructure and Security.

Data Management

Snowplow provides standard access to functionality to help you manage your behavioral data. This functionality is called “Event Data Management” and is further described in the table below.

You can also subscribe to our “Data Product Studio” product. This service provides you with enhanced functionality to manage your behavioral data and is further described in the Data Product Studio section of the table below.

Event Data Management

Access to a library of Snowplow SDKs for collecting behavioral data in different application environments. A full list of our current Snowplow SDKs can be found in our documentation.

Tools for developers to support your setup of Snowplow’s SDKs including:

  • Snowtype: A tool for enabling developers to more easily integrate Snowplow tracking SDKs based on their data design by creating type-safe, client-specific functions and methods for instrumenting Snowplow tracking.
  • Snowplow Micro: A tool to enable developers to inspect Snowplow data from a development environment easily, and set up automated tests to fail builds that break Snowplow tracking.
  • Snowplow browser extension: A tool to enable developers to conveniently inspect and validate web tracking via a Chrome plugin.
  • ID service: A tool to help you set your own first-party persistent cookies for tracking your users on your web domains.

A user interface that enables you to:

  • Instrument and configure out-of-the-box Snowplow behavioral data sets (behavioral data products).
  • Enable and configure enrichments on the behavioral data sets.
  • View the different behavioral data products you are using Snowplow to deliver, including the data definitions and instructions on how to access and understand the data.
  • Receive alerts for any deviations (quality issues) in the data collected from those definitions. (Failures can be reviewed via a live dashboard.)

Data Product Studio

Access to functionality to help you define, extend, manage, and socialize the data generated by the Snowplow Data Pipeline. The Data Product Studio includes the following functionality to enable you to:

  • Design new behavioral data sets (behavioral data products), including defining new schemas (data structures) and event specifications (semantics).
  • Assign ownership to those data sets.
  • Providing controls on who can create and update behavioral data set definitions.
  • Generate machine-readable data contracts for those behavioral data sets.
  • Report against the data set definition/design.
  • Record changes to data set definitions over time.
  • Enable users within your organization to “subscribe” to updates and receive notifications on changes to the associated definitions.

Data Applications

Snowplow Data Applications utilize the Data Pipeline and Data Management to address specific use cases.

Digital Analytics

The Digital Analytics package comprises four data apps:

  1. User and marketing analytics application: Provides you with models, dashboards, and reports to understand your customer engagement with digital channels.
  2. Marketing Attribution application: Provides you with models, reports, and dashboards to understand the impact of different marketing channels on conversions and traffic levels.
  3. Funnel analytics application: Enables you to build funnels on tables of data in the data warehouse via a user interface.
  4. Video and media analytics application: Visualize engagement with video, audio, and streaming content on your site, including clicks through to conversions and advertisements.

The package includes data models (written using dbt open source software) that aggregate the underlying event-level data in the data warehouse into AI and Business Intelligence-ready tables. Example tables include a user-level table, a session-level table, and a pageview-level table. These tables directly power the graphs and charts in the user interfaces, and can be used by you to perform more sophisticated analytics and AI.

The data models implement several data processing steps, including but not limited to:

  • Deduplicating the underlying event data.
  • Stitching user identities across different platforms and channels (e.g., web and mobile).
  • Accurately calculating time spent engaging with different content items (e.g., web pages, mobile screens).
  • Sessionizing the data.

The data models aggregate the data in a performant, incremental fashion which may reduce your cost of data processing and increase the speed of data delivery. The data models are extendable and run in your selected cloud data destination.

Ecommerce Analytics

The Ecommerce Analytics package includes all the applications and underlying dbt models (written using dbt open source software) in the Digital Analytics package, but also includes an additional ecommerce application backed by an associated ecommerce dbt model.

The ecommerce data application provides dashboards and reports which may be useful for marketers and merchandisers to help them understand and optimize a digital shopping experience.

The ecommerce dbt model creates AI- and Business Intelligence-ready tables describing carts, checkouts, product performance, transactions, and sessions. These tables directly power the visualizations in the ecommerce application, and can be used directly to power more sophisticated analytics and AI.

Event Streaming

Technology to stream Snowplow data in near real-time into third-party SaaS applications (e.g., Amplitude and Braze).