Snowplow Behavioral Data Platform Product Description

Last updated: 03 April 2025

This Snowplow Product Directory provides descriptions of the products and services you have purchased on an Order Form or enabled as part of a pilot service. While our Product Directory may be updated from time to time, the descriptions of the products as of the Start Date in your Order Form will apply to the Products or Services specified in your Order Form. If new terms are introduced for new features or functions made available within a Product or Service during the Term of your Agreement, these new terms will apply to the use of those new features or functions if you use them.

Product Overview:

Snowplow enables businesses to own and unlock the value of their customer behavioral data across all digital touchpoints to fuel AI-driven analytics, real-time customer experiences, fraud mitigation, and agentic applications. Our platform collects, manages, and delivers this BI- and AI-ready data to your chosen destinations through our Behavioral Data Platform (“BDP”), to help you improve your data quality, data governance, and usability.

Our BDP product includes Data Pipeline, Data Management, and Extensions components organized into Workspaces. Our services cover Onboarding, Technical Account Management, and Service Levels. The specific Products and Services applicable to you are detailed in your Order Form.

Workspace:

A Workspace is an isolated environment that includes a production Data Pipeline as well as optional quality assurance (QA) or development Data Pipelines.

Data Pipeline:

The Snowplow Data Pipeline enables the collection of your behavioral data across multiple digital touchpoints and applications, including but not limited to:

  • Web applications
  • Mobile applications
  • Desktop applications
  • Server applications
  • Smart TV applications

Your data is processed by Snowplow in real time. Data processing steps include, but are not limited to:

  • Validating the data against schemas - these are objects which define the structure of the data that you collect, including the fields that are recorded with each event and the validation criteria for each of these fields.
  • Enriching the data with first-party and third-party data sets
  • Obfuscating data fields (e.g., for data protection)
  • Transforming the data into a format optimized for loading into downstream data destinations

The data is delivered into downstream destinations activated by you. These cloud data destinations can include:

  • Data warehouses and lakehouses (e.g., Snowflake, Databricks, GCP BigQuery, AWS Redshift, AWS S3, Azure Fabric)
  • Streaming technologies (e.g., Kafka, GCP Pub/Sub, AWS Kinesis, Azure Event Hubs)

Distribution: Cloud or Private Managed Cloud (“PMC”)

Snowplow BDP is distributed via two models: Cloud and Private Managed Cloud.

If you choose Cloud distribution, all your customer behavioral data is processed on Snowplow’s own infrastructure, in our own cloud environment, before being delivered to your selected cloud data destination (e.g., data warehouse or lakehouse).

If you choose the Private Managed Cloud (PMC) distribution, your customer behavioral data is processed end-to-end in your own cloud infrastructure (on your AWS, GCP, or Azure Cloud accounts).

Outage Protection (only available for PMC customers who use AWS)

The Outage Protection product protects you against data loss in the event of a region-wide AWS outage. In the event of a region-wide AWS outage, this service redirects your data processing into a secondary region selected by you, until the outage is over and your data can be redirected to your previous region. While we can’t guarantee that data loss will not be completely eliminated, Outage Protection will minimize that as much as practicable.

Infrastructure and Security (PMC only)

The following Infrastructure and Security features can be added by PMC customers to their Data Pipeline:

Feature High Advanced
HTTP Access Controls
All HTTP (i.e. non-encrypted) traffic to internet facing load balancers deployed as part of Snowplow BDP can be disabled.
VPC Peering (AWS and GCP only)
As part of the Snowplow pipeline setup, a Virtual Private Cloud (VPC) housing the pipeline is set up in your cloud account. If you wish to enable VPC peering between any existing VPC you own and the new Snowplow VPC, can choose the CIDR/IP range used in the Snowplow-setup VPC so that peering is possible.
SSH Access Controls (AWS only)
As part of your internal security policies Snowplow’s SSH access to the environment can be disabled.
CVE Reporting (AWS and GCP only)
CVE Reporting provides a periodic report on Common Vulnerabilities and Exploits identified in any relevant software component, as well as regular patching of the same
Custom IAM Policy (AWS only)
As part of agent installation on EC2 nodes extra IAM permissions can be required (e.g. SSM agent) for correct functionality. IAM policies attached to EC2 servers can be extended with a customer defined policy if needed.
Custom VPC Integration (AWS only)
As part of a Private Managed Cloud deployment, Snowplow deploys a VPC for all other Snowplow infrastructure to be deployed within. If customers require Snowplow to set up pipelines and other Snowplow infrastructure into a pre-existing VPC (rather than creating one from scratch), they need to select this option. This VPC must allow Snowplow access to the internet via a directly connected Internet Gateway (IGW) and ensure sufficient NACL rules are allowed for the deployment to function as expected in order to be signed off by the Snowplow team prior to deployment.
Custom Security Agents (AWS, GCP only)
On AWS, for all EC2 servers that are deployed as part of the service, a customer’s custom security agents may be installed via an S3 object made available by the customer. On all EKS clusters that are deployed as part of the service a customer’s custom security agent can be deployed via a helm chart. On GCP, for all GKE clusters that are deployed as part of the service, a customer’s custom security agents may be installed via a helm chart.
Custom EKS AMIs (AWS only)
Provision of a custom hardened AMI (machine image) for use in EKS node pools instead of standard AWS images.

Note: Note: the features that apply to you depend on the cloud infrastructure vendor you are using. For example, Customer EKS AMIs described above only apply if you use AWS and are a PMC customer who purchases a subscription to the “Advanced” Infrastructure and Security.

Event Forwarding

Event Forwarding empowers organizations to seamlessly deliver enriched behavioral data to their preferred downstream destinations in real time. With support for platforms like Google Server-Side Tag Manager, Braze, and Amplitude, Event Forwarding ensures data flows efficiently and accurately into the business applications where it’s most valuable. This feature enables businesses to deliver their data instantly, driving use cases such as real-time personalization, customer engagement, and analytics with minimal latency.

Data Management:

Snowplow provides standard access to functionality to help you manage and govern your behavioral data. This functionality is called “Event Data Management” and is further described in the table below.

You can also subscribe to our “Data Product Studio” and “Data Model Pack” products. These services provide enhanced functionality to manage and model your behavioral data, further described in the table below.

Event Data Management

Access to a library of Snowplow SDKs for collecting behavioral data in different application environments. A full list of our current Snowplow SDKs can be found in our documentation.

Tools for developers to support your setup of Snowplow’s SDKs including:

  • Snowtype: A tool for enabling developers to more easily integrate Snowplow tracking SDKs based on their data design by creating type-safe, client-specific functions and methods for instrumenting Snowplow tracking.
  • Snowplow Micro: A tool to enable developers to inspect Snowplow data from a development environment easily, and set up automated tests to fail builds that break Snowplow tracking.
  • Snowplow browser extension: A tool to enable developers to conveniently inspect and validate web tracking via a Chrome plugin.
  • ID service: A tool to help you set your own first-party persistent cookies for tracking your users on your web domains.

A user interface that enables you to:

  • Instrument and configure pre-built Snowplow behavioral data sets (behavioral data products), including authoring up to 5 new custom schemas (data structures).
  • Enable and configure enrichments on the behavioral data sets
  • View the different behavioral data products you are using Snowplow to deliver, including the data definitions and instructions on how to access and understand the data.
  • Receive alerts for any deviations (quality issues) in the data collected from those definitions. (Failures can be reviewed via a live dashboard.)

Data Product Studio

Access to functionality to help you define, extend, manage, and socialize the data generated by the Snowplow Data Pipeline. The Data Product Studio includes the following functionality to enable you to:

  • Design unlimited new behavioral data sets (behavioral data products), including defining new schemas (data structures) and event specifications (semantics).
  • Assign ownership to those data sets.
  • Provide controls on who can create and update behavioral data set definitions.
  • Generate machine-readable data contracts for those behavioral data sets.
  • Report against the data set definition/design.
  • Record changes to data set definitions over time.
  • Enable users within your organization to “subscribe” to updates and receive notifications on changes to the associated definitions.

Data Model Packs

The Digital Analytics Data Model Pack is comprised of data models and dashboards with visualizations for the following use cases:

  1. User and Marketing Analytics: Understand your customer engagement with digital channels.
  2. Marketing Attribution: Understand the impact of different marketing channels on conversions and traffic levels.
  3. Funnel Analytics: Understand the sequential steps users take toward a specific goal, identifying drop-off points and optimizing the user journey for higher conversions.
  4. Video and Media Analytics: Understand engagement with video, audio, and streaming content, including clicks through to conversions and advertisements.

The Data Model Pack includes data models (written using dbt open source software) that aggregate the underlying event-level data in the cloud data destination into AI and Business Intelligence - ready tables. Example tables include a user-level table, a session-level table, and a pageview-level table. These tables directly power the graphs and charts in the example visualizations and user interfaces, and can be used and customized by you to perform more sophisticated analytics and AI. The included dbt packages are: 

  • Unified Digital: Understand user behavior across web and mobile apps 
  • Attribution: attribute conversions and revenue through multiple attribution methods: 
  • Media Player: calculate aggregate play and ad statistics across video, audio and streaming content
  • Normalize: filter and flatten your event data into format more suitable for downstream applications 
  • Utils: contains our base processing logic for all other packages"

The data models implement several data processing steps, including but not limited to:

  • Deduplicating the underlying event data
  • Stitching user identities across different platforms and channels (e.g., web and mobile)
  • Accurately calculating time spent engaging with different content items (e.g., web pages, mobile screens)
  • Sessionizing the data

The data models aggregate the data in a performant, incremental fashion which may reduce your cost of data processing and increase the speed of data delivery. The data models are extendable and run in your selected cloud data destination.

The Ecommerce Analytics Data Model Pack includes the underlying dbt models (written using dbt open source software) in the Digital Analytics Data Model Pack, but also includes an associated ecommerce dbt model to help understand and optimize a digital shopping experience. 

The ecommerce dbt package creates AI and Business Intelligence-ready tables describing carts, checkouts, product performance, transactions, and sessions. These tables directly power the example visualizations and user interfaces, and can be used and customized to power more sophisticated analytics and AI.

Extensions:

Snowplow Extensions introduces tools and integrations that enhance Snowplow core functionality, enabling organizations to seamlessly extend the value of their behavioral data. Extensions are designed to empower teams to operationalize insights, streamline workflows, and sync data by connecting Snowplow to a broader ecosystem of platforms and tools. 

Reverse ETL

Reverse ETL, powered by Census, is a reverse ETL tool that empowers data teams to operationalize their data by seamlessly syncing insights from data warehouses to business tools like CRMs, marketing platforms, and analytics tools. Designed for flexibility and precision, reverse ETL enables organizations to create personalized customer experiences, streamline workflows, and drive better decision-making by making data actionable across teams. With robust automation, field-level controls, and support for complex data models, reverse ETL bridges the gap between data warehouses and the tools where business happens.

Audience Hub

Audience Hub, powered by Census, enables teams to create, manage, and activate highly targeted customer segments directly from their data warehouse. With an intuitive, no-code interface, Audience Hub empowers marketing, sales, and customer success teams to craft dynamic audiences based on real-time data, behavioral insights, and business logic. By eliminating the need for complex engineering workflows, it accelerates the ability to personalize campaigns, drive engagement, and improve customer retention.