Blog

September Release Roundup

By
Rob Edwards
October 9, 2023
Share this post

Welcome back to our new monthly Release Roundup. In this post we cover all of our September releases including: Snowplow ID Service Support, Snowplow Lake Loader and an upgrade to our Snowplow Inspector.

✨Top features this month

Avoid the impact of ITP on cookie expiration with Snowplow’s ID Service

As part of Safari 16.4, released in April 2023, Safari sets the lifetime of server-set first-party cookies to a maximum of seven days.

These changes can have a significant impact, limiting the ability to identify returning users and impacting many downstream applications such as marketing attribution and product analytics, to name a few.

To extend cookie lifetime, Snowplow has introduced support for an "ID Service” alongside our latest Javascript Tracker version 3.15. The "ID Service" enables the generation of a unique browser identifier that enhances Snowplow's tracking capabilities in browsers where ITP is enabled.

Deploying an "ID Service" alongside your tracker, prolongs the lifespan of cookies to up to two years. This way you bypass the latest ITP updates and gain greater insights into your customer behavior.

Read our full blog on the release here or visit our documentation

Load to new destinations with Snowplow Lake Loader

In August, we introduced the ability to run Snowplow on Azure using our Open Source Quick Start Guide to collect data and load it into Snowflake.

This month we are expanding our support for Azure with additional destinations using our new Snowplow Lake Loader, including:

  • Databricks
  • Azure Synapse Analytics
  • Azure Fabric and OneLake

The new Snowplow Lake Loader streams data to your data lake using open table formats like Delta. As with other Snowplow loaders, the Lake Loader automatically manages schema changes as you design and evolve your custom events.

The new Lake Loader offers an exciting vision with planned support for a variety of clouds (Azure, AWS, GCP) and open table formats (Delta, Apache Iceberg, Apache Hudi) that enable new destinations for Snowplow data (e.g. ClickHouse, via S3 and Apache Iceberg). These also provide an alternative way to load into existing destinations (e.g. Snowflake, Databricks, BigQuery).

Our current version already supports some of these combinations, including:

  • Azure + Delta: compatible with Synapse Analytics, Databricks, etc
  • GCP + Delta: compatible with Databricks

As with all of our releases, we encourage you to share your thoughts and feedback on our Discourse to help shape future development.

If you are a BDP Enterprise customer and interested in Azure, join our waiting list here.

_________________________________________________________________________

🆕 Other notable releases and updates

A refreshed debugging experience with our upgraded Snowplow Inspector 

Snowplow Inspector, our in-browser debugging tool, underwent a major overhaul in September. In addition to a new design, you can now log into the Snowplow Console via the extension, enabling you to:

  • Import Iglu/mini registries into the extension for schema detection and validation , automatically creating Iglu credentials as required
  • Send events to pipelines defined in the Console (recognized via the configured domain names), including enrichment configurations for that pipeline
  • Import Tracking Scenarios into the extension as Test Suites
  • Support more modern Bad Rows formats using: The Import > Bad Rows feature

Do you want to find out more? See our GitHub post for additional details or download the extensions for yourself. 

_________________________________________________________________________

🔧 Fixes and performance improvements

Tracker updates

DBT Packages

Core Pipeline

Subscribe to our newsletter

Get the latest blog posts to your inbox every week.

Get Started

Unlock the value of your behavioral data with customer data infrastructure for AI, advanced analytics, and personalized experiences