Experiment with Snowplow Signals in Minutes with New Sandbox and Accelerators
Behavioral data used to be the domain of data teams: fuel for analytics, reporting, and business intelligence. But as apps are needing to become more intelligent and adaptive, that same data is now essential to application infrastructure.
Software engineers building adaptive products need the same thing data teams need: high-quality behavioral data. But they need it served differently, through APIs, SDKs, and real-time computation engines that integrate with their application code, not batch pipelines that update overnight.
Traditional personalization platforms abstract away your data. Feature stores require heavy orchestration. Engineers need to ship features in weeks, instead of spending months building infrastructure.
Today we're releasing two Solution Accelerators that make it easy to explore what’s possible with Snowplow Signals, plus Signals Sandbox to try them without infrastructure setup. Each accelerator provides open-source reference code, schema definitions, and a clear architectural pattern for production-grade, agentic applications.
Real-Time Personalization for Digital Travel Bookings
This accelerator simulates a travel platform that adapts to user intent in real time, powering dynamic offers and AI-driven recommendations as customers browse destinations.
Architecture Overview:
- Event Collection: Snowplow trackers stream user events (page views, dwell time, destination interactions) into Signals
- Feature Computation: Signals computes contextual features like
recent_destination_views,avg_dwell_time_dest, andluxury_interest_score - Profile Update: Each session updates the user's profile in real time, combining live behavioral signals with historical attributes such as loyalty tier and booking history
- Agentic Context: An OpenAI-powered agent queries the Profiles API to personalize responses on the fly:
profile = profiles_api.get(user_id)
prompt = f"""
The user has viewed {profile["recent_destination_views"]} luxury destinations
and has a loyalty_tier of {profile["loyalty_tier"]}.
Suggest an upgrade or high-value recommendation.
"""
ai_response = openai_client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": prompt}]
)
- Triggered Intervention: Signals executes a rule-based Intervention (e.g., display premium offer, open chat prompt) in response to the agent's output
This accelerator shows how Snowplow Signals acts as the context engine for intelligent agents, providing live state awareness, historical understanding, and event-driven activation in one seamless loop.
ML-Based Prospect Scoring
This accelerator demonstrates how to integrate machine learning models directly into the Signals workflow to drive real-time scoring and engagement triggers—perfect for growth and product teams.
Architecture Overview:
- Model Training (Batch): A predictive model is trained on historical behavioral data (e.g., in Databricks or Snowflake) and outputs a base_prospect_score
- Streaming Updates:Incoming events feed live features such as feature_usage_10min and trial_activity_rate
- Profile Merge: Signals combines both data sets in real time, recalculating a live_prospect_score as behaviors change
- Trigger Action: When the score crosses a threshold, an Intervention is fired via webhook or SDK:
if (liveProspectScore > 0.8) {
interventions.trigger({
id: "notify_sales_team",
payload: { userId, liveProspectScore },
});
}
This accelerator demonstrates how Snowplow Signals unifies batch ML outputs and streaming feature pipelines, eliminating the need for a separate feature store or heavy data orchestration.
Try These Patterns in the Signals Sandbox
Developers shouldn't need a distributed systems team just to experiment with real-time context.
The Signals Sandbox provides a ready-to-use environment for testing how behavioral data becomes real-time intelligence. In the Sandbox, you can:
- Spin up a Signals environment in minutes with no Kafka, Flink, or Redis setup required
- Stream sample data from a live e-commerce demo site
- Watch features compute and profiles update in real time
- Write and test Interventions with Python scripts
- Connect to AI models like OpenAI or AWS Bedrock to experiment with agentic behaviors
Go from event stream → computed feature → in-session action, all without setting up infrastructure.
Building With Context
Signals is the connective tissue between raw event data, intelligent models, and real-time decisioning, and it's designed for the engineers building tomorrow's AI-native products.
Whether you're a data engineer managing pipelines or a software engineer building adaptive UIs, you get the same foundation—just different tools for different jobs.
Start building today:
- Try the Signals Sandbox
- Complete the E-Commerce Interventions Tutorial
- Explore Solution Accelerators:
- Get a Demo of the Signals Console: Talk with our experts to see how to deploy Signals in your own cloud, connect it to your stack, and customize workflows to your products
Build faster. Build smarter. Build with context.