Jobs: an improved interface for monitoring warehouse jobs
Modeling your raw data is a critical step in your data stack. It transforms and aggregates your immutable, raw data into tables that can help generate insights for your business. Snowplow provides this data modeling functionality through the open source offering, SqlRunner.
Snowplow BDP customers can monitor their data modeling jobs and other SQL jobs running on their warehouse in the Snowplow BDP Console and we are pleased to present an overhaul of this important monitoring feature.
This release brings a number of significant improvements and benefits to the Jobs interface in Snowplow BDP Console:
Visual and functional timeline improvements
- Each warehouse job now appears on a separate track, with sub-tracks displaying successful, skipped and failed job runs; making it much more intuitive and easy to read the timeline and see what needs attention.
- Warehouse jobs you aren’t interested in can be collapsed to reduce the clutter and make it easier to focus on the jobs that you do care about.
- For different zoom levels so you can see runs across the day or zoom into a single hour.
- Auto-refresh so Console keeps your monitoring view up-to-date in the background.
A new way to navigate warehouse jobs
- You can now view a tabular summary of the last 24 hours of activity, making it easy to spot recently failed runs as well as seeing total runs and the average time a job is taking to run.
- From this view you can dive deeper into the all runs of a particular job, and then further into details and logs of a specific job run.
An API to consume the information programmatically
- The underlying information that powers the timeline and tables is available via the Snowplow API layer, meaning you can consume it into other tools or applications.
All Snowplow BDP customers will automatically receive this upgrade.
Not a Snowplow BDP customer yet? Get in touch with us here to learn more.