KPI Standardisation and Kimball Warehouse
Eliminated conflicting metric definitions across five teams at 500K+ daily order scale.
Blinkit was processing 500,000+ daily orders across a hyperlocal quick commerce network. At that scale, even small discrepancies in how key metrics were calculated across teams — retained margin, order counts, fulfilment rates — could lead to significant misalignment in decisions. Multiple sources of truth existed across the organisation, each producing subtly different numbers.
Establish a single, authoritative source of truth for critical business metrics, with a warehouse design robust enough to guarantee consistency across all consumers of that data at scale.
Before this engagement, Blinkit's analytics layer was built on a stack of denormalised flat tables built on top of other denormalised flat tables — a legacy architecture that had grown organically and become nearly impossible to audit, correct, or extend. Metric discrepancies were untraceable because the spaghetti structure meant there was no authoritative definition of how any number was calculated.
The Kimball approach was chosen specifically to eliminate that ambiguity: dimensional modelling produces fact tables at a defined grain with conformed dimensions, making it structurally harder to generate conflicting numbers from the same underlying data. The migration to incremental dbt models also addressed a severe operational problem — all models were being dropped and fully recreated daily, and the ETL load had grown to the point where it was becoming unsustainable.
- 01Ran workshops across finance, product, and growth to surface every competing definition of each key metric and trace discrepancies to their root cause in the legacy model structure.
- 02Agreed canonical metric definitions with cross-functional sign-off — retained margin, order count, fulfilment rate, and all other critical KPIs formalised in a metric glossary.
- 03Designed the Kimball-style dimensional warehouse: fact tables at correct grain, conformed dimensions shared across business functions, clear raw-to-modelled layer separation replacing the legacy denormalised structure.
- 04Migrated all models to dbt with full documentation, schema tests, and freshness checks — every model with inline documentation and a defined owner.
- 05Converted all models from full daily rebuilds to incremental loads, substantially reducing ETL overhead and enabling the platform to scale with data volume.
- 06Established the metric governance process: new metrics requiring a documented definition and cross-functional sign-off before being built into the warehouse.
- 0% error rate on critical business KPIs including retained margin, sustained across a platform processing 500,000+ daily orders
- Single source of truth adopted across all business functions — finance, product, and growth operating from identical numbers
- ETL overhead substantially reduced by converting all models from full daily rebuilds to incremental loads
- Metric governance process established, preventing definition drift as the organisation scaled
Start a conversation.
Every engagement begins with a focused discussion of your current data environment and priorities. To schedule an initial consultation, reach out directly.
Get in touch