No Cookies, No Problem: Grammarly’s Ad Experiment
How Grammarly Measured Ad Impact Without Tracking 30M+ Users
TL;DR
Situation
Grammarly’s 30M+ daily users rely on paid ads, but traditional attribution failed to measure YouTube’s true impact, especially with third-party cookies disappearing.
Task
Find a privacy-first way to determine if YouTube ads drive new users or just capture organic sign-ups using geo experimentation.
Action
Geo Experimentation: Stopped ads in select regions while maintaining them in others to measure impact.
Data Modeling: Used Google’s TBR package with a BSTS model to estimate incremental user acquisition.
Geo Split Selection: Created balanced test/control groups using clustering and randomization.
Power Analysis: Optimized experiment duration and sample size for accuracy and minimal business disruption.
KPI Measurement: Focused on new active users instead of revenue, aligning with Grammarly’s freemium model.
Result
YouTube ads were most effective before peak seasons but had less impact during them. When ads were paused, new sign-ups dropped immediately, showing YouTube’s stronger mid-funnel role than expected.
Use Cases
Marketing Attribution Improvement, Media Spend Optimization, Incrementality Measurement
Tech Stack/Framework
Google TBR (Time-Based Regression), Bayesian Structural Time Series (BSTS) Model, Clustering, Randomization
Explained Further
Understanding the Attribution Challenge
Measuring marketing effectiveness is challenging, especially for ads that don’t involve direct clicks. It’s easy to track whether someone clicks on a Google search ad, but what about an ad they only view on YouTube? Traditional attribution models often struggle to assign proper credit to top-of-funnel marketing, leading to an undervaluation of video ads and TV campaigns.