👈 Home

When to freak out


We’ve looked at what noise is and how to measure it with an A/A test, so you know how long your experiments need to run before you trust the results.

But a much more urgent need to understand noise arises when your numbers take a sudden downturn.

Is something broken? Did you break it? Is this because of that new tag we pushed to Google Tag Manager last Thursday? OMG SHOULD WE REMOVE THAT TAG?

There are methods from Stats 100 we can use to put questions like this in perspective, but here’s a super simple way to bring a bit of context to the conversation - without typing any formulas into Excel.

Zoom out before you freak out

Take whatever metric you’re concerned about, over whatever time period you’re considering, and look at 10-20 other relevant data points - same metric, same length of time.

Say you’re on the verge of firing your new marketing manager because last week your new visitor conversion rate was 1.3% (a precipitous downturn from 2.1% the week before 📉).

Before you drop the hammer, let’s take a look at new visitor conversion rate (same metric) on a week-by-week basis (same length of time) for, say, the past 16 weeks.

Is 1.3% the lowest it’s been? If not, how many weeks saw a lower conversion rate?

If the answer is “Well, yeah, a few …” you’re officially authorized not to freak out.

It doesn’t prove that nothing’s broken. It just tells you that this result seems to fall within the natural variance of this metric on this time scale.

(If last weeks’ result really was the lowest you’ve seen in 16 weeks, by all means look closer. It can also be instructional to zoom out further for perspective; when was the last time the number was this low?)

Caveats

If seasonality is a significant factor in the metric you’re monitoring, it might make more sense to look at, say, the last 8 weeks + the same 8 weeks last year.

But beware your brilliant and creative brain’s ability to cook the books here. Don’t bring in other metrics, or start concocting fancy “three week rolling averages with outliers discarded” comparisons. If you’re freaking out about a week’s results, look at other weeks’ results.

This is scary stuff! But being reactive to fluctuations in data should also scare you; that’s how you waste effort, undermine potentially valuable improvements to your program, and end up busily making zero progress for months at a time.

So, zoom out. Deep breaths. Your heart and your team will thank you.


    © 2024 Brian David Hall