An experiment with no changes to the user interface, used to validate testing platform installation & metrics, and to measure noise in conversion data. (Read more here.)
Above the Fold
Content that is visible on page load, without scrolling, to most visitors.
The tragic state of affairs when you’ve run an experiment, gathered data, and can’t make a decision based on it. (Read more here.)
The percentage of visitors who leave your site without navigating past the page they landed on.
Call To Action
A step we encourage visitors to take, or a button or link representing that step.
A valuable action taken by a website visitor – usually purchases, signups, form submissions. You can say “conversion” when referring to clicks, engagements, and other interactions, but optimizing for them doesn’t necessarily make the business more money. (Read more here.)
A fraction, or percentage representing conversions per session (or conversions per visitor).
Conversion Rate Optimization
The practice of systematically changing your website with the hope/belief that what you’re doing will increase conversions.
Abbreviation for Conversion Rate.
Abbreviation for Call To Action.
Literally any interaction on your website. The opposite of bounce. Feels good, but does not make you money 🙂
A possibly useful mental model that pictures visitors to your site taking a series of sequential steps before converting. An ecommerce checkout flow is a good example of a funnel. Your site may not really have one.
Visual representations of where visitors click on a page. (Read more here.)
The topmost section of a web page, excluding navigation and promotional banners. It’s a terrible place for a carousel.
At best, a rigorous statement of experiment parameters and success criteria. At worst, a psuedoscientific gibberish way of saying “I wanna test this”. You may not need one.
Impact on Revenue
A combination of math and hope that seeks to communicate how much extra money a business will make as a result of an experiment. (Read more here.)
The result of a test that fails to find any clear winners or losers. The worst result imaginable, but if you’ve reached it after testing sufficiently different experiences, it tells you that the page / element / type of change you’re testing just doesn’t matter.
The percent increase achieved by an A/B test variation. If the control experience has a 3% conversion rate, and a variation has a 3.6% conversion rate, it has 20% lift.
Abbreviation for Minimum Detectable Effect.
An action on your website that is not tied directly to revenue – newsletter signups, page visits, clicks.
Minimum Detectable Effect
A measure of test sensitivity. Based on traffic, conversions, desired statistical significance, and test duration, this number tells you how much lift you’ll be able to detect with statistical methods.
A simple, non-statistician’s term to refer to the fact that traffic, conversion rates, and pretty much everything else about website data tends to vary day to day, hour to hour. Noise is what causes identical variations in A/A tests to show different conversion rates. (Read more here.)
Painted Door Test
An experiment that measures interest in a feature not yet built, or resource not yet created. Visitors are invited to take the first step toward using the nonexistent resource, and their clicks are used to validate or invalidate demand.
The Peeking Problem
Serving different experiences to different visitors based on their behavior, demographics, or some other arbitrary set of rules. (Read more here.)
Video playback of actual site visitors’ clicks, scrolls, and navigation. (Read more here.)
The odds that a given test’s results are not due to random noise. 95% seems to be a magic number.
Rules that dictate when an experiment should be concluded, so as to avoid the peeking problem. Based on target sample size, minimum test duration, and level of statistical significance.
Qualitative research method that involves assigning actual humans (who hopefully fall within your target customer base) to complete tasks on your website, while sharing their thought process.
A unique experience in an A/B test.
An actual human using your website, or a fictional human reconstructed by your analytics platform based on click and page view data. (Read more here.)