Webinar: Common Mistakes in A/B Testing

Post on 17-Jul-2015

111 views 0 download

Transcript of Webinar: Common Mistakes in A/B Testing

A/B-testing Mistakes & Quick Fixes

CRO Expert &A/B-testing Ninja

Partner ManagerBenelux & Nordics

Your hosts

Nr 1 Website Optimization Platform

Delivering best customer experiences at every touch point on the web and mobile apps

Nr 1 in Scandinavia on Online Conversion Rate Optimization

The only 2 Star Solution Partner in Scandinavia!

Agenda

● Brief overview of A/B-testing● Common A/B-testing mistakes● Some customer cases● Summary● QA

Brief overviewof A/B-testing

➔ An optimization method that involves testing different versions of a web page (or app)

➔ The variations are identical except for a few things that might affect a user's behavior

➔ Calculations are made to see if the effect is not coincidence

What is A/B-testing?

Here’s how it works

● Visitors are randomly selected to see different variations(cookies are stored)

● Keeping track of your KPIs● Downloading content from the cloud or redirecting the

visitor to a different URL

An A/B-testing tool in a nutshellThree primary things

➔ To learn more about the visitor’s behaviour in order to formulate new hypotheses

➔ We want to achieve our online goals e.g. increased sales or more leads

Why should you test?

10 common mistakes

Testing areaIf there is an obvious opportunity to shift behaviour, expose insight or increase number conversions

You test everything

Just Do It (JFDI)Issues where a fix is easy to identify or the change is a no-brainer

Put your findings into buckets

ExploreYou need more information to triangulate the problem. If an item is in this bucket, you need to do further digging, more data points

(red = not suitable for testing)

No (analytics) integration

● Troubleshooting tests● (Segmenting results)● Test that “flipp”● Tests that don’t make any sense● Broken test● What drives the difference

Best-in-class Integrations

Your test will finish in 100 years!

★ Use a test duration calculator★ https://www.optimizely.com/resources/sample-size-calculator★ http://apps.conversionista.se/visual-test-duration-calculator/

You draw conclusions based on an ongoing test

Optimizely’s Stats Engine● New way of measuring significance in a dynamic environment

Results● Make a decision as soon as you see significant results● Test many goals and variations accurately at the same time● No extra work for experimenters

Traditional Statistics

Stats Engine

Percent of tests with winners or losers declared 0.36 0.22

Percent of tests with a change in significant declaration 0.37 0.04

● Segmenting● Customer service● Session replay● Eyetracking● User testing● Form analytics

Your hypothesis is crapUse input from:

● Search analysis● A/B-testing ● Web analysis● Competitors● Customer contacts● Surveys

Solution:Question your ideas

http://dah.la/hypothesis-creator

Read the blog post about how to use the formulahttps://conversionista.se/ab-test-hypoteser/

IAR

Magine TVInternet TV Streaming Service

The challenge: More leads without changing the sign-up

The landing page

Scroll map analysisGenerates a map based on where the visitors of your website click or scroll

The analysis with Google Analytics

In the funnel visualization reports we found a bigger drop off between signup and thank you page than between landing page and signup page

The HypothesisSince we have observed that [We have a big drop off between the

Signup and the Thank you Page]. By [Analyzing the data in Google

Analytics And Crazy Egg]. We want to [Move up the “Instructions”]

which should lead to [more people signing up]. The effect will be

measured by [the number of people signing up]

http://dah.la/hypothesis-creator

The hypothesis formula

The testOriginal Variation

KEY CHANGES:Move the instructions to the top of the page

Variation 1 outperforms the Original

Micro Conversion

Goal

Micro Conversion

Goal

Macro Conversion

Goal

Your tests are not prioritized

Opportunity

High

LowLowEffortHigh

Opportunity factors to take into consideration:

➔ Complexity➔ Resources➔ Decisions

Effort factors to take into consideration:

➔ Potential➔ Scale➔ Goal

SpotifySpotify’s Premium Trial Flow

Original

● Premium Trial page (US)● High drop off● Asked to provide credit

card details to start the premium trial

● User testing● Short survey →

○ Data shows that the primary reason to not start the premium trail is■ Does not want to

give away their credit card details

Input

Test Hypothesis

“Eftersom att vi med DATAANALYS har observerat att en stor del av de som

lämnar premiumflödet (i data) gör det p.g.a. att de INTE VILL GE BORT sina betaluppgifter

kommer vi säga VARFÖR de måste ge det vilket kommer leda till att fler gör det.

Något vi kommer att mäta i antal köp.”

http://dah.la/hypothesis-creator

The hypothesis formula

Hypothesis: “Give the user a reason...”

We only use this to verify your account, you won't be charged anything for your trial

We need this because our music deals only allow free trials for users that are credit card or PayPal holders

We need this just in case you decide to stay Premium after your free month

B

C

D

Test Results

C. “Because of our music...”

B. “Verify your account…”

D. “If you want to continue...”

A. Original

Variations CC PAGE Thank You page

You run a “bad” test

SwedofficeB2B E-Commerce Site

Original

Solution

Test ResultsNo difference between the variations

A/B-test (1)

Original Variation

Why?!

Retake

A/B-test (2)

Conversions + 6%Revenue per Visitor + 10%

Original Variation

You don’t isolate the variations and end up with no change

Different Traffic Sources not taken into considerationMaximize ROI on your PPC investment

OptimizelyHow Optimizely Maximized

ROI on their PPC investment

Google Keyword-Insertion

Creating Symmetry

Original

Variation

39% Increase in Sales Leads

Bounce Rate Decreased

Quality Score went up

Cost per Lead went down

39% Increase in Sales Leads

Bounce Rate Decreased

Quality Score went up

Cost per Lead went down

39% Increase in Sales Leads

Bounce Rate Decreased

Quality Score went up

Cost per Lead went down

Results

39% Increase in # of Sales LeadsBounce Rate DecreasedGoogle Quality Score went upCost per Lead went down

Do not get risky - be aware of bugs

- Make sure not to direct all traffic to a “broken” or bad performing variation

- Preview your variations in cross browser tests- Use phased rollouts to avoid dissatisfaction

Phased Rollouts

Phased RolloutsThe Sad Story...

Using code blocks to be flexible

Phased RolloutsThe Happy Story...

Inkcards’ challenge

Phased Rollouts

Summary: Common Testing Mistakes

➔ You test everything on your site

➔ No integrations➔ Your test will finish in a 100

years➔ You draw conclusions

based on an ongoing test➔ You put in too little effort

on your hypothesis

➔ Your test isn’t prioritized➔ You don't learn anything➔ You change everything at

once➔ You don't account for

different traffic sources➔ Be aware of bugs

Key take aways

1. The only bad test is the one where you don’t learn anything

2. Expect the unexpected

3. Only test where you can trigger a behaviour change - where

we make decisions

4. Formulate your test hypothesis WELL !important

REMEMBER & DON’T FORGET

Q&A

Thanks!

CRO Expert & A/B-testing Ninja

Partner ManagerBenelux & Nordics

conversionista.se optimizely.com