[CXL Live 16] Beyond Test-by-Test Results: CRO Metrics for Performance & Insight by Claire Vo

Post on 24-Jan-2018

4.250 views 0 download

Transcript of [CXL Live 16] Beyond Test-by-Test Results: CRO Metrics for Performance & Insight by Claire Vo

EXPERIMENT ENGINEEEClaire Vo @clairevo

CRO Metrics for Performance & Insight

Beyond test-by-test results: measuring testing

program effectiveness

Things I know to be true:

EXPERIMENT ENGINEEEClaire Vo @clairevo

Most CRO teams focus onand measure the outputs of testing.

EXPERIMENT ENGINEEEClaire Vo @clairevo

(wins make us feel warm & fuzzy)

Claire Vo @clairevo

Claire Vo @clairevo

Claire Vo @clairevo

Justin Rondeau Busted! DigitalMarketer Calls Bull$hit on 4 Conversion Rate Optimization (CRO) Case Studieshttp://www.digitalmarketer.com/conversion-rate-optimization-case-studies/

But more than anything, the success of a testing program

is driven by the inputs.

EXPERIMENT ENGINEEEClaire Vo @clairevo

How to do testing, better:

EXPERIMENT ENGINEEEClaire Vo @clairevo

1. Increase Tests Run (Quantity)

2. Increase Tests Won (Quality)

3. Profit

Claire Vo @clairevo

How to do testing, better—really:

EXPERIMENT ENGINEEEClaire Vo @clairevo

1. Set Goals

2. Measure Performance Regularly

3. Adjust & Iterate

We need a set of CRO performance metrics

to track and improve CRO quantity and quality.

EXPERIMENT ENGINEEEClaire Vo @clairevo

Quantity

EXPERIMENT ENGINEEEClaire Vo @clairevo

EXPERIMENT ENGINEEEClaire Vo @clairevo

“We run [X] tests per week.”

EXPERIMENT ENGINEEEClaire Vo @clairevo

“We run [X] tests per week.”…usually.”

Claire Vo @clairevo

Instead you should know:

What to know Metric to Measure

How many tests can I run Testing Capacity

How many am I running? Testing Velocity & Coverage

Am I getting any better? Trends Over Time

When I’m not running tests, why? Annotations

Claire Vo @clairevo

Testing CapacityCRO PERFORMANCE METRICS

How many tests can you run per year

52

Test Duration (in weeks)x # of simultaneously

testable pages/funnels

Claire Vo @clairevo

Testing VelocityCRO PERFORMANCE METRICS

How many experiments are run per [time period]

0

2.5

5

7.5

10

5/23/2015 6/20/2015 7/18/2015 8/15/2015 9/12/2015 10/10/2015 11/7/2015 12/5/2015

• Weekly for high traffic sites

• Monthly for low traffic sitesGoal 8 tests / week

Claire Vo @clairevo

Testing VelocityCRO PERFORMANCE METRICS

How many experiments are run per [time period]

-0.4

2.2

4.8

7.4

10

5/23/2015 6/20/2015 7/18/2015 8/15/2015 9/12/2015 10/10/2015 11/7/2015 12/5/2015

• Weekly for high traffic sites

• Monthly for low traffic sitesGoal 8 / week

Tracking trend towards goal

BONUS

EXPERIMENT ENGINEEEClaire Vo @clairevo

Make sure you’re not wasting traffic.

(use it or lose it.)

Claire Vo @clairevo

Łukasz Twardowski A/B Testing and the Infinite Monkey Theoryhttp://www.slideshare.net/useitbetter/ab-testing-and-the-infinite-monkey-theory

13%

87%

Claire Vo @clairevo

Testing CoverageCRO PERFORMANCE METRICS

What % of testable days are you running a test?

July

1 2 3 4 5 6

7 8 9 10 11 12 13

14 15 16 17 18 19 20

21 22 23 24 25 26 27

28 29 30 31

August

1 2 3

4 5 6 7 8 9 10

11 12 13 14 15 16 17

18 19 20 21 22 23 24

25 26 27 28 29 30 31

83%

Claire Vo @clairevo

Time Since Last Zero Test Day

CRO PERFORMANCE METRICS

It has been

100DAYS SINCE WE HAD

NO TESTS LIVE

Quality

EXPERIMENT ENGINEEEClaire Vo @clairevo

EXPERIMENT ENGINEEEClaire Vo @clairevo

Stop evaluating the quality of your program on a test-by-test basis.

(it makes you look good, until you look bad)

Claire Vo @clairevo

You should also track:

What to know Metric to Measure

Am I running effective tests? Win Rate, Lift Amount, Expected Value

Am I running tests effectively? ROI

Am I getting any better? Trends Over Time

53%

29%

19%

Claire Vo @clairevo

Win RateCRO PERFORMANCE METRICS

What % of tests run win / lose / are inconclusive?

13%

Claire Vo @clairevo

Rolling Win RateCRO PERFORMANCE METRICS

What % of tests run win / lose / are inconclusive over time?

0

4

8

12

16

April May June July August September October November

Goal 15% Win Rate

Tracking by Page

BONUS

Once you hit a steady testing velocity, ideally your win rate stays level or increases.

Claire Vo @clairevo

Expected ValueCRO PERFORMANCE METRICS

$1 per roll of the die Every time you roll a 3 I pay you $5

Win Rate = 1/6 = 16% Value of Win = $5

Expected Value of Roll = 83 cents

DO NOT PLAY THIS GAME!

Claire Vo @clairevo

Expected ValueCRO PERFORMANCE METRICS

On average, what is the expected value of any test run?

Win Rate x

Average Lift of Winning Test x

Revenue Value of Test

Claire Vo @clairevo

Expected ValueCRO PERFORMANCE METRICS

$1500 every time you run a test Every time you win, you get 10% increase on your revenue, which is $1,000,000 and you win 10% of the time

Win Rate = 10% Value of Win = 10% x $1,000,000 = $100,000

Expected Value of Test = $10,000RUN THIS TEST FOR SURE!

Claire Vo @clairevo

Annual Expected Value of ProgramCRO PERFORMANCE METRICS

Expected Value of Test x

Annual Testing Capacity

What is the estimated potential of my testing program?

Claire Vo @clairevo

Annual Expected Value of ProgramCRO PERFORMANCE METRICS

What is the estimated potential of my testing program?

High Complexity Low Velocity

Low ComplexityHigh Velocity

Type of Tests Take longer, bigger changes Easier, smaller changes

Win Rate 20% 10%

Avg Lift 30% 10%

Expected Value of Test $30,000 $10,000

Tests / Year 25 100

Annual Expected Value of Testing Program $1.5M $1M

EXPERIMENT ENGINEEEClaire Vo @clairevo

Yes, you should trend this over time.

(and set a goal!)

Putting it all together

EXPERIMENT ENGINEEEClaire Vo @clairevo

CRO Metrics for Performance:

EXPERIMENT ENGINEEEClaire Vo @clairevo

1. Track quantity by measuring velocity & coverage

2. Track quality by win rate, avg. lift, and expected value

3. Set goals & measure everything over time

EXPERIMENT ENGINEEEClaire Vo @clairevo

Questions?

claire@experimentengine.com

experimentengine.com/cxl-live-2016