[Webinar] Visa's Journey to a Culture of Experimentation
-
Upload
optimizely -
Category
Technology
-
view
225 -
download
0
Transcript of [Webinar] Visa's Journey to a Culture of Experimentation
1
The Journey to a Culture of Experimentation
Ramkumar RavichandranVisa
Nate WrightOptimizely
A look at the defining moments that made Visa’s experimentation program
2 3
Nate WrightDirector, Product Marketing
Optimizely
Ramkumar RavichandranDirector of A/B Testing
Visa
4
Housekeeping
! We are recording this webinar
! You’ll receive the recording and slides
! We’ll answer all questions at the end
6
Digital Experience Optimization:Digital Products, commerce & campaigns
Up to 5X Increase in Yield:Revenue, share of wallet, funnel conversion, risk mitigation, ops efficiency
Partner of Choice:Work with leading global enterprises & “digital disruptors” including 26 of F100
OUR COMPANY
Digital Experimentation Platform:Next gen “Test and Learn” system
Replaces Digital Guesswork: with evidence-based optimization
Speeds Innovation & Optimization:Single platform for marketing & product teamsBest-in-class stats & machine learningConsumer-grade usability Enterprise program management & prof. services
OUR SOLUTION
7
“Our success is a function of how many experiments
we do per year, per month, per week, per day.”
“Instead of saying ‘I have an idea,’ what if you said
‘I have a new hypothesis, let’s go test it.’”
“Our company culture encourages experimentation
and free flow of ideas.”
“One of the things I’m most proud of, and I think
what is the key to our success, is this testing
framework we’ve built.”
Experimentation is the Next Great Business Transformation
Jeff Bezos Larry Page
Mark Zuckerberg Satya Nadella
The Surprising Power of Online Experiments
8
10x more experiments
Consumer-grade usability
Open data integration
Maximum yield of business value
UX and feature-level experiments and personalization at every digital touch point
Enterprise-wide management & governance
Captures, governs and shares
ideation, analyses & results
World’s most trusted outcomes
Best in class Stats Engine
Fast time to results via ML
Accelerates digital innovation
Speeds dev ops & deployment De-risks continuous feature delivery
Ensures success of new features
Unifying flagging & experiments enables controlled testing of new features while maintaining high performance
Ideate
Manage
StoreGovern
Analyze
Share
Open DataIntegration
Security &compliance
StatsEngine
StatsAccelerator
Consumer-grade
usability
APIs &Developer
Tools
FeatureFlags
Open source SDKs
X-Channel
Full StackExperimentation
Personali-zation
Recommen-dations
WebExperimentation
PRODUCTS
COMMERCE
CAMPAIGNS
Optimizely X Unlocks the Experimentation Best Practices of the World’s Greatest Digital Companies
9
26 of the fortune 100 have chosen Optimizely to drive their digital experience
We’re Proud to Work With Great Global Enterprises
10
B U S I N E S S V A L U E
VE
LO
CIT
Y/V
OL
UM
E
LEVEL 1
Executional Start
LEVEL 2
Foundational Growth
LEVEL 3
Cross-FunctionalAdvancement
LEVEL 4
Operational Excellence
LEVEL 5
Culture of Experimentation
Our Products and Services Take You on Your Experimentation Journey
MARKED BY THREE BROAD ERAS: GETTING BUY-IN, FOUNDATION & TRUST BUILDING AND FOLLOWED BY GROWTH
2014 2015 2016 2017 2018
• Product launched • KPIs & goals established• Selling Experimentation• Build vs. Buy
• Optimizely integrated• POC Tests• New Test Pipeline• A/B Testing Kanban
Process, Team & KPIs
• New Flow launch with Learn, Listen & Test framework• Cross functional
stakeholder engagement
• Educate on full potential and Vision for Experimentation• Evangelism and Point
of Contact for other business units within Visa• Data Driven Strategy
(Analytics & Testing)
• Workflow Management and Program Optimization• Targeting &
Personalization
Pivotal decisions along the journey!
PIVOTAL DECISION 1: GETTING EXPERIMENTATION IN THE DOOR
A compelling needs, USP & impact story, getting right stakeholder sponsorship and executive support got us in…
• Gap vs. KPI goals• Time to action on actionable insights• UX decisions that could have been answered better (vs. Hippo, small samples, competitions)• Technology investment, time and effort in delivering fixes that didn’t move the needle• Personalization• Learning goals for future initiatives
The Story
• Right stakeholders with the right need (Product Launch Management) and right “heft”• Positioning at the right time (after baselines and analytics)• Clear Success Criteria
Stakeholder Sponsorship
• Customer focus – UX, UX, UX• Demands on accountability (progress and results)• Focus on execution efficiency and optimization (Agile)• Long term vision
Executive Support
PIVOTAL DECISION 2: BUILD VS. BUY ON EXPERIMENTATION TOOL
Tool needs to support a variety of experimentation needs but at the same time making it easy for non tech users to learn and use, manage the Workflow and with keep latency low…
• Supported experiment designs: Multi level, multi factor, multiple A/B
• Custom Traffic distribution: Segment filters, Universal Controls
• Type of Tests: Placement/Prominence/Messaging, Funnels, Omni Channel, Algorithms
• Test Metrics: Standard & Custom
• Implementation effort
• Supported Channels: Web, Mobile Web, Native SDK, Single Page Applications
• Pricing packages
• Programming experience
• Analysis options: Integration with Web Analytics/CXM and data export to data lakes
• Security limitations
…the key factor being can in house tool be kept current with market needs, the migration or integration cost and support need/cost from Engineering team
PIVOTAL DECISION 3: SETTING UP THE RIGHT FOUNDATION (PROCESS, TEAM & PROGRAM KPIs)
We iterated our way into a working team framework and process set up for selecting right experiments, setting them up correctly/quickly, ensuring we have Dev/QA/PM support and is guided by Product Strategy…
• A/BTestAnalyst(Analytics):Thedriverofthetestingprogram.Involvedfromstarttofinishupuntilthehand-offofasuccessfultesttoitsrespectiveproductowner.ASMEintheOptimizelytool,owneroftestsetup,deployment,andanalysis.
• ProductPartner:Talkstoandbringsintherightpeoplefordifferentstepsoftheprocess.Offersproduct’sperspectiveintermsofgatekeepingdutiesontestideas.Wellconnectedtodifferentproductownersandactsastheliaisontowardstheproductteam.
• QAPartner:Helpsensurethattherearenobugsinthetestsetup,fromausabilitystandpoint.
• TechnologyPartner:Offersconsultationonfeasibilityfortests,assistsinsetupofadvancedtests.
• DesignPartner:Helpstheteamgerminateideas,aswellasgivetheteamvisualstoworkoffofinatest.
Ideation Prioritization/Grooming Setup QA Deployment Analysis Implementation
Analytics,Product,Design,Tech
Analytics,QA
Analytics,Product
Team
Process
PIVOTAL DECISION 3: SETTING UP THE RIGHT FOUNDATION (PROCESS, TEAM & PROGRAM KPIs) contd…
Apart from the business KPIs, we defined a set of internal operational KPIs for the Experimentation Program to ensure we are driving value both efficiently and effectively…
Program KPIs (Operational)
• # of Tests run per month
• % Successful tests
• % Learning Tests
• % Workaround/Bug fix Tests
• #Channels Tested on
• Time from ideation to deployment
• Time from test outcome to product implementation
• Program RoI
• Stakeholder NPS
• KPI Delta vs. Universal Control
…both raw and YoY
growth forms
PIVOTAL DECISION 4: LEARN-LISTEN-TEST FRAMEWORK FOR NEW FLOW ROLL OUT
Analytics provides insights into “user behavior”, Research context on “motivations” & Testing helps verify the “tactics” in the field and everything has to be productized…
Key benefitsFocus on Big WinsReduced WastageQuick FixesAdaptabilityAssured executionLearning for future initiatives
Strategy
Data Tagging
Data Platform
Reporting
Analytics
Research
Cognitive
IterativeLoop
Optimization
PIVOTAL DECISION 4: LEARN-LISTEN-TEST FRAMEWORK FOR NEW FLOW ROLL OUT contd…
Iterative & quick improvement (15% pts) of the KPI performance during the new experience launch helped us gain trust of our stakeholders…
1 2 3 4 5 6 7 8Months since launch
Iterative Testing helped us improve the performance of
new product...
New Experience Current Experience
…doing it as a cross functional group of PM, Data Scientists, Engineering/QA and UX helped us educate the value & impact of experimentation
PIVOTAL DECISION 5: MOVING EXPERIMENTATION UP TO THE TOP OF THE PRODUCT DEVELOPMENT LIFECYCLE
Leveraging insights from experiments to prioritize new ideas/features/functionalities/forms and making Test & Learn a standard rollout process…
Concept Design Prototype Build Run Retire
Business Case
Wha
t ha
ppen
s he
re?
The Flow Testing the waters Development Launch &
Sustain
Migration from the current product to newer one.
Moved up the Strategic Value Chain signified the arrival of “A/B Testing”
PIVOTAL DECISION 6: EDUCATING STAKEHOLDERS ON FULL POTENTIAL AND LONG TERM PROGRAM VISION
Over past few years we progressed along maturity curve, but still ways to go. The most important critical element to up level experimentation and continuously engage stakeholders is to show that lot more is possible and should be done…
SELL
SCALE
EXPAND
DEEPEN
TRANSFORM
Phases of Maturity
Val
ue A
dd We are here
• Sell the value and get it in
• Solid foundation of Team/process/KPI
• Successful deployment of Test & Learn to drive impact
• Complex tests • Data Driven
Design
• Personalization• Champion/Challenger• Platform
• Algorithmic Testing• Test Modularity &
Portability• Monetizable Product
Moreat:https://www.slideshare.net/RamkumarRavichandran/advancing-testing-maturity-in-your-organization
OTHER BEST PRACTICES THAT WORKED FOR US
Knowing what we are testing & how much to expect, i.e., rank ordering between Visual(Messaging, Prominence, Placement), Page Design (Flow, Form, CTA), Platform Performance and Content/Personalization.
Saying no: Keeping the pipeline focused on high impact tests, leveraging alternatives for low value tests (Prototypes/usability studies/surveys) ensure that real Tests don’t suffer from low sample or contamination.
Sharing the wins: Credit where it is due- Engineers, Testers, Program Managers are as critical to the test success as is analyses, product strategy or Design. Ensure they get the credit and make it a win-win for everyone.
Communication: Regular reporting of pipeline, impact and learning help with mindshare & engagement.
Planning it ahead: Intake criteria/process, prioritizing with strategic goals, pre analyses (impact/success criteria/proxies/past learning), multi KPI tracking set up and decision protocols help improve effectiveness.
LESSONS LEARNT THE HARD WAY
Part time involvement: Not everyone on the Kanban team is fully dedicated to Testing->rotation affect continuity. Build in buffers for managing external dependencies and get ‘right’ help when needed.
Not keeping key external facing team always in the loop: Lax Testing Governance & Lineage can severely impact brand integrity, pose legal challenges or become tricky for external facing teams. Proactive communication mandatory.
Platform means everything required for making it self-serve: Optimize onboarding exercise, simplify adoption, make it easy to learn/engage/ask questions/take help, creation of an active community, selling the vision and up level the conversation.
What’s works once and at one place, doesn’t work same every time and everywhere: Offsite QA necessary.
Soft target for Product issues: Anything goes wrong, must be testing. Only response was to actually jointly address each blame and prove that it was indeed not the case.
STILL HAVEN’T BEEN ABLE TO SOLVE CONCLUSIVELY
Resourcing & budgeting: As experimentation matures, the investment and support needs spike. Repeat selling becomes tougher and tougher with increasingly complex message and large scale dependencies.
Moral hazard: Since Optimizely can do things quicker, it’s often used for bug fixes to workaround engineering protocols. Also since Optimizely can ramp the winner variation to 100% right away, the incentive to make a product change right away becomes lesser. It leads to multiple concurrent experiments.
External Factors: Regulatory requirements, privacy issues, non traditional GUIs and AI solutions
Globalization issues
Victim to vagaries of set up: Cannot and should not be an independent fiefdom, will depend on the overall set up and has to work within the constraints.
IF GOD DECIDED TO CREATE AN A/B TEST PROGRAM, WHAT WOULD IT LOOK LIKE…
Every major product change has been iterated, quantified & contextualized
A centralized but modular, & integrated Learn, Listen and Test Framework covering all domains
A Single-Source-Of-Truth Testing Datamart within the Organization’s Datalake for year end Program effectiveness studies
Unified Workflow & Project Management with searchable Knowledge repository & centralized Admin capabilities
Programmatic Testing with human intervention protocols
SUMMARY
As with any user journey management, our journey began with education/selling, followed by a POC/quick wins to get tapped in a major initiative. The victories in major initiative helped us get engagement and support to grow along the experimentation maturity curve.
Benefit from Experimentation is best realized when it’s anchored to Strategic goals, supported with insights from Analytics and Research and what to test is driven by right stakeholders.
Growing along maturity curve gets more difficult progressively because of increasing needs (resourcing & budgeting). Keep iterating on multiple selling approaches and get help when needed. But most importantly remember it’s a “long haul”.
Organizations with a disciplined Experimentation culture within the DNA are poised to reap benefits of higher accountability, focus on business performance and optimized Customer Experience Management
Testing Program was successful because of the right foundation of the team, ownership and success criteria. Knowing what to test and what not to and why helped us delivering stronger Program RoI.