How to effectively implement different online research methods - UXPA 2015 - Fadden & Bedekar

64
How to effectively implement different online research techniques for rapid unmoderated feedback Niyati Bedekar @nbedekar Steve Fadden @sfadden Presented at UXPA 2015, San Diego Slides: https://goo.gl/X8dolV

Transcript of How to effectively implement different online research methods - UXPA 2015 - Fadden & Bedekar

How to effectively implement different online research techniques for rapid unmoderated feedback

Niyati Bedekar@nbedekar

Steve Fadden @sfadden

Presented at UXPA 2015, San Diego Slides: https://goo.gl/X8dolV

Agenda

Online techniques

Method toolkit

Common requests and solutions

Case studies and templates

Effective practices

Image source: http://pixabay.com/en/modesto-california-scenic-trail-205544/

Introductions

Steve Fadden Niyati Bedekar

University of Pune

Who We Are

Who are you?Years experience in

user research:

<11-22-55+

Image source: Karen Arnold (http://www.publicdomainpictures.net/view-image.php?image=45018)

Who are you?Total number of

employees:

1-20 21-100101-500500+

Image source: Karen Arnold (http://www.publicdomainpictures.net/view-image.php?image=45018)

Who are you?

Most recent research request?

Most common research request?

Jot down

Image source: http://pixabay.com/en/photos/note%20paper/

Online methods, especially asynchronous

Confession: Agility over comprehensiveness

Image source: https://www.flickr.com/photos/smithser/3735204251

Variety of Online Methods

Image source: https://flic.kr/p/jZUByi

Methods toolkit

What People Do

What People Say

Why & How to Fix

How Many & How Much

Behavioral

Attitudinal

Qualitative Quantitative

Rohrer, C. October 12, 2014. When to use which user experience research methods. Retrieved from http://www.nngroup.com/articles/which-ux-research-methods/

Toolkit is growing(Rohrer’s framework)

Image source: http://www.freestockphotos.biz/stockphoto/1772

Method (participant effort) Types of answers provided

Click Behavioral: Where to start or go next?

Preference Attitudinal: Compare between options

Recall Hybrid: What do you remember? What are your first impressions?

Sentiment Attitudinal: How does this make you feel?

Embedded questions Hybrid: What happens next, and why? How would you rate this?

Terminology/naming Attitudinal: What does something mean?

Commenting Hybrid: What comes to mind while reviewing a concept/flow? OR Open feedback

Go-to methods

Image source: http://www.geograph.org.uk/photo/1911269

Method (participant effort) Types of answers provided

Card sorting Hybrid: What items belong together and what should they be called?

Discussion groups / Focus groups

Attitudinal: What comes to mind while reviewing other feedback?

Unmoderated usability testing

Hybrid: What do you expect? What do you do? Why?

Additional methods to consider

Image source: http://www.geograph.org.uk/photo/1911269

Sample mockups used

“Finals week starts on June 1. Where would you first click to put a reminder on your calendar?”

Click methods (Behavior: Where do users click)

UsabilityTools

“Describe what you would expect to see after you clicked the area in the previous screen?”

Embedded question (Hybrid: What happens next)

Qualtrics

“Please click the variation you prefer. [after] Why did you choose it?”

Preference (Attitude: Which do you prefer)

Verify

“You will see a screen for 5 seconds. After reviewing the screen, you’ll be asked questions about it. [after] What do you remember?”

Recall (Hybrid: What do you remember)

Verify

“Review this screen and think about how it makes you feel.”

Sentiment (Attitude: How does this make you feel)

Verify

“Do you find this design to be attractive?”

Embedded question (Attitude: How do you rate this)

SurveyMonkey

“Label each marker with what you would call the icon.”

Terminology/naming (Attitude: What does this mean)

Verify

“This design shows what happens when you click the ‘+’ icon. Comment on areas you find confusing, problematic, helpful, usable.”

Commenting (Hybrid: What comes to mind)

Verify

Fast

Convenient

Focused

↑ People

↓ Resources

Benefits

Image source: http://commons.wikimedia.org/wiki/File:Double-alaskan-rainbow-airbrushed.jpg

Context

Environment

Participant profiles

Tasks

Responses

Challenges

Image source: http://commons.wikimedia.org/wiki/File:Angela-Whyte-Hurdle-Posed.jpg

Activity

Form groups of 3-5Review common requestsDiscuss how you typically researchConsider online solutionsDiscuss pros/cons

Discussion: Research requests

Image source: http://en.wikipedia.org/wiki/Fischer's_lovebird

Reference (for Activity)

Image source: http://www.geograph.org.uk/photo/1911269

Method (participant effort) Types of answers provided

Click Behavioral: Where to start or go next?

Preference Attitudinal: Compare between options

Recall Hybrid: What do you remember? What are your first impressions?

Sentiment Attitudinal: How does this make you feel?

Embedded questions Hybrid: What happens next, and why? How would you rate this?

Terminology/naming Attitudinal: What does something mean?

Commenting Hybrid: What comes to mind while reviewing a concept/flow? OR Open feedback

Group discussion: Share thoughts● Problem● Typical solution● Online research solution● Pros/cons

Discussion: Research requests

Image source: http://en.wikipedia.org/wiki/Fischer's_lovebird

Case Studies and Templates

Case Study 1: Evaluate new data export concept

Background

- New functionality for an existing product- Integrated with 3rd party software- To be implemented ASAP

Goals

- “Boil the ocean” to learn if concept was understood, desired, and usable

Methods

Embedded questionCritical incident

Embedded question Comprehension rating

CommentingOn each storyboard panel, after presenting full story

Embedded questionOpen feedback, questions, and expectations

“Consider the last time you had to export data. Describe why you needed to export data, and list the steps you remember from that process. (If you haven’t exported data before, or don’t remember the last time, just skip to the next question).”

Embedded question (Critical Incident)

“I’m pretty old school, so I export my credit card transaction data about every quarter. My credit card site has a button to export to CSV, so I just click that and it downloads to my computer.”

“We have our marketing, sales, and inventory data in different systems. I have to export data from each system in order to combine it into a spreadsheet for my stakeholders. The export process is easy. Combining the data is more involved.

“Consider the concept presented on the next 4 slides. After reading about the concept, you will be asked about what you found to be confusing, problematic, useful, and appealing about the concept.”

1.

New concept scenario

2.

3.

4.

100%

“How understandable is this concept?”

Embedded question (Comprehension)

Commenting (Identify strengths and weaknesses)

“You will now be shown each concept slide again. On each slide, indicate anything you found to be particularly confusing, problematic, useful, and appealing.”

1.

2.

3.

4.

100%“Doing this would require a lot of clicks, even for a small number of columns.”

“You should embed best practices for naming here. Otherwise, the result could be messy.”

“Will we be able to save the mappings? That could save time in the future.”

“Any final comments, questions, or feedback you’d like to share?”

Embedded question (Open feedback)

“It’s great that you don’t have to jump around different parts of the system to do this. Very valuable to be able to complete this from one place.”

“Seems very clear to me. I think anyone who has used [XYZ] would be able to understand it too.”

“Hi, I wanted to follow up to reiterate that this is a REALLY COOL idea and it fills a much needed requirement for our use of the product. Please consider me for future studies like this, because we need this functionality!”

Template 1: Exploring a new concept

NDA,Confidentiality, Demographics

Embedded Question: Critical incident to activate

[Present concept]Video, illustration, storyboard, description

Embedded Question: Comprehension rating, after presenting concept

Commenting: Concept slides (storyboards work well)

Embedded Question: Open feedback

Case Study 2: Identify problems and preferences for calendar range selection tools

Background

- Tool developed without support- Early stage prototype, only worked within company firewall- Team wanted feedback before further refinement

Goals

- Recruit internal participants only- Identify heuristic violations- Gauge preference compared to existing tools

Methods

ClickHow would you start task

Commenting(after using prototype) See screenshots of tool in different states

PreferenceCompare tool to existing tool

Embedded QuestionExplain preference and next steps

Template 2: Eliciting usability/heuristic feedback

NDA,Confidentiality, Demographics

Recall: What is remembered? [or]

Sentiment: How does this make you feel?

Click: How would you start this task?

Embedded Question: What would you expect to see after clicking?

Commenting: Open feedback, after engaging

Embedded Question: Usability rating

Case Study 3: Redesign chart type & update visual treatment

Background

- Existing component used frequently by customers and loved by many!- Not scalable- Prone to misinterpretation- Team wanted to test new designs

Goals

- Understand if users comprehend the new design- Gauge preference among 3 different approaches (including existing)- Mix of internal users and customers

Methods

Embedded questionUnderstandability of information

PreferenceAmong the various options

CommentingOpen feedback, expectations

Template 3: Redesigned visual treatment

NDA,Confidentiality

Embedded Question: to gather understanding of information on chart (randomize)

Preference: Which design do you prefer? (randomize)

Embedded Question: Why the selected design?

Commenting: Open feedback

Demographics

Case Study 4: Understand how people find content

Background

- Team assigned to build new system- Wanted to create a system where content was easy to locate

Goals

- Identify how users locate content- Discover differences based on content type- Understand pain points to see if they can be reduced or eliminated

Methods

Click(for each method) Where do you click first to locate this kind of content?

SentimentWhat feeling is associated?

CommentingOpen feedback, expectations

Embedded Question(after each method) What do you find most/least usable?

Template 4: Understanding behavior and expectations

NDA,Confidentiality, Demographics

Embedded Question: Critical incident to activate

Click: What do you do first?

Sentiment: How do you feel when you do this?

Commenting: What works well and not well?

Embedded Question: Open feedback

Effective Practices

Maintain your own panel; build a snowball

Image source: https://flic.kr/p/aZhJF

Image source: https://flic.kr/p/qSsTmF

Recruit from social media, communities, classifieds & search

Match screening to goals and use creative incentives

Image source: http://www.pexels.com/photo/numbers-money-calculating-calculation-3305/

Protect confidentiality and collect useful demographics

Image source: https://flic.kr/p/8xzAnc

Image source: http://pixabay.com/en/mark-marker-hand-leave-516279/

Order questions intentionally, limit required questions and total number

Image source: https://flic.kr/p/oWyYTz

Launch a pilot study, but remove data to save time later

Provide follow-up channels

Image source: http://commons.wikimedia.org/wiki/File:Old_British_telephones.jpg

Other hints and tips?

Image source: https://flic.kr/p/6yoj2L

Final Considerations & Surprises

People really do participate

Image source: https://flic.kr/p/g9agMi

Engagement and response quality are surprisingly high

Image source: https://flic.kr/p/hfWrxQ

Incentives don’t need to be high; no incentive could be the right price

Image source: http://commons.wikimedia.org/wiki/File:Money_Cash.jpg

Many participants (and researchers) want follow-up opportunities

Image source: https://flic.kr/p/7qcudQ

Triangulation is critical

Image source: https://flic.kr/p/2NJxPz

Thank you

Questions? Answers?

(Please leave your note with common research requests)

Additional Resources

Type of Test

Tools Click / Suc- cess

Prefer- ence

Recall Senti- ment

Ques- tion

Termin- ology/ Label

Com- menting

Card sorting

Discus- sion

Unmoder- ated usability + video on website

Metrics & Results

Verify ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓

Usabilla ✓ ✓ ✓ ✓

Loop11 ✓ ✓ ✓

UserTesting.com ✓ ✓

UserZoom ✓ ✓ ✓ ✓ ✓ ✓

Optimal Workshop

✓ ✓

Yahoo Groups, Facebook, LinkedIn

Survey tools (Getfeedback, Qualtrics, SurveyMonkey)

✓ ✓ ✓ ✓

Examples of types of tests available (Incomplete list)

Chrisitan Rohrer’s NNG article about when appropriatenes of a method to help answer specific questions: http://www.nngroup.com/articles/which-ux-research-methods/

A review of usability and UX testing tools: http://www.smashingmagazine.com/2011/10/20/comprehensive-review-usability-user-experience-testing-tools/

How to select an unmoderated user testing tool to fit your needs: http://www.nngroup.com/articles/unmoderated-user-testing-tools/

List of tools for unmoderated testing:

1. http://remoteresear.ch/tools/2. http://www.infragistics.com/community/blogs/ux/archive/2012/11/07/6-tools-for-remote-

unmoderated-usability-testing.aspx

Kyle Soucy’s article in UX Matters (Unmoderated, Remote Usability Testing: Good or Evil?) http://www.uxmatters.com/mt/archives/2010/01/unmoderated-remote-usability-testing-good-or-evil.php

Additional Links