Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

38
Ch 13. Asking Users & Experts Team 3: Jessica Herron Lauren Sullivan Chris Moore Steven Pautz

Transcript of Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Page 1: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Ch 13. Asking Users & Experts

Team 3: Jessica Herron

Lauren Sullivan

Chris Moore

Steven Pautz

Page 2: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Asking Users and Experts: Introduction

• We want to find out what users do, want to do, like, or don’t like

• Several tools & techniques are available• Interviews

• Questionnaires

• Heuristic Evaluation

• Cognitive Walkthroughs

• Strengths/Weaknesses of these tools

Page 3: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Interviews

Unstructured are not directed by a script. Rich but not replicable.

Structured are tightly scripted, often like a questionnaire. Replicable but may lack richness.

Semi-structured guided by a script but interesting issues can be explored in more depth. Can provide a good balance between richness and replicability.

Page 4: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Interview Basics• Remember the DECIDE framework

– Determine Goals, Explore Questions, Choose eval paradigm, Identify issues, Decide on ethical issues, and Evaluate data.

• Goals and questions guide all interviews• Two Types of Questions:

Closed Questions Closed questions have a predetermined answer format, e.g., ‘yes’ or ‘no’. Closed questions are quicker and easier to analyze.

Open Questions Open questions do not have a predetermined format, they are trying to determine the feeling of the interviewee.

Page 5: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Avoid Interview Pitfalls

Long questions Compound sentences - split into two Jargon & language that the interviewee may not

understand Leading questions that make assumptions e.g.,

why do you like …? They might not actually like the product.

Be aware of unconscious biases e.g., gender stereotypes

Page 6: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Interview ComponentsIntroduction introduce yourself, explain the goals of the

interview, reassure about the ethical issues, ask to record, present an informed consent form.

Warm-Up make first questions easy & non-threatening.

Main Body present questions in a logical order

Cool Off include a few easy questions to defuse tension at the end

Closure thank interviewee, signal the end, e.g, switch recorder off.

Page 7: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

The Interview Process

• Dress in a similar way to participants• Check recording equipment in advance• Devise a system for coding names of participants

to preserve confidentiality.• Be pleasant• Ask participants to complete an informed consent

form

Page 8: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Group interviews

• Also known as focus groups• Typically 3-10 participants• Provide a diverse range of opinions• Need to be managed to:

- ensure everyone contributes- discussion isn’t dominated by one person- the agenda of topics is covered, eliminate digressions.

Page 9: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Other Sources

• Telephone Interview– Can’t see participant’s body language or nonverbal

clues.

• Online Interviews– Asynchronous: E-mail or Synchronous: Chat Room

• Retrospective Interview– Check with the participant to be sure interviewer

correctly understood their remarks.

Page 10: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Analyzing interview data

• Depends on the type of interview• Structured interviews can be analyzed like

questionnaires• Unstructured interviews generate data like that

from participant observation• It is best to analyze unstructured interviews as

soon as possible to identify topics and themes from the data

Page 11: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Links

• UPA – (Usability Professionals Association)– Discovering User Needs Discovering User Needs

• Sage Services - User Research & Usability Techniques Available Through Sage.

Page 12: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Questionnaires

• Questions can be closed or open• Closed questions are easiest to analyze, and may

be done by computer• Can be administered to large populations• Paper, email & the web used for dissemination• Advantage of electronic questionnaires is that data

goes into a data base & is easy to analyze• Sampling can be a problem when the size of a

population is unknown as is common online

Page 13: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Questionnaire Style

• Questionnaire format can include:- ‘yes’, ‘no’ checkboxes- checkboxes that offer many options- Likert rating scales- semantic scales- open-ended responses

Page 14: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Designing Questionnaires

• Make questions clear and specific.• When possible, ask closed questions and offer a

range of answers.• Consider including a ‘no-opinion’ option for

questions that seeks opinions.• Think about the ordering of questions.• Avoid complex multiple questions.

Page 15: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Designing Questionnaires

• When scales are used, make sure the range is appropriate and does not overlap.

• Make sure the ordering of scales is intuitive and consistent.

• Avoid jargon.• Provide clear instructions on how to complete the

questionnaire.• Make a balance between white space and the need

to keep the questionnaire compact.

Page 16: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Encouraging a good response

• Ensure the questionnaire is well designed so that participants do not get annoyed.

• Provide a short overview section.• Include a stamped, self-addressed envelope.• Contact respondents through a follow-up letter.• Offer incentives.• Explain why you need the questionnaire to be

completed.

Page 17: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Online Questionnaires

• Advantages:– Responses received quickly.

– Copying and postage costs lower.

– Data can be transferred immediately to database.

– Time required is reduced.

– Errors in questionnaire can be corrected easily.

Page 18: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Online Questionnaires

• Disadvantages:– Sampling is problematic if population size is unknown.

– Preventing individuals from responding more than once.

– Individuals have also been known to change questions in email questionnaires

Page 19: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Developing Web-based Questionnaire

• Steps:1. Produce an error-free interactive electronic version from the

original paper-based one.

2. Make the questionnaire accessible from all common browsers and readable for different sized monitors and network locations.

3. Make sure information identifying each respondent will be captured and stored confidentially.

4. User-test the survey with pilot studies before distributing.

Page 20: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Asking experts

• Experts use their knowledge of users & technology to review software usability

• Expert critiques (crits) can be formal or informal reports

• Heuristic evaluation is a review guided by a set of heuristics

• Walkthroughs involve stepping through a pre-planned scenario noting potential problems

Page 21: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Heuristic evaluation

• Developed by Jacob Nielsen in the early 1990s• Based on heuristics distilled from an empirical

analysis of 249 usability problems• These heuristics have been revised for current

technology, e.g., HOMERUN for web• Heuristics still needed for mobile devices,

wearables, virtual worlds, etc.• Design guidelines form a basis for developing

heuristics

Page 22: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Nielsen’s heuristics• Visibility of system status• Match between system and real world• User control and freedom• Consistency and standards• Help users recognize, diagnose, recover from errors• Error prevention • Recognition rather than recall• Flexibility and efficiency of use• Aesthetic and minimalist design• Help and documentation

Page 23: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

HOMERUN

• High-quality content• Often updated• Minimal download time• Ease of use• Relevant to users’ needs• Unique to the online medium• Netcentric corporate culture

Page 24: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Discount evaluation• Heuristic evaluation is referred to as discount

evaluation when 5 evaluators are used.

• Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

Page 25: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

3 stages for doing heuristic evaluation

• Briefing session to tell experts what to do• Evaluation period of 1-2 hours in which:

- Each expert works separately- Take one pass to get a feel for the product- Take a second pass to focus on specific features

• Debriefing session in which experts work together to prioritize problems

Page 26: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Heuristic evaluation of websites

• MEDLINEplus – medical information website

• Keith Cogdill, commissioned by NLM, evaluated this website

• Based evaluation on the following:– Internal consistency

– Simple dialog

– Shorcuts

– Minimizing the user’s memory load

– Preventing errors

– Feedback

– Internal locus of control

Page 27: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

MEDLINEplus suggestions for improvement

• Arrangement of health topics– Arrange topics alphabetically as well as in categories

• Depth of navigation menu– Having a higher “fan-out” in the navigation would enhance

usability…meaning many short menus rather than a few deep ones.

Page 28: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Turning guidelines into heuristics for the web

• Navigation, access and information design are the three categories for evaluation

• Navigation– Avoidance of orphan pages

– Avoidance long pages with excessive white space (scrolling problems)

– Provide navigation support

– Avoidance of narrow, deep, hierarchical meus

– Avoidance of non-standard link colors

– Provide a consistent look and feel

Page 29: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Turning guidelines into heuristics for the web (cont.)

• Access– Avoidance of complex URLs

– Avoid long download times that annoy users• Pictures

• .wav files

• Large software files

• Information Design– Have good graphical design

– No outdated or incomplete information

– Avoid excessive color usage

– Avoid gratuitous use of graphics and animation

– Be consistent

Page 30: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Advantages and problems

• Few ethical & practical issues to consider• Can be difficult & expensive to find experts• Best experts have knowledge of application

domain & users• Biggest problems

- important problems may get missed- many trivial problems are often identified

Page 31: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Walkthroughs

• Alternative approach to heuristic evaluation• Expert walks through task, noting problems in

usability• Users are usually not involved• Several forms:

• Cognitive Walkthrough

• Pluralistic Walkthrough

Page 32: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Cognitive Walkthroughs

• Involves simulating the user’s problem-solving process at each step in the task

• Focus on ease of learning• Examines user problems in detail, without needing

users or a working prototype• Can be very time-consuming and laborious• Somewhat narrow focus• Adaptations may offer improvements

Page 33: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Cognitive Walkthrough Steps

• One or more experts walk through the design description or prototype with a specific task or scenario

• For each task, they answer three questions:• Will users know what to do?

• Will users see how to do it?

• Will users understand from feedback whether the action was correct or not?

Page 34: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Cognitive Walkthrough Steps

• A record of critical information is compiled• Assumptions about what would cause problems and why,

including explanations for why users would face difficulties

• Notes about side issues and design changes

• Summary of results

• Design is revised to fix problems presented

Page 35: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Pluralistic Walkthroughs

• Variation of Cognitive Walkthrough• Users, developers, and experts work together to

step through a scenario of tasks• Strong focus on users’ tasks• Provides quantitative data• May be limited by scheduling difficulties and time

constraints

Page 36: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Pluralistic Walkthrough Steps

• Scenarios are developed in the form of a series of screens

• Panel members individually determine the sequence of actions they would use to move between screens

• Everyone discusses their sequence with the rest of the panel

• The panel moves to the next scenario

Page 37: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Asking Users and Experts: Summary

• Various techniques exist for getting user opinions• Structured, unstructured, and semi-structured

• Interviews, focus groups, questionnaires

• Expert evaluations: heuristic evaluation and walkthroughs

• Various techniques are often combined for added accuracy and usefulness

Page 38: Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Interview: Jakob Nielsen

• “Father of Web Usability”• Pioneered heuristic evaluation• Demonstrated cost-effective methods for usability

testing & engineering• Author of several acclaimed design books• www.useit.com• www.nngroup.com