Human-Computer Interaction

20
1 Human-Computer Interaction Usability Evaluation: 1 Introduction and Analytic Methods

description

Human-Computer Interaction. Usability Evaluation: 1 Introduction and Analytic Methods. Lecture Overview. Definition and motivation Industrial practice and interest Types of evaluation Analytic methods Heuristic evaluation Keystroke level model Cognitive walkthrough. - PowerPoint PPT Presentation

Transcript of Human-Computer Interaction

Page 1: Human-Computer Interaction

1

Human-Computer Interaction

Usability Evaluation: 1Introduction and Analytic Methods

Page 2: Human-Computer Interaction

2

Lecture Overview

• Definition and motivation• Industrial practice and interest• Types of evaluation• Analytic methods

• Heuristic evaluation

• Keystroke level model

• Cognitive walkthrough

Page 3: Human-Computer Interaction

3

Evaluation: Definition and Motivation

• Definition• Gathering information about usability or potential

usability of a system • Motivation• Suggest improvements or confirm acceptability of

interface and/or supporting materials• Ensure competitive reputation

• ‘Users will evaluate your interface sooner or later’ (Hix and Hartson, 1993)

• Match or exceed usability of competitor’s products (and statutory requirements)

Page 4: Human-Computer Interaction

4

MUSiC Project - Metrics for Usability Standards in Computing

• Early 1990’s - European survey• Generally high appreciation of importance of

usability evaluation • Knowledge of evaluation methods limited• Lack of metrics a major problem (after time and

money limitations)• Intuitiveness of a product and ability to learn it

quickly without manuals is a an increasingly important usability factor

Page 5: Human-Computer Interaction

5

Reasons why Interface Evaluation is Often Omitted or Poorly Performed

• Assumption designer’s personal behaviour is ‘representative’

• Implicit unsupported assumptions about human performance

• Acceptance of traditional/standard interface design

• Postponement of evaluation until ‘a more convenient time’

• Lack of expertise in analysing experiments

Page 6: Human-Computer Interaction

6

What to Evaluate

• Usability specifications at all lifecycle stages• Initial designs (pre-implementation)

• Partial

• Integrated

• Prototype at various stages• Final(?) implementation• Documentation

Page 7: Human-Computer Interaction

7

Formative Evaluation

• Repeatedly, as development proceeds• Purpose: to support iterative refinement• Nature: structured, but fairly informal• Average of 3 major ‘design-test-redesign’ cycles,

with many minor cycles to check minor changes

The earlier poor design The earlier poor design features or errors are features or errors are

detected, the easier and detected, the easier and cheaper they are to correctcheaper they are to correct

Page 8: Human-Computer Interaction

8

Summative Evaluation

• Once, after implementation (or nearly so)• Important in field or ‘beta’ testing• Purpose: quality control - product is reviewed to

check it meets • Own specifications

• Prescribed standards, e.g. Health and Safety, ISO

• Nature: formal, often involving statistical inferences

Page 9: Human-Computer Interaction

9

Where to Evaluate

• Designer’s mind• Discussion workshops• Representative workplace• Experimental laboratory

Page 10: Human-Computer Interaction

10

How to Evaluate:Evaluation Methods

Method Interface User Involvement

development

Analytic Specification No users

Expert Specification or No users

prototype Role playing only

Observational Simulation or Real users

prototype

Survey Simulation or Real users

prototype

Experimental Normally full Real users

prototype

Empirical

ccoossttss

Page 11: Human-Computer Interaction

11

Types of Data

• Quantitative data• Objective measures

• Directly observed• E.g. time to complete, accuracy of recall

• User performances or attitudes can be recorded in a numerical form

• Qualitative data• Subjective responses

• Reports and opinions that may be categorized in some way but not reduced to numerical values

Page 12: Human-Computer Interaction

12

Measurement Tools

• Semi-structured interview• Questionnaire - personal /postal administration• Incident diary• Feature checklist• Focus group• Think-aloud• Interactive experiment

Compare on:• Cost

• Number of subjects

Page 13: Human-Computer Interaction

13

Analytic Evaluation

• Usable early in design• Little or no advance

planning• Cheap• Quick

• Focus on problems• Lack of diagnostic

output for redesign• Encourages

strengthening of existing solution

• Broad assumptions of users’ cognition

• Can be difficult for evaluator

Advantages DisadvantagesAdvantages Disadvantages

Page 14: Human-Computer Interaction

14

Analytic Evaluation:

Heuristic Evaluation

• Assess design against known usability criteria• e.g. Brown, 1994

• Coffee break test• Data overload• Engineering model• Help• Mode test• Dead ends• Unable to complete• Stupid questions

• Jotting test• Standalone test• Maintenance test• Consistency test• Reversibility• Functionality test• Knowledge of completion

Page 15: Human-Computer Interaction

15

Analytic Evaluation: Keystroke Level Model (Card et al., 1980)

• Best known analytic evaluation technique• Simple way of analysing expert user performance

- usually of unit tasks - say 20 secs• Applies constants to operations - total gives

completion time for error free dialogue sequence• Proven predictive validity (+ 20%)

• Human motor system is well understood

• No high-level mental activity

Page 16: Human-Computer Interaction

16

KLM Constants

Operator Meaning Time(secs)

K Press key (good) 0.12 0.28 1.20 (poor)

B Mouse button press

Down or up 0.10

Click0.20

P Point with mouse 1.10 (Fitt’s law: K log2(Distance/Size + 0.5)

H Hand to keyboard or mouse 0.40

M Mental preparation for physical 1.35 action - 1 M per ‘chunk’

R System response time Measure

Averages - modify to suit

Page 17: Human-Computer Interaction

17

KLM - Worked Example(adapted from Browne, 1994)

Operation Operator Time

Decide to deal M 1.35System response, Windows R 2.00Locate and grasp mouse H 0.40Point at option P 1.10Press mouse button B 0.10Release mouse button B 0.10Identify source (customer) M 1.35Point at source on menu P 1.10Press mouse button B 0.10System response, window R 1.00Release mouse button B 0.10Ascertain product M 1.35Press mouse button B 0.10Release mouse button B 0.10System response, window R 1.00Calculate quote (complex) M 2.70Return hand to keyboard H 0.40Type 6 quote characters K * 6 1.20

Total 15.55

Page 18: Human-Computer Interaction

18

Analytic Evaluation: Cognitive Walkthrough

• Analyses design in terms of exploratory learning i.e. user

• Has rough plan

• Explores system for possible actions

• Select apparently most appropriate action

• Interpret system’s response and assess if progresses task

• Suits systems primarily learned by exploration e.g. walk-up-and-use

• Overall question - How successfully does this design guide the unfamiliar user through the performance of the task?

Page 19: Human-Computer Interaction

19

Analytic Evaluation: Cognitive Walkthrough - Key Questions

Simulation of exploration, selection and interpretation at each state of interaction

• Will the correct action be made sufficiently evident to the user?

• Will the user connect the correct action’s description with what he or she is trying to do?

• Will the user interpret the system’s response to the chosen action correctly, that is, will the user know if he or she has made a right or a wrong choice?

Page 20: Human-Computer Interaction

20

Lecture Review

• Definition and motivation• Industrial practice and interest• Types of evaluation• Analytic methods

• Heuristic evaluation

• Keystroke level model

• Cognitive walkthrough