Topics for today: April 7, 2004
•Finish discussion of error recovery/documentation from last week
•Ratner Ch. 9 – study of improved error messages
•Ratner Ch. 15 – study of “live help” systems
•Finish discussion of usability life cycle from Monday
•Ratner Ch. 4 – methodology for cost/benefit analysis of usability engineering activities
Final exam: Wed April 21, 8:00 a.m., 424 HA
Documentation Guidelines- Organization
• State the educational objectives of each section
• Introduce concepts in a logical order of increasing difficulty
• After each “chunk” of material (7 + or - 2 concepts): Provide a “walkthrough” example showing how the concepts are used.
• Avoid forward references
Documentation Guidelines - Appearance & Style
• CONSISTENCY - Develop written guidelines for consistent organization, style, and appearance
• READABILITY - Use white space and text-organizing conventions to avoid large text blocks. (use headings/subheadings, bullet lists, short paragraphs)
• SIMPLICITY - Use simple writing style, even if users are well-educated (users are engaged in many tasks at once)
Tutorial Material
• Should describe capabilities at task/functional level.
• Should describe capabilities in an action-oriented way.
•Use a conceptual model (OAI model) to structure explanations•Start by explaining the task model objects, from the highest level down to “atomic” elements.•Then explain the task model actions, from user’s goals down to specific action steps.•Once user understands the task objects and actions, then show the interface model objects and the mechanisms or command syntax needed to accomplish tasks• Finally, describe shortcuts
Object/Action Interface Model: (Schneiderman, Sec. 2.3)
Objects Actions Objects Actions
Task Model Interface Model
Information design stage Mechanism/visual design
Visible symbols Physical actions
Domain information System Tasks
Program objects Steps
Visible objects User Operations
Creating Good Documentation - Summary
Good:•Progressive approach•Task-oriented examples•Readable explanations
Bad:•Complete specification presented in one text block•Abstract formal notations•Terse technical prose or complex prose style
On-line Help
Pro’s:•It’s there whenever you need it.•Can be updated at low cost•Enhanced by string search, indexes, TOC, bookmarks,
hypertext links•Use of color, sound, animation
Con’s:•Readability may be less than printed manuals•Presents another user interface to master•Blocks user’s view of workspace
Reading from Paper v. Displays
Studies through 1980’s showed performance disadvantages in reading from display screens -- about 30% slower task times, slightly lower accuracy.
Readability issues:•screen size (frequent paging)•placement (looking down is better, rigid posture)•contrast, flicker, resolution, curved display surface fonts, layout, formatting
Other issues: health concerns, fatigue, and stress
But: Later studies showed no difference with better quality display.
Context-sensitive on-line Help
•For part of program that is active•For a selected object:
•using function key (F1)•Balloon help
•Prompts for fill-in fields
FAQsNetworked human help available
Help deskUser discussion groups/Newsgroups
New approaches for on-line help
Four empirical studies
1. Error messages
2. Live help systems
3. Eye-hand coordination
4. Scent of the Web (searching for information)
Advice on reading empirical studies
1. What question or issue is being investigated?2. Describe the experimental methodology
i. What was the set-up (HW/SW)?ii. What were subjects asked to do?iii. How were the data analyzed?
3. What conclusions were drawn?
4. What additional questions do you have about themethodology?
5. What were the strengths and weaknesses of the study?6. Do you think the conclusions were justified (why?)
Revising error messages
Background review: Norman 3 ways to approach errors
Norman 3 kinds of errors
Schneiderman 3 attributes of good error messages
Revising error messages
Background review: Norman 3 ways to approach errors
minimize root causesreversible actionseasy to discover errors and clear how to correct
Norman 3 kinds of errorsslipmistakesituational
Schneiderman 3 attributes of good error messagespositive tonespecificconstructive<non-anthropomorphic>
Revising error messages (cont.)
1. What question or issue is being investigated?2. Describe the experimental methodology
i. What was the set-up (HW/SW)?ii. What were subjects asked to do?iii. How were the data analyzed?
3. What conclusions were drawn?
4. What additional questions do you have about themethodology?
5. What were the strengths and weaknesses of the study?6. Do you think the conclusions were justified (why?)
Live Help System
Interaction Elements:•Knowledge base of FAQ items
•Continuously updated by assistants•User types NL question, matched to FAQ’s•Chat interface interacts with human assistants
•If retrieved FAQ’s do not satisfy user•Feedback on availability of assistants•Feedback on your assistant’s “state”•Dialog history•Text entry area
•User model displayed to assistant
Usability Testing of Live Help System
Methodology: field study using ElfwoodIssues to investigate:
•Impact on user attitudes, especially trust•Quality of support•Quality of assistant work situation
Assistants and users volunteeredEvaluation by questionnaires (2 for users, 1 for assistants)
Usability Testing of Live Help System (cont.)
Group 1 – users who interacted with assistantsquestions to evaluate efficiencyquestions to evaluate attitude
Group 2 – users who did not interact with assistantswhy? – 15 % were satisfied w/FAQ
38% “just browsing”24% could not get the system to work29% no assistants available then
Group 3 – volunteer assistants
Design implications for future Live Help System
1. Emphasize the availability of live help, since users don’t expect it.
2. Make initiation process very easy3. Do not use platform-dependent software (Java applet)4. Make availability hours clear for getting human help 5. Provide queuing status6. Provide call-back option7. Use visual and audio alert when help becomes available8. Consider email or voice options
Cost-justifying usability
Applying traditional cost-benefit analysis to Web UE projects
Context:Complex Web apps vs. simple content-only sitesDevelopment time and cost approaching other software projectsSurveys show ease of use is critical to Web success
Some benefit categories for Web sitesincreased buy-to-look ratios (e-commerce model)increased number of visitors (advertising model)decreased cost of other customer service channelsdecreased user training cost (internal KM model)
Cost-justifying usability (cont.)
Steps in the methodology:1. Start with the UE plan2. Establish analysis parameters3. Estimate the cost of each lifecycle task in the plan4. Select relevant benefit categories5. Estimate monthly benefits6. Compare cost to benefits
1. Benefits per month2. One-time cost3. Payback period
Cost-justifying usability (cont.)Usability Engineering Plan - activities
I.User profileI.Task analysis (problem scenarios)I.Usability goal settingII.Information architecture (activity scenarios)II.Conceptual design (information scenarios)II. Paper prototype developmentII. Usability testIII. Coordinated mechanism and screen design
(interaction scenarios)III. Document design standardsIV. Live prototype development V. Usability testVI. Complete user interface design/prototype Usability test
Compare with Nielsen Usability Life Cycle – 7 Stages
I. Preliminary analysis
Know the user•user characteristics•users’ current and desired tasks•functional analysis•co-evolution of tasks and artifacts
Competitive analysis (automated and non-automated alternatives)Setting usability goals
•financial impact analysis
Usability Life Cycle (cont.)
II. Early design
Parallel designParticipatory design
•Domain experts (get used up)•Paper mock-ups or sample screens (not system specs!)
III. Middle DesignCoordinated design of the total interfaceApply guidelines and heuristic analysis
Usability Life Cycle (cont.)
IV. Implemented designPrototyping/scenarios (storyboarding)
V. Empirical testing
VI. Iterative DesignSolution may or may not helpDatabase (hypertext) of design rationale
VII. Studying usability in the field
Cost-justifying usability (cont.)
Usability Engineering Plan – cost componentsUsability engineer hoursDeveloper hoursUser hoursEquipment
Cost-justifying usability (cont.)
Goals of this activity:win funding for UEplan appropriate UE programs
Discussion of Web statistics and their limitationsnumber of visitors v. how many were satisfiedhow many bought v. how many did not buyhow many customer support calls processed v. how many customer problems resolved
Better data would lead to after-the-fact validation and greater credibility in the future