Designing a site (3/4) – 1h Testing and Evaluation.

23
Designing a site (3/4) – 1h Testing and Evaluation

Transcript of Designing a site (3/4) – 1h Testing and Evaluation.

Page 1: Designing a site (3/4) – 1h Testing and Evaluation.

Designing a site (3/4) – 1h

Testing and Evaluation

Page 2: Designing a site (3/4) – 1h Testing and Evaluation.

Lazar’s Development LifecycleDefine the mission &

target users

Collect user

requirements Create andModify

Conceptual design

Create and modify physical design

Perform usability testing

Implement and market the website

Evaluate and improve the website

Page 3: Designing a site (3/4) – 1h Testing and Evaluation.

Testing

Focused on checking the physical design• Test schedule

– Milestones: modular, alpha, beta

• Black box testing (functional testing)– Does it do what it’s supposed to? Aims to test a

program from the specification

• White box testing (structural testing)– Is the code correct? Aims to test a program by

checking the implemented code

• Revision documentation

Page 4: Designing a site (3/4) – 1h Testing and Evaluation.

Test Documents

• Test Schedule– Module/Unit– Integration– Fit for Purpose

• Report Forms– Alpha– Beta– Bug report

• Revision document

Page 5: Designing a site (3/4) – 1h Testing and Evaluation.

Software Engineer’s Perspective: Verification

• "Program testing can be used to show the presence of bugs, but never to show their absence" (Dijkstra's dictum)

• Analysing and testing of code is big business. Issues:– Unit-level, – Integration-level– Acceptance-level

Page 6: Designing a site (3/4) – 1h Testing and Evaluation.

Other approaches to testing

• Media production– Catch problems when they happen (take 2,3,4…)– Don’t think you can fix it in the mix/postproduction– But now the studio could make supply talent!

• Journalism– Problems in spelling and phrasing (house style)– Problems in presentation (the “delayed drop”)– Potential legal liability– “Tomorrow’s fish and chip wrapper” – or a valuable

archive for the future?

Page 7: Designing a site (3/4) – 1h Testing and Evaluation.

Activity - Philosophies of Testing

• Three examples of different approaches– Software Engineer: functional, mathematical testing?– Media Producer: it it professional, distinctive?– Journalist: can I get away with it? Is it truthful?

“Good writing”?

• Form into three role-playing groups, one for each of the these three professions.

• Identify the five most important criteria for judging whether work is acceptable in your discipline

Page 8: Designing a site (3/4) – 1h Testing and Evaluation.

Philosophies of Testing

• Software Engineer– Unit: Compliance, Code objects, Bugs– Integration: Compatibility, Navigation– Acceptance: Accessibility, Efficiency

• Media Producer– Unit: Font size/colour, Flash limited use– Integration: Links, Navigation, Menu bar clear,

Colour scheme, Grouping layout• Journalist

– Unit: Titles, Headings, Compliance– Integration: Links, Legal, Group decisions– Acceptance: Formatting, Accessibility, Up-to-Date

Page 9: Designing a site (3/4) – 1h Testing and Evaluation.

Verification: unit-level test

• Before you write your own Javascript, consider how it will be maintained.

• eXtreme Programmers spend more time writing automated test-scripts than code.

• England & Finney: "The lower level of coding you undertake, the longer it will take".

• Code Generators (eg Dreamweaver) can be just as easy to manage as binary file producers (eg Flash), as long as you don’t tinker with the code

• Is there much to unit test in a web-page?– HTML/XHTML valid?– Navigation links present and correct?– Accessibility compliance tested, checked?

Page 10: Designing a site (3/4) – 1h Testing and Evaluation.

Web Validation Tools

• http://netmechanic.com/ (web pages)

• http://www.w3.org (HTML, CSS)

• http://webxact.watchfire.com (accessibility)

Page 11: Designing a site (3/4) – 1h Testing and Evaluation.

Verification: Acceptance Testing

• Match the requirements specification

• But what if there is no requirements specification? Then you use pragmatic tests:– Justifies payment of invoice– Adequate for release to customers– Impact assessment

Page 12: Designing a site (3/4) – 1h Testing and Evaluation.

Some categories

• Low-level, generally simple problems– Spelling mistakes, incorrectly typed links

• Mistakes that only happen once you make the site live– “Hard-coded” rather than “Relative” links– Download times

• More abstract problems – Does it communicate with the audience?– It it entertaining?

Page 13: Designing a site (3/4) – 1h Testing and Evaluation.

Smithson & Hirschheim – 3 layers of “quality”

• Efficiency– Hardware/software monitor, Quality assurance (manufacturing)– Simulation, Total quality management– Code inspection Software metrics, Clean-room

• Effectiveness– System usage, Resource utilisation (Economics)– Critical success factors, Cost benefit analysis (Management)– User satisfaction (Organisational behaviour, Risk analysis)– Gap analysis (Marketing, Management )

• Understanding…

Page 14: Designing a site (3/4) – 1h Testing and Evaluation.

Evaluation

Focuses on the conceptual design• Formative evaluation (“Usability testing”?)

– carried out during the development cycle – information gained is fed back to influence the development

• Summative evaluation– normally done on completion of the project– have aims and objectives been met ?

• Integrative evaluation– carried out once application is being used– has system as a whole been improved ?

• Evaluation can be qualitative and quantitative

Page 15: Designing a site (3/4) – 1h Testing and Evaluation.

Evaluation – who does it?

Users• Perform tasks (Appropriate and realistic)

• Observe and collect data (Errors, Misunderstandings, Task completed successfully/unsuccessfully, Time taken)

• Debriefing (Questionnaire, Focus groups)

Experts• Heuristic evaluation• Guidelines• Cognitive walkthrough• Consistency

Page 16: Designing a site (3/4) – 1h Testing and Evaluation.

Usability testing – where?

• Usability Laboratory– Artificial situation– More control

• In the context of use– Realistic– Harder to control

• Over the web

• Anywhere?

Page 17: Designing a site (3/4) – 1h Testing and Evaluation.

Data collection from the user

• How many users?• Task performance (Time performance, Errors

encountered)

• Observation• Video• Tracking• Thinking aloud protocol• Audio recording• Questionnaires• Interview/Focus Group

Page 18: Designing a site (3/4) – 1h Testing and Evaluation.

Usability testing steps

1. Select representative users and set up time/location for usability testing

2. Introduce the usability test (tell them the system is under test – not the user), get informed consent, give users the tasks

3. Ask users to attempt to perform tasks; collect data (notes, audio- /video-tape, logging)

4. After the usability test, ask users to fill out a questionnaire

5. Thank users for participation (and compensate them)

Page 19: Designing a site (3/4) – 1h Testing and Evaluation.

Data Collection from Experts 1/2

• Heuristic Review– e.g. see Shneiderman’s 8 golden rules or Nielsen’s

10 Usability Heuristics

• Guideline Review– An expert review against a larger list of design

guidelines

Page 20: Designing a site (3/4) – 1h Testing and Evaluation.

Data Collection from Experts 2/2

• Cognitive Walkthrough (Tasks/goals) - At each step– Does the user know what to do?– Does the user know how and where to do it?– Does the user know what they have done?

• Consistency– Design & Navigation

• Formal Usability Inspection– Designers justify choices to expert reviewers screen

by screen.

Page 21: Designing a site (3/4) – 1h Testing and Evaluation.

Heuristics: Shneiderman’s 8 golden rules of dialogue design

• Strive for consistency• Enable frequent users to use shortcuts• Offer informative feedback• Design dialogs to yield closure• Offer simple error handling• Permit easy reversal of actions • Support internal locus of control• Reduce short-term memory load

Page 22: Designing a site (3/4) – 1h Testing and Evaluation.

Nielsen’s 10 Usability Heuristics

1. Visibility of the system status

2. Match between system and the real world

3. User control and freedom

4. Consistency and standards

5. Error prevention

6. Recognition rather than recall

7. Flexibility and efficiency of use

8. Aesthetic and minimalist design

9. Help users recognise, diagnose and recover from errors

10.Help and documentation

Page 23: Designing a site (3/4) – 1h Testing and Evaluation.

Cooperative Usability Evaluation (Monk et al, 1995)

1. Prepare a set of scenarios

2. Each scenario is a story of a particular use-case of your website

3. Try to prepare different scenarios

4. From each identify the tasks users will have to perform

5. Tasks consist of a series of activities that are performed to accomplish a goal, e.g.

Place an online order for a textbook

Find out some specific information from the web site

e-mail feedback to the client