©Brian Whitworth 2005, bwhitworth@acm.org1 THE WEB OF SYSTEM PERFORMANCE (WOSP) Excellence requires...

Post on 26-Dec-2015

213 views 0 download

Tags:

Transcript of ©Brian Whitworth 2005, bwhitworth@acm.org1 THE WEB OF SYSTEM PERFORMANCE (WOSP) Excellence requires...

©Brian Whitworth 2005, bwhitworth@acm.org

1

THE WEB OF SYSTEM PERFORMANCE (WOSP)

Excellence requires Balance

©Brian Whitworth 2005, bwhitworth@acm.org

2

Overview

I. System performance issues & problems

II. The Web of System Performance (WOSP) model

III. Implications for system design

IV. Validation experiments on how users evaluate information systems

V. Summary tips

©Brian Whitworth 2005, bwhitworth@acm.org

3

Past Predictions– Leisure society (4 day week)- cf a 6 day week!– Paperless office (Toffler) – cf more paper than ever before!– No more programmers (Martin)- programmers still in demand– AI will replace people – we still prefer human telephone help – Star-Trek’s video-phone – we have the technology, but it didn’t happen

– Video-conferencing – millions $ invested, but didn’t happen – Virtual-reality (headset) games – cf MMORGs & WADs – A mainly multi-media Internet - the Internet is still mainly text.

Major advances like blogs and wikis are just text! – E-commerce will dominate trade – dot-coms note, it is < 10%

People who invested in predictions, like Internet bandwidth and video-conferencing, lost money

©Brian Whitworth 2005, bwhitworth@acm.org

4

Bleeding Edge Theory• Media Richness Theory

– Rich medium rich message?

– A “richness dimension?

– “Lean” e-mail, chat & text-messaging are the success stories!

• Technology Acceptance Model (TAM)– Usefulness+Usability define end-user acceptance

– Yet Security, Privacy and Reliability are key issues!• Microsoft spent millions on NT to make windows more reliable

– “Mr Clippy” (Word’s Assistant) was a Bayesian smart friendly animation, yet was such a failure, its exclusion was an XP selling point!

©Brian Whitworth 2005, bwhitworth@acm.org

5

Leading Edge Practice• Practitioners develop “unpredicted” products like:

– E-mail: Considered a too simple text messaging system

– Chat rooms: Not multi-media, so not predicted

– WWW: Pundits argued it needed centralized control

– HTML: Experts felt it was a too-simple tag language

– Cell phones: Originally thought to be a yuppy toy

– Google: Was a simple search engine (unlike multi-mediaYahoo)

– Text messaging: Why text with a multi-media phone? Because!

Is IS theory out of touch with IS Practice?

©Brian Whitworth 2005, bwhitworth@acm.org

6

Yet we need theory• Imagine if we ran an atomic or nuclear program

by trial and error. • Imagine if we ran the space shuttle program by

trial and error • Are we developing the technical infrastructure

of the future World Society by trial and error? – Theory distills experience, and lets us avoid costly

mistakes. It prevents predictable errors.

We Need Better IS Theories!

©Brian Whitworth 2005, bwhitworth@acm.org

7

Question

• Propose: The problem is a limited view of system performance.

• Question: What is system performance?– Is it one, two or multi-dimensional?– Is it absolute or relative to the system environment?– What happens when performance increases?– How does one produce high performance systems?

©Brian Whitworth 2005, bwhitworth@acm.org

8

Performance Tensions• Alexander’s “Notes on the Synthesis of Form” raised

the idea of “design tension” over 40 years ago. IS developers now use it as “Pattern Theory”. It argues that system performance is always a balancing act.

• A vacuum cleaner is a design “form” in the following problem design space:

Reliability

Usability

Functionality

Cost

©Brian Whitworth 2005, bwhitworth@acm.org

9

Evolution Resolves Performance Tensions

• Successful biological systems range from simple viruses to powerful predators– They balance performance design tensions differently– Not just the strong are “fit”– One sided “excellence” can cause extinction

• Likewise, information systems may have a virtual “evolution”, where success has many forms, e.g.– Mobile laptops vs more powerful desktops– “Light” background utilities vs mainline applications– One purpose add-ins vs all purpose suites– Open source public vs closed private systems

©Brian Whitworth 2005, bwhitworth@acm.org

10

System Levels

1. Social: norms, culture, laws, zeitgeist, sanctions, roles (Sociology)

2. Personal: semantics, attitudes, beliefs, opinions, ideas (Psychology)

3. Software: programs, data, bandwidth, memory, processing (Computing)

4. Hardware: computers, wires, printer, keyboard, mouse (Engineering)

The model proposes four levels of “system” :

©Brian Whitworth 2005, bwhitworth@acm.org

11

Overlapping levels• Each level is a different “view”, not a different

system! (See Alter, Grudin, Kuutti, 1996)

– Meaning --> Culture: Personal meanings common across a group that continue over time create a culture Data --> Meaning: Data/Information creates personal meaning (based on human psychological processes)

– Physics --> Data: Physical hardware actions create data or information (as Shannon & Weaver define it)

• Each level depends on and “emerges” from the previous

• Higher levels offer higher performance benefits e.g. the benefits of cooperative society

©Brian Whitworth 2005, bwhitworth@acm.org

12

People vs Technology?• A plane is a technical system, but plane plus the

pilot is also a system. How should people relate to technology? – Human like technology? - machines that replace people,

e.g. the Terminator, IBM's Deep Blue, AI the movie.– Technologized people? - machines that mechanize people,

e.g. from the industrial dark ages, to the Matrix. – People dominated by technology add-ons? - e.g. Star

Trek's Borgs, Star War’s Darth Vader.

• Human and machine can only work together if technology supports human processes

©Brian Whitworth 2005, bwhitworth@acm.org

13

• Define the human process, then design the technology– Human Process Technology Support– 1:1 conversation Email – Many sense processing Multi-media system– Trading Ebay type systems– Human foraging Browser design– Group conversations Chat– Associative memory Hypertext– People learn by feedback The wonderful back button– Normative behavior Reputation systems

• NOTE: THERE IS NOT JUST ONE SOLUTION!

Technology support for human processes

©Brian Whitworth 2005, bwhitworth@acm.org

14

Levels combine

Social

Personal

Informational

Mechanical Hardware system

Software

system

Human-computer interaction (HCI) system

Socio-technical system (STS)

©Brian Whitworth 2005, bwhitworth@acm.org

15

Social-Technical Systems• Hardware Systems involve physical exchanges• Technology Systems are hardware + data flows• HCI Systems are technology plus people • Social-technical Systems (STS) are social systems,

built on HCI systems, built on software systems, built on hardware systems

• A social-technical systems involves all four system levels

• The Web of System Performance (WOSP) is a multi-goal model of the IS design space for any level of a social-technical system

©Brian Whitworth 2005, bwhitworth@acm.org

16

System performance involves?

• Four common system elements:– Boundary: monitors system entry/exit– Internal structure: supports and controls – Effectors: generate output effects– Receptors: process input signals

• Examples:– People: skin, brain and organs, muscles and

senses– Computers: case, mother-board architecture,

peripheral output and input

©Brian Whitworth 2005, bwhitworth@acm.org

17

System performance is?

How well a system interacts with its environment

• An environment can offer any system: – Opportunity – environment gives benefits– And/Or– Risk – environment gives damage or loss

©Brian Whitworth 2005, bwhitworth@acm.org

18

WOSP Goals• Each system element can be designed to gain value

(opportunity) and/or avoid loss (risk):– Boundary

• To enable useful entry (extendibility).• To deny harmful entry (security).

– Internal structure• To accommodate external change (flexibility).• To accommodate internal change (reliability).

– Effector• To maximize external effects (functionality).• To minimize internal effort (usability).

– Receptor• To enable meaning exchange (connectivity).• To limit meaning exchange (privacy)

©Brian Whitworth 2005, bwhitworth@acm.org

19

A System is not “high performance” if:

1. Ineffectual – it cannot do the job.2. Unusable – you cannot make it work.3. Unreliable – it breaks down often.4. Insecure – it succumbs to viruses.5. Inflexible – it fails when things change.6. Incompatible – it cannot import standard

plug-ins or data. 7. Disconnected – it cannot communicate.8. Indiscreet – it reveals private information.

©Brian Whitworth 2005, bwhitworth@acm.org

20

Effectors: change external environment

Functionality(what it can do)

Usability(reduced effort to to)

©Brian Whitworth 2005, bwhitworth@acm.org

21

Structure: controls and maintains

Flexibility(performs differently)

Reliability(performs the same)

©Brian Whitworth 2005, bwhitworth@acm.org

22

Boundary: controls entry/exit

Security(can reject)

Extendibility(can accept)

©Brian Whitworth 2005, bwhitworth@acm.org

23

Receptors: control communication

Connectivity

(can communicate)

Privacy

(can hide)

©Brian Whitworth 2005, bwhitworth@acm.org

24

Functionality

Usability

Connectivity

Security

Privacy

Reliability

Flexibility

Extendibility

©Brian Whitworth 2005, bwhitworth@acm.org

25

• Goal: To act on the environment• What can the system do?• Also called: functionality,

capability, power, usefulness, effectiveness

• Examples:– A car that goes fast, stops well,

turns well etc

– Software that does what the user wants

Functionality

©Brian Whitworth 2005, bwhitworth@acm.org

26

• Multi-functional• All purpose (e.g. Office Suites)• Feature driven (“bloatware”)• Can always “get the job done”• Complicated - large menus• Most people only use 50% or less

of their potential • Hard to use

Functional systemsTend to be:

©Brian Whitworth 2005, bwhitworth@acm.org

27

• Goal: To operate efficiently or easily, to minimize the resource costs of action

• How easy is it to operate/run?• Also called: easy to use, simple,

user friendly, efficient• Examples:

– A car that is easy to drive, with simple controls

– People can use usable software without a lot of training or help

– HTML is usable, email is usable

Usability

©Brian Whitworth 2005, bwhitworth@acm.org

28

• Easy to use, learn and operate (intuitive)

• Require little training, help or documentation

• Provide only what is necessary now (contextual screen information)

• Simple, focus on essentials, remove unnecessary “bells or whistles”

• Less powerful

Usable systemsTend to be:

©Brian Whitworth 2005, bwhitworth@acm.org

29

• Goal: To continue operating despite internal data or component failure (errors)

• How often can it perform?

• Also called: Dependability, stability, trustworthy, ruggedized, MTBF, robustness, durability, maintainability, recoverability.

• Examples:– A reliable car starts every day

– Reliable software doesn’t “hang” with internal “bugs”, or data errors

– Windows reliability has increased over the years - XP was sold on this

Reliability

©Brian Whitworth 2005, bwhitworth@acm.org

30

• Be long lasting (so lifetime warrantees are economic)

• Be predictable (so users can plan)

• Have Redundant parts - if one part fails another takes over

• Be minimally coupled - one error does not cause another• Recover quickly from breakdown (module or data recovery)

• Have Undo/Back operations to reverse errors

• Be hard to change

Reliable systemsTend to:

©Brian Whitworth 2005, bwhitworth@acm.org

31

• Goal: To still work in new environments, e.g. by changing its operation

• Where can it perform?• Also called: Adaptability, tailorability,

customizability, portability, platform independence, plasticity, agility, modifiability, pervasive

• Examples:– An all-terrain vehicle works anywhere– Mobile/pervasive computing lets people use

software in any environment

Flexibility

©Brian Whitworth 2005, bwhitworth@acm.org

32

• Easily customized to individuals (e.g. control panel)

• Easily “fitted” to different situations (e.g. hardware independent software, O/S independent programming languages)

• Able to learn usage trends (e.g. cache prediction algorithms)

• Demand driven (e.g. CSMA/CD Ethernet vs polling networks)

• Interconnected, holistic - one change can affect all • Error prone (e.g. changing Windows 95 screen display could be

catastrophic! - until an Undo was provided)

Flexible systemsTend to be:

©Brian Whitworth 2005, bwhitworth@acm.org

33

• Goal: To use outside components/data• With what can the system perform?• Also called: Tool use, openness, scalability,

interoperability, standards, permeability, compatibility

• Examples:– Cars can fit add-ons (e.g. towbar, spoiler, rack ..)

(if towbar ball is standard size)– Open software, like Netscape, accepts “plug-ins”– Clip and paste between applications allows one

system to use another’s data easily

Extendibility

©Brian Whitworth 2005, bwhitworth@acm.org

34

• Be able to accept third-party “add-ons” (Open IBM PC vs McIntosh sealed system)

• Be open source (system code openly displayed)• Support general standards • Be easily added to - Open Systems Architecture

“Plug & Play”• Be easily scalable (like the World Wide Web)• Have decentralized control – object orientated • Not be as secure

Extendible systemsTend to:

©Brian Whitworth 2005, bwhitworth@acm.org

35

• Goal: To resist or repel unauthorized entry, misuse or take over

• Who controls the system?• Also called: Anti-virus, firewall,

protectiveness, defense, integrity, safety, integrity

• Examples:– A secure car’s lock cannot be picked– Secure software (and data) cannot be

used or modified by hackers or viruses

Security

©Brian Whitworth 2005, bwhitworth@acm.org

36

• Centrally controlled, bureaucratic• Strong on boundary checks (logon codes,

passwords)• Resistant to external attack or entry• Internally idiosyncratic• Risk avoiding rather than opportunity

enhancing • Hard to add on to, not opportunistic

Secure systemsTend to be:

©Brian Whitworth 2005, bwhitworth@acm.org

37

• Goal: To communicate with other similar systems

• Who can we talk to?• Also called: Communication,

information exchange, networking, sociabilty, interactivity

• Examples:– A connected car could sense other cars

and avoid a crash– Connected software uses the Internet:

email, chat rooms and groupware

Connectivity

©Brian Whitworth 2005, bwhitworth@acm.org

38

• Be informed by other systems, and “group aware”

• Be up to date by download/upload • Have many channeled (multi-media), two-

way duplex (reciprocal), high linkage (many-to-many) communication

• Have a low “cycle time” (user complaints, software patches)

• Experience information overload/distraction

Connected systemsTend to:

©Brian Whitworth 2005, bwhitworth@acm.org

39

• Goal: To control internal information release and disclosure

• Who sees us?

• Also called: Tempest proof, opaqueness, privacy, autonomy, camouflage, stealth confidentiality, secrecy

• Examples:– Car with tinted windows, a radar scrambler

– Privacy software controls interactions with others (e.g. Black Ice, Zone Alarm)

Privacy

©Brian Whitworth 2005, bwhitworth@acm.org

40

• Be regenerative (“packing” a file needs privacy)

• Restrict information use (e.g. digital signatures copy protection)

• Be information encrypted • Respect rights to own information, where one

needs permission before data access • Protect personal information (documents, credit cards) • Hard to connect to

Private systemsTend to:

©Brian Whitworth 2005, bwhitworth@acm.org

41

Performance is a Balance

©Brian Whitworth 2005, bwhitworth@acm.org

42

Derived vs Observed

• WOSP goals were derived from the general nature of systems

• Yet match observed system requirements quite well

• Current system requirements are very confused and confounded

• E.g. are privacy and reliability part of security?

©Brian Whitworth 2005, bwhitworth@acm.org

43

WOSP Area = Performance• The WOSP area best estimates “performance”• Performance (the area) needs all the WOSP dimensions• ‘Experts” predict based based on recent advance(s)• If the most recent advance is the strongest dimension• It also has the greatest “tension” with other factors• So it soon gives diminishing returns• Then, IS progress moves “unexpectedly” to a dimension where the tension is less• The gaming development of the last decade was not graphics but

social gaming (MMORPGs massively multiplayer online role-playing games)

©Brian Whitworth 2005, bwhitworth@acm.org

44

WOSP Lines = Tensions • WOSP lines are cross-cutting tensions• Increasing any aspect of performance means any other

can “bite back” (Edward Tenner)

– “Open” systems can increase security risks, and more security makes systems less open eg US.

– More functions can mean more for users to learn– Flexible control panel can reduce reliability, when users

change the wrong things

• Yet one can have security and openness, ease of use and functionality, connectivity and privacy, etc

• As systems advance, progress needs tension resolution

©Brian Whitworth 2005, bwhitworth@acm.org

45

Innovation can resolve tensions

• Find innovative ways to reconcile“opposites”– Functionality + Usability = elegance – Extendibility + Security = discrimination– Reliability + Flexibility = autonomy – Connectivity + Privacy = legitimacy

• Avoid one-dimensional progress (it bites back)

• Find progress at the points of least tension

System design is an art as well as a science

©Brian Whitworth 2005, bwhitworth@acm.org

46

Expanding the Web• In 1992, Apple CEO John Sculley introduced the hand held Newton, saying portability (flexibility) was the wave of the future – he was right

• But the Newton’s portability reduced its data input usability, and in 1998 Apple dropped the line.

• When Palm’s Graffiti language improved handwriting recognition, the PDA market took off.

• Innovations may need combination advances

©Brian Whitworth 2005, bwhitworth@acm.org

47

Why are “killer apps” functionally simple?

• A basic email system can be written in a weekend, yet email is a killer app

• When systems begin, the web is “slack”. As they evolve, tensions arise

• A simple new application allows WOSP performance expansion

• A powerful new function, like video e-mail, is much more difficult to support all-roundMinimize functionality to increase your system

performance!

©Brian Whitworth 2005, bwhitworth@acm.org

48

WOSP shape/profile and Environment “Fit”

• Performance has no “perfect” form• In opportunity environments right action gives benefit,

favoring the four success creating goals:– Functionality, Reliability, Extendibility and Connectivity

• In risk environments, wrong action gives great loss, favoring the four failure avoiding goals:– Security, Privacy, Usability, Flexibility

• Different environments, organizations and applications favor different performance “shapes”

©Brian Whitworth 2005, bwhitworth@acm.org

49

Analyze the performance profile

• What % of your project should be spent on: – Security Resist outside attack/take-over?– Extendibility Use outside components/data?– Functionality Required functionality?– Usability Conserve user/system effort?– Reliability Avoid/recover from internal failure?– Flexibility Predict/adapt to external changes?– Connectivity Communicate with other systems?– Privacy Control self-disclosure?

• Don’t spend 90% of project effort on what will be only 50% of system performance

©Brian Whitworth 2005, bwhitworth@acm.org

50

Convert WOSP goals to performance requirements

• For a browser– Privacy

• Any sensitive information I give the browser, like logon passwords, is encrypted, so others can’t see it.

• Password information always shows as asterisks, so others cannot look over my shoulder to see them

• It stops web sites from getting my name or email from my computer’s data.

– Security• It can detect and prevent spyware from installing• It can detect and prevent popup ads

©Brian Whitworth 2005, bwhitworth@acm.org

51

System design “layers”• Specialist specifications/teams for:

– Functionality: Traditional specs (main module)

– Usability: User costs (HCI) (interface, help, wizards)

– Reliability: Error analysis (error & recovery code)

– Flexibility: Portability analysis (control panel, preferences)

– Extendibility: Compatibility analysis (plug-in manager, import/export data, clip and paste)

– Security: Threat analysis (logon/registration module)

– Connectivity: Internet/network analysis (comms module)

– Privacy: Legitimacy analysis (rights control module)

©Brian Whitworth 2005, bwhitworth@acm.org

52

Design Integration

• For projects to produce known and established systems, whose goals are well known and well defined, specialization and specialists maximize results (they give more local output per local criteria)

• For projects to produce new systems, whose goals are not well known or defined, requires integration as well as specialization, to increase synergy and decrease cross-cutting conflicts

©Brian Whitworth 2005, bwhitworth@acm.org

53

Increasing Integration

• Cross-disciplinary leader or project “guru”– Hard to find

• Increase cross-discipline communication: Regular cross-specialist meetings under multi-discipline chairs – hard to run

• Combine specialty teams. e. g. for: actions (functionality + usability), interactions (security + extendibility), contingencies (reliability + flexibility) and sociability (connectivity + privacy).

©Brian Whitworth 2005, bwhitworth@acm.org

54

Extreme Programming

• For small teams producing innovative/new projects– Everyone contributes to every system part– All team members involved in every aspect of

system design– Common entry point for all system changes

• Maximizes system integration (goal synergies)

• Minimizes design conflicts

©Brian Whitworth 2005, bwhitworth@acm.org

55

Validation – Conjoint Analysis

• Subjects were told they were company managers who had to choose a new common browser for their company

• They were given 30 browsers to choose from, each with different ratings on the eight WOSP factors

• Then asked to rate and rank the browsers

©Brian Whitworth 2005, bwhitworth@acm.org

56

Results: Criteria Weights

Performance Factor

Avg. Importance

Std Dev. 99% Conf % > 12.5

Security 22.78 12.78 16.07-29.50 70.83 Privacy 15.47 9.19 20.30-10.64 58.33

Usability 14.16 9.88 19.36-8.97 50.00 Functionality 12.02 8.21 16.33-7.70 29.17

Reliability 11.64 8.15 15.93-7.36 33.33 Connectivity 9.24 6.54 12.68-5.80 33.33 Extendibility 7.69 4.56 10.09-5.30 16.67

Flexibility 6.99 6.46 10.39-3.59 16.67

Security, Privacy & Usability came before Functionality!

©Brian Whitworth 2005, bwhitworth@acm.org

57

Analytical Hierarchical Processing Study

• TAM proposes perceived usefulness and ease of use affect technology acceptance (2 factors)

• Subjects used both TAM and WOSP criteria

• Subjects rated 3 browsers using AHP pair-wise comparisons

• They then compared the two methods

©Brian Whitworth 2005, bwhitworth@acm.org

58

Results: TAM vs WOSP

Effect N TAM WOSP df MS F p

Confidence 20 3.7 2.4 1 16.9 12.8 .002*

Accuracy 20 3.6 2.45 1 13.2 12.4 .002*

Completeness 19 4.21 2.42 1 30.4 16.8 .001*

Outcome Satisfaction 18 3.14 2.35 1 5.64 9.63 .006*

©Brian Whitworth 2005, bwhitworth@acm.org

59

CA vs AHP WOSP Ranks

CRITERIA AHP WEIGHTS RANK

LOCAL GLOBAL AHP CA

Extendibility 0.21 0.04 8 7

Security 0.79 0.17 2 1

Flexibility 0.43 0.09 7 8

Reliability 0.57 0.12 4 5

Usability 0.58 0.09 3 3

Functionality 0.42 0.12 6 4

Privacy 0.64 0.20 1 2

Connectivity 0.36 0.11 5 6

Has the Web become a threat environment?

©Brian Whitworth 2005, bwhitworth@acm.org

60

What Users Want• Functionality and Usability

• Reliability and Flexibility

• Security and Extendibility

• Connectivity and Privacy

• They want it all!

• Why?

• We ourselves are such a balanced system, so we expect no less of the systems we work with

©Brian Whitworth 2005, bwhitworth@acm.org

61

Design Tips Summary• Define the human process, then design the technology• Less (functionality) may mean more performance (avoid

the Version 2 flop problem)

• Match project effort to your app’s performance profile• For new applications, increase design integration by

cross-disciplinary staff, or more/better communication

• Expect cross-cutting conflicts• Convert WOSP goals to technology requirements• Check your web of system performance - success

needs many causes, but system failure needs only one

©Brian Whitworth 2005, bwhitworth@acm.org

62

Design Questions

• What is the human process/task your technology will support?

• What is your system level? (S/W,HCI, STS)• What is your application’s performance profile?

(e.g. your top 3-4 criterion goals are?)• What are specific requirements of each goal?• Do you have cross-cutting design issues? • What designs satisfy this “performance space”?

©Brian Whitworth 2005, bwhitworth@acm.org

63

ConclusionExcellence requires Balance

See brianwhitworth.com for example papers: 1. The Web of System Performance2. Polite Computing3. Spam and the Social-Technical Gap4. Legitimate by design5. Voting before discussing

©Brian Whitworth 2005, bwhitworth@acm.org

64

TAM• TAM proposes perceived usefulness and perceived ease

of use affect technology acceptance

• Is a system characteristics tension model

Perceived Usefulness

Perceived Ease of Use

Behavioural Intention

Use Behaviour

©Brian Whitworth 2005, bwhitworth@acm.org

65

Unified Theory of Acceptance and

Use of Technology (UTAUT)Performance Expectancy

Effort Expectancy

Behavioral Intention

Use Behavior

Social Influence

Facilitating Conditions

Voluntariness of use

ExperienceAgeGender

©Brian Whitworth 2005, bwhitworth@acm.org

66

WOSP & UTAUT

WOSP Criteria

Behavioral Intention

Use Behavior

Social Influence

Facilitating Conditions

Voluntariness of use

ExperienceAgeGender

©Brian Whitworth 2005, bwhitworth@acm.org

67

AHP Design

Select for system performance

WOSP dimensions

TAM dimensions

Software

WOSP rankings TAM rankings

Main Goal

Criterions (Cx,,cxi)

Alternatives (ay)

Output (wxy,wxiy,wy).

©Brian Whitworth 2005, bwhitworth@acm.org

68

Technology acceptance

• Three factor types affect technology acceptance:1. System Characteristics: Is it useful, easy to use, secure,

etc? e.g. TAM (original)

2. Individual Characteristics: Age, gender, experience, attitude to computers. e.g. social cognitive theory

3. Organizational Characteristics: Corporate values, technology infrastructure, social structure, normative effects. e.g. innovation diffusion theory

WOSP is a system characteristics tension model, but still compatible with theories 2. and 3.

©Brian Whitworth 2005, bwhitworth@acm.org

69

Summary

• One-dimensional progress bites back• Progress is a train on many tracks at once• Advanced system performance has many goals • Reconciling these goals requires innovation• Innovation finds the points of least tension• System design is an art as well as a science,

needing integration as well as specialization• The “killer” advances of the last decade (email,

browsers, chat, blogs) succeeded by balance as well as excellence