Multimodal, crossmedia, multi platform

38
Input modaliteiten UXD minor thema ‘Multimodal, Crossmedia and Multi-Platform

description

 

Transcript of Multimodal, crossmedia, multi platform

Page 1: Multimodal, crossmedia, multi platform

Input modaliteiten

UXD minor thema ‘Multimodal, Crossmedia and

Multi-Platform

Page 2: Multimodal, crossmedia, multi platform

Kwartaalprogramma22 november

BuzzCapture + thema intro

29 novemberWorkshop rond opdrachten

6 decemberPhilips Design: service design

13 decemberTbd.

3 januariFabrique: ‘Mental Notes’ workshop

Page 3: Multimodal, crossmedia, multi platform

Kwartaalprogramma10 januari

Usevine: iPad en het tweede scherm

17 januariTbd

24 januariTbd

31 januariEindpresentaties en afsluiting

Page 4: Multimodal, crossmedia, multi platform

Nieuwe woorden leren

Enkele belangrijke uxd begrippen

Page 5: Multimodal, crossmedia, multi platform

Theme in the scheme of things

Media, modalities and platforms provide us the nuts and bolts of the user experience.

The quality of the user experience is determined by our ability to utilize the media, modalities and platforms at our disposal.

Page 6: Multimodal, crossmedia, multi platform

Crossmedia‘Crossmedia (also known as Cross-

Media, Cross-Media Entertainment, Cross-Media Communication) is a media property owned, service, story or experience distributed across media platforms using a variety of media forms.’

http://en.wikipedia.org/wiki/Crossmedia

Page 7: Multimodal, crossmedia, multi platform

Multi-platform‘In computing, cross-platform (also

known as multi-platform) is a term used to refer to computer software or computing methods and concepts that are implemented and inter-operate on multiple computer platforms.’

http://en.wikipedia.org/wiki/Multiplatform

Page 8: Multimodal, crossmedia, multi platform

Multimodal‘Multimodal interaction provides the

user with multiple modes of interfacing with a system beyond the traditional keyboard and mouse input/output.’

http://en.wikipedia.org/wiki/Multimodal_interaction

Page 9: Multimodal, crossmedia, multi platform

Modality‘A modality is a path of communication

between the human and the computer.’

http://en.wikipedia.org/wiki/Modality_(human-computer_interaction)

Page 10: Multimodal, crossmedia, multi platform

Input and output modalities‘In human-computer interaction, a

modality is the general class of:– a sense through which the human can

receive the output of the computer (for example, vision modality)

– a sensor or device through which the computer can receive the input from the human’

http://en.wikipedia.org/wiki/Modality_(human-computer_interaction)

Page 11: Multimodal, crossmedia, multi platform

Output modalities (computer-to-human)‘Any human sense can be translated to a modality:

• Major modalities – Seeing or vision modality– Hearing or audition modality

• Haptic modalities – Touch, tactile or tactition modality — the sense of pressure– Proprioception modality — the perception of body awareness

• Other modalities – Taste or gustation modality– Smell or olfaction modality– Thermoception modality — the sense of heat and the cold– Nociception modality — the perception of pain– Equilibrioception modality — the perception of balance’

http://en.wikipedia.org/wiki/Modality_(human-computer_interaction)

Page 12: Multimodal, crossmedia, multi platform

An input device is any peripheral (piece of computer hardware equipment) used to provide data and control signals to an information processing system (such as a computer).

http://en.wikipedia.org/wiki/Input_devices

Input modalities (human-to-comp.)

Page 13: Multimodal, crossmedia, multi platform

Pointing devicesIvan Sutherland (MIT) demoing Sketchpad (1962)(introduced by Alan Kay in 1987)

Page 14: Multimodal, crossmedia, multi platform

Pointing devices‘Pointing devices are input devices

used to specify a position in space.– Direct/indirect– Absolute/relative’

http://en.wikipedia.org/wiki/Input_devices

Page 15: Multimodal, crossmedia, multi platform

Fitts’ law‘The time it takes to move from a starting

position to a final target is determined by the distance to the target and the size of the object.’ (Saffer, 2007)

Page 16: Multimodal, crossmedia, multi platform

Pointing devicesAnd you can point at more than merely

pixels on a screen…

Page 17: Multimodal, crossmedia, multi platform

Alphanumeric input: keyboards

Page 18: Multimodal, crossmedia, multi platform

Alphanumeric input: keyboards

Page 19: Multimodal, crossmedia, multi platform

Alphanumeric input: keyboards

Page 20: Multimodal, crossmedia, multi platform

Alphanumeric input: speech recognitionSpeaker dependent/independentDiscrete-word/connected-word inputLimited/large vocabulary

Page 21: Multimodal, crossmedia, multi platform

Alphanumeric input: handwriting recognition

‘Recognition’ patents as early as 1914

‘Electronic ink’ and recognition in Vista

http://www.freepatentsonline.com/1117184.pdf

Page 22: Multimodal, crossmedia, multi platform

Pen Computing

‘The return of the pen’

Switching modes: ‘pointing’ vs. ‘ink’

Page 23: Multimodal, crossmedia, multi platform

Tap is the New Click"One of the things our grandchildren will

find quaintest about us is that we distinguish the digital from the real.“

William Gibson - from: Saffer (2009)

Page 24: Multimodal, crossmedia, multi platform

Ubiquitous computing ‘Ubiquitous computing (ubicomp) is a

post-desktop model of human-computer interaction in which information processing has been thoroughly integrated into everyday objects and activities.’

http://en.wikipedia.org/wiki/Ubiquitous_computing

Page 25: Multimodal, crossmedia, multi platform

Wearable computing‘Wearable computers are computers

that are worn on the body.’

http://en.wikipedia.org/wiki/Wearable_computer

Page 26: Multimodal, crossmedia, multi platform

Tangible/Natural user interfaces

Hiroshi Ishii (MIT)

Page 28: Multimodal, crossmedia, multi platform

Ergonomics of Interactive Gestures

"Hands are underrated. Eyes are in charge, mind gets all the study, and heads do all the talking. Hands type letters, push mice around, and grip steering wheels, so they are not idle, just underemployed."

—Malcolm McCullough, Abstracting Craft(from: Saffer, 2009)

Page 29: Multimodal, crossmedia, multi platform

Patterns for Touchscreens and Interactive Surfaces

Tap to select

Page 30: Multimodal, crossmedia, multi platform

Patterns for Touchscreens and Interactive Surfaces

Drag to move object

Page 31: Multimodal, crossmedia, multi platform

Patterns for Touchscreens and Interactive Surfaces

Pinch to shrink and spread to enlarge

Page 32: Multimodal, crossmedia, multi platform

Patterns for Free-Form Interactive Gestures

Point to select/activate

Page 33: Multimodal, crossmedia, multi platform

Patterns for Free-Form Interactive Gestures

Shake to change

Page 35: Multimodal, crossmedia, multi platform

ReaderWearable computers:

Steve Mann. Eyetap.org. http://about.eyetap.org/

Ubiquitous computing: Mark Weiser (1991). The Computer for the 21st

Century. http://www.ubiq.com/hypertext/weiser/SciAmDraft3.html

Adam Greenfield (2006). Everyware: The Dawning Age of Ubiquitous Computing. New Riders, Berkeley, CA.

Donald Norman (1998). The Invisible Computer: Why Good Products Can Fail, The Personal Computer Is so Complex, and Information Appliances Are the Solution. The MIT Press, Cambridge, MA

Mike Kuniavsky (2010). Smart Things. Morgan Kaufmann

Page 36: Multimodal, crossmedia, multi platform

ReaderInput devices

Doug Engelbart (1968). The mother of all demos. Google video stream

Wikipedia. http://en.wikipedia.org/wiki/The_Mother_of_All_Demos

Page 37: Multimodal, crossmedia, multi platform

ReaderFitts’ Law

Dan Saffer (2007). Designing for Interaction: Creating Smart Applications and Clever Devices. New Riders, Berkeley, CA. (page 53)

Speech recognitionMicrosoft. Microsoft Speech Technologies.

http://www.microsoft.com/speech/speech2007/default.mspx

Page 38: Multimodal, crossmedia, multi platform

ReaderHandwriting recognition

Wacom. Unleash Windows Vista With A Pen. http://www.wacom.com/vista/index.php

Gestural InterfacesDan Saffer (2009). Designing Gestural

Interfaces. O’Reilly Media, Sebastopol, CA

ErgonomicsHenry Dreyfuss (1955). Designing for People.

Allworth Press, New York, NY.