Human-Computer Interaction Research on the Endeavour Expedition

Post on 28-Jan-2016

23 views 0 download

Tags:

description

Human-Computer Interaction Research on the Endeavour Expedition. James A. Landay Jack Chen, Jason Hong, Scott Klemmer, Francis Li, Mark Newman, Anoop Sinha Endeavour Retreat January 20, 2000. Several Projects on UIs of the Future. Designer’s Outpost SUEDE Multimodal Design Assistant - PowerPoint PPT Presentation

Transcript of Human-Computer Interaction Research on the Endeavour Expedition

1/20/2000 1

Human-Computer Interaction Research on the Endeavour Expedition

James A. Landay

Jack Chen, Jason Hong, Scott Klemmer,

Francis Li, Mark Newman, Anoop SinhaEndeavour Retreat

January 20, 2000

1/20/2000 2

Several Projects on UIs of the Future

• Designer’s Outpost• SUEDE• Multimodal Design Assistant• Context-aware PDA

Infrastructure• Context-based Information Agent

1/20/2000 3

The Future: Information-centric User Interfaces

1/20/2000 4

Designer’s Outpost: Tangible Tools for Information Design

1/20/2000 5

Designer’s Outpost: Tangible Tools for Information Design

1/20/2000 6

Designer’s Outpost: Tangible Tools for Information Design

1/20/2000 7

Designer’s Outpost: Tangible Tools for Information Design

1/20/2000 8

Designer’s Outpost: Tangible Tools for Information Design

1/20/2000 9

Designer’s Outpost: Tangible Tools for Information Design

• Two 640x480 USB web cams– camera above desk

• captures ink / IDs

– camera below desk• occlusion free

• captures structure

• ITI Visionmaker Desk– 1280 x 1024 resolution

– direct pen input

• Cross iPen tablet– high-res ink capture

1/20/2000 10

One Future: Our Access to Information will be via Speech

1/20/2000 11

SUEDE: Low-fidelity Prototyping for Speech-based User Interfaces

1/20/2000 12

SUEDE: Low-fidelity Prototyping for Speech-based User Interfaces

1/20/2000 13

SUEDE: Low-fidelity Prototyping for Speech-based User Interfaces

1/20/2000 14

A Better Future: Our Access to Information will be via Multimodal UIs

• How do we combine speech, gesture, etc. into a UI design?– rapid production of “rough cuts”

• informal (sketching / “Wizard of Oz”)• iterative design (user testing/fast mods)

– generate initial code• UIs for multiple devices• designer adds detail / improve UI

• Study– uses of these novel modes alone

• e.g., SUEDE

– construction of multimodal apps• e.g., a multimedia notebook

• Early stages so far– based on inferring models

Model

1/20/2000 15

The Best Future: Multimodal UIs that are Aware of User’s Context

• Applications can be aware of– location

– who is the user

– what are they doing

– who is nearby…

• DARPA has funded us to purchase 25-50 PDAs w/– wireless communications

– wireless infrastructure

– built-in bar code scanners

• Idea– deploy across Soda

– build/study interesting applications

– looking for students to get involved• use Hill/Culler/Ninja infrastructure & Motes

Symbol SPT 1700

1/20/2000 16

A Context-based Information Agent: Motivation

• Proactive Meeting Support– Capture human-to-human

communication and context

– Use communication and context to proactively find relevant info

– Present the most useful info as non-intrusively as possible

Searching for relevant

items…

……

……!!

January

1/20/2000 17

Design Space

Input

Presentation

Search

1/20/2000 18

Design Space - Input

• Human-Human Communication– Speech, Ink, Vision, Text

• Context– Who you are?

– Who is speaking?

– Who else is here?

– Where am I?

– When is it?

– What calendar event am I in?

– What todo item am I doing?

– How busy am I?

– …

Input

Presentation

Search

1/20/2000 19

Design Space - Search

• Structuring queries from input– Continuous streams of input

– Simple approach - spot keywords

– But, given richer sources of input, can we formulate better queries?

– Which kinds of communication and context are useful for this task?

• Information sources to search– Personal Information (local files)

– Group Information (local web pages)

– Global Information (web search)

Input

Presentation

Search

1/20/2000 20

Design Space - Presentation

• Synchronous / Asynchronous– Show me as I do my work

– Show me after I get back from lunch

• Aim for minimal attention– Attention should be on the task

• Tailor output to context– I'm busy, don't bother me at all

– Show me only if it's really important

Input

Presentation

Search

1/20/2000 21

Low-Fidelity Prototype

• Run a low-fidelity prototype– Quick way of testing system that

doesn't exist yet

– Parts requiring lots of programming simulated by human

• First iteration– Speech-based agent, listens in on

conversations

– Uses web search engines

– Presents combined search results

– Basically an alternative front-end for web search engines

Web search

1/20/2000 22

Low-Fidelity Prototype • Combined results– Categorized by time

– Categorized by topic

– Material explicitly referenced

• Endeavour homepage

• "Do you have a link to that?"

• "There's a paper from Xerox PARC…"

– Related material not explicitly mentioned

• UW Portolano

• MIT Oxygen Project

1/20/2000 23

Low-Fidelity Prototype Results

• Some survey results– People liked the general concept

– Search results rated ok, not great

– Didn't want to spend too much time (spent more time than we expected)

– Control of when agent was running (turn on and off)

– Real-time results

– Multiple ways of organizing, and accessing, and filtering the info

1/20/2000 24

Design Space – RefinedInput

Presentation

Context

Communication

Ink, Text, Speech

Who, Where, When, How, What

SearchPersonal Info

Group Info

Public Info

Calendar, Contacts, Email,Personal webpages, Notes

Calendar, Group Notes,Contacts, Group webpages

Webpages, Newsgroups,Digital Libraries

1/20/2000 25

Design Space - RefinedInput

Presentation

Context

Communication

Ink, Text, Speech

Who, Where, When, How, What

SearchPersonal Info

Group Info

Public Info

Calendar, Contacts, Email,Personal webpages, Notes

Calendar, Group Notes,Contacts, Group webpages

Webpages, Newsgroups,Digital Libraries

1/20/2000 26

Prototype

• Implementation– Speech-based input (IBM ViaVoice)

– Start with keyword-based search

– Uses Google search engine

• Problems– Speech recognition is poor

– "Interesting" keywords not in dictionary

• Still in the process of improving the software

1/20/2000 27

1/20/2000 28

Continuing Work

• Better organization and navigation of results

• Peripheral displays• Figuring out "right" rate to

update results• Another low-fi prototype using

"precise" speech recognition• Improving recognition rate

– ICSI Speech Recognition Engine / ViaVoice 2000

– Crawl local web pages to expand dictionary

1/20/2000 29

Comments?

1/20/2000 30

Related Work

• Cyberguide and others• Letizia• Remembrance Agent• MSR Implicit Queries• XLibris