A Multimodal Approach To Accessible Web Content On Smartphones

Post on 25-May-2015

429 views 0 download

description

Portable and Mobile Systems in Assistive Technology - A Multimodal Approach To Accessible Web Content On Smartphones - Knudsen, Lars Emil (f)

Transcript of A Multimodal Approach To Accessible Web Content On Smartphones

A Multimodal Approach To Accessible WebContent On Smartphones

Lars Emil KnudsenHarald Holone

Østfold University College

11.07.2012

1 / 16

Introduction

I Master student at Østfold University College, NorwayI Applied Computer Science

MotivationI ProgrammingI Rewarding to contributeI Research valueI Exciting combination of multimodality and smartphones

2 / 16

Introduction

I I will present our current work in the area of multimodalinterfaces on smartphones.

1. Our implementation of W3C’s multimodal interactionframework on smartphones running the Android OS.

2. The results from user tests and interviews with blind andvisually impaired users

3 / 16

Background

I SMUDI-projectI Norwegian speech recognitionI Multimodal interfaceI Achieve universal designI Run by MediaLT, a norwegian research companyI This project has been created in relation to the SMUDI project

I Rapid development in the mobile marketI Smartphones with new capabilitiesI New opportunities of interface design, with multimodal

interaction being one of them

4 / 16

Designing Robust Multimodal Systems for UniversalAccess, Oviatt 2001

I “Given the right context, temporal disability applies toeveryone.”

5 / 16

Multimodality

I What is multimodality?I Modalities describe the different paths of communication

between a human and the computer.

6 / 16

Multimodality and universal access

OviattI Claims that multimodality can greatly expand the accessibility

of computing for diverse and non-specialist users.I Also claims that multimodality can promote new forms of

computing not previously available.

7 / 16

Multimodality, universal access, visually impaired and blind

I Spoken and multimodal bus timetable systems: design,development and evaluation, Turunen et al., 2005

I Multimodality generally improves performanceI Users need traning to be able to use the system

8 / 16

Voice recognition

I External ServerI Several commercial services:

I Google VoiceI (Siri)I NuanceI Vlingo

9 / 16

Frameworks

I MONA (Niklfield et al.)I Miranda (Paay et al.)I Others (Reithinger et. al)(Nardelli et al.)I W3C Multimodal Interaction Framework

10 / 16

Framework Implementation

I Based on W3C’s specification of their Multimodal InteractionFramework

I Used EMMA.I EMMA - “Extensible MultiModal Annotation markup

language”I Descriptions of XML markup languageI Represents the semantics and meaning of dataI Implemented on a need to have basis

I Consist of a set of componentsI Recognition componentI Interpretation componentI Interaction manager

11 / 16

Method

I Test the framework implementationI Evaluate multimodal mobile interfacesI User testI Four blind and visually impaired usersI Two phases:

1. Practical - Users were given tasks to carry out2. Semi-structural interview

12 / 16

Prototype

I Developed on top of, and in parallel to, the multimodalframework

I For AndroidI Used a norwegian weather service (yr.no)I Used norwegian voice recognition from NuanceI Input modalities:

I SpeechI TouchI Touch gesturesI OrientationI Acceleration gesturesI Touch keyboardI Navigation keys

13 / 16

Results

I Speech input was preferredI Touch gestures and orientation was perceived as more fun

than usefulI Acceleration gestures was seen as simple to useI Touch keyboard, nice but slowI Navigation keys, OK way of navigating

14 / 16

Results

I Feasible to implement W3C’s Multimodal InteractionFramework on Android

I The users were positive towards a multimodal interfaceI Multimodal interface can help support universal access

15 / 16

Thank you!

I Questions?

16 / 16