Paper Alejandro Medrano - Use of hierarchical model-view-controller architecture for user...

6

Click here to load reader

description

 

Transcript of Paper Alejandro Medrano - Use of hierarchical model-view-controller architecture for user...

Page 1: Paper Alejandro Medrano - Use of hierarchical model-view-controller architecture for user interaction in AAL environments

Use of hierarchical model-view-controllerarchitecture for user interaction in AAL

environments

Alejandro M. Medrano Gil ∗, Dario Salvi ∗, María Teresa Arredondo Waldmeyer ∗ and Patricia Abril Jimenez ∗∗ Life Supporting Technologies, Departamento de Tecnología Fotónica y Bioingeniería,

Escuela Técnica Superior de Ingenieros de Telecomunicación, Universidad Politécnica de MadridAvenida Complutense n. 30, "Ciudad Universitaria". 28040 - Madrid (SPAIN){amedrano, dsalvi, mta, pabril}@lst.tfo.upm.es

Abstract—The Ambient Assisted Living (AAL) concept appliesa model of provision of various services for elderly and peoplewith special needs. AAL services should include invisible, butalways present technology; natural, simple and effortless interac-tion with users; device-independent and reactive user interfaces;and continuous, autonomous and ubiquitous service adaptation.

In this context, the presentation of the information is achallenge. The system installed in a AAL environment should beable to support dynamic presentation of data according user’spreferences and needs and environment conditions. Separatingthe view from the data and logic present an advantage whenestablishing coding patterns and software architecture for thedomain.

This paper reveals the details and advantages of using Hierar-chical Model View (HMVC), proving its feasibility with a genericdesign for User Interaction (UI) Renderers and the concretesolutions implemented in an EU funded research project.

I. INTRODUCTION

Ambient Assisted Living (AAL) is a technological frame-work whose main focus is to make life easier to elderlyand disabled people by introducing smart technology in theirdaily lives. This kind of users are particularly demanding interms of user interaction requirements. Elderly users do notusually have the required knowledge or skills to fully interactwith modern electronic systems, while other users can presentdisabilities that can pose limits to their daily activities, andthus also when interacting with software. Although addressingthese impaiments is particulary challenging for software devel-opers, there is also a long-established experience in applicationdevelopments for this particular users.

Universal Design1, or Design for All2 (D4All) are fields thatattracted big efforts in the research [2] and from Europeanpolicies3. ICT D4All aims to build systems usable by the vastmajority of people, regardless of factors such as their abilitiesor age, the interaction platform and the context in which theyoperate [4].

1http://www.ncsu.edu/project/design-projects/udi/2http://www.designforall.org/3http://ec.europa.eu/information_society/activities/einclusion/policy/

accessibility/dfa

On the methodological side also User Centered Designmethodologies are based on the assumption that the productmust be created around and for its final users, independentlyof their abilities. These methodologies has been successfullyapplied in many projects related to AAL [6], [3], [1].

User Centered Design methodologies have been the reasonfor the development of multiple user interaction (UI) solutionslike gesture recognition, voice activation, different kinds ofvisual representations that mean that application developersshould be extremely flexible when designing the interaction.However, as each of these modalities has its own particular-ities, the information to be provided to the users have to beadapted and this causes some problems and difficulties in thedevelopment process.

II. DEFINITIONS

In order to represent the User Interaction (UI) independentlyof the interaction modality, a meta model of UI must bedefined. This meta model is then interpreted and translatedinto a concrete UI mean by what can be called a UI renderer.The following paragraphs offer a deeper overview of these twoconcepts.

A. UI Meta Models

The objective of the meta model is to model the user inter-action neutral to the form of modality or the used technology.This model represents the type of information that can be sentto users and can received from them and the actual content ofthe message.

An example of a standard meta models for UI is XForms[10], other specific examples can be found within AAL plat-forms, where user interaction models have proliferated [8], [7],[11].

Meta models are defined by examining carefully the UIrequirements, trying to factorize and categorize the informa-tion types exchanged by user and applications. The basicconcept behind these models is the element, a piece of UI thatcontains a message. Elements are then grouped in containersthat represent the user interaction in a given point in time.

Page 2: Paper Alejandro Medrano - Use of hierarchical model-view-controller architecture for user interaction in AAL environments

Usually elements are classified in a number of subclasseswhich further specifies their properties. The most generalclassification of elements include input elements, output, labelsand groups.

• Input elements are elements which value can be changedby the user. Input elements may also be seen as output-input as they show the state before and after the userinteraction.

• Output elements, on the other hand, do not enable theuser to change their status, and they are used to displaystatic information to the user.

• Labels help modelling the tag that the rest of elementsshould have, displaying the name of the user or anicon. The use for these labels also includes modellingof the options provided for elements that involve the usermaking a selection out of a discrete set of options.

• Groups are used to bond a set of elements together to helpthe UI renderer, and eventually the user to understand thatthose elements relate conceptually.

For each of these classes there are specializations, for in-stance, to display and retrieve boolean values, others specializein simple text, or numerical values within a specific range.

The choice of these subclasses is domain of the application.Applications use the meta model by making instances of thedesired container, and filling it with instances of the elementsthey need to show to the user, or need to retrieve fromthe user. This model is then presented to the user using aUI renderer, when the user interacts with the renderer thelatter will inject the information provided by the user into themodel. The application then receive this information throughthe mechanism the model provides, either a callback or anevent.

This process forces the application developers to design UIin neutral modality terms, preventing them from manipulatingthe end result, and keeping them agnostic of the hardware ortechnology used in the actual presentation.

The use of the meta model forces also the use of MVCpattern, as the meta model itself represents the view part ofany application, and provides mechanisms to help the designof the controller section.

B. UI Renderers

The function of a UI renderer is to interpret the metamodel into a useful form that the user understands and caninteract with. Usually one UI renderer is associated to onetechnology (and the compatible hardware) and one or morecommunication channels.

A UI renderer may be designed for a wide spectrum ofusers, but can also be specialized to fit a set of users withspecial needs. For example a GUI renderer may be configuredto show bigger fonts if the user has mild visual impairment ormay be purposely built for cognitive deprived users that areunable to remember short term events or can not read.

The containers defined in the meta model are rendered ac-cording to their definition as close as the hardware, technologyand/or communication channel enables the UI renderer to do

Meta Model

GUI Renderer VUI Renderer

App1 App2 App3

Figure 1: Layered UI architecture

so. The task for the UI renderer is to translate from the metamodel into the underlying technology’s components, and theuser input back into the meta model.

As most of the information is modelled as modality neutral,there can be many UI renderers, each one managing differentset of channels with different modalities.

C. Interaction model among Applications, Meta Models andUI Renderers

The global picture of how the whole system works ispresented in figure 1. Application developers focus on thewhole stack using a canonical MVC approach, they do notneed a meta model nor an independent UI renderer. Byintroducing the meta model application developers can focuson the requirements of their application and ignoring detailsabout the user interface. The meta model also allow severalapplications to share a common interface, and look.

Applications create instances of the meta model, includingthe required containers with their properties, and pass them toUI renderers. In some cases this process of transporting themeta model instance from the application to the UI renderer isdone through another component. When this occurs normallyit is because the meta model is part of a platform, andthe mechanism of transportation is also a sub system ofthe UI management. The case of platforms with this kindof UI approach usually enable different applications and UIrenderers to run at the same time, just like shown in figure 1.

D. Nested and Hierarchical MVC

This approach implies a MVC stacking. The applicationsuse MVC where the view is a meta model instance and theUI renderer also uses MVC, but this very same instance isthe model. This concept is called nested MVC, as the MVCfor the UI renderer plays within the view of the MVC of theapplication (see figure 2 a).

This concept should not be confused with hierarchicalMVC (HMVC) (see figure 2). HMVC refers to the MVCbeing applied several times at different levels. In the scopeof this paper, HMVC is the application of MVC for meta

Page 3: Paper Alejandro Medrano - Use of hierarchical model-view-controller architecture for user interaction in AAL environments

M C

V

M C

V

(a) NMVC

M C

V

M C

V

M C

V

(b) HMVC

Figure 2: Diference between Nested and Hierarchical MVC

model containers in top level, then grouping elements, andsimple elements on the bottom level. HMVC means that eachcomponent of the meta model is treated autonomously, andindependent from all other components of the meta model,the only relationship being a hierarchical one.

III. METHODOLOGY

UI renderers should follow the MVC pattern, not justbecause the MVC is imposed by the usage of a meta model,but because it helps the design and clarity of the implementa-tion. In this scope the model is represented by the incominginstances of the meta model, the view role is played by theunderlying rendering technology, and the controller is the UIrenderer’s logic.

The main idea behind this design is to make highly config-urable, extensible and maintainable UI renderer.

The advantages of the adoption of such methodology can beclassified in terms of extensibility, re-usability, maintainabil-ity, testability and scalability of the produced UI renderers.Following, these five aspects are analysed separately.

A. Extensibility

Expected extensions of the renderer comprise meta modelextensions, with new containers and/or new elements, and theconfiguration of the actual look and feel of the components ofthe underlying technology (for example the usage of differentcolours, fonts, or layouts for GUI renderers). By separatingthe model from the controller and the view, these extensionare dealt separately in different modules of the solution.

B. Reusability

Reusability is the capability of applying the same solution toother UI renderers, in fact not only the design can be appliedbut also great part of the implementation has great potential tobe reused. The MVC model decouples model form view thisway the controller part keeps unaltered when the view part ischanged to develop a new UI renderer.

C. Maintainability

Maintainability is the measure of how the project can be re-paired, evolve, or be adapted in the future. The most significantcontribution to maintainability is a clear and understandabledesign, where the key features are immediately spotted anddevelopers have easy access to them.

D. Testability

Testability in user interaction always implies tricky tech-niques, as the UI by definition needs a user to interact with.

Nevertheless testability means also the easiness to test indi-vidual parts of the design independently; with the decouplingof the model and view, both can be tested independently. Theproposed design also provides an easy way to test individ-ual meta model container interpretations and even individualelements.

E. Scalability

Scalability means that the design is equally applicable forincreased functionality. In this context increased functionalityis the extension of the meta model, with new containersor elements. We propose the use of reflective properties,applicable in the objective programming language like Java,in order to guarantee dynamic mapping between model andview. This way any change in the meta model is immediatelyapplicable by either modifying an existing class or adding anew class per new element.

IV. RESULTS

This section presents the results of the design and theimplementation of a UI renderer within the AAL projectuniversAAL. Section IV-A proposes an architectural designwhich is independent from Meta model and the underlyingtechnology used for generating the user interaction. SectionIV-B1 shows how we implemented this design in the univer-sAAL platform using Java Swing as the graphical renderer.

A. UI Renderer Design

The proposed architecture is composed of 4 main com-ponents, a Meta Model Component (MMC), a ContainerManagement Unit (CMU), a View Mapper (VM) and a MainHub (MH), along with view modules per container and perelement, as shown in figure 3.

The MMC is the module in charge of interfacing with themeta model, listening to the meta model events and delegatingthe received instances to CMU.

The CMU decides which container to show at each time,according to the context or meta-data included in the metamodel. Meta-data included in the meta model can describespecific needs, for example they can describe the priority acontainer has, or the language used in the content, to theemphasis of using one modality over another. The CMUmay implement the logic to queue a container that has lesspriority than the container that is currently being displayed atthat moment, or filter those containers that do not match theuser’s language or the renderer’s modality. The CMU is then

Page 4: Paper Alejandro Medrano - Use of hierarchical model-view-controller architecture for user interaction in AAL environments

MH

CMU VM MMC

Container View Element View

Figure 3: Proposed UI renderer architecture

responsible for obtaining the implementation of the containerby querying the the VM to get the container view component.

The VM is a piece of software that locates the viewcomponent for any container or element instance of the metamodel. For example when the VM is required to search for anelement or container of type T, then the viewing componentfor that type is loaded, for instance on the basis of a namingconvention that associates T with TView. Reflection can beused to easily implement this mechanism.

The MH will act as a place holder for all common neededvariables, including the MMC and the CMU. This allows toreplace these components, when for example another metamodel is used, or when another display management is re-quired.

The container view component will make use of the VMto find the element view component for each of it’s children(matching the expected behaviour of top level components ofthe HMVC hierarchy). Grouping element views would do thesame technique but their hierarchical level is lower as they canbe “owned” by containers or other grouping elements.

View components in general (element and container viewcomponents) are atomic translators for the meta model com-ponents. For each meta model component’s MVC (as part ofthe HMVC) there would be a view component, which willallocate all the necessary instructions to do the translation intothe underlying technology’s model.

Furthermore, each view component can be decoupled intotwo major parts, the information translation and the look andfeel (L&F) layers. The idea is to decouple what a componentshows from how it shows it.

The L&F is independent from proper viewing component(information translation layer), and several L&F templates canbe developed for the same UI renderer. L&F templates willtake care of colours, fonts, layouts, shapes, sizes and everydisplaying detail that the proper view component does nottake care of. The capability of providing different L&Fs issomething appreciated by any user, but in the case of AALusers is specially important. This mechanism offers AAL

UI Renderer

Meta Model

L&F ExtensionM

eta Model

Extension

Figure 4: Extensions context

developers a simple, yet powerful, way to specialize the UIrenderer for certain impairments. For example the rendererwill normally use fancy colours and fonts, while in presenceof visual impaired users, the renderer would adopt anothertemplate with bigger sizes, simplified fonts and high-contrastcolours.

This design inherently allows meta model extensions andL&F extensions. These extensions can be provided in separatepackages and dynamically incorporated to the UI renderer (seefigure 4). Being reflection used in VM, the new meta modelcomponents are directly mapped to the provided viewingcomponents.

For instance it is possible to enrich the choice of L&Ftemplates by adding them to the VM (e.g. simplified L&F,customised colours etc.), or it is possible to enrich the metamodel by extending it and by providing proper viewingcomponents for it (e.g. a viewer that plots arrays on a twoaxis graph).

L&F packages are loaded by instructing the VM to loadthe L&F components instead of the proper view packages.As L&F components are extensions of the proper component,they incorporate all the needed functionality required from aviewing component.

Meta model extensions can be very useful, for examplean application may want to display graphs, which are notusually defined in the original meta model; as it will bedefined in neutral modality and graphs only make sense ingraphical modalities. The way to provide this functionalityis by providing meta model extensions, in some sense theseextensions are stackable, so they may be thought of as plugins.

The proposed design automatically incorporates these plu-gins. When a reflection VM is used, the new meta modelcomponents are directly mapped to the provided viewingcomponents. So the packaged meta model extension pluginfor a UI renderer just contains the viewing components for

Page 5: Paper Alejandro Medrano - Use of hierarchical model-view-controller architecture for user interaction in AAL environments

the given extension.

B. UniversAAL Technological Solution

UniversAAL is an European research funded project [11],which copes with the challenge to create a standard openplatform that makes technically feasible and economicallyviable the development of personalized AAL solutions.

As part of its platform universAAL provides UI supportas a basic functionality. The main software artefacts of theplatform around which the UI management is built are theinput and output buses. In universAAl a bus is a virtual channelof communication through which semantic rich informationtravels, these buses connect pieces of software within the samenode and between nodes providing the power of distributedprocessing. The output bus is the mechanism through whichoutput events are presented to the user, likewise the input busdoes the same job but inversely, taking user input events andpresenting them to the applications.

The meta model in universAAL, wrapped in these input andoutput events, is a definition of resources called IO RDF, eventhough it’s implementation is not in actual RDF [9], but a Javaequivalent representation. This meta model features 4 types offorms, grouping elements, several input and output elements,and a special kind of element called submit, which triggers theuser’s input events and terminates dialogs. This meta modelis very basic, as in any AAL environments UI is not expectedto be very complex due to users’ limitations. The IO RDF isbased on XForms [10], and like it, it is susceptible of beingextended.

UniversAAL adds another component to the IO buses andthe IO RDF: the Dialog Manager (DM). The DM is a dialogorchestrator which coordinates all output and input eventscoming from and to applications, playing the role of theCMU at system wise level. This coordination involves multiuser coordination, multi modality synchronization, prioritymanagement, impairment mapping, and many other importantadministration of the whole UI.

Lastly but not least important, the UI renderers, here calledUI handlers, cope with human-machine interaction and map-ping to and from the IO RDF (meta model).

1) The Swing Handler: Within this framework, we havedeveloped a UI renderer based on the well-established JavaSwing library [5] as graphical rendering engine, namely theSwing Handler, to prove the applicability of the HMVC inUI renderers for an AAL environment. This Swing Handlerfollows the design proposed in section IV-A, but speciallyadapted to univesAAL unique features.

UniversAAL provides two elements to receive and publishoutput and input events respectively, these are the outputsubscriber and the input publisher, both form the meta modelcomponent as their role is to interact directly with the IObuses. Elements, or Form Controls as IO RDF defines them,have their view following a HMVC paradigm. In the case ofsubmit form controls have a main role in the control part ofthe handler’s MVC. They are connected to the input publisher

Figure 5: Simple test showing some of the key features of thedefault look and feel package.

to commit their mission, that is: to send all user input whenthe submit is activated by the user.

The handler had no need for a queued CMU as univer-sAAL’s DM already manages the priority.

Look and feel is supported by means of the standard JavaSwing look and feel templates. A standard template for alluniversAAL applications has been produced, although it leavesthe possibility of extending it or implementing a new one.

The visual results of the implementation produce an elegantand simple graphical output which can run seamlessly on dif-ferent environments and operating systems and with differentscreen resolutions.

V. CONCLUSIONS AND FUTURE WORK

This paper shows how to apply hierarchical MVC patternsto the design of AAL user interaction renderers. The proposedarchitecture and the example of implementation provided forthe universAAL platform prove that the use of this patternsimplifies the effort in developing UI renderers.

The Swing Handler has been compared with an alreadyexisting handler of the platform. Thanks to our design, locatingfunctionalities of each IO RDF resource representation wasmuch easier on our implementation than the existing one.Moreover, the Swing Handler allows the redefinition of theIO RDF, extending it without hardly the need to change thecentral logic of the handler. The adoption of Swing templatesallows also the capability of changing the look and feel, evenat runtime.

Nevertheless the implementation of a single class for eachelement in the meta model was tedious work, and for largemeta models it would require a bigger development team.The duality of view component and L&F template, makes thatthe development of new templates is equally human resourcedemanding. But usually already created L&F packages willserve as reference and source templates to develop new l&Fpackages, which eases the procedure to create new L&Fs.

As future improvements we propose the development ofplugins for the Swing handler. Including L&F packages spe-cialized for specific hardware like hand held devices or TV

Page 6: Paper Alejandro Medrano - Use of hierarchical model-view-controller architecture for user interaction in AAL environments

displays. Other interesting L&F packages are AAL impairmentspecific like short sighted or cognitive deprived users. As faras L&F template enables there should be a specific templateper hardware and impairment, so we propose the research onhow to combine L&F packages. This will enable to easilydevelop a specific handler, for example a new L&F templatecould be created from combining existing templates for handheld device and mild sight impairment, in stead of creatingthe same handler from scratch.

Finally, another future improvements involves the other typeof plugin, the meta model extension. We propose the creationof GUI modal specific features, like for example a graph andchart element which will draw a graphical representation ofsignal data or any type of numerical data series.

ACKNOWLEDGEMENTS

We would like to thank the whole universAAL Consortiumfor their valuable contribution for the realization of this work.This work has been partially funded by European Union underthe 7th Framework Program through the universAAL (FP7-247950) research project.

REFERENCES

[1] Jan O. Borchers. A pattern approach to interaction design. InSatinder Gill, John Karat, and Jean Vanderdonckt, editors, Cognition,

Communication and Interaction, Human-Computer Interaction Series,pages 114–131. Springer London, 2008. 10.1007/978-1-84628-927-9_7.

[2] Laura Burzagli, Pier Luigi Emiliani, and Francesco Gabbanini. Designfor all in action: An example of analysis and implementation. ExpertSystems with Applications, 36(2, Part 1):985 – 994, 2009.

[3] Annette De Vito Dabbs, Brad A. Myers, Kenneth R. Mc Curry, Jacque-line Dunbar-Jacob, Robert P. Hawkins, Alex Begey, and Mary AmandaDew. User-centered design and interactive health technologies forpatients. CIN: Computers, Informatics, Nursing, 27, May/June 2009.

[4] P. L. Emiliani and C. Stephanidis. Universal access to ambientintelligence environments: opportunities and challenges for people withdisabilities. IBM Syst. J., 44:605–619, August 2005.

[5] Amy Fowler. A swing architecture overview. http://java.sun.com/products/jfc/tsc/articles/architecture/.

[6] Sonja Müller, Ingo Meyer, Ilse Bierhoff, Sarah Delaney, Andrew Six-smith, and Sandra Sproll. Iterative User Involvement in Ambient AssistedLiving Research and Development Processes: Does It Really Make aDifference? 2011.

[7] AALuis project. Facilitate the connection of different services todifferent types of user interfaces and thus enable the future users ofambient assisted living systems to use more services interacting in theirpreferred way. http://www.aaluis.eu/.

[8] GoldUI Project. European aal project for adaptive embedded humaninterfaces designed for older people. http://www.goldui.eu/.

[9] W3C RDF Working Group. Resource description framework a standardmodel for data interchange on the web. http://www.w3.org/RDF/.

[10] W3C The Forms Working Group. Next generation of forms technologyfor the world wide web. http://www.w3.org/MarkUp/Forms/.

[11] universAAL project. Universal open platform and reference specificationfor ambient assisted living. http://www.universaal.org/.