D7.5 Report on evaluation of the robot trolley - … · Report on evaluation of the robot trolley...

71
D7.5 Report on evaluation of the robot trolley Workpackage WP7 Task T7.3 Document type Deliverable D7.5 Title Report on evaluation of the robot trolley Subtitle Author(s) A. Green, C. Bogdan, J. Falb, D. Ertl, K. Severinson Eklundh and H. Kaindl Internal Reviewer(s) FZI, TUW and KTH teams Version v1.0, 14.6.2010 Status Final Distribution Public

Transcript of D7.5 Report on evaluation of the robot trolley - … · Report on evaluation of the robot trolley...

D7.5Report on evaluation of the robot trolley

Workpackage WP7Task T7.3Document type Deliverable D7.5Title Report on evaluation of the robot trolleySubtitleAuthor(s) A. Green, C. Bogdan, J. Falb, D. Ertl, K. Severinson Eklundh and H.

KaindlInternal Reviewer(s) FZI, TUW and KTH teamsVersion v1.0, 14.6.2010Status FinalDistribution Public

Report on evaluation of the robot trolley – D7.5Summary

SummaryThis report describes the activities of Task T7.3 in workpackage 7 (WP7) related to the evaluation of theusability and user experience of the shopping trolley. Such a trolley should, through its design, inviteto a shopping activity that is efficient but also enjoyable. Therefore evaluation methods are used thatcombine assessment of task performance with subjective measures, assessed through questionnaires andinterviews. The results concerning task performance are partly reported in D6.3 (System tests). Thereport describes the following studies:

– An experimental study on the effect of motion cues on users’ behaviour.

– A study of user experience in shopping tasks with the robot trolley.

– A pilot study of the Walking Aid.

– An exploratory study of a shared shopping scenario.

A questionnaire survey of acceptance and satisfaction with the robot trolley was made with participantsin the first two studies. Finally, two focus group sessions about the trolley and its interfaces were carriedout with stakeholders from the retail industry.

These studies have shown that the system can be used for its main task, i.e. shopping in a realistic setting,by untrained users. About a third of the participants in the first two studies were satisfied with the robot’sbehaviour and virtually all of them thought the robot was interesting. Problems with robustness andrecognition rate in interaction with the robot were encountered, and the participants rated the touchscreen interaction highest of four integrated user interfaces. The use of Motion Cues as an interactionelement is an interesting aspect of HRI that should be considered in future research.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page II/67

Report on evaluation of the robot trolley – D7.5Contents

Contents

Summary II

1 Introduction 1

2 Background 22.1 Evaluation of usability and user experience . . . . . . . . . . . . . . . . . . . . . . . . 2

3 Preparation for Evaluation Activities 53.1 Preparations during the final year . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53.2 Comments on the work process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

4 Movement-centric Evaluation 84.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

4.1.1 Related work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84.1.2 The Motion Cue through Speech Change . . . . . . . . . . . . . . . . . . . . . 104.1.3 Experiment with a Semi-Autonomous Robot . . . . . . . . . . . . . . . . . . . 10

5 Implications: HRI on the Move 175.1 Constraints for HRI on the Move . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

5.1.1 Walking Cognitive Workload . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175.1.2 Time Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185.1.3 Reduced Perception Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . 18

5.2 Mixed Initiative . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

6 Shopping Study 216.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216.2 Setup and procedure for the Robot Shopping study . . . . . . . . . . . . . . . . . . . . 216.3 Task-completion in the Shopping study . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

7 Evaluation of the Multimodal User Interface 257.1 Users’ acceptance and satisfaction with the robot trolley . . . . . . . . . . . . . . . . . . 25

7.1.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267.2 Questions about the acceptance and satisfaction with the robot trolley . . . . . . . . . . 26

7.2.1 Ease of use and learnability . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287.2.2 The level of support for the user’s shopping activity . . . . . . . . . . . . . . . . 287.2.3 Preferences towards the interface modalities of a shopping trolley . . . . . . . . 297.2.4 Touch screen interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307.2.5 Speech input and output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307.2.6 Movements of the trolley . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page III/67

Report on evaluation of the robot trolley – D7.5Contents

7.2.7 Haptic interaction as a way to steer the robot trolley . . . . . . . . . . . . . . . . 327.2.8 Open questions about the usability and user acceptance of the shopping trolley . 33

7.3 Summary and discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

8 Evaluation with stakeholders 368.1 Discussion with industrial stakeholders about the functionalities and interface of a robot

trolley . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

9 Exploratory study of the Walking Aid 379.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 379.2 Setup and task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 379.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

10 Exploratory study of a shared-shoppingscenario 4110.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4110.2 Setup and task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4110.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

11 Conclusions 45

Appendix 48

A Histogram figures 48

B Further questionnaire data from MoC and Shopping studies 58

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page IV/67

Report on evaluation of the robot trolley – D7.5Introduction

1 IntroductionThe purpose of this report is to describe the activities of Task T7.3 in workpackage 7 (WP7). Theobjectives of Task T7.3, Evaluation of the overall functionality of the robot trolley, can be summarisedas follows:

• Evaluate the functionality and user acceptance of the robot trolley in a realistic context of use.

• Assess the quality of the final system as an input to dissemination and exploitation actitivitiescarried out in WP8.

The evaluation has been done in close collaboration with the partners of WP6 in which system tests werecarried out. These system tests are reported in the deliverable D6.31.

The system has been evaluated with formative evaluations and user studies throughout the final year ofthe project. The results of the formative evaluations has fed into the development and integration process.To evaluate the system in a more formal way we have carried out two studies: one study on Robot Motionas a Communicative Cue (MoC) and one study on Robot Shopping (RS). The result of these studies aredescribed in Section 4 and Section 7.

The process of evaluating the system has been determined by several factors:

• The robots could only be evaluated on location at the FZI laboratory in Karlsruhe.

• Use of non-native speakers using an English speech system.

• Evaluation took place in a robotics laboratory constructed as a mini-store.

• Technical issues:

– The delay of the CoRoD (the robot built by Zenon) which did not allow for full-scale evalua-tion of multi-user and multi-robot scenarios with users.

– System breakdowns (both hardware and software), i.e., right before/in the beginning of anevaluation week delayed or reduced the size of the planned studies.

– Iterative refinement of user interface and system components took place also during the eva-luation weeks. This was both a result of having the integration weeks right before the eva-luation week and changes/performance tuning required to make the system working to allowfor user studies.

The consequences of this was that we apart from the formative evaluation could carry out two userstudies with 20 and 11 users respectively. A pilot study of the Walking Aid and an exploratorystudy of a Shared shopping scenario were also performed. In the end phase we also carried out afocus group evaluation with representatives for the retail industry.

1D6.3 Report on system tests

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 1/67

Report on evaluation of the robot trolley – D7.5Background

2 BackgroundTheoretical approaches and design practices that have emerged from research on Human-Computer In-teraction (HCI) are relevant for Human-Robot Interaction (HRI) as well. Methods and approaches fromHCI cannot be transferred directly to HRI without reflecting over the implications [26]. One central ideaof HRI is that robots and users can be seen as part of a complex system in which interaction is carriedout in a physically situated environment [50, 55]. Situated communication between a mobile servicerobot and its users takes place in a physically shared environment, and typically concerns entities andactivities that can be referenced, viewed and manipulated by the participants, including the robot itself.Communication is proposed as an important factor for how socially interactive robots are perceived andaccepted by humans [16].

Designing for the general-purpose setting of “supermarket shopping” and other settings where a trolleycan be used (e.g., airports, hospitals, etc) raises the question of user involvement in design, as the settingis highly familiar to many potential users. One way to approach the interaction design is to ground it inthis familiarity by involving the potential end-users in the design. During the first phases of CommRobthis was achieved through field studies and a series of workshops with potential users who took part inprototyping activities.

2.1 Evaluation of usability and user experience

As there are1 no design guidelines or widely accepted methods for the design and evaluation of human-robot communication for autonomous robots, we were compelled not only to consider the design ofhuman-robot communication but also find ways of studying the interplay between human and robots. Aconsequence of this is that there are no obvious heuristics or design guidelines to pick when it comes toevalute human-robot interaction as there is for evaluating usability of computer systems of a more mainstream type, like graphical user interfaces with mouse and keyboard as an input device. Even withouthaving such guidelines and heuristics, the initial pilot studies carried out during the Evaluation Weekscan be seen as a variant of discount usability methods (e.g. [42]). As such methods are focused on thediscovery of problems and challenges in the design they have allowed for formative evaluation neededto prepare the robot systems for the usability oriented user studies carried out during the last year of theproject.

In the recent years the focus in the HCI community has shifted from the core usability evaluation of aspecific interface to the evaluation of the totality of the use, involving the context of use and the users’experience related to the use of a system. As the development of robots targets a mass market, evaluationof the user experience becomes a key activity in human-robot interaction. A robot system like the onesketched in the scenario of this project will be evaluated with users that already are engaged in a familiarand routine activity -– grocery shopping. From the user perspective, compared with a manual trolley, therobotic trolley needs to improve the quality of the experience of shopping.

1There are attempts to establish performance metrics for autonomous systems (cf. [25, 24]).

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 2/67

Report on evaluation of the robot trolley – D7.5Background

This means that interaction has to shift focus, from design aimed to improve or sustain usability todesign of the user experience. The evaluation criteria need to be extended to involve hedonic attributesof the design, i.e., properties of systems that affect the type and quality of experience rather than theefficiency of the system. Evaluation of robots intended for professional use has long focused on usabilityrather than experience. The approaches for large-scale evaluation of professional robot systems to assessperformance of user and robot acting as a system provide means of comparing competing approachesfor a well-specified task [52]. In a scenario where task efficiency cannot be measured in a straightforward manner -– in terms of low task completion time, shortest distance travelled, etc, the experienceof engaging in the activity of shopping while interacting with the system evaluation of the user experiencecomes in focus.

Gaining insights about how users experience the interaction with service robots is an important factorto determine how they should be designed. However, there exist many different views on what the term“user experience” refers to. According to Norman [44] there is an inevitable link between user experienceand emotions. Norman provides a psychological model to explain why our emotional response affectsthe way we understand and appreciate artefacts and systems. According to Norman [43] when we arein a state of positive affect we are more curious, creative and effective when it comes to learning newthings. On the other hand, when we are anticipating danger or experience something negative we becomemore likely to get entangled by details and flaws of the interface. Norman [44] argues that “[d]esignerscan get away with more if the product is fun and enjoyable”.

Alben [2] describes user experience as: “all aspects of how people use an interactive product the way itfeels in their hands, how well they understand how it works, how they feel about it while they’re using it,how well it serves their purposes, and how well it fits into the entire context in which they are using it”.

There are several methods that are focused on the users’ experience, such as Kansei engineering [51],interface aesthetics [5] and research on robot appearance [14]. Hassenzahl [23] relates two general typesof attributes of a product: pragmatic attributes, concerning the utility and usability of the product, andhedonic attributes, concerning attributes related to the psychological well-being of the user. A productwhose pragmatic and hedonic attributes are both strong is a desired product and conversely a productwhere both attributes are weak is an undesired product. McCarthy and Wright [38] stress the holisticand dialogical dimension of user experience (following Dewey [13] and Bakhtin [4]), something whichthey describe as actively constructed through “a process of sense making” where elements of interactionare related to the users and their context. The way interaction unfolds can be seen as a sense-makingprocess that appears as a narrative structure where the users and system act to interpret, reflect on andrecount the experience of using the artefact [38]. This becomes especially important when attempting toanalyse the complex relationship between user, robot and environment. A robot can be seen as an agentacting together with other artificial or human agents in a complex system involving decision making,planning and cooperative actions [50]. In cognitive robotics, this process can be seen as cooperativeservice discovery and configuration, or human-augmented mapping where the robot and user engage inconversation to gain a shared understanding of their common environment [22, 57, 56].

When practicing design that is directed towards user experience we can define a set of usability goalsfor an interactive product, e.g., those listed by Preece et al [46]: fun, satisfying, emotionally fulfillingrewarding, supportive of creativity, aesthetically pleasing, motivating, helpful entertaining, enjoyable.Fulfilling these goals sometimes conflicts with usability goals (i.e., efficiency, effectiveness, learnability,

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 3/67

Report on evaluation of the robot trolley – D7.5Background

safety, utility, etc). In a scenario involving a robotic shopping trolley we need to take both types of goalsinto consideration.

The trolley should, through its design, invite to a shopping activity that is efficient but also enjoyable.We have therefore chosen to use evaluation methods that connect assessment of task performance withsubjective measures, assessed through questionnaires and interviews. Through the cooperation betweenintegration and evaluation teams, task performance has partly been assessed in connection with the sys-tem tests reported in Deliverable D6.3.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 4/67

Report on evaluation of the robot trolley – D7.5Preparation for Evaluation Activities

3 Preparation for Evaluation Activities

3.1 Preparations during the final year

The evaluation of the robot system is a central activity of CommRob in the final year. There are manyfactors that influence the success of evaluation, most importantly the progress and state of the systemdevelopment and integration. Therefore it was decided early on that evaluations with users would takeplace in a number of dedicated weeks, each one preceded by a week of integration. All this activitytook place at the premises of FZI in Karlsruhe, and each EW week there were participants from at leasttwo (mostly three or four) partners present. The activities are visualised in Figure 3.1. This structureof the evaluation process turned out to be successful in terms of the quality of collaboration betweenpartners, but there were some occasions when only limited user evaluation could be performed. At thosetimes pilot studies were performed that paved the way for a formal user evaluation in a following week.Overall, the integration and evaluation activities had to be adapted more or less continuously with respectto the availability of system elements.

During the final year of the project the evaluation preparation started right after the review meeting. Afterthe review we documented the status of the system at the time.

JuneMay August September October November

March April June

Pilot study(MoC)

Pilot studyMoC

MoCstudy

Pilot study -Walking

Aid

Studyplanning

Studyplanning

(Shopping)

Shoppingstudy

Exploratorystudy -Shared

shopping

2009

2010

Analysis

Figure 3.1: Graphical representation of the evaluation activities

In May 2009 we started by checking the state of the trolley and its components at the time againstthe project goals. The functions and features were further divided between mainly individual technicalfeatures and components to test, components to test in different combinations, and last, but not least,features of the CommRob shopping trolley that users will be able to experience and evaluate. A strategyto tightly couple so-called “integration”-weeks with “evaluation-weeks” and the planning until spring2010 was drafted. We also inspected the FZI laboratory settings to be used for the envisioned series ofevaluations

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 5/67

Report on evaluation of the robot trolley – D7.5Preparation for Evaluation Activities

In June 2009 we carried out a pilot study involving three participants to test and iteratively refine systemcomponents and user evaluation set-up; definition and testing of the envisioned shopping mission. Thisresulted in a list of system components that needed to be improved upon before the next iteration ofevaluation.

In August 2009 another pilot study with 5 participants was performed to iteratively refine and improve onboth system components and their interworking interfaces. In this study we tested that the task scenariowould work sufficiently to allow for user testing. Some improved components were installed and tested(e.g., the wireless barcode scanner). Some performance improvements of the GUI was carried out. Wealso made an informal speech interface evaluation.

In the beginning of September 2009, the movement-centric user study involving about 20 participantswas carried out. This study is described in Chapter 4.

During the semi-annual meeting in September 2009 we discussed and planned the further evaluationactivities. Focus was on staffing constraints and the narrow time frame for HW/SW integration.

In October and November we met in Karlsruhe to analyse and plan evaluation of the Shopping scenario.The CoRoD trolley was tested to establish whether evaluations were possible. Initial integration andtesting of the vision system was conducted.

In March 2010 we carried out a heuristic evaluation of individual robot components that had not beenavailable for testing before. We also did heuristic evaluation of the integrated robot system. Some issuesrelated to modules that crashed when shifting between users were identified and fixed. We also prepareda more realistic shopping environment, shelves and products were acquired. The week after we carriedout the study with 11 users described in Chapter 6. Towards the end of the project we also carried outexploratory studies of the Walking Aid described in Chapter 9 and of the shared-shopping scenario.

3.2 Comments on the work process

Early in our evaluation activities we learned that a specific kind of work is required to bring the systemfrom an integrated prototype, which passes system tests, to a system which can actually be put in thehands of untrained users in a proper environment for the planned runs with the users. We found that oneday of work with users requires about 2-3 days of such system preparation work with a user focus. Letus detail on this specific activity.

• The robot platform needs to be operated by technical staff with different specializations. Thisconstrained the access of both components and robot. To do evaluation, input was needed frommany partners: those specialized in interaction design and evaluation, those specialized in thecommunication platform and application logic, as well as those specialized in the robot hardwareand lower software layers.

• While many user interface parts were prepared “at home” by partners and tested as units or insimulated environments, their ultimate tests were on the robot, integrated with other componentsand with feedback from interaction design specialists. This in turn resulted in cycles of formativeevaluation where changes that were driven by input from users was done. For instance, the wireless

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 6/67

Report on evaluation of the robot trolley – D7.5Preparation for Evaluation Activities

barcode scanner, which needed to be adopted after pilot trials that showed users being uncomfor-table with the wired scanner. The actual design of user experiments and trials often depends on theprogress of the formative evaluation. This design had to be altered after partner discussions at therobot side.

• A number of interaction design improvements came only when the integrated system was tested,and thus time was taken near the robot for their rapid development and cycles of testing andimprovement.

• Since the Lab at FZI had to be shared with other projects, we needed to create and build up astore each time. We had to put up shelves, get products, consider their placement, and enter eachproduct into the database (with price, bar code, picture, coordinates). When shelf configurationchanged to match the requirements of the user tests, the topological map of the robot also neededto change depending on the placement of RFID barriers.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 7/67

Report on evaluation of the robot trolley – D7.5Movement-centric Evaluation

4 Movement-centric Evaluation

4.1 Introduction

User interfaces of semi-autonomous robots are typically multimodal, including speech out and a graphi-cal user interface (GUI), see, e.g., [45]. A few such robots can also communicate with ‘arm’ or ‘hand’gestures, see, e.g., [10, 53]. In analogy to communication between humans, however, the embodiment ofsuch a robot should allow it to include also body movements in 2D-space into multimodal communica-tion with a human.

Therefore, this work introduces and investigates the modality Motion Cue (MoC) enacted by a semi-autonomous robot in the context of multimodal communication. It is integrated with the modalities GUIand speech output to reinforce them. The GUI generation has been described in previous work [15, 31].

The robot used in this study is the FZI InBot [19]. Since the robot lacks manipulators, it may be thoughtof as a kind of “torso”, whose only possible movements are in 2D, by changing position, orientation orspeed.

We have identified several possibilities for communication with such 2D-space movements. This workis focused exclusively on a specific scenario in the context of shopping support, where a robot cart usesthe following motion cue:

• The robot slows down before incidentally passing a product from a given shopping list, whileexplaining this fact through GUI and speech output. This is also an example of mixed-initiative,since the robot takes the initiative in such a situation.

For studying the effects of this new modality, we designed an experiment around this scenario, with andwithout MoC. We performed this experiment in a laboratory that simulates a small part of a supermarket,sufficient for enacting this scenario with human participants. Our statistical analysis of the experimentalresults includes both quantitative measurements and subjective statements from the participants.

4.1.1 Related work

Motion cues as a form of non-verbal communication are part of the robot’s ‘body language’. This conceptwas discussed, e.g., in [18, 36] with a focus on emotional body movements. Previous work [17] presentsthe definition of a social interface. Robots shall provide a natural interface by employing human-likesocial cues and communication modalities. So, social behaviour is only modeled at the interface, whichusually results in shallow models of social cognition. Moreover, the authors aim for readable social cues.A socially interactive robot must send signals to the human in order to provide feedback of its internalstate and allow the human to interact in a facile, transparent manner. The proposed channels for suchemotional expressions include, among others, body and pointer gesturing.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 8/67

Report on evaluation of the robot trolley – D7.5Movement-centric Evaluation

Figure 4.1: GUI screen while robot performs motion cue.

Dance as a form of social interaction was more recently used in [40] to explore the properties and im-portance of rhythmic movement in general social interaction. All these authors focus on movement asexpression of emotion, not as a modality that reinforces the output of other modalities in a multimodaluser interface.

Other work [35] discusses humanoid robot body orientation for reconfiguring spatial formation arrange-ments with people. The human and the robot are in a so-called F-formation. The authors show that ahumanoid robot that interacts with a person can reconfigure the spatial formation of the user with rotatingthe robot’s body. Moreover, they point out that it is more effective to rotate the whole humanoid body ofthe robot rather than rotating only the head. In contrast to our work, the authors do not consider a mo-ving semi-autonomous shopping cart but a humanoid robot with a different morphology, thus a differentembodiment.

The perception of affect elicited by robot motion is presented in [48]. They point out that motion cues canreveal details on a person’s current physical and mental state in inter-human communication. Researchhas shown that people do not only interpret motion cues of humans, but also motion cues of devicessuch as robots. So they studied the relationship between motion characteristics of a robot and perceivedaffect. The authors suggest that acceleration and curvature appear to be most influential for how motionis perceived. Their work is different in the morphology of their much smaller robots and the differenttrajectories that these robots had to follow when driving around. Additionally, their work does not involvea cooperative task that the human and the robot fulfill jointly.

Other work points out that humans perceive all motor actions as semantically rich, whether or not theywere intended to be [7], or demonstrates that unpleasant bodily expressions of the robot elicit unpleasantimpressions to the user and vice versa [32].

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 9/67

Report on evaluation of the robot trolley – D7.5Movement-centric Evaluation

4.1.2 The Motion Cue through Speech Change

MoC is a kind of Spatial Prompt [21, 28] that used as an output modality in a multimodal UI reinforceother modalities like GUI and speech out. The physical representation of MoC in our current approachis acceleration and deceleration of a robot with a given cruising speed. Moreover, the user does notactively accelerate or decelerate the robot while MoC is used as output modality, but it is an autonomousrobot behaviour. This modality is based on two quantities, velocity and displacement. Velocity is therate of change of displacement with time. Acceleration and deceleration, respectively, are changes invelocity over time. For implementing MoC for a certain robot, simply the Newtonian equations for thesequantities have to be used. So, the main properties of MoC are the concrete values for the intended robottorso acceleration and deceleration. Changing these values at design time lets humans experience MoCdifferently at run time.

We have implemented MoC prototypically on a robot cart. The speed of our robot is 50cm/sec when itdrives from one place to another. In its layered architecture, the communication layer (top layer) sends aMoC command for a product and the position of the product to the operational layer (middle layer). Theoperational layer triggers the beginning of MoC once the distance of the robot to the product position isless than one meter. With a short delay, the communication layer puts a screen as shown in Figure 4.1on the robot’s touch screen, and triggers speech out with the content “We are now passing product namefrom your shopping list.” Then the robot decelerates to 10cm/sec and drives with this speed for about1.5m.

While the robot moves more slowly, the user may stop the robot via the stop button on the GUI. If theuser stops the robot during the MoC behaviour, she can resume later and the robot accelerates again to50cm/sec. If the user does not stop the robot, it accelerates again to 50cm/sec when the product is left50cm behind, and drives to the originally given location.

Figure 4.2 visualizes MoC, where the red line shows the robot moving at 50cm/sec and the green lineindicates the MoC behaviour without a stop.

4.1.3 Experiment with a Semi-Autonomous Robot

For evaluating MoC, we designed and ran an experiment with our semi-autonomous robot in a laboratory.The basic scenario was to have the robot guide the participant to some product. On the way, both theparticipant and the robot would pass by three other products from the participant’s shopping list. Whilethe participant originally does not know about that, the robot has full information about these productpositions and the fact that they are on the shopping list. So, it would try to inform the participant whenapproaching such a product. This communicative act of informing is always done through the GUI andspeech output. In addition, the robot slows down when MoC (as described above) is enabled.

In our experiment, we wanted to find out whether MoC reinforces the message or not. So, we comparedtrials with MoC enabled with other trials, where MoC was disabled, i.e., only GUI and speech were used.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 10/67

Report on evaluation of the robot trolley – D7.5Movement-centric Evaluation

Shelf

Wine

Robot Cart

Coca Cola Cleaner Shampoo

Speed of Robot Cart [m/s]

Distance toProduct [m]

- 2.0 0.5

0.75

0.2

0 0.6-0.5- 1.0

Figure 4.2: Motion cue experiment setup in our laboratory.

Laboratory Setting

We designed and built the laboratory setting for the experiment as sketched in Figure 4.2 and as depictedin Figure 4.3. It simulates a small part of a supermarket, more precisely a 10 meter aisle. The shelfcontained about 10 different products that the robot had stored in its product database (including theirposition). Each of them could be scanned with the bar code scanner mounted on the robot.

Given Task

We gave the participants a print-out of a shopping list with the four items Coca Cola, Cleaner, Wine, andShampoo. The list had to be entered in such a way that the robot had Shampoo as the first target location.We asked the participants to imagine that this was their own shopping list. The idea was to motivate themto ‘buy’, much as one would usually be motivated to buy items from one’s own shopping list. After that,the last instruction was to let oneself be guided to the first target location (Shampoo).

After that, the robot was supposed to take the initiative and to guide the participant first. When passingone of the products on the shopping list, the robot would initiate a dialogue about that fact. That is, therewere no other explicit tasks given to the participants, in particular no explicit instruction to ‘buy’ (to takesuch a product, to scan it and to put it into the cart). This was only implicitly suggested by entering themon ‘one’s own’ shopping list.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 11/67

Report on evaluation of the robot trolley – D7.5Movement-centric Evaluation

Figure 4.3: Participant stopping the robot via GUI after a motion cue.

Participants

We hired participants of about the same age (between 20 and 30 years old, with an average of 24.1years) and of about the same level of education (students), with an incentive of a small gift (5 Euros)after having participated. Then we assigned them to two groups (with the same gender distribution), onededicated to be ‘treated’ with MoC first and one without.

Two Trials Each

In addition, we wanted each of them experience both ‘treatments’, so that we might have them betterexplain their user experience with MoC. So, we let each participant repeat the same trial a second time,but this time receiving the other ‘treatment’. It was clear that there would be a learning effect, so thatcomparisons of data from the first and the second trials could, of course, not be made for testing our nullhypothesis. Still, we found it interesting to get data for better understanding the learning effect throughcomparing the data from the (repeated) second trials.

In order to get more information from the participants about their user experience, we actually let themreceive the MoC ‘treatment’ once again after these two formal trials. This time, however, we did notevaluate any measurable results but asked them to verbalize their experience through thinking-aloud. Inaddition, the test leader asked them to reflect in detail on the robot’s behaviour when it was slowing downat a product from the shopping list.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 12/67

Report on evaluation of the robot trolley – D7.5Movement-centric Evaluation

Null Hypothesis

We defined the null hypothesis that employing MoC would show essentially the same results as withoutMoC. These results were measured through the number of products taken, and the number of attemptsto stop the cart, both in the course of a given trial. These metrics are supposed to indicate whether thecommunicated message about a product from the shopping list being in the close vicinity gets across.

Execution

Before these trials, each of the participants was given a short introduction on how to use basic functiona-lity of the robot, initiating it to guide one to a given product and to stop it. For ‘buying’ a product, theywere advised not only to put the product into the basket of the robot cart, but also to scan it with its barcode scanner. In the course of this introduction, the participants got acquainted with the bar code scannerand the touch screen.

Of course, however, they were neither shown the feature of indicating a product from the shopping listnor told about it. And they were not shown or told anything about a difference in behaviour (slowingdown or not).

Then the given task was explained to the participant. So far, everything was identical to all participants.Depending on the group they had been assigned to, however, they were given ‘treatment’ with MoC firstand then without, or the other way round.

Before the introduction to the robot, in between the two trials and after them, as well as at the end, weasked the participants to fill out various questionnaires. In particular, we were interested in the subjectiveopinion of each participant on a few issues. Primarily, we wanted to know whether the participants feltthat the robot influenced the execution of their planned action (going to Shampoo) by the suggestions forinterleaved actions (such as taking Coca Cola while passing).

Experimental Results

The statistical analysis of the experiment’s results includes data of 15 participants that have completedboth trials of the experiment. (5 participants encountered technical problems with the robot, so that theirdata cannot be included.) For each participant, we measured the number of products the participanthas taken. Each participant may have taken between 0 and 3 suggested products (Cola, Cleaner, andWine) before they arrived at the originally targeted product (Shampoo). The frequencies of the suggestedproducts taken in the first trial are shown in Figure 4.4.

Figure 4.4 illustrates how many participants (drawn at the ordinate) have taken how many of the sugges-ted products (drawn on the abscissa), with the left blue bars showing the number of participants treatedwith MoC and the right red bars the ones treated without MoC. Figure 4.4 shows that participants usingthe robot with the MoC behaviour activated have taken more products in the first trial than participantsusing the robot without. The contrary trends of the red and blue bars in Figure 4.4 show that the numberof products taken correlates with the behaviour of the robot. The strong correlation at 0.821 is highly

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 13/67

Report on evaluation of the robot trolley – D7.5Movement-centric Evaluation

1   1  

2  

6  6  

2   2  

0  0  

1  

2  

3  

4  

5  

6  

7  

0   1   2   3  

Freq

uency  

Number  of  suggested  products  taken  

MoC  

w/o  MoC  

Figure 4.4: Frequencies of products taken in the first trial.

statistically significant at the 0.01 level.1 Since the situation in the first trial has been identical for allparticipants besides the activation or non-activation of MoC, we are safe to assume that the strong corre-lation is caused by the robot’s behaviour.

In the second trial, participants using the robot with MoC behaviour have taken approximately the samenumber of products as participants using the robot without.

Based on the data of both trials and the different results in both trials, we calculated a two-factorialANOVA (analysis of variances) with the robot’s behaviour (with MoC and without MoC) and the trial(first and second) as independent fixed factors. Thus, we wanted to know if the means of the suggestedproducts taken differ between the four groups and, therefore, we analyzed if the variance between thegroups is greater than the variance within the groups. Table 4.1 shows the ANOVA results, which arevisualized in the interaction graph shown in Figure 4.5.

Table 4.1: Effects of behaviour, trial and behaviour ∗ trial on the participants’ actions with effect size Fand significance level sig.

Factor Mean Square F Sig.behaviour 4.715 5.975 0.022trial 8.715 11.043 0.003behaviour * trial 11.834 14.996 0.001

The significance levels in Table 4.1 show that both factors, the robot’s behaviour as well as the trial,and additionally an interaction between the robot’s behaviour and the trial have significant effects onthe participant’s actions. This means that in the first trial, the different behaviours lead to a significantdifference in the means of the numbers of products taken. Looking at the sequence of both trials, we seethat in the second trial this difference disappears and thus the number of trials also has an effect, that is

1The correlation value represents the Pearson correlation of a point-biserial correlation.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 14/67

Report on evaluation of the robot trolley – D7.5Movement-centric Evaluation

behaviourNonMoCMoC

Estim

ated

Mar

gina

l Mea

ns3

2,5

2

1,5

1

0,5

0

21

trial

trial21

3

2,5

2

1,5

1

0,5

0

NonMoCMoC

behaviour

Figure 4.5: Interaction graph of robot’s behaviour and trial.

even stronger than the robot’s behaviour, comparing the effect sizes F in Table 4.1. We assume that theeffect of the trial number is caused by a high learning effect that superimposes all other effects. Thus,we cannot say if an effect of the robot’s behaviour is present by looking only at the second trial’s data.The results in Table 4.1 regarding the interaction effect of both factors (behaviour ∗ trial), confirm thatthe effect of the robot’s behaviour changes (decreases) significantly with the participant’s learning effect.On the other hand, the interaction effect shows that if MoC is activated in the beginning, less learning isrequired by the participant to act in the same way (take as many suggested products as possible) in thesecond trial. Overall, MoC improves the communication of the robot with users not yet familiar with therobot’s behaviour.

For excluding any learning effect in the analysis (even learning effects within the first trial), we alsoanalyzed the participants’ actions on the very first product (Cola) in their first trial separately, includingall 20 participants that completed the first trial successfully. The results show a similar and highlysignificant effect at the 0.01 level of the robot’s behaviour on the participants’ actions, with 9 of 10participants taken Cola in the MoC condition and only 1 of 10 in the condition without MoC.

We further calculated crosstabs between the number of products taken in the first and second trial toverify that the participants have a consistent behaviour in taking products in both trials, which we hadverified successfully. Thus, participant-specific effects appear in the same way in both trials, e.g., aparticipant that usually does not take wine, acted consistently in both trials.

We also took the metric ‘participant attempts to stop’ into account, that measures if a participant attemp-ted to stop the cart but did not necessarily stop it. The analysis of this metric shows similar statistical

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 15/67

Report on evaluation of the robot trolley – D7.5Movement-centric Evaluation

results as the metric ‘participant takes the product’ as reported above. We also discovered a significantcorrelation of that metric with the robot’s behaviour in the first trial, where the MoC behaviour drewmore attention to the product than the behaviour without MoC.

In addition, we asked the participants for a rating of the statement “The robot cart got me to execute myplanned actions only after the suggested actions” after the first trial, to see if the suggestions by the robotinfluenced their actions. The participants had to answer on a 5-point rating scale from ‘totally agree’ to‘totally disagree’. The frequency distribution of the answers after the first trial is shown in Figure 4.6.In the case of the MoC behaviour, most participants agreed (median x̃ = 1.5), whereas in the case of thebehaviour without MoC more participants disagreed (x̃ = 4). A univariate analysis of variance with therobot’s behaviour as factor shows that around 25% (partialη2 = 0.252,sig = 0.048) of the variance canbe explained by the robot’s behaviour.

4  

2  

0  

1   1  

0  

3  

1  

3  

2  

0  

1  

2  

3  

4  

5  

1   2   3   4   5  

Freq

uency  

totally  agree  (1)  -­‐-­‐>  totally  disagree  (5)  

MoC  

w/o  MoC  

Figure 4.6: Rating frequencies of the statement “The robot cart got me to execute my planned actionsonly after the suggested actions” from ‘totally agree’ (= 1) to ‘totally disagree’ (= 5).

Based on these results, we think that the MoC behaviour draws more attention to the robot and the actionsit suggests, primarily for new users. This may increase the acceptance of robots.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 16/67

Report on evaluation of the robot trolley – D7.5Implications: HRI on the Move

5 Implications: HRI on the MoveMost settings in task-oriented Human-Robot Interaction regard sensing, processing and in general inter-acting with the user at zero or very low robot speed, and the human is not assumed as moving much either,in fact even research on spatially oriented robotics rely on the robot and the user being in a relativelyfixed position[58, 29, 39] others are focused on cogntive models of spatial relationships [49, 9, 54, 41],i.e. focusing on how to interpret or represent communicative input related to movement. In the course ofthis project we discovered the need to re-consider Human-Robot Interaction in regard to the movementof both the human and the robot.

• For example, using a touch screen or speech input is much harder for a user in a crowded environ-ment when the robot is allowed to move during interaction compared to a static robot in a crowdedenvironment that moves from time to time but may not be used when “on the move”.

• Moreover, it is a challenging task when both, the human and the robot, can propose own initiativesfor interaction (mixed initiative) in a socially acceptable manner

So, to overcome some of the challenges of HRI on the move with mixed initiative we present sequentialcommunication phases charting a design space for multimodal UIs that are dedicated to HRI on the move.

Although around 30 users (see Chapters 4 and 7) have done shopping with the robotic shopping trolleyin a laboratory environment with various degrees of success, the problems that they encountered and theissues we found while trying to improve the interaction led us to delimit several categories of “HRI onthe move challenges” which we will present below.

5.1 Constraints for HRI on the Move

We identified the following constraints for designing interaction for HRI on the move: walking cognitiveworkload, time constraints and perception capabilities. They are discussed in the following.

5.1.1 Walking Cognitive Workload

Slowing down when the robot “shows” something to the user is a functionality of our robot (cf. Chap-ter 4), yet only later we came to reflect on why this is appropriate. Our current understanding is thatwalking poses a certain cognitive workload which reduces the attention that the user can invest on loo-king in other directions than the walking direction, looking at specific details like the touchscreen, etc.Reducing the movement speed can thus be necessary if important aspects of the environment must beconsidered, or some communication from the robot perceived. Again, like for other aspects of the inter-action on the move, we should consider this cognitive load aspect in a more general way, as a genericcondition for HRI on the move.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 17/67

Report on evaluation of the robot trolley – D7.5Implications: HRI on the Move

5.1.2 Time Constraints

In order for the human-robot communication to take place in a socially acceptable manner, some opera-tions need to be performed within a certain time window. The robot slow-down exemplified a situationwhere the robot tries to give more time for the user to consider an offer and react. In any user interfaceswift and relevant feedback is vital for sucessful communication (cf. [8, 11, 20]).

We can exemplify “latest” time constraints. Some interactions have to be performed in a specific shorttime frame (“let’s go to the right”, “oh, stop a second”, “no - left, not right”). These are cases wherethe complete interaction cycle has to take place in few seconds. This means: detection of the needfor interaction, taking the initiative and start a communicative act, perception of the communicative actby the other partner, interpretation of the act, performing an action to respond to the communication,detection of the action by the first partner, interpretation by the first partner.

“Earliest” time constraints exist too. Sometime there has to pass some time which is needed before areaction to a communicative act is possible or before taking the initiative to start a communicative act ispossible. This depends on the available modalities. For example: the available modality is a GUI on atouch screen: Robot: “here is a special offer, if you want me to stop, touch the screen!”, but the user is3 m away from the robot, and therefore it takes some seconds for the user to reach the screen.

If both types of constraints appear in the same situation, one must provide heuristics for determiningwhich one is more important, and how that affects the communication to the user.

5.1.3 Reduced Perception Capabilities

On the technical side, many sensors behave differently (generally more poorly) while on the move.Cameras give fuzzy images, they may give shaky pictures when on a moving robot due to a shakingplatform. As mentioned already, noise from the robot drive system disturbs speech recognition meaningthat electro-mechanical systems like robots have reduced perception capabilities while on the move.

5.2 Mixed Initiative

Mixed initiative allows either the human user or the robot to start an interaction. This allows for morenatural interaction between a human and a robot [1]. We present here two samples of mixed initiative inthe course of our running example.

Suggestion of product: The robot suggests an item that the user is probably interested in. This can bean item of the shopping list which is passed at the very moment. Such a shopping list is administrated viathe touch screen of the robot. The robot can choose between the different modalities available to it. Inthis case slowing down and uttering a verbal comment (e.g. ’we are passing’ followed by product name)can be most familiar to the user. This could be assisted by turning the robot in the direction of the itemwhich would catch more attention, thus, reinforcing speech and GUI output..

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 18/67

Report on evaluation of the robot trolley – D7.5Implications: HRI on the Move

Suggestion of route: The robot suggests an alternative route which probably poses advantages forthe user, e.g., because it is shorter than the originally planned route or the planned route is crowded orblocked. Another interesting application can be leading a visually impaired person, who is holding onto the force sensitive handlebar, along a planned route and this way acting as an intelligent white cane.Here the most suitable modality might be slowing a down a little and slightly turning in the direction tobe indicated while uttering a speech output. This provides the user with an extra modality to react to therobot’s action: the resistance force on the handle bar.

Machine initiative is the first phase of the process. The design considerations for this phase are de-pending on the application, i.e. what the initiative is taken about. We have exemplified above two suchapplications and the possible design decisions in regard to modalities used (speech, GUI) and how to usethem.

Human perception of machine initiative follows, after a certain time, the initial initiative. The timeneeded for the user to notice the initiative, as well as other aspects of the human perception are importantfactors to consider in a human-robot interaction set-up where human attention may be directed to othercomponents of the environment. Further robot actions may come too early for the user to comprehendthem in their logical relation to the initiative.

Human response to the machine initiative can be of different types. It can range from tacit perceptionwhereby the user notices the initiative but does not react in any way that can be understood by themachine, to acknowledgement whereby the user reacts to the initiative without attempting to explicitlyrespond by interacting with the machine, to interactional response in which the user interacts with themachine to accept, strengthen, weaken, or reject the machine initiative. For example, for a productsuggestion of our exemplary trolley, the user can acknowledge by slowing down, and/or looking at theproduct, without explicitly interacting with the trolley, yet both such acknowledging cues can help thetrolley in shaping its further actions. An interactional response would occur when the user is stoppingthe trolley via a GUI touchscreen or stopping the trolley by scanning the offered product with a barcode scanner. For route suggestion the user can amplify or reduce the robot turn using the robot’s haptichandle.

It is important to consider whether the interaction is expected to take place in one of the modalities inwhich the initiative was presented or it can come in other modalities. For example, in the case of routesuggestion, the user may choose to press a Stop button on the robot touchscreen GUI or more directlysteer the robot in the opposite direction from where it turned to suggest another route.

In case of user acknowledgment without further interaction, a default action may be defined duringdesign. In some cases acknowledgment without explicit interaction may be enough to consider therobot initiative as accepted, while in other cases it may only account as partial acceptance, therebyweakening the initiative or canceling it altogether. The difference between acknowledgment and explicitinteractional response has played a role for us also in evaluating the movement as communication designsin with mock-up robots [6] and currently with the InBOT platform. We found that it is much easier toestablish whether the initiative was taken than it is to establish whether it is noticed or acknowledged.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 19/67

Report on evaluation of the robot trolley – D7.5Implications: HRI on the Move

Design plays a role also in facilitating interactional response. Not only the launching of the initiative canbe designed but also the levers offered to the user for responding to it can be made more available. In thecase of a product offering, the user could not only respond by speech or touchscreen GUI interaction, butalso by scanning the offered product with a portable scanner, which may be more handy than reachingfor the touchscreen. To reject an offer, the user could also employ the movement modality by pushingthe slowing-down trolley (in the case of product offering) or turning it in the initial direction (in the caseof route suggestion or product offering via turn).

Such prompting for user interaction is also useful when the user has the initiative, e.g. when the userasks for the trolley to move to a destination, the trolley can start slowly and accelerate progressively, toallow for the user to stop the action or alter it.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 20/67

Report on evaluation of the robot trolley – D7.5Shopping Study

6 Shopping Study

6.1 Introduction

We carried out a study to evaluate the experience of doing shopping with the trolley. The study wascarried out with eleven participants at the FZI premises in Karlsruhe.

6.2 Setup and procedure for the Robot Shopping study

The shopping trolley that was used (InBot) could move autonomously to a product in the simulated storeby using the commands ”guide me to X” or ”meet me at X”. In the Shopping study, it could also besteered through a haptic handle.

Figure 6.1: The test leader instructing a test person on how to use the speech interface. The backgroundshows the the shelves with products.

The participants were greeted and a consent form was handed out. After agreeing to participate in thestudy we invited them to fill out Questionnaire 1 (background data). We also informed them that wewould record the session with the robot on video.

The test leader then introduced the shopping task to the test persons. This task was to use the cart to shopaccording to a list with products placed in the store (see Table 6.1). The instruction for the first threeitems was to to use the touch screen user interface (GUI) and then for the next three to use the speechinterface (Speech). For the three last products they were allowed to use one of the two modalities thatwere available (GUI/Speech). The haptic steering, was available but introduced in the following session.

The participants were given a short demonstration of both the touch screen interface and the speechinterface (see Figure 6.1) For the touch screen interface they entered one or two products and the trolleymoved to a location. Once the robot had reached its goal position they were instructed on how to use the

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 21/67

Report on evaluation of the robot trolley – D7.5Shopping Study

Product Command ModeTea Guide me GUICoffee Guide me GUIEggs Meet me GUIJuice Meet me SpeechCarrots Guide me SpeechToothbrush Meet me SpeechChocolate (free choice)Battery (free choice)Magazine (free choice)

Table 6.1: The user tasks with corresponding commands for the Shopping Study

bar code reader. After this they were told how to mount and use the head set microphone and could tryout phrases like “guide me to the next product”, etc.

Once the demonstration was complete, the trolley was replaced into the starting position by the testleader. Then the participant was handed the shopping list with the task (shown in Table 6.1). Theparticipant then entered the products in the shopping list. Although we gave a specific shopping list tothe participant, we did not require or check that this list was entered in a particular order.

After the session with the GUI/speech UIs were completed the users were introduced to the haptic stee-ring. The participants were given a short introduction by a test leader on how to guide the robot throughgently pushing/pulling the handle. The participants could then try out to steer the trolley in differentdirections. Once informed about this way to controlling the trolley, they were asked to carry out a smalltask, i.e., shopping three products using the haptic handle. Also in this control mode the participantswere shopping according to a specified list, but this time with products they had not shopped for in theprevious study. Since the trolley does not navigate autonomously in the haptic control mode the map inthe righthand corner on the GUI was pointed out to the participants.

After completing the haptic part of the session, the user was invited to another room and we asked themto fill out Questionnaires 2 (usability assessment).

All in all three different questionnaires were used in the study:

• Questionnaire 1: background data

• Questionnaire 2: assessment of users’ attitudes towards the trolley with the multimodal user inter-face and haptic control.

The questions in questionnaire 1 and 2 were in essence the same as in the study on Motion Cues (inquestionnaire 2 we added a few questions specficially focused on speech). The results from these ques-tionnaires and the ones from the study on Motion Cues are reported in the next chapter (Chapter 4).

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 22/67

Report on evaluation of the robot trolley – D7.5Shopping Study

6.3 Task-completion in the Shopping study

The participants all managed to complete the shopping task. The time they spent on the task differedquite a lot meaning that they spent between 5 and 15 minutes on the task (see Figure 6.2). This can beexplained by the fact that they did not shop exactly the nine items that were on their shopping list, butdeviated from it (see Figure 6.4) by getting 2-5 products more. In Appendix 1, the results are presentedin the form of histogram charts.

The mean time for shopping one product differed between the participant (range betwen 0.45-1.25 mi-nutes). For all participants the mean was about one minute. The absolute time to collect one product inthe store is not so interesting, since this is largely dependent on the distance traveled. The difference intime to get each product, though, shows that there are some differences in the way the interface worked(or not). During the inspection of the videos two major issues were found:

– Technical problems with the barcode scanner.

– The user had to repeat phrases several times during speech input.

The impression was that users that only used the speech input for the three products, for which the giventask was to use the speech input (according to the task sheet) had the lowest task completion times.

Figure 6.2: Task completion, mean duration (mm:ss) per trial person of complete shopping session.

Figure 6.3: Task completion, mean duration of task (mm:ss) per trial person.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 23/67

Report on evaluation of the robot trolley – D7.5Shopping Study

Figure 6.4: Task completion, number of shopped items (#) per trial person. Max=19, min=9.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 24/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

7 Evaluation of the Multimodal User Interface

7.1 Users’ acceptance and satisfaction with the robot trolley

This chapter presents the results of a questionnaire survey concerning users’ acceptance and satisfactionwith the shopping trolley. The same two questionnaires were administered to participants in two studies,study on Robot Motion as a Communicative Cue (MoC) and The Robot Shopping study (RS), describedearlier (Section 4 and 6). There were also a few extra questions directed to the participants of the RobotShopping study. For the sake of clarity we will repeat some characteristics of the studies first, and thenreport on the results of the survey.

The MoC study was a controlled experiment including 20 participants. The purpose was to study theeffect of lowering the speed of the trolley when passing a certain product in a simulated shopping scena-rio. The users interacted with the trolley via touch screen, and the trolley responded via touch screen andspoken output (in English).

The Shopping Study was an observational study including 11 participants, with the purpose to evaluatethe multimodal interface of the trolley. The users interacted with the trolley via touch screen, speechinput and haptic handle. The trolley responded by touch screen and spoken output.

In both studies, users were first introduced to the trolley, and then instructed to move around in the shopenvironment with the trolley in order to buy certain products according to a prepared script. Beforethe trial they were asked to fill in a questionnaire with questions of a demographic character, and somequestions about their shopping habits. After the trial they filled in a second questionnaire with questionsabout their acceptance of the trolley and their satisfaction with its interfaces and behaviour. Both studieswere conducted in German (instructions and questionnaires).

The two studies are somewhat different in purpose and methodology, but they also have important si-milarities. Both included interaction with the same robot shopping trolley, with two different interfaces(touch screen in the MoC study vs. touch screen + speech input + haptic handle in the Shopping study).Both studies used a shopping scenario in the same simulated shop environment. The only difference wasthat the environment was somewhat larger in the Shopping Study, with a few additional shelves to allowfor more extensive movements.

Due to the similarities we consider it meaningful to present the results of the questionnaire surveystogether, and to compare them in some respects. In particular, we will sometimes discuss the implicationsof the additional modalities of the Shopping study. We are aware that the number of participants is toolow to allow generalizations of the results. However, the results can contribute to the growing knowledgeof multimodal interaction with mobile robots, and help create hypotheses for further research.

The setup and procedure for the Robot Motion study was described in Section 4.1.3, the next few sectionsdescribe the setup and the questionnaires part of the both studies.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 25/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

7.1.1 Participants

In the group participating in the MoC Study there were 20 participants, 16 men and 4 women. Theirage was 24.1 years on average, and all of them were students within areas such as informatics, electricalengineering, machine design and economics.

The group that participated in the Robotic Shopping Study included 11 participants, 6 men and 5 women.Their average age was 29.9 years. Six were students, mostly within technical areas, and 4 of the otherswere employed at the university.

The participants were frequent users of information technology. The tools and services that they usedthe most were: MP3 players (95/91 %), digital camera (80/64 %) and online banking. Most of them hadno previous experience with speech recognition systems, but 85/82 % had experience of touch screens,which they had used with e.g. cash dispenser machines, ticket machines, mobile phones and navigationsystems.

Most of the participants live in small household units, buy groceries 1-2 times a week and often use abike or walk to the supermarket. A few participants report that they sometimes drive a car to the grocerystore. On the question whether shopping is an important leisure activity for them, the majority answered”less important”.

It can be argued that this selection of participants does not match the assumed target group for the shop-ping trolley very well. However, given the fact that the robot was an early prototype, with practicallimitations and occasional technical failures, we considered it appropriate that the evaluation includedpeople with a certain technical competence and understanding, though not knowledgeable within robo-tics.

7.2 Questions about the acceptance and satisfaction with the robot trolley

The second questionnaire included a number of questions about the participants’ subjective experienceof the robot trolley and its behaviour, including the interfaces. The questions were often phrased asstatements that were intended to be answered on so-called Likert scales (e.g. Strongly agree, Agree,Neither agree nor disagree, Disagree, Strongly disagree). It is often recommended to treat such data asordinal, i.e. not interval data, avoiding the use of standard statistical analyses, since the points on the scalecannot be assumed to be equidistant. Therefore we have mostly limited the analysis to a compilation offrequencies. In some other questions, the answers are to be given as numbers on a bipolar scale (such as”cooperative - uncooperative”). We will treat those data as well as ordinal data, using only frequenciesand median values to assess the results.

General satisfaction with the trolley

2:1 How satisfied were you with the behaviour of the shopping trolley?

Very satisfied or satisfied: 35 % (MoC Group); 36.3 % (Shopping group)

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 26/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

The initial question was a general one which subsumes a number of specific issues dealt with later in thequestionnaire. In both groups, the response ratings were rather low: about 35 % answered that they werevery satisfied or satisfied with the trolley’s behaviour. This result probably reflects certain problemsthat occurred in handling the trolley, as well as the fact that some of the participants did not considerthemselves to have a need for an advanced shopping trolley. We will examine those issues further below.

2:2 Please rate the shopping trolley on the following scales: (shown in Table 7.1)

Table 7.1: Dimensions of the question 2.2.: "Please rate the shopping trolley on the following scales"

Group MoC (n=20) Group Shopping(n=11)

median median

predictable 1 ←→ 5 unpredictable 3.00 2.60

talkative 1 ←→ 5 quiet 3.00 3.00

fast 1 ←→ 5 slow 3.00 4.00

conspicuous 1 ←→ 5 unobtrusive 2.50 2.00

expensive 1 ←→ 5 cheap 1.00 2.00

controllable 1 ←→ 5 uncontrollable 2.00 3.00

unpleasant 1 ←→ 5 pleasant 3.00 3.00

cooperative 1 ←→ 5 uncooperative 2.00 2.00

useful 1 ←→ 5 useless 2.50 2.00

noisy 1 ←→ 5 silent 4.00 3.00

attractive 1 ←→ 5 unattractive 3.00 3.00

dangerous 1 ←→ 5 harmless 4.00 4.00

boring 1 ←→ 5 interesting 5.00 5.00

The participants were asked to rate the shopping trolley on a number of different dimensions. Thesedimensions expressed both task-related and social/emotional experiences of the trolley. The results weresimilar for groups MoC and Shopping, with a few exceptions (see Table 7.1). The participants in groupMoC experienced the trolley as faster than those in the shopping group, whereas participants in theshopping group experienced the trolley as noisier.

The median values were usually close to the middle of the scales, with some exceptions. Notably,participants in both groups generally considered the trolley to be very interesting or interesting.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 27/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

7.2.1 Ease of use and learnability

Several questions in the questionnaire dealt with the general experience of ease of use of the shoppingtrolley.

2:3 The handling of the shopping trolley was easy to learn.

Strongly agree or agree: 95 % (group: MoC); 91 % (group: Shopping)

2:7 The concepts and the language used are easy to understand.

Strongly agree or agree: 75 % (group: MoC); 75.75 % (group: Shopping).

2:4 It is easy to operate the shopping trolley.

Strongly agree or agree: 40 % (group: MoC); 45.45 % (group: Shopping)

The overwhelming majority of the participants in both groups considered it to be easy to learn to handlethe shopping trolley. Most of them also considered the interface to be conceptually easy to understand.However, the lower ratings of 2:4 suggest that making the trolley move towards its destination sometimesraised considerable difficulties. This could have to do with the usability of the speech/GUI commandinterfaces as well as the trolley’s movements (see further below).

2:5 It is easy to make the shopping trolley do what I want.

Strongly agree or agree: 55 % (group: MoC), 27.27 % (group: Shopping).

2:6 Altogether, it is unproblematic to use the shopping trolley.

Strongly agree or agree: 60 % (group: MoC): 36.36 % (group: Shopping)

Question 2:5 concerns the user’s experience of control over the robot trolley. The results show a cleardifference between the groups: in the Shopping group only half as many thought it was easy to controlthe trolley. A similar pattern occurs in the results of question 2:6. We should be aware that the Shoppinggroup consisted of only 11 people, so that the differences may well be due to chance. But when addingthis result to other questions we get a picture that the Shopping group had to control additional modalities,i.e. speech input and the haptic handle, both of which introduced practical difficulties for some of theparticipants.

7.2.2 The level of support for the user’s shopping activity

Apart from the ease of handling the trolley, an important issue is whether it supports the participants’shopping activity. On this general question, more than half of participants in both groups agreed:

2:8 My shopping activity was well supported by the shopping trolley.

Strongly agree or agree: 55 % (group: MoC), 72.73 % (group: Shopping)

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 28/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

This result may seem surprising, considering that most participants were not heavy shoppers, and livedin small households. A specific feature of the trolley that almost all participants appreciated was that itcan locate products in the store. This was also a frequent answer to the open question 2:23 ”What didyou like about the shopping trolley?” (see below).

2:9 The function of finding products in the store is helpful.

Strongly agree or agree: 90 % (group: MoC), 100 % (group: Shopping).

One open question addressed directly the characteristics that the trolley can move autonomously, withtwo options for the answer.

2:10 How do you find the idea that a shopping trolley can move by itself?

I prefer to push it by myself: 60 % (group: MoC); 18.2 % (group: Shopping)

It is helpful when it moves by itself: 45 % (group: MoC); 82.8 % (group: Shopping)

We may note that the answer pattern differs strongly for the two groups. In group MoC, more than halfof the users prefer to push the shopping trolley by themselves. In the Shopping group, the majority ofusers have a positive attitude towards the trolley moving autonomously.

7.2.3 Preferences towards the interface modalities of a shopping trolley

The participants were asked to rank the interface modalities in the following question.

2:11 What kind of trolley interface would you prefer? Please assign each of them a ranking, orderingthem from 1 to 4:

Touch screen 1Haptic handle 2Speech input/output 3Gestures 4

Generally, the touch screen was ranked highest by the most participants: 90 % in group MoC and 73 %in the Shopping group ranked it as number 1 of the four alternatives.

If the frequencies for ranks 1 and 2 are added for each modality, the order of the interfaces for group MoCwas screen (95 %), haptic handle (60 %), speech (30 %), gestures (15 %). For the Shopping group, therewas no difference between haptic handle and speech, yielding the order of interfaces: screen (100 %),haptic handle and speech (both 45 %), gestures (11 %).

It should be noted that group MoC had only experienced the touch screen and speech output in theexperiment with the trolley, whereas the Shopping group had experienced all of the interface modalitiesexcept gestures (no gesture recognition functionality was operative on the robot trolley). The touchscreen interface was evidently felt to be robust and received high ratings on several specific questions(see the next section). Looking at the speech and haptic handle modalities, the Shopping group – that had

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 29/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

experienced both interfaces in the trials – ranked those modalities equally high but considerably lowerthan touch screen. This difference is intriguing, but the limited number of participants prevents us fromdrawing any general conclusions from the results.

7.2.4 Touch screen interaction

The use of a stored shopping list was an important part of the scenario in both studies. Therefore a coupleof questions addressed the topic of handling the shopping list on the touch screen. In general, enteringthe shopping list was found to be quite easy to do in the graphical interface.

2:12 To enter the shopping list was (easy 1 ←→ 5 complicated)

Easy or rather easy: 95 % (group: MoC); 90.91 % (group: Shopping)

2:13 To edit the shopping list was (difficult 1 ←→ 5 simple) (Only group Shopping.)

Simple or rather simple 33.33 %

Only 9 users in Group B answered the second question about editing the shopping list. (In the MoC study,the shopping list was not changed during the experiment.) Three of them considered it to be simple orrather simple.

Most of the functions of the touch screen were apparently easy to find and understand by the participants.This is reflected in the following results.

2:14 The functions on the screen were easy to find.

Strongly agree or agree: 95 % (group: MoC); 81.82 % (group: Shopping)

2:15 The shopping situation was well explained through the information on the touch screen.

Strongly agree or agree: 65 % (group: MoC); 90.91 % (group: Shopping).

2:16 The screen always informed me about the current state of the shopping trolley. Strongly agree oragree: 58 % (group: MoC); 72.73 % (group: Shopping)

In summary, the touch screen with its graphical interface was rated to be accessible and transparent.Notably, most participants had experienced such interfaces before the study and were more accustomedto them than the other interface modalities.

7.2.5 Speech input and output

2:17 The commands that can be given with the speech interface are... (appropriate 1 ←→ 5 inappro-priate) (Only group Shopping).

Appropriate or rather appropriate: 45.45 %

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 30/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

2:18 The shopping trolley has understood my speech commands. (Only group Shopping.)

Strongly agree or agree: 27.27 %.

Spoken input was used only in the Shopping study, where it was used in parallel with the touch screen.The speech commands were not many, and the efficiency of use cannot be compared with the one of thetouch screen. There was only a very brief introduction how to use the speech interface. The participantswere not English native speakers. Moreover, there were outliers of people who were not able to utterthe English commands, they did not even get a single command right (this was not just a problem ofaccent). A better training to the speech interface including exemplary speech input from tape, training ofthe different commands and the voice of the user potentially improves the success and acceptance rate.

Still, the spoken output of the trolley was judged to be comprehensible by more than half of the partici-pants in both groups.

2:19 I understood the speech output of the shopping trolley.

Strongly agree or agree: 65 % (group: MoC); 54.55 % (group: Shopping).

2:20 On the whole, the speech output of the trolley is... (inappropriate 1 ←→ 5 appropriate)

Appropriate or rather appropriate: 55 % (group: MoC), 54.55 % (group: Shopping)

The speech interface used on the trolley was in English. In the open questions about the users’ experienceof the trolley (see below) several participants expressed that they would have preferred to communicatewith the trolley in the German language. This might be one of the explanations for the comparativelylow ratings above, in addition to the speech recognition problems encountered by several participants.

7.2.6 Movements of the trolley

2:21 The movements of the shopping trolley were...

(smooth 1 ←→ 5 irregular)

Smooth or rather smooth: 5 % (group: MoC), 0 % (group: Shopping)

The question about the movements of the trolley received quite unanimous answers. Almost no partici-pants (5 %) in group MoC and none in the Shopping group thought that the movements were smooth orvery smooth. This confirms the impressions from the video observations, and is also expressed in manyof the answers to the open questions.

2:22 I could rely on the shopping trolley to guide me.

Strongly agree or agree: 52.63 % (group: MoC); 81.82 % (group: Shopping).

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 31/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

Even though the robot trolley did not move smoothly, the participants were satisfied with its navigationand guiding, especially those in the Shopping group. Actually, “guide-me” was one of the commandsused in the trials, meaning that the robot moved autonomously together with the user.

7.2.7 Haptic interaction as a way to steer the robot trolley

The last part of the trial in the Shopping Study consisted of an introductory test of the haptic handleinterface on the InBot. The participants were shown how to hold on to the handle and steer the trolley indifferent directions for a short while. The participants then carried out a series of shopping tasks with thisinterface, similar to the previous tasks. Technical problems occurred in two of the trials, so the resultsbelow cover only 9 participants.

3:3 The handling of the haptic handle is easy to learn.

Strongly agree or agree: 55.6 %

3:4 The trolley is easy to move.

Strongly agree or agree: 22.22 %

Apparently, the way the haptic handle was intended to work was felt to be easy and intuitive. However,in practice the trolley was often quite hard to steer in this way, which is shown by the results of 3:4. Thevideo observations confirm that the movements of the trolley were sometimes abrupt and irregular.

3:5 It is easy to carry out commands with the haptic handle.

Strongly agree or agree: 12.5 %

3:6 The commands of the haptic handle are easy to learn.

Strongly agree or agree: 37.5 %

The haptic functionality includes some commands where a certain grip on the handle is used with aspecial meaning; however, these commands were not introduced to the participants. Therefore it is hardto interpret the answers to 3:5 and 3:6.

3:7 The shopping trolley was too slow.

Strongly agree or agree: 77.78 %

This result seems to show that it was difficult to move the trolley around at a sufficient speed. Actually,the observations showed that participants generally steered the trolley at a slow pace, although they couldhave moved it significantly faster. This result is somewhat puzzling, but it seems clear that using a haptichandle requires more training than the participants received.

3:8 I could move the shopping trolley to where I wanted to go.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 32/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

Strongly agree or agree: 44.44 %

It is clear that the participants had some difficulties moving the trolley smoothly, so the responses toquestion 3:8 seem to support those in 3:4 above.

In two open questions, the participants were asked to describe their experience of the haptic functionality.There were rather few answers; all of them are given below.

3:9 I would like the trolley to move in other ways as well, namely:

P4: ”Like a normal shopping trolley, without supporting systems”

P8: ”In parallel to myself”

P7: ”Yes, but I don’t know which ways are possible”

Participant P4 missed the easiness of steering a common shopping trolley. P8 suggested that it should bepossible to walk beside the trolley and steer it at the same time.

3:10 What do you think in general about such a steering mechanism for a shopping trolley?

Some answers were positive, others were negative and some wanted an improved haptic functionality:

P3: ”Necessary”.

P7: ”Great.”

P8: ”Very helpful, then I have flexibility to go back and forth.”

P1: ”Not meaningful, since the navigation function is missing; I am strong enough and need noservo,← waste of energy.”

P2: ”Automatic← yes; manual (self) drive - no.”

P5: ”If it works well, it is probably meaningful, but I have rather been ’struggling’ with thetrolley.”

P4: ”If it is predictable/similar to the behaviour of a ’normal’ shopping trolley, it is pleasant.”

P9: If the functions could be improved, such a shopping trolley could turn out to be useful, but itis not altogether necessary.”

7.2.8 Open questions about the usability and user acceptance of the shopping trolley

Three questions in the questionnaire were open ones that addressed the participants’ views of the functio-nality of the shopping trolley. They were answered by both group MoC and the Shopping group, totally31 participants. All individual responses are given in the Appendix.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 33/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

2:23 What did you like about the shopping trolley?

There were three themes that recurred in the responses to this question.

• The trolley locates the products; it guides the user to the products; the user avoids searching forthe products.

• Easy handling/intuitive interface

• Advanced and intelligent technology

2:24 What should be improved on the shopping trolley? Are there functions that you miss?

Recurring themes:

• The trolley should move more smoothly.

• It should stop when passing an item on the shopping list.

• German language would be preferred instead of English.

Many interesting suggestions for improvements were proposed; see the Appendix.

2:25 Would you use such a shopping aid in the supermarket? Why/why not?

Group MoC (n=20)Yes 8No 8Yes and no, it depends etc. 3No response 1

Group Shopping (n=11)Yes 10No 1

Clearly, the participants in the Shopping group were more positive towards using such a trolley them-selves. The motivations given for why they would use the trolley were partly the same as the answers toquestion 2: 23 above, which included ”products can be found quicker”, ”to try out new technology”, ”itmakes the shopping more interesting”.

Among the motivations for why they would not use such a trolley were: ”I simply don’t need it, it is tooslow”, ”it is not helpful for my shopping situation, since I usually run quickly through the store and alsoknow where the products are.”, ”I am young, I can do it myself”. All responses and motivations can befound in the Appendix.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 34/67

Report on evaluation of the robot trolley – D7.5Evaluation of the Multimodal User Interface

7.3 Summary and discussion

The surveys have shown a rather mixed picture of the participants’ attitudes towards the robot shoppingtrolley. This reflects that the interaction with the robot was not without problems. Actually, we are dea-ling with advanced technology in the form of an early prototype and therefore some technical problemsoccurred now and then, making it difficult to reproduce exactly the same conditions for each run.

In general, about a third of the participants reported that they were satisfied with the behaviour of theshopping trolley. Rating the trolley on a number of subjective dimensions gave mixed results, wherethe strongest result was that the technology was seen to be very interesting. The way the trolley inter-face worked was easy to learn, but operating the trolley in practice was more difficult. Trying out thehaptic handle gave somewhat mixed results, and several users reported an experience of difficulty inmanoeuvring the trolley, though some users also appreciated the potential of the technology.

The touch screen interface was ranked as the most preferred one, and its design was felt to be supportiveand informative. The speech interface was also somewhat appreciated, but the speech recognition did notalways work well, and several participants lacked the possibility to speak their native language (German)when interacting with the trolley. In the Shopping group, participants ranked the speech and the hapticinterfaces equally high but lower than the touch screen.

In their answers to open questions about the trolley’s functionality, the participants mentioned somethemes particularly often. The most frequently mentioned advantage of the trolley was that it can easilylocate the items sought for in the store, that it is easy to handle, and that the technology is interesting.When both groups are added together, a majority reported that they would use such a technology if itexisted in their supermarket, while another group found that it would not be of much use to them in theirpresent situation. Many suggestions for improvement of the trolley and its interfaces were proposed.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 35/67

Report on evaluation of the robot trolley – D7.5Evaluation with stakeholders

8 Evaluation with stakeholders

8.1 Discussion with industrial stakeholders about the functionalities and in-terface of a robot trolley

A meeting was arranged at KTH in late spring 2010 with 30 representatives for the French industrial net-work PICOM (Pôle de compétitivité Industrie du Commerce) who were visiting Sweden. The networkis focused on research and cooperation within the area of ”commerce in the future”, and in particularinnovation within retailing and distribution. The members are mostly industrial executives and resear-chers, and the visit was arranged in collaboration with the French embassy in Stockholm. The groupwas particularly interested in knowing about the CommRob project which was presented together withanother KTH project. This was also an opportunity for the CommRob team to obtain responses to theproject’s ideas and results from an important stakeholder group.

As a part of this 2-hour meeting, two focus group sessions were arranged around the topic of havinga robotic shopping trolley in real stores. Each session was started with a video presentation, where aperson performs shopping tasks with the shopping trolley in the laboratory. The purpose of the ensuingdiscussion was to elicit the group’s reactions on the functionalities of a robot shopping trolley. Thediscussion focused on 1) whether the shopping trolley could become a product, and 2) what are the mostsuitable modalities to communicate with the robot.

The prototype evoked considerable interest and many comments and questions were raised by the groups.The following are some of the themes that were brought up.

• Elderly and disabled persons are a potential target group for such a trolley.

• The idea of a robot trolley is interesting for project shopping, where it would be helpful to packthe products in the appropriate order and calculate the best possible path.

• The robot trolley is also interesting for heavy products or large quantities of goods.

• It is necessary to change the infrastructure of the shops in order to host robot trolleys.

• A supermarket is a noisy place. It would be difficult to use a speech interface.

• Nowadays you can localize products in the store with a mobile phone. You don’t need a robot todo it.

• The robot prototype with all the technological components looks a bit clumsy.

• Entering the shopping list seems to take time. How do you remove products?

• Points of concern: battery management (the tablet needs to be replaced and put in charger), difficultto use trolley outside in snow and rain.

• What happens when a robot meets another robot?

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 36/67

Report on evaluation of the robot trolley – D7.5Exploratory study of the Walking Aid

9 Exploratory study of the Walking Aid

9.1 Introduction

In the project, we have explored how a robotic shopping trolley apart from its main function of carryinggoods and guiding to locations in the store, could have a second function as a walking aid.1. Most otherprojects dealing with something like that focus solely on the walking function, and have implementedtheir walking aid functionality as a primary, single function, either by assisting with the driving usingthe handle as input [3, 47] or by assisting visually impaired users to a location using a physical interfacemodelled after the leashes of guide dogs. The user grabs the handle and follows the robot [33, 34].

In order to carry out a formative evaluation of the walking aid (Figure 9.1) we carried out a small userstudy where we let two CommRob staff members drive the robot who where untrained on the walkingaid.

9.2 Setup and task

The system we are investigating is intended for sighted users and is controlled using a set of buttonsplaced underneath the handles (see Figure 9.1). How these buttons operate is displayed in Figure 9.2.The robot used was the CoRod demonstrator, modified with a pair of handles.

As the prototype is motorised and in a very early stage, we have not used people with actual walkingimpairments use the system. If this is to take place we need to make sure there are proper safety measurestaken, something which is out of the scope of this project.

The participants were introduced to the notion of a walking aid. They were also shown how to operatethe robot using the buttons on the handles (see Figure 9.2).

Using a kind of think-aloud protocol the test persons reported which one of the functionalities theyattempted to use. After that they had to perform the tasks depicted in Figure 9.3. The first task was todrive in a slalom course to a point located in the map (see Point 1). Once at Point 1 they were instructedto drive backwards to Point 2. They grey boxes shown in the figure are empty cartons with a dimensionabout 60 X 60 X 60 cm. The participants were informed that they should not touch the boxes with therobot.

After they had performed the task the participants received and filled out a questionnaire.

1This is detailed in D6.2.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 37/67

Report on evaluation of the robot trolley – D7.5Exploratory study of the Walking Aid

Figure 9.1: The Walking aid

9.3 Results

Both participants managed to learn to drive the robot using the interface and they also succeeded withthe driving task. Since only two people took part in the study it makes little sense to provide quantitativedata. There were some open questions where the participant expressed their opinions:

“[I] would like the direction button on the other hand”“a bigger turning radius would be better for handicapped people”“The global behaviour is easy to understand even if a few tries are needed tolearn the commands”“[It’s] easier to learn by using than by reading instructions.”

The construction was considered stable and pleasant to grasp, and its handling and use easy to learn. In-terestingly, the instructions through the illustration in Figure 9.2 presented were considered less effectivethan just trying it out. Still, the optimal placement of buttons could be further studied.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 38/67

Report on evaluation of the robot trolley – D7.5Exploratory study of the Walking Aid

CoRod and Walking Aidwith bird‘s eye view

Button 1- left handle

Button 2 - left handle

Button 3 - right handle

Button 4 - right handle

Shift robot left Shift robot right

Move forward Move backward

Turn left Turn right

Backward turn left Backward turn right

Precondition: Walking aid handles have to be lowered down

FUNCTIONALITY:

Figure 9.2: Walking Aid

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 39/67

Report on evaluation of the robot trolley – D7.5Exploratory study of the Walking Aid

Point 2

Point1

Figure 9.3: The map showing the task that was used for the pilot study.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 40/67

Report on evaluation of the robot trolley – D7.5Exploratory study of a shared-shoppingscenario

10 Exploratory study of a shared-shoppingscenario

10.1 Introduction

In the DoW (Description of Work), cooperative robot behaviours are discussed, both in relation to naviga-tion and physical path planning, and also with respect to the organisation of users tasks, i.e., collaborativeshopping. As other approaches on collaboration during shopping usually stress the collaboration betweenhuman and robot [30], this work focuses on the collaboration between humans who are using the robot asa tool. Such collaboration is possible or necessary in other tasks such as fetch-and-carry robots, e.g. [27].

The communication platform provides support for sharing shopping lists between two robots each. “Sha-red shopping” means that two persons buy products in a supermarket together. In such a scenario, bothpersons have their own trolley each, that can be used in principle alone like in single shopping with arobot trolley. The trolleys are equipped with a multimodal UI with a touch screen. When the personsdecide to shop together, they link their robots and then they share one shopping list represented on bothscreens. When a person edits this shared shopping list or scans a product while shopping, its own robottrolley informs the other one, and vice versa.

In order to evaluate how the concept of such shared shopping lists can be used for collaborative shopping,we carried out an exploratory study, with a special focus on how the user interfaces to such a system arereceived by potential users.

More precisely, we ran it in two different contexts. In one, we used our simulator, so that the twountrained users actually just sit in front of a GUI screen each. This screen is connected with the top layerand the simulator. The other context was a lab set up as a small supermarket. The leader of the evaluationused a conventional (non-motorised) shopping trolley equipped with touch screen and computer for thetop layer and application logic like the CommRob Trolleys. The untrained user, however, used the InBotTrolley.

10.2 Setup and task

Now let us describe the setup and task for this exploratory study in more detail. While the setup wasclearly different in the different contexts, the given task was the same.

The first context based on the simulator was simply using 2 PCs, disconnected from any robots. It wasbuilt up at TUW in Vienna, where only one of the 3 identical touch screen computers was available (whilethe other 2 were mounted on the 2 CommRob robots in the FZI lab). So, we had to use a substitute for thesecond GUI, and chose a regular laptop PC with a mouse. Still, on both PCs the Graphical User Interfaceused on the CommRob demonstrators was installed. Each computer also had a bar-code scanner attachedto it.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 41/67

Report on evaluation of the robot trolley – D7.5Exploratory study of a shared-shoppingscenario

Both PCs were placed on tables in such a way, that the participants were able to talk with each other(much as they might in a supermarket) but could not see each other’s screens. So, there were no mobileplatforms involved due to logistical and time constraints.

To connect the two interfaces and the underlying top layer (including also the application logic of theCommRob robots), the Middle Layer Interface Simulator as described in D6.3 was used to provide avisualisation of robot navigation and product locations. In fact, both virtual robots “moved” in this virtualsupermarket to specified virtual products, and these movements were shown in a navigation window ofthe GUI like it is shown when the real robots move in the physical world.

We invited 10 participants to the TUW laboratory and they were formed into pairs of collaborativeshoppers. The participants were students and employees at TUW (1 female, 9 male, average age 27.9).Initially, each participant filled out a form providing us with the necessary permission to make videorecordings of the user study. After this, each participant pair was instructed to the graphical user interfaceand how to operate the bar-code scanner.

Scanning of a product was done on a sheet of paper placed beside each screen. Then each participantwas given (identical) lists of 8 products and were instructed to use their laptop/touch screen to enter andcollaboratively “shop” for 8 products.

The second context was provided in the FZI lab partly setup as a (small) supermarket (similarly to thesetup of previous studies). InBot was available for 1 untrained participant as a user being guided toproducts by it. Instead of a second robot, we were able to use eTrolley, configured as a more or lessconventional shopping trolley, but with a touch screen and computer for the top layer and applicationlogic like the CommRob Trolleys. While eTrolley cannot, of course actively move to a selected product,it can give directions. And it can communicate with its user and InBot based on the same communicationplatform. The machine-machine communication is actually carried through a wireless network. Theleader of the evaluation himself executed the shopping task on eTrolley.

In fact, the same task was given in both contexts. The participants were told that they had to performshopping together, using a shared-shopping application that is installed on the GUI. In the simulatorcontext, they were additionally informed that such screens are originally installed on two real robots atFZI/Karlsruhe and that these robots could be run in a supermarket-like laboratory. For the purpose of thisstudy, they should imagine that the screens were installed on real robots. In both contexts, the participantswere then motivated to assume that they want to buy products from a shopping list, and that they wantto perform shopping together – supported by the shared-shopping application. In order to complete thegiven mission, they had to shop all the products on the list, something which needed to be coordinatedbetween the participants, either by using the screen as an information source or by talking to each other.

Then we described to them how they could

1. start the system,

2. add items to the shopping list,

3. use the guide-me button to “move” to products in the store,

4. use the bar-code scanner to confirm buying an item, and

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 42/67

Report on evaluation of the robot trolley – D7.5Exploratory study of a shared-shoppingscenario

5. use the “go-to-checkout”-button.

During the demonstration, the participants were instructed in steps and they could immediately try outthe interface by clicking the buttons (either with their fingers or using the mouse on the laptop computer).We also pointed out the existence of a navigation window in the upper right of the GUI where they couldfollow the positions of the robots.

Once the participants were finished with their task of shopping the products, they could proceed tocheckout. Another (older and larger) touch screen PC was used (by both participants) for the checkout,once the (simulated) shopping was done. Identical PCs were used for this purpose in both contexts.After the checkout, the task was finished and the participants were given a questionnaire to assess theirexperience of the interface in this collaborative shared-shopping scenario.

10.3 Results

All five pairs (Pair 1–Pair 5) in the simulator context as well as the pair consisting of evaluator anduntrained participant in the robot context were able to fulfil the task of collaborative shopping and thesubsequent checkout at the cashier touch screen. This meant that they were able to add the items onthe shared shopping list and to activate the system so that the (simulated) robot could guide them to theproducts. One participant in the first session (Pair 1) experienced technical problems due to a configura-tion problem with a bar-code scanner. This was fixed and worked in the following trials. Note, that allparticipants said that they easily understand the linking/unlinking of two robots in this scenario.

!"#$%&

"'()*+,-.-,/

012--3

# $ % & '4) 5).012-- 0. 0,,3

6-27-8"

Figure 10.1: Shared Shopping is a useful application.

The two Figures 10.1 and 10.2 show some results from the filled-in questionnaires. We may interpretFigure 10.1 as indicating a bit more agreement than disagreement to whether shared shopping is a usefulapplication. However, this is far from conclusive. In Figure 10.2, we see a bit more agreement thandisagreement in the assessment whether the participants would use such applications with 2 linked robotsin practice as well. However, this is also far from conclusive.

We should also interpret these results with great care, as they did not involve the use of real robots. Itseems as though the users expect that the robots should be able to provide support that is directly relatedto how to solve the task. This is interesting as in a real shop the task would not be to enter and tick offitems from a list, but to navigate between actual products with robots and users possibly out of sight.Knowing who goes where and gets what product seems to be of great importance to the participants.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 43/67

Report on evaluation of the robot trolley – D7.5Exploratory study of a shared-shoppingscenario

Figure 10.2: I would use such applications with 2 linked robots in practice as well. yes=6, no=4 (N=10)

In fact, the participant in the robot context fully agreed to the usefulness of shared shopping and thathe would use such applications with 2 linked robots in practice as well. And it seemed as though theparticipant in the robot context better understood that the guidance was towards the nearest product onthe list and not yet taken, than the participants in the simulator context.

We prepared several additional questions for the trial in the robot context. For instance, the participantin this trial fully agreed that shared shopping is fun. He was undecided, however, about the support ofcommunication with the shopping partner through this shared-shopping application. Still, he slightlyagreed that it provides a new communication experience and fully that it supports the communicationalso with information about the places of the products to buy. In comparison with the use of mobilephones or SMS/emails for the same purpose, he rated the former slightly better than the shared-shoppingapplication but the latter slightly worse.

In summary, our new support for cooperative shopping through shared shopping lists (implementedthrough high-level robot-robot communication) seems to be both interesting to and essentially usableby untrained users, as well as promising for shopping in the future. It certainly needs to be furtherimproved and studied, but we may have already made a step into Robot-Supported Cooperative Work(RSCW) [37], as an extension of Computer-Supported Cooperative Work (CSCW).

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 44/67

Report on evaluation of the robot trolley – D7.5Conclusions

11 ConclusionsThe evaluation process in CommRob during P3 has been driven by a close cooperation between repre-sentatives for robotics and human-robot interaction. During a series of dedicated weeks, one week forintegration, followed by a week of evaluation, efforts were made to assemble the resources required forinviting end users into the laboratory and letting them evaluate the new technology of a robotic shoppingtrolley. The evaluations were preceded by a number of pilot trials where the robot was gradually adaptedto fit the needs of the study setting.

A special interest through the project has been the mechanisms for spatial adaptation between robot andhuman. In human-human interaction, there are intricate rules for how we approach one another andshare the social space. This topic was also central in the Cogniron project where some of the CommRobpartners took part. In CommRob, there has been focus in particular on the notion of motion cue, or“movement as communication”, i.e. how a robot can be designed to use spatial and motion cues as anaddition to other modalities in communication with the user. A controlled experiment has been performedwith 20 users, aiming to investigate the impact of a motion cue on users’ behaviour when passing acertain product in the store. The study confirmed that the message about incidentally passing buy aproduct from the shopping list got across more strongly if the robot trolley slowed down as a motion cuefor announcing that with speech and GUI, than without this motion cue.

We have also performed a study which was intended to evaluate the users’ experience when performingshopping tasks with the robot. Eleven users took part in a setting that was similar to the first study, takingplace in a simulated store environment. The results showed that all users were able to complete theirshopping tasks with the robot. A questionnaire survey was administered to participants in both studies,aiming to assess the users’ acceptance and satisfaction with the robot trolley and its multimodal interface.The results showed that about a third of the participants in both groups were satisfied with the behaviourof the trolley, and most of them thought that the robot was interesting. A particular advantage mentionedby many users was that the robot made it easy to locate products in the store. There were mixed results oncertain usability issues; in particular, the robot’s movements were often not as smooth as users expected.

Exploratory studies of the Walking Aid and of the Shared-shopping scenario have also been performed.Although limited in scope, these studies point at new interaction forms within HRI and demonstrate theirprincipal feasibility.

To complement these studies, we have performed a small study including two focus group sessionswith industrial stakeholders, i.e. a network of executives from the retailing industry who were visitingScandinavia. They were very interested in the CommRob trolley and the discussion centred on thepotential for the trolley to work in a real store. Although this was only a single opportunity for discussion,several interesting issues were raised and important contacts were made.

Despite these limitations, we think that the robot prototype has been thoroughly tested and evaluatedwith untrained users, and important insights have been gained, especially with regard to the multimodalinterface. The concept of motion cue is clearly a promising one. Speech is also promising in the shoppingcontext, but the problems with noise and poor recognition rates have to be solved. Speech-out turned out

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 45/67

Report on evaluation of the robot trolley – D7.5Conclusions

to be very valuable, but speech-in has its usual limitations. Furthermore, the haptic steering needs tobe studied in a setting where users have the opportunity for more extensive training. Participants in ourstudies generally ranked the touch screen highest among the four modalities in the shopping setting.

Altogether, these results show that users and stakeholders find the technology quite interesting and in-novative, but there are clearly many possibilities open for improvements. Moreover, these studies haveraised a number of new research challenges for HRI, involving robots and humans interacting and com-municating in a new but at the same time familiar setting.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 46/67

Report on evaluation of the robot trolley – D7.5List of Figures

List of Figures

3.1 Graphical representation of the evaluation activities . . . . . . . . . . . . . . . . . . . . 5

4.1 GUI screen while robot performs motion cue. . . . . . . . . . . . . . . . . . . . . . . . 94.2 Motion cue experiment setup in our laboratory. . . . . . . . . . . . . . . . . . . . . . . 114.3 Participant stopping the robot via GUI after a motion cue. . . . . . . . . . . . . . . . . . 124.4 Frequencies of products taken in the first trial. . . . . . . . . . . . . . . . . . . . . . . . 144.5 Interaction graph of robot’s behaviour and trial. . . . . . . . . . . . . . . . . . . . . . . 154.6 Rating frequencies of the statement “The robot cart got me to execute my planned actions

only after the suggested actions” from ‘totally agree’ (= 1) to ‘totally disagree’ (= 5). . . 16

6.1 The test leader instructing a test person on how to use the speech interface. The back-ground shows the the shelves with products. . . . . . . . . . . . . . . . . . . . . . . . . 21

6.2 Task completion, mean duration (mm:ss) per trial person of complete shopping session. . 236.3 Task completion, mean duration of task (mm:ss) per trial person. . . . . . . . . . . . . . 236.4 Task completion, number of shopped items (#) per trial person. Max=19, min=9. . . . . 24

9.1 The Walking aid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 389.2 Walking Aid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 399.3 The map showing the task that was used for the pilot study. . . . . . . . . . . . . . . . 40

10.1 Shared Shopping is a useful application. . . . . . . . . . . . . . . . . . . . . . . . . . 4310.2 I would use such applications with 2 linked robots in practice as well. yes=6, no=4 (N=10) 44

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 47/67

Report on evaluation of the robot trolley – D7.5Histogram figures

A Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 48/67

Report on evaluation of the robot trolley – D7.5Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 49/67

Report on evaluation of the robot trolley – D7.5Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 50/67

Report on evaluation of the robot trolley – D7.5Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 51/67

Report on evaluation of the robot trolley – D7.5Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 52/67

Report on evaluation of the robot trolley – D7.5Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 53/67

Report on evaluation of the robot trolley – D7.5Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 54/67

Report on evaluation of the robot trolley – D7.5Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 55/67

Report on evaluation of the robot trolley – D7.5Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 56/67

Report on evaluation of the robot trolley – D7.5Histogram figures

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 57/67

Report on evaluation of the robot trolley – D7.5Further questionnaire data from MoC and Shopping studies

B Further questionnaire data from MoC and Shop-ping studies

Open answers, questions 2:23-2:25.

Question 2:23: What did you like about the shopping trolley?

MoC Study (n=20)

1: Intuitive handling, ”something new”

2: Transportation to all products in an unknown store (saves time).

3: See D3 (Durchgang 3)

4: Very interesting, but for me it is useless.

5: Generally it is helpful to be informed about the position of the products.

7: It was practical and automatized.

8: That it finds the products. The screen is quite easy to use. The possibility to directly scan theproducts.

9: It can run by itself and it tells when it has found the item.

10: Simple to use, can be very helpful for blind persons.

11: Simple handling.

12: The shopping trolley is doubtlessly more useful with full shelves than in the trial.

13: The idea that one does not have to push the trolley, and the guiding to the products (avoidingtedious searches for the desired products).

14: (1) It knew its way better than me; (2) Research technology, (3) It really does things quite well(?)

15: Touchscreen, voice, scanner

16: The touchscreen; the scanner

17: Simple product search, to see the price of the products that are in the trolley. Total price?

18: The turnability (movable in all directions)

19: It knows the product(??1)

1’??’ = handwriting is not clear.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 58/67

Report on evaluation of the robot trolley – D7.5Further questionnaire data from MoC and Shopping studies

20: The idea to spare oneself of the product search in a supermarket.

Shopping Study (n=11)

1: To find products in unknown territory.

2: Tone pleasant, Menu well arranged and easy

3: Tells when it passes an item from the list. Map of where the products are.

4: Guiding mode; speech input; remarks about passing other products on the shopping list.

5: To be guided; informs when a product is reached.

6: To find products without tedious searching; My products are being carried - I do not have topush a shopping cart.

7: Everything. (A smaller barcode reader to hand around your neck or in the form of a ring wouldbe nicer and easier to handle.)

8: Simple to handle and intelligent.

9: The items are well shown on the screen with respect to where they are located.

10: Navigation to products.

11: Simple to handle.

Question 2:24: What should be improved on the shopping trolley? Are there functions that you miss?

MoC Study (n=20)

1: A smoother drive, no unexpected movements

2: Smoother drive (without bumps); reverse drive; German language.

3: See D3.

4: Semiautomatic drive.

8: The movements could be smoother. When it says ”You are passing a product” it should stop. Asmoother acceleration and brake.

9: It lacks perhaps a function that allows you to say that you want two of this item. Improvement:The stop should be precisely in front of the item.

10: It should not go so fast. The trolley could pick up the products itself.

11: Shopping list to be done while still at home; stopping in front of product; monitor the distanceto the user; slow acceleration.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 59/67

Report on evaluation of the robot trolley – D7.5Further questionnaire data from MoC and Shopping studies

12: Impression: one follows a plan to a far greater extent. Spontaneous shopping becomes moredifficult. To enter categories instead of products.

13: Explicit stopping, when one passes a product on the shopping list (with notification).

14: Supermarket ground plan with product markers; start/stop also from other places than be-hind/by the screen; to sort the shopping list according to path; going faster on long distances

15: Help (?): Unpredictable navigation; Bad timing, Acceleration/braking too few steps; too slow;does not take account of when I stop, I must adapt to it (should be reversed)

16: Giving out routes like in car navigation; no motorization; a map of the supermarket on thedisplay.

17: Slow (??) of the trolley is not optimal and unexpected. RFID would be an advantage.

18: Instruction about the scanner.

19: <Transcription not possible.>

20: Ground plan; it should tell me what it plans next for me (direction, products etc.)

Shopping Study (n=11)

1: German; Showing the accumulated bill at the check-out counter.

2: Movement; Function: calling out specific products by speech.

3: Smoother drive.

4: To edit the shopping list should be easier; Generally the system should act more quickly/it seemsrather slow.

5: I hardly managed to deal with the handle function, and became impatient with it.

6: The shopping trolley should slide more smoothly, it often causes an uneasy feeling; the heightshould be optimized also for smaller persons.

7: Announcement of special offers; Partial sum of the price of items bought so far; paymentfunctionality for credit/EC card or cash.

8: The direction where the products are. Perhaps if there was a shelf with 4 levels, how do I knowwhere the product is located?

9: The shaky movement and the difficult driving of the shopping trolley.

10: a) the description of the products bought so far is difficult to understand>perhaps a list ofthe products that were already taken out of the shelf; b) One should decide on only a few spokencommands. ”Guide me to next product” should also do exactly this and not try to recognize thename of the product; c) Sorting functions.

11: it moves in a too shaky fashion.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 60/67

Report on evaluation of the robot trolley – D7.5Further questionnaire data from MoC and Shopping studies

Question 2: 25: Would you use such a shopping aid in the supermarket? Why/why not?

MoC Study (n=20)

Yes 8No 8Yes and no, depends etc 3No response 1

Comments on why/why not:

1 (yes) Products can be found quicker, paths optimized

2 (yes) To save time in unknown stores.

3 (yes) To try out new technology

7 (yes) For fun

9 (yes) It is easier when we don’t know where the items are.

13 (yes) To some extent such a trolley could make the shopping more efficient (shorter paths, goaldirection). In addition, not to have to push the trolley is comfortable and keeps both hands free.

17 (yes) To quickly find products that I need.

18 (yes) It makes the shopping more interesting; one does not have to search any more.

4 (no) I simply don’t need it; too slow.

6 (no) It is not helpful for my shopping situation, since I usually run quickly through the store andalso know where the products are.

8 (no) I am young. I can do that myself. I can also control everything.

10 (no) I don’t need these aids.

12 (no) I know my supermarket and know where is what. I am therefore faster with conventionalshopping.

15 (no) See 24.

16 (no) Efficient shopping is not yet possible; I prefer my own tempo.

20 (no) I really know what I buy and where the products are in the store. Therefore the trolleywould only be an obstacle for me.

11 (yes and no) I mostly go off with a backpack, too much space (?). With a car and large amountsof shopping however, I would be happy to use it.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 61/67

Report on evaluation of the robot trolley – D7.5Further questionnaire data from MoC and Shopping studies

14 (it depends) When I am know my way in the store, then I am quicker without the shoppingtrolley. When I have no plan for the supermarket, the trolley can be helpful.

19 (yes and no) Yes: At an unknown supermarket it is certainly helpful, hardly by casual shopping.No: the pleasure of spontaneous shopping is lost.

Shopping Study (n=11)

Yes 10No 1

Comments on why/why not:

1 (yes) Because I save time in unfamiliar stores.

2 (yes) In large stores very helpful when trying to find something new. In small stores superfluous.

3 (yes) Simplifies the shopping.

4 (yes) By curiosity; hoping that it makes the shopping quicker.

5 (yes) Because of the function with the touchscreen. It is good I think, when you can ”drive off”a previously planned shopping list.

7 (yes) Because it is fun.

8 (yes) In large stores the shopping trolley can reduce the time for the shopping, by using the fastestroute:)

10 (yes) Efficient shopping; a liking for play

11 (yes) Quicker to find the groceries, less use of energy (we are all a bit lazy:)

9 (no) It takes too long until I am guided to the item I want.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 62/67

Report on evaluation of the robot trolley – D7.5Bibliography

Bibliography[1] Julie A. Adams, Pramila Rani, and Nilanjan Sarkar. Mixed initiative interaction and robotic sys-

tems. In AAAI-04 Workshop on Supervisory Control of Learning and Adaptive Systems, TechnicalReport WS-04-10, 2004.

[2] Lauralee Alben. Quality of experience: defining the criteria for effective interaction design. inter-actions, 3(3):11–15, 1996.

[3] R. Annicchiarico, C. Barrué, T. Benedico, F. Campana, U. Cortés, and A. Martínez-Velasco. Thei-Walker: an intelligent pedestrian mobility aid. In Proceeding of the 2008 conference on ECAI2008, pages 708–712, Amsterdam, The Netherlands, The Netherlands, 2008. IOS Press.

[4] M. M. Bakhtin. Towards a Philosophy of the Act. University of Texas Press., Austin, Texas, 1993.

[5] Olav W. Bertelsen and Sören Pold. Criticism as an approach to interface aesthetics. In NordiCHI’04: Proceedings of the third Nordic conference on Human-computer interaction, pages 23–32,New York, NY, USA, 2004. ACM Press.

[6] Cristian Bogdan, Anders Green, Helge Hütenrauch, Minna Räsänen, and Kerstin Severinson Ek-lundh. Cooperative design of a robotic shopping trolley. In Proceedings of COST298: THE GOOD,THE BAD AND, THE CHALLENGING., Kobenhavn, Denmark, May 11-13 2009.

[7] Cynthia Breazeal and Paul Fitzpatrick. That certain look: Social amplification of animate vision. InProceedings of AAAI Fall Symposium, Socially Intelligent Agents - The Human in the Loop, pages3–5, 2000.

[8] S. E. Brennan and E. Hulteen. Interaction and Feedback in a Spoken Language System: A Theore-tical Framework. Knowledge-Based Systems, 8:143 – 151, 1995.

[9] Timothy Brick and Matthias Scheutz. Incremental natural language processing for hri. In HRI’07: Proceeding of the ACM/IEEE international conference on Human-robot interaction, pages263–270, New York, NY, USA, 2007. ACM Press.

[10] Sylvain Calinon and Aude Billard. Incremental learning of gestures by imitation in a humanoidrobot. In Proceedings of the ACM/IEEE international conference on Human-robot interaction(HRI ’07), pages 255–262, New York, NY, USA, 2007. ACM.

[11] Justine Cassell and Kris R. Thórisson. The Power of a Nod and a Glance: Envelope vs. EmotionalFeedback in Animated Conversational Agents. Applied Artificial Intelligence, 13:519–538, 1999.

[12] K. Dautenhahn. Robots in the wild: Exploring Human-Robot Interaction in Naturalistic Environ-ments. Special issue of Interaction Studies: Social Behaviour and communication in Biological andArtificial Systems. Interaction Studies, 10(3), 2009.

[13] J. Dewey. Art as Experience. Pedigree, 1934.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 63/67

Report on evaluation of the robot trolley – D7.5Bibliography

[14] Carl F. DiSalvo, Francine Gemperle, Jodi Forlizzi, and Sara Kiesler. All robots are not createdequal: the design and perception of humanoid robot heads. In Proceedings of the conference onDesigning interactive systems: processes, practices, methods, and techniques, pages 321 – 326.ACM Press, 2002.

[15] Jürgen Falb, Sevan Kavaldjian, Roman Popp, David Raneburger, Edin Arnautovic, and HermannKaindl. Fully automatic user interface generation from discourse models. In Proceedings of the13th International Conference on Intelligent User Interfaces (IUI ’09), pages 475–476, New York,NY, USA, 2009. ACM.

[16] Terrence Fong, Illah Nourbakhsh, and Kerstin Dautenhahn. A survey of socially interactive robots.Robotics and Autonomous Systems, 42(3-4):143–166, 2003.

[17] Terrence Fong, Illah Nourbakhsh, and Kerstin Dautenhahn. A survey of socially interactive robots.Robotics and Autonomous Systems, 42(3-4):143 – 166, 2003.

[18] Nico Frijda. Recognition of emotion. Advances in Experimental Social Psychology, 4, 1969.

[19] M. Göller, T. Kerscher, J.M. Zollner, R. Dillmann, M. Devy, T. Germa, and F. Lerasle. Setup andcontrol architecture for an interactive shopping cart in human all day environments. In Proceedingsof International Conference on Advanced Robotics (ICAR ’09)., pages 1–6, June 2009.

[20] Anders Green. The need for contact and perception feedback to support natural interactivity inhuman-robot communication. In Proceedings of IEEE 16th International Symposium on Robot andHuman Interactive Communication (RO-MAN 2007), pages 552–557, Jeju, Korea, August 26-292007.

[21] Anders Green and Helge Hüttenrauch. Making a Case for Spatial Prompting in Human-RobotCommunication. In Multimodal Corpora: From Multimodal Behaviour theories to usable models,workshop at the Fifth international conference on Language Resources and Evaluation, LREC2006,Genova, Italy, May 22-27 2006.

[22] Anders Green, Helge Hüttenrauch, and Kerstin Severinson Eklundh. Applying the Wizard-of-Oz framework to Cooperative Service Discovery and Configuration. In 13th IEEE InternationalWorkshop on Robot and Human Interactive Communication RO-MAN 2004, pages 575–580, 20-22Sept 2004.

[23] M. Hassenzahl. The thing and I: Understanding the relationship between user and product. InM. A. Blythe, K Overbeeke, A. F. Monk, and P. C. Wright, editors, Funology: From Usability toEnjoyment, pages 31–42. Kluwer Academic Publisher, Dordrecth, NL, 2003.

[24] Hui-Min Huang. Autonomy Levels for Unmanned Systems (ALFUS) Framework: Safety andApplication Issues. In Proceedings of the Performance Metrics for Intelligent Systems (PerMIS)Workshop, page August, Gaithersburg, MD, USA, 2007.

[25] Hui-Min Huang, Kerry Pavek, Brian Novak, James Albus, and Elena Messina. A framework forautonomy levels for unmanned systems (alfus),. In Proceedings of the AUVSI’s Unmanned SystemsNorth America 2005,, Baltimore, Maryland, USA, June 2005.

[26] Helge Hüttenrauch. From HCI to HRI: Designing Interaction for a Service Robot. PhD thesis,KTH Royal Institute of Technology, 2007. Phd Thesis.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 64/67

Report on evaluation of the robot trolley – D7.5Bibliography

[27] Helge Hüttenrauch and Kerstin Severinson Eklundh. To Help or Not to Help a Service Robot. InProceedings of the 12th IEEE International Workshop on Robot and Human Interactive Communi-cation RO-MAN’2003, Millbrae CA, USA, 2003. IEEE.

[28] Hermann Kaindl, Jürgen Falb, and Cristian Bogdan. Multimodal communication involving move-ments of a robot. In CHI ’08 extended abstracts on Human factors in computing systems, pages3213–3218, New York, NY, USA, 2008. ACM.

[29] Takayuki Kanda, Hiroshi Ishiguro, Tetsuo Ono, Michita Imai, and Ryohei Nakatsu. Developmentand Evaluation of an Interactive Humanoid Robot ”Robovie”. In IEEE International Conferenceon Robotics and Automation (ICRA 2002), pages 1848–1855, 2002.

[30] Takayuki Kanda, Masahiro Shiomi, Zenta Miyashita, Hiroshi Ishiguro, and Norihiro Hagita. An af-fective guide robot in a shopping mall. In HRI ’09: Proceedings of the 4th ACM/IEEE internationalconference on Human robot interaction, pages 173–180, New York, NY, USA, 2009. ACM.

[31] Sevan Kavaldjian, David Raneburger, Jürgen Falb, Hermann Kaindl, and Dominik Ertl. Semi-automatic user interface generation considering pointing granularity. In Proceedings of the 2009IEEE International Conference on Systems, Man and Cybernetics (SMC 2009), San Antonio, TX,USA, Oct. 2009.

[32] Abdelaziz Khiat, Masataka Toyota, Yoshio Matsumoto, and Tsukasa Ogasawara. Investigating therelation between robot bodily expressions and their impression on the user. In Proceedings of the11th international conference on Intelligent user interfaces (IUI ’06), pages 339–341, New York,NY, USA, 2006. ACM.

[33] Vladimir Kulyukin, John Nicholson, and Daniel Coster. Shoptalk: toward independent shoppingby people with visual impairments. In Assets ’08: Proceedings of the 10th international ACMSIGACCESS conference on Computers and accessibility, pages 241–242, New York, NY, USA,2008. ACM.

[34] Vladimir A. Kulyukin and Chaitanya Gharpure. Ergonomics-for-one in a robotic shopping cart forthe blind. In HRI ’06: Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robotinteraction, pages 142–149, New York, NY, USA, 2006. ACM.

[35] Hideaki Kuzuoka, Yuya Suzuki, Jun Yamashita, and Keiichi Yamazaki. Reconfiguring spatial for-mation arrangement by robot body orientation. In Proceedings of the 5th ACM/IEEE internationalconference on Human-robot interaction (HRI ’10), New York, NY, USA, 2010. ACM.

[36] Tomomasa Sat? M. Wada, Y. Kakazu. Expression of emotion and intention by robot body move-ment. Intelligent autonomous systems, IAS-5, 1998.

[37] T. Machino, Y. Nanjo, Y. Yanagihara, Iwaki S. Kawata, H., and K Shimokura. Proposal of robot-supported cooperative work – remote-collaboration system based on a shared field of view. Journalof Robotics Society of Japan,, 24(7):830–837., October 2006.

[38] J. McCarthy and P. Wright. Technology As Experience. The MIT Press, 2004.

[39] Marek Michalowski, S. Sabanovic, and Reid Simmons. A Spatial Model of Engagement for aSocial Robot. In 9th IEEE International Workshop on Advanced Motion Control, pages 762 – 767,March 2006.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 65/67

Report on evaluation of the robot trolley – D7.5Bibliography

[40] Marek P. Michalowski, Selma Sabanovic, and Hideki Kozima. A dancing robot for rhythmic socialinteraction. In Proceedings of the ACM/IEEE international conference on Human-robot interaction(HRI’07), pages 89–96, New York, NY, USA, 2007. ACM.

[41] R. Moratz, T.Tenbrink, K. Fischer, and J. Bateman. Spatial knowledge representation for human-robot interaction. In C. Freksa, C. Habel, and K. F. Wender, editors, Spatial Cognition III, pages263–286, Berlin, 2003. Springer Verlag.

[42] Jakob Nielsen and Thomas K. Landauer. A mathematical model of the finding of usability pro-blems. In CHI ’93: Proceedings of the INTERACT ’93 and CHI ’93 conference on Human factorsin computing systems, pages 206–213, New York, NY, USA, 1993. ACM.

[43] D. A. Norman, A. Ortony, and D. M. Russell. Affect and machine design: Lessons for the develop-ment of autonomous machines. IBM Syst. J., 42(1):38–44, 2003.

[44] Donald A. Norman. Emotional Design: Why We Love (or Hate) Everyday Things. Basic Books,May 2005.

[45] D. Perzanowski, A.C. Schultz, W. Adams, E. Marsh, and M. Bugajska. Building a multimodalhuman-robot interface. Intelligent Systems, IEEE, 16(1):16–21, Jan-Feb 2001.

[46] Jennifer Preece, Yvonne Rogers, and Helen Sharp. Interaction Design. Wiley, January 2002.

[47] Thomas Röfer, Tim Laue, and Bernd Gersdorf. iwalker - an intelligent walker providing servicesfor the elderly. In Technically Assisted Rehabilitation 2009, Berlin, Germany, 2009.

[48] Martin Saerbeck and Christoph Bartneck. Perception of affect elicited by robot motion. In Procee-dings of the 5th ACM/IEEE international conference on Human-robot interaction (HRI ’10), pages53–60, New York, NY, USA, 2010. ACM.

[49] Matthias Scheutz, Paul Schermerhorn, James Kramer, and David Anderson. First steps towardnatural human-like hri. Autonomous Robots, 22(4):411–423, 2007.

[50] Jean C. Scholtz. Theory and Evaluation of Human-Robot Interaction. In Proceedings of the HawaiiInternational Conference on System Science (HICSS), Waikoloa, Hawaiian 6-9, 2003.

[51] Simon Schütte. Engineering Emotional Values in Product Design- Kansei Engineering in Develop-ment. PhD Thesis, Linköping Institute of Technology, Linköping, Sweden, 2006.

[52] Aaron Steinfeld, Terrence Fong, David Kaber, Michael Lewis, Jean Scholtz, Alan Schultz, andMichael Goodrich. Common Metrics for Human-Robot Interaction. In In Proceedings of HRI20061st annual conference on Human-Robot Interaction, Salt Lake City, UT, USA, March 2-3 2006.ACM.

[53] R. Stiefelhagen, C. Fügen, P. Gieselmann, H. Holzapfel, K. Nickel, and A. Waibel. Natural human-robot interaction using speech head pose and gestures. In Proceedings of International Conferenceon Intelligent Robots and Systems (IROS’04), Sendal, Japan, October 2004.

[54] T. Tenbrink, K. Fischer, and R. Moratz. Spatial Strategies in Human-Robot Communication. KI4/02, 2002. Themenheft Spatial Cognition Christian Freksa (ed.) arenDTaP Verlag.

[55] Thora Tenbrink. Communicative aspects of human-robot interaction. In Helle Metslang and MartRannut, editors, Languages in development. Lincom Europa, 2003.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 66/67

Report on evaluation of the robot trolley – D7.5Bibliography

[56] Elin Anna Topp. Initial steps toward human augmented mapping. Licenciate thesis, KTH RoyalInstitute of Technology, Stockholm, Sweden, 2006.

[57] Elin Anna Topp, Helge Hüttenrauch, Henrik Christensen, and Kerstin Severinson Eklundh. Acqui-ring a Shared Environment Representation. In In Proceedings of HRI2006 1st annual conferenceon Human-Robot Interaction, Salt Lake City, UT, USA, March 2-3 2006. ACM.

[58] M. L. Walters, K. Dautenhahn, K. L. Koay, C. Kaouri, R. te Boekhorst, C. L. Nehaniv, I. Werry, andD. Lee. Close encounters: Spatial distances between people and a robot of mechanistic appearance.In Proceedings of IEEE-RAS International Conference on Humanoid Robots (Humanoids2005),pages 450–455, Tsukuba, Japan, December 5-7 2005.

CommRob IST-045441Advanced Behaviour and High-Level Multimodal Communication with and among Robots

page 67/67