Robotics Research Centre - Nanyang Technological...

1
www.rrc.mae.ntu.edu.sg Robotics Research Centre Funding Agency : Agency for Science, Technology and Research (A*STAR) Project Motivation & Objectives Methodology Results Principal Investigator: Prof Gerald Seet Team Members: Dr Iastrebov Viatcheslav Mr Pang Wee Ching, Mr Dinh Quang Huy Efficient industrial robot’s detection, identification, tracking and programming by hand-pointing demonstration can be solved by integrating multiple sensory modalities such as haptic input, gesture input, laser pointing, range data, visual commendation and task simulation. The goal is to create a partnership dialogue between a human and robot. This is a one-handed multimodal remote controller for industrial robots. All 5 fingers are ergonomically interfaced to the buttons and motions of the hand are measured by IMU. Slide haptic input sensor is for smooth fine-tuning control. Involving of only one hand in the constraint industrial environment provides a self-support with the remaining free hand. All controls can be conducted without looking at the device just by fingers and hand sensations. Laser pointer, and range-LIDAR are used for Programming By Demonstration (PBD). A custom-designed laser outlining projection system has been implemented as part of HRI. It serves the purpose of generation of graphics for augmented reality applications. Users, interested in monitoring the tasks being executed by robot, may experience difficulties using transparent LCD HMD-glasses in the outside area. Without the wearable displays, visual interaction between human and robot would only be possible to share in the notifications by the robot executed through the “Laser Writer”, which have been mounted on the wheeled robot. Bright (even at daylight) real time display of comments, intentions and predictions of the robot can partially replace the AR in HMD. Whilst the unfilled line drawings may be considered a disadvantage, the ability to project images in a bright outdoor environment is an advantage. 1) Haptic and Gesture User Interface (HGIF) Controller 2) Use of lasers in the generation of graphics for augmented reality applications 3) Framework of Human & Robot Partner Interactions This is a functional framework for interaction of haptic and hand motion PBD and AR visual feedback with the Robot. This is not an architecture, it is a level of abstraction above the architecture. This framework describes information required by components and for data flowing among components. Key point here is the decision of the human: which mode of interaction and control he wants at the moment: a direct control mode, or assisted mode in which a task execution scenario is being presented to the Human’s approval, correction or cancellation. a. Human-robot partnership can occur due to the dialog between parties: the human must be able to define and refine his needs for the robot to deliberate and formulate alternative options for the human. b. All haptic controls can be conducted without looking at the device just fingers’ and hand’s sensations are involved. Feedback is displayed in the HMD glasses and via the Laser graphics. c. Laser pointer, and range-LIDAR are used for PBD and are easily activated by haptic or gesture of one hand. HGIF Controller

Transcript of Robotics Research Centre - Nanyang Technological...

Page 1: Robotics Research Centre - Nanyang Technological Universityrrc.mae.ntu.edu.sg/Research/ResearchAreas/Documents/Poster Dec … · Slide haptic input sensor is for smooth fine-tuning

www.rrc.mae.ntu.edu.sg

Robotics Research Centre

Funding Agency : Agency for Science, Technology and Research (A*STAR)

Project Motivation & Objectives

Methodology

Results

Principal Investigator:

Prof Gerald Seet

Team Members:

Dr Iastrebov Viatcheslav

Mr Pang Wee Ching,

Mr Dinh Quang Huy

Efficient industrial robot’s detection, identification, tracking and programming by hand-pointing demonstration can

be solved by integrating multiple sensory modalities such as haptic input, gesture input, laser pointing, range data,

visual commendation and task simulation. The goal is to create a partnership dialogue between a human and robot.

This is a one-handed multimodal remote controller for industrial robots. All 5

fingers are ergonomically interfaced to the buttons and motions of the hand

are measured by IMU. Slide haptic input sensor is for smooth fine-tuning

control. Involving of only one hand in the constraint industrial environment

provides a self-support with the remaining free hand. All controls can be

conducted without looking at the device – just by fingers and hand

sensations. Laser pointer, and range-LIDAR are used for Programming By

Demonstration (PBD).

A custom-designed laser outlining projection system has been

implemented as part of HRI. It serves the purpose of generation of

graphics for augmented reality applications. Users, interested in

monitoring the tasks being executed by robot, may experience

difficulties using transparent LCD HMD-glasses in the outside area.

Without the wearable displays, visual interaction between human and

robot would only be possible to share in the notifications by the robot

executed through the “Laser Writer”, which have been mounted on the

wheeled robot. Bright (even at daylight) real time display of comments,

intentions and predictions of the robot can partially replace the AR in

HMD. Whilst the unfilled line drawings may be considered a

disadvantage, the ability to project images in a bright outdoor

environment is an advantage.

1) Haptic and Gesture User Interface (HGIF) Controller

2) Use of lasers in the generation of graphics for augmented

reality applications

3) Framework of Human & Robot Partner Interactions

This is a functional framework for interaction of haptic and hand

motion PBD and AR visual feedback with the Robot. This is not an

architecture, it is a level of abstraction above the architecture. This

framework describes information required by components and for

data flowing among components. Key point here is the decision of

the human: which mode of interaction and control he wants at the

moment: a direct control mode, or assisted mode in which a task

execution scenario is being presented to the Human’s approval,

correction or cancellation.

a. Human-robot partnership can occur due to the dialog between parties: the human must

be able to define and refine his needs for the robot to deliberate and formulate

alternative options for the human.

b. All haptic controls can be conducted without looking at the device – just fingers’ and

hand’s sensations are involved. Feedback is displayed in the HMD glasses and via the

Laser graphics.

c. Laser pointer, and range-LIDAR are used for PBD and are easily activated by haptic or

gesture of one hand.

HGIF Controller