New Toward a Sustainable Human- Robot Collaborative...

128
KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF INDUSTRIAL ENGINEERING AND MANAGEMENT Toward a Sustainable Human- Robot Collaborative Production Environment ABDULLAH ALHUSIN ALKHDUR DOCTORAL THESIS IN PRODUCTION ENGINEERING STOCKHOLM, SWEDEN 2017

Transcript of New Toward a Sustainable Human- Robot Collaborative...

Page 1: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

KTH ROYAL INSTITUTE OF TECHNOLOGY

SCHOOL OF INDUSTRIAL ENGINEERING AND MANAGEMENT

Toward a Sustainable Human-Robot Collaborative Production Environment ABDULLAH ALHUSIN ALKHDUR

DOCTORAL THESIS IN PRODUCTION ENGINEERING

STOCKHOLM, SWEDEN 2017

Page 2: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 3: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

Toward a Sustainable Human-Robot

Collaborative Production Environment

ABDULLAH ALHUSIN ALKHDUR

Doctoral Thesis

School of Engineering and Management

Department of Production Engineering

The Royal Institute of Technology

Stockholm, Sweden 2017

Page 4: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 5: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

Thinking is the hardest work there is, which is probably the reason why so few engage in it.

Henry Ford

Page 6: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 7: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

I

Abstract

This PhD study aimed to address the sustainability issues of the

robotic systems from the environmental and social aspects. During the

research, three approaches were developed: the first one an online

programming-free model-driven system that utilises web-based

distributed human-robot collaboration architecture to perform distant

assembly operations. It uses a robot-mounted camera to capture the

silhouettes of the components from different angles. Then the system

analyses those silhouettes and constructs the corresponding 3D models.

Using the 3D models together with the model of a robotic assembly cell,

the system guides a distant human operator to assemble the real

components in the actual robotic cell. To satisfy the safety aspect of the

human-robot collaboration, a second approach has been developed for

effective online collision avoidance in an augmented environment, where

virtual three-dimensional (3D) models of robots and real images of

human operators from depth cameras are used for monitoring and

collision detection. A prototype system is developed and linked to

industrial robot controllers for adaptive robot control, without the need of

programming by the operators. The result of collision detection reveals

four safety strategies: the system can alert an operator, stop a robot, move

away the robot, or modify the robot’s trajectory away from an

approaching operator. These strategies can be activated based on the

operator’s location with respect to the robot. The case study of the

research further discusses the possibility of implementing the developed

method in realistic applications, for example, collaboration between

robots and humans in an assembly line.

To tackle the energy aspect of the sustainability for the human-robot

production environment, a third approach has been developed which

aims to minimise the robot energy consumption during assembly. Given a

trajectory and based on the inverse kinematics and dynamics of a robot, a

set of attainable configurations for the robot can be determined, perused

by calculating the suitable forces and torques on the joints and links of

the robot. The energy consumption is then calculated for each

configuration and based on the assigned trajectory. The ones with the

lowest energy consumption are selected.

Page 8: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

Keywords

vision sensor, 3D image processing, collision detection, safety, robot,

kinematics, dynamics, collaborative assembly, energy consumption,

optimisation, manufacturing

Page 9: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

III

Sammanfattning

Denna Doktorsstudie syftade till att studera frågan om

robotsystemems hållbarhet från miljömässiga och sociala aspekter. Under

forskningen har tre metoder utvecklats: en online programmeringsfri och

människa-robot-system som nyttjar en webb-baserad arkitektur för att

utföra distans-montering. Det utvecklade systemet använder en

industrirobot att montera komponenter okända i förväg. Den använder

en robot- monterad kamera för att fånga silhuetter av de komponenterna

från olika vinklar. Då systemet analyserat dessa silhuetter och

konstrueras motsvarande 3D-modell. Med användning av de 3D-

modeller tillsammans med en modell av en robotmonteringscell, styrs

systemet en fjärran mänsklig operatör för att montera de verkliga

komponenterna i själva robotcellen. Resultaten visar att det utvecklade

systemet kan konstruera de 3D-modeller och montera dem inom en

lämplig tidsram.

För att tillfredsställa säkerhetsaspekterna av människa-robot

samarbetet har en andra metod utvecklats, med effektiv

kollisionsdetektering där virtuella tredimensionella (3D) modeller av

robotar och verkliga bilder av mänskliga operatörer från stereoskopiska

kameror för övervakning och kollisionsdetektering. Ett prototypsystem

utvecklas och kopplas till industriella robot-regulatorer för adaptiv

robotstyrning, utan behov av programmering av operatörerna. Resultatet

av kollisioner avslöjar fyra säkerhetsstrategier: systemet kan varna en

operatör, stoppa en robot, flytta bort roboten, eller ändra robotens bana

från en annalkande operatör. Dessa strategier kan aktiveras baserat på

operatörens placering med avseende på roboten. Fallstudien av

forskningen diskuterar vidare möjligheten att genomföra den utvecklade

metoden i realistiska tillämpningar.

Att ta itu med energiaspekten av hållbarheten för människa-robot

produktionsmiljön, har en tredje metod utvecklats som syftar till att

minimera robotens energiförbrukningen under monteringen. Givet en

rörelsebana och baserat på omvänd kinematik och dynamik för roboten,

kan en uppsättning av uppnåeliga konfigurationer bestämmas, genom att

beräkna lämpliga krafter och moment på lederna och länkar av roboten.

Energiförbrukningen beräknas därefter för varje konfiguration och

baserat på den tilldelade banan. De med den lägsta energiförbrukningen

väljs.

Page 10: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 11: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

V

Acknowledgements

I remember four years ago when I received the decision email showing

that I have been admitted to start my PhD study. That moment was

definitely one of the turning points in my life.

I was fortunate enough to work with a professional research group at

the production engineering department of the university. This has

provided me with ability to work closely with a team of researchers and

understand the values of good scientific researches.

I would like to express my appreciation to all the persons that helped

me during my PhD journey. My work would not have seen the light

without the help and support from them.

I am genuinely grateful to my research supervisor, Professor Lihui

Wang for his superb support and guidance. Each meeting with him added

important aspects to the implementation and broadened my perspective

for the research. From him I have learned to think critically, to identify

challenges, to overcome them and to present feasible solutions.

I would also like to mention my deep appreciation towards Professor

Mauro Onori for his supervision and all the support provided to me while

I was a student in the department. It was a great pleasure for me to have a

chance of working with him.

I am also grateful to Mr Bernard Schmidt for the excellent time of

collaboration between us.

I would like to thank my colleagues in the sustainable manufacturing

group for their feedback, cooperation and of course friendship.

Last but not the least; I would like to thank my family for supporting

me throughout this research and my life in general. I take this

opportunity to dedicate this work to my parents who have made me what

I am. I learnt to aspire to a career in research from them in my childhood.

Page 12: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 13: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

VII

Acronyms

2D Two-Dimensional

3D Three-Dimensional

CBM Counterbalanced Mechanism

CCD Charge-Coupled Device

CPU Central Processing Unit

D-H Denavit-Hartenberg

FSA Finite State Automata

HRLC Human-Robot Local Collaboration

HRRC Human-Robot Remote Collaboration

ISO International Organization for Standardization

NC Numeric Control

PC Personal Computer

RAM Random-Access Memory

RGB Red, Green and Blue

RNEA Recursive Newton-Euler Algorithm

TCP Tool Centre Point

TS Technical Specification

TSP Travelling Salesman Problem

Page 14: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 15: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

IX

List of author’s relevant publications

Journal papers

Paper 1: A. Mohammed, B. Schmidt, and L. Wang, “Energy-efficient robot configuration for assembly,” Transactions of the ASME, Journal of Manufacturing Science and Engineering, vol. 139, no. 5, Nov. 2017.

Paper 2: A. Mohammed, B. Schmidt, and L. Wang, “Active collision avoidance for human–robot collaboration driven by vision sensors,” International Journal of Computer Integrated Manufacturing, pp. 1–11, 2016.

Paper 3: L. Wang, A. Mohammed, and M. Onori, “Remote robotic assembly guided by 3D models linking to a real robot,” CIRP Annals - Manufacturing Technology, vol. 63, no. 1, pp. 1–4, 2014.

Conference Proceedings

Paper 4: A. Mohammed and L. Wang, “Vision-assisted and 3D model-based remote assembly,” in Proceedings of the International Conference on Advanced Manufacturing Engineering and Technologies, 2013, pp. 115–124.

Paper 5: A. Mohammed, L. Wang, and M. Onori, “Vision-assisted remote robotic assembly guided by sensor-driven 3d models,” The 6th International Swedish Production Symposium, 2014.

Paper 6: A. Mohammed, B. Schmidt, L. Wang, and L. Gao, “Minimising energy consumption for robot arm movement,” Procedia CIRP, vol. 25, pp. 400–405, 2014.

Other Publications

Paper 7: X. V. Wang, L. Wang, A. Mohammed, and M. Givehchi, “Ubiquitous manufacturing system based on cloud: a robotics application,” Robotics and Computer-Integrated Manufacturing, vol. 45, pp. 116–125, 2017.

Page 16: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

X

Paper 8: X. V. Wang, A. Mohammed, and L. Wang, “Cloud-based robotic system: architecture framework and deployment models,” in Proceedings of the 25th International Conference on Flexible Automation and Intelligent Manufacturing, Volume: 2, 2015.

Paper 9: B. Schmidt, A. Mohammed, and L. Wang, “Minimising energy consumption for robot arm movement,” in Proceedings of the International Conference on Advanced Manufacturing Engineering and Technologies, 2013, pp. 125–134.

Page 17: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

XI

Summary of appended papers

Paper A: Remote robotic assembly guided by 3D models linking to a real robot

Abstract: This paper presents a 3D model-driven remote robotic

assembly system. It constructs 3D models at runtime to represent

unknown geometries at the robot side, where a sequence of images from a

calibrated camera in different poses is used. Guided by the 3D models

over the Internet, a remote operator can manipulate a real robot instantly

for remote assembly operations. Experimental results show that the

system is feasible to meet industrial assembly requirements with an

acceptable level of modelling quality and relatively short processing time.

The system also enables programming-free robotic assembly where the

real robot follows the human's assembly operations instantly.

Paper B: Active collision avoidance for human–robot collaboration driven by vision sensors

Abstract: Establishing safe human–robot collaboration is an essential

factor for improving efficiency and flexibility in today’s manufacturing

environment. Targeting safety in human–robot collaboration, this paper

reports a novel approach for effective online collision avoidance in an

augmented environment, where virtual three-dimensional (3D) models of

robots and real images of human operators from depth cameras are used

for monitoring and collision detection. A prototype system is developed

and linked to industrial robot controllers for adaptive robot control,

without the need of programming by the operators. The result of collision

detection reveals four safety strategies: the system can alert an operator,

stop a robot, move away the robot, or modify the robot’s trajectory away

from an approaching operator. These strategies can be activated based on

the operator’s existence and location with respect to the robot. The case

study of the research further discusses the possibility of implementing the

developed method in realistic applications, for example, collaboration

between robots and humans in an assembly line.

Paper C: Energy-Efficient Robot Configuration for Assembly

Abstract: Optimising the energy consumption of robot movements has

been one of the main focuses for most of today's robotic simulation

software. This optimisation is based on minimising a robot's joint

Page 18: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

XII

movements. In many cases, it does not take into consideration the

dynamic features. Therefore, reducing energy consumption is still a

challenging task and it involves studying the robot's kinematic and

dynamic models together with application requirements. This research

aims to minimise the robot energy consumption during assembly. Given a

trajectory and based on the inverse kinematics and dynamics of a robot, a

set of attainable configurations for the robot can be determined, perused

by calculating the suitable forces and torques on the joints and links of

the robot. The energy consumption is then calculated for each

configuration and based on the assigned trajectory. The ones with the

lowest energy consumption are selected. Given that the energy-efficient

robot configurations lead to reduced overall energy consumption, this

approach becomes instrumental and can be embedded in energy-efficient

robotic assembly.

Paper D: Ubiquitous manufacturing system based on cloud: a robotics application

Abstract: Modern manufacturing industry calls for a new generation of

production system with better interoperability and new business models.

As a novel information technology, Cloud provides new service models

and business opportunities for manufacturing industry. In this research,

recent Cloud manufacturing and Cloud robotics approaches are reviewed.

Function block-based integration mechanisms are developed to integrate

various types of manufacturing facilities. A Cloud-based manufacturing

system is developed to support ubiquitous manufacturing, which provides

a service pool maintaining physical facilities in terms of manufacturing

services. The proposed framework and mechanisms are evaluated by both

machining and robotics applications. In practice, it is possible to establish

an integrated manufacturing environment across multiple levels with the

support of manufacturing Cloud and function blocks. It provides a

flexible architecture as well as ubiquitous and integrated methodologies

for the Cloud manufacturing system.

Page 19: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

XIII

Contents

ABSTRACT .......................................................................................................................I

KEYWORDS .................................................................................................................................. II

SAMMANFATTNING ................................................................................................. III

ACKNOWLEDGEMENTS ............................................................................................ V

LIST OF AUTHOR’S RELEVANT PUBLICATIONS .............................................. IX

JOURNAL PAPERS ....................................................................................................................... IX CONFERENCE PROCEEDINGS ................................................................................................... IX OTHER PUBLICATIONS ............................................................................................................. IX

SUMMARY OF APPENDED PAPERS ...................................................................... XI

CONTENTS ................................................................................................................. XIII

LIST OF FIGURES ..................................................................................................... XVI

LIST OF TABLES ....................................................................................................... XIX

CHAPTER 1: INTRODUCTION ................................................................................. 3

1.1. BACKGROUND .................................................................................................................... 3 1.2. THE SCOPE OF THE RESEARCH......................................................................................... 5 1.3. RESEARCH QUESTIONS ..................................................................................................... 6

1.3.1. Primary research questions ..........................................................................6 1.3.2. Secondary research questions .....................................................................6

1.4. METHODS SELECTION ....................................................................................................... 6 1.5. ORGANISATION OF THE DISSERTATION ......................................................................... 7

CHAPTER 2: RELATED WORK .............................................................................. 11

2.1. HUMAN-ROBOT COLLABORATION ................................................................................. 11 2.1.1. Human-robot local collaboration (HRLC) ........................................... 11 2.1.2. Human-robot remote collaboration (HRRC) ...................................... 16

2.2. ROBOT ENERGY EFFICIENCY .......................................................................................... 18

CHAPTER 3: HUMAN-ROBOT COLLABORATION ........................................... 23

3.1. REMOTE HUMAN-ROBOT COLLABORATION ................................................................. 23 3.1.1. System development ..................................................................................... 23

Page 20: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

XIV

3.1.2. Methodology .................................................................................................... 25 3.1.2.1. Capturing snapshots ................................................................................. 25 3.1.2.2. Converting to grayscale ........................................................................... 25 3.1.2.3. Adjusting brightness and contrast ..................................................... 25 3.1.2.4. Gaussian smoothing .................................................................................. 25 3.1.2.5. Image thresholding ................................................................................... 26 3.1.2.6. Silhouettes labelling .................................................................................. 26 3.1.2.7. Camera calibration .................................................................................... 26 3.1.2.8. Construction of pillars ............................................................................. 27 3.1.2.9. Trimming of pillars.................................................................................... 29 3.1.2.10. Solid prism representation .................................................................. 30

3.1.3. Implementation .............................................................................................. 32 3.2. LOCAL HUMAN-ROBOT COLLABORATION .................................................................... 32

3.2.1. System development ..................................................................................... 32 3.2.2. Methodology .................................................................................................... 33

3.2.2.1. Kinect sensors calibration ..................................................................... 34 3.2.2.2. Depth image capturing ............................................................................ 35 3.2.2.3. Depth image processing .......................................................................... 37 3.2.2.4. Minimum distance calculation ............................................................. 38

3.2.3. Implementation .............................................................................................. 38

CHAPTER 4: ENERGY MINIMISATION OF ROBOT MOVEMENT ................ 41

4.1. SYSTEM DEVELOPMENT .................................................................................................. 41 4.2. METHODOLOGY ............................................................................................................... 42

4.2.1. Denavit-Hartenberg (D-H) notation ...................................................... 43 4.2.2. Forward kinematics ...................................................................................... 44 4.2.3. Inverse kinematics......................................................................................... 44 4.2.4. Inverse dynamics ........................................................................................... 46

4.2.4.1. Forward recursion ..................................................................................... 46 4.2.4.2. Backward recursion .................................................................................. 47

4.2.5. Energy consumption ..................................................................................... 47 4.2.6. Energy optimisation ..................................................................................... 47

4.3. IMPLEMENTATION........................................................................................................... 48

CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS .................... 49

5.1. REMOTE HUMAN-ROBOT ASSEMBLY ............................................................................ 49 5.1.1. Case study for human-robot remote assembly ................................... 49 5.1.2. Results of remote human-robot assembly............................................ 51

5.2. LOCAL HUMAN-ROBOT COLLABORATIVE ASSEMBLY .................................................. 52 5.2.1. Minimum safe distance ................................................................................ 52 5.2.2. Safe speed of robot ........................................................................................ 54 5.2.3. Safety scenarios for active collision avoidance .................................. 58 5.2.4. Collaboration scenarios .............................................................................. 61 5.2.5. Results for local human-robot collaboration ..................................... 63

5.3. ROBOT ENERGY MINIMISATION ..................................................................................... 65

Page 21: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

XV

5.3.1. Single trajectory energy minimisation .................................................. 65 5.3.1.1. Straight line scenario for weight carrying ......................................65 5.3.1.2. Square path scenario for friction-stir welding .............................66

5.3.2. 3D energy map ................................................................................................ 68 5.3.3. Energy measurement in predefined paths ........................................... 69

CHAPTER 6: CONCLUSIONS AND FUTURE WORK ......................................... 73

6.1. CONCLUSIONS .................................................................................................................. 73 6.2. LIMITATIONS OF THE STUDY .......................................................................................... 75 6.3. FUTURE WORK ................................................................................................................. 75

BIBLIOGRAPHY ......................................................................................................... 79

APPENDIX A: MAIN IMPLEMENTATION CLASSES OF THE REMOTE HUMAN-ROBOT COLLABORATION ..................................................................... 87

APPENDIX B: MAIN IMPLEMENTATION CLASSES OF THE LOCAL HUMAN-ROBOT COLLABORATION ..................................................................... 91

APPENDIX C: MAIN IMPLEMENTATION CLASSES OF THE ROBOT ENERGY MINIMISATION ......................................................................................... 95

Page 22: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

XVI

List of Figures

Figure 1 Different aspects of sustainable manufacturing ............................ 4 Figure 2 Levels of safety in human-robot shared environment .................. 5 Figure 3 Organisation of the dissertation .................................................... 8 Figure 4 Example of conventional industrial setup ....................................15 Figure 5 From static to dynamic robotic safety installations .................... 16 Figure 6 Human-robot remote collaboration ............................................ 18 Figure 7 Energy based selection of robot trajectory .................................. 21 Figure 8 System overview for remote human-robot collaborative

assembly ...................................................................................... 24 Figure 9 Shape approximation by trimming process ................................ 24 Figure 10 Parameters and coordinate systems for camera calibration .... 27 Figure 11 Construction of initial pillars ...................................................... 29 Figure 12 Example of pillar-trimming process .......................................... 30 Figure 13 Prism creation with different cut cases ...................................... 31 Figure 14 Class diagram for remote human-robot collaborative

assembly ...................................................................................... 32 Figure 15 Active collision avoidance system .............................................. 33 Figure 16 Kinect's detection range .............................................................. 34 Figure 17 Retrieving the depth data from a Kinect sensor ........................ 35 Figure 18 Stages of depth data processing ................................................. 36 Figure 19 Class diagram for the active collision avoidance ....................... 39 Figure 20 Energy minimisation module .................................................... 42 Figure 21 (A) Robot's frame assignments, (B) D-H parameters. .............. 43 Figure 22 The first joint angle projection on X-Y plane ............................ 44 Figure 23 The second and third joint angles’ projection ........................... 45 Figure 24 Module diagram for energy minimisation ................................ 48 Figure 25 Results of case study for 3D modelling and remote

assembly ...................................................................................... 50 Figure 26 Comparison of computation time of the processing steps ........51 Figure 27 Modelling errors vs. number of snapshots processed ...............51 Figure 28 Parameters used for determining the minimum

safe distance ................................................................................ 52

Page 23: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

XVII

Figure 29 Parameters used for calculating the robot's safe speed ........... 54 Figure 30 Levels of danger for the operator's body regions ..................... 55 Figure 31 Maximum permissible pressures applied on human body ...... 56 Figure 32 Maximum permissible forces applied on human body ............ 56 Figure 33 The effective mass values of the human's body regions .......... 58 Figure 34 (A) Collision-free trajectory planner, (B) Trajectory

modification to avoid collision .................................................. 59 Figure 35 Modification of movement vector ............................................. 60 Figure 36 The human-robot safety strategies ............................................. 61 Figure 37 Case study for human-robot collaboration ............................... 62 Figure 38 Experimental setup for human-robot collaboration ................ 63 Figure 39 Experimental setup for velocity measurement ......................... 64 Figure 40 Results for velocity measurement using Kinect sensor ........... 65 Figure 41 Energy minimisation results for square shape case study ....... 67 Figure 42 An energy map in the workspace of an ABB IRB 1600 robot .. 68 Figure 43 Case study paths for analysing the 3D energy map .................. 69 Figure 44 The results of the energy optimisation for the case

study paths ................................................................................. 70 Figure 45 The measurements on the real robot of the case study paths ... 71 Figure 46 Cloud-based framework for human-robot collaboration ......... 76 Figure 47 Cloud-based framework of robot's energy minimisation .......... 77 Figure 48 MainMenu class ......................................................................... 87 Figure 49 ImageProcessingUI class ........................................................... 88 Figure 50 PillarsCreateAndTrim class ....................................................... 89 Figure 51 Robot class .................................................................................. 89 Figure 52 ColourAdjustment and BrightnessContrast classes ................. 89 Figure 53 CameraTransformation class ..................................................... 90 Figure 54 Coordinates_projection class .................................................... 90 Figure 55 MainApplication class ................................................................. 91 Figure 56 Manager class ............................................................................. 92 Figure 57 Controlling class ......................................................................... 93 Figure 58 RobotControl class ..................................................................... 93 Figure 59 Visualization2D and Visualization3D classes ........................... 94 Figure 60 Wrapper class ............................................................................. 94 Figure 61 Velocity class ............................................................................... 94 Figure 62 OnlinePlannerCallable module ................................................. 95 Figure 63 RobotSpec module ..................................................................... 96 Figure 64 Trajectory, Target and Path modules ........................................ 96 Figure 65 EnergyPoint module ................................................................... 97

Page 24: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

XVIII

Figure 66 Map3D module ........................................................................... 97 Figure 67 WorkingEnvelope module .......................................................... 98

Page 25: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

XIX

List of Tables

Table 1 Relationship between the publications and the research

questions ........................................................................................... 7 Table 2 Calibration parameters for the Kinect sensor .............................. 35 Table 3 Results of straight-line path from the optimisation module ....... 66 Table 4 Square path scenario results from the optimisation module ...... 66 Table 5 Square path scenario results from RobotStudio® ........................ 66 Table 6 The joint values (deg) of the experiment paths with

corresponding simulated energy consumption ............................ 69 Table 7 Case study paths energy consumption comparison ...................... 71

Page 26: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 27: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 28: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 29: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

TOWARD A SUSTAINABLE HUMAN-ROBOT COLLABORATIVE PRODUCTION ENVIRONMENT

Page 30: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 31: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

Chapter 1: Introduction

This chapter presents the motivations of the research work. The first

section is a brief description of the research background, followed by the

second section which defines the research problems. The third section of

the chapter explains the scientific questions of the research. The scope of

the research work is reported in the fourth section. Finally, the last

section highlights the outline of this dissertation.

1.1. Background

In order to reduce the poverty level and improve the quality of life,

several countries have worked actively during the last decades to develop

and maintain good economies. However, the economic developments

have led to unnecessary consumption of natural resources, pollution and

ecological issues [1]. Consequently, many international organisations and

countries realised the importance of sustainable development and defined

strategies and policies toward better sustainability. Therefore, it is

essential for companies situated in those countries to work toward

greener production and be ready when the regulations are activated.

The development consists of several aspects; one of the main aspects is

sustainable manufacturing. One of the clear definitions of the sustainable

manufacturing is the one from the US Department of Commerce [2] being

“The creation of manufactured products that use processes that minimise

negative environmental impacts, conserve energy and natural resources,

are safe for employees, communities, and consumers and are

economically sound”. This means that different aspects of manufacturing

need to be improved and aligned with the objectives of sustainability. A

good explanation of these aspects is reported in [3] which takes into

Page 32: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

4 CHAPTER 1: INTRODUCTION

consideration the three dimensions of sustainability: economy, society

and environment. Figure 1 illustrates the different aspects of sustainable

manufacturing.

Figure 1 Different aspects of sustainable manufacturing

Based on the description of the sustainable manufacturing, the

following aspects are named to be addressed as part of the objectives of

this dissertation work:

1. Innovation: During the research work, the following innovative

solutions were developed:

a. Introduced a 3D model-driven robotic system that allows a

distant operator to control a robot and perform a remote

assembly. Using a single camera mounted on the robot’s end

effector, the system is able to identify unknown objects within the

robot’s workspace and introduce them automatically to a virtual

3D environment to facilitate remote assembly operations. The

system has been implemented on a physical robot and tested with

basic assembly tasks.

b. Presented a system to actively detect and avoid any unexpected

collision between humans and robots to safeguard a collaborative

environment. The system has been implemented and tested in a

physical setup.

c. Introduced an approach to constructing the dynamic model of the

robot and determining the energy consumption of its movement.

Soci

al

Environmental

Economical

Innovation

Paying taxes responsibly

Creating jobsGenerating

sales and profits

Contributing to local economy

Combating bribery and corruption

Investing in infrastructure

Complying with the law

Respecting human rights

Good working conditions

Treating suppliers fairly

Good community relations

Ensuring product safety

Using energy and resources efficiently

Minimising use of hazardous substances

Protecting biodiversityUsing environmentally

sound materials and energy

Minimising waste and emissions

· Local human-robot collaboration, Section 2.3.1

· Remote human-robot collaboration, Section 2.3.2

· Robot energy efficiency, Section 2.4

Robot energy efficiency, Section 2.4

· Local human-robot

collaboration, Section 2.3.1

· Remote human-robot

collaboration, Section 2.3.2

2

1

3

Page 33: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

5

The developed approach has been evaluated using experimental

measurements on the physical robot.

2. Using energy and resources efficiently: Reducing

environmental impact and saving resources by minimising energy

consumption is realised in this research work. A module has thus

been developed to minimise the energy consumption of robot

movements by selecting the most energy-efficient robot joints

configurations.

3. Good working conditions: Improving the working conditions of

shop floor operators was one of the main intentions of this research.

Therefore, the safety aspect was one of the major topics addressed in

the research. As shown in Figure 2, the highest priority is dedicated to

the human’s safety. This is achieved by using a set of depth sensors in

a robotic cell to monitor human operators and control robots to avoid

any possible collision.

Figure 2 Levels of safety in human-robot shared environment

1.2. The scope of the research

The envisioned PhD study shall focus on the human-robot collaborative

assembly of mechanical components, both on-site and remotely. It should

also address sustainability issues from the environmental and societal

Safety of other equipment

Safety of human

Safety of robot Leve

l of

imp

ort

ance

Hig

h pr

iori

ty

Low

pri

orit

yM

ediu

m p

rior

ity

Page 34: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

6 CHAPTER 1: INTRODUCTION

perspectives. The main research objective is to develop safe, energy-

efficient and operator-friendly solutions for human-robot collaborative

assembly in a dynamic factory environment.

1.3. Research questions

The questions that are addressed in this research are explained in the

following sections:

1.3.1. Primary research questions

The primary research questions to be addressed in this study include:

PRQ 1. How to safely protect humans in a robot-coexisting environment?

PRQ 2. How to plan and assign assembly tasks to humans and robots

dynamically?

PRQ 3. How to control robots and instruct humans during assembly in

real time?

PRQ 4. How to generate task-specific control codes without tedious low-

level programming?

1.3.2. Secondary research questions

The secondary research questions extend to:

SRQ 1. How to decide robot trajectory and process parameters that lead

to better energy efficiency while satisfying assembly quality and

productivity?

SRQ 2. How to deal with ad-hoc assembly scenarios remotely where the

robot is treated as a manipulator?

SRQ 3. How to calibrate robots adaptively to changes and effectively to

interruptions?

During the development of the study the research questions were

answered and reported in a number of publications. Table 1 links the

research questions with their relevant papers and dissertation chapters.

1.4. Methods selection

In order to carry out the research work, the following approaches are pre-

selected:

1. Depth images of humans and 3D models of the robotic cell need to be

used for active collision avoidance, where potential collisions are

detected in an augmented environment and avoided by active robot

control automatically.

Page 35: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

7

2. Vision cameras need to be installed for automatic robot calibration

and for remote trouble shooting, whereas remote monitoring and

remote assembly are facilitated by sensor-driven Java 3D models.

The standard Web browser is the primary user interface for remote

assembly.

3. Ad-hoc components arrived at an assembly cell need to be detected

by camera images, based on which 3D models can be generated to

guide remote assembly. In this case, a camera should only be used at

the initial preparation stage; actual remote assembly is facilitated by

the 3D models without cameras for better network performance.

4. A dynamic model of an industrial robot needs to be composed to

develop an energy minimisation module. To validate and evaluate the

developed model, measurements and experiments on the physical

industrial robot must be carried out.

Table 1 Relationship between the publications and the research questions

Chapter Research questions Relevant

paper Primary Secondary

Chapter 3:

Human-robot

collaboration

PRQ 1 Paper 2

PRQ 2 SRQ 2

Paper 3

Paper 4

Paper 5

PRQ 3 SRQ 3 Paper 2

PRQ 4

Paper 2

Paper 3

Paper 4

Paper 5

Chapter 4: Energy

minimisation of

robot movement

SRQ 1

Paper 1

Paper 6

Paper 9

Chapter 6:

Conclusions and

future work

PRQ 1 Paper 7

SRQ 1 Paper 7

Paper 8

1.5. Organisation of the dissertation

As shown in Figure 1, this dissertation consists of six chapters.

Page 36: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

8 CHAPTER 1: INTRODUCTION

Figure 3 Organisation of the dissertation

Chapter 1 gives an introduction to the research background,

challenges, as well as the primary and secondary objectives. This chapter

also highlights the scope of the research work. The literature related to

Chapter 1: Introduction

Chapter 2: Related work

Chapter 6: Conclusions and future work

Chapter 4: Robot movement energy minimisation

Chapter 3: Human-robot collaboration

Chapter 5: Case studies and experimental results

Paper 2

Paper 3

Paper 4

Paper 5

Paper 1

Paper 6

Paper 9

Paper 7

Paper 8

Background Scope of research

Research questions Methods selection

Human-robot remote collaboration

Human-robot collaboration Robot energy efficiency

Human-robot local collaboration

System development

ImplementationMethodology

Human-robot remote assembly

Human-robot local assembly

Robot energy minimisation

Conclusions Future work

Limitations of the study

Page 37: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

9

the conducted research is reviewed in Chapter 2. Chapter 3 describes two

approaches developed to facilitate a collaboration between an industrial

robot and both remote and on-site operators. Chapter 4 explains the

approaches developed in the research to minimise the energy

consumption of the robot movement. Case studies and experimental

results of this research are described in Chapter 5. Finally, discussions

and conclusions together with the future work needed for expanding the

research are explained in Chapter 6.

Page 38: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 39: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

Chapter 2: Related work

This chapter consists of a literature review of the work related to this PhD

study. The first section reports the related work in the human-robot

collaboration field and reviews the different approaches presented by

other researchers. This section is divided into two subsections: the first

one reviews the local collaboration between humans and robots; the

second one goes through the related work in the remote human-robot

collaboration field. The last section is dedicated to review the literature

related to the energy efficiency of robots.

2.1. Human-robot collaboration

Based on the characterisation scheme proposed by [4], the human robot

collaboration is focused on level-8 of collaboration between the human

and the robot. This means that the system will facilitate this collaboration

automatically by showing the making the decisions needed to avoid

accidents and warn the operator if it is needed. The operator can observe

the status of the system during the run time.

To address the collaboration between humans and robots, two types of

human operators need to be taken into considerations. The first one is

remote operators; the second one the local operators on a shop floor. The

following sections explain the related work along those two directions.

2.1.1. Human-robot local collaboration (HRLC)

A human-robot collaborative setting demands local cooperation between

both humans and robots. Therefore, the coherent safety of humans in

such setting is highly important. This consists of contributions from both

passive collision detection and active collision avoidance to monitor the

shop floor operators and control the robots in a synchronised means.

Page 40: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

12 CHAPTER 2: RELATED WORK

In recent years, human-robot local collaboration has been reported by

a number of researchers. [5] and [6] introduced an approach for

controlling a humanoid robot to perform a cooperation with an operator.

[7] provided a tool to allocate flexibly shop floor operators and industrial

robots in a shared assembly setting. [8] presented a simulation based

optimisation module with multiple objectives to assign and generate

strategies for human-robot assembly tasks. [9] drew the attention to the

availability and advantages of using different technologies in human-

robot collaboration environments. The main benefit of using human-

robot collaboration in a shared setup is to combine the reliability of the

robots with the flexibility of the humans. Nevertheless, such a setup with

fence-less interaction needs an accurate design to avoid any additional

stress for the human working in that environment. Therefore, [10]

intended to develop a suitable hybrid assembly setup by assessing the

stress level of a human caused by the movement of a proximate robot.

Moreover, [11] estimated the human's effective state in real time by

considering the robot movement as an incitement; it is achieved by

measuring the human biological activities such as heart beats, facial

expression and perspiration level. In addition, [12] proposed a trust

measurement scale that can be used for industrial human-robot

collaboration cases. The study consists of two stages: the first one is to

acquire a feedback from a number of operators based on a defined

questionnaire; the participants’ answers were then used in the second

stage to apply the results using three different industrial robots.

Furthermore, [13] presented a roadmap that showed the importance of

the human factor in the collaboration between the human and the robot.

The research also explained how this factor can affect the level of success

in establishing the collaboration. The human aspect is also highlighted by

[14] which aimed to design an optimal human-robot collaborative

environment. Their approach took into consideration both operational

time and biomechanical load to define the means of collaborations in an

industrial case study.

In addition, [15] presented a theoretical framework to review the effect

of the human organisational factor in the human-robot collaboration

setup. It was achieved by introducing a case study to evaluate the level of

success for the collaboration between the human and the robot. The

results of the study showed that several human factors (participation,

communication and management) can play an important role in the

implementation of the human-robot collaboration. Furthermore, [16]

Page 41: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

13

developed a sensor based framework that uses both depth and force

sensors to control an industrial robot and establish a safe interaction

between the human and the robot.

Meanwhile, other researchers focused on using different techniques to

develop methods for human-robot collaborations. Researchers such as

[17] used Finite State Automata (FSA) to develop an approach that

considers the collaboration between multiple operators and multiple

robots. Another research [18] used physical interaction to define a

collaboration between the human and the robot. The presented approach

measured the currents of the robot’s motors and controlled the robot

accordingly. Other researchers [19] introduced a collaborative robot that

can interact with the human without the need for additional sensors using

the counterbalanced mechanism (CBM). In addition, [20] presented an

augmented reality tool for supporting the human operator in a shared

assembly environment. The tool uses video and text based visualisations

to instruct the operator during the assembly process.

A number of recent researches aimed to develop approaches that can

perceive and protect humans working with robots in shared

environments. These approaches are mainly based on two methods: (1)

constructing a vision based system that can track the human in the

robotic cell [21] by combining the detection of the human's skin colour

and its 3D model, and (2) inertial sensor-based approach [22] using

geometry representation of human operators generated with the help of a

motion capturing suit. Real-industrial experiments indicate that the latter

approach may not be considered as a realistic solution as it relies heavily

on the presence of a set of sensors attached to a specific uniform that the

operator needs to wear and the drawback of capturing the movement

around the person wearing the uniform, leaving the neighbouring objects

undetected. This can create a safety issue as there may be a possibility of

collision between a moving object and a standing-still operator. This is

explained in detail together with other sensing methods in the literature

survey [23].

The efficiency of vision based collision detection has been the interest

for several research groups. For instance, [24] developed a multi-camera

collision detection system, whereas a high-speed emergency stop was

utilised in [25] to avoid a collision using a specialised vision chip for

tracking. In addition, [26] presented a projector-camera based approach.

It is achieved by identifying a protected zone around the robot and

defining the boundary of that zone. The reported approach is capable of

Page 42: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

14 CHAPTER 2: RELATED WORK

detecting dynamically and visually any safety interruption. [27] presented

an approach consisting of a triple stereovision system for capturing the

motion of a seated operator (upper-body only) by wearing coloured

markers. However, methods that depend on colour consistency may not

be suitable in environments with uneven lighting conditions.

Furthermore, the markers for tracking of moving operators may not be

clearly visible within the monitored area. Contrary to markers, a ToF

(time-of-flight) camera was utilised in [28] for collision detection, and an

approach using 3D depth information was introduced in [29] for the

same purpose. The usage of laser scanners in these approaches provides

proper resolution but generates high computational latency, since each

pixel or row of the captured scene needs to be analysed individually.

Nonetheless, ToF cameras present high performance solution for depth

images acquisition, but with incompetent level of pixel resolution (with

the ability to reach 200 x 200) and with relatively high cost. Recently,

[30] built a three-dimensional grid to trace foreign objects and recognise

humans, robots and background using data from 3D imaging sensors.

Currently, [31] used depth information from single Kinect sensor to

develop an approach for collision avoidance.

Moreover, other researchers aimed to integrate different sensing

techniques to track humans and robots on shop floors like the work

reported in [32], which developed a collision-free robotic setting by

combining the outputs of both ultrasonic and infrared proximity sensors.

In addition, researchers such as [33] introduced a direct interaction

between the human and the robot by fusing both force/torque sensors

and vision systems into a hybrid assembly setup.

At the same time, several commercial systems were presented as safety

protection solutions. One of the widely accepted choices is SafetyEYE®

[34], which uses a single stereo image to determine 2½D data of a

monitored region and detect any interruption of predefined safety zones.

An emergency stop will be triggered once an operator enters into any of

the safety zones of the monitored environment. Nevertheless, these safety

zones are static since they cannot be updated during run time.

In order to fulfil the market demands for high productivity, several

companies used a large number of industrial robots in their production

lines. However, the same companies are faced today with new demands

for a wide range of products with different specifications. To fulfil these

new demands, many companies try to increase the adaptability of their

production lines. This is a challenging task due to the fact that these

Page 43: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

15

production lines were designed initially for high productivity. One of the

most valuable solutions is to establish safe and collaborative

environments in these production lines. These environments allow the

operators to work side by side with the robots. With such environments,

the production lines can combine the high productivity of the industrial

robots with the high adaptability of the human operators.

However, the current interaction between a robot and a human in an

industrial setup is relatively limited. In such setup, the human and the

robot are working on different tasks, in different time intervals and with

different resources. Figure 4 shows an example of conventional industrial

setup with human-robot interaction.

Figure 4 Example of conventional industrial setup

It is important to sustain a high productivity during human-robot

collaboration. Therefore, there is a need to adopt a low price and reliable

online safety systems for effective production lines where shop floor

operators share the tasks with industrial robots in a collision-free

fenceless environment. Despite the successes of safety approaches in the

past decade, the reported methods and systems are either highly

expensive or immensely limited in handling real industrial applications.

In addition, most of the current commercial safety solutions rely on

defining static regions in the robotic cell with limited possibilities of

Shop floor operatorIndustrial

robot

Robot’s working

envelope

Industrial fence

Laser curtains

Interaction region

Conveyer

Raw material

Human robot

interaction region

Page 44: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

16 CHAPTER 2: RELATED WORK

collaboration. Focusing on finding a solution to the mentioned problem,

this part of the dissertation presents a novel approach to introducing a

safe and protected environment for human operators to work locally with

robots on the shop floor. Its novelty includes: (1) successful recognition of

any possible collision between the robot’s 3D models and the human’s

point cloud captured by depth sensors in an augmented-reality

environment, and (2) active avoidance of any collision through active

robot control. The developed system can dynamically represent the robot

with a set of bounding boxes and control the robot accordingly. This

opens the door to several possibilities of closer collaborations between the

human and the robot. Figure 5 explains this approach.

Figure 5 From static to dynamic robotic safety installations

2.1.2. Human-robot remote collaboration (HRRC)

In recent years, researchers have developed various tools to program,

monitor and control industrial robots. The aim is to reduce possible robot

downtime and avoid collisions caused by inaccurate programming,

through simulation. However, these tools require pre-knowledge about a

robotic system. Introducing unknown objects to the robotic system may

cause a breakdown to the simulation due to no longer valid robot

programs.

Many researches focused on defining the framework for the distant

collaboration between the shop floor operators and the industrial robots.

One of these researches is the approach [35] which presented a cyber-

virtual system for modelling and visualising remote industrial sites. The

approach showed the possibilities of imitating the behaviour of the

physical robot using a virtual one. Another example is a research

presented by [36] which focused on defining a distributed multi-robotic

system that allows shop floor operators to control remote robots and

Slow down the robot

Move away the robot

Dynamic safety installationStatic safety installation

Allow close collaboration

dist 1

dist 3

dist 2

Monitor the moving parts of the robot

Monitor the operator with depth sensors

No direct collaboration

dist 1

Page 45: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

17

perform industrial collaborative tasks. The results of this research showed

that the developed approach can be considered as a tool for training the

novice operators. Additional approach reported by [37] focused on

developing a multilayer distributed architecture to control remote robots

and establish a collaborative behaviour among robots and humans driven

by a set of defined rules. Other research such as [38] indicated the

possibility of establishing collaboration between a mobile robot and an

expert operator. The research also showed that the approach can be used

to perform maintenance and service tasks in industrial settings.

Other researchers focused on defining a convenient method for remote

interaction between the human and the robot. For instance, the work

presented by [39] introduced a cyber hub infrastructure to allow the

remote operators to command the robot using hand gestures. The

developed approach showed as well that a group of distributed operators

can control concurrently several distant robots. Other research groups

such as [40] investigated the usage of motion and force scaling to

establish a human robot remote collaborative manipulation based on

virtual tool dynamics. Another research [41] showed the potential of

using a master-slave distributed system to remotely control a lightweight

microsurgery robot. The operator in this case was supported by a

feedback to simulate the force reflections applied at the end-effector of

the robot.

Establishing remote robotic laboratories was the main focus for other

researchers such as [42]. The latter research focused on developing a

platform for remote human-machine interaction using simulation based

cloud server for visualisation. Another example is the approach presented

by [43] which introduced a collaboration tool to allow learners to access

to a remote laboratory which consists of robots and automation

equipment.

Both laser scanners and vision cameras are common techniques to

convert unknown objects to virtual 3D models. Modelling objects using

stereo vision cameras was the main focus for several researchers [44]–

[46], whereas others, including [47] adopted a pre-defined library of 3D

models to match the real desired objects. However, the stereo vision

camera-based approach suffers from two drawbacks: (1) the utilised

equipment is expensive and less compact, and (2) it lacks the ability to

capture and model complex shapes from fixed single viewpoint due to

limited visibility.

Page 46: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

18 CHAPTER 2: RELATED WORK

2D vision systems can also be applied to model unknown objects. By

taking a number of snapshots of an object from different viewpoints, the

object can be modelled based on analysing the silhouette in each

snapshot. For example, [47] and [48] focused on modelling the object in

high accuracy and with details.

Despite the fact that these approaches were successful in their

reported applications, they are unable to model multiple objects in a

single run. Besides, they lack the ability to model objects remotely.

This part of the research proposes a new approach to constructing 3D

models of multiple arbitrary objects, simultaneously, based on a set of

snapshots taken for the objects from different angles. This approach is

implemented through a system that allows an operator to perform

assembly operations from a distance. Figure 6 illustrate the concept.

Figure 6 Human-robot remote collaboration

2.2. Robot energy efficiency

The significance of minimising energy consumption has been understood

by many researchers and equipment manufacturers. Related efforts are

numerous. Okwudire and Rodgers [50] presented an approach to

controlling a hybrid feed drive for energy-efficient NC machining, their

results showed several improvements in energy consumption and

performance over the traditional approaches. Another research [51]

presented a thorough analysis of energy consumed during the production

of a single product. The approach reported many improvements focused

on product production and its design, these improvements reduced the

Modelled objects

Virtual robot

Actual objects

Real robot

Internet Factory network

Monitoring Monitoring

Shop floor physical environmentWeb based environment

Controlling

Firewall

Firewall

Controlling

Page 47: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

19

energy consumption by almost 50%. Another method presented by

Weinert et al. [52] focused on the identifying the energy consumption and

managed to reduce the energy consumption by describing the production

operations as a set of energy blocks and then determining the energy

consumed in each block.

In addition, another research [53] focused on optimising the energy

consumption by planning the trajectory of the robot based on the

dynamic model of it. Furthermore, a research presented by [54]

developed an energy efficient trajectory planner that minimised the

robot’s mechanical energy consumption using a sequential quadric

programming solver. Another research [55] took into consideration the

dynamic model of the robot including friction losses, as well as servo

motors and inverters losses to plan the trajectory of the robot. Moreover,

other researchers [56] calculated the energy consumption by taking into

consideration the actuating energy of the robot and the grasping forces of

its gripper. Furthermore, the research reported by [57] focused on

presenting a method to optimise the energy consumption of industrial

robot with serial or parallel structure. Another work [58] presented a

literature survey for the analyses of the energy consumption of the

industrial robots. The survey addresses the different losses that the

industrial robots suffer from and how it is possible to model

mathematically these losses and calculate the energy consumption. In

addition, [59] developed a multi-criteria trajectory planner which takes

into consideration the energy consumption of the robot. The approach

took into account avoiding the obstacles around the robot and minimising

its travelling time. Designing the robot from energy efficiency aspect was

the focus of [60]. In their approach, a design for five degrees of freedom

industrial robot arm was presented. The approach considered three

holistic design criteria with the energy as one of them.

Meanwhile other researchers like [61] explained how the dynamic

behaviour of an industrial robot in assembly settings can affect the energy

consumption. The energy consumption was analysed using a multi-

domain simulation tool. The study compared the results from the

simulation with the ones from the real robot to evaluate the accuracy of

the approach.

Other researchers such as [62] worked on minimising the energy

consumption of multiple robots sharing the same environment and with

multiple tasks. The research indicated the importance of the

synchronisation of the robots’ operations within a system of multi-robots.

Page 48: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

20 CHAPTER 2: RELATED WORK

Another interesting approach [63] introduced methods to generate a

collision-free path of a robot with the reduction of energy consumption in

a simulation environment. Furthermore, another research [64] presented

a method to determine the excitations of the robot’s motors and

minimising the electromechanical losses at the same time. Another

approach [65] investigated the possibilities of minimising the energy

consumption by optimising the load distribution between two industrial

robots sharing the same task.

Several researchers on the other hand examined the possibilities of

minimising the energy consumption of machine tools. For example, Mori

et al. [66] demonstrated the ability to reduce the energy consumption by

adjusting specific cutting conditions as well as controlling the

acceleration to maintain a proper synchronisation between the spindle

acceleration and its feed system. This approach provides a useful tool for

changeable machining operations. Furthermore, the work presented by

Vijayaraghavana and Dornfeld [67] highlighted the impact of monitoring

the energy consumption of the machine tools by associating consumed

energy with performed machining tasks. Behrendt et al. [68] investigated

the nature of energy consumption in different machine tools and

evaluated their efficiencies, accompanied by energy demand modelling

[69].

Analysing the robot workspace from the energy perspective was also

the focus for other researchers. For example, [70] introduced an

approach to generating an energy map of a robot’s working envelope. The

map was presented as a set of 3D-grid points that explain the energy

consumption of the robot’s movement between them.

Within the assembly domain, many research teams focused their work

on studying the energy efficiency of industrial robots. Several tools have

been developed to calculate and analyse the energy consumption of the

robots. For example, the work presented by Verscheure et al. [71]

suggested several strategies to efficiently reduce energy consumption in

robot related applications. Another example is the scheduling method

presented in [72] which generates multiple energy-optimal robotic

trajectories using dynamic time scaling. The method showed that it is

possible to reduce the energy consumption of a multi-robotic system by

changing the execution time frames of the robot’s paths. Furthermore,

another research [73] addressed the problem of finding the optimal

robotic task sequence to reduce the energy consumption. The research

Page 49: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

21

introduced an extension for the Travelling Salesman Problem (TSP) to

handle the multiple degrees of freedom for the industrial robots.

Despite the significant effort made toward energy-efficient machines

and machining, successful use of energy during robotic assembly remains

a challenge and requires further investigations. This is due to the fact that

kinematic and dynamic features of the robots are governed by robot

controllers instead of shop floor operators. This part of the dissertation

introduces an approach to minimising the energy consumption of an

industrial robot’s movements. It is achieved by developing a module that

optimises the robot’s joint configuration to follow a certain trajectory

defined by an operator. The optimisation is based on selecting the most

energy efficient robot configurations. Figure 7 shows an example of the

functionality of the developed approach.

Figure 7 Energy based selection of robot trajectory

Target in 3D space

Configuration number 1 Configuration number 2 Configuration number 3

J1: 0.00J2: 16.66J3: -14.82J4: 0.00J5: 58.66J6: 0.00

J1: -180.00J2: -79.76J3: -55.26J4: -180.00J5: 105.49J6: 0.00

J1: -180.00J2: -44.04J3: -124.74J4: -180.00J5: 71.71J6: 0.00

Energy 1 Energy 3

Energy 2

Selecting the configuration

with the lowest energy consumption

Discarding the configurations with high energy consumption

Page 50: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 51: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

Chapter 3: Human-robot collaboration

This chapter presents the developed environment of this research, which

can provide collaborations between humans and robots. The first section

describes the methodology and system implementation of remote human-

robot collaboration, followed by the system implementation of local

human-robot collaboration in the second section.

3.1. Remote human-robot collaboration

The proposed system demonstrates the ability to identify and model any

arbitrary incoming objects to be assembled using an industrial robot. The

new objects are then integrated with the existing 3D model of the robotic

cell in the structured environment of Wise-ShopFloor [74]–[76], for 3D

model-based remote assembly.

3.1.1. System development

The system consists of four modules: (1) an application server,

responsible for image processing and 3D modelling; (2) a robot, for

performing assembly operations; (3) a network camera, for capturing

silhouettes of unknown/new objects; and (4) a remote operator, for

monitoring/control of the entire operations of both the camera and the

robot. The system is connected to a local network and the Internet. Figure

8 shows the details of the developed system.

The network camera is mounted near the end effector of the robot.

First, the robot moves to a position where the camera is facing the objects

from above to capture a top-view snapshot. The system then constructs

the primary models of the objects by converting their silhouettes in the

top-view snapshot to a set of vertical pillars with a default initial height.

After that, the camera is used to take a sequence of new snapshots of the

objects from other angles. Projecting the silhouettes of each snapshot

Page 52: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

24 CHAPTER 3: HUMAN-ROBOT COLLABORATION

back to the 3D space generates a number of trimmed pillars. The

intersections of these pillars identify the final 3D models of the objects.

Figure 9 shows the 2D layer of a simplified trimming process of one

object after the top-view snapshot, where the bounding polygon including

errors is used to approximate the actual object.

Figure 8 System overview for remote human-robot collaborative assembly

Figure 9 Shape approximation by trimming process

Ora

cle

A

pp

lica

tio

n

Se

rve

r

3D Model-Driven Assembly

Physical Assembly

3D Online Modelling

Image Processing

Robot Communication

Camera Communication

3D Models

Pinhole Camera Model

Wise-ShopFloor Virtual Environment

Shop Floor Physical Environment

Real RobotActual Objects

Vision Camera

Mo

nit

ori

ng

Co

ntr

olli

ng

Updating

Remote Operator

Inte

rnet

Fact

ory

Net

wo

rk

Processing

Snapshots

Capturing

Accessing

Optional Local Operator

Accessing

Virtual Robot

C

Snapshot 2

C

Snapshot 4

C

Snapshot 1

C

Snapshot 3

Actual object

Silhouette of

the object

projected on

the scene

Modelling

error

Page 53: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

25

3.1.2. Methodology

Stage 1: Image processing

Image processing steps are performed to recognise the features of the

captured objects through their extracted silhouettes. The details of those

steps are explained below.

3.1.2.1. Capturing snapshots

An IP-enabled network camera mounted near the end-effector of the

robot is used to take a sequence of snapshots. These images are then sent

to the application server for processing.

3.1.2.2. Converting to grayscale

The colour images are converted to grayscale to reduce computational

complexity, by taking the average value of RGB values of each pixel in the

images.

3.1.2.3. Adjusting brightness and contrast

Finding the right pixel intensity highly relies on the lighting conditions of

the working environment and the settings of the camera. Therefore, the

brightness and contrast are adjusted based on the lighting conditions of

the developed system.

3.1.2.4. Gaussian smoothing

A zero mean Gaussian filter is used to remove the noise from the image

and improve the accuracy of extracted silhouette. It is achieved by

applying Equation 1, where the output image 𝐻(𝑖, 𝑗) is the convolution of

the input image 𝑓(𝑖, 𝑗) and the Gaussian mask 𝑔(𝑘, 𝑙).

𝐻(𝑖, 𝑗) = 𝑓(𝑖, 𝑗) ∗ 𝑔(𝑘, 𝑙) = ∑ ∑ 𝑓(𝑖 − 𝑘, 𝑗 − 𝑙) ∗ 𝑔(𝑘, 𝑙)

(𝑚−1)/2

𝑙=−[(𝑚−1)/2]

(𝑛−1)/2

𝑘=−[(𝑛−1)/2]

(1)

The discrete form of convolution is performed which goes through

each element in the convolution mask and multiply it with the value of

the corresponding pixel of the input image; the sum of these

multiplications is assigned to the pixel in the output image.

Page 54: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

26 CHAPTER 3: HUMAN-ROBOT COLLABORATION

3.1.2.5. Image thresholding

This process identifies the silhouette pixels in the image by assigning

certain intensity values to them. It starts by scanning the image pixel by

pixel while comparing its intensity value with a threshold value. Each

pixel in the image will have either white or black intensity value

depending on whether it is higher or lower than the threshold value.

3.1.2.6. Silhouettes labelling

This process is to assign a specific label for each silhouette in the image.

The connected component labelling algorithm [77] is chosen due to its

efficiency. The process starts by scanning the image pixel by pixel to find

one that belongs to one of the silhouettes, followed by examining its

neighbouring pixels. If one or more neighbouring pixels already have a

label, the algorithm assigns the lowest label to the pixel. Otherwise, a new

label is assigned. The outcome of labelling operation is a two-dimensional

array where each element represents a pixel, and each silhouette is

represented by a unique label.

Stage 2: 3D modelling

3.1.2.7. Camera calibration

The mathematical model of the camera is defined using the pinhole

camera model [9] due to its acceptable level of approximation.

Constructing that model requires camera’s calibration to determine its

parameters and identifies its physical location. The calibration includes:

(1) focal length, (2) optical centre, (3) radial distortion coefficients, and

(4) tangential distortion coefficients. Figure 10A illustrates some of the

parameters.

Page 55: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

27

Figure 10 Parameters and coordinate systems for camera calibration

The camera’s position and orientation with respect to the robot’s end-

effector are described as well using a transformation matrix. Since the

camera is mounted near the end-effector of the robot, the calibration

needs to be performed only once as long as the camera has a fixed

position and orientation with respect to the end-effector.

To construct the 3D models, a coordinate frame is defined and placed

at the optical centre of the camera. The transformation matrix between

the base and TCP (tool centre point) of the robot are known as a priori,

and added to the transformation matrix between the TCP and the

camera’s centre. Together, they define the relationship between the base

coordinate system of the robot and that of the camera. Figure 10B

describes the locations and specifications of those coordinate systems.

Another 2D coordinate system in the image plane must be defined

which specifies the locations of pixels in a captured image to simplify the

processing.

3.1.2.8. Construction of pillars

The first snapshot taken by the camera provides the top view of the

objects. The system extracts first the silhouettes of the objects from that

V

U

Y

XZ

XY

Z

Y

XZ

Y

XZ

Captured silhouette

Image plane

Image centre

Optical centre

cx

Focal length

f

cy

Principal axis

Articulated robotEnd-effector

coordinate system

Camera coordinate

system

End-effector ↔ Camera

End-effector ↔ Base

Base coordinate

system

BA

Page 56: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

28 CHAPTER 3: HUMAN-ROBOT COLLABORATION

snapshot; these silhouettes help the system to construct an initial

representation of the 3D models. These models are represented by a set of

pillars in 3D space, each of which corresponds to one pixel with a non-

zero value in the image plane. The construction of the initial pillars is

accomplished by applying Tsai’s camera model [78], as described in

Equations 2-6.

𝑥′ =𝑥

𝑦 (2)

𝑦′ =𝑦

𝑧 (3)

𝑟2 = 𝑥′2 + 𝑦′2 (4)

𝑥′′ = 𝑥′(1 + 𝑘1𝑟2 + 𝑘2𝑟

4 + 𝑘3𝑟6) + 2𝑝1𝑥

′𝑦′ + 𝑝2(𝑟2 + 2𝑥′2) (5)

𝑦′′ = 𝑦′(1 + 𝑘1𝑟2 + 𝑘2𝑟

4 + 𝑘3𝑟6) + 2𝑝2𝑥

′𝑦′ + 𝑝1(𝑟2 + 2𝑦′2) (6)

where k1, k2, k3 are radial distortion coefficients and p1, p2 are

tangential distortion coefficients. The projected point on the image plane

represented by UV pixels coordinate is calculated using Equations 7-10.

𝑓𝑥 = 𝑓 × 𝑠𝑥 (7)

𝑓𝑦 = 𝑓 × 𝑠𝑦 (8)

𝑢 = 𝑓𝑥 × 𝑥′′ + 𝑐𝑥 (9)

𝑣 = 𝑓𝑦 × 𝑦′′ + 𝑐𝑦 (10)

Equations 7 and 8 introduce two different focal length coefficients: fx

and fy; this is due to the fact that the individual pixels on a typical CCD

image sensor are rectangles. Therefore, pixel’s dimensions are defined by

sx and sy.

Figure 11 illustrates the construction of the pillars. The system

introduces a temporary height for the pillars to overcome the lack of

depth information in the top view snapshot. This height is based on the

maximum height of the objects. The extra length of the pillars will be

trimmed off later during the process.

Page 57: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

29

Figure 11 Construction of initial pillars

3.1.2.9. Trimming of pillars

The initial pillars need to be trimmed iteratively to construct the 3D

models that are appropriate to approximate the real objects. Figure 12

shows the trimming process of one pillar as an example.

X

Z

Y

3D point

2D image array

Camera in top view position

Camera coordinate

system

Pillar

2D silhouette

Projected 2D point

Max

imu

m h

eigh

t

Page 58: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

30 CHAPTER 3: HUMAN-ROBOT COLLABORATION

Figure 12 Example of pillar-trimming process

The trimming process starts after analysing the second captured image

and extracting the silhouettes. First, it projects the pillars one by one

from the 3D space to the image plane. Since each pillar is represented by

two 3D end points, the projection is only for the points. The projection

creates two 2D points on the image plane, calculated by Equations 9, 10.

Then, a line that connects the two 2D points is created using Bresenham

algorithm [79]. The process continues by extracting the pixels that are

shared by the projected line and the silhouette of the object, which reveals

a trimmed line. Finally, the trimmed 2D line is projected back to the 3D

space to replace the old pillar, resulting in a trimmed new pillar. The

same trimming process is repeated for all pillars and for all snapshots.

3.1.2.10. Solid prism representation

Despite the fact that the aforementioned processes can trim the pillars as

closely as possible to mimic the real objects, the pillars alone are neither

intuitive nor computationally efficient for 3D visualisation due to the fact

of non-solid geometry. Moreover, the modelled shapes need to be

compatible with the robot 3D model in Wise-ShopFloor [74]–[76]. This,

❶ Silhouette and initial pillars

❸ Trimming of projected lines

❷ Projected lines on image plane

❹ Trimmed pillars in 3D space

3D point of a pillar

Pillar

For the ease of understanding, the image plane is placed in parallel of the pillars. The image

plane can be in any pose for trimming.

Pillar with multiple

subsections

ZY

X

V

U

V

U

Page 59: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

31

however, can be achieved by creating a solid prism representation for the

trimmed pillars.

Figure 13 Prism creation with different cut cases

Two objectives are considered here: (1) to localise the process, and (2)

to create a uniform representation of a given object. The pillars are first

divided into groups of three according to their immediate neighbouring

connectivity. The prism creation is then divided into three steps to

construct: (1) the top surface, (2) the bottom surface, and (3) the three

sides of each prism. The order of the end points is crucial when building a

1

2

3

1

2

3 12

3

Section of modelled object

Cut divides 1 pillar Cut divides 3 pillars Cut divides 2 pillars

Sectional prisms

Object’s surface with cut feature

3D end point

A1

A AA

BB

B

A3

B1

B2

B3

A2

A1

A3,B3

A2

A2

B2

A1,B1 B1

B2

Pillar

A3,B3

Page 60: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

32 CHAPTER 3: HUMAN-ROBOT COLLABORATION

surface patch of each prism as its surface normal affects its visibility. As

shown in Figure 13, three end points in counter clockwise order are used

to create a surface patch with an outer visibility. Moreover, three cases of

pillar division (cut) caused by the subsections of pillars are also

considered during prism creation, as illustrated in Figure 13.

3.1.3. Implementation

The system reported in Section 3.1.1 is developed using Java as a

programming language. It is achieved by constructing classes to handle

the functionalities needed for the system. Figure 14 illustrates the

structure of those classes. The main classes developed for the system are

highlighted with numbers and described in detail in Appendix A.

Figure 14 Class diagram for remote human-robot collaborative assembly

3.2. Local human-robot collaboration

3.2.1. System development

An active collision avoidance solution is developed predicated on Wise-

ShopFloor framework [74]–[76], as shown in Figure 15. Both C++ and

Java are utilised for the system development due to their high

2

1

3

4

5

6

7

5

Page 61: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

33

performance and flexibility. An industrial robot, ABB IRB 1600, is

utilised to construct a physical human-robot collaborative assembly cell

for testing and verification. A PC of Intel 2.7 GHz Core i7 CPU with 12 GB

RAM is introduced as a local server responsible for collision avoidance,

which runs a 64-bit Windows 7 operating system. With the aid of Java

3D, this collision avoidance server is utilised for image processing and for

establishing a collision-free environment. Moreover, two Microsoft Kinect

sensors (depth cameras) are installed to obtain the depth images of

operators in the robotic cell. The interface with the Kinect sensors is

implemented with the help of a C++ open source library.

Figure 15 Active collision avoidance system

3.2.2. Methodology

The developed approach commences by calibrating the Kinect sensors,

then acquiring the depth information from them. The process perpetuates

Visualisation PC

Local Server

Mini PC 2 Mini PC 1

Robot

Controller

Industrial Robot

Operator

Camera 1 Camera 2

3D

Visualisation

2D

Visualisation

Augmented

environment

Control

Commands

Generator

Point Cloud

merging

Distance

Calculation

Safety

policies

Human Labeling

Background removal

Point Cloud

Construction

Human Labeling

Background removal

Point Cloud

Construction

Robot Control

Robot Monitor

Depth data Depth data

Po

int

clo

ud

1

Point cloud 2

Ro

bo

t co

ntr

ol c

om

man

ds

Robot control signals

Robot monitor signals

Hu

man

mai

n p

oin

t cl

ou

d

Page 62: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

34 CHAPTER 3: HUMAN-ROBOT COLLABORATION

by determining the most proximate distance between the robot and

obstacles (including operators), then active collision avoidance is

performed. Furthermore, the velocity of the approaching operator is

calculated to ameliorate the system responsiveness. The following

sections explain the details of the mechanism for this approach.

3.2.2.1. Kinect sensors calibration

The depth vision systems culled in this research are Microsoft Kinect

sensors (depth cameras) equipped with a spatial resolution of 640×480,

11-bit depth, and a field of view of 58×40 deg., which can measure the

depth information from 0.5 to 5.0 m, as described in Figure 16. A

calibration of the Kinect sensors is needed to determine the distance

values. It is accomplished by measuring a particular surface from

different distances. To amend the precision of the distance calculation,

Equation 11 is implemented for optimising the sensor’s parameters.

Figure 16 Kinect's detection range

𝑑 = 𝑘3 tan (𝑛

𝑘2+ 𝑘1) − 𝑘4 (11)

4.5 m

2.0 m

0.5 m

3.0 m

Detect

ing r

ange o

f

any movin

g obje

ct

2.5 m

Detect

ing r

ange o

f

full h

uman b

ody

Out of

detect

ing

range

robot

Kinect sensor

Human

Humanaverage height (1.8 m)

Page 63: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

35

where n represents the raw 11-bit to describe the distance from the

Kinect sensor and k1 – k4 are the calibration parameters. The parameters

are computed after optimisation with their values described in Table 2

Table 2 Calibration parameters for the Kinect sensor

Parameter Measured value

Unit Kinect sensor 1 Kinect sensor 2

k1 1.1873 1.18721 rad

k2 2844.7 2844.1 1/rad

k3 0.12423 0.1242 m

k4 0.0045 0.01 m

3.2.2.2. Depth image capturing

To communicate with the depth sensors and access its raw data (colour,

depth, etc.), the libfreenect userspace driver for Microsoft Kinect is

selected and used. The first step is initialising the driver to detect all

connected Kinect sensors in order to access the depth data, followed by

the second step of reading the depth streams through a call-back

function. The third stage is to retrieve the depth frame and then

processing it pixel by pixel in the final stage, as shown in Figure 17. As

soon as there is a new depth frame available, a call-back function is

triggered to copy the data, make pre-processing and pass it to the

processing stage.

Figure 17 Retrieving the depth data from a Kinect sensor

Kinect Sensor

Color FrameColor Stream

Depth Frame Depth Data

Color Data

Depth Stream D D D16 bit

Pixel 2Pixel 1 Pixel 3

Page 64: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

36 CHAPTER 3: HUMAN-ROBOT COLLABORATION

Figure 18 Stages of depth data processing

Depth images

Data

acquisition

Background

removal

Filtering

and labeling

3D visualisation

Combining with 3D

model of robot

Depth images without

background

3D point cloud of

detected operator

Active collision

detection and

avoidance between

human and robot

Camera 2

Detected operator

Augmented environment

Camera 1

Page 65: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

37

3.2.2.3. Depth image processing

In this research, 3D models are utilised to represent a well-defined shop-

floor environment. A set of position sensors is linked to the collision

avoidance system to define the behaviour of the 3D models and monitor

the shop-floor in real time.

The present pose of the robot can be introduced and visualised to the

human-robot shared environment by reading the joint values from the

position sensors and reflecting those readings on the 3D model of the

robot.

Concurrently, the human operator can be represented as point cloud

with the assistance of the depth images from the Kinect sensors. Three

Kinect sensors are employed for surveillance of unstructured unknown

objects in the robotic cell, including mobile operators who lack the

representation in the 3D space. The concept is illustrated in Figure 15.

The most proximate distance between the virtual 3D model of the robot

and camera information about the operator’s location is utilised to detect

any collision in an augmented environment. This approach helps to

sustain the rapid processing. An appropriate decision can be made

assisted by the calculated minimum distance, which leads to efficient

avoidance of any possible collision.

The detailed procedures of depth images capturing and analyses are

described in Figure 18. To maintain the efficiency of the system, using the

background images captured in the calibration stage, the procedure

continues by subtracting the background from the depth images. This

step helps to decrease the processing time. Depth information captured

for the movement of the robot is subtracted as well from the depth

information by projecting back the robot model to the depth images.

Consequently, merely the unknown objects are kept in the captured depth

information, as shown in Figure 18, where the images in the third row

depict a human operator as the subject of interest after employing a

noise-removal filter and connected-component algorithm. Then the noise

associated with the captured point cloud is eliminated by performing a

statistical outlier removal filter.

Identifying the human operator from three cameras is achieved by

converting the captured images to point clouds represented in the robot

coordinate system, then fusing them into one point cloud after registering

the images. The captured point cloud of the human operator is then

superimposed to the 3D model of the robot to construct an augmented

environment, helping the system to calculate successfully the minimum

Page 66: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

38 CHAPTER 3: HUMAN-ROBOT COLLABORATION

distance between the point cloud and the 3D robot model. Figure 18

additionally shows the outcome of the depth images processing at

different stages.

3.2.2.4. Minimum distance calculation

The point cloud of the human operator contains a considerable amount of

data, despite the fact that the size of the image data is considerably

decremented after background removal. Consequently, minimising the

point cloud representation becomes essential in the implementation,

which leads to the enhancement of system performance. Minimum

bounding boxes are selected and assigned to the 3D point clouds to

improve the calculation of collision detection. A bounding box aligned

with the axes of the 3D space is introduced in the developed system for

smooth visualisation and simple representation utilising the two opposite

corners of a box.

3.2.3. Implementation

Using Java as a development tool, the collision avoidance system

presented in Section 3.2.1 is implemented. Therefore, a set of classes has

been created to handle the functionalities needed for the system. Figure

19 shows the relationships among these classes. The main classes

developed for the system are highlighted with numbers and detailed in

Appendix B.

Page 67: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

39

Figure 19 Class diagram for the active collision avoidance

Wrapper

Model

1

2

3

4

5

56

7

Page 68: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 69: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

Chapter 4: Energy minimisation of robot movement

This chapter consists of the description of the approach to minimise the

robot’s energy consumption by selecting the most energy-efficient robot

configurations. It is achieved by constructing the robot’s dynamic model

and analyse the robot’s trajectory from the energy perspective. The first

section of the chapter explains the developed system and its components.

The second section provides a description of the methodology used in the

development. The last section highlights how the system is implemented.

4.1. System development

In order to fulfil the research objective, an optimisation module has been

developed using MATLAB®. As shown in Figure 20, this module aims to

solve five tasks: ❶ to affiliate the predefined trajectory of a robot’s TCP

(tool centre point) with its velocity and acceleration; ❷ to solve the

inverse kinematics of the trajectory and determine the robot’s joint

configurations; ❸ to calculate both the forward and backward recursions

of the robot model, the results of which are used for solving the inverse

dynamics of the robot trajectory; ❹ to determine the energy consumption

for each joint configuration; and ❺ to select the robot’s optimal joint

configuration based on the calculated values of energy consumption.

Here, each joint configuration represents one set of joint angles to place

the TCP in the defined position and orientation along the trajectory.

Page 70: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

42 CHAPTER 4: ENERGY MINIMISATION OF ROBOT MOVEMENT

Figure 20 Energy minimisation module

4.2. Methodology

The optimisation module is based on the mathematical model of a robot,

which can describe the kinematic and dynamic behaviours of the robot in

Server

Forward Kinematics

Inverse Kinematics

Inverse Dynamics

Energy Consumption

Calculator

Configuration Selection

Forward Recursion

Backward Recursion

Client

Robot’s TCP Acceleration

Robot’s TCP Velocity

Robot’s TCP Trajectory

Robot’s TCP Payload

Danevit-Hartenberg Notation

Robot’s inertial data

Robot’s Mechanical Specifications

θ1(t)

θ2(t)

θ3(t)

θ4(t)

θ5(t)

θ6(t)

T(t)

θ1

(t)

θ6

(t) ,θ’1

(t)

θ’’1

(t)

θ’’1

(t)

,θ’6

(t)

.....

Co

nf. 1

Co

nf. n

Wn

W1

Optimisation Module

Ethernet

Eth

ern

et

Industrial Robot

Legendθn(t) .............................................................................................. Joint angleθ’n(t)................................................................... Joint angle’s first derivativeθ’’n(t) ............................................................ Joint angle’s second derivativeT(t) .....................................................Transformation matrix of robot targetConf. n ..................................................................Robot joint configurationWn ............................................Robot’s configuration energy consumption

Robot’s CAD model

1

2

3

4

5

Page 71: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

43

particular. After analysing the force and torque on each joint, the most

energy-efficient joint configuration of the robot is determined. As an

example, an ABB IRB 1600 industrial robot is chosen for explanations

below.

4.2.1. Denavit-Hartenberg (D-H) notation

By implementing the D-H notation, the kinematic coefficients are

identified and the joint frames of the robot are defined. The notation

assigns frames at the joints of the robot from its base to end-effector. The

z-axes of these frames are aligned with the joints’ rotation axes, as shown

in Figure 21. For simplification, frames 4-6 are placed at the robot wrist

with the same origin.

Figure 21 (A) Robot's frame assignments, (B) D-H parameters.

Y0 X0

Z0

d2

d1

a2 a3 a4

Pw

0

1

2

3

45 6

a1

θi

αi

di Link i-1

Linki Jointi+1

Jointi Jointi-1

ai

B

yi

xi zi

zi-1 xi-1

yi-1

A

X2

Y1

Y5

Y3

Y2

YTCP

Y4,6

Z1

Z3

Z5 Z4,6

ZTCP

X1

X3

X4,6

X5

XTCP

Z2

Page 72: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

44 CHAPTER 4: ENERGY MINIMISATION OF ROBOT MOVEMENT

4.2.2. Forward kinematics

The forward kinematics of the robot is calculated by multiplying the

transformation matrices of the robot joints as clarified in Equation 12 to

define the pose of the robot’s end-effector with respect to its base.

𝐓(𝜃1…𝜃6)TCP0 = 𝐓(𝜃1).1

0 𝐓(𝜃2).21 𝐓(𝜃3).3

2 𝐓(𝜃4).43 𝐓(𝜃5).5

4 𝐓(𝜃6).65 𝐓TCP

6 = [ 𝐑TCP0 𝐏0 1

] (12)

where, T ji is the transformation matrix between link i and j, TTCP

6 is that

between TCP and joint 6, RTCP0 and P are the rotation matrix and

translation vector, respectively.

4.2.3. Inverse kinematics

Based on the kinematic features of the robot, the first three joints control

the end-effector’s position and the last three joints control its orientation.

The process is started first by solving the configuration of the first joint

𝜃1 in Equation 13. It is accomplished by considering that 𝜃1 changes the

position of robot’s wrist in the X-Y plane as illustrated in Figure 22.

𝜃1 = {𝑎𝑡𝑎𝑛2 (𝑃𝑦𝑤, 𝑃𝑥𝑤)

𝑎𝑡𝑎𝑛2 (−𝑃𝑦𝑤 , −𝑃𝑥𝑤) (13)

Figure 22 The first joint angle projection on X-Y plane

The calculation continues by determining the values of the next two

joints. It is achieved by using Link2 and Link3 to form a XY0-Z0 plane as

shown in Figure 23. It starts by calculating 𝑐𝑜𝑠𝜃3 value using Equation 14

and then calculating r value using Equation 15. Equation 16 shows the

calculation for joint3 value, resulting in two possible results.

Top view

XY

Zθ1

Y0

X0

XY0

Pyw

PxwZ

Page 73: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

45

Figure 23 The second and third joint angles’ projection

𝑐𝑜𝑠𝜃3 = −𝑑22 + (𝑎2 + 𝑎3)

2 − [(𝑃𝑧𝑤 − 𝑑1)2 + 𝑟2]

2(𝑎2 + 𝑎3)𝑑2 (14)

𝑟 = √(𝑃𝑥𝑤 − 𝑎1𝑐𝑜𝑠𝜃1)2 + (𝑃𝑦𝑤 − 𝑎1𝑠𝑖𝑛𝜃1)

2 (15)

𝜃3 = {𝑎𝑡𝑎𝑛2 (+√1 − 𝑐𝑜𝑠2𝜃3, 𝑐𝑜𝑠𝜃3)

𝑎𝑡𝑎𝑛2 (−√1 − 𝑐𝑜𝑠2𝜃3, 𝑐𝑜𝑠𝜃3) (16)

The calculations then continue by finding joint2 value 𝜃2 using

Equation 17.

𝜃2 = 𝑎𝑡𝑎𝑛𝑃𝑧𝑤 − 𝑑1

𝑟− 𝑎𝑡𝑎𝑛

(𝑎2 + 𝑎3)𝑠𝑖𝑛𝜃3𝑑2 + (𝑎2 + 𝑎3)𝑐𝑜𝑠𝜃3

(17)

The orientation of the robot wrist against the base is then determined

by using Equation 18. The rotation matrix of the rest of the joints can be

calculated by Equation 19.

𝑅𝑤0 = 𝑅 (𝜃1, 𝜃2, 𝜃3). 𝑅4

3 (𝜃4 = 0)30 (18)

𝑅𝑧𝑥𝑧(𝜃4, 𝜃5, 𝜃6) = 𝑅𝑊0 . 𝑅𝑇𝐶𝑃

0 . 𝑅𝑇 = [

𝑟11 𝑟12 𝑟13𝑟21 𝑟22 𝑟23𝑟31 𝑟32 𝑟33

]𝑇𝐶𝑃6 (19)

The configurations of the last three joints can be computed in

Equations 20-25 using Euler angles: 𝛼, 𝛽, and 𝛾.

Z0

1

XY0

r

α β

θ2

d2

Pzw

a1

d1

a 2+

a 32

Pw

θ3

XY

Z1

2

Side view

Page 74: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

46 CHAPTER 4: ENERGY MINIMISATION OF ROBOT MOVEMENT

𝛽 =

{

𝑎𝑡𝑎𝑛2(+√𝑟31

2 + 𝑟322 , 𝑟33)

𝑎𝑡𝑎𝑛2(−√𝑟312 + 𝑟32

2 , 𝑟33)

(20)

𝛼 = 𝑎𝑡𝑎𝑛2(𝑟13𝑠𝑖𝑛𝛽

, −𝑟23𝑠𝑖𝑛𝛽

) (21)

𝛾 = 𝑎𝑡𝑎𝑛2(𝑟31𝑠𝑖𝑛𝛽

, −𝑟32𝑠𝑖𝑛𝛽

) (22)

𝜃4 = 𝛼 (23)

𝜃5 = −𝛽 (24)

𝜃6 = 𝛾 (25)

4.2.4. Inverse dynamics

Recursive Newton-Euler Algorithm (RNEA) [80] is adopted in this

research for its reliable results. The first step is to calculate the inertial

tensor matrices of the robot using the 3D model of the robot together

with SolidWorks® software.

The procedure of solving the inverse dynamics is divided into two

steps: forward and backward recursions as explained below.

4.2.4.1. Forward recursion

Starting from the first robot link to the last one, the algorithm determines

the linear and angular motions of each link of the robot. Since the robot

will start from a standstill state, the initial values initial values of

velocities and accelerations are set to zeros. Consequently, the angular

velocity 𝜔𝑖 and acceleration 𝛼𝑖 of link 𝑖 are calculated together with the

linear accelerations 𝑎𝑖 and 𝑎𝑐𝑖 of link 𝑖, using Equations 26-29.

𝜔𝑖 = 𝐑𝑇 . 𝜔𝑖−1 + 𝑧𝑖𝑖𝑖−1 . �̇�𝑖 (26)

𝛼𝑖 = 𝐑𝑇 . 𝛼𝑖−1 + 𝑧𝑖𝑖𝑖−1 . �̈�𝑖 +𝜔𝑖 × 𝑧𝑖 . �̇�𝑖 (27)

𝑎𝑖 = 𝐑𝑇 . 𝑎𝑖−1 + �̇�𝑖 + 𝑟𝑖−1,𝑖 + 𝜔𝑖 × (𝜔𝑖 × 𝑟𝑖−1,𝑖) 𝑖𝑖−1 (28)

𝑎𝑐𝑖 = 𝐑𝑖𝑖−1 . 𝑎𝑖−1 + �̇�𝑖 × 𝑟𝑖−1,𝑐𝑖 +𝜔𝑖(𝜔𝑖 × 𝑟𝑖−1,𝑐𝑖) (29)

Page 75: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

47

4.2.4.2. Backward recursion

The process continues by obtaining the force and torque that affect the

movement of each joint. The calculation begins with the last link and ends

with the base of the robot. Using the angular velocity and acceleration

values calculated in the previous step, the gravity vector 𝑔0 is represented

in a frame for each link in Equation 30.

𝑔𝑖 = 𝐑𝑇 . 𝑔0𝑖0 (30)

The force 𝑓𝑖+1 and torque 𝜏𝑖 of link 𝑖 are calculated using Equations 31-

32. At the same time, the external force 𝑓N+1 and torque 𝜏N+1 applied to

the robot’s end-effector are considered implicitly. The mechanical loses

are modelled through Coulomb and viscous friction with respective

coefficients 𝜏𝑐𝑖 and 𝜏𝑣𝑖.

𝑓𝑖 = 𝐑𝑖+1𝑖 ∙ 𝑓𝑖+1 +𝑚𝑖(𝑎𝑐𝑖 − 𝑔𝑖) (31)

𝜏𝑖 = 𝐑𝑖+1𝑖 ∙ 𝜏𝑖+1 − 𝑓𝑖 × 𝑟𝑖−1,𝑐𝑖 + 𝐑𝑖+1

𝑖 ∙ 𝑓𝑖+1 × 𝑟𝑖,𝑐𝑖 + 𝜔𝑖 × (𝐼𝑖 ∙ 𝜔𝑖) + 𝐼𝑖 ∙ 𝑎𝑖 + 𝜏𝑐𝑖𝑠𝑖𝑔𝑛(�̇�𝑖). 𝑧𝑖 + 𝜏𝑣𝑖�̇�𝑖. 𝑧𝑖 (32)

4.2.5. Energy consumption

First, the power consumed at each joint in a certain time interval 𝑘 is

calculated. Then, the power consumptions of all joints are accumulated to

get the total power consumption of the robot, as described in Equations

33 and 34.

𝑃𝑖(𝑘) = (𝜏𝑖(𝑘) ∙ �̇�𝑖(𝑘)) (33)

𝑃(𝑘) =∑𝑃𝑖(𝑘)

𝑛

𝑖=1

(34)

The process continues by computing the energy consumption in

Equation 35, where 𝑑𝑡𝑘 is the time duration of the robot path.

𝐸 = ∫ 𝑃(𝑡)𝑑𝑡 ≅ ∑𝑃(𝑘). 𝑑𝑡𝑘

𝑀

𝑘=0

𝑡𝑀

𝑡0

(35)

4.2.6. Energy optimisation

Energy optimisation is performed as a final step to select the most

energy-efficient robot configuration from the calculated configurations to

perform the assembly tasks.

Page 76: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

48 CHAPTER 4: ENERGY MINIMISATION OF ROBOT MOVEMENT

4.3. Implementation

A set of modules in Matlab® have been created to handle the

functionalities needed for the system. Figure 24 shows the relationships

among these modules. The main modules developed for the system are

highlighted with numbers and detailed in Appendix C.

Figure 24 Module diagram for energy minimisation

1

52

4

6

3

3

3

Page 77: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

Chapter 5: Case studies and experimental results

This chapter describes the validation experiments and testing results of

the developed system. The first section handles the case studies used to

evaluate the remote human-robot collaborative assembly. The second

section clarifies the case studies and results used to evaluate the

functionalities of the local human-robot collaborative assembly. The

evaluation is based on the risk assessments reported by the ISO safety

standard. The last section explains the experimental results of the energy

minimisation module.

5.1. Remote human-robot assembly

The remote assembly modules and functions are implemented in Java

and integrated to the Wise-ShopFloor. The application server utilised in

the approach runs on a computer with Intel Core i5 2.8 GHz processor

and 4 GB RAM running on Windows Vista Home Basics operating

system.

5.1.1. Case study for human-robot remote assembly

Four basic objects are selected for a case study to assess the practicality of

the developed system. Figure 25 explains the experimental results of the

main stages of the image processing and 3D model generation.

In this case study, seven snapshots were taken for 3D models

generation. Increasing the number of snapshots from difference angles

could improve the quality of the generated 3D models. However, the

processing time would be longer.

Page 78: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

50 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

Figure 25 Results of case study for 3D modelling and remote assembly

To evaluate the developed system, a hypothetic scenario is envisioned.

A remote operator “assembles” the models one on top of another to create

a stack using a 3D robot model. Concurrently, the real robot moves the

actual objects accordingly. During the assembly, the needed control

commands are transmitted from the virtual robot to the real robot,

automatically without the need for additional robot programming.

Additional features have been developed to improve the cycle time of

the assembly operations by introducing vision-based tools to pick and

place the objects. By analysing the top view snapshot, the system is able

to calculate the centres of gravity of the objects and generate the suitable

robot trajectories for pick-and-place.

❶ Initially no 3D models

❸ During trimming process ❹ Three modelled parts

❺ During remote assembly ❻ Final assemblled parts

❷ Construction of pillars

Page 79: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

51

5.1.2. Results of remote human-robot assembly

The system has been verified for ten times and the average computation

time for each processing step was computed. Figure 26 shows the

percentage of the processing time of each step with respect to the total

time required to perform the image processing. It is found that although

the system can process an image in ±4 sec, the silhouette labelling

consumes 36.4% of the total processing time. This high processing

percentage is due to the nature of the labelling algorithm that scans the

image pixel by pixel and examines the neighbouring pixels of each pixel.

Figure 26 Comparison of computation time of the processing steps

Figure 27 Modelling errors vs. number of snapshots processed

0

500

1000

1500

2000

2500

3000

Pro

cess

ing

tim

e (

mse

c)

18.0

19.0

20.0

21.0

22.0

23.0

24.0

Snapshot 2 Snapshot 3 Snapshot 4 Snapshot 5 Snapshot 6 Snapshot 7

He

igh

t (m

m) Modelled height

Actual height

The calculated processsing time for each snapshot is indicated in the graph

4 Sec

2 Sec

1.8 Sec

3 Sec

5 Sec

6 Sec

Page 80: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

52 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

To perform the 3D modelling, seven snapshots were taken. The results

of the pillar trimming in each snapshot are recorded. Figure 27 shows the

difference between the actual pillar height of an object and that of its 3D

model after processing each snapshot. The accuracy of pillar trimming is

quite high as the error converges quickly to a small value after snapshot 7

in about 24 sec.

5.2. Local human-robot collaborative assembly

5.2.1. Minimum safe distance

It is clear to see that that distance between the human and the robot in a

shared environment is the key factor that describes the level of

collaboration between them. On the other hand, working in close

proximity with the robot increases the probabilities of having collisions

leading to serious injuries. Therefore, the challenge is to develop a system

that can provide a safe and collaborative environment for the operator

and the robot.

To control the robot effectively, it is important to determine the safe

distance between the human and the robot. This distance is the key factor

for selecting the suitable safety strategy for the developed system and

controlling the robot properly. As Figure 28 illustrates, different aspects

affect the calculation of the safe distance; the first one is the approaching

speed of the human operator, the second is the response time of the

control system. Based on ISO 13855:2010 standard [81], Equation 36

explains how the safe distance can be calculated.

Figure 28 Parameters used for determining the minimum safe distance

Ds = the minimum safe distanceK = human approaching speedTd = safety decision-making timeTs = Robot stopping time

K

Td Ts

Ds

Monitored zone

Approaching operatorIndustrial robot

Page 81: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

53

𝐷𝑠 = (𝐾 × (𝑇𝑑 + 𝑇𝑠)) + 𝐶 (36)

where:

Ds = the minimum safe distance, in millimetres (mm)

K = the maximum speed in which the human can approach the

robot, in millimetres per second (mm/s)

Td = the time between detecting the distance and issuing a control

command, in seconds (s)

Ts = the response time of the robot, which is the time required for

the robotic system to react, in seconds (s)

C = a parameter used to adjust the proximity between the human

and the robot based on the level of collaboration, the value should be

equal of greater that zero, in millimetres (mm)

By examining the distance formula, it is found that the distance relies on

the following factors:

1. The approaching speed of the operator toward the robot (K); the

following are the typical types of approaches considered in the

formula:

a. The operator approaching the robot by walking, the speed of the

walking varies from one person to another. Therefore, it depends

on different factors such as age, body type, physical condition and

personality. To fulfil the safety aspect, the maximum walking

speed needs to be considered.

b. The movement of the human’s upper extremity; this relies on the

physical features of the human arms. Therefore, the maximum

movement needs to be considered.

2. The maximum response time of the detection system (Td). This

represents the time between detecting a human by the Kinect sensors

and the generation of proper robot control commands. This can be

decomposed into two portions: the first is the maximum processing

time of calculating the closest distance between the human and the

robot, and the second is the maximum decision-making time for the

collision avoidance system.

3. The stopping time of the robot (Ts). This represents the maximum

time between issuing the control command from the collision

avoidance system and having the physical response from the robot.

In addition, it is realised that other abnormal approaches such as

running, jumping and falling are not considered in the formula. To

Page 82: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

54 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

handle these scenarios, two additional approaches need to be

followed:

a. A set of clear warnings and safety instructions must be in place

where the operator can see before approaching the robot.

b. Hard-wired sensing devices need to be installed to detect this

kind of behaviours and react immediately by stopping the robot

completely.

5.2.2. Safe speed of robot

Controlling the robot speed is the first step in initiating a collaborative

environment which allows the operator working side by side with the

robot and share tasks and resources together. To facilitate this, the robot

speed needs to be reduced to an acceptable safety level. This reduction

needs to take into consideration the several human and robot parameters

that are highlighted in Figure 29.

Figure 29 Parameters used for calculating the robot's safe speed

Based on ISO/TS 15066:2016 [82] standard, the maximum

permissible robot speed is then calculated using Equation 37.

𝑉𝑚𝑎𝑥 =𝑝𝑚𝑎𝑥 ∗ 𝐴

√(1𝑚𝐻

+1𝑚𝑅)−1 ∗ 𝑘

(37)

where:

Human hand

Robot gripper

A

Vmax

mH

mR

LegendA ------------------ area of contact between robot and body regionmH ------------------------------ effective mass of human body regionmR -------------------------------------------------effective mass of robot

Vmax -- maximum speed between robot and human body region

Workpiece

Page 83: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

55

mH = the effective mass of the operator’s body region, in kilograms

Kg.

mR = the effective mass of the robot, in kilograms Kg.

Pmax = the maximum permissible pressure value, in Newton per

centimetre2 N/cm2.

k = the effective spring constant, in Newton per millimetre N/mm.

It is clear to see that different aspects affect the speed calculation. The

following is a description of these aspects:

1. The human biomechanical limits: Several approaches [83]–[85]

have been reported to analyse and assess the limits of the different

parts of the human body. These studies focused on addressing the

potential injuries caused by external object colliding with the human

body in different regions. In addition, the limits for permissible forces

and pressures applied to those regions are identified. The results of

these studies were also highlighted in the recent guidance from

ISO/TS 15066:2016 [82].

Based on these studies, the different regions of the operator’s body

have been examined and ranked based on the level of danger in

collision. Figure 30 shows the operator body model with the level of

danger for each part.

Figure 30 Levels of danger for the operator's body regions

Middle of forehead

Temple

Masticatory muscle

Shoulder joint

Sternum

Pectoral muscle

Abdominal muscle

Arm nerve

Pelvic bone

Back of the hand

Radial bone

Thigh muscle

Forefinger pad

Kneecap

Middle of shin

Neck muscle

Deltoid muscle

Seventh neck vertebra

Humerus

Forearm muscle

Fifth lumbar vertebra

Palm

Forefinger end joint

Thenar eminence

Calf muscle

Front Rear

Level of Danger

Low Moderate High Very high

Page 84: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

56 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

Figure 31 Maximum permissible pressures applied on human body

Figure 32 Maximum permissible forces applied on human body

0

100

200

300

400

500

600

Neck muscle

Seventh neck muscle

Shoulder joint

Fifth lumbar vertebra

Sternum

Pectoral muscle

Abdominal muscle

Pelvic bone

Deltoid muscle

Humerus

Radial bone

Forearm muscle

Arm nerve

Forefinger pad

Forefinger end joint

Thenar eminence

Palm

Back of the hand

Thigh muscle

Kneecap

Middle of shin

Calf muscle

Maximum permissible pressure N/cm2

0

50

100

150

200

250

300

350

400

450

Neck muscle

Seventh neck muscle

Shoulder joint

Fifth lumbar vertebra

Sternum

Pectoral muscle

Abdominal muscle

Pelvic bone

Deltoid muscle

Humerus

Radial bone

Forearm muscle

Arm nerve

Forefinger pad

Forefinger end joint

Thenar eminence

Palm

Back of the hand

Thigh muscle

Kneecap

Middle of shin

Calf muscle

Maximum permissible force N

Page 85: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

57

Furthermore, the permissible pressures applied to different parts of

the human body are determined. Figure 31 explains the maximum

permissible pressures applied on different regions of the body.

In addition, Figure 32 shows the limitations of the forces that can be

applied on the human body.

2. The area of contact between the human and the robot: When

the human and the robot collide with each other, they made contact

with certain areas of their surfaces. These areas are known as the

contact areas and they can affect the robot speed control based on the

scenario of the collision. Therefore, the following cases of collisions

are considered to determine the value of the area:

a. If the human and the robot colliding on a single point, then the

smallest area of contact is selected; whether it is on the surface of

the operator or the robot. For instance, if the operator’s hand

collides with the robot, then the contact area of the hand is

selected.

b. If the human and the robot colliding on multiple points, then the

smallest area of contact of these points is considered.

3. The spring constants of the human body parts: These

constants have different values for different parts of the operator’s

body. These values are determined based on the human body feature

listed in a table in the standard ISO/TS 15066:2016 [82].

4. The effective mass of the human body part: Different parts of

the operator’s body have different effective mass values; these values

are shown in Figure 33. It is assumed that the operator will be

standing up during the usual human-robot collaboration. Therefore,

the thighs, knees and lower legs of the operator handle the impact of

the operator’s full body and this reflects on the mass values of these

parts.

5. The effective mass of the robot: This parameter can be

determined based on the robot posture and motion using Equation

38.

𝑚𝑅 =𝑀

2+𝑚𝐿 (38)

where:

Page 86: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

58 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

M = the total mass of the moving parts of the robot

mL = the effective payload of the robot, which includes both the gripper

and the workpiece.

Figure 33 The effective mass values of the human's body regions [82]

5.2.3. Safety scenarios for active collision avoidance

Two safety cases are introduced to provide a collision-free environment

during human-robot collaboration. The first one is to stop the movement

of the robot if operators are perceived in close distance and resume the

robot motion once the operators walk away. Tasks with limited degrees of

freedom (for example, inserting operations in assembly) can profit from

that scenario. The second case shown in Figure 34 (A) is to dynamically

modify a robot trajectory to avoid collision with unidentified obstacles

(including humans). This case can highly suit applications such as

material transfer where the path modification has less influence on the

operations while keeping the humans safe.

0

10

20

30

40

50

60

70

80

Op

erat

or'

s Ef

fect

ive

mas

s K

g

Page 87: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

59

Figure 34 (A) Collision-free trajectory planner, (B) Trajectory modification to avoid collision

Detecting an obstacle along the planned robot path activates the

system to step in and adjust dynamically the robot trajectory. Based on

the calculated distance to the obstacle, the robot trajectory is modified

during the robot movement. Figure 34 (B) shows the vectors needed for

controlling dynamically the robot path. Collision vector c is defined as the

vector from the robot’s end-effector to the nearest obstacle. Then the

robot’s direction of movement is represented by the vector vc, which can

be decomposed into a parallel component vc||c and a perpendicular

component vcc against the collision vector c. The parallel component is

determined in Equation 39 by calculating the dot product between the

movement vector and a collision versor pointing to the direction of the

collision vector. The collision versor represents a unit vector and shares

the same direction of the collision vector. The perpendicular component

is then computed using Equations 40 and 41. The parallel component

representing the motion approaching the obstacle is thus modified to

prevent any possible collision. The modification is supported by the

computation of the distance between the obstacle and the robot end-

effector in this case, resulting in a vector va||c. A new vector of modified

robot motion can then be generated from vcc and va||c as clarified in

Equation 42.

𝑉𝑐 ∥ 𝐶 = (𝑉𝑐 ∙ �̂�) ∙ �̂� (39)

d(i) < threshold

va||c

vc||c

vc

vac

Current TCP

position

Closest operator to

the robot end-effector

STOP

Destination

TCP position

A

Another operator

in the robot’s

monitored area

vcc

B

Page 88: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

60 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

�̂� =𝐶

|𝐶| (40)

𝑉𝑐 ⊥ 𝐶 = 𝑉𝑐 − 𝑉𝑐 ∥ 𝐶 (41)

𝑉𝑎 = 𝑉𝑎 ∥ 𝐶 + 𝑉𝑐 ⊥ 𝐶 (42)

Taking distance-to-obstacle ||c|| into consideration, Figure 35

illustrates the disparity between the parallel component and the

movement vector. The colour distribution indicates the modified value of

the parallel component va||c of the collision vector. The line indicated by

is the predicted movement toward the obstacle and line is the

predicted movement in the opposite direction. Two threshold values dth1

and dth2 are defined to change the parallel component of the movement

vector. The latter component remains the same as long as the distance-to-

obstacle ||c|| is greater than dth2. When ||c|| is smaller than dth1, the

parallel component receives a defined negative value, pointing to the

opposite direction of the obstacle. In the case that dth1 ||c|| dth2, the

parallel component is modified linearly as shown in Figure 35 to maintain

the steadiness of the movement.

Figure 35 Modification of movement vector

Based on the calculated shortest distance-to-obstacle, one of four

safety strategies is activated to control the robot. Figure 36 describes four

cases where the four safety strategies can be utilised: an audio warning

1

2

dth1

dth2

Page 89: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

61

is triggered once an operator enters into the monitored area, and at the

same time the speed of the robot is decreased to prepare for a full stop;

a retrievable stop interruption is sent to the robotic system if the human

enters into a defined hazard zone; however, when the human continues

to walk towards the robot (e.g. for inspection, etc.), the robot arm will

move away automatically to maintain a safe distance from the operator to

avoid any possibility of collision; and a modification in the current

robot trajectory to actively prevent any collision with the operator while

the robot is moving. Once the operator exits the monitored area, the robot

(in cases and ) will proceed the task from where it stopped. Applying

the four safety strategies ensures a minimum down-time on the robot side

as it replaces the traditional emergency stops with recoverable temporary

interruptions. At the same time, the operator is guaranteed to step in and

out the robotic cell freely and safely, even though the robot is in

operation, which strengthens the productivity on the human side. As

expected, the last three safety strategies are especially important for

operators to perform shared tasks with robots. Switching between the

three safety strategies is possible at any moment.

Figure 36 The human-robot safety strategies

5.2.4. Collaboration scenarios

A clear example of the collaboration scenario is shown in Figure 37 where

the robot follows the operator’s hand to provide essential assistance to the

STOP

Active Collision

Avoidance

System

Warning the operator Stopping the robot

Moving the robot back Modyfing the path

Monitored area

Monitored areaMonitored area

Monitored area

Page 90: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

62 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

operator during a shared assembly operation. One more collaboration

case could be that the robot sustains a safe distance from the

collaborating operator during assembly. For example in Figure 38, the

robot follows the human hand to facilitate a shared task. The operator’s

responsibility is to coordinate the robot during the assembly. The

collision avoidance responsibility is to control the robot at the time of

picking and delivering assembly parts. Switching between following and

avoiding behaviours of the robot can be implemented by a simple button

press or through a voice command; this provides a consistent

collaboration between the human and the robot.

Figure 37 Case study for human-robot collaboration

Operator’s

hand

tracked by

the system

Robot’s end-

effector follows

the operator’s

hand

Monitored area

Parts of the scene

considered as

background

Collaboration modes can be

specified by operator’s voice

commands

Following

Monitored distance

Part to be

assembled

Page 91: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

63

Figure 38 Experimental setup for human-robot collaboration

5.2.5. Results for local human-robot collaboration

Additional analyses of the developed system indicate possibilities of

determining the velocity of any foreign object in the robotic cell. The

calculation of the object’s velocity is based on tracking the object’s

positions in 3D space with regard to time. To assess the efficiency of the

velocity calculation, an experimental setup was established by mounting a

pendulum in the robotic cell and tracks its movement during the steady

state swinging. Equations 43 and 44 explain the calculation of the

maximum velocity of the pendulum mathematically.

The calculation is then compared with actual measured velocity. Figure

39 shows the pendulum setup in the robotic cell and Figure 40 depicts the

measured velocity of the pendulum during the experiment.

𝑃𝐸 + 𝐸𝑘 = 𝑐𝑜𝑛𝑠𝑡 → 𝑚𝑔ℎ +𝑚𝑣2

2= 𝑐𝑜𝑛𝑠𝑡 → 𝑚𝑔ℎ𝑚𝑎𝑥 =

𝑚𝑣𝑚𝑎𝑥2

2 (43)

𝑣𝑚𝑎𝑥 = ±√2𝑔ℎ𝑚𝑎𝑥 = ±√2 ∗ 9.80665 𝑚/𝑠2 ∗ 0.2 𝑚 = ±1.98 𝑚/𝑠 (44)

where PE is potential energy, Ek is kinetic energy, m is the mass of the

pendulum, g is the gravity of Earth, h is pendulum position over the

reference level, hmax is pendulum position at its highest point, v is

pendulum velocity, vmax is the maximal velocity of the pendulum at its

lowest point.

The Augmented Reality monitors the human and the robot

The human approaches the

robot

The robot follow the human

Monitored Enviroment

Page 92: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

64 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

Figure 39 Experimental setup for velocity measurement

It is found that the measurement suffers from a degree of uncertainty

and statistical noise. Therefore, a Kalman filter [86] is implemented to

reduce the noise. It uses a recursive approach to estimate the object

velocity based on a series of measurements captured by a Kinect sensor

over time. Using this filter leads to more accurate results from the

measurements.

73 cm

45 cm

mv

Monitored

Area

20 cm

YX

Z

Page 93: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

65

Figure 40 Results for velocity measurement using Kinect sensor

The results show that the system is able to detect the velocity of any

foreign object and control the robot accordingly. To integrate the results

of this section with the developed system, additional implementations are

needed. For instance, detecting the velocity allows the developed system

to predict the position of the obstacle which can help to handle high-

speed motions of the human and reduce the processing delays.

Additional results related to the developed active collision avoidance

system are reported in [87].

5.3. Robot energy minimisation

5.3.1. Single trajectory energy minimisation

5.3.1.1. Straight line scenario for weight carrying

A straight line segment starting at (596, 0, 662) and ending at (596, -211,

580) was examined as weight carrying movement in assembly operation.

The performance of the optimisation module was evaluated with and

without a payload. The evaluation was based on a comparison between

our optimisation module and robotic simulation software, RobotStudio®.

Table 3 depicts that the module resulted in a more energy-efficient

configuration.

-2.5

-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

2.5

0 1 2 3 4 5 6 7 8 9

Ve

loci

ty [

mm

/s]

Time [s]

Velocity in X-direction Velocity in Y-direction Velocity in Z-direction

Page 94: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

66 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

Table 3 Results of straight-line path from the optimisation module

Start conf.

Start joint values

End conf.

End joint values

Energy consumption [J] Payload

Module RobotStudio

(0,0,0,x)

05.95

07.4213.380

(-1,0,0,x)

52.256.7255.286.2822.2721.29

19.6 28.7 0kg

27.5 38.4 6kg

5.3.1.2. Square path scenario for friction-stir welding

The second case study emulates friction-stir welding in a square shape

path. Similar to the pervious case, the optimisation module was tested

with and without a payload. The results of energy-optimised path

following with a payload are shown in Figure 41.

Tables 4 and 5 summarise the results from the optimisation module

and the ones from RobotStudio®, respectively. Due to the optimisation of

robot dynamics in terms of joint configurations, differences in energy

consumption are clear from the results.

Table 4 Square path scenario results from the optimisation module

Joint values at four corners of the square path Energy consumption [J]

1 2 3 4 without payload

with payload

22.0846.6130.564.4937.8321.69

20.2754.7332.6022.5427.4526.10

29.5955.6245.1612.6333.2235.82

32.8847.1843.575.1743.1930.37

68.7 71.3

Optimal configuration (-1, 1, 0, x)

Table 5 Square path scenario results from RobotStudio®

Joint configuration Energy consumption [J]

without payload with payload

(-1, 1, 0, x) 80.5 88.8

(1, 1, 0, x) 150.0 158.0

(-1, -1, 1, x) 104.9 112.5

Page 95: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

67

Figure 41 Energy minimisation results for square shape case study

Page 96: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

68 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

5.3.2. 3D energy map

In this case study, the square path was examined at different locations

within the robot’s workspace. Energy consumption is optimised at each

location in the workspace. Figure 42 depicts a 3D energy map of the

robot, where the energy consumptions are represented in colours from

green to red. Using the energy map as a guide, an engineer can place a

workpiece in a location of low energy level in a robotic assembly cell. It is

obvious that the areas in red or close to red should be avoided for energy

saving.

Figure 42 An energy map in the workspace of an ABB IRB 1600 robot

X [m]

Z [m]

Y [m]

[J]

Page 97: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

69

Figure 43 Case study paths for analysing the 3D energy map

5.3.3. Energy measurement in predefined paths

A case study emulates assembly task which requires moving the robot in a

certain trajectory is introduced. As shown in Figure 43, this path is tested

in two different locations to examine the correctness of using the energy

map. Two identical paths placed respectively in “green” and “red” zones

are considered.

Table 6 describes the robot’s joints values for the first target in the first

and second path.

Table 6 The joint values (deg) of the experiment paths with corresponding simulated energy

consumption

Path Joint

1 Joint

2 Joint

3 Joint

4 Joint

5 Joint

6 Energy [J]

First 53 38 -13 30 -91 -139 463

-126 68 -2 90 149 131 434

Second -6 -9 -29 45 -22 -153 201

173 110 1 18 121 79 451

Second PathFirst Path

Area with high energy

consumption

Area with low energy

consumption

ABB IRB1600 ABB IRB1600

Page 98: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

70 CHAPTER 5: CASE STUDIES AND EXPERIMENTAL RESULTS

Figure 44 The results of the energy optimisation for the case study paths

The case study is introduced to the optimisation module as well as to

RobotStudio® and the real robot. The results of energy-optimised path

following are shown in Figure 44.

The case study is also introduced to RobotStudio® and the results are

recorded for later comparison.

Furthermore, the case study paths are performed on the real robot

(ABB IRB 1600) to evaluate the accuracy of the optimisation. The results

are illustrated in Figure 45.

Page 99: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

71

Figure 45 The measurements on the real robot of the case study paths

The measurements are determined using a 3-phase voltage module (NI

9244) and a current module (NI 9238) from National Instrument™.

Table 7 summarise the results from the optimisation module and the ones

from RobotStudio®, respectively. Due to the different placement of the

path, a difference in energy consumption is clear from the results. These

results indicate that the energy map presented in Figure 42 is valid and

can be used for energy optimisation of the robot.

Table 7 Case study paths energy consumption comparison

Path Energy consumption [J]

Measured RobotStudio® Optimisation module

First 635 170 463

Second 275 60 201

Second PathFirst Path

Page 100: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 101: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

Chapter 6: Conclusions and future work

This chapter summarises the conclusions and the future work related to

the developed approaches in this research. Based on the results in

Chapter 5, several conclusions have been found. These conclusions are

reported in the first section of the current chapter. The second section

identifies the limitations of the current implementation that needs further

improvements, and states the plans for the future work.

6.1. Conclusions

This PhD study is focused on developing a sustainable and safe human-

robot collaborative production environment. It includes three systems

that can be deployed on the shop floor to enhance the sustainability of

production lines from the environmental and societal perspectives. The

results of the development show that the systems are capable of achieving

their intended objectives.

The first system provides a novel approach for the remote operators to

monitor and control a physical industrial robot in a virtual robotic

environment. The 3D models of the components to be assembled are

constructed based on the snapshots of the components captured by a

robot-mounted IP camera. Due to the real-time network speed constraint,

the camera cannot be used during assembly – thus this vision-assisted

and 3D model-based approach. This allows the operator to perform a

remote assembly tasks using robots located in hazardous sites. A case

study has been performed for evaluating the performance of the system.

The experimental results of the case study show that the developed

system can generate a set of 3D models in about 24 sec based on seven

snapshots. The efficiency can be improved by parallel image processing

Page 102: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

74 CHAPTER 6: CONCLUSIONS AND FUTURE WORK

during the travel time when the robot is moving for the next snapshot.

Additional snapshots can be utilised to improve the modelling accuracy of

more complex objects.

The accuracy of a 3D model may be attributed to: (1) camera

calibration, (2) camera resolution, and (3) the distance between an object

and the camera.

The second system is an effective tool for actively avoiding collisions

between humans and robots, which provides reliable and coherent safety

protection in a human-robot collaborative environment. The aim of the

developed system is to enhance the overall performance of the robotic

system. It is achieved by fusing the 3D models of the robots with the point

clouds of human operators in an augmented environment using series of

depth and vision sensors for active collision detection and avoidance. By

comparing the developed approach with the traditional safety systems,

two major benefits can be identified: the first one is allowing immersive

collaboration between humans and robots; the second one is the reduced

downtime of the robotic cell with the co-existence of humans. This

approach can detect possible collisions in real-time and actively control

the robot in one of the four safety modes: alarming the human operator,

stopping the robot, moving the robot away from the approaching operator

through recoverable interruptions, or online modification of the robot

trajectory at runtime. This approach thus provides better adaptability and

productivity. Furthermore, a local human-robot collaborative scenario

has been validated for the purpose of enabling the robot to track the

operator to facilitate a shared assembly task.

The third system developed in the research is an optimisation module

that combines the selection of the robot’s energy-efficient configurations

with the wise usage of robot workspace in the low energy area. The

optimisation module selects the robot’s joint configurations that consume

the lowest energy. Given that energy consumption also depends on the

location of a workpiece in the robot workspace, the concept of energy map

is proposed and developed to advise shop floor engineers for workpiece

placement towards low energy consumption. On the other hand, energy

consumption in robotic assembly is influenced by robot kinematics,

dynamics, task requirements and the technical features of robots in terms

of design. Except the last one, the energy behaviours of robots are

modelled mathematically in this research, which is useful for energy-

efficient robot trajectory planning.

Page 103: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

75

6.2. Limitations of the study

A few limitations were identified during the study and they can be

summarised in the following points:

1. Achieving sustainability in manufacturing is a large-scale goal that

requires joint efforts from researchers around the world. Therefore,

this research focused particularly on developing solutions to a few

aspects toward sustainable manufacturing. The developed solutions

can be integrated with other researcher’s solutions. This will lead to a

significant enhancement for the sustainability of the robotic system

in the production line.

2. The highest percentage of robots usage nowadays is linked with the

assembly applications. Therefore, the research was aimed to define

sustainable solutions for robots used mainly in assembly lines. Other

robotic applications (such as welding or cutting) can be considered in

the future to further evaluate the developed solutions.

6.3. Future work

The future work for the remote assembly system will be centred on the

following aspects:

1. Enhancing the accuracy of the assembly operation.

2. Improving the repeatability of the developed approach.

3. Extending the developed user interface to include friendlier and more

interactive components.

4. Enhancing the efficiency of the developed tool by improving the

performance and reducing the computational time.

5. Introducing more complex and realistic case studies to evaluate the

overall functionalities of the developed system.

The future work for the active collision avoidance system can be

summarised as follows:

1. Introducing the safety strategies to assembly planning to match the

behaviour of the robot with the nature of the planned task. This leads

to more advanced collaborative assembly between humans and

robots.

2. As illustrated in Figure 46, assigning the less time-critical tasks of the

developed system to high-performance cloud-based resources [88].

This can shorten the execution time and improve the scalability of the

system.

Page 104: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

76 CHAPTER 6: CONCLUSIONS AND FUTURE WORK

On the other hand, the future work of the energy optimisation module

can be focused on the following aspects:

1. Improving the modelling of the robot’s dynamic components by

identifying the technical features of robots such as energy loss in

motors, gears and couplings.

2. Performing additional validations for the energy model and the

optimisation approach.

3. As shown in Figure 47, allocating the simulation module of the

implementation to high performance cloud-based resources [88].

This can reduce the computational time and improve the

scalability of the system.

Figure 46 Cloud-based framework for human-robot collaboration

Cloud

Local Active Collision Avoidance Server

Robot Controller

Local Terminal

Database

Server

Industrial Robot

Operator

Decision-making

module

Camera

Communication

Robot

Communication

Point Cloud

ProcessingDistance Calculation

Robot Control

3D Visualisation

User

interface

Status

Monitor

2D Visualisation

Clo

ud

Lo

ca

l S

erv

ers

Ph

ysic

al R

eso

urc

es

Robot Monitor

Monitoring

Controlling

Camera 1 Camera 2

Inte

rne

tF

acto

ry N

etw

ork

Collision-free Path

Planning

Robot

Specifications

Tools

Specifications

Task Scheduler

Augmented

environment

Depth image

Human Labeling Human Labeling

Depth image

Page 105: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

77

Figure 47 Cloud-based framework of robot's energy minimisation

Joints’ torquesSimulation Module

Local Server

Client

Inverse Kinematics

Forward Kinematics

Inverse Dynamics

Forward Recursion

Backward Recursion

Energy Consumption

Calculator

T(t)

θ1(t)

θ6(t)

,θ’1(t)

θ’’1(t)

θ’’6(t)

,θ’6(t)

...

Danevit-Hartenberg Notation Robot’s inertial data

Industrial Robot

Robot’s TCP

Acceleration

Robot’s TCP

Velocity

Robot’s TCP

Payload

Clo

ud

Lo

ca

l S

erv

er

Ph

ysic

al R

eso

urc

e

Fa

cto

ry N

etw

ork

Robot’s TCP

Trajectory

Robot’s Mechanical

Specifications

Robot’s energy analyses

System EvaluatonOptimisation Algorithm

Config. 1Config. 6

Energy 1

Inte

rne

t

Energy 6 ....

Page 106: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 107: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

79

Bibliography

[1] N. A. Ashford, “Government Strategies and Policies for Cleaner Production,” United Nations Cleaner Production Programme, no. August 2002, 1994.

[2] “Department of Commerce.” [Online]. Available: https://www.commerce.gov/.

[3] “About sustainable manufacturing and the toolkit - OECD.” [Online]. Available: https://www.oecd.org/innovation/green/toolkit/aboutsustainablemanufacturingandthetoolkit.htm.

[4] R. Parasuraman, T. B. Sheridan, and C. D. Wickens, “A model for types and levels of human interaction with automation.,” IEEE transactions on systems, man, and cybernetics. Part A, Systems and humans : a publication of the IEEE Systems, Man, and Cybernetics Society, vol. 30, no. 3, pp. 286–297, 2000.

[5] D. J. Agravante, A. Cherubini, and A. Kheddar, “Using vision and haptic sensing for human-humanoid joint actions,” IEEE Conference on Robotics, Automation and Mechatronics, RAM - Proceedings, pp. 13–18, 2013.

[6] C. a. Monje, P. Pierro, and C. Balaguer, “A new approach on human–robot collaboration with humanoid robot RH-2,” Robotica, vol. 29, no. 6, pp. 949–957, 2011.

[7] S. Takata and T. Hirano, “Human and robot allocation method for hybrid assembly systems,” CIRP Annals - Manufacturing Technology, vol. 60, no. 1, pp. 9–12, 2011.

[8] F. Chen, K. Sekiyama, J. Huang, B. Sun, H. Sasaki, and T. Fukuda, “An assembly strategy scheduling method for human and robot coordinated cell manufacturing,” in International Journal of Intelligent Computing and Cybernetics, 2011, pp. 487–510.

[9] J. Krüger, T. K. Lien, and a. Verl, “Cooperation of human and machines in assembly lines,” CIRP Annals - Manufacturing Technology, vol. 58, no. 2, pp. 628–646, Jan. 2009.

[10] T. Arai, R. Kato, and M. Fujita, “Assessment of operator stress induced by robot collaboration in assembly,” CIRP Annals - Manufacturing Technology, vol. 59, no. 1, pp. 5–8, 2010.

[11] D. Kuli and E. a Croft, “Affective State Estimation for Human – Robot Interaction,” IEEE Transactions on Robotics, vol. 23, no. 5, pp. 991–1000, 2007.

[12] G. Charalambous, S. Fletcher, and P. Webb, “The Development of

Page 108: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

80

a Scale to Evaluate Trust in Industrial Human-robot Collaboration,” International Journal of Social Robotics, vol. 8, no. 2, pp. 193–209, 2016.

[13] G. Charalambous, S. Fletcher, and P. Webb, “Development of a Human Factors Roadmap for the Successful Implementation of Industrial Human-Robot Collaboration,” in Proceedings of the AHFE 2016 International Conference on Human Aspects of Advanced Manufacturing, 2016, pp. 195–206.

[14] F. Ore, B. R. Vemula, L. Hanson, and M. Wiktorsson, “Human – Industrial Robot Collaboration: Application of Simulation Software for Workstation Optimisation,” Procedia CIRP:6th CIRP Conference on Assembly Technologies and Systems (CATS), vol. 44, pp. 181–186, 2016.

[15] G. Charalambous, S. Fletcher, and P. Webb, “Identifying the key organisational human factors for introducing human-robot collaboration in industry: an exploratory study,” International Journal of Advanced Manufacturing Technology, vol. 81, no. 9–12, pp. 2143–2155, 2015.

[16] A. Cherubini, R. Passama, P. Fraisse, and A. Crosnier, “A unified multimodal control framework for human-robot interaction,” Robotics and Autonomous Systems, vol. 70, pp. 106–115, 2015.

[17] H. Ding, M. Schipper, and B. Matthias, “Collaborative behavior design of industrial robots for multiple human-robot collaboration,” 2013 44th International Symposium on Robotics, ISR 2013, vol. 49, 2013.

[18] M. Geravand, F. Flacco, and A. De Luca, “Human-robot physical interaction and collaboration using an industrial robot with a closed control architecture,” in 2013 IEEE International Conference on Robotics and Automation (ICRA), 2013, pp. 4000–4007.

[19] S.-W. Song, S.-D. Lee, and J.-B. Song, “5 DOF Industrial Robot Arm for Safe Human-Robot Collaboration Seung-Woo,” in 8th International Conference, ICIRA 2015, 2015, vol. 9245, pp. 121–129.

[20] G. Michalos, P. Karagiannis, S. Makris, Ö. Tokçalar, and G. Chryssolouris, “Augmented Reality (AR) Applications for Supporting Human-robot Interactive Cooperation,” Procedia CIRP, vol. 41, pp. 370–375, 2016.

[21] J. Krüger, B. Nickolay, P. Heyer, and G. Seliger, “Image based 3D Surveillance for flexible Man-Robot-Cooperation,” CIRP Annals - Manufacturing Technology, vol. 54, no. 1, pp. 19–22, 2005.

[22] J. A. Corrales, F. A. Candelas, and F. Torres, “Safe human-robot interaction based on dynamic sphere-swept line bounding volumes,” Robotics and Computer-Integrated Manufacturing, vol. 27, no. 1, pp. 177–185, 2011.

Page 109: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

81

[23] Z. M. Bi and L. Wang, “Advances in 3D data acquisition and processing for industrial applications,” Robotics and Computer-Integrated Manufacturing, vol. 26, no. 5, pp. 403–413, 2010.

[24] T. Gecks and D. Henrich, “Human-robot cooperation: Safe pick-and-place operations,” Proceedings - IEEE International Workshop on Robot and Human Interactive Communication, vol. 2005, pp. 549–554, 2005.

[25] D. Ebert, T. Komuro, A. Namiki, and M. Ishikawa, “Safe human-robot-coexistence: Emergency-stop using a high-speed vision-chip,” 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, pp. 1821–1826, 2005.

[26] C. Vogel, C. Walter, and N. Elkmann, “A projection-based sensor system for safe physical human-robot collaboration,” IEEE International Conference on Intelligent Robots and Systems, pp. 5359–5364, 2013.

[27] J. T. C. Tan and T. Arai, “Triple stereo vision system for safety monitoring of human-robot collaboration in cellular manufacturing,” Proceedings - 2011 IEEE International Symposium on Assembly and Manufacturing, ISAM 2011, pp. 1–6, 2011.

[28] R. Schiavi, A. Bicchi, and F. Flacco, “Integration of active and passive compliance control for safe human-robot coexistence,” Proceedings - IEEE International Conference on Robotics and Automation, pp. 259–264, 2009.

[29] M. Fischer and D. Henrich, “3D Collision Detection for Industrial Robots and Unknown Obstacles Using Multiple Depth Images,” Advances in Robotics Research, pp. 111–122, 2009.

[30] P. Rybski, P. Anderson-Sprecher, D. Huber, C. Niessl, and R. Simmons, “Sensor fusion for human safety in industrial workcells,” IEEE International Conference on Intelligent Robots and Systems, pp. 3612–3619, 2012.

[31] F. Flacco, T. Kroeger, A. De Luca, and O. Khatib, “A Depth Space Approach for Evaluating Distance to Objects: with Application to Human-Robot Collision Avoidance,” Journal of Intelligent and Robotic Systems: Theory and Applications, vol. 80, pp. 7–22, 2015.

[32] B. Dániel, P. Korondi, and T. Thomessen, “Joint Level Collision Avoidance for Industrial Robots,” IFAC Proceedings Volumes, vol. 45, no. 22, pp. 655–658, 2012.

[33] A. Cherubini, R. Passama, A. Crosnier, A. Lasnier, and P. Fraisse, “Collaborative manufacturing with physical human-robot interaction,” Robotics and Computer-Integrated Manufacturing, vol. 40, pp. 1–13, 2016.

[34] “Pilz GmbH & Co. KG,” 2016. [Online]. Available: http://www.safetyeye.com/.

Page 110: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

82

[35] J. O. Blech, M. Spichkova, I. Peake, and H. Schmidt, “Cyber-Virtual Systems: Simulation, Validation & Visualization,” 2014.

[36] X. G. Wang, M. Moallem, and R. V. Patel, “An internet-based distributed multiple-telerobot system,” IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans., vol. 33, no. 5, pp. 627–633, 2003.

[37] J. M. Junior, “Scara3D : 3-Dimensional HRI integrated to a distributed control architecture for remote and cooperative actuation,” no. November 2016, 2008.

[38] E. Vartiainen, V. Domova, and M. Englund, “Expert on wheels: An approach to remote collaboration,” HAI 2015 - Proceedings of the 3rd International Conference on Human-Agent Interaction, pp. 49–54, 2015.

[39] H. Zhong, J. P. Wachs, and S. Y. Nof, HUB-CI model for collaborative telerobotics in manufacturing, vol. 46, no. 7. IFAC, 2013.

[40] T. Itoh, K. Kosuge, and T. Fukuda, “Human-machine cooperative telemanipulation with motion and force scaling using task-oriented virtual tool dynamics,” IEEE Transactions on Robotics and Automation, vol. 16, no. 5, pp. 505–516, 2000.

[41] S. Charles, H. Das, T. Ohm, C. Boswell, G. Rodriguez, R. Steele, and D. Istrate, “Dexterity-enhanced telerobotic microsurgery,” 8th International Conference on Advanced Robotics. Proceedings. ICAR’97, pp. 5–10, 1997.

[42] I. Peake, J. O. Blech, L. Fernando, H. Schmidt, R. Sreenivasamurthy, and S. D. Sudarsan, “Visualization facilities for distributed and remote industrial automation: VxLab,” IEEE International Conference on Emerging Technologies and Factory Automation, ETFA, vol. 2015–Octob, pp. 1–4, 2015.

[43] J. E. Ashby, “The effectiveness of collaborative technologies in remote lab delivery systems,” Proceedings - Frontiers in Education Conference, FIE, pp. 7–12, 2008.

[44] I. Karabegovic, S. Vojic, and V. Dolecek, “3D Vision in Industrial Robot Working Process,” in 12th International Power Electronics and Motion Control Conference, 2006, pp. 1223–1226.

[45] M. a. Moreno-Armendariz, “A new fuzzy visual servoing with application to robot manipulator,” Proceedings of the American Control Conference, vol. 1, no. 3, pp. 3688–3693, 2009.

[46] S. T. Aristos, D., “Simultaneous Object Recognition and Position Tracking for Robotic Applications,” in Proceedings of the 2009 IEEE International Conference on Mechatronics., 2006, vol. 0, no. April, pp. 145–169.

[47] Y. Sumi, Y. Kawai, T. Yoshimi, and F. Tomita, “3D object recognition in cluttered environments by segment-based stereo vision,” International Journal of Computer Vision, vol. 46, no. 1,

Page 111: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

83

pp. 5–23, 2002. [48] C. H. Esteban and F. Schmitt, “Silhouette and stereo fusion for 3D

object modeling,” Proceedings of International Conference on 3-D Digital Imaging and Modeling, 3DIM, vol. 2003–Janua, pp. 46–53, 2003.

[49] B. Petit, J. D. Lesage, C. Menier, J. Allard, J. S. Franco, B. Raffin, E. Boyer, and F. Faure, “Multicamera real-time 3D modeling for telepresence and remote collaboration,” International Journal of Digital Multimedia Broadcasting, vol. 2010, 2010.

[50] C. Okwudire and J. Rodgers, “Design and control of a novel hybrid feed drive for high performance and energy efficient machining,” CIRP Annals - Manufacturing Technology, vol. 62, no. 1, pp. 391–394, 2013.

[51] S. Rahimifard, Y. Seow, and T. Childs, “Minimising embodied product energy to support energy efficient manufacturing,” CIRP Annals - Manufacturing Technology, vol. 59, no. 1, pp. 25–28, 2010.

[52] N. Weinert, S. Chiotellis, and G. Seliger, “Methodology for planning and operating energy-efficient production systems,” CIRP Annals - Manufacturing Technology, vol. 60, no. 1, pp. 41–44, 2011.

[53] J. Gregory, A. Olivares, and E. Staffetti, “Energy-optimal trajectory planning for robot manipulators with holonomic constraints,” Systems and Control Letters, vol. 61, no. 2, pp. 279–291, 2012.

[54] K. Paes, W. Dewulf, K. Vander, K. Kellens, and P. Slaets, “Energy efficient trajectories for an industrial ABB robot,” in Procedia CIRP 15, 2014, vol. 15, pp. 105–110.

[55] C. Hansen, J. Oltjen, D. Meike, and T. Ortmaier, “Enhanced Approach for Energy-Efficient Trajectory Generation of Industrial Robots,” pp. 1–7, 2012.

[56] S. F. P. Saramago and M. Ceccarelli, “Effect of basic numerical parameters on a path planning of robots taking into account actuating energy,” Mechanism and Machine Theory, vol. 39, no. 3, pp. 247–260, 2004.

[57] M. Pellicciari, G. Berselli, F. Leali, and A. Vergnano, “Mechatronic s A method for reducing the energy consumption of pick-and-place industrial robots,” Mechatronics, vol. 23, no. 3, pp. 326–334, 2013.

[58] P. Matthias and B. Martin, “Reducing the energy consumption of industrial robots in manufacturing systems,” pp. 1315–1328, 2015.

[59] R. Saravanan, S. Ramabalan, and C. Balamurugan, “Evolutionary multi-criteria trajectory modeling of industrial robots in the presence of obstacles,” Engineering Applications of Artificial Intelligence, vol. 22, no. 2, pp. 329–342, 2009.

Page 112: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

84

[60] R. Chhabra and M. Reza Emami, “Holistic system modeling in mechatronics,” Mechatronics, vol. 21, no. 1, pp. 166–175, 2011.

[61] M. Brossog, J. Kohl, J. Merhof, S. Spreng, and J. Franke, “Energy consumption and dynamic behavior analysis of a six-axis industrial robot in an assembly system,” Procedia CIRP, vol. 23, pp. 131–136, 2014.

[62] A. Vergnano, C. Thorstensson, B. Lennartson, P. Falkman, M. Pellicciari, F. Leali, and S. Biller, “Modeling and optimization of energy consumption in cooperative multi-robot systems,” Tase, vol. 9, no. 2, pp. 423–428, 2012.

[63] Z. Liu, W. Bu, and J. Tan, “Motion navigation for arc welding robots based on feature mapping in a simulation environment,” Robotics and Computer-Integrated Manufacturing, vol. 26, no. 2, pp. 137–144, 2010.

[64] E. S. Sergaki, G. S. Stavrakakis, and A. D. Pouliezos, “Optimal robot speed trajectory by minimization of the actuator motor electromechanical losses,” Journal of Intelligent and Robotic Systems: Theory and Applications, vol. 33, no. 2, pp. 187–207, 2002.

[65] Y. F. Zheng and J. Y. S. Luh, “Optimal Load Distribution for Two Industrial Robots Handling a Single Object,” Journal of Dynamic Systems, Measurement, and Control, vol. 111, no. 2, p. 232, 1989.

[66] M. Mori, M. Fujishima, Y. Inamasu, and Y. Oda, “A study on energy efficiency improvement for machine tools,” CIRP Annals - Manufacturing Technology, vol. 60, no. 1, pp. 145–148, 2011.

[67] A. Vijayaraghavan and D. Dornfeld, “Automated energy monitoring of machine tools,” CIRP Annals - Manufacturing Technology, vol. 59, no. 1, pp. 21–24, 2010.

[68] T. Behrendt, A. Zein, and S. Min, “Development of an energy consumption monitoring procedure for machine tools,” CIRP Annals - Manufacturing Technology, vol. 61, no. 1, pp. 43–46, 2012.

[69] T. Peng, X. Xu, and L. Wang, “A novel energy demand modelling approach for CNC machining based on function blocks,” Journal of Manufacturing Systems, vol. 33, no. 1, pp. 196–208, 2014.

[70] P. Ystgaard, T. B. Gjerstad, T. K. Lien, and P. A. Nyen, “Mapping Energy Consumption for Industrial Robots,” 2012.

[71] D. Verscheure, B. Demeulenaere, J. Swevers, J. De Schutter, and M. Diehl, “Time-Optimal Path Tracking for Robots : A Convex Optimization Approach,” vol. 54, no. 10, pp. 2318–2327, 2009.

[72] H. Forslund and P. Jonnson, “High-Level Scheduling of Energy Optimal Trajectories,” International Journal of Operations and Production Management, vol. 27, no. 1, pp. 90–107, 2007.

[73] S. Alatartsev, V. Mersheeva, M. Augustine, and F. Ortmeier, “On optimizing a sequence of robotic tasks,” IEEE International

Page 113: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

85

Conference on Intelligent Robots and Systems, pp. 217–223, 2013.

[74] L. Wang, “Wise-ShopFloor: An integrated approach for web-based collaborative manufacturing,” IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews, vol. 38, no. 4, pp. 562–573, 2008.

[75] L. Wang, M. Givehchi, G. Adamson, and M. Holm, “A sensor-driven 3D model-based approach to remote real-time monitoring,” CIRP Annals - Manufacturing Technology, vol. 60, no. 1, pp. 493–496, 2011.

[76] L. Wang, R. Sams, M. Verner, and F. Xi, “Integrating Java 3D model and sensor data for remote monitoring and control,” Robotics and Computer-Integrated Manufacturing, vol. 19, no. 1–2, pp. 13–19, 2003.

[77] M. Jankowski and J. Kuska, “Connected components labelling–algorithms in Mathematica, Java, C++ and C#,” … of the 6th International Mathematica …, 2004.

[78] R. Y. Tsai, “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses,” IEEE Journal on Robotics and Automation, vol. 3, no. 4, pp. 323–344, 1987.

[79] Z. Mahmood, M. A. Ansari, and S. T. Rehman, “Adapting Bresenham Algorithm,” Journal of Theoretical and Applied Information Technology, vol. 2, no. 2, pp. 27–30, 2006.

[80] J. Y. S. Luh, M. W. Walker, and R. P. C. Paul, “On-Line Computational Scheme for Mechanical Manipulators,” Journal of Dynamic Systems, Measurement, and Control, vol. 102, no. 2, pp. 69–76, Jun. 1980.

[81] ISO 13855:2010: Safety of machinery – Positioning of safeguards with respect to the approach speeds of parts of the human body. Geneva, Switzerland: International Organization for Standardization, 2010.

[82] ISO/TS 15066:2016: Robots and robotic devices -- Collaborative robots. Geneva, Switzerland: International Organization for Standardization, 2016.

[83] D. Mewes and F. Mauser, “Safeguarding crushing points by limitation of forces.,” International journal of occupational safety and ergonomics : JOSE, vol. 9, no. 2, pp. 177–191, 2003.

[84] K. Suita, Y. Yamada, N. Tsuchida, K. Imai, H. Ikeda, and N. Sugimoto, “A failure-to-safety ‘Kyozon’ system with simple contact detection and stop capabilities for safe human-autonomous robot coexistence,” Proceedings of 1995 IEEE International Conference on Robotics and Automation, vol. 3, pp. 3089–3096, 1995.

[85] J. A. Marvel, J. Falco, and I. Marstio, “Characterizing task-based human-robot collaboration safety in manufacturing,” IEEE

Page 114: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

86

Transactions on Systems, Man, and Cybernetics: Systems, vol. 45, no. 2, pp. 260–275, 2015.

[86] R. E. Kalman, “A New Approach to Linear Filtering and Prediction Problems,” Journal of Basic Engineering, vol. 82, no. 1, pp. 35–45, Mar. 1960.

[87] A. Mohammed, B. Schmidt, and L. Wang, “Active collision avoidance for human–robot collaboration driven by vision sensors,” International Journal of Computer Integrated Manufacturing, pp. 1–11, 2016.

[88] X. V. Wang, L. Wang, A. Mohammed, and M. Givehchi, “Ubiquitous manufacturing system based on Cloud: A robotics application,” Robotics and Computer-Integrated Manufacturing, Feb. 2016.

Page 115: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

87

Appendix A: Main implementation classes of the remote human-robot collaboration

The following are descriptions of the main classes used in the

implementation of the remote human-robot collaboration:

1. MainMenu: As presented in Figure 48, this class is the main

component of the implementation. It is responsible for managing the

steps of the image processing and 3D modelling by calling the related

methods one by one.

Figure 48 MainMenu class

2. ImageProcessingUI: As detailed before, the development is based

on Wise-ShopFloor framework. Therefore, this class is developed for

constructing a plugin for the user interface of the image processing

stage. As shown in Figure 49, the class consists of several methods to

generate the components of the interface and handle the behaviour of

the mouse.

Page 116: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

88

Figure 49 ImageProcessingUI class

3. PillarsCreateAndTrim: This class is responsible for generating the

trimming of pillars used in the implementation to construct the 3D

shapes. As shown in Figure 50, the class has several methods for

creating, trimming and visualising the pillars.

Page 117: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

89

Figure 50 PillarsCreateAndTrim class

4. Robot: As shown in Figure 51, this class is responsible for

constructing the 3D environment of the user interface and adding the

3D model of the robot to it.

Figure 51 Robot class

5. ColourAdjustment and BrightnessContrast: These classes are

used for processing the captured snapshots as an initial step to detect

the unknown objects and extract them later. As illustrated in Figure

52, the classes have methods to remove noise and adjust the

parameters of the captured images such as the brightness and

contrast levels. These parameters can be adjusted both manually

(through the user interface) or automatically (running in the

background).

Figure 52 ColourAdjustment and BrightnessContrast classes

Page 118: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

90

6. CameraTransformation: The responsibility of this class is to

handle the transformation of coordinate systems between the robot’s

base and the camera. It is important for correctly allocating the

pillars and the generated models in the 3D space. Figure 53 shows the

class and its methods.

Figure 53 CameraTransformation class

7. Coordinate_projection: This class is used to project the points

from the 3D world coordinate system to the image plane. It is

important for creating and trimming the pillars as well as generating

3D models. As Figure 54 illustrates, this class takes into account the

reduction of the optical distortion in the camera’s images.

Figure 54 Coordinates_projection class

Page 119: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

91

Appendix B: Main implementation classes of the local human-robot collaboration

The main classes used in the implementation of the local human-robot

collaboration are the following:

1. MainApplication: This class is responsible for constructing the

user interface of the collision avoidance system. As shown in Figure

55, this class have methods for the interface visualisation. In addition,

it has methods for the mouse behaviour and the different commands

that the user can send to the robot.

Figure 55 MainApplication class

2. Manager: This class is responsible for managing the functionalities

of the collision avoidance system. The class has several methods for

controlling the robot and activating the safety strategies. In addition,

Page 120: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

92

the class has methods for activating the 2D and 3D visualisations. The

list of these methods is presented in Figure 56.

Figure 56 Manager class

3. Controlling: As illustrated in Figure 57, this class is responsible for

activation and deactivating the connection with the robot to monitor

and control it.

Page 121: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

93

Figure 57 Controlling class

4. RobotControl: As shown in Figure 58, this class is responsible for

generating the commands for the robot to move it, stop or control its

speed. Furthermore, the class handles the communication with the

robot by sending control messages and receiving the robot’s latest

status.

Figure 58 RobotControl class

5. Visualization2D and Visualization3D: As illustrated in Figure

59, these classes are responsible for maintaining the 2D and 3D

visualisations of the depth data, the human 3D point cloud and the

robot’s 3D model.

Page 122: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

94

Figure 59 Visualization2D and Visualization3D classes

6. Wrapper: This class handles the communication between the

implementation in Java and the one in C++ by translating the data

coming from the C++ implementation to the Java environment.

Figure 60 shows the methods used in the class.

Figure 60 Wrapper class

7. Velocity: As illustrated in Figure 61, this class is used to determine

the velocity of the human operator using the depth data captured by

the Kinect sensors. The velocity calculation is needed for controlling

the safe distance between the human and the robot.

Figure 61 Velocity class

Page 123: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

95

Appendix C: Main implementation classes of the robot energy minimisation

The following are details for the main modules of the implementations of

the robot energy minimisation:

1. OnlinePlannerCallable: This module manages the stages of the

processing of the energy module. As illustrated in Figure 62, it has

the functions for solving the kinematic and dynamic models of the

robot. In addition, the module calculates the energy consumptions of

the robot configuration and selects the lowest one.

Figure 62 OnlinePlannerCallable module

2. RobotSpec: As shown in Figure 63, this module has a set of

methods to get the robot mechanical parameters and pass them to the

other modules upon request.

Page 124: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

96

Figure 63 RobotSpec module

3. Trjectory, Target and Path: These are a set of modules that define

the parameters for the trajectory and the locations of the targets that

the robot needs to perform. Figure 64 explains the relationship

between the modules and their functionalities.

Figure 64 Trajectory, Target and Path modules

Page 125: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

97

4. EnergyPoint: As illustrated in Figure 65, this module is used to

generate the energy points that are consist of the calculated energy

value and the location of the performed trajectory. This module

represents the basic unit for constructing the 3D energy map.

Figure 65 EnergyPoint module

5. Map3D: This module constructs the 3D energy map of the robot. It

is achieved by determining the energy consumption of robot in

different locations within a robot’s working envelop. Using the

calculated energy values, this module generates a colour-based 3D

map to visualise the robot workspace from the energy perspective.

Figure 66 illustrates the functions used in the module.

Figure 66 Map3D module

Page 126: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

98

6. WorkingEnvelope: This module defines the working envelope of

the robot which is important to describe the volume of the 3D energy

map. The functionality of this module is shown in Figure 67.

Figure 67 WorkingEnvelope module

Page 127: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX
Page 128: New Toward a Sustainable Human- Robot Collaborative Environmentkth.diva-portal.org/smash/get/diva2:1076454/FULLTEXT01.pdf · 2017. 2. 23. · TSP Travelling Salesman Problem . IX

www.kth.se

TRITA IIP-17-03 ISSN 1650-1888 ISBN 978-91-7729-301-9