Industrial Robotrepository.psau.edu.sa/jspui/retrieve/19208856-3510... · robot dynamic model has...
Transcript of Industrial Robotrepository.psau.edu.sa/jspui/retrieve/19208856-3510... · robot dynamic model has...
Industrial Robot
Teleoperation with EMG-based fuzzy-control for robotic grasping
Journal: Industrial Robot
Manuscript ID IR-11-2016-0278
Manuscript Type: Original Manuscript
Keywords: Teleoperation, Robotics, Control
Industrial Robot
Industrial Robot
1
1Abstract—The use of manipulator robot controlled by a natural human-computer interaction offers new possibilities in the
execution of complex tasks in dynamic workspaces. In this context, reflex-based control has been understood to be a promising
approach to perform a human-robot-interaction. Naturally, human motor control adapts the human arm motion to the external
environment with stability, compliance and safety. Inspired by the high-efficiency natural HRI, we propose a Kinect-based vision
system for robot teleoperation with biofeedback-based grasping task. The proposed approach takes account the grasping task to
make the robot perform the desired action based on human arm behavior. Thus, the extraction of handling and position control
from natural human behavior is a very potential way to transfer human experienced skills to robots. Indeed, the gesture control
method which takes advantages from the stability and robustness of the natural human controller seems more effective compared
to conventional mathematical modelling. In this context, we propose a high-friendly human-machine-interface (HMI) which allows
user to control in real time a robotic arm. Based on the Kinect sensor that generates a depth and image of a human body 3D-data,
The Kinect software development kit (SDK) provides a tool to keep track of human body skeleton and abstract it into 3 dimension
coordinates. Therefore, the Kinect sensor is integrated into our control system to detect the different user joints coordinates. The
robot dynamic model has been implemented in a real-time sliding mode control algorithm. A fuzzy logic controller has been
implemented to manage the grasping task based on the EMG signal. Experimental results are carried out to test the effectiveness of
the manipulating system, and the results verify the tracking ability, stability and robustness.
Index Terms—Gesture control; Teleoperation; Human-Robot-Interface; EMG signal-based control
I. INTRODUCTION
Recently, there has been a knowing interest in the development of robotic applications for telemanipulation purposes (Mima
et al., 2012, Schlegel et al., 2015). For robot teleoperation, there has been much research which done providing the possibility
of controlling the robotic systems dedicated for complex tasks (Hernansanz et al., 2015, Ajoudani et al., 2012).
Human-arm gesture recognition is an important research issue in the field of the natural human-computer interaction.
Despite previous works, building a robust human-arm gesture recognition system that is applicable in robotic remains a
challenging problem (Tsarouchi et al., 2016). Based on hand-gesture recognition, several human–robot interfaces are
developed (Terrence et al., 2000, Ueda et al., 2003, Knoll, 2003, Bogdan et al., 2005, Kim et al., 2014).
Regarding such previous works, body-machine interface is frequently used for robot teleoperation. Ajoudani et al. (2012)
present a position tracking markers for motion planning and EMG sensor for a master operator’s arm stiffness estimation to
control a robotic slave arm. However, the developed approach lacks the normalization of EMG signal for stiffness estimation.
Since the EMG signals are very sensitive to muscle fatigue and noise (subject-dependent system).
Otherwise, several works present a method of real-time robot-manipulator teleoperation using markerless image and EMG-
based hand-arm tracking (Siddharth and Xianghai, 2007, Vogel et al., 2011). Although the EMG control established by Vogel
et al. (2011) is a subject-independent system without need of either precision in electrodes placement or human arm/hand
model computing. However, the proposed approach needs more surface for electromyography electrodes and requires a slow
movement for end-effector position estimation. Thereby, the velocity was not considered in the robot-control approach.
Therefore, using vision-based teleoperation can be considered as a solution for these kinds of matters.
The Kinect camera can be considered among the best computer peripherals to bring cutting-edge computer vision capabilities.
Kinect is a motion sensor that provides a natural user interface available for several applications in different fields including
game-based learning systems (Tsai et al., 2015, Chuang et al., 2014), stroke rehabilitation (Bower et al., 2015, Štrbac et al.,
2014), helping visually impaired people (Kanwal et al., 2015, Pham et al., 2016), navigation systems (Tuvshinjargal et al.,
2015, Xu et al., 2015) and other fields (Kim and Kim, 2015, Aragón et al., 2013). Based on its internal processor, Kinect
sensor can recognize human movement patterns to generate a corresponding skeleton coordinates that can be provided in a
computer environment such as Matlab (Brecher et al., 2012, Li, 2013), LabVIEW (Muhiddin et al., 2013) and .NET
environment (Lee et al., 2015). Therefore, the dynamic gesture recognition technology has gained increased attention.
Several approaches have been developed, in order to improve recognition or take advantage of the existing. Ibañez et al.
(2014) propose easy gesture recognition approach in order to reduce the effort involved in implementing gesture recognizers
with Kinect. The practical results with these developed packages are acceptable. Furthermore, an Eigenspace-based method
was developed by Ding and Chang (2015) for identifying and recognizing human gestures by using 3D Kinect data. Based on
Microsoft’s ‘Kinect for Windows SDK’, Galna et al. (2014) use the API for measuring movement in people with Parkinson’s
disease.
1
Teleoperation with
EMG-based fuzzy-control for robotic grasping
Page 1 of 15 Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
2
Gesture control is mainly used for telemanipulation in several modes. In the study by Qian et al. (2013), A Client/Server
structured robot teleoperation application system is developed in networked robot mode. Gesture based telemanipulation of an
industrial robotic arm in master-slave mode for unstructured and hazardous environments is described by Oskoei and Hu
(2008), a Maximum Velocity Drift Control approach is applied allowing the amateur user to control the robot by simply
gestures and force control was combined with the gesture-based control to perform safe manipulation. However, this solution
is costly and depends on reliability of the force-torque sensor. In addition, a method of human–robot interaction using
markerless Kinect-based tracking of the human hand for teleoperation of a dual robot manipulator is presented by Du and
Zhang (2014). Based on hand motion, the user can control the robot manipulator to perform tasks of picking up and placing.
However, the work was applied in virtual environment.
Actually, the Kinect sensor offers new perspectives for the development and mainly for the remote-control approaches.
Many problems were occurred during the human-robot interaction when using the Kinect sensor. After the acquisition of 3D
Kinect data, these data will be processed to control the robot. Several approaches have been used to check the control. A
Cartesian impedance control is used to control the dual robot arm (Luo et al., 2013), This approach allows the robot to follow
the movement of the human arm, avoiding self-collision obstacles without solving inverse kinematics problems. In the study
by Al-Shabi (2015), a PID controller is also used to control a 2-degree of freedom Lego Mind storm NXT robotic arm. Based
on online HIL (Hardware-In-the-Loop) experimental data, the acquired input data from the Kinect sensor are processed in a
closed loop PID controller with feedback from motors encoders. Recently, the gesture control is considered to perform a stable
and efficient teleoperation system (Doisy et al., 2016). Indeed, Kinect-based human kinematics identification with human
postural impedance adaptation are used by Liang et al. (2016) to make Baxter robot arm imitate human arm behaviors.
Nevertheless, all these cited works do not address the haptic control cases. However, the force feedback is still important
especially for grasping tasks
Many researches has been focused on the design and development of robotic systems for haptic interaction in the context of
virtual reality and teleoperation applications (Fontana et al., 2013). Robotic grasping is a challenging problem involving
perception, planning, and control. Several previous works have studying this problem (Palli et al., 2014, Jiang et al., 2011,
Lenz et al., 2015). The problem of hand posture detection and pattern recognition has been previously approached using vision-
based methods (Chaudhary et al., 2011, Lamberti and Camastra, 2012) or glove-based systems (Deyou, 2006, Palm and Hiev,
2008) depending on the constraints of the specific applications. Actually, the glove-based approach seems effective in the
posture detection but this solution requires a great presence of sophisticated sensors and it can be seen as a costly solution. It
is evident that solutions based only on vision approaches cannot lead to an efficient recognition in some situations, i.e. when
the human hand is hidden in some parts.
In this paper, we proposed a gesture-based telemanipulation scheme to control a Lynxmotion robotic arm with biofeedback-
based grasping task. The presented work proposes a human-assisting manipulator teleoperated by Kinect sensor and with a
grasping movement controlled using the EMG signal.
In our case, given the constraint of the control scheme, vision-based approaches for the recognition of the hand postures and
patterns is not suitable as the hand is practically occluded during the gesture control process. Therefore, the recognition should
be based on the biofeedback naturally generated by the muscles responsible of each posture. However, the open loop EMG-
based grasping control lacks a few efficiencies. Indeed, the designed control scheme ensures the hand pattern recognition and
copy the hand state to the gripper without considering the object rigidity. In this case, the operator cannot estimate the real
grasping state and don’t have any idea of applied force. Moreover, the use of force sensor in closed-loop control scheme
without operator intervention is ineffective in the special cases in which the manipulated objects vary in a wide range with
different metallic characteristics such as stiffness, sensitivity and smoothing. Therefore, the use of human-in-the-loop
technique can imitate the natural human postures in the grasping task. To achieve this goal, a force sensor has been installed
in the gripper and the grasping force is processed and wireless transmitted to the vibrator module plugged in the human arm.
In this paper, we present a new gesture control using various techniques involved Kinect-based pattern recognition, EMG-
based posture detection and haptic feedback to control a grasping robot.
The rest of paper is organized as follow. Section 2 presents the recognition method. Section 3 demonstrates the control
design of the robot dynamic in two subsections, one to describe the development of the dynamic model and the second to
discuss the sliding mode-based controller stability. Grasping task method overview is detailed in section 4 discussing the
EMG-based fuzzy controller design as well as the features extraction chose. In the last section, the experimental results are
discussed.
II. RECOGNITION METHOD
Several methods are used to detect the human arm movements. Kinect sensor is the most popular thanks to skeleton detection
and depth computing. The Kinect is a motion sensor that can measure three-dimensional motion of a person. In fact, Kinect
sensor generates a depth and image (x, y and z) of human body. In addition, the Kinect sensor can provide space coordinates
for each joint (see figure 1).
Page 2 of 15Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
3
Figure 1. Kinect for windows SDK detected joints
In this paper, we focus by the human arm portion of the skeleton data. Indeed, assuming that the user must control the robot
based on his right arm gestures, the three considered movements are:
• Flexion-extension right shoulder joint movement
• Abduction-adduction right shoulder joint movement
• Flexion-extension right elbow joint movement
After the acquisition of each elementary joints coordinates, it will be used to determine the joints position. This algorithm
is to identify the distance between each two consecutive joints. Hence, the angle of each joint can be computed based on the
triangle containing the joint. Several vectors should be considered to get the angle of each joint.
Assuming that the user must control the robot based on his right arm gestures. The vectors "a", "b" and "ed" show the model
of the human arm. Indeed, each vector has the distance between two successive joints. The vector "a" reflecting the distance
between two successive joints right elbow and right wrist determined by the following equation
𝑎 = √(𝑥𝑤 − 𝑥𝑒)2 + (𝑧𝑤 − 𝑧𝑒)
2 + (𝑦𝑤 − 𝑦𝑒)2 (1)
Similarly, the vector "b" introduces the distance between two successive joints right shoulder and right elbow and "ed"
reflects the distance between two successive joints right wrist and right hand. These two vectors can be determined by these
following equations
𝑏 = √(𝑥𝑒 − 𝑥𝑠ℎ)2 + (𝑧𝑒 − 𝑧𝑠ℎ)
2 + (𝑦𝑒 − 𝑦𝑠ℎ)2 (2)
𝑒𝑑 = √(𝑥ℎ − 𝑥𝑤)2 + (𝑧ℎ − 𝑧𝑤)
2 + (𝑦ℎ − 𝑦𝑤)2 (3)
The vectors "c" and "d" are used only to determine the angles j2 and j3 such as the vector "c" introduces the distance between
the right shoulder and the right wrist, the vector "d" gives the vector between right elbow and right hand. These two vector can
be presented by the two following equations
c = √(xsh − xw)2 + (zsh − zw)
2 + (ysh − yw)2 (4)
d = √(xh − xe)2 + (zh − ze)
2 + (yh − ye)2 (5)
The vector "c" is designated for the calculation of the angle j2 as shown in figure 2.1. In the triangle "abc", j2 can be
determined by the following equation
𝑗2 = cos−1(𝑏2 + 𝑎2 − 𝑐2)
2𝑎𝑏 (6)
Similarly to vector "c", the vector "d" is designated to computing the angle j3 as describes the figure 2.3. In the triangle
"aded", j3 can be determined by the following equation
𝑗3 = cos−1(a2 + ed
2 − d2)
2aed (7)
The joint angle of the base Tn is determined from the orthogonal projection of the vector "b" on the (x, z) plane as indicates
the figure 2.2. Tn can be determined by the following equation
𝑇𝑛 = tan−1√(𝑧𝑒 − 𝑧𝑠ℎ)
2
√(𝑥𝑒 − 𝑥𝑠ℎ)2 (8)
Page 3 of 15 Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
4
The angle j1 is equivalent to the angle of vector "b" by contributing to the vertical (y axis) as described by the figure 2.4. j1
can be determined by the following equation
𝑗1 = tan−1
√(ye − ysh)2
√(xe − xsh)2 + (ze − zsh)
2 (9)
At each execution of the algorithm, an additional calculation is performed to determine the desired speed and acceleration.
All these parameters will be considered as the desired trajectories for the manipulator robot control algorithm.
Figure 2. (1) Elbow joint computing. (2) shoulder abduction-adduction movement computing. (3) Wrist joint computing. (4) Shoulder flexion-extension
movement computing.
III. CONTROL DESIGN OF ROBOT MOTION
A. Dynamic model of Lynxmotion robotic arm
Dynamic model deals with the torque and force causing the motion of the mechanical structure. In this section, robotic arm
dynamic model has been computed based on Euler-Lagrange formulation. The potential and kinetic energies of each link has
been computed by the two following equations
𝑢𝑖 = −𝑚𝑖𝑔𝑇 𝑃𝑖𝑐 (10)
𝐾𝑖 =1
2𝑚𝑖𝑣𝑐𝑖
𝑇𝑣𝑐𝑖 +1
2𝑤𝑖𝑖 𝑇 𝐼𝑖
𝑖 𝑤𝑖𝑖 (11)
Where 𝑚𝑖 is the 𝑖𝑡ℎ link mass, 𝑣𝑐𝑖 is the 𝑖𝑡ℎ link linear velocity, 𝑤𝑖𝑖 is the 𝑖𝑡ℎ link angular velocity, 𝐼𝑖
𝑖 is the 𝑖𝑡ℎ inertia tensor
and 𝑃𝑖𝑖 is the 𝑖𝑡ℎ link position.
The n degree-of-freedom robot manipulator composed of rigid bodies is expressed based on Newton’s and Euler’s equations
as follow
𝑀(𝑞)�̈� + 𝐶(𝑞, �̇�)�̇� + 𝑔(𝑞) = 𝜏 (12)
Where
𝜏 is 𝑛×1 vector consisting on applied generalized torque, 𝑞 ∈ 𝑅𝑛 denotes the vector of generalized
displacements.
�̇� ∈ 𝑅𝑛 is the joint velocity vector
𝑞 ̈ ∈ 𝑅𝑛 is the joint acceleration vector
𝑀(𝑞) ∈ 𝑅𝑛∗𝑛 is an inertia matrix, which is symmetric uniformly bounded and positive definite.
𝐶(𝑞, �̇�)�̇� ∈ 𝑅𝑛 is a vector function containing Coriolis and centrifugal forces.
𝑔(𝑞) is a vector function consisting of gravitational forces
In this paper, we develop a model of 3DOF robotic manipulator in which we determine the inertia matrix which satisfies
the previous conditions. Also, the Coriolis and Centrifugal and gravitational forces matrices are successfully determined. All
these settings are implemented to applying the desired control.
B. Sliding mode control
Robot manipulator has highly nonlinear dynamic parameters. For this reason, the design of a robust controller is required.
In this section, we study the motion controller based on Sliding-Mode Control (SMC) theory.
The dynamic model of the robot manipulator is described by (12)
Page 4 of 15Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
5
Let 𝑞𝑑(𝑡) denoted the desired trajectory. The tracking error is defined as
𝑒 = 𝑞𝑑(𝑡) − 𝑞(𝑡) (13)
Define �̇�𝑟 = �̇�𝑑 + 𝑠(𝑞𝑑 − 𝑞) where s is a definite positive matrix.
According to the linear characteristic of robotic, we obtain
�̂�(𝑞)�̈�𝑟 + �̂�(𝑞, �̇�)�̇�𝑟 + �̂�(𝑞) = 𝛾(𝑞, �̇�, �̇�𝑟 , �̈�𝑟)�̂�
Where �̂� is the robot estimated parameters vector and 𝛾(𝑞, �̇�, 𝑞𝑟 , �̇�𝑟) is a regression matrix.
Define 𝐻(𝑞) = 𝐻(𝑞) − �̂�(𝑞)
�̃�(𝑞) = 𝐶(𝑞, �̇�) − �̂�(𝑞, �̇�) (14)
�̃�(𝑞) = 𝑔(𝑞) − �̂�(𝑞)
The sliding vector is selected as
𝑠 = �̇� + 𝜇 ∗ 𝑒 (15)
We propose the sliding mode control law as the following
𝜏 = �̂�(𝑞)�̈�𝑟 + �̂�(𝑞, �̇�)�̇�𝑟 + �̂�(𝑞) + 𝑠 + 𝑘 ∗ 𝑠𝑔𝑛(𝑠) (16)
Where 𝑘𝑖 = ∑ �̅�𝑖𝑗𝑗 �̅̃�𝑗 .
Define �̅̃�𝑖 Such ∀ 𝑖 |�̃�𝑖| ≤ �̅̃�𝑖
�̅�𝑖𝑗 Such ∀ 𝑖 |𝛾𝑖𝑗| ≤ �̅�𝑖𝑗
Theorem: The proposed controller (16) guarantees the asymptotic stability of the system.
Proof: select the LEC as
𝑉(𝑡) =1
2𝑠𝑇𝑀(𝑞)𝑠 (17)
�̇�(𝑡) = 𝑠𝑇𝑀(𝑞)�̇� +1
2𝑠𝑇�̇�(𝑞)𝑠
= 𝑠𝑇𝑀(𝑞)�̇� + 𝑠𝑇𝐶(𝑞)𝑠
= 𝑠𝑇[𝑀(𝑞)�̈�𝑟 + 𝐶(𝑞, �̇�)�̇�𝑟 + 𝑔(𝑞) − 𝜏] (18)
Replace 𝜏 by its expression (16), yields
�̇�(𝑡) = 𝑠𝑇[�̃�(𝑞)�̈�𝑟 + �̃�(𝑞, �̇�)�̇�𝑟 + �̃�(𝑞) − 𝑘 ∗ 𝑠𝑔𝑛(𝑠) − 𝑠]
= 𝑠𝑇[𝛾(𝑞, �̇�, �̇�𝑟 , �̈�𝑟)�̃� − 𝑘. 𝑠𝑔𝑛(𝑠) − 𝑠] (19)
Where 𝛾(𝑞, �̇�, �̇�𝑟 , �̈�𝑟) = [𝛾𝑖𝑗] such as |𝛾𝑖𝑗| ≤ �̅�𝑖𝑗
And �̃� = [�̃�𝑖] such as |�̃�𝑖| ≤ �̅̃�𝑖𝑗
Yields
�̇�(𝑡) =∑∑𝑠𝑖𝛾𝑖𝑗�̃�𝑗𝑗
−∑𝑠𝑖𝐾𝑖𝑠𝑔𝑛(𝑠𝑖)
𝑖
−∑𝑠𝑖2
𝑖
=∑∑𝑠𝑖𝛾𝑖𝑗�̃�𝑗𝑗𝑖
−∑∑|𝑠𝑖|�̅�𝑖 �̅̃�𝑗𝑗𝑖
−∑𝑠𝑖2
𝑖
≤ −∑𝑠𝑖2
𝑖
≤ 0
This proves the stability.
IV. GRASPING TASK METHOD
A. EMG acquisition and processing
Myoelectric signal is an electrical potential generated by the muscles. Generally, EMG signals have been measured by two
methods. The first is an invasive method using a needle electrode sensor. However, this method is not recommended in our
case owing to need of more clinical skills, moreover the needle electrodes can cause pain for the patient. The second is a non-
invasive method using a surface electrode sensor. This method can be easily applied and it gives a significant information to
use in several applications (Merletti and Hermens, 2004, Micera et al., 1999).
In this paper, we control the grasping task from multi-channel surface electromyography signal (SEMG) during dynamic
contraction of the hand muscles. The selection of the muscles, as well as the placement of the electrodes, were based on the
related literature (Cram et al., 2010, Phinyomark et al., 2012, Perotto and Delagi, 2005). The first of the electrodes is placed
at the Flexor Digitorum Superficialis (FDS) muscle to measure gross finger flexion as well as the grasping task, and the second
Page 5 of 15 Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
6
at the Extensor Digitorum Communis (EDC) to measure gross finger extension and then the opening task.
Firstly, SEMG signals from patient are acquired using E-health Sensor Shield V2.0 which allows users to perform biometric
and medical applications where body monitoring is needed by using different sensors included the muscle/electromyography
sensor (EMG). The EMG sensor will measure the filtered and rectified electrical activity of a muscle and using the centered
potentiometer in the shield, the EMG gain is adjusted to 1000.
This sensor uses disposable pre-gelled electrodes and the resolution of the acquisition system ADC is 10 bits. Myoelectric
signals are detected by placing three electrodes. Two of them for measurement with a distance equal to 3 cm (Potvin and Bent,
1997) and the third operates as a reference electrode placed at the proximal end of the elbow.
B. Features extraction
In order to optimize the quality of the received signal, we design a Butterworth band pass filter of range of 20– 400 Hz and
a notch filter of 50 Hz to remove the power line noise. This developed filter is implemented in the user interface block diagram
based on digital filter design module for LabVIEW. Sampling frequency of the acquisition system is set at 1000 Hz.
Tacking account of the very complex natures of the biomedical signals, feature extraction is very important issue in EMG
signal processing. The main goal of this technique is to extract the useful information which is hidden in SEMG signal. In
addition, the feature extraction focuses to remove the unwanted EMG parts and interferences. In the literature, EMG features
can be decomposed into several groups which are: time domain, frequency domain or spectral domain, and time-scale or time-
frequency domain (TFD). Based on the related library (Phinyomark et al., 2012c), we focused on time domain features. In
fact, time domain (TD) features are extracted directly from raw EMG time series and do not need any additional transformation.
Then, these features are usually quick and easy implemented. In our case, the feature extraction is mainly used to analyze the
EMG data to extract the useful information for onset detection and estimation of applied grasping force to copy the desired
hand pattern in the robot gripper. Two feature extractions are used in this stage, one as onset detector for grasp onset detection
and other is considered as input of fuzzy logic controller to estimate handgrip force.
The onset detector feature is selected to lunch the pic and place cycle. Based on the related literature, one of the most popular
used as an onset detection is the mean absolute value (MAV) (Phinyomark et al., 2012c). This feature, similar to integrated
EMG, can be called also with another name like: average rectified value (ARV), integral of absolute value (IAV), averaged
absolute value (AAV). As mentioned in its name, it is the average of absolute value of the EMG signal amplitude in a given
segment defined in (20). In order to improve the onset index, two modified versions of this feature were developed by
Phinyomark et al. (2009a). The first type of modified MAV (MAV1) is an extension of MAV feature. The weighted window
function 𝑤𝑖 is assigned into the equation for improving robustness of MAV feature as calculated by (21). The second modified
version (MAV2) is an expansion of MAV feature which is similar to the first modified version. In order to improve smoothness
of the weighted function, the weighted window function 𝑤𝑖 that is assigned into the equation is a continuous function as defined
by (22).
𝑀𝐴𝑉 =1
𝑁∑|𝑋𝑖|
𝑁
𝑖=1
(20)
𝑀𝐴𝑉1 =1
𝑁∑𝑤𝑖|𝑋𝑖|
𝑁
𝑖=1
(21)
Were 𝑤𝑖 = {1, 𝑖𝑓 0.25𝑁 ≤ 𝑖 ≤ 0.75𝑁
0.5, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
𝑀𝐴𝑉2 =1
𝑁∑𝑤𝑖|𝑋𝑖|
𝑁
𝑖=1
(22)
Where 𝑤𝑖 =
{
1, 𝑖𝑓 0.25𝑁 ≤ 𝑖 ≤ 0.75𝑁4𝑖
𝑁, 𝑒𝑙𝑠𝑒𝑖𝑓 𝑖 < 0.25𝑁
4(𝑖−𝑁)
𝑁, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
In our case, MAV2 feature was selected in order to improve the onset handgrip index as well as the fuzzy logic controller
lunching. Hence, two MAV2 features are extracted from FDS and EDC muscle to detect the grasping and releasing onset,
respectively.
Once the fuzzy logic controller is lunched, a second feature is used as input to estimate the handgrip power. Based on the
related library (Phinyomark et al., 2012c), Root Mean Square (RMS) is the most popular feature in analysis of the EMG signal
for both clinical and engineering applications. It is modeled as amplitude modulated Gaussian random process whose relates
to constant force and non-fatiguing contraction. It can be used for handgrip force prediction (Cao et al., 2015). The
mathematical definition of RMS feature can be expressed by (23). It is also similar to standard deviation method.
Page 6 of 15Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
7
𝑅𝑀𝑆 = √1
𝑁∑𝑋𝑖
2
𝑁
𝑖=1
(23)
After signal processing, two RMS features are considering as input of fuzzy logic controller in order to estimate the handgrip
force and then performing the pic and place task.
B. Fuzzy logic control
Based on EMG signal, many algorithms have been applied for hand pattern recognition using Back-Propagation Neural
Network (BPNN), Multiple Nonlinear Regression (MNLR), fuzzy logic and others. Here in this part, a fuzzy logic system is
used for estimate the handgrip force during the manipulation process. The RMS time domain parameter, obtained from raw
SEMG signals is fed as the input to the fuzzy logic system while the desired gripper position is taken as the outputs of system.
The block level diagram of fuzzy logic system is shown in the following figure.
Figure 3. Block level diagram of fuzzy logic system
LabVIEW PID and Fuzzy Logic Toolkit-based fuzzy system is designed in order to estimate the handgrip state and control
the grasping task for pic and place applications. For both inputs and output, we use triangular shapes as the membership
function (see fig.4 and fig. 5). In addition, center of area method, calculated by (25), is used for deffuzzification. Since EMG
is a bio-signal which is subject depend, the EMG obtained from any subject should be normalized by mapping the time domain
parameters using the formula (24).
𝑌𝑁 =(𝑌𝑁𝑚𝑎𝑥 − 𝑌𝑁𝑚𝑖𝑛)(𝑌 − 𝑌𝑚𝑖𝑛)
(𝑌𝑚𝑎𝑥 − 𝑌𝑚𝑖𝑛)+ 𝑌𝑚𝑖𝑛 (24)
Where [𝑌𝑚𝑖𝑛 ,𝑌𝑚𝑎𝑥] is the range of the input parameters Y before mapping, [𝑌𝑁𝑚𝑖𝑛 , 𝑌𝑁𝑚𝑎𝑥]] is the output range of the
normalized parameter 𝑌𝑁. The normalized time domain parameters are ranging from 0 to 10.
𝐶𝑜𝐴 =∫ 𝑓(𝑥) ∗ 𝑥 𝑑𝑥𝑋𝑚𝑎𝑥𝑋𝑚𝑖𝑛
∫ 𝑓(𝑥) 𝑑𝑥𝑋𝑚𝑎𝑥𝑋𝑚𝑖𝑛
(25)
Where CoA is the center of area, 𝑥 is the value of the linguistic variable, and 𝑥𝑚𝑖𝑛 and 𝑥𝑚𝑎𝑥 represent the range of the
linguistic variable.
For both RMS time domain parameters inputs extracted from FDS and EDC, five membership functions are designed as
follow: very low, low, medium, high and very high. In addition to the biofeedback inputs, the last state should be considered
in the pic and place process. Indeed, after grasping, the EMG raw decreases in the hold state. However, the gripper must be
latched until the release action. In this case, the EMG data are non-significant for interpreting in the fuzzy logic controller.
Therefore, adding the last state in the fuzzy inputs can overcome this drawback. Otherwise, seven membership functions are
reserved for the gripper position output named: opened, very wide, wide, very low grasping, low grasping, medium grasping
and high grasping. Figure 4 and figure 5 show the membership functions of inputs and output, respectively. One hundred
seventy-five if then rules will be implemented to perform the pic and place task.
B. System overview of the grasping task method
The global system consists of three parts, where first step deals with gripper initialization to the rest state when it is opened.
The second step takes care of bio signal acquisition and processing. In this part, a MAV2 feature was extracted to detect the
onset grasping. Once this feature value exceeds the threshold, a fuzzy logic controller is executed to control the gripper position
similar to applied handgrip force.
Page 7 of 15 Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
8
Figure 4. Input variable membership functions: (1) RMS1, (2) RMS2 and (3) Last State.
According to handgrip force evaluated by the two RMS features extracted from FDS and EDC muscles and tacking account
the last state of the gripper position output, the fuzzy logic controller performs the grasp and hold state for object moving. The
user can release the grasped object by mean of MAV2 feature extracted from the EDC muscle. The final step is filled by
returning to the rest state and testing the EMG signal for novel pic and place cycle.
Figure 5. Input variable membership functions: Gripper Position
V. EXPERIMENTAL RESULTS AND DISCUSSIONS
In the beginning, the required modules and toolkits for LabVIEW have been installed. The JKI's VI Package Manager such
as Kinesthesia Toolkit for Microsoft Kinect and LabVIEW Interface for Arduino (LIFA) to ensure the skeleton detecting and
the communication between computer and Arduino board, respectively. Afterwards, the required firmware to the Arduino
board has been uploaded. In addition, we install the required modules such us control design and simulation module to
implement the sliding mode-based control law, the PID and fuzzy logic toolkit to design the fuzzy system containing the
membership functions and the related rules base for the grasping task control, the robotic module for LabVIEW to
communicate with the SSC-32 servo controller and the digital filter design for EMG preprocessing. The flowchart of the
grasping task control process is represented in Figure 6.
In the experimental setup, the basic components of the experimental platform are Lynxmotion robotic arm interfaced by the
SSC-32 servo controller board, power sources, connecting wires, Kinect sensor, laptop, Mega 2560 Arduino board supporting
both E-health shield and Xbee shield, USB-Xbee adapter and two Xbee modules. Figure 7 illustrates the experimental platform
of the teleoperating system including the basic components previously described. As shown in this figure, two power sources
were used. Indeed, a 6V DC adapter was chosen ensuring the SSC-32 power supply and then the robot motion and control. On
the other hand, a 12V DC adapter was selected for the Arduino board powering. This voltage can support the Arduino board
and its extensions including both the E-health shield and the Xbee shield. This ATmega2560 microcontroller-based board is
selected in order to establish the wireless communication and biomedical signal acquisition. Indeed, this board communicates
with the laptop via two Xbee modules, one is plugged in the Xbee adapter shield, and the other is connected in the laptop
through an USB-Xbee adapter. In addition, this board ensures the EMG acquisition based on the E-health shield and actuates
the vibrating motor mounted in the human arm.
Page 8 of 15Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
9
Gripper opened
MAV2 < threshold
MAV2 < threshold
Figure 8 presents the detailed architecture of the proposed control scheme. The upper section represents the grasping task
method, while the lower section represents the robot motion control. The global diagram of the telemanipulation system
illustrates how to manage pic and place task only via human arm behavior. Using reflex-based control, the user can place the
robot end effector in the target position thanks to Kinect sensor. In addition, a human-in-the-loop technique has been used
performing the grasping task. Naturally, the reflex-based control is a potential way to manage the unexpected scenarios using
Initialization Going to rest state
EMG measurement
And processing
MAV2 (from FDS)
Fuzzy logic system Handgrip force
Robot end effector HS422 servomotor
Onset ?
Gripper position
RMS 1
EMG measurement
And processing
RMS (from FDS/EDC)
RMS 2
Onset ?
EMG measurement
And processing
MAV2 (from EDC)
Figure 6. grasping task control process for pic and place cycle
Last S
tate
Gripper position
Robot end effector HS422 servomotor
Hu
ma
n-in
-the-lo
op
Page 9 of 15 Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
10
the human controller with natural human sensors. On the one hand, the target position is considered as visual input for end-
effector positioning. On the other hand, a proprioceptive feedback has been used to estimate the applied gripper force using
vibrating mini motor disc. Both hand grip force human-controller output and end-effector grasping force are wireless
transmitted via Xbee protocol. A human-machine-interface has been developed to interact with telemanipulation system.
Containing many options to establish a perfect and high-friendly human robot interaction. These characteristics can be
abstracted in few points such as hardware configuration, experimental recording and experimental results. Indeed, this HMI
ensures the skeleton detection and acquisition of 3D human joints coordinates captured by Kinect sensor. Also, the tracking
process state control and user position checking (see figure 9) have been ensured by implementing our security algorithm. In
fact, where the security program indicates a skeleton detection error, may be occurring due to another person in the Kinect
sensing area during the monitoring process, the recognition method can provide a non-numeric or infinite joint position. In
this case, the last valid calculated angle of each joint will be sent to the robot until the operator solve the problem. In addition,
a smoothing signal algorithm can be enabled to reject the unwanted random movement generated by the twitched human arm.
This HMI has been developed in order to integrate the grasping task control, control law including sliding mode control and
wireless communication using Xbee protocol. Containing two screens, this HMI allows user to control the manipulator robot
by arm gesture and muscle activation. Firstly, the human arm joint coordinates are acquired and processed by the implemented
recognition algorithm presented in the first section. Then, each human arm joint position is determined with the related velocity
and acceleration. These results are considered as inputs for the robot motion controller. Moreover, the dynamic behavior of
the manipulator robot is implemented to apply the sliding mode control while 3DOF are considered for robot motion control.
After position checking process, the user can test the grasping task based on the EMG signal measured by the placed electrodes.
The developed HMI indicates the gripper position by informing the user about the state of the gripper. As shown in figure 10,
the LED indicator changes to green if the grasping process is done.
Next steps consist of testing the teleoperation process, figure 11 enumerates the teleoperation process including two pic and
place tasks controlled by gestures and muscles activities. The sequences in this figure illustrate how the user can perform the
pic and place task based on the high-friendly user interface. Figure 12 presents the extracted graph illustrating the compared
desired and real position of each joint.
Figure 7. Experimental platform of the teleoperating system: (1) Kinect sensor, (2) USB-Xbee adapter + Xbee module, (3) USB-RS232 adapter, (4) 6V DC
adapter, (5) 12V DC adapter, (6) SSC-32 board, (7) Lynxmotion robotic arm, (8) Object to grasp, (9) Mega2560 Arduino board + E-health shield + Xbee shield + Xbee module, (10) Wired vibrator motor, (11) Surface EMG electrodes, (12) User.
Page 10 of 15Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
11
Figure 8. Global Diagram of the telemanipulation system
Figure 9. User position checking algorithm
Figure 10. Test of grasping task
Page 11 of 15 Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
12
Page 12 of 15Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
13
Figure 11. Sequences of teleoperation system
Figure 12. desired and real positions
Page 13 of 15 Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
14
VI. CONCLUSION
This paper applies a gesture-based telemanipulation scheme to control a robotic arm with biofeedback-based grasp. Indeed,
in this work, we propose a human-assisting manipulator teleoperated by Kinect sensor and with a grasping movement
controlled using the EMG signal. In addition, we propose a high-friendly human-machine-interface (HMI) which allows user
to control in real time a robotic arm.
The developed control approach can be used in all applications, which require real time human-robot cooperation. Trajectory
tracking and system stability have been ensured by Sliding Mode Control (SMC). Control and supervision, the both have been
provided by a Kinect based human-machine interface developed in graphical programming language NI LabVIEW. A fuzzy
logic controller has been implemented to manage the grasping task based on the EMG signal. Experiments have shown the
high efficiency of the proposed approach.
In conclusion, the proposed control scheme provides a feasible teleoperation system for many applications such as pick and
place in hazardous and unstructured environment. However, there are some constraints when applying the proposed method,
such as the sensibility of the desired trajectory generated by the human arm even in case of random and unwanted movements.
This can damage the manipulated object during the teleoperation process. In this case, such operator skills are highly required
in order to secure the teleoperated robot as well as the manipulated object. In the same context, our telemanipulation approach
is mainly based on the human-in-the-loop technique. Indeed, the use of the proprioceptive feedback to the human controller
requires a high promptitude of the operator. Thus, the operator person should be selected under such criteria. In addition,
several preliminary tests must be performed until the selected operator learns the teleoperating process.
The future objective of this work is to respect the work space and the regional constraints in the control schemes by filtering
the unaccepted trajectories and recovering any lack of the operator skills.
REFERENCES
Ajoudani, A., Tsagarakis, N. and Bicchi, A. (2012), “Tele-impedance: Teleoperation with impedance regulation using a body–machine
interface”. The International Journal of Robotics Research 31(13) 1642–1655.
Al-Shabi, M. (2015), “Simulation and Implementation of Real-Time Vision-Based Control System for 2-DoF Robotic Arm Using PID with
Hardware-in-the-Loop”. Intelligent Control and Automation, vol. 6, pp. 147-157.
Aragón, A. C., Macknojia, R., Payeur, P. and Laganière, R. (2013), “Rapid 3D Modeling and Parts Recognition on Automotive Vehicles
Using a Network of RGB-D Sensors for Robot Guidance”. Journal of Sensors. Volume 2013, Article ID 832963, 16 pages.
Bogdan, I., Didier, C., Patrick, L. and Vasile, B. (2005), “Dynamich and gesture recognition using the skeleton of the hand”. Journal on
Applied Signal Processing; 13:2101–9.
Bower, K., Louie, J., Landesrocha, Y., Seedy, P., Gorelik, A. and Bernhardt, J. (2015), “Clinical feasibility of interactive motion-controlled
games for stroke rehabilitation”. Journal of NeuroEngineering and Rehabilitation, 12:63.
Brecher, C., Breitbach, T., M¨uller, S., Ph.Mayer, M., Odentha, B., M. Schlick, C. and Herfs, W. (2012), “3D Assembly Group Analysis for
Cognitive Automation”. Journal of Robotics. Volume 2012, Article ID 375642, 18 pages.
Cao, H., Li, C., Sun, S., Wang, W., Gao, Z. and Zhang, X. (2015), “Comparison of two strategies for handgrip force prediction based on
sEMG”. International Conference on Mechatronics, Electronic, Industrial and Control Engineering.
Chaudhary, A., Raheja, J.L., Das, K. and Raheja, S. (2011), “Intelligent approaches to interact with machines using hand gesture recognition
in natural way: a survey,” International Journal of Computer Science & Engineering Survey (IJCSES), vol. 2.
Chuang, C., Chen, Y., Tsai, L., Lee, C. and Tsai, H. (2014), “Improving Learning Performance with Happiness by Interactive Scenarios”.
The Scientific World Journal. Volume 2014, Article ID 807347, 12 pages.
Cram, J.R., Kasman, G.S. and Holtz, J. (2010), “Introduction to Surface Electromyography”, 2nd ed. Jones and Bartlett Publishers.
Deyou, X. (2006), “A neural network approach for hand gesture recognition in virtual reality driving training system of SPG,” in Proceedings
of the 18th International Conference on Pattern Recognition (ICPR ’06), pp. 519–522.
Ding, I. and Chang, C. (2015), “An Eigenspace-based Method with a User Adaptation Scheme for Human Gesture Recognition by Using
Kinect 3D Data”, Applied Mathematical Modelling, ELSEVIER.
Doisy, G., Ronen, A. and Edan, Y. (2016), “Comparison of three different techniques for camera and motion control of a teleoperated robot”,
Applied Ergonomics, Volume 58, Pages 527–534.
Du, G. and Zhang, P. (2014), “Markerless human–robot interface for dual robot manipulators using Kinect sensor”, Elsevier, Robotics and
Computer-Integrated Manufacturing, vol. 30, pp.150–159.
Fontana, M., Fabio, S., Marcheschi, S. and Bergamasco, M. (2013), “Haptic Hand Exoskeleton for Precision Grasp Simulation”. J.
Mechanisms Robotics 5(4), 041014 (9 pages).
Galna, B., Barry, G., Jackson, D., Mhiripiri, D. and Olivier, P. (2014), “Accuracy of the Microsoft Kinect sensor for measuring movement
in people with Parkinson’s disease”, Elsevier, Gait & Posture 39, 1062–1068.
Hernansanz, A., Casals, A. and Amat, J. (2015), “A multi-robot cooperation strategy for dexterous task oriented teleoperation”. Robotics
and Autonomous Systems. Vol. 68, Pages 156–172.
Ibañez, R., Soria, A., Teyseyre, A. and Campo, M. (2014), “Easy gesture recognition for Kinect”, Advances in Engineering Software,
ELSEVIER, pp 171–180.
Jiang, Y., Moseson, S. and Saxena., A. (2011), "Efficient grasping from RGBD images: Learning using a new rectangle representation," 2011
IEEE International Conference on Robotics and Automation (ICRA), Shanghai, 2011, pp. 3304-3311.
Kanwal, N., Bostanci, E., Currie, K. and Clark, A. (2015), “A Navigation System for the Visually Impaired: A Fusion of Vision and Depth
Sensor”. Applied Bionics and Biomechanics. Volume 2015, Article ID 479857, 16 pages.
Page 14 of 15Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
Industrial Robot
15
Kim, H. and Kim, I. (2015), “Dynamic Arm Gesture Recognition Using Spherical Angle Features and Hidden Markov Models”. Advances
in Human-Computer Interaction. Volume 2015, Article ID 785349, 7 pages.
Kim, Y., Leonard, S., Shademan, A., Krieger, A. and Kim, P. C. W. (2014), “Kinect technology for hand tracking control of surgical robots:
technical and surgical skill comparison to current robotic masters”. Surgical Endoscopy, Volume 28, Issue 6, pp 1993-2000.
Knoll, A.J.Z. (2003) “A two-arm situated artificial communicator for human–robot cooperative assembly”. IEEE Transactions on Industrial
Electronics ;50(4):651–8.
Lamberti, L. and Camastra, F. (2012), “Handy: a real-time three color glove-based gesture recognizer with learning vector quantization,”
Expert Systems with Applications, vol. 39, no. 12, pp. 10489– 10494.
Lee, H., Liu, C., Chu, K., Mai, Y., Hsieh, P., Hsu, K. and Tseng, H. (2015), “Kinect Who’s Coming - Applying Kinect to Human Body
Height Measurement to Improve Character Recognition Performance”. Smart Science. Vol. 3, No. 2, pp. 117-121.
Lenz, I., Lee, H. and Saxena, A. (2015), “Deep learning for detecting robotic grasps”. The International Journal of Robotics Research. vol.
34 no. 4-5 705-724.
Li, B. (2013) “Using Kinect for Face Recognition under Varying Poses, Expressions, Illumination and Disguise”. Applications of Computer
Vision (WACV), Tampa, 15-17, 186-192.
Liang, P., Ge, L., Liu, Y., Zhao, L., Li, R. and Wang, K. (2016) “An Augmented Discrete-Time Approach for Human-Robot Collaboration”.
Discrete Dynamics in Nature and Society. Vol. 2016, Article ID 9126056, 13 pages.
Luo, R.C., Shih, B. and Lin, T. (2013), “Real Time Human Motion Imitation of Anthropomorphic Dual Arm Robot Based on Cartesian
Impedance Control”, 2013 IEEE International Symposium on Robotic and Sensors Environments (ROSE), Oct. 2013, Washington,
USA.
Merletti, R. and Hermens, H. (2004). “Detection and conditioning of the surface EMG signal”. In R. Merletti & P. Parker (Eds.),
Electromyography: Physiology, engineering, and noninvasive applications (pp. 107–132). New Jersey: John Wiley & Sons.
Micera, S., Sabatini, A. M., Dario, P. and Rossi, B. (1999), “A hybrid approach to EMG pattern analysis for classification of arm movements
using statistical and fuzzy techniques”. Medical Engineering and Physics, 21, 303–311.7
Mima, K., Miyoshi, T., Okabe M., Honda M., Imamura T. and Terashima K. (2013), “Bilateral Telemanipulation with a Humanoid Robot
Hand/Arm between USA and Japan”. Intelligent Autonomous Systems 12, Volume 194 of the series Advances in Intelligent Systems
and Computing pp 197-205.
Muhiddin, C., Phillips, D., Miles, M., Picco, L. and Carberry, D. (2013), “Kinect 4 ... holographic optical tweezers”, Journal of Optics, 2013
IOP Publishing Ltd, Volume 15, Number 7.
Oskoei, M. A., and Hu, H. (2008), “Support vector machine based classification scheme for myoelectric control applied to upper limb”.
IEEE Transactions on Biomedical Engineering, 55(8), 1956–1965.
Palli, G., Melchiorri, C., Vassura, G., Scarcia, U., Moriello, L., Berselli, G., Cavallo, A., De Maria, G., Natale, C., Pirozzi, S., May, C.,
Ficuciello, F. and Siciliano, B. (2014), “The DEXMART hand: Mechatronic design and experimental evaluation of synergy-based
control for human-like grasping”. The International Journal of Robotics Research, Vol. 33(5) 799–824.
Palm, R. and Hiev, B. (2008), “Grasp recognition by time-clustering, fuzzy modeling, and Hidden Markov Models (HMM)—a comparative
study,” in Proceedings of the IEEE International Conference on Fuzzy Systems (FUZZ '08), pp. 599–605, Hong Kong.
Perotto, A. and Delagi, E. (2005), “Anatomical guide for the electromyographer: the limbs and trunk”. Charles C Thomas Pub Limited.
Pham, H., Le, T. and Vuillerme, N. (2016), “Real-Time Obstacle Detection System in Indoor Environment for the Visually Impaired Using
Microsoft Kinect Sensor”. Journal of Sensors. Volume 2016, Article ID 3754918, 13 pages.
Phinyomark, A., Limsakul, C. and Phukpattaranont, P. (2009a), “A novel feature extraction for robust EMG pattern recognition”. Journal
of Computing, 1(1), 71–80.
Phinyomark, A., Phukpattaranont, P. and Limsakul, C. (2012c), “Feature reduction and selection for EMG signal classification”. Expert
Systems with Applications, 39(8), 7420–7431.
Phinyomark, A., Phukpattaranont, P. and Limsakul, C. (2012), “Fractal analysis features for weak and single-channel upper-limb EMG
signals”. Expert Systems with Applications 39, 11156–11163.
Potvin, J.R. and Bent, L.R. (1997), “A validation of techniques using surface EMG signals from dynamic contractions to quantify muscle
fatigue during repetitive tasks”. Journal of Electromyography and Kinesiology, 7 (2), pp. 131–139.
Qian, K., Niu, J. and Yang, H. (2013), “Developing a Gesture Based Remote Human-Robot Interaction System Using Kinect”, International
Journal of Smart Home, Vol. 7, No. 4.
Schlegel, S., Albrecht, S.and Blasé, B. (2015), “High-Dexterity Telemanipulation Robot for Minimally Invasive Surgery”. World Congress
on Medical Physics and Biomedical Engineering, Toronto, Canada Volume 51 of the series IFMBE Proceedings pp 862-865.
Siddharth, J. K. and Xianghai Wu, V. (2007), “Robot-manipulator teleoperation by markerless vision-based hand-arm tracking”.
International Journal of Optomechatronics 2007;1:331–57.
Štrbac, M., KoIovi, S., Markovi, M. and Popovi, D. (2014), “Microsoft Kinect-Based Artificial Perception System for Control of Functional
Electrical Stimulation Assisted Grasping”. BioMed Research International Volume 2014, Article ID 740469, 12 pages.
Terrence, F., Conti, F., Sébastien, G. and Charles, B. (2000) “Novel interfaces for remotedriving: gesture, haptic and PDA,” SPIE
Telemanipulator & Telepresence Technologies VII ;4195:300–11.
Tsai, C., Kuo, Y., Chu, K. and Yen, J. (2015), “Development and Evaluation of Game-Based Learning System Using the Microsoft Kinect
Sensor”. International Journal of Distributed Sensor Networks, Volume 2015, Article ID 498560, 10 pages.
Tsarouchi, P., Makris S. and Chryssolouris, G. (2016), “Human–robot interaction review and challenges on task planning and programming”.
International Journal of Computer Integrated Manufacturing, Vol. 29, No. 8, 916–931.
Tuvshinjargal, D., Dorj, B. and Lee, D. (2015), “Hybrid Motion Planning Method for Autonomous Robots Using Kinect Based Sensor
Fusion and Virtual Plane Approach in Dynamic Environments”. Journal of Sensors. Volume 2015, Article ID 471052, 13 pages.
Ueda, E., Matsumoto, Y., Imai, M. and Ogasawara, T. (2003), “A hand-pose estimation for vision-based human interfaces”. IEEE
Transactions on Industrial Electronics ;50(4):676–84.
Vogel, J., Castellini C. and Van der Smagt, P. (2011), “EMG-Based Teleoperation and Manipulation with the DLR LWR-III”. 2011
IEEE/RSJ International Conference on Intelligent Robots and Systems. September 25-30, 2011. San Francisco, CA, USA.
Xu, T., Jia, S., Dong, Z. and Li, X. (2015), “Obstacles Regions 3D-Perception Method for Mobile Robots Based on Visual Saliency”. Journal
of Robotics. Volume 2015, Article ID 720174, 10 pages.
Page 15 of 15 Industrial Robot
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960