2010 - Robotic Tactile Sensor System and Applications

download 2010 - Robotic Tactile Sensor System and Applications

of 14

Transcript of 2010 - Robotic Tactile Sensor System and Applications

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    1/14

    1074 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 57, NO. 3, MARCH 2010

    Robotic Tactile Sensor System and ApplicationsKitti Suwanratchatamanee, Student Member, IEEE, Mitsuharu Matsumoto, Member, IEEE, and

    Shuji Hashimoto, Member, IEEE

    AbstractThis paper presents a tactile sensor system for a robotmanipulator and an active-sensing technique to realize 3-D objectrecognitions concerning object shape, object surface normal, andobject edge tracing with experimental results. The proposed tactilesensor units implemented on the robot hand consist of three thinsheets of force-sensitive resistors arranged triangularly with theperipheral circuits. One potential application of the proposedtechniques is to realize an effective humanrobot cooperation tomove an object together by utilizing the control of a hand pose tokeep the direction of the hand normal to the object surface in threedimensions, which is often necessary when pushing an object.Another is a 3-D object edge tracing. The proposed techniques canbe employed in industrial processes such as welding and inspectionto eliminate manual teaching procedures for searching the object

    edge automatically before doing the welding process. In theseapplications, information about the object shape or orientation isnot required in advance.

    Index TermsHumanrobot interactions, object recognition,robot tactile systems, robots, tactile sensors.

    I. INTRODUCTION

    RECENTLY, a variety of sensors have been reported for

    robots, such as vision-type sensors [1], [2] and tactile-

    type sensors. While computer vision is often employed to

    recognize the object shape with the position and orientation,

    tactile sensing is an essential ability for a robot to handle an

    object [3]. The tactile sensor attached on the robot hand cansense the object surface, even when the robot vision cannot get

    the occluded surface image. In bilateral teleoperation, informa-

    tion is transmitted not only from the master to the slave but

    also from the slave to the master. Therefore, the operator on the

    Manuscript received May 9, 2008; revised August 21, 2009. First publishedSeptember 1, 2009; current version published February 10, 2010. This workwas supported in part by the Global Robot Academia Grant-in-Aid for GlobalCOE Program by the Ministry of Education, Culture, Sports, Science andTechnology; by Fundamental Study for Intelligent Machines to Coexist withNature, Research Institute for Science and Engineering, Waseda University;by CREST project Foundation of technology supporting the creation of digitalmedia contents of the Japan Science and Technology Agency; by the Grant-in-Aid for the WABOT-HOUSE Project by Gifu Prefecture; by the Research

    Fellowships of the Japan Society for the Promotion of Science for YoungScientists (DC2: 20-56621); by a research grant from the Support Center forAdvanced Telecommunications Technology Research; by a research grant fromthe Foundation for the Fusion of Science and Technology; by Special Coordi-nation Funds for Promoting Science and Technology; and by the Ministry ofEducation, Science, Sports and Culture, Grant-in-Aid for Young Scientists (B)(20700168, 2008). This study was conducted as part of the humanoid project atthe Humanoid Robotics Institute, Waseda University.

    K. Suwanratchatamanee and S. Hashimoto are with the Graduate Schoolof Advanced Science and Engineering, Waseda University, Tokyo 169-8555,Japan (e-mail: [email protected]; [email protected]).

    M. Matsumoto is with the Education and Research Center for FrontierScience, University of Electro-Communications, Tokyo 182-8585, Japan(e-mail: [email protected]).

    Color versions of one or more of the figures in this paper are available onlineat http://ieeexplore.ieee.org.

    Digital Object Identifier 10.1109/TIE.2009.2031195

    master side can feel tactile sensation from the slave side as well

    [4]. A variety of tactile-sensing systems have been proposed

    not only for robots but also for humanmachine interfaces and

    humansystem interactions [5][8], force feedback, and pattern

    recognition. The tactile sensor equipped on the fingertip gives

    a signal to maintain a stable grasp [9]. In real situations, the

    friction between the object and the finger should be measured

    for grasping an object [10]. Although manipulation is one of

    the most interesting tasks, local shape recognition is another

    valuable tactile-sensing application. Object surface orientation

    is important information when the robot contacts the object

    [11]. Object edge sensing and tracking are also important to

    recognize the shape without a vision system [12]. Concern-

    ing object tracking for industrial welding robots, there are

    a variety of techniques. Some research works have used the

    visual system (charge-coupled device camera) [13][15], while

    others focused on the range-sensing methodology of echo pulse

    amplitude and time of flight [16], [17]. In order to acquire

    more information on an object, some research works combine

    a tactile sensor with an actively controlled arm. Some of them

    aimed to identify surface patterns [18], while others focused on

    recognizing the roughness and softness of objects [19]. To track

    humans and to avoid obstacles, some researchers have reported

    the robot equipped with 16 tactile and 16 ultrasonic sensors

    with 360 coverage [20]. There are a variety of techniques andsensing devices utilized as a part of tactile sensing. Previous

    works have used traditional strain gauges, electromagnetic de-

    vice sensors, force-sensitive resistors, capacitive tactile array,

    optical devices, piezoelectric resonance, and shape-memory-

    alloy devices as microcoil actuators used for 2-D and 3-D tactile

    displays [21], [22]. The use of 3-D or 6-D force sensors located

    within the body can perform the same task. They are robust and

    have good performance over a period. They are widely used

    in robotics [23][25]. It may also be easy to use these types of

    sensors. However, this paper aims to study the application range

    by using the simplest system not only about usability but also

    about principle.According to [21], tactile sensor is a device or system that

    can measure a given property of an object or contact between

    the sensor and the object. The tactile sensor units with two and

    three elements of low-cost force-sensitive resistors have been

    proposed in the previous paper with preliminary experimental

    results [26][29]. The development of the proposed sensor unit

    is one of the simple implementations structured with three

    sensing elements (FSRs). The proposed technique is to use

    such devices with a layout specialized for object edge tracing

    and object surface sensing. There are some previous works

    using the same sensing device. For a control task, the dynamic

    behavior has to be evaluated. However, there are only a few

    0278-0046/$26.00 2010 IEEE

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    2/14

    SUWANRATCHATAMANEE et al.: ROBOTIC TACTILE SENSOR SYSTEM AND APPLICATIONS 1075

    Fig. 1. Diagram of the robotic tactile sensor system.

    works utilizing this thin-film sensor. Although a comparison

    of the respective performances of commercial products has

    been reported [30], this work only reported static condition. An

    experimental analysis of the dynamic behavior of such sensorshas been reported [31]. This paper presents a tactile sensor

    system for a robot manipulator and various active-sensing tech-

    niques. The tactile sensor unit implemented on the robot hand

    acquires the distribution of planer surface by arranging three

    force-sensitive resistors triangularly. To show the suitability

    of the proposed system to the practical use, this paper also

    introduces some applications. One is hand pose control to keep

    the direction of movement normal to the 3-D plane object,

    which is often required to push an object for positioning.

    This technique can be used for a cooperative task between

    a human and a robot to move an object together. Another is

    hand pose and motion control for a 3-D objects surface foractive edge recognition and tracing. The proposed techniques

    can be employed for industrial purposes to eliminate manual

    procedures for improving the safety in the human workspace.

    As a typical example, a simulated welding torch for a robot

    has been developed. This torch unit has two functions, namely,

    detecting the object information such as welding points before

    doing the welding process and simulating the welding function.

    In these applications, the user does not need any information

    about the object shape or orientation in advance.

    II. TACTILE-S ENSING SYSTEM

    The diagram of the tactile-sensing system is shown in Fig. 1.To perform the real-time sensing process for controlling the

    robot manipulator, a suitable interfacing system was developed.

    The block diagram of the controller parts for this system is

    shown in Fig. 2. The developed sensing robot can interact with

    both an object-holder robot and a human. The robotic arm wasset on a line-tracking system. The tactile sensor torch unit is

    equipped at the top of the arm (end effectors). The arm robot is

    specially improved for this paper based on Mitsubishi Corpora-

    tions Movemaster-EX. The robot is controlled by a personal

    computer (PC). To perform the real-time sensing process for

    controlling the robot manipulator, a suitable interfacing system

    was developed. The PC also controls the line-tracking unit to

    move the robot along the X-axis. The sensor unit can also be

    controlled by the same PC through the sensor interface module.

    As the proposed tactile sensor unit works together with the

    robotic arm and can scan the space, the sensor unit does not

    need to have a lot of sensing elements. The minimum number of

    sensing points required for detecting the local shape and surface

    orientation is three. The global shape measurement can be done

    by moving the arm along the surface of the object.

    The prepared sensing devices are Flexi-Force, which are

    a sort of the force-sensitive resistor produced by Tekscan, Inc.

    [32]. This sensing device is made of thin polyester film with

    0.127-mm thickness. The active-sensing area is 9.53 mm in

    diameter. The device is capable of sensing forces between 0 and

    4.4 N according to the information provided by Tekscan, Inc.

    The sensor resistance decreases when the force is applied to the

    sensing-element sheet. The resistances of three pieces of force-

    sensitive resistors have similar values when the force is applied

    to the center of all the sensing elements. Hence, by utilizingthe differences between three force-sensitive resistors, users

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    3/14

    1076 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 57, NO. 3, MARCH 2010

    Fig. 2. Block diagram of the controller parts.

    can detect the gradient of the surface. Fig. 3 shows the design

    concept of the proposed tactile sensor unit. The three sensing el-

    ements are fixed to make a triangular position and are located in

    120 interval for each element. They are covered with a sponge

    rubber plate (soft material) whose thickness can be changed

    depending on the object. The other side of the device is covered

    with a hard plate and fixed on the end effectors of the robot

    arm. To simplify the functional check, the circuit of the sensing

    devices has LED indicators to show the sensor that received the

    strongest force. It also has an LCD display on the interface unit,

    as well as a communication channel to send out the data.

    The resistances of three pieces of force-sensitive resistors

    have similar values when the force is applied to the center of

    all the sensing elements. Hence, the gradient of the surface can

    be detected by utilizing the differences between three force-

    sensitive resistors. The resistance is measured by using the

    charge and discharge of the RC circuit, as shown in Fig. 4.

    After charging the capacitor, the discharge will start throughthe force-sensitive resistor. The microprocessor measures the

    discharge time using the software clock counter. To measure

    the variable resistance of the sensing element, the proposed

    method utilizes the RC time constant method. The step input

    is applied to the circuit, and the discharge time is measured.

    The microcontroller checks the voltage of the capacitor with

    2-s interval. To measure the discharge time of the capacitor,

    the proposed method estimates the time when the voltage of the

    capacitor is less than the logic threshold voltage. The variable

    resistance of the sensing element (R) can be obtained as

    R =

    t

    C lnVSupplyVI/O

    (1)

    where VSupply and VI/O represent the supply and logic thresh-

    old voltages, respectively; C is the capacitance of the capacitor,

    and t is the discharge time. In this method, VSupply, VI/O, and

    Cwere set to 5 V, 1.4 V, and 0.01 F, respectively.

    The relation among forces, resistance, and time in (1) is

    nonlinear. Although such a behavior makes electronics complex

    and affects computational time, modern microcontrollers offer

    today computational speeds that are enough to solve such a

    problem. Moreover, it also offers RC time computing func-

    tions. Thus, the system can simply connect the RC circuits to

    the microcontroller. As the research aim is to introduce a simple

    system, it is considered that FSRs are better than other available

    sensing devices. The selected FSR devices are thin-sheet-type

    sensors, which have suitable dimensions and are flexible to be

    structured in a sensing module such as the proposed sensing

    unit. In addition, the selected FSRs are cost-effective sensing

    elements. As the maximum discharge time is less than 3 ms,

    the cycle time of the measurement is short enough for real-timecontrol of the robot manipulator. In addition, linearity is not

    essential for our applications. Because, the research aim is not

    to use the proposed sensor for measuring the exact load forces

    but to use the relationship between three sensor outputs for

    sensing data analysis to define the contacted-object information

    for robot movement control.

    III. PRELIMINARY EXPERIMENT

    This experiment aims to confirm the filtering effect from

    the soft material of the proposed sensor unit by observing

    the quantitative relation between the sensing outputs and theapplied load forces. To evaluate the filtering effect, two different

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    4/14

    SUWANRATCHATAMANEE et al.: ROBOTIC TACTILE SENSOR SYSTEM AND APPLICATIONS 1077

    Fig. 3. Prototype of a tactile sensor unit utilizing three elements.

    Fig. 4. Electronic circuit diagram of tactile sensor units.

    kinds of sensor units with and without a soft material were used

    for performing the experiments. The prepared soft material is

    a 15-mm-thick sponge with 0.496-N/mm2 Youngs modulus.

    First, the sensor unit was fixed. Then, the applied load forces

    were increased from 3.626 to 25.382 N with 3.626-N interval.

    The load was normal to the soft-material surface, so the three

    sensing elements received the same value of load force. Therelationship between resistance and load is shown in Fig. 5.

    The resistance of the sensor unit gradually decreases with the

    increase of load for both cases with and without a soft material.

    Next, a suitable material was chosen. The researchers found that

    the range of resistance change in the case of the sensor unit

    with a soft material was larger than that without a soft material.

    Based on these results, the researchers decided to use a sensor

    unit with a sponge for the following experiments.

    The result shown here is for the case of static force, whereas

    the active touch involves dynamic forces. However, the sensing

    system works in real time. The rise time of the sensing device

    is less than 20 s. The conversion time of the A/D converter is

    less than 3 ms. Thus, the proposed sensor can be used in the

    case of dynamic forces in the applications given in this paper.

    As described previously, the proposed sensor unit was tested to

    confirm the practical limitation in the preliminary experimental

    result for industrial purposes. As a result, the researchers con-

    firmed that it could work even when the applied force was up to

    30 N. It is considered that the device was not damaged because

    the sensing device is made of simple thin polyester film, even if

    the large-force condition was applied to the device.

    IV. TACTILE-S ENSING TECNIQUES

    A. Object-Edge-Sensing Technique

    To perform real-time sensing control for hand pose and mo-

    tion control for object edge tracing, as shown in Fig. 6, the dif-

    ference among the resistance values of those sensing-element

    sheets was used. The coordinate system is also shown in Fig. 6.

    The robot controls its hand pose direction based on the sensor

    values. The object edge is found by scanning. The angle change

    is large when the robot hand reaches an edge. At the object

    edge, the force on sensor 1 should be equal to that on sensor 2,

    as shown in Fig. 6 (front view). Based on the sensor values,

    when the force on sensor 2 is larger than that on sensor 1, the

    robot recognizes that the position of its hand is above the object

    edge. On the other hand, when the force on sensor 1 is largerthan that on sensor 2, the robot recognizes that the position of

    its hand is under the object edge.

    Object edge recognition can be performed by utilizing the

    two signals from sensors 1 and 2. However, there are two

    important advantages of using three sensing elements. The first

    advantage is to keep the robot hand along the edge when tracing

    the object edge. The relation of the three outputs should be kept

    so that the output of the central sensing device (sensor 3) is

    smaller than that of the other two (sensors 1 and 2), as shown in

    Fig. 6 (side view), while these two outputs should be the same,

    as shown in Fig. 6 (front view).

    The second advantage is for scanning the end object. After

    finding the edge, the outputs of sensors 1 and 2 are the same.Then, the difference between sensor 3 and sensors 1 and 2 was

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    5/14

    1078 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 57, NO. 3, MARCH 2010

    Fig. 5. Relationship between the resistance and load forces of a tactile sensor unit.

    Fig. 6. Hand pose and motion control for object edge detection and tracing.

    used for scanning the end of the object. The tracing motion

    continues to the (1, 0, 0) direction until it reaches the rough

    end point, i.e., no forces appear on sensors 1 and 2. In other

    words, only sensor 3 has touched an object. To recognize the

    details of the end point, the robot then moves back to the

    (1, 0, 0) direction to detect the sensing forces on sensors 1

    and 2 again. After the robot detected the hand position on the

    object, it moves to the (1, 0, 0) direction for fine scanning. The

    movement continues until the robot does not detect any forces

    from all three sensors, i.e., the sensors do not contact the object.

    Then, the robot moves back to the (1, 0, 0) direction by 26 mm,

    which is the length of the sensing unit. Then, the robot moves

    13 mm forward to the (1, 0, 0) direction, which means that

    the center of the sensing unit is set to the end of the object.

    To perform the sensing technique efficiently, the sensor unit

    should keep the orientation, as shown in Fig. 6. Otherwise,

    the edge sensing will fail. For example, let us consider that

    the line contacts to the center of two sensors parallel to the

    object edge. In this case, there is no difference between the two

    sensors output. If only two sensors are used, the method has

    a limitation. Therefore, the three sensing elements are used tokeep the correct orientation, as described before.

    B. Object-Surface-Normal-Sensing Technique

    To perform real-time sensing control to follow a normal

    surface in three dimensions, the sensor unit should have the

    ability to detect not only one axis but also two axes. It can be

    done because the proposed tactile sensor unit has been designed

    with three sensing elements placed in a triangular position,

    as described in Section II. The control criterion is to make

    the three sensing outputs equal. Fig. 7 shows the flowchart ofsensing data analysis to detect an object position and to deter-

    mine the robot movement based on the relationship between the

    pushing and received forces in three dimensions. By utilizing

    these data, the proposed system can detect the gradient of the

    soft-material (sponge) surface. To keep the robot hand normal

    to the object surface, the force data from three sensor devices

    is used to control the robot-hand direction. To complete the

    movement in three dimensions, this paper introduces the eight

    directions of movements, as shown in Fig. 7. This figure shows

    an analysis of tactile-sensing feedback to define an object-

    touch position for robot control. It also provides the relationship

    between the pushing and received forces of the three sensing

    elements. The relationship among the values of the sensingelements is also given.

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    6/14

    SUWANRATCHATAMANEE et al.: ROBOTIC TACTILE SENSOR SYSTEM AND APPLICATIONS 1079

    Fig. 7. Flowchart of sensing data analysis to define an object position for robot movement control.

    For example, as shown in Fig. 7, when the object touches

    on the right side of the sponge, a pushing force appears on

    the right side as well. Consequently, the force on sensor 3 is

    greater than those on sensors 1 and 2, which are equal. To

    follow the object surface normal, the robot hand needs to move

    to the left until the forces on all three sensing elements are

    equal. In the same way, the robot hand is controlled to the

    appropriate direction based on the sensor outputs, as shown in

    Fig. 4. In order to verify the tactile information and sensing

    performance as a tactile interface, the program for analyzing

    the distributed pressure patterns was created. The load force is

    applied to the sensor unit. The receiving-force patterns are used

    for deciding on the robot movements automatically following

    the flowchart to control the hand pose normal to the object

    surface.

    V. EXPERIMENT

    To confirm the ability of the proposed sensing system, seven

    experiments were conducted. The first experiment is to measurethe angle of the object surface. The second experiment is to

    measure the object shape using a mechanical scan method. The

    third experiment is to search an object surface and to keep

    the sensor unit orientation normal to the object surface. The

    fourth experiment is to confirm the effectiveness of the robot-

    hand pose action and to show some examples of humanrobot

    cooperation to move an object in three dimensions. The fifth

    experiment is object edge recognition to confirm the edge

    detection. To show the robustness of the proposed system, the

    experiments from three kinds of initial points were conducted,

    i.e., the point above, under, and on the edge. The sixth experi-

    ment is object edge detection starting from the object surface.

    The last experiment is to show the continuous procedure of

    detecting the object edge for welding purposes. The robot

    tracks the object edge to obtain the object information and then

    repeats the movement from the starting point to the end point

    along the object edge by utilizing the simulated torch like a

    welding process. As the first two experiments aim to confirm

    the principle of the proposed system, two sensing elements

    are used to simplify the experiments. On the other hand, in

    the following five experiments, the researchers utilized a three-sensing-element module, as shown in Fig. 3.

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    7/14

    1080 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 57, NO. 3, MARCH 2010

    Fig. 8. Surface angle measurement system setup.

    A. Surface Angle Measurement

    This experiment aims to test the effectiveness of the pro-

    posed sensor for surface angle measurement. Fig. 8 shows the

    experiment setup. The sensor unit has two sensing elementswith a soft material (sponge). The thickness of the sponge

    was changed from 5 to 20 mm with 5-mm interval in this

    experiment. To collect the data, the robot pushed its hand on

    the measuring plane. The plane was tilted at 0 (flat), 5,10, 15, 20, and 25 angles. The right side in Fig. 8shows the measuring plane movement in the example angle

    degrees at 25. The resistances of two sensing elements wereused to detect the attached surface angle. The experiments were

    conducted ten times for each sponge size and plane degree.

    Fig. 9 shows the average of the results. The vertical axis in this

    figure shows the different reading RC time constants between

    two sensing elements, while the horizontal axis shows themeasuring degrees. According to the experimental results, the

    measurable angle is limited by the thickness of the soft material.

    As shown in Fig. 9, the measurable limitation angles of the 5-,

    10-, 15-, and 20-mm-thickness sponges are within 5, 15,20, and 20, respectively. This experiment also shows thatthe characteristic becomes linear for every thickness of soft

    materials in small degrees. If the thickness of soft materials is

    15 mm, the linear region is around from 1010. Accordingto the experiments, the range of the measurable angle is limited

    in large degrees due to the nonlinear characteristics of the

    sensor unit structure.

    B. Shape Measurement

    This experiment aims to test the effectiveness of the proposed

    sensor for shape measurement. Fig. 10 shows the experiment

    setup. In this experiment, the sensor unit has two sensing

    elements with a soft material (sponge). The thickness of the

    sponge was set to 15 mm. In this experiment, the robot scanned

    the surface of the object by measuring the angle along the object

    surface. The system collected the data by pushing the robot

    hand on the object surface. The scanning motion starts with

    the tip position controlled by pushing the robot hand down in

    the vertical axis to be touched with an object and continues

    pushing until the force becomes a certain value. Then, therobot releases its hand from the object by moving the robot

    hand up in the vertical axis. The robot then moves its hand

    2 mm forward in the horizontal axis. It repeats this process

    until scanning is finished for the whole object. The shape of the

    surface is obtained by integrating the angle data along the scan

    direction. The experimental result and the photograph of the

    measured object are shown in Fig. 11. As a result, the resistance

    change between two sensing elements could be used for shapemeasurement. As can be seen in Fig. 11, the scanned results

    are similar to the real object surface. This result shows that the

    proposed system can be used to obtain shape recognition.

    C. Surface Normal Following

    This experiment aims to apply the proposed tactile-sensing

    system to hand pose control in order to keep the hand direction

    normal to the object surface in three dimensions. Two robot

    arms were prepared for this experiment, namely, a sensing

    robot arm and an object-holder robot arm. Two robot arms

    can interact with each other only through the objects. The

    experimental setup is shown in Fig. 12(a). In order to make

    the sensing robot follow the object surface normal smoothly,

    the speed of robot motion is controlled proportionally to the an-

    gle between the robot hand and the object plane. The controller

    unit controls to move the robot hand to change the orientation

    following the changes of the attached object plane. The robot

    hand turns the angle to become smaller and increases the speed

    of robot motion when the attached angle becomes larger. On the

    other hand, the robot hand turns the angle to become smaller

    and decreases the speed of robot motion when the attached

    angle becomes smaller. The sensing robot moves first to the

    object surface. It sets its hand at 0 and then moves in the

    (0, 1, 0) direction to contact the surface of the object heldby the object-holder robot. After that, the object-holder robot

    begins turning the object with a sine function from 0, +20,and 20, as shown in Fig. 12(b)(d), respectively, for twocycles with 147 sensing times. It then keeps the attitude at 0 for

    30 sensing times. The term sensing times means the number

    of sensing for each moving step of the object-holder robot. In

    this experiment, the sampling interval was set at 550 ms, and the

    total experimental time was 96.25 s (175 sensing times). During

    the object movement, the sensing robot moves its hand to follow

    the object surface normal, as shown in Fig. 13. The dotted

    line shows the object angle, and the solid line shows the angle

    between the robot hand and the object plane. As a result, therobot smoothly moved. During the motion, these three sensing

    outputs are used for controlling the robot to follow the object

    surface normal. Thus, some of the sensing elements may not

    sometimes touch with an object because the object was moving.

    However, the results show the contact stability and effectiveness

    of the proposed system. The robot could keep the angle between

    the robot hand and the object surface at 90, within 5 errors.

    In other words, the robot could follow the surface normal while

    changing the object angles, as shown in Fig. 13.

    D. Experiment on Robot-Hand Pose Actions

    This experiment aims to realize an effective humanrobotinteraction, particularly an effective cooperation between them

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    8/14

    SUWANRATCHATAMANEE et al.: ROBOTIC TACTILE SENSOR SYSTEM AND APPLICATIONS 1081

    Fig. 9. Surface angle measurement result.

    Fig. 10. Shape measurement system setup.

    via the object and the proposed sensor system. A person utilized

    just one hand or finger to push the object toward the sensor. The

    robot needed to follow the person to hold the object. A robot

    arm was equipped with the proposed sensor to perform active

    surface recognition. The technique to estimate the contact angle

    between the sensor and a touched object plane was utilized.

    Hand pose control was used to keep the direction of movement

    normal to the 3-D plane object, which is often required to pushan object for positioning. This technique can therefore be used

    for a cooperative task between a person and a robot to move a

    large object. The person might not be able to maintain object

    orientation during the movement due to the object weight.

    Therefore, to assist the person, the robot should support the

    person in different positions. In order to enable the robot to do

    this kind of task, the robot must move its hand normal to the

    object to support at different angles and levels.

    Although the human is close to the industrial robot for

    interacting tasks, this experiment was conducted under safety

    conditions. The researchers implemented the function that the

    robot will not move when the object does not touch the sensor

    unit for safety reasons. The researchers also would like tonote that this experiment is an example of humanmachine

    interaction. For real applications, the researchers would like to

    employ the safer robot system instead of an ordinary industrial

    robot.

    The first experiment in this section is to show the failed

    and successful examples of humanrobot interaction with and

    without the follow-up control of the robot. Fig. 14 shows the

    procedure of the failed example when the robot is not controlled

    and does not keep the robot-hand direction normal to the object

    surface. Fig. 14(a) shows an initial setup of the experiment.

    Fig. 14(b) and (c) shows the actual robot movement withoutand with the follow-up control, respectively. In this procedure,

    the sensing robot set first its hand at 0 and then moved to the

    waiting area. Then, the sensing robot waited for a person to

    place an object to the sensor, as shown in Fig. 14(a). Then,

    a person began turning the object freely up to 45. When a

    person moved the object without a robot follow-up control, the

    object fell down, as shown in Fig. 14(b). On the other hand,

    when a person moved the object with a robot follow-up control,

    the robot can follow the movement and keep its hand direction

    normal to the object. The object was held by a human and a

    robot, as shown in Fig. 14(c).

    The second experiment is to realize an effective cooperation

    task between a human and a robot to hold the various objects

    together. Fig. 15 shows a photograph of humanmachine co-

    operation to move the various objects. As can be seen, a robot

    can interact with a person through an object. Throughout this

    experiment, the person moved the touched object plane freely,

    and the robot was able to follow the person to hold an object

    together. Moreover, it also confirmed humanrobot cooperation

    with various objects. The test objects include a 30-g box, an

    18-g sphere, a 536-g cylinder, and a 264-g block. Fig. 15(a)(c)

    shows the photographs of cooperation between a human and a

    machine to move the aforementioned objects. Of course, such

    interactions are not possible if there is no friction between

    the robot and the object. Thus, the system performances alsodepend on the touching area between a sensor and an object.

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    9/14

    1082 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 57, NO. 3, MARCH 2010

    Fig. 11. Shape measurement result.

    Fig. 12. Surface normal following setup and experiment.

    Fig. 13. Following normal direction result.

    Fig. 14. Actual movement when the robot is (uncontrolled/controlled) to keep the direction normal to the object surface.

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    10/14

    SUWANRATCHATAMANEE et al.: ROBOTIC TACTILE SENSOR SYSTEM AND APPLICATIONS 1083

    Fig. 15. Humanrobot interaction through various objects.

    Fig. 16. Actual movement when the robot recognizes an object edge (the starting point is around an object surface).

    During the experiment, the system performance is different due

    to the contact object. The box gives better results to make the

    interaction task compared to the cylinder and sphere because

    those shapes are different. The flat surface of the box helps

    the sensor to contact to the object better compared to others. In

    humanrobot interaction through various objects, as it is diffi-

    cult to show the qualitative evaluations about the forces applied

    to a humans hand, the researchers gave various examples of

    humanmachine interaction to show the effectiveness of the

    proposed method.

    E. Experiment on Object Edge Recognition

    Fig. 16 shows the object-edge-detecting procedure and shows

    the actual robot movement during the sensing process parallel

    to the X-axis. As shown in Fig. 16, the robot detects the edge by

    utilizing the difference between the received forces on sensors 1

    and 2. To confirm the robustness of the proposed method, the

    average error can be calculated as follows:

    E(Avg) =1

    n

    n

    i=0

    (Xi XAvg)2+(Yi YAvg)2+(Zi ZAvg)2

    (2)

    where E(Avg) represents the mean square error, (Xi, Yi, Zi)denotes the object edge coordinates in the ith measurement,

    n is the data length, and (XAvg, YAvg, ZAvg) represents theaverage coordinates of the object edge throughout all the

    measurements.

    1) When the Initial Point Is Set Above the Edge: The robot

    sets first its hand at 45 and moves to the (0, 1, 1) direction to

    contact its hand on the object. At the object edge, the force on

    sensor 1 should be equal to that on sensor 2. Based on the sensor

    values, when the force on sensor 2 is larger than that on sensor 1,

    as shown in Fig. 16(a), the robot recognizes the position of

    its hand. After recognizing the position of its hand, the robot

    moves to the (0,1, 1) direction until all forces on three sensors

    are equal to zero, i.e., the sensors do not contact the object.

    The robot then moves to the (0, 1, 1) direction with 1-mm

    interval based on the force difference between sensors 1 and 2.

    The robot repeats the movements until the forces on sensors 1

    and 2 are equal, as shown in Fig. 16(c). At this stage the robot

    finds the object edge. The experiments were conducted ten

    times. The mean square error (E(Avg)) was 0.29 mm.2) When the Initial Point Is Set Under the Edge: In this case,

    when the robot touched the object, the force on sensor 2 is

    smaller than that on sensor 1, as shown in Fig. 16(b). Based

    on the sensor values, the robot can recognize that the positionof its hand is under the object edge position. After recognizing

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    11/14

    1084 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 57, NO. 3, MARCH 2010

    Fig. 17. Actual movement when the robot recognizes an object edge (the starting point is on an object surface).

    Fig. 18. Actual movement when the robot tracks an object edge.

    Fig. 19. Actual movement when the robot does a simulated welding task.

    the hand position on the object, the robot moves to the

    (0, 1, 1) direction until all forces on three sensors are equal to

    zero, i.e., the sensors do not contact the object. The robot then

    moves to the (0, 1, 1) direction with 1-mm interval based on

    the force difference between sensors 1 and 2. The robot repeats

    the movements until the forces on sensors 1 and 2 are equal,

    as shown in Fig. 16(c). The experiments were conducted ten

    times. The mean square error (E(Avg)) was 0.44 mm.3) When the Initial Point Is Set on the Edge: In this case,

    after the robot touched the object, the force on sensor 2 is

    similar to that on sensor 1, as shown in Fig. 16(c). If the

    forces are not similar, the robot will do the object-edge-finding

    procedure. The experiments were conducted ten times. Themean square error (E(Avg)) was 0.32 mm.

    F. Experiment on Object Edge Finding

    Fig. 17 shows the procedure of finding the edge from the

    object surface and shows the actual robot movement when the

    robot finds the object edge from the object surface. In this case,

    the robot sets first its hand at 0 and then moves to the (0, 1,

    0) direction to contact its hand on the object surface. At this

    contacted point, the forces on three sensors should be equal, as

    shown in Fig. 17(a). The robot recognizes the hand position.

    Then, the robot keeps moving until it reaches the rough edge

    point, as shown in Fig. 17(b). In other words, it keeps movinguntil the force on sensor 1 is equal to zero and that on sensor 2

    is the biggest of all. Then, the robot turns its hand by 45,and the robot uses the object-edge-finding procedure to reach

    an exact object edge, as shown in Fig. 17(c). The experiments

    were conducted ten times. The mean square error (E(Avg)) was0.43 mm.

    G. Continuous Tracking and Tracing an Object Edge

    This experiment aims to obtain the object information by

    tracing an object edge continuously. Welding is one potential

    application of this hand pose and motion control technique

    for finding and tracing the edge and shape of a 3-D object.

    This technique can be used instead of manual teaching by aperson, which is currently often necessary to obtain the object

    information for welding points before carrying out the welding

    process. Fig. 18 shows the procedure of tracking an object

    edge and shows the actual robot movement when the robot

    tracks an object edge. Fig. 19 shows the actual robot movement

    when the robot a simulated welding task. In this experiment,

    the robot moves first to the object edge position. It then sets

    its hand at 45 and moves in the (0, 1, 1) direction until its

    hand comes into contact with the object. At the object edge,

    the forces on sensors 1 and 2 should be equal. If they are not

    equal, the robot will repeat the edge recognition procedure,

    which is described in Section V-E. At this stage, the sensors

    were set on the object edge. Fig. 18(a) shows the illustrationand the photograph of the starting point when the robot tracks

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    12/14

    SUWANRATCHATAMANEE et al.: ROBOTIC TACTILE SENSOR SYSTEM AND APPLICATIONS 1085

    Fig. 20. Absolute error along an object edge during the tracking process.

    Fig. 21. Absolute error along an object edge.

    an object edge. Fig. 19(a) shows the starting point when therobot does a simulated welding task. In order to do the welding

    task, the distance between the end of the welding torch and the

    object edge should be kept along the object edge by utilizing

    the set point of the reading force on sensor 3. The output of

    sensor 3 should be kept constant throughout the object-edge-

    tracing procedure. After detecting the object edge, the program

    records the coordinates of the position (X,Y,Z). Then, therobot moves to the (1, 0, 0) direction in 10-mm interval and

    trace the object edge until it reaches the rough end point. At

    the end point, the forces on sensors 1 and 2 are equal to zero

    because only sensor 3 has touched an object. To recognize

    the details of the end point, the robot then moves back to the

    (1, 0, 0) direction with 1-mm interval to detect the sensing

    forces on sensors 1 and 2 again. After recognizing the hand po-

    sition on the object, the robot moves to the (1, 0, 0) direction

    with 1-mm interval for fine scanning. The movement continues

    until all three sensor forces are equal to zero, i.e., the sensors

    do not contact the object. Then, the robot moves back to the

    (1, 0, 0) direction by 26 mm, which is the length of the

    sensing unit. At this position, the robot confirms the object edge

    again. Then, the robot moves 13 mm forward to the (1, 0,

    0) direction, which means that the center of the sensing unit

    is set to the end of the object. Figs. 18(b) and 19(b) show

    the illustration and the photograph of the end point when the

    robot tracks an object edge and does a simulated welding task,respectively. After the edge tracing, the robot goes back to

    the starting point and waits for a user command. When therobot receives a command from a user, it starts the movement

    from the starting point to the end point and does a simulated

    welding task based on the obtained edge shape. Fig. 19(a) and

    (b) shows the continuous procedure when the robot does a

    simulated welding task. When a user utilizes this system, he/she

    can repeat the welding task with various welding speeds. In the

    experiments, the object was set in parallel to the X-axis, and

    the robot moved by 159 mm along the X-axis and obtained

    21 tracing points by searching the object edge. The average

    of the obtained object edge (YAvg, ZAvg) was (501.6 mm,7.2 mm), while the actual object edge was set to (500 mm,

    5 mm). The average absolute error was 1.37 mm. The absolute

    error along the edge is shown in Figs. 20 and 21.

    VI. CONCLUSION AND FUTURE WOR K

    This paper has proposed a tactile-sensing system and a

    control method for a robot to realize active object recognition.

    The tactile sensor implemented on the robot hand consists of

    three pieces of force-sensitive resistors arranged triangularly

    with the peripheral circuits. The robot arm equipped with the

    sensor has been controlled for active object recognition. The

    developed sensing device may not be enough for accurate force

    measurement. It is, however, useful for controlling the robot

    arm to recognize the object shape and to maintain the handorientation for humanrobot cooperative tasks. The bandwidth

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    13/14

    1086 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 57, NO. 3, MARCH 2010

    of the sensor unit has not been examined yet. The total response

    of the sensor unit is slower than that of the FSR because the FSR

    is covered with a soft material whose mechanical reaction is

    slow. However, the proposed sensing unit works sufficiently in

    all the applications shown in this paper, as the motion of the ro-

    bot arm is slow enough. Various applications and experimental

    results were introduced, such as object surface angle and objectshape measurement, surface normal following for humanrobot

    interactions, object edge recognition, and object edge finding

    and tracing.

    The experimental results show that the proposed tactile-

    sensing system can be used practically for obtaining the active

    object information. Accordingly, future works should include

    the application of the proposed sensor unit to actual welding

    and interactive tasks. The welding task is one potential applica-

    tion for 3-D object recognition. The researchers aim to develop

    a novel robot with an actual sensing and welding torch. As

    the experimental results also show, this paper have shown only

    the tracing of a straight trajectory. Hence, future works should

    include other trajectories and error reduction to realize the

    efficient automatic welding task. Another potential application

    is humanmachine cooperation. The researchers would also

    like to implement the proposed system on humanoid robot feet

    to control the foot pose motion and to maintain the balance

    of the whole body to make more stable biped walking in an

    unstructured environment to support the human in his/her daily

    working tasks.

    REFERENCES

    [1] Y. Motai and A. Kosaka, Hand-eye calibration applied to viewpoint

    selection for robotic vision, IEEE Trans. Ind. Electron., vol. 55, no. 10,pp. 37313741, Oct. 2008.

    [2] P. Vadakkepat, P. Lim, L. C. De Silva, L. Jing, and L. Ling, Multimodalapproach to human-face detection and tracking, IEEE Trans. Ind.

    Electron., vol. 55, no. 3, pp. 13851393, Mar. 2008.[3] M. Rucci and P. Dario, Active exploration procedures in robotic tactile

    perception, in Proc. Intell. Robot. Syst., 1993, pp. 2024.[4] E. Ishii, H. Nishi, and K. Ohnishi, Improvement of performances in

    bilateral teleoperation by using FPGA, IEEE Trans. Ind. Electron.,vol. 54, no. 4, pp. 18761884, Aug. 2008.

    [5] H. Iwata, K. Tomita, and S. Sugano, Quantification of humanrobotphysical contact states based on tactile sensing, in Proc. IEEE Int. Conf.

    Adv. Intell. Mechatronics, 2003, pp. 610615.[6] H. Iwata and S. Sugano, Humanrobot-contact-state identification based

    on tactile recognition, IEEE Trans. Ind. Electron., vol. 52, no. 6,pp. 14681477, Dec. 2005.

    [7] K. Suwanratchatamanee, M. Matsumoto, and S. Hashimoto, Balance

    control of robot and humanrobot interaction with haptic sensingfoots, in Proc. 2nd IEEE Int. Conf. Human Syst. Interaction, 2009,pp. 6874.

    [8] K. Suwanratchatamanee, M. Matsumoto, and S. Hashimoto, Haptic sens-ing foot system for humanoid robot and ground recognition with one legbalance, IEEE Trans. Ind. Electron., to be published.

    [9] J. Jockusch, J. Walter, and H. Ritter, A tactile sensor system for a three-fingered robot manipulator, in Proc. IEEE Int. Conf. Robot. Autom.,1997, pp. 30803086.

    [10] M. E. Tremblay and M. R. Cutkosky, Estimation friction using incipientslip sensing during a manipulation task, in Proc. IEEE Int. Conf. Robot.

    Autom., 1993, pp. 429434.[11] N. Chen, R. Rink, and H. Zhang, Local object shape from tactile

    sensing, in Proc. IEEE Int. Conf. Robot. Autom., 1996, pp. 34963501.[12] N. Chen, H. Zhang, and R. Rink, Edge tracking using tactile serve, in

    Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. , 1995, pp. 8489.

    [13] L. H. Sharif, S. Yamane, T. Sugimoto, and K. Oshima, Intelligent coop-erative control system in visual welding robot, in Proc. 27th IEEE Annu.Int. Conf. Ind. Electron. Soc., 2001, pp. 439443.

    [14] X. Liu and C. Xie, Robotic seam tracking utilizing arc light, in Proc.6th IEEE Int. Conf. Intell. Syst. Des. Appl., 2006, pp. 616621.

    [15] X. Liu and C. Xie, Arc-light based real-time seam tracking systemin welding robot, in Proc. IEEE Int. Conf. Control Autom., 2007,pp. 24622467.

    [16] C. Umeagukwu, B. Maqueira, and R. Lambert, Robotic acousticseam tracking: System development and application, IEEE Trans. Ind.

    Electron., vol. 36, no. 3, pp. 338348, Aug. 1989.

    [17] P. Koseeyaporn, Continuous surface tracking for welding robot, in Proc.IEEE Int. Tech. Conf., 2004, pp. 491494.[18] M. Shimojo and M. Ishikawa, An active touch sensing method using a

    spatial filtering tactile sensor, Trans. Inst. Electron., Inf. Commun. Eng.C-II, vol. J74-C-II, no. 5, pp. 309316, 1991.

    [19] M. Tanaka, N. Li, and S. Chonan, Active tactile sensing using a two-finger system, in Proc. Int. Conf. Motion Vibration Control, 2002,pp. 762767.

    [20] P. Vadakkepat, P. Lim, L. C. De Silva, L. Jing, and L. L. Ling, Multi-modal approach to human-face detection and tracking, IEEE Trans. Ind.

    Electron., vol. 55, no. 3, pp. 13851393, Mar. 2008.[21] M. H. Lee and H. R. Nicholls, Tactile sensing for mechatronicsA state

    of the art survey, Mechatronics, vol. 9, no. 1, pp. 131, 1999.[22] T. Matsunaga, K. Totsu, M. Esashi, and Y. Haga, Tactile display for

    2-Dand 3-Dshapeexpression using SMAmicroactuators, in Proc. IEEEAnnu. Int. Conf. Microtechnologies Med. Biol., 2005, pp. 8891.

    [23] A. Bicchi, J. K. Salisbury, and D. L. Brock, Contact sensing from force

    and torque measurements, Int. J. Robot. Res., vol. 12, no. 3, pp. 249262,1993.

    [24] T. Takeda,Y. Hirata, andK. Kosuge, Dancestep estimation methodbasedon HMM for dance partner robot, IEEE Trans. Ind. Electron., vol. 54,no. 2, pp. 699706, Apr. 2007.

    [25] T. Tsuji, Y. Kaneko, and S. Abe, Whole-body force sensation byforce sensor with shell-shaped end-effector, IEEE Trans. Ind. Electron.,vol. 56, no. 5, pp. 13751382, May 2009.

    [26] K. Suwanratchatamanee, M. Matsumoto, R. Saegusa, and S. Hashimoto,A simple tactile sensor system for robot manipulator and object edgeshape recognition, in Proc. 33rd IEEE Annu. Int. Conf. Ind. Electron.Soc., 2007, pp. 245250.

    [27] K. Suwanratchatamanee, M. Matsumoto, and S. Hashimoto, A simplerobotic tactile sensor for object surface sensing, Int. J. Robot. Soc. Jpn.,

    Adv. Robot., vol. 22, no. 8, pp. 867892, 2008.[28] K. Suwanratchatamanee, M. Matsumoto, and S. Hashimoto, A tac-

    tile sensor system for robot manipulator and continuous object edgetracking, in Proc. 7th France-Jpn./5th Eur.-Asia Congr. Mechatronics,2008, CD-ROM.

    [29] K. Suwanratchatamanee, M. Matsumoto, and S. Hashimoto,Humanmachine interaction through object using robot arm withtactile sensors, in Proc. 17th IEEE Int. Symp. Robot Human InteractiveCommun., 2008, pp. 683688.

    [30] F. Vecchi, C. Freschi, S. Micera, A. Sabatini, and P. Dario, Experimentalevaluation of two commercial force sensors for applications in biome-chanics and motor control, in Proc. IFESS, 2000.

    [31] C. Lebosse, B. Bayle, M. de Mathelin, and P. Renaud, Nonlinear mod-eling of low cost force sensors, in Proc. IEEE Int. Conf. Robot. Autom.,2008, pp. 34373442.

    [32] Flexi-Force User Manual and Technical Data Sheet (Model A101).Tekscan, Inc., Boston, MA. [Online]. Available: www.Tekscan.com

    Kitti Suwanratchatamanee (S07) received theB.Eng. degree in electronics and telecommunicationsengineering from King Mongkuts University ofTechnology Thonburi, Bangkok, Thailand, in 2002,and the M.Eng. degree in electronics engineeringfrom RMIT University, Melbourne, Australia, in2004. He is currently working toward the Dr.Eng.degree in pure and applied physics (informationengineering) in the Graduate School of AdvancedScience and Engineering, Waseda University, Tokyo,Japan.

    His research interests are in robotics and automation, including sensors,tactile and haptic sensing systems, robot protocol, industrial robots, humanoidrobots, and intelligent welding systems.

    Mr. Suwanratchatamanee is a Student Member of the Robotics Society ofJapan and a Research Fellow of the Japan Society for the Promotion of Science

    (DC2: 20-56621). He received the IECON07 Student Scholarship Award, theHCIMA08 Third-Prize Award, and the HSI09 Best Paper Award in the area ofintelligent systems from the IEEE in 2007, 2008, and 2009, respectively.

    Authorized licensed use limited to: Guru Anandan Saminathan. Downloaded on May 10,2010 at 05:49:53 UTC from IEEE Xplore. Restrictions apply.

  • 7/29/2019 2010 - Robotic Tactile Sensor System and Applications

    14/14

    SUWANRATCHATAMANEE et al.: ROBOTIC TACTILE SENSOR SYSTEM AND APPLICATIONS 1087

    Mitsuharu Matsumoto (M06) received the B.Eng.degree in applied physics and the M.Eng. andDr.Eng. degrees in pure and applied physics fromWaseda University, Tokyo, Japan, in 2001, 2003, and2006, respectively.

    He has published 13 books and more than50 papers in refereed international conference pro-ceedings and journals. He is currently an Assistant

    Professor in the Education and Research Cen-ter for Frontier Science, University of Electro-Communications, Tokyo. His research interests

    include array signal processing, blind source separation, image processing,optical devices, pattern recognition, self-assembly, and robotics.

    Dr. Matsumoto is a member of the Institute of Electronics, Information andCommunication Engineers. He received the Ericsson Young Scientist Awardfrom Nippon Ericsson K.K., Japan, in 2009.

    Shuji Hashimoto (M09) received the B.S., M.S.,and Dr.Eng. degrees in applied physics from WasedaUniversity, Tokyo, Japan, in 1970, 1973, and 1977,respectively.

    He is currently a Professor in the Department ofApplied Physics and the Dean of the Faculty ofScience and Engineering, Waseda University, wherehe has also been the Director of the Humanoid

    Robotics Institute since 2000. He is the author ofover 400 technical publications, proceedings papers,editorials, andbooks. Hisresearchinterests arein hu-

    man communication and KANSEI information processing, including imageprocessing, music systems, neural computing, and humanoid robotics.

    Dr. Hashimoto is a member of the International Computer Music Associa-tion, the Institute of Electronics, Information and Communication Engineers,the Information Processing Society of Japan, The Society of Instrument andControl Engineers, the Institute of Systems, Control and Information Engineers,the Institute of Image Electronics Engineers of Japan, the Robotics Society ofJapan, the Human Interface Society of Japan, and the Virtual Reality Society ofJapan.