Group One - Uppsala Universityit.uu.se/edu/course/homepage/styrsystem/vt09/Nyheter/Gr... · 2009....
Transcript of Group One - Uppsala Universityit.uu.se/edu/course/homepage/styrsystem/vt09/Nyheter/Gr... · 2009....
-
Uppsala University
Department of Information Technology
Embedded Control Systems, spring 2009
Instructors: Prof. Alexander Medvedev, Karl Marklund, Romeo Buenaflor, Egi Hidayat
Smooth Walker Group One
Hanzheng Zou
Hongyi Zhao
Zhao Huang
Zhiqian Yuan
-
Abstract
The goal of this project is to let the AIBO robot walk smoothly and have better vision
ability. The standard gaits concentrate on effective locomotion but also result in
significant displacement of the optical axis of the AIBO's head camera. Thus, there is
a need for adapting a standard gait so that the head is kept as steady as possible
without much loss in the motion velocity. Relevant software tools have to be
developed.
Acknowledgments
We would like to thank the Seeing While Walking groups of year 2008 for their work
and report that we could to some extent use in our work this year.
We want to thank Professor Alexander Medvedev for the theoretical help and also for
his ideas during the course. Mr. Romeo Buenaflor has helped us with the OPENR
programming during the software development.
-
Index
Abstract ............................................................................................................................................. 2
Acknowledgments ............................................................................................................................. 2
Index .................................................................................................................................................. 3
1. Introduction .................................................................................................................................. 4
2. Project description ....................................................................................................................... 6
2.1 Goal ..................................................................................................................................... 6
2.2 Platform ............................................................................................................................... 6
2.2.1 Hardware .................................................................................................................. 6
2.2.2 Software ................................................................................................................... 8
2.3 Development Tools .............................................................................................................. 8
2.3.1.Tekkotsu .................................................................................................................... 8
2.3.2. OPEN-R .................................................................................................................. 10
2.3.3. Medit ..................................................................................................................... 10
3. Previous study ............................................................................................................................ 11
4. Experiments on AIBO ................................................................................................................ 12
4.1. Use MotionCommander to get the default gait ............................................................... 12
4.2. Design a New Gait with Medit ......................................................................................... 13
4.3. Evaluate body height in the Matlab ................................................................................. 14
4.4. Implement Java client for taking picture .......................................................................... 17
4.5. Message sending for controlling the robot ...................................................................... 17
4.6. Use Matlab to evaluate the shaking ................................................................................. 19
4.7. Use Camera to adjust the walking direction .................................................................... 21
5. Result .......................................................................................................................................... 25
6. Future work ................................................................................................................................ 28
Reference ........................................................................................................................................ 29
Appendix A: Camera Vision while walking. ................................................................................. 30
Appendix B: New Gait Parameters ............................................................................................... 31
Work division
Hanzheng Zou: Chapter 1, Chapter 2.1, 4.2, 4.5, 4.7, Chapter 5, Chapter 6;
Hongyi Zhao: Chapter1, 2.2, 2.3, 4.1, 4.3
Zhao Huang: Chapter 2.1,2.2, 2.3, Appendix
Zhiqian Yuan: Chapter 3, 4.4, 4.6
-
1. Introduction
AIBO (Artificial Intelligence roBOt) was one of several types of robotic pets designed
and manufactured by Sony. From 1999 to 2006, there have been several different
models designed. The AIBO robotic pets are considered to be autonomous robots,
since they are able to learn and act based on the external environment, even from
other AIBOs. The AIBO robots are able to walk with its mechanical legs, recognize
some spoken commands, and "see" the environment via camera in the head.
Because it has a computer, vision system and articulators integrated inside, but much
cheaper than other conventional research robots, AIBO is used in numerous research
areas such as Artificial Intelligence, Robotics, Real-time and Embedded Control
System.
RoboCup (Originally called as Robot World Cup Initiative) is an international
competition founded in 1993 .It is an international research and education initiative.
It is an attempt to foster AI and intelligent robotics research by providing a standard
problem where wide range of technologies can be integrated and examined, as well
as being used for integrated project-oriented education. For this purpose, RoboCup
chose to use soccer game as a primary domain, and organizes RoboCup. The goal for
the RoboCup project is to create a soccer team of autonomous robots who can beat
the FIFA world champions by the year 2050.
AIBO robots were earlier used as the official robot in RoboCup but have now been
replaced by a new type of robot since Sony has discontinued the production of AIBO
in 2006. And the research about how AIBO can play soccer better is still valuable for
the future's robots research.
A programmer's kit for ‘non-commercial’ use was released by Sony. The kit has
now been expanded into three distinct tools: R-CODE, AIBO Remote Framework, and
the OPEN-R SDK. These three tools are combined under the name AIBO SDE
(Software Development Environment). All of these tools are free to download and
can be used for commercial or non-commercial use. Since the first release of OPEN-R,
several AIBO programming tools have been developed by university labs, including
URBI, Tekkotsu, Pyro and Cognitive Vision. In this project, OPEN-R is mainly used.
The Uppsala University gave the first robot course using the Sony AIBO started from
the early spring 2004. And each year several groups worked on the AIBO projects to
achieve certain goals to contribute on the robot research.
This report presents our project, named “Smooth Walker”, which is done during
Embedded Control System course in spring semester of 2009. The project’s aim is to
create an approach for the dog to walk steadier than it usually does, and also to build
a method for us to evaluate the new gait. The gait is required to be fast, straight and
-
steady like nature dogs. Then it can be used for robot soccer game.
There are several challenges such as how to create a brand-new gait from scratch,
how to compensate the mechanical error of the dog and how to evaluate the gait.
We will talk about the solutions we created for these problems later in this report.
Among all the available development tools out there for AIBO, we choose OPEN-R
to be the development environment. We think it’s the most powerful one and give us
the ability to tweak the dog for advanced functions. We also use Matlab for gait
evaluation and MEdit (a tool from Sony) to emulate new gait.
-
2. Project description
2.1 Goal
The project is called Smooth Walker. As you can see from the project name, it
aims to let the AIBO robot walk smoothly and directly.
The goal of this project is to let the AIBO robot walk smoothly and have better
vision ability. The standard gaits concentrate on effective locomotion but also result
in significant displacement of the optical axis of the AIBO's head camera. Thus, there
is a need for adapting a standard gait so that the head is kept as steady as possible
without much loss in the motion velocity. Relevant software tools have to be
developed.
* Tools for on-line estimation of head shaking.
* Tools for automatically adjusting a gait in order to minimize the shaking.
We observed the standard gait via recorded video clip frame by frame. We found
that the shaking is caused by the robotic gait, which can let the dog walk but not
natural and smooth at all.
After a little investigation of the OPEN-R SDK, we think it possible to create a
brand-new gait, which we can take fully charge of all parameters. Thus we plan to
create a new gait to weaken the shaking of walking.
We also encountered an issue that the dog won’t go straight even if the gait is
totally symmetric to z-axis due to the mechanism error. Thus how to eliminate this
issue or at least work around it is also a goal of this project.
To find out a proven and reliable way to evaluate the gait and shaking is another
must have goal of this project. We need to prove that the new gait is not only
visually looks better than standard gait but also numerically be better. We found
some experiments in previous work to evaluate a gait; we can simply take advantage
of them.
2.2 Platform
2.2.1 Hardware
The robot we are using is AIBO ERS-210 from Sony Corporation. The dog has the
following specification:
-
Apart from the dog, we are using a Dell desktop PC, borrowed from department,
for development and host to connect with AIBO. The PC is running in Ubuntu 8.10.
CPU 64 bit RISC Processor
CPU Clock Speed 192MHz
Internal Memory 32 MB
External
Memory/Program
Storage
8Mb AIBO-ware Memory Stick
Moveable Parts Mouth : 1 degree of freedom
Head : 3 degrees of freedom
Leg : 3 degrees of freedom x 4
Ear : 1 degrees of freedom x 2
Tail : 2 degrees of freedom
Total : 20 degrees of freedom
Input/Output PC Card slot : In/Out
Memory Stick Slot In/Out
AC IN Power Supply connector Input
Image Input 100,000-pixel CMOS Image sensor
Audio Input Stereo Microphones
Audio Output Speaker
Built-in Sensors Temperature Sensor
Infrared Distance Sensor
Acceleration Sensor
Touch sensors (head) Switch( back, chin & bottom of legs)
Vibration Sensor
Built in clock Date & Time
Power
Consumption Approx. 9W (Standard operation in autonomous mode)
Operating Time Approx. 1.5 (Standard operation in autonomous mode using
fully charged ERA-201B1 battery)
Dimensions (W x
H x L)
Approx. 6 x 11 1/8 x 9 7/8 in. 152 x 281 x 250 mm (not
including the ears & tail)
Weight Approx. 3 lb. 3 oz. / 1.5 kg (Including battery & memory
stick)
Color Gold/Silver/Black
Supplied
Accessories
AC Adapter, Lithium Ion Battery Pack ERA-201B1 (x 1), AC
Adapter, Manual, Pink Ball
Operating
Temperature 41F to 95F / 5C to 35C
Operating
Humidity 10%-80%
-
2.2.2 Software
The AIBOs run a special operating system developed by Sony, called Aperios. Sony
has released a software development kit, the OPEN-R SDK (written in C++). We will
give an introduction for OPEN-R SDK later. Programming for the Aibo is for the most
part very similar to a UNIX environment, except for process control and inter-process
communication. The development environment uses a slightly patched version of the
3.3 GCC compiler, and can be run on almost any UNIX based platform, including Mac
OS X and cygwin under Windows.
Sony developed the proprietary real time operating system Aperios for the
autonomous dog Aibo (Artificial Intelligence Robot). The development time lasted for
about 5 years, the first model was hit the market in 1999. This robot is equipped with
a MIPS of 64-bits RISC processor (100 MHz) and 8 MByte DRAM main memory. The
color video camera with a resolution of 180,000 pixels as well as the audio input and
output interfaces make an orientation possible. Altogether 18 engines control the
movability of the new mechanical domestic animal for the entertainment market. A
battery provides electrical power for 90 minutes, till now he still cannot charge itself.
The hardware is built up on the Open-R architecture and the operating system reacts
to the signals of the sensors in real time. Aperios is optimized for high transfer rates
of audio and video streams.
2.3 Development Tools
2.3.1. Tekkotsu
Tekkotsu is an application development framework for robots. It was originally
created for AIBO. Tekkotsu is object-oriented, making much extensive use of C++
templates. It’s elegant and open-source. This makes it easy for application developers
to create their own customized code without having to change the Tekkotsu’s source
code.
We can interact with Tekkotsu by a Java interface. The GUI tool provides man useful
capabilities. You can execute many behaviors, both pre-installed and behaviors
written by the user. The behaviors are C++ classes with specific methods defined by
Tekkotsu. Similar capabilities are available from the console (telnet to port 10001 on
the robot) if you do not wish to use a Java-based GUI interface.
-
Figure 1. Sample Controller GUI Menu of Tekkotsu
Some of the services Tekkotsu provides include basic visual processing, forward and
inverse kinematics solvers, remote monitoring and teleoperation tools, and wireless
networking support.
Figure 2. Overview of communication between Tekkotsu and AIBO
Tekkotsu is a C++ wrapper of OPEN-R API. It’s handy to start with a tool like
Tekkotsu as it wraps many low level functions. With Tekkotsu, you don’t need to start
from scratch, as it’s painful and hard. But we planned to create our own gait to see if
we can minimize the shaking, Tekkotsu doesn’t seem to be a very good choice
because it encapsulates many low-level details of motions. We could modify it’s
source code to achieve the function but we decide to start with OPEN-R, which gives
almost full-charge of low-level details.
-
2.3.2. OPEN-R
As introduced previously, OPEN-R is a development SDK created by SONY.
Application created with OPEN-R SDK is object-driven. Objects are loaded upon
boot. They are executable modules of the application. The executables typically have
‘.bin’ extension. Objects communicate via message mechanism.
OPEN-R programs are built as a collection of concurrently running OPEN-R objects.
An OPEN-R object is implemented and complied using a C++ object. OPEN-R
provides API to programmatically change the angles of all joints of AIBO. This is the
power we are looking for. Also, there are plenty of sample codes, which you can learn
from. We recommend you start with MovingLegs sample code. There is a existing
skeleton for routing of changing angles. After you play with the sample and dig deep
into the code, you can then create your own gait by modifying the code. Of course,
you can check our code because we already implement new gaits through OPEN-R
SDK.
2.3.3. Medit
MEdit is a GUI tool from Sony for AIBO simulation. It can simulate position with
customized angles. It can also generate motions through a series of positions. You can
adjust the angles dynamically and see the motion again. This is a really handy tool for
simulating new gait without any bother of coding and debugging. We think this is a
key to the success of creating a new gait.
Figure 3. MEdit user interface.
-
3. Previous study
There were five groups in 2008 working on the AIBO’s project, four of them aiming at
achieving a more stable movement by adapt AIBO’s original gait. The last group dealt
with relationship between neck jerking and leg movement to reduce the head
shaking, which is beyond of our aim.
Group one designed a gait developer tool for automatic gait construction and
evaluation. This tool accepts a set of parameters which is set by users, and generates
a gait, tests it, and then relevant data was output to a Matlab script for evaluation.
Raw camera data can be achieved via network and can be used to generate off-line
images for analysis, which is automatically done by shell scripts. This makes the
whole testing process automated, except for that the battery can’t support long time
running if not charged, which reduced the test’s efficiency.
Group two used a structural framework Tekkotsu to aid their designing. They
tried to regulate the PID parameters in all moving joints to minimize high frequency
vibrations. They use a upright gait designed by themselves, so that stabilization and
speed are not taken into account. The result shows their method works but still not
perfect.
Group four built a model for analysis. To gather data, they use telnet which can
send controlling command to AIBO, as well as a graphical ftp gFtp to transfer several
files from and to AIBO. The use of ftp need configuration on AIBO side, which is done
by using tinyftpd program which is a sample program included with OPEN-R. Group
four also built some text-based user interfaces which can be accessed from the
terminal running telnet. The UI can take action command and scripts.
To make evaluation of gaits, some former groups made AIBO going towards a
target, which can be a red dot on the white paper, or a light source in the dark room
or anything that makes strong contrast ratio against the backgrounds. Then the
camera in the front of AIBO will take a series of pictures, which can be used for
further evaluation. The targets in the pictures were located, moreover its diversion
through the whole process corresponds to the vibration of AIBO’s movement, and
then graphs were plotted based on those data, showing the AIBO’s vibration.
-
4. Experiments on AIBO
The experiments to design and implement new gait will be discussed in this
chapter. These experiments are the fundamental solution of how to reach the goal of
the project.
4.1. Use MotionCommander to get the default gait
There are various ways to make the dog walk. We can use a Tool like Tekkotsu,
which has a graphic user interface. It’s already been introduced in Tools section. But
the gait that Tekkotsu produces is very ugly and not natural. You may want to try it
out yourself or you could watch some videos on youtube to see how the gait
Tekkotsu can provide. Otherwise, we can also use the sample project
MotionCommander to make the dog walk. The shaking of the walking by
MotionCommander is much reduced, though not ideal, because MotionCommander
achieves a set of fairly better gaits inside it. We think this is a good point to start
otherwise we have to start from scratch.
MotionCommander is a sub-project contained in SoccerLion200 sample code
folder. Basically what SoccerLion200 program will do is: once AIBO is standing, press
the head sensor to activate AIBO. AIBO will search for the pink ball. Once found, AIBO
will dribble the ball, or kick it. MotionCommander is a sample project by which you
can control AIBO’s motion wirelessly by simply typing commands through console. It
contains several of gaits for the user to choose and you can choose specific motion
like, walking forward, walking leftward, walking left, etc. You can even specify how
many steps you hope the dog to take, from 1 to N. After we look into the source code
of MotionCommander, we found that the core part of the project is not public to us.
It’s encapsulated as a lib file (../SoccerLion200/OMWares/lib/libOMWares.a). So we
cannot simply get the angles data of the gaits that MotionCommander provides as
we don’t have source code for that part.
Assuming we can make the dog walking through the ways above, it’s not our final
goal after all. The project’s name is “SmoothWalker” thus we need to make the walk
as smooth as possible. We need to improve the walking condition of the dog to
reduce the shaking of the head then we can get stable image from the internal
camera. We think we need a way to have full charge of the walking process rather
than sending the dog one or two command and leave itself walk. The full control
means that we can tweak the angle of each joint in each step. The freedom of
changing angles will give us powerful flexibility while designing a new gait. But, it’s
quite hard for us to start create a brand-new gait from scratch especially when we
are new to AIBO. Thus we create a way to extract angles info in each step through
MotionCommander. The angles info is then printed back to terminal.
-
The idea is to print angle value of each joint back to console when a motion
command is done. We perform a one-step motion each time to capture all angles of
all joints in a single step. After performing four times, we are done with the angle
collection work. This process is a kind of reverse engineering though we are
impossible to get accurate results of the angles due to mechanical errors. But what
we need is just a base to start, the results are much more valuable than nothing to
help us form a gait, which will work.
The command used to connect to AIBO:
$ telnet 192.168.0.4:59000
The command used to let AIBO walk a singlestep:
$ w d f 1
After each step finishes, the angles data can be printed in terminal via OSYSPRINT
calls. The code used to print angle data is added in
MotionCommander::notifyMotionResult(const ONotifyEvent& event) in
MotionCommander.cc.
double start[NUM_JOINTS];
for (int i = 0; i < NUM_JOINTS; i++) {
OJointValue current;
OPENR::GetJointValue(jointID[i], ¤t);
start[i] = degrees(current.value/1000000.0); // This is tricky…
OSYSPRINT(("%f\n",start[i]));
}
MotionCommander::notifyMotionResult(const ONotifyEvent& event) is a callback
delegate method which is called after a motion is done.
So, is AIBO able to walk with these 4 sets of angles? The answer is no. The above
process only capturs the ending position of each step. However a step is composed
with larger than one motion (roughly two). You can imagine when the dog is stepping,
the leg is firstly lifted to a certain height and at the same time moved forward. Then
after a very short stop, the leg is put back on the ground. We will discuss this later in
section 4.2 about how to really make this data useful.
4.2. Design a New Gait with Medit
In the previews part, we modified the MontionCommnader project and get the
values of each joints in each step of the dog walking. These values can be made up
into a gait in theory. Considering about the errors in the getting gait, and even the
default gait is not a perfect one, these data cannot be used to design a gait directly.
-
But it is valuable data that can be researched to design a new gait, and it is a good
prototype is see how the dog is walking.
To design a gait, we have the perfect software call Medit which is provided by the
Sony. Using this software, we can simulate the dog’s walking and make an animation
to preview the design.
In principle, there are four steps to design a new gait using this tool.
Find out how many poses in dog’s walking circulation. For example, if you design
a gait follows iterations of “RightFront-->LeftRear-->LeftFront-->RightRear”, there
would be a least 4 poses in the walking loop. And the best way is to use 2 groups
of data for one poses. That means eight groups of joints values for a walking
loop.
Second, setting value to each joints of the different Pose. The preview window in
the left-top will show the current dog state when you setting different values
into the Pose Control list.
Third, preview the walking. There is a play button in the Montion control window,
it will link the 4 poses together and an animation of your motion design will be
shown in the left top window.
Forth, check if the walking poses can be hanging together or not, and change the
joints value to make the animation looks walking reasonable. The coherent gait is
the goal of making a stable walking gait.
The forth step will be looped for several time if you want to design a “perfect” gait. A
“perfect” gait should walk symmetrically, that means the left and right foot should
walk same distance forward. If the dog doesn’t walk symmetrically, it might just walk
to one side or walk an “S” route. Then the Open-R implementation can be done
based on the joints value of gait which is designed in the Medit.
4.3. Evaluate body height in the Matlab
At the very beginning of the project, even before we find a way to extract a gait from
MotionCommander, we have an idea about how to make a gait smooth. The idea is
talking two properties. One is the vertical height from joint 2, which connect main
body and a leg, to the sole of a leg. Another is the horizontal length of how far the
leg has moving forward comparing with joint 1.
-
Figure 4. Robot leg calculation
We use some measurements from previous reports. The length between joint 2 to
joint 1 is fixed as 61.85mm. The length between joint 1 to sole (point P) is 66.71mm.
Thus we can calculate the height use the following equation.
H = A * cos(a)+ B * cos(a+b)
the equation for horizontal length:
L = A * sin(b)+B * sin(a+b);
Each leg has such a height property. The idea is to keep all heights of all legs the
same while walking. Thus the main body, including the head won’t shake due to the
inconsistent leg heights theoretically. The idea is simple to understand but we found
it’s not ease to make it really practical later on in the project.
Here is what we are trying to do with this idea. We are trying to apply the theory
explained above in the process of designing a gait. After a gait is designed, we will
have 12 angles for motion (3 ones for each leg). Then we can use the equation above
to calculation the height for a specific leg in a single motion. We utilize Matlab to
finish this boring task. The code to be run in Matlab is:
Joint 2
Joint 1
L
H
-
function countHeight = count( )
%UNTITLED Summary of this function goes here
% Detailed explanation goes here
A = 61.85;
Bf= 66.71;
Br= 76.49;
fid=fopen('dat.txt');
index=1;
while 1
tline = fgetl(fid);
if ~ischar(tline), break, end
%if tline == '\n' continue, end
%disp(tline);
%disp(sprintf('%d',index));
if(mod(index,3) == 1)
a = str2num(tline);
if (mod(index,4) == 1)
disp(sprintf('Step # %d\n',(index - 1)/12 + 1));
end
%disp(a);
elseif(mod(index,3) == 2)
index = index + 1;
continue;
elseif(mod(index,3) == 0)
b = str2num(tline);
%disp(sprintf('joint3: %f\n',b));
if(mod(index/3,4) == 1)
h = A*cosd(a)+ Bf*cosd(a+b);
l = A*sind(b)+Bf*sind(a+b);
disp(sprintf('Right front height: %f -- %f\n',h,l));
elseif(mod(index/3,4) == 2)
h = A*cosd(a)+ Bf*cosd(a+b);
l = A*sind(b)+Bf*sind(a+b);
disp(sprintf('Left front height: %f -- %f\n',h,l));
elseif(mod(index/3,4) == 3)
h = A*cosd(a)+ Br*cosd(a+b);
l = A*sind(b)+Bf*sind(a+b);
disp(sprintf('Right rear height: %f -- %f\n',h,l));
elseif(mod(index/3,4) == 0)
h = A*cosd(a)+ Br*cosd(a+b);
l = A*sind(b)+Bf*sind(a+b);
disp(sprintf('Left rear height: %f -- %f\n',h,l));
end
end
index = index + 1;
end
fclose(fid);
end
The code take an input file named “dat.txt” which has all the angle data formatted
with one angle per line. And the calculation result is printed in console like the
following stuff:
-
Step # 1
Right front height: 87.935377 -- 121.370152
Left front height: 102.017698 -- 102.842673
Right rear height: 109.167099 -- 92.858056
Left rear height: 96.564963 -- 116.706403
Step # 2
Right front height: 109.410256 -- 87.140644
Left front height: 113.694860 -- 84.174658
Right rear height: 112.211615 -- 94.781690
Left rear height: 112.745333 -- 88.576318
……
……
……
The first number is the height property; the second number is the length property.
4.4. Implement Java client for taking picture
We found a simple way to get the image from AIBO, write a java program, sending
http request to the port of AIBO’s image taking object. When the data is received, the
program encodes the data to recreate jpg image from it, and save it to the disk.
These pictures are showing how AIBO “see” while it is walking. And you can see the
AIBO’s vision from continual pictures in the Appendix A.
4.5. Message sending for controlling the robot
In Open-R, the executable code is based on the OObjects, and each OObject can be
run separately in the embedded operating system. In general, there are several
independent OObjects which controlling the robot’s motivations but they need
interact with each other to accomplish the work. So message sending between the
OObjects are important in the project. In this SmoothWalker project, the message
passing is used in two parts: First is that user wants to send command to control the
robot, the second is that the head controlling object sending commands about the
environment information. In this part we will mainly talk about Message sending for
user to control the robot.
Here is the diagram showing messages-passing and states-changing of the
ControlSubject and MovingLegs, these 2 objects are inherited from Open-R OObject.
-
Figure 5. Overview of message sending for controlling robot movement.
As you can see from the diagram, the ControlSubject Object has a function called
SendCommand(), the command “start, stop, go” etc, will be send in this function. In
the MovingLegs Object, function called NotifyCommand() is used to handle the
messages sent by the ControlSubject and also it has code to send message back to
notify the results. This message sending-receiving mechanism has to define the
message passing channel, and it was defined in the configuration file called
“stub.cfg”.
Stub.cfg in ControlSubject:
ObjectName : ControlSubject
NumOfOSubject : 1
NumOfOObserver : 1
Service : "ControlSubject.MovingLegs.ControlSubjectCommand.S", null,
SendCommand()
Service : "ControlSubject.MovingLegs.ControlSubjectResult.O", null,
NotifyMovingLegsResult()
Stub.cfg in MovingLegs:
ObjectName : MovingLegs
NumOfOSubject : 3
NumOfOObserver : 2
Service : "MovingLegs.Move.OCommandVectorData.S", null, Ready()
Service : "MovingLegs.Result.ControlSubjectResult.S", null, null
Service : "MovingLegs.Command.ControlSubjectCommand.O", null, NotifyCommand()
Service : "MovingLegs.TargetResult.BallTrackingHeadResult.S", null, null
Service : "MovingLegs.Target.BallTrackingHeadCommand.O", null,
NotifyTargetCommand()
ControlSubject MovingLegs
S:SendCommand()
O:NotifyMovingLegsResult()
O:NotifyCommand()
Send result back in code
Send
start,stop,go
… command
-
And in the “\MS\OPEN-R\MW\CONF\CONNECT.CFG”, add the following definitions:
ControlSubject.MovingLegs.ControlSubjectCommand.S MovingLegs.Command.ControlSubjectCommand.O
MovingLegs.Result.ControlSubjectResult.S ControlSubject.MovingLegs.ControlSubjectResult.O
This means the message “ControlSubject.MovingLegs.ControlSubjectCommand.S” is
sent and received as “MovingLegs.Command.ControlSubjectCommand.O”. Then all the
preparations are done before you sending messages between these OObjects. In the
same way you can also design the message passing between all OObjects.
4.6. Use Matlab to evaluate the shaking
We use matlab to evaluate the performance of various gaits, including both their
adjusted and unadjusted versions.
We first retrieve images captured by AIBO’s camera, there’s a example of how it’s
like:
Figure 6. Example of image taken by AIBO camera.
Those images are then processed with a matlab file imdetect.m. In this file, the
image is first transformed into gray image, using function RGB2GRAY. Then the dot is
shrunk into a one-pixel size dot. Here shows the images before and after the
processing:
Figure 7. Images processing in Matlab.
Now we can obtain the coordinates of the dot by read the coordination of this white
pixel.
-
The matlab file measuremove.m reads a batch of images, and then it calls the
function imdetect in imdetect.m, and does the process described above to this group
of images. After that, we obtain the coordinates of the dots by executing the
following statement:
[y,x] = find(img==1);
and the size of image by executing the following statement:
[n, m] = size(img);
Next we plot the graph to show the shakiness of gait:
[AX,H1,H2]=plotyy(0:to-from,coord(:,1)-m/2,0:to-from,coord(:,2) - n/2,'plot');
Beside of the image analysis, we also use a offline analysis to compare the shakiness
of various gaits so as to verify the effect of adjustment as well as make a conclusion
on stable gait designing, we make this by obtaining the dots’ position directly from
AIBO, and use matlab to read this position sequence, then plot a graph. Here are a
group of graphs:
Figure 8 Evaluation of no-adjust walking in X-direction.
As we can see, before adjustment, the gait is quite unstable.
And here is the evaluation of adjusted gait:
Figure 9 Evaluation of auto-adjusting walking in X-direction.
The evaluation shows that the adjustment keeps AIBO’s walking towards the target.
-
Finally, we calculate the variance of various gaits so that we can analyze the gaits in a
mathematical way.
4.7. Use Camera to adjust the walking direction
Before the AIBO robots kick the ball in the RoboCup, they should find a way to reach
the ball first. In principle, the robot will looks around in the environment using its
camera in the head and find the direction of the ball, then trying to walk approach to
the ball. There is one problem comes out: If the robot cannot walk straightly, it can
never get to the ball even it knows where the ball located. The problem that robot
cannot walk straightly mainly comes from 2 reason: First is that the robot may walk
in an unreasonable gait, second is the mechanical error occurs when robot running.
We then should design a reasonable gait as good as we can to let it walk straight, and
try to minimize the mechanical error.
A better way can be chose to make it walk straightly: Using the camera to getting
target information and choose a dynamic gait to compensate the mechanical error.
Message passing between OObjects are also needed in this solution. The following
diagram shows the overview of how it runs.
Figure 10 Overview of Auto adjusting algorithm implemented with the camera.
Here the code showing picture analysis in the BallHeadTrack class. It will do this
NotifyImage() function whenever an new picture is taken by the camera:
BallTrackingHead MovingLegs
Camera
NotifyImage()
Adjust to right
Adjust to left
No Adjust
NotifyTargetCommand()
Ready(){To change gait dynamic}
-
BallTrackingHead::NotifyImage(const ONotifyEvent& event){
……
if (ballTrackingHeadState == BTHS_SEARCHING_BALL) {
if (cdtImage.ColorFrequency(BALL_CDT_CHAN) >= BALL_THRESHOLD) {
found++;
} else {
found = 0;
//SearchBall();
}
if (found == FOUND_THRESHOLD) {
//OSYSPRINT(("### BALL FOUND ### \n"));
found = 0;
CentroidAndDeltaAngle(cdtImage, &xc, &yc, &d_pan, &d_tilt);
//OSYSPRINT(("### BallHeadTracking %d , %d ##### pixelnumber %d into
File\n",xc,yc,cdtImage.ColorFrequency(BALL_CDT_CHAN)));
OSYSPRINT(("%d %d\n",xc,yc));
if(xc>48)
{
//OSYSPRINT(("send to right message\n"));
BallTrackingHeadCommand legscmd(BALLTHCMD_MOVE_TO_ADJUSTTORIGHT);
subject[sbjMovingLegs]->SetData(&legscmd, sizeof(legscmd));
subject[sbjMovingLegs]->NotifyObservers();
}else if(xcSetData(&legscmd, sizeof(legscmd));
subject[sbjMovingLegs]->NotifyObservers();
}
else
{
BallTrackingHeadCommand legscmd(BALLTHCMD_MOVE_TO_NOADJUST);
subject[sbjMovingLegs]->SetData(&legscmd, sizeof(legscmd));
subject[sbjMovingLegs]->NotifyObservers();
}
}
}
}
The code will do image analysis to decide what compensations the robot’s legs
should make for a straight walking. The picture taken by the camera is in the size of
-
88*72. We will first extract the “PINK” color in the image, and then use
CentroidAndDeltaAngle to get the centre point (xc,yc) of the “Pink ball” on the image. If
the robot is walking straightly to the pink ball, in principle the xc is value should be
value around 44. Then if x is bigger than 48, it means the centre of target is in the
right side of the dog’s vision and the dog is now walking to the left side, the message
“BALLTHCMD_MOVE_TO_ADJUSTTORIGHT” which is telling the MovingLegs about adjusting
its direction to the right side should be sent. And if x is smaller than 40, message
“BALLTHCMD_MOVE_TO_ADJUSTTOLEFT” should be sent. The MovingLegs will compensate
the mechanical errors by getting the real time adjusting messages from the
BallTrackingHead.
To let the dog choose dynamic gait based on the real time analysis from the camera,
the MovingLegs should have 3 suits of gait to correspond the compensating decision.
And they are defined as “ADJUST_TORIGHT”,”ADJUST_TOLEFT”, and “ADJUST_NONE”.
The code dealing with the dynamic gait adjusting is as following:
BallTrackingHeadCommand* cmd = (BallTrackingHeadCommand*)event.Data(0);
if (cmd->type == BALLTHCMD_MOVE_TO_ADJUSTTORIGHT) {
//OSYSPRINT(("##############################Should adjust leg walk to right a bit\n"));
adjustDirect = ADJUST_TORIGHT;
}else if (cmd->type == BALLTHCMD_MOVE_TO_ADJUSTTOLEFT) {
//OSYSPRINT(("##############################Should adjust leg walk to LEFT a bit\n"));
adjustDirect = ADJUST_TOLEFT;
}else {
adjustDirect = ADJUST_NONE;
}
This Algorithm implemented in MovingLegs take the decision from BallTrackingHead
by message sending. And in the legs-controlling part of walking, the robot will use a
changeable joints value of legs to walk.
-
if (movingLegsState == MLS_MOVING_TO_WALKINGPOS8 && command == GO) {
OSYSDEBUG(("MLS_MOVING_TO_8\n"));
//MovingResult r = MoveToPos(WALKINGPOS8_ANGLE, WALKINGPOS4_MAX_COUNTER,
isStarting);
MovingResult r;
if(isSlowWalk){
r = MoveToPos(WALKINGPOS8_ANGLE, WALKINGPOS4_MAX_COUNTER_SLOW, isStarting);
}
else{
if( isAdjustOn == false || adjustDirect == ADJUST_NONE){
r = MoveToPos(WALKINGPOS8_ANGLE, WALKINGPOS4_MAX_COUNTER, isStarting);
}
else if(isAdjustOn == true && adjustDirect == ADJUST_TORIGHT){
//OSYSPRINT(("MLS_MOVING_TO_8 used to right gait\n"));
r = MoveToPos(WALKINGPOSTORIGHT4_ANGLE, WALKINGPOS4_MAX_COUNTER, isStarting);
}
else if(isAdjustOn == true && adjustDirect == ADJUST_TOLEFT) {
r = MoveToPos(WALKINGPOSTOLEFT4_ANGLE, WALKINGPOS4_MAX_COUNTER, isStarting);
}
}
This code snippet shows how it takes a group of dynamic joints vector while the
robot is walking. And this is core idea of dynamic walking adjusting for the AIBO. This
improves a lot on the stability of AIBO walking and it can be obviously see the result
either by the Matlab evaluations or the reality-running result. The AIBO can always
get to the ball!
-
5. Result
In the SmoothWalker Project, the plan is to do researches which are mainly focused
on how to design and implement a stable AIBO walking gait and find a way to
evaluate the robot’s head shaking. To design a reasonable and suitable gait, several
different experiments on the related areas are done. In the final, the following works
have been done:
Find an approach to design a gait.
Implement a algorithm that can adjust the gait automatically
Find methods to evaluate the robots’ shaking.
In the final project, the dynamic walking solution works as follow figure.
Figure 11 Workflow of SmoothWalker Final solution.
This figure shows a stable and straightly walking solution for the robot. And the
shaking is obviously reduced as we can see from the Matlab analysis result. The
robot’s shaking in horizontal (x direction) has been eliminated when we design and
implement the newly designed gait with dynamic algorithms.
-
Gait1x/direction without compensate variance = 159.3209
Gait1 x/direction with compensate variance =16.1256
-
Gait1y/direction without compensate variance = 10.6001
Gait1 y/direction with compensate variance = 7.7584
Figure 12 Evaluation comparations in x and y direction of SmoothWalker
The experiences and the implementation results also give the AIBO program designer
some principles of how to design a reasonable and stable gait. First, try to design the
four leg-body joints in the same height when robot is walking. This will make the feet
carrying the robot-body in the same flat thus make the walking more stable. The
second is to let the robot walk symmetrically. This means it walks symmetrically on
both sides in the same height and distance. It will obviously improve the robot’s
walking. There principles can be used when user want to design a new gait from
scratch.
-
6. Future work
Robot is always an interesting topic to work with. In the SmoothWalker project, we
have made researches on the Robot based on the AIBO robots manufactured by
SONY. Totally new gaits are designed in the project, and it shows good results while
walking. Even lots of different experiments and solutions are made for the project,
there are still some points might be improved in the future research.
How to design an “accurate gait”. We have found an approach to design a stable
and “good” gait in the project, but still the designing is not a perfect one. Though
we find the principle that we should make robot walk symmetrically and keep
the robot-body in same height, we could not really let the robot-legs in the same
height while it is walking. A solution to design an “accurate” gait is needed.
Method to improve the existing gait. New gait is not always needed in the reality,
and the user might want to keep the old gait but want to optimize it. The way to
optimize the existing gait should be investigated.
Research about Inverse kinematics. More inverse kinematics research should be
done, and then more conclusions can be made for designing robot’s animations.
-
Reference
[1] Programming tools for Aibo, Sven Westermark, April 2005,
http://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Intro .
[2] Introduction to the Aibo programming environment,Ricardo A. Téllez, 17th July
2005,
[3] Ordinary Di_erential Equations Framework for the Robotic Dog Aibo,Etienne
Dysli,14th February 2005
[4] Seeing While Walking, Marie Emilsson,Eric Hansander, Michael Skarped, June
2007,
http://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Generellt/Rapport7.pd
f
[5] Seeing While Walking, Johan Simonsson, Johan Nilsson, June 2007
http://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Generellt/Rapport8.pd
f
[6] Seeing While Walking, Erik Bengtsson, Henrik Larsson, Anton Smedby, April 2007
http://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Generellt/Rapport9.pd
f
[7] D. Yazar, M. A. Khan, E. Aksu, M. Alzate and L. Yang. Seeing while walking, 2008.
http://www.it.uu.se/edu/course/homepage/styrsystem/vt08/Nyheter/Grupper/g5_
Final_Report.pdf.
http://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Introhttp://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Generellt/Rapport7.pdfhttp://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Generellt/Rapport7.pdfhttp://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Generellt/Rapport8.pdfhttp://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Generellt/Rapport8.pdfhttp://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Generellt/Rapport9.pdfhttp://www.it.uu.se/edu/course/homepage/styrsystem/vt07/Generellt/Rapport9.pdfhttp://www.it.uu.se/edu/course/homepage/styrsystem/vt08/Nyheter/Grupper/g5_Final_Report.pdfhttp://www.it.uu.se/edu/course/homepage/styrsystem/vt08/Nyheter/Grupper/g5_Final_Report.pdf
-
Appendix A: Camera Vision while
walking.
-
Appendix B: New Gait Parameters
Here shows the gait 1 definitions. There are 5 groups of gaits in the source code. //New gait 1
const double WALKINGPOS1_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
5,
5,
100,
-25,
5,
96,
-35,
5,
96,
-55,
5,
120
};
const double WALKINGPOS2_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
5,
5,
55,
-40,
5,
95,
-12,
5,
85,
-35,
5,
95
};
const double WALKINGPOS3_ANGLE[] = {
0, // TILT
0, // PAN
-
0, // ROLL
-25,
5,
96,
5,
5,
100,
-55,
5,
120,
-35,
5,
96
};
const double WALKINGPOS4_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
-40,
5,
95,
5,
5,
55,
-35,
5,
95,
-12,
5,
85
};
const double WALKINGPOSTORIGHT1_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
5,
20,
100,
-25,
5,
96,
-35,
-
20,
96,
-55,
5,
120
};
const double WALKINGPOSTORIGHT2_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
5,
20,
55,
-40,
5,
95,
-12,
20,
85,
-35,
5,
95
};
const double WALKINGPOSTORIGHT3_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
-25,
20,
96,
5,
5,
100,
-55,
20,
120,
-35,
5,
96
};
const double WALKINGPOSTORIGHT4_ANGLE[] = {
-
0, // TILT
0, // PAN
0, // ROLL
-40,
20,
95,
5,
5,
55,
-35,
20,
95,
-12,
5,
85
};
const double WALKINGPOSTOLEFT1_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
5,
5,
100,
-25,
20,
96,
-35,
5,
96,
-55,
20,
120
};
const double WALKINGPOSTOLEFT2_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
5,
5,
55,
-40,
20,
-
95,
-12,
5,
85,
-35,
20,
95
};
const double WALKINGPOSTOLEFT3_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
-25,
5,
96,
5,
20,
100,
-55,
5,
120,
-35,
20,
96
};
const double WALKINGPOSTOLEFT4_ANGLE[] = {
0, // TILT
0, // PAN
0, // ROLL
-40,
5,
95,
5,
20,
55,
-35,
5,
95,
-12,
20,
85
};