7/27/2019 Python SDK2 0
1/42
7/27/2019 Python SDK2 0
2/42
2002 Evolution Robotics, Inc. All rights reserved. Evolution Robotics and the Evolution
Robotics logo are trademarks of Evolution Robotics, Inc.
MicrosoftDirectX is a trademark of Microsoft Corporation.
MicrosoftSpeech SDK 5.1 is a trademark of Microsoft Corporation.
Microsoft
Windows is a trademark of Microsoft Corporation.DirectXTM is a trademark of Microsoft Corporation.
Other product and brand names may be trademarks or registered trademarks of their respective
owners.
Part number MC6080.
Last revised 11/15/02.
7/27/2019 Python SDK2 0
3/42
ER1 SDK User Guide
Chapter 1: Introduction
Task Overview ........................................................................................................ 1-1
Running Tasks......................................................................................................... 1-2
Customer Support ................................................................................................... 1-3
Registration ............................................................................................................. 1-3
ER1 Community ..................................................................................................... 1-3
Accessories.............................................................................................................. 1-3
Chapter 2: Software Installation
Installing the Software ............................................................................................ 2-1
Resource Configuration .......................................................................................... 2-2
User Configuration File .......................................................................................... 2-2
setUserConfig .................................................................................................. 2-2
getUserConfig.................................................................................................. 2-3
Assigning a Camera ................................................................................................ 2-3
Chapter 3: Evolution Robotics Software Development Kit
Using Tasks............................................................................................................. 3-2
Parallel Tasks .......................................................................................................... 3-2
Defining New Tasks................................................................................................ 3-3
TaskContext ............................................................................................................ 3-3
Events...................................................................................................................... 3-4
Example Program.................................................................................................... 3-4
Table of Contents
7/27/2019 Python SDK2 0
4/42
Table of Contents
ER1 SDK User Guide
Units......................................................................................................................... 3-5
setDefaultUnits ................................................................................................. 3-5
getDefaultUnits................................................................................................. 3-6
About X, Y Coordinates .......................................................................................... 3-6
Overview of Tasks ................................................................................................... 3-7
Tasks in Detail ......................................................................................................... 3-8
DetectColor....................................................................................................... 3-8
DetectMotion .................................................................................................... 3-8
DetectObject ..................................................................................................... 3-9
DetectSound...................................................................................................... 3-10
DoAtATime ...................................................................................................... 3-11
DoPeriodically .................................................................................................. 3-11
DoWhen............................................................................................................ 3-12
DriveStop.......................................................................................................... 3-12
GetImage........................................................................................................... 3-12
GetPosition ....................................................................................................... 3-13
Move................................................................................................................. 3-13
MoveRelative.................................................................................................... 3-13
MoveTo ............................................................................................................ 3-14
PlaySoundFile................................................................................................... 3-14
RecognizeSpeech.............................................................................................. 3-15
SendMail........................................................................................................... 3-15
SetDriveVelocities............................................................................................ 3-16
Speak................................................................................................................ 3-16
SpeakFromFile.................................................................................................. 3-16
Stop................................................................................................................... 3-17
Turn .................................................................................................................. 3-17
TurnRelative ..................................................................................................... 3-17
TurnTo .............................................................................................................. 3-18
Wait .................................................................................................................. 3-18
WaitUntil .......................................................................................................... 3-18
WatchInbox....................................................................................................... 3-19
7/27/2019 Python SDK2 0
5/42
ER1 SDK User Guide
Chapter 4: Tutorials
Object Sonar............................................................................................................ 4-1
Imports ............................................................................................................. 4-1
7/27/2019 Python SDK2 0
6/42
Table of Contents
ER1 SDK User Guide
7/27/2019 Python SDK2 0
7/42
ER1 SDK User Guide 1-1
Introduction
The ER1 Software Development Kit (ER1 SDK) for Python was created around a
powerful, flexible API that empowers you to create intricate robot behaviors that build onthe capabilities of the ER1. (Python is especially suited for this application due to its
versatility and ease of use.)
The ER1 SDK is organized as a series of tasks. Each task describes a discrete action that
the robot can perform, such as moving forward or playing a sound. These tasks can be
used singly, strung together in sequence or executed in parallel. Tasks communicate with
each other using pieces of information called Events.
If you want your robot to perform a task that is not provided with the API, you can write
your own. You can mix tasks youve written yourself with the provided ER1 tasks.
Task Overview
Tasks are divided into two distinct groups: Action Tasks and Monitoring Tasks. Both
task types can produce Events.
Action tasks perform a specific action then exit. These actions are usually short in
duration. An example of an action task is PlaySoundFile. This task simply plays a
sound file and then exits.
Chapter 1
7/27/2019 Python SDK2 0
8/42
Chapter 1 Introduction
1-2 ER1 SDK User Guide
Monitoring tasks check the status of their environment until they are terminated.
An example of a monitoring task is DetectMotion. This task runs continuously,
raising MotionDetection events whenever motion is detected. Other interested
tasks can then use the information contained in the MotionDetection events.
Events are pieces of information that are published by one task and are received by
other interested tasks.
Running Tasks
There are two ways to run multiple tasks: Sequentially and in Parallel. Each of these is
suited to different situations and different task types.
Sequential Tasks - Action tasks are the only type of task that can be run sequentially. In
this scenario, the tasks are performed one after the other with no two overlapping. An
example of this would be using the MoveRelative task to move forward five feet and
then the Speak task to say Hello. Because monitoring tasks never self-terminate, they
cannot be used in a sequential construct. In this case the parallel construct is needed.
Parallel Tasks - The parallel construct is used to run two or more tasks at the same time.
You will need this construct every time you run a monitoring action. The parallelconstruct can be used to mix and match any of the different action or monitoring tasks in
order to create an unlimited number of robot behaviors. (Note that a group of parallel
tasks can be run in two different ways: they can wait for one task to terminate, or they can
wait for all the tasks to terminate.)
The following is an example of using a parallel construct. Imagine that we want to make
the robot listen for voice commands and respond to them. This goal will require at least
two tasks running simultaneously: one to perform speech recognition, and another to
respond.
We can use the built-in RecognizeSpeech task to generate events when a voice
command is heard. RecognizeSpeech is a monitor task, which means that it will never
complete on its own, but instead keeps running and generating speech events until it isexplicitly terminated. Now we need to run another task in parallel with
RecognizeSpeech to wait for the speech events.
The built-in WaitUntil task waits for an event and runs some other task when it receives
that event. So we could have WaitUntil wait for speech recognition events, then run the
Speak task with an argument of"hello". WaitUntil would then terminate immediately
after running Speak.
When we add RecognizeSpeech and WaitUntil to the same parallel construct, they run
simultaneously. As soon as a speech recognition event is raised, WaitUntil will catch the
event and run Speak, causing the robot to respond. When WaitUntil terminates, the
parallel construct will terminate all remaining tasks in the construct, in this case
RecognizeSpeech, and we are done.
http://-/?-http://-/?-7/27/2019 Python SDK2 0
9/42
ER1 SDK User Guide 1-3
Customer Suppor
Customer Support
Evolution Robotics Customer support is available by email at
[email protected] or by filling out the form at
www.evolution.com/support/. Customer Service representatives are available by calling
toll free at 866-ROBO4ME or, for international customers 626-229-3198, Monday though
Friday, 9 A.M. to 5 P.M. Pacific Time.
Registration
Remember to register your robot online at www.evolution.com. By registering you get the
following benefits:
World class customer support
RobotMail email account
Join the ER1 SDK development community
Software updates and bug fixes
Information on new accessories and software
ER1 Community
The ER1 community is a place to share development ideas, lessons learned, shortcuts,
pictures and applications. After you have completed registration, visit the ER1 SDK
section of the ER1 community at www.evolution.com.
Accessories
Evolution Robotics is continually developing new accessories for your ER1. Each of these
accessories is designed to expand the number and variety of applications you can perform
with your ER1. XBeams Expansion Pack- The XBeams Expansion Pack allows you to change the
configuration of your robot. For example, you can create your own SnakeBot, add a
backpack or design something unique.
ER1 Gripper Arm - The ER1 Gripper is Evolution Robotics latest addition to our
ER1 accessories. The gripper arm allows your ER1 to grasp, carry and release
objects.
IR Sensors - The IR sensors will enhance the ER1s obstacle avoidance
capabilities. The IR Sensor Pack will be available for sale later this year.
7/27/2019 Python SDK2 0
10/42
Chapter 1 Introduction
1-4 ER1 SDK User Guide
http://-/?-http://-/?-7/27/2019 Python SDK2 0
11/42
ER1 SDK User Guide 2-1
Software Installation
Installing the Software
The ER1 Software Development Kit is available on the Evolution Robotics website at
www.evolution.com. Follow the directions posted on the website to download the
software, then follow these steps:
1. Download the executable named ER1_Python_Setup.exe to your Windows Desktop.
This may take several minutes, depending on your Internet connection.
2. Click on the ER1_Python_Setup.exe file to initiate the Installshield Wizard. The
Installshield Wizard will walk you through the installation process. Click on Next.
3. Read the License Agreement carefully and then click on Yes. The installation process
will begin. You can cancel the installation process at any time by clicking on the
Cancel button.
4. When the installation process is complete, you will get the message Setup has
finished installing ER1 SDK for Python on your computer. Click on the Finish button
Chapter 2
7/27/2019 Python SDK2 0
12/42
Chapter 2 Software Installation
2-2 ER1 SDK User Guide
Resource Configuration
A resource describes a software or hardware interface that provides additional capabilities
to the ER1 SDK. For the ER1 SDK for Python system to load resources correctly, it must
be told which resources to load and how. The resource configuration system provides this
and other information about resources and the external environment.
Resource information is stored in XML format. The primary resource file is named
resource-config.xml, and is located in the config directory in the ER1 SDK
installation directory. This file lists the resources present on the system, with parameters
to the appropriate drivers, and provides additional information about the physical
configuration of the system. This file is shipped with a default hardware setup for the
ER1. In addition, a number of optional hardware resources are listed in the resource
config, but are commented out (i.e. ). To enable one or more of these
resources, simply remove the comments from around the appropriate section.
Optional resources listed in the file include:
A second camera
ER1 Gripper Arm
ER1 IR sensors
ViaVoice speech recognition and Text to speech (not included)
User Configuration File
Some tasks in the ER1 SDK allow you to use a user configuration file in order to define
frequently used values. The file is named user-config.xml and can be found in the
config directory of the ER1 SDK installation directory. This file can be used to control
how a task operates. An example of this would be setting the mailserver to be used in the
SendMail task. Any values defined in this file will apply to all tasks, so set these values
carefully. To set and get values in the user config file, use the setUserConfig and
getUserConfig functions.
setUserConfig
This function is used to set any user configuration values.
Usage
ersp.task.setUserConfig(key, value, 1|0)
Parameters
key This parameter defines the name of the config setting.
value This specifies the value of the config setting.1|0 (Optional) A value or 1 saves the parameters to the user config file, and a value of 0 does
not. By default, any values set are not saved and are only used during that Python session.
Returns
Nothing.
http://-/?-http://-/?-7/27/2019 Python SDK2 0
13/42
ER1 SDK User Guide 2-3
Assigning a Camera
Example
The following example shows how to set your mailserver:
ersp.task.setUserConfig($MailHostname, mail.yourmail.com)
getUserConfig
This function is used to see any user config values that have been set.
Usage
ersp.task.getUserConfig (key)
Parameters
key This parameter specifies the name of the config setting.
Returns
This function returns all of the settings in the user-config.xml file.
Assigning a CameraThe Camera Chooser GUI is used to select a camera to use for obstacle avoidance and a
camera to use for obstacle recognition. It is initiated by clicking on camerachooser.exe
in the installation directory. It looks like this:
Click on the Swap Camera button to change the camera used for each function.
7/27/2019 Python SDK2 0
14/42
Chapter 2 Software Installation
2-4 ER1 SDK User Guide
http://-/?-http://-/?-7/27/2019 Python SDK2 0
15/42
ER1 SDK User Guide 3-1
Evolution Robotics
Software DevelopmentKit
The following sections go into detail about the concepts and tasks you will need to use.
These sections are:
Using Tasks
Parallel Tasks
Defining New Tasks
TaskContext
Events
Units
About X, Y Coordinates
Chapter 3
7/27/2019 Python SDK2 0
16/42
Chapter 3 Evolution Robotics Software Development Kit
3-2 ER1 SDK User Guide
Overview of Tasks
Tasks in Detail
Setting a Camera
Using Tasks
Tasks are, at their heart, a lot like functions. To have the robot execute a task, you call itwith some arguments. For example, to make the robot move forward 10 centimeters, you
can use the following command in a Python script:
from ersp.task import navigation
navigation.MoveRelative(0, [10, 0, 5, 4])
Or to make the robot move in a square:
from ersp.task import navigation
# Move forward 50 cm
navigation.MoveRelative(20.0, [50, 0, 10])
# Move to the right 50 cm
navigation.MoveRelative(20.0, [50, 50, 10])
# Move back 50 cm, then move to the left 50 cm
navigation.MoveRelative(40.0, [0, 50, 10], [0, 0, 10])
Some tasks also return useful values. For example, to determine the location and
orientation of the robot, you can use the GetPosition task:
from ersp.task import resource
(x, y, theta, timestamp) = resource.GetPosition()
print "I am at (%s, %s)." % (x, y)
Parallel Tasks
For a robot that lives in a complex, changing environment, sometimes simple sequences
of actions aren't enough; Sometimes you want your robot to do more than one thing at a
time. To make the robot perform multiple tasks in parallel, you can use the Parallel class.
For example, the following script will cause your robot to rotate 360 degrees around and
simultaneously speak to you:
from ersp import task
from ersp.task import speech
from ersp.task import navigation
import math
http://-/?-http://-/?-7/27/2019 Python SDK2 0
17/42
ER1 SDK User Guide 3-3
Defining New Task
p = task.Parallel()
p.addTask(navigation.TurnRelative, [math.pi * 2])
p.addTask(speech.Speak, ["I'm starting to feel a littledizzy."])
p.waitForAllTasks()
You use addTask to specify the tasks and their arguments that you want the robot to
perform, and you can add as many tasks as you want. Remember that adding a task doesn'tstart it running. The waitForAllTasks method starts up all the tasks at the same time,
then waits until all the tasks have completed.
Defining New Tasks
In addition to using the predefined tasks that are provided as part of the ER1 SDK, you
can define new tasks of your own. It's as simple as defining a new function, plus one
additional step of registering the function as a task.
For example, here's a script that defines a new, very simple task and then calls the task:
from ersp import task
# First define the task functiondef MyTaskFun(context):
print "hello!"
# Then register the task
MyTask = task.registerTask("MyTask", MyTaskFun)
# Finally, call the task (not the function).
MyTask()
TaskContext
There are a few things you have to know when writing a function that you plan on
registering as a task. First the function must take a single argument. When the task is
executed, that argument will be bound to the TaskContext object, which represents the
task context. One of its most important functions is packaging up the task arguments.
To extract the task arguments you can use the context's getArguments method, which
returns a tuple containing the arguments. For example, here's a task that adds its two
arguments together and returns the sum:
from ersp import task
def AddFun(context):
(a, b) = context.getArguments()
return a + b# Note that a will be set to the first argument and b will beset to the second argument.
Add = registerTask("Add", AddFun)
print Add(5, 4)
7/27/2019 Python SDK2 0
18/42
Chapter 3 Evolution Robotics Software Development Kit
3-4 ER1 SDK User Guide
Events
Some tasks are designed to watch for particular events or situations, for example a loud
noise, a recognizable object in front of the camera, or a spoken command. These tasks
need a way to signal when the situation they have been monitoring for occurs, and the
way they signal other tasks is by raising events. You can make your tasks wait for these
events, and take appropriate actions.
Example Program
Here is an example that waits for the speech recognition system to hear a spoken
command:
from ersp.task import speech
from ersp import task
def WaitForCommandFun(context):
task = context.getTask()
event = task.waitForEvent(speech.EVENT_SPEECH_RECOGNIZED)
text = event.getProperty("text")
print "You said '%s'." % (text,)
WaitForCommand = task.registerTask("WaitForCommand",\WaitForCommandFun)
p = Parallel()
p.addTask(WaitForCommand)
p.addTask(RecognizeSpeech, "mygrammar.something.something")
p.waitForFirstTask()
First, here's an overview of what this program does. It starts up two tasks in parallel, the
custom WaitForCommand task, and RecognizeSpeech. RecognizeSpeech is a task that
runs forever and raises events when the speech recognition system recognizes speech.
WaitForCommand waits for one of the speech recognition events, prints the text that was
recognized, then returns. This occurs because we call Parallel's waitForFirstTask
method so as soon as the WaitForCommand task returns, the program ends.
The first thing WaitForCommand needs to do is get a handle on the Task object that
represents the currently executing instance of the WaitForCommand task. This is required
in order to call the Task's waitForEvent method.
The waitForEvent method waits for an event to occur and returns the event. You specify
which events you're waiting for by passing the type as the first argument(EVENT_SPEECH_RECOGNIZED in the example).
The method does not return until an event of that type is raised by another task.
Once a speech recognition event is raised and waitForEvent returns, we can look up the
value of the "text" property, which for recognition events contains the text that was
recognized.
http://-/?-http://-/?-7/27/2019 Python SDK2 0
19/42
ER1 SDK User Guide 3-5
Units
You can even write tasks that raise their own events. It's as easy as 1-2-3:
1. Creating the event
2. Setting any properties
3. Raising the event
Here's an updated version of theAdd task that instead of returning the sum of its two
arguments, raises an event that contains the sum.
This is just a fictional example of how to use events:
from ersp import task
def AddFun(context):
(a, b) = context.getArguments()
event = Event("Sum")
event ["sum"] = a + b)
manager = context.getTaskManager()manager.raiseEvent(event)
Units
The ER1 SDK uses a certain set of default units for its functions. These are centimeters for
forward and backward motion, radians for rotation, and seconds for time. However, you
may change these units to inches, feet, or meters for movement, degrees for rotation, or
minutes for time. How do you do this? You use the setDefaultUnits function to set the
units or the getDefaultUnits function to find out how your units are set.
setDefaultUnits
Usage
import ersp.taskersp.task.setDefaultUnits(ersp.task UNIT_type, unit)
Parameters
UNIT_type This parameter specifies the UNIT_type: UNIT_DISTANCE, UNIT_ANGLE, and/or
UNIT_TIME.
unit This parameter sets the units to be used for each UNIT_type. These are:
DISTANCE - This parameter can be set to cm (centimeters), ft (feet), m (meters), or
in (inches).
ANGLE - TheANGLE parameter can set to rad (radians) ordeg (degrees).
TIME - This can be set to sec (seconds) ormin (minutes).
Returns
Nothing.
7/27/2019 Python SDK2 0
20/42
Chapter 3 Evolution Robotics Software Development Kit
3-6 ER1 SDK User Guide
getDefaultUnits
Usage
import ersp.taskersp.task.getDefaultUnits (UNIT_type) )
Parameters
UNIT_type This parameter can be set to UNIT_DISTANCE, UNIT_ANGLE, orUNIT_TIME.
Returns
This function returns the distance, angle and/or time setting requested.
About X, Y Coordinates
This coordinate system, with the positive x axis pointing forward and the positive y axis
pointed toward the left, is the coordinate system we all know and love (positive x-axis
pointed to the right, positive y-axis pointed forward, +x, +y values in the forward-right
quadrant), but rotated 90 degrees counter-clockwise. The reason for the rotation is that we
want the 0 degree mark (i.e. positive x-axis) to be pointed forward. This coordinatesystem, with the x-axis pointed forward, is the standard in all of robotics, and that is why
ER SDK uses it.
1. Robot starting position (0, 0) with front of robot pointing along X+ axis.
2. Robot path to new relative position of x=10, y=20.
3. Robot position after first relative move of x=10, y=20. Axes are redrawn so that robot
is again at the position 0,0, with the front of the robot pointing along the X + axis.
http://-/?-http://-/?-7/27/2019 Python SDK2 0
21/42
ER1 SDK User Guide 3-7
Overview of Tasks
4. Robot path to new relative position of x=10, y= -30
5. Robot position after relative move of x=10, y= -30. Robot is facing in the direction it
would have been facing if the robot had traveled in a straight line to its new position.
Overview of Tasks
The following are the tasks with brief descriptions:
DetectColor - This task begins watching for a specified color and raises events
when it is seen.
DetectMotion - Returns the amount of motion between consecutive frames from
the camera.
DetectObject - This task begins watching for a specified object and raises events
when it is seen.
DetectSound - Returns the perceived volume level from the microphone.
DoAtATime - Performs a task at a specific time.
DoPeriodically - Performs a task periodically, every so often.
DoWhen - Waits for an event, and executes a task when the event is raised.
DriveStop - Sends as top command to the drive system.
GetImage - Returns an image from the camera.
GetPosition - Returns the robot in global coordinate system (x, y, theta).
Move - Moves at the given velocity, with obstacle avoidance.
MoveRelative - Moves a set of relative distances
MoveTo - Moves to a set of absolute points.
PlaySoundFile - Plays a sound file.
RecognizeSpeech - Recognizes speech according to the specified grammar,
raising events when speech is detected.
SendMail - A task that sends mail and waits for its completion.
SetDriveVelocity - Sets the drive system to move at a given velocity.
Speak - "Speaks" the given text using TTS, waiting until completed.
SpeakFromFile - "Speaks" the specified file using TTS, waiting until completed.
Stop - Stops.
Turn - Moves at the given angular velocity, with obstacle avoidance.
TurnRelative - Turns a relative distance.
TurnTo - Turns to an absolute heading.
Wait - Waits for the specified amount of time.
7/27/2019 Python SDK2 0
22/42
Chapter 3 Evolution Robotics Software Development Kit
3-8 ER1 SDK User Guide
WaitUntil - Waits for an event.
WatchInbox - A task that waits for new mail and raises an event when it is
available.
Tasks in Detail
DetectColor
When the color is recognized, an Evolution.ColorDetected event is raised. The event
will have a "label" property corresponding to the label passed to this task, and it will have
a "heading" property whose value is the angle (in radians) to the color.
This task does not terminate.
Usage:
DetectColor(label,color)
Parameters:
label[string] The label to use in the event.
color
[DoubleArray]The color to watch for. This should be a DoubleArray of size 5 containing values
generated by the color training application.
Returns:
None.
Events Raised
ersp.task.vision.EVENT_COLOR_DETECTION("Evolution.ColorDetected")
Raised when the specified color is detected.
label [string] The label argument given to the task.
heading[double] The angle to the color (0 is straight ahead).
DetectMotion
Returns the amount of motion between consecutive frames from the camera.
Usage:
DetectMotion(threshold, min_change_per_px, min_event_interval, resource)
Parameters:threshold
[integer,
double]
(Optional) Pixel delta threshold. The default value is 15.
http://-/?-http://-/?-7/27/2019 Python SDK2 0
23/42
ER1 SDK User Guide 3-9
Tasks in Detai
min_change_
per_px
[integer,
double]
(Optional) Minimum delta per pixel across the entire visual field. The default value is 1.
min_event_interval
[integer,time]
(Optional) Minimum interval, in seconds, between MotionDetected events. The default
value is .1 seconds.
resource[integer,
string]
(Optional) Camera resource.
Events Raised
ersp.task.vision.EVENT_MOTION_DETECTION_STARTED("Evolution.MotionRecognitionStarted")
Raised once DetectMotion has finished initializing and has started watching for motion.
Has no properties.
ersp.task.vision.EVENT_MOTION_DETECTED("Evolution.MotionDetected")
Raised when motion over the specified threshold is detected.
diff_per_pixel[double] The amount of motion.
DetectObject
When the object is recognized, an Evolution.ObjectDetected event is raised. The
event will have a "label" property whose value is the name of the object recognized, and a
"heading" property whose value is the angle (in radians) to the object.
DetectObject may have to load a large model set file before it can begin performing
recognition; If you're interested in knowing exactly when loading completes and
recognition starts you can wait for an Evolution.ObjectRecognitionStarted event
(which has no properties).
This task does not terminate.
Usage
DetectObject(model_set, min_time)
Parameters:
model_set[string]
The filename of the model set to use.
min_time
[double]
An optional minimum time between detection events for each model.
More parameters may be given to restrict which models within the modelset will be
recognized.
7/27/2019 Python SDK2 0
24/42
Chapter 3 Evolution Robotics Software Development Kit
3-10 ER1 SDK User Guide
Returns:
Nothing.
Events Raised
ersp.task.vision.EVENT_OBJECT_RECOGNITION_STARTED("Evolution.ObjectRecognitionStarted")
Raised when DetectObject has finished initializing and has started watching forobjects. Has no properties.
ersp.task.vision.EVENT_OBJECT_DETECTION("Evolution.ObjectDetected")
Raised when an object is detected.
label [string] Identifies the object that was recognized.
heading [double] The heading to the object (0 is straight ahead).
elevation [double] The elevation of the object (0 is level)
distance [double] The distance to the object.
period [double] The minimum time between events.
DetectSound
Returns the perceived volume level from the microphone.
Usage
DetectSound(threshold, min_time_between_events, resource_id)
Parameters:
threshold
[double]
Sound level threshold (0 threshold 1)
min_time_
between_event
s [time]
(Optional) Event interval
resource_id
[string]
(Optional) WinVoice resource
Returns:
NULL; this task raises events
Events Raised
ersp.task.net.EVENT_SOUND_DETECTED ("Evolution.SoundDetected")
Raised when the audio level peaks above the threshold level.
loudness [double] Loudness of the sound (between 0.0 and 1.0).
http://-/?-http://-/?-7/27/2019 Python SDK2 0
25/42
ER1 SDK User Guide 3-11
Tasks in Detai
DoAtATime
Performs a task at a specific time.
Usage
DoAtATime(time,task)
Parameters:
time
[double]
The time argument specifies the time at which to execute the given task, and should be
given as the number of seconds since the "Epoch". You can use the Python time function
to determine the current time, and adjust it. For example,
import time
now = time.time()
# Add 10 seconds
future = now + 10
DoAtTime(future, MyTask)
task
[pointer]
Task to be run at the specified time.
Any additional arguments are passed to the task when it is run.
Returns:
Nothing
DoPeriodically
Performs a task periodically, every so often.
Usage
DoPeriodically(time, task)
Parameters:
time [double] Seconds between task invocations.
task [pointer] Task to be run at the specified time.
Any additional arguments are passed to the task when it is run.
Returns:
Return value of task.
7/27/2019 Python SDK2 0
26/42
Chapter 3 Evolution Robotics Software Development Kit
3-12 ER1 SDK User Guide
DoWhen
Waits for an event, and executes a task when the event is raised.
Usage
DoWhen(timeout, eventspec, task)
Parameters:
timeout
[double]
(Optional) Time, in seconds, to wait for the event. Defaults to infinity.
eventspec
[string]
The event to wait for.
task [pointer] (Optional) If specified, a task to run if the event is raised. The event will be passed to the
task as the first argument.
Any additional arguments are passed to the task when it is run.
Returns:
Return value of task.
DriveStop
Sends stop command to the drive system.
Usage
DriveStop
Parameters
None.
Returns
Nothing.
GetImage
Returns an image from the camera.
Usage
GetImage(resource_id)
Parameters:
resource_id
[string]
(Optional) camera
Returns:
image [out, Image]; image data
http://-/?-http://-/?-7/27/2019 Python SDK2 0
27/42
ER1 SDK User Guide 3-13
Tasks in Detai
GetPosition
Polls the range sensor and returns distance-timestamp pairs in one contiguous array.
Usage
GetPosition(resource_id)
Parameters:
resource_id
[string]
Range Sensor ID. Multiple sensors can be specified.
Returns
distances [out, double_array]; { d0, t0, d1, t1, ..., dX, tX}
Move
Moves at the given velocity, with obstacle avoidance.
Usage
Move(v, w)
Parametersv [double] Linear velocity.
w [double] Angular velocity.
Returns
Nothing.
MoveRelative
Moves a set of relative distances. See About X, Y Coordinates for details about
specifying x, y coordinates.
Usage
MoveRelative(timeout [dx, dy, v, w, eps,] final_heading)
Parameters:
timeout
[double]
Seconds to wait for completion; zero means no timeout.
dx (Required) Distance to move forward.
dy (Required) Distance to the left.
v (Optional) Velocity.
w (Optional) Angular velocity.
eps (Optional) How close to stop from object.
final_heading (Optional) Direction the robot should face when finished.
Returns
Nothing.
7/27/2019 Python SDK2 0
28/42
Chapter 3 Evolution Robotics Software Development Kit
3-14 ER1 SDK User Guide
MoveTo
Moves to a set of absolute points.
Usage
MoveTo(timeout [dx, dy, v, w, eps, final_heading])
Parameters
timeout
[double]
Seconds to wait for completion; zero means no timeout.
dx (Required) Distance to move forward.
dy (Required) Distance to the left.
v (Optional) Velocity.
w (Optional) Angular velocity.
eps (Optional) How close to stop from object.
final_heading (Optional) Direction the robot should face when finished.
Returns
Nothing.
PlaySoundFile
Plays a sound file.
Usage
PlaySoundFile(file, async)
Parameters:file [string] Path to the sound file.
async
[boolean]
(Optional) Boolean indicating whether the sound should be played asynchronously,
meaning that the task returns immediately, but the sound keeps playing. True means
async. The default is false (sync).
Returns:
Nothing.
http://-/?-http://-/?-7/27/2019 Python SDK2 0
29/42
ER1 SDK User Guide 3-15
Tasks in Detai
RecognizeSpeech
Recognizes speech according to the specified grammar, raising events when speech is
detected. This task runs until terminated.
Usage
RecognizeSpeech(grammar, resource)
Parameters:
grammar
[string]
Grammar file to use.
resource
[string]
(Optional) ASR (recognizer) resource to use.
Returns
Nothing
Events Raised
ersp.task.speech.EVENT_SPEECH_RECOGNIZED("Evolution.SpeechRecognized")
Raised when speech was recognized and accepted.
text[string] The text of the speech.
confidence [double] The confidence
ersp.task.speech.EVENT_SPEECH_REJECTED("Evolution.SpeechRejected")
Raised when speech was possibly recognized, but rejected.
text[string] The text of the speech.
confidence [double] The confidence value.
SendMail
A task that sends mail and waits for its completion.
Usage
SendMail(to, from, subject, message, attachment)
Parameters:
to [string] Email address.from [string] (Optional) Sender name.
subject
[string]
(Optional) Subject.
message
[string]
(Optional) Text message.
7/27/2019 Python SDK2 0
30/42
Chapter 3 Evolution Robotics Software Development Kit
3-16 ER1 SDK User Guide
attachment
[string]
(Optional) Attachment to send (must specify complete path).
Returns
Immediately if an error occurs or until mail is sent.
SetDriveVelocities
Sets drive system to move at the given velocity.
Usage
SetDriveVelocities v, w
Parameters
v [double] Linear velocity, in default units.
w [double] Angular velocity, in default units.
Returns
Nothing.
Speak
"Speaks" the given text using TTS, waiting until completed.
Usage
Speak(text, voice, resource)
Parameters:
text [string] Text to speak.
voice [string] (Optional) Default voice to use (the text can override).
resource
[string]
(Optional) TTS resource to use.
Returns
Nothing.
SpeakFromFile
"Speaks" the specified file using TTS, waiting until completed.
Usage
SpeakFromFile(file, voice, resource)
Parameters:
file [string] File from which to read.
voice
[string]
(Optional) Default voice to use (the file can override).
http://-/?-http://-/?-7/27/2019 Python SDK2 0
31/42
ER1 SDK User Guide 3-17
Tasks in Detai
resource
[string]
(Optional) TTS resource to use.
Returns:
Nothing.
Stop
Stops the robot.
Usage
Stop stop_type
Parameters:
stop_type
[double]
(Optional) This parameter specifies the type of stop. STOP_OFF, STOP_SMOOTH and
STOP_ABRUPT are allowed values.
Returns:
Nothing
Turn
Moves at the given angular velocity, with obstacle avoidance.
Usage
Turn(w)
Parameters:
w [double] Angular velocity in default angular units.
Returns:
Nothing
TurnRelative
Turns a relative distance.
Usage
TurnRelative(angle, w)
Parameters:
angle [double] The angle to turn through, in radians.w [double] (Optional) The desired angular velocity (defaults to pi/6 / s).
Returns:
Nothing.
7/27/2019 Python SDK2 0
32/42
Chapter 3 Evolution Robotics Software Development Kit
3-18 ER1 SDK User Guide
TurnTo
Turns to an absolute heading.
Usage
TurnTo(timeout, eps, angle, angular_velocity)
Parameters:
timeout
[double]
Seconds to wait for completion; zero means no timeout.
eps How close to get to target
angle Angle in radians
angular_
velocity
Velocity
Returns
Nothing.
Wait
Waits for the specified amount of time.
Usage
Wait(duration)
Parameters:
duration
[double]
Seconds to wait
Returns:
Nothing.
WaitUntil
Waits for an event.
Usage
WaitUntil(eventspec, timeout)
Parameters:
eventspec
[string]
Event to wait for.
timeout
[double]
(Optional) Time to wait for the event, in seconds. Defaults to infinity.
Returns:
Nothing.
http://-/?-http://-/?-7/27/2019 Python SDK2 0
33/42
ER1 SDK User Guide 3-19
Tasks in Detai
WatchInbox
A task that waits for new mail and raises an event when it is available.
Usage
WatchInbox(pop_server, user_name, password, from_filter,
to_filter, subject_filter, message_filter, poll_interval)
Parameters:
pop_server
[string]
(Optional) Incoming server name (will search for in user config file).
user_name
[string]
(Optional) User name (will search for in user config file).
password
[string]
(Optional) Password (will search for in user config file).
rom_filter
[string]
(Optional) Sender name filter.
to_filter[string]
Receiver name filter.
subject_filter
[string]
(Optional) Subject line filter.
message_filter
[string]
(Optional) Mail body filter.
poll_interval
[string]
(Optional) Time interval (seconds) to check for new mail. The default value is 30 seconds
Returns:
Immediately if an error occurs or until task is killed.
Events Raised
ersp.task.net.EVENT_MAIL_RECEIVED ("Evolution.MailReceived")
Raised when new mail is received. Depending on the filter criteria, the event contains one
or all of the following properties:
* to [string]; address of the receiver
*from [string]; address of the sender
*subject[string]; the subject line
* body [string]; the contents of the mail
7/27/2019 Python SDK2 0
34/42
Chapter 3 Evolution Robotics Software Development Kit
3-20 ER1 SDK User Guide
http://-/?-http://-/?-7/27/2019 Python SDK2 0
35/42
ER1 SDK User Guide 4-1
Tutorials
Object Sonar
This program makes the robot act like it has sonar. This demonstrates the techniques you
will need to write your own programs.
Wait for a person to say the name of an object the robot can recognize ("box", "dollar bill",
"logo"), then turn slowly in place. Make a sonar ping noise every 5 seconds. If the object
is spotted, make a sonar ping noise more often. Don't stop until a person says "stop". A
person can say a new object to be looked for at any time.
Say "quit" to quit the program completely.
Tests recognizing objects, speech recognition, repeating at intervals, playing audio files
(and turning at a given velocity).
Imports
# The first thing we need to do is import the ersp.task module so we
# can use the API.
import ersp.task
Chapter 4
7/27/2019 Python SDK2 0
36/42
Chapter 4 Tutorials
4-2 ER1 SDK User Guide
# We know we want to be able to use some movement tasks, the speech
# recognition task, and some general utility tasks, so we have to
# import those modules. We use the 'from module import submodule'
# syntax so we can refer to the tasks like 'speech.DetectSpeech'.
from ersp.task import navigation
from ersp.task import speech
from ersp.task import vision
from ersp.task import util
# We'll also be using the sleep function from the standard Python time
# module.
import time
# And pi from the math module.
import math
## -- Constants
# Here we define the list of objects that the robot can recognize.
KNOWN_OBJECTS = {"cereal": "cranberries.png"}
# This is the audio file containing the sonar ping sound.
PING_SOUND_FILE = "ping.wav"
# The speed at which the robot should rotate, in radians/s.
ANGULAR_VELOCITY = 10.0 * (math.pi / 180.0)
## -- Custom Tasks and Utility Functions
# Here we define a function for a task that will be in charge, and
# figure out what to do based on what commands we hear.
def MainSonarLoopFun(context):
timeToQuit = 0
task = context.getTask()
# Enable the speech recognition event so that we don't accidentally
# miss one while we're in the body of the loop.
task.enableEvent(speech.EVENT_SPEECH_RECOGNIZED)
# Run in a loop until either this task is terminated from outside,
# or we decide it's time to quit (a person gave us the 'quit'
# command).
while not (context.terminationRequested() or timeToQuit):
# Wait for a speech recognition event from the DetectSpeech task.
event = task.waitForEvent(speech.EVENT_SPEECH_RECOGNIZED)
# Check to make sure we really got an event (waitForEvent will
# unblock and return None when this task is terminated).
if event != None:
http://-/?-http://-/?-7/27/2019 Python SDK2 0
37/42
ER1 SDK User Guide 4-3
Object Sona
# Grab the value of the event property that contains the actual
# text that was recognized.
command = event["text"]
# Check whether we got the pause command, and if so then pause
# the sonar.
if command == "pause":
print "Pausing."
stopSonar()
# Check whether we got the quit command, and if we did, then stop
# the sonar and set a flag that lets us know we can end this
# while loop.
elif command == "quit" or command == "exit":
print "Exiting."
stopSonar()
timeToQuit = 1
# Check whether the word we heard was the name of an object we
# know about. If it was, then look for that object.
elif command in KNOWN_OBJECTS:
scanMessage = "Scanning for %s." % (command,)
print scanMessage
speech.Speak(scanMessage)runSonar(KNOWN_OBJECTS[command])
# Whatever it was the person said, it wasn't something we were
# expecting or know how to handle. Let the person know.
else:
print "Sorry, I don't know what you mean."
speech.Speak("Sorry, I don't know what you mean.")
# Don't forget to disable the event.
task.disableEvent(speech.EVENT_SPEECH_RECOGNIZED)
# Now that we've defined the MainSonarLoop function, the way we use it
# as a task is to call registerTask (and we assign the resulting task
# to MainSonarLoop).
MainSonarLoop = ersp.task.registerTask("MainSonarLoop", MainSonarLoopFun)
# This task just starts up the DetectObject task and the RecognitionHandler
# task.
def SonarFun(context):
# object should be a string containing the name of the object to
# look for.
[objectName] = context.getArguments()
# We want to run the two tasks in parallel.
# Note here that we pass our context in to create this Parallel
# object. This is always good practice, if you're in a task that
# has been passed its own context object. For this task it is
# required so that task termination happens properly.
p = ersp.task.Parallel(context)
# Tell DetectObject to look for the object specified by objectName.
detector = p.addTask(vision.DetectObject, ("base.dla", 2.5, objectName))
7/27/2019 Python SDK2 0
38/42
Chapter 4 Tutorials
4-4 ER1 SDK User Guide
# Add the SonarDetector task (which doesn't need any arguments).
handler = p.addTask(RecognitionHandler)
# Neither of the tasks we just added will ever end, so this really
# just waits until this task is explicitly terminated.
p.waitForAllTasks()
# When we do finally get terminated, we want to terminate these
# tasks as well.
detector.terminate()
handler.terminate()
return None
Sonar = ersp.task.registerTask("Sonar", SonarFun)
# This task does the actual pinging
def RecognitionHandlerFun(context):
task = context.getTask()
while not context.terminationRequested():
# Wait for an object recognition event, then ping. Or if 4# seconds go by without an event, just ping anyway.
event = task.waitForEvent(vision.EVENT_OBJECT_DETECTION, 4000.0)
if event != None:
print event
util.PlaySoundFile(PING_SOUND_FILE)
RecognitionHandler = ersp.task.registerTask("RecognitionHandler", RecognitionHandlerFun)
# We start a Sonar task in the background. We keep track of the task
# in this variable.
sonarTask = None
# This function starts the Sonar task.
def runSonar(objectName):
global sonarTask
print "Starting sonar. It can take about 10-15 " + \
"seconds before the sonar is ready to detect things."
# If there's already a Sonar task running, stop it.
if sonarTask != None:
stopSonar()
else:
util.PlaySoundFile(PING_SOUND_FILE)
# Start a new Sonar task in the background.
sonarTask = ersp.task.getTaskManager().installTask(Sonar, [objectName])
navigation.Move(0.0, 0.1)
# This function stops the Sonar task.
def stopSonar():
global sonarTask
http://-/?-http://-/?-7/27/2019 Python SDK2 0
39/42
ER1 SDK User Guide 4-5
Object Sona
navigation.Stop()
if sonarTask != None:
print "Terminating %s." % (sonarTask,)
# Terminate the task, then sleep for .2 seconds to give the task a
# chance to exit cleanly.
sonarTask.terminate()
time.sleep(.200)
else:
print "No sonarTask to terminate."
def FakeASRFun(context):
manager = context.getTaskManager()
while (not context.terminationRequested()):
print "enter a command:"
line = raw_input()
if line != "":
event = ersp.task.Event(speech.EVENT_SPEECH_RECOGNIZED)
event["text"] = line
manager.raiseEvent(event)
FakeASR = ersp.task.registerTask("FaskASR", FakeASRFun)
log = ersp.task.Log("Evolution.Tasks.Vision")log.setPriority(ersp.task.LOG_DEBUG)
p = ersp.task.Parallel()
p.addTask(MainSonarLoop)
# if you want to use speech recognition, uncomment this line
# and comment out the next line that adds FakeASR.
#p.addTask(speech.RecognizeSpeech)
p.addTask(FakeASR, ["sonar.grammar"])
# We use try and finally here to make sure that no matter how this
# program ends, whether you say "quit" or you press Control-Break, the
# robot stops moving.
try:
p.waitForFirstTask()
finally:
navigation.Stop()
print "all done."
7/27/2019 Python SDK2 0
40/42
Chapter 4 Tutorials
4-6 ER1 SDK User Guide
http://-/?-http://-/?-7/27/2019 Python SDK2 0
41/42
ER1 SDK User Guide I-1
CCustomer support1-3
DDetectColor3-8DetectMotion3-8DetectObject3-9DetectSound3-10DoAtATime3-11DoPeriodically3-11DoWhen3-12DriveStop3-12
E
Events3-4
GgetDefaultUnits3-6GetImage3-12getUserConfig2-3
MMove3-13MoveRelative3-13MoveTo3-14
PPlaySoundFile3-14
Index
7/27/2019 Python SDK2 0
42/42
RRecognizeSpeech3-15Registration1-3
SSendMail3-15setDefaultUnits3-5SetDriveVelocities3-16SetFaceVector3-16setUserConfig2-2Speak3-16SpeakFromFile3-16Stop3-17
TTaskContext3-3Turn3-17
TurnRelative3-17TurnTo3-18
UUser config2-2
WWait3-18WaitUntil3-18WatchInbox3-19
XX, Y coordinates3-6
Top Related