Paper ID #10727
PROGRAMMING A SCARA ROBOT FOR A
Prof. Akram Hossain, Purdue University Calumet (College of Technology)
Akram Hossain is a professor in the department of Engineering Technology and Director of the Centerfor Packaging Machinery Industry at Purdue University Calumet, Hammond, IN. He worked eight yearsin industry at various capacities. He is working with Purdue University Calumet for the past 24 years.He consults for industry on process control, packaging machinery system control and related disciplines.He is a senior member of IEEE. He served in IEEE/Industry Application Society for 15 years at var-ious capacities. He served as chair of manufacturing Systems Development Applications Departmentof IEEE/IAS. He authored more than 25 refereed journal and conference publications. In 2009 he as PIreceived NSF-CCLI grant entitled A Mechatronics Curriculum and Packaging Automation Laboratory Fa-cility. In 2010 he as Co-PI received NSF-ATE grant entitled Meeting Workforce Needs for MechatronicsTechnicians. From 2003 through 2006, he was involved with Argonne National Laboratory, Argonne, ILin developing direct computer control for hydrogen powered automotives. He is also involved in severaldirect computer control and wireless process control related research projects. His interests are in the areaof industrial transducer, industrial process control, modeling and simulation of Mechatronics devices andsystems, wireless controls, statistical process control, computer aided design and fabrication of printedcircuit board, programmable logic controllers, programmable logic devices and renewable energy relatedprojects.
Mr. md jubair hossain, Purdue University Calumet
Currently I am working on my masters degree in Purdue University Calumet. My major is Mechatronics. Ihad worked in some machine assembling & manufacturing company for industrial automation. In Purdue,I worked with Scara robot in lab for a project, from there I gain some knowledge in programming Scararobot. This is my first publication.
EDUCATION M.S. Engineering Technology, Purdue University Calumet , (currently working) B.S Elec-trical & Electronic Engineering , CUET , Bangladesh, August 2010
Ms. Mafruha Jahan
c©American Society for Engineering Education, 2014
PROGRAMMING A SCARA ROBOT FOR A
MANUFACTURING CELL TO
ASSEMBLE AND PRODUCE
MEDICAL DEVICES
Abstract:
This research paper focuses on a single cell manufacturing machine setup that can be
programmed according to requirements to perform certain processing functions. Manufacturing
cell operation depends on parts to be assembled. The Primary target of this manufacturing cell is
to glue three parts together to produce a small medical device with limited human intervention.
There are three different trajectory actions required to completely assemble the part. This
research paper talks about how to program a low-cost Scara robot for the manufacturing cell
which performs multiple sequential operations to produce the device.
First operation is to glue part “A” and part “B” together to produce a part ‘AB”. The second
operation is the glue drying time of part “AB”. The third operation is to glue part “C” to part
“AB”. The forth operation is to dry part “ABC”. Since there is a minimum robot trajectory
activity during the glue drying process, a buffer of part “AB” is created to utilize that time. By
utilizing the glue drying time the buffered part “AB” gets ready for the third operation. This
means that as soon as the buffer part “AB” has been dried, the robot performs its third operation
that is joining the part “C” with the dried buffer part “AB”. Then the robot performs again its 1st
and 3rd
operation sequentially. Although, a minimum robot trajectory activity is required during
the fourth operation of glue drying of part “ABC, nevertheless it is a required step for the
complete part assembly. In this way the process is maximizing its throughput and minimizing the
production cycle time.
For multiple part-type operation in this single-machine cell, we provide an efficient algorithm
that simultaneously optimizes the robot movement and part sequencing operation. The result of
this research paper is promising for creating small and compact manufacturing cells.
INTRODUCTION
Scara are generally are the first choice of manufacturers because of their speed, ruggedness, price
and durability. Scara’s are ideal for a variety of general-purpose applications requiring fast,
repeatable and articulate point to point movements such as palletizing, depalletizing, machine
loading, unloading and assembly. Due to their 'elbow' motions, scara robots are also used for
applications requiring constant acceleration through circular motions like dispensing. [1]
This paper considers the scheduling of operations in a single manufacturing cell that repetitively
produces a family of similar parts. We provided a sequential scheme for performing certain jobs
through programming. The single manufacturing cell can perform several operations and can be
interfaced with windows based programming software tools by which we can easily teach the
robot. In this paper we explained how a single cell manufacturing machine can be programmed
according to job requirements to perform certain processing stages that depend upon the parts
being manufactured. Without being involved with the complicated robot programming language
this software tool allows for quick and easy teaching whatever our application may be. Figure 1
represents a typical Tracheostomy device to be assembled.
Figure 1: Tracheostomy Device [2]
DESCRIPTION OF PARTS AND THEIR ASSEMBLY PROCESS
At the beginning of the process Two Parts “A” & “B” will come under the robot through two
conveyor belts. The arm will first put measured amount glue on part “A” from liquid dispenser
valve. At the same time “B” will come through the second conveyer belt and robot will pick up
“B” to join together with “A” which is the first operation. Figure 2 shows three different parts to
be assembled. Figure 3 shows scara robot, conveyor, and parts arrangement inside the
manufacturing cell.
Figure 2: Three Different Parts to be Assembled
Figure 3: Arrangement inside the Manufacturing Cell
As mentioned earlier second operation is the glue drying time. That’s why, at first the robot will
do 1st operation 5 times and keep the joined part, “AB”, together on the 2
nd conveyor for a while
as it takes times to dry up part “AB”. If it takes 10 second to join part “A” with part ”B”, then it
takes 50 seconds to complete the 5 joined products “AB”. Which is more than the glue drying
Part A Part B Part C
time for a single joined product (assume glue drying time is 20 seconds). Figure 4 shows the 1st
assembly operation.
Figure 4: 1st Operation
Now we have 5 joined products “AB” ready for next operation. We consider this as a buffer
product and from now on, robot will try to maintain 5 joined product “AB” on the 2nd
conveyor.
When the 5th
joined product “AB“ is ready (product “AB“ is ready means completely dry) the
part “C” will come on the 3rd
conveyer belt. Figure 5 shows movement of parts on the three
conveyor belts.
Figure 5: Movement of Parts on the Conveyors
+
=
Part A Part B Joint part “AB”
After the 2nd
operation the robot will glue the part “AB” with part “C” and produce part “ABC”,
Joining part “AB” with part “C” to produce part “ABC” is the 3rd
operation. Figure 6 shows 3rd
operation process.
Figure 6: 3rd
operation
Now robot will perform 1st and 3
rd operation sequentially and eliminate the glue drying time,
then send the whole product to production line. The Process will continue like this. Figure 7
shows different products in the operation.
Figure 7: Different Parts [3]
HARDWARE AND SOFTWARE REQUIREMNT FOR ROBOT PROGRAMMING
Hardware
Personal computer: PC should be equipped with the CPU and software workable with the
operating system of Microsoft windows.
COM Port: A vacant COM port to connect with the robot.
Scara Robot: Model TMB100 4-axis model
+ = Joint Part “AB” Part C Final product “ABC”
Part A Joined Part “AB” Final Part “ABC”
Table I
Operating range J1 arm/±90° J2 arm/±150° Z axis/100mm R axis/ ± 360°
Arm length J1 arm 260mm J2 arm 180mm J1 + J2 440mm
Load Tool 5 kgf
R axis inertia 90 kg. cm2
Maximum Speed J1 + J2
1500mm/sec (1 kgf load)
1400mm/sec (3 kgf load)
1300mm/sec (5 kgf load)
Maximum speed Z & R axis Z = 320mm/sec, R = 1000°/sec
Repeatability XY ±0.02mm per axis Z ±0.02mm R axis < ± 0.02°
Data memory capacity 100 programs 6000 points
Drive system 5-phase stepping motor
Operation system Point to point and continuous path
Interpolation XYZ & R simultaneous 3D linear interpolation
Teaching method Direct, remote & manual data Input (MDI) teaching
CPU 32 bit (MC68EC020, MC68882)
PLC 50 programs, 100 steps for each program
I/O signals 25 input & 24 output signals
External interface RS232C - one channel for PC, one channel for the teach
pendant and one channel for external equipment (optional)
Wiring & piping to tool 15 wires for signals, 4 air pipes (4mm)
Interface - interlocking 4 input signals from interlock equipment
Power supply 220V AC 180-250V consumption 200VA
Working temperature 0 - 40°C
Relative humidity 20 - 95% no condensation
Weight 41kg
Table I: Scara TMB100 4-axis model specification [4]
Figure 8 shows a Scara robot Model TMB100 4-axis.
Figure 8: Scara Robot Model TMB100 4-axis
Software
Software tool “JR Points” is used to program the robot. Each program contains many location
point data coupled with specified speed.
SCARA ROBOT PROGRAMMING
The way we have programmed the robot is given below by an example. A program is a set of
commands in which a series of actions and movement performed by the robot and positions
where dispensing is carried out are registered in order. The scara robot can store up to 100
programs from program number 1 to 100. A program consists of two parts. One is the program
data which controls the program itself and the other part is point data (or a series of point data
where there is more than one point) which contains information such as coordinates of the
robots. The coordinate is a point where the robot may perform a job. The program data consists
of following seven items: [5]
Program name
Work home position
Dispense condition
Cycle mode
PTP condition
Tool data
Move area limit
Teaching:
We tried to make the teaching process as simple as possible. There is no need to learn the
complicated programming language. Axes of the scara robot (TMB100) can be taught
individually to perform certain task. This innovative teaching method allows the user to register
work points by simply grabbing the robot arm and moving it to the desired location. This greatly
simplifies the teaching process, making it fast for all users of all levels. The systems can also be
taught by traditional teaching methods. The JOG teaching method allows the user to drive the
robot arm to the desired location by pressing buttons on a teach-box.
A Typical Programming Example:
Figure 9 shows a typical scara robot arm movement trajectory for operation.
Figure 9: A Typical Scara Robot Arm Movement Trajectory
Desired Point Configuration
Point 1 Point 2 Point 3 Point 4 Point 5 Point 6 Point 7
CP Start
Point
CP Stop
Point
CP Stop
Point
CP Arc
Point
CP Passing
Point
CP Arc
Point
CP End
Point
In JR Points under program menu, add new program. which is shown in figure 10.
Figure 10: Add a Program in JR Points
Add program No. 1, which is showen in Figure 11.
Figure 11: Add Program
Now select the point number 1 with mouse pointer. Which is shown in Figure 12.
Figure 12: Select Point Number
Under Robot menu select JOG, which is showen in Figure 13.
Figure 13: Select Jog Menu
Now we need to define cordinate values or we could change robot arm by clicking on these
buttons shown in red in Figure 13. After moving on desired point, click on the Register button.
In this screen line speed need to be selected. We selected line speed as 20. which is shown in
Figure 14.
Figure 14: Define Cordinate Value
& line Speed
Now click this icon (Shown in red in Figure 15) to add additional new points. After this we need
to define cordinate values in these data points. This is showen in Figure 15.
Figure 15: Add Additional Points
Now we need to send these values to the robot, which is shown in Figure 16.
Figure 16: Sending Data Points to the Robot
Now use the drop down menu and perform as follows: Robot >Test Running to run the whole
program. This is shown in Figure 17.
Figure 17: Running a Program
All the coordinates data are shown in Table II.
Table II
point 1 point 2 point 3 point 4 point 5 point 6 point 7
point type
CP Start
Point
CP Stop
Point
CP Stop
Point
CP Arc
Point
CP
Passing
Point
CP Arc
Point
CP End
Point
arm shape Righty Righty Righty Righty Righty Righty Righty
coordinates
X -101.158 -101.153 -111 -118.722 -126.366 -118.692 -111
coordinates
Y 323.172 337.013 337 329.258 336.753 344.745 337
coordinates
Z 96 96 96 96.136 96.136 96.136 96
coordinates
R 0 0 0 0 0 0 0
Line Speed 20 20 20 20 20 20 -
Table II: All Data Point in JR Points
DISCUSION OF THE RESULT
Symbologies
= time for moving part A (in second)
= time for moving part B and gluing (in second)
= time for moving part C and gluing (in second)
= time for moving joined part “AB” and gluing (in second)
= time glue dry (in second)
Comparison between Traditional & Enhanced Operational Sequences
Traditional Operational Sequence
If we move one part at a time and joined together, we need to wait for the glue to dry. We can calculate
the time for this operation.
Step followed in tradition operation to produce one unit
Operation
Time (in sec) 5 5 20 5 5 20
A
B
G
A+B
C
A B G A+B C G
Total time required to produce 1 product =5+5+20+5+5+20
=60 second
Enhanced Operational Sequence
We divided enhanced operational sequence into two steps. Time required for STEP 1 is shown
below:
Sequence in STEP 1
Operation
Time (in sec) 5 5 5 5 5 5 5 5 5 5
Total time required for STEP 1 = 5+5+5+5+5+5+5+5+5+5
= 50 second
Sequence in STEP 2
Operation
Time (in sec) 5 5 5 5
Total time required for STEP 2 = 5+5+5+5
= 20 second
A B A B A B A B A B
A+B C A B
In first 50 seconds robot will follow STEP 1. After that robot will continuously follow sequence in STEP
2.
Total time required to produce the 1st products = 50+20
= 70 seconds
Data Generated from Traditional and Enhanced Sequences
time(second) time(second)
Produced
unit
traditional enhanced
1 60 70
2 120 90
3 180 110
4 240 130
5 300 150
6 360 170
7 420 190
8 480 210
9 540 230
10 600 250
11 660 270
12 720 290
13 780 310
14 840 330
15 900 350
16 960 370
17 1020 390
18 1080 410
19 1140 430
20 1200 450
21 1260 470
22 1320 490
23 1380 510
24 1440 530
25 1500 550
time(second) time(second)
Produced
unit traditional enhanced
26 1560 570
27 1620 590
28 1680 610
29 1740 630
30 1800 650
31 1860 670
32 1920 690
33 1980 710
34 2040 730
35 2100 750
36 2160 770
37 2220 790
38 2280 810
39 2340 830
40 2400 850
41 2460 870
42 2520 890
43 2580 910
44 2640 930
45 2700 950
46 2760 970
47 2820 990
48 2880 1010
49 2940 1030
50 3000 1050
time(second) time(second)
Produced
unit traditional enhanced
51 3060 1070
52 3120 1090
time(second) time(second)
Produced
unit traditional enhanced
76 4560 1570
77 4620 1590
53 3180 1110
54 3240 1130
55 3300 1150
56 3360 1170
57 3420 1190
58 3480 1210
59 3540 1230
60 3600 1250
61 3660 1270
62 3720 1290
63 3780 1310
64 3840 1330
65 3900 1350
66 3960 1370
67 4020 1390
68 4080 1410
69 4140 1430
70 4200 1450
71 4260 1470
72 4320 1490
73 4380 1510
74 4440 1530
75 4500 1550
78 4680 1610
79 4740 1630
80 4800 1650
81 4860 1670
82 4920 1690
83 4980 1710
84 5040 1730
85 5100 1750
86 5160 1770
87 5220 1790
88 5280 1810
89 5340 1830
90 5400 1850
91 5460 1870
92 5520 1890
93 5580 1910
94 5640 1930
95 5700 1950
96 5760 1970
97 5820 1990
98 5880 2010
99 5940 2030
100 6000 2050
Analysis of the Collected Data
In this experiment, glue drying time = 20 seconds and for parts movement & gluing time = 5
seconds.
Figure 18 shows number of product versus time required to produce the product by using
traditional and enhanced sequences.
Figure 18: Product Produced Versus Time Required for Traditional and Enhanced Sequence
Product per minute (traditional sequence) = (100/6000)*60
= 1
Product per minute (enhanced sequence) = (100/2040)*60
= 2.926
Thus we could conclude enhanced sequence is almost 3 times faster than traditional sequence.
Gue drying time
(second)
sequence improvement compare
to traditional sequence
5 1.471
10 1.961
20 2.941
30 3.922
40 4.902
50 5.882
60 6.862
70 7.843
80 8.824
90 9.804
100 10.784
0
1000
2000
3000
4000
5000
6000
7000
0 20 40 60 80 100 120
traditional
enhanced
products
tim
e (s
eco
nd
)
Figure 19 shows the improvement made using enhanced sequence compare to traditional
sequence. Using the data (curve) we could benchmark our enhanced sequence with traditional
sequence.
Figure 19: Factor of Improvement Comparing to Methods
CONCLUSION
To assemble and produce a medical device using scara robot is one of the cheapest solutions in
the evaluation of the robot. Because of advances in hardware and software design, it is more
compact, more efficient, easier to use and less expensive than its predecessors.
0
2
4
6
8
10
12
0 50 100 150
Improvement Due to Enhanced Compare to Traditional Sequence
Glue Drying Time (seconds)
Fact
or
of
Imp
rove
nt
References
[1] http://www.robotics.org/content-detail.cfm/Industrial-Robotics-Featured-Articles/Scara-vs-Cartesian-Robots-
Selecting-the-Right-Type-for-Your-Applications/content_id/1001
[2] http://endo.co.id/romsons-tracheostomy-tube.html
[3] http://www.aic.cuhk.edu.hk/web8/Tracheostomy%20tube.htm
[4] http://www.intertronics.co.uk/products/ctmb100.htm
[5] TMB100 Dispensing Manual from http://www.fisnar.com/
Paper ID #9275
Virtual Joystick Control of Finch Robot
Prof. David R. Loker, Pennsylvania State University, Erie
David R. Loker received the M.S.E.E. degree from Syracuse University in 1986. In 1984, he joinedGeneral Electric (GE) Company, AESD, as a design engineer. In 1988, he joined the faculty at Penn StateErie, The Behrend College. In 2007, he became the Chair of the Electrical and Computer EngineeringTechnology Program. His research interests include wireless sensor networks, data acquisition systems,and communications systems.
Mr. Stephen A. Strom, Penn State Behrend
Stephen Strom joined the faculty of Penn State Erie, The Behrend College, and the School of Engineeringin Fall 2010. He is a lecturer in the Electrical and Computer Engineering department and holds a B.S. inelectrical engineering from Carnegie Mellon University. Steve comes to Penn State Behrend with morethan thirty years experience in designing and programming embedded systems and has multiple patentsfor both hardware designs and software algorithms.
c©American Society for Engineering Education, 2014
Joystick Control of Finch Robot
Abstract
Junior-level students in the Electrical and Computer Engineering Technology program complete a
3-credit Measurements & Instrumentation course. There are three main sections of the course: (1)
Programming applications using LabVIEW, (2) Data acquisition, sensors, and signal conditioning,
and (3) Design of measurement systems. Weekly laboratory activities mirror the lecture materials.
Part of the requirements in the course includes an end-of-semester team design project where one
possible option is to design and implement software application for the Finch Robot. Students are
provided LabVIEW SubVIs for all of the robot’s low-level functions (audio buzzer, tri-color LED,
left/right motor control, light sensors, obstacle detectors, temperature sensor, and tri-axis
accelerometer values) as well as the corresponding DLL files to run the SubVIs. The objectives for
the project are to utilize their LabVIEW programming skills to design a joystick control for the
speed and direction of the robot, display the pitch and roll of the robot, and audibly alert the user of
the presence of an obstacle in front of the robot.
This paper provides a detailed listing of the engineering requirements for the project. An example of
student work is provided, along with a project assessment. Recommendations are included to help
ensure student success on the project.
Introduction to the Measurements and Instrumentation Course
This is a required junior-level course for Electrical and Computer Engineering Technology students.
The purpose of the course is several-fold:
Learn principles of LabVIEW programming.
Use LabVIEW to design software for programming PC-based data acquisition (DAQ)
systems
Understand various sensors and design signal conditioning circuits to interface the sensors to
DAQ systems
Integrate all of these components into the design of measurement systems
This course is lab intensive and utilizes LabVIEW with a data acquisition (DAQ) device as a
primary vehicle for the design of measurement systems1-3
. The course is 3 credits and consists of 2
hours of lecture and 2 hours of lab per week. The lecture content of the course is divided into three
areas: Programming applications using LabVIEW (5 weeks), Data acquisition, sensors, and signal
conditioning (4 weeks), and Design of measurement systems (7 weeks). LabVIEW is a graphical
programming environment that allows a developer access to a wide variety of I/O and sensor
interfaces, perform mathematical analysis, and link all of these operations to custom designed
“control panels” or user interfaces.
The lab content of the course is designed to reinforce concepts discussed during lecture. Each lab is
considered a project since it lists a series of engineering requirements and depending upon the scope
of the project, requires either 2 or 3 weeks to complete. Each project is completed by a student team
that consists of no more than 2 students (some students prefer to work by themselves), where
students pick their team members at the beginning of the semester.
For nearly all of the projects, students are expected to work outside of the scheduled lab time in
order to complete the objectives. Grading for the project consists of 60% based on meeting all of the
engineering requirements, 30% based on the content of the lab report, and 10% based on spelling,
grammar, and writing style. There is a 5% reduction for late lab report submittals. A listing of the
projects for the course is shown below.
Lab 1: Software-defined Calculator Project (2 weeks)
Lab 2: Thermocouple Project (2 weeks)
Lab 3: Waveform Generator Project (2 weeks)
Lab 4: Digital Voltmeter Project (2 weeks)
Lab 5: Digital Multimeter Project (2 weeks)
Lab 6: Temperature Control System Project (2 weeks)
Also, a team final project is required for the course. Three weeks of scheduled lab time are provided
at the end of the semester for students to work on their project. One possible option for the final
project is to design and implement a software application for the Finch Robot. The purpose of this
paper is to describe the details about the Finch Robot project.
Project Definition
For the Finch Robot project, students were given the task of designing a “Joystick” control for the
speed and direction of a Finch Robot4. The resulting LabVIEW program used a (USB) serial link to
the robot to send commands to the device as well as poll the unit and read back its sensor
information.
This project fits within the course outcomes as it requires the students to:
Design a measurement system using LabVIEW
Incorporate simultaneous operations all within the program
Perform software signal conditioning (digital averaging of the accelerometer values)
Take a complex problem, break it into smaller operations, implement each component, and
combine all operations together into an overall program
Engineering Requirements for the Project
The overall project objectives are to:
Use LabVIEW for the design of a state machine to control the motion of the robot
Design a joystick control to move the robot forward, back, left and right
Play an audible beep on the robot’s speaker when an obstacle is detected
Display the pitch and roll of the robot on the screen
The students were given detailed engineering requirements for the project as shown below.
User Interface Requirements:
Joystick control
Indicators for displaying pitch and roll of the robot
Audible alert for warning the user of the presence of an obstacle in front of the robot
Functional Requirements:
Speed of the robot controlled by the joystick
Direction of the robot controlled by the joystick
Documentation:
VI online description
Title information is shown on front panel
Appropriate comments are provided on block diagram
Deliverables:
Soft copy of lab report
Soft copies of VI and all SubVIs
Finch Robot
The Finch Robot is a device that is tethered to the PC via a USB cable. Both power and control
come across the cable. The robot has two independent drive motors that power the left/right wheels
and can be run at varying speeds in both forward and reverse.
The robot also has two (photocell) light sensors, one temperature sensor, two forward IR obstacle
sensors, a 3-axis accelerometer, an audio buzzer and a tri-color LED.
When plugged into a PC, its USB interface registers itself as a human interface device (HID) similar
to that of a keyboard or mouse. By using HID packets, the PC can send and receive information to
the robot. The low-level code consists of a dynamically linked library and a series of LabVIEW
SubVIs, as shown in Figure 1, that allow the user to interface to each of the sensors as well as to the
motor.
The SubVIs can be grouped into four categories: Initialization/Exit, Motion, Audio/Visual, and
Sensors. Each of the SubVIs’ categories, descriptions, and inputs/outputs are listed in Table 1.
Figure 1. Finch LabVIEW SubVIs.
Category SubVI Description
Initialization/Exit
Fin_Init Initialize the interface to the Finch Robot
Fin_Exit Terminate the interface to the Finch Robot and close the
connection to the USB port
Motion
Fin_Mtr
Enable/disable the left/right motor speeds. Inputs are
integers for:
Time on in tenths of a second
Left/right motor speed (values range from -255 to +255,
where -255 is full speed backward, 0 is stop, and +255 is
full speed forward
Fin_Wait
Wait for the Finch Robot to stop. This is used in conjunction
with the Fin_Mtr SubVI to wait for the robot to come to a
complete stop.
Audio/Visual
Fin_Buzz
Enable/disable the audio buzzer. Inputs are integers for:
Time for the buzzer to remain on in mS
Frequency of the buzzer in Hz (use 0 to turn buzzer off)
Fin_Led
Enables/disables the tri-color LED. Brightness of each color
(red, green, and blue) is controlled by an integer ranging
from 0 to 255, where 0 is off and 255 is full brightness.
Sensors
Fin_Accel
Reads the tri-axis accelerometer values. Outputs are:
3 floating point numbers representing the x, y, and z axis
acceleration values
2 integers representing Boolean values (1/0) indicating
whether the Finch Robot has been tapped or shaken
Fin_Light
Reads the left/right light sensors. Each output is an integer
ranging from 0 to 255, where 0 indicates no light
(completely dark)
Fin_Sense
Reads the left/right IR sensors. Each output is an integer
representing a Boolean value (1/0) indicating the presence of
an obstacle
Fin_Temp
Reads the temperature sensor on the Finch Robot. Output is
a floating point number representing temperature in degrees
Celsius.
Table 1. SubVI Descriptions.
The Finch Robot LabVIEW SubVIs represent the basic operations of the robot and must be
included into a larger LabVIEW program to perform full control and display operations. A typical
LabVIEW program, shown in Figure 2, has multiple SubVIs chained together using the error in and
error out terminals. Thus, as one SubVI completes its operation, the next one begins.
Figure 2. Linking One Finch Robot SubVI to Another.
Project Implementation
The project includes several programming and design aspects. The first step was to separate the
project into smaller operations, where each operation was addressed via its own LabVIEW SubVIs.
The students came to the conclusion that they needed the following operations:
Initialization operation
Joystick controller operation
Motor speed operation
Obstacle detector operation
Accelerometer operation
Since a state machine was required, each state was assigned to represent one of the above
operations. Each state would have its set of controls and indicators, and students needed to
determine which SubVIs to use in each state. After completing each state, students determined how
to merge the states together to design the full program. Lastly, testing was performed on the
completed VI.
LabVIEW Block Diagram
Each state of the state machine is described below, along with its function. Screen shots of each
state were obtained from a student’s LabVIEW program.
Initialization State
The initialization state, shown in Figure 3, is a “one-time” call to the SubVIs that connects the
program to the Finch Robot and joystick. While the Finch-Init SubVI is required, setting the motor
speed to zero is optional. Concurrent with this, the program is connecting to the joystick and setting
the initial values for the SubVIs in the next state.
Figure 3. LabVIEW Initialization State.
The joystick control moved the robot forward, back, left and right and at varying speeds. This
portion required multiple iterations in that the students had to solve several problems:
Read in the position of the joystick using the X- and Y-axis from the controller and convert its
input range to the values expected by the Finch Robot SubVIs.
The Finch Robot can accept any input speed from -255 to +255. However, some speed settings
are invalid as the robot does not have enough “traction” to implement a full speed turn. As a
result, it was necessary to limit the speed settings that were sent to the robot.
When the joystick is pushed completely forward, it returns a value of +32767, when pulled
completely back, it returns a value of -32768, and when released, it reads 0. The students found
that when the joystick was released, the X/Y values did not go to exactly zero. Thus, it was
necessary to add a “deadband” zone around the zero point.
If an obstacle was detected, the program needed to halt the motion of the robot. Thus, both the
set motor speed and the obstacle detection operations had to work with each other.
The solution was implemented via two states: read joystick and set motor speed.
Read Joystick State and Set Motor Speed State
The joystick returns its position as a range of values from +32767 to -32768. This will get converted
to a motor speed of +255 to -255 (which is in the range of the Finch motor SubVI). In addition, the
X/Y values from the joystick are used to determine if the robot is to move straight or turn. In Figure
4, several examples are shown to illustrate the joystick axis values which are read and converted to
left/right motor speed values.
Figure 4. Joystick Axis Values Converted to Motor Speeds.
Once calculated, the motor speed settings are then sent to the detect state where they can be passed
onto the motor or blocked (if an obstacle is detected). In Figures 5 and 6, the joystick data is
converted/scaled, checked against a deadband, and limited to the range expected by the Finch
Robot.
Figure 5. LabVIEW Read Joystick State.
Figure 6. LabVIEW Set Motor Speed State.
Detect State
When an obstacle is detected, an audible beep on the robot’s speaker is played. This integrated the
forward IR sensors as well as the motor speed settings from the previous state and either enabled the
motion of the robot, or stopped the robot and played an audible tone. Once the obstacle is removed,
the tone is disabled and the robot is allowed to move.
This state needed to "or" the values from the two forward obstacle sensors and set a flag to
True/False (indicating if an object is present). Depending on this flag, it would send the motor speed
calculated in the previous state to the robot or send the motor a speed setting of zero (to stop the
robot). This is shown in Figure 7.
Figure 7. LabVIEW Detect State.
Measure Sensors State
While reading the X, Y and Z-axis acceleration sensor data is straight forward, it was necessary to
perform averaging on the data. This was done by determining a simple average over the last 20
readings. The “conditioned” result was then converted from “gravity” to the actual tilt of the robot
in both pitch (front-to-back) and roll (side-to-side) using simple arithmetic. This is illustrated in
Figure 8. Both data values were indicated on the front panel.
Figure 8. LabVIEW Measure Sensors State.
LabVIEW Front Panel
The front panel was the user interface that displayed the controls and indicators. The main
components were:
Joystick X/Y axis values
Motor power left/right speed settings
Pitch and roll values in both “degrees” and “gravity” (blue and red indicators on each gauge)
An example of the screen shot of a student’s front panel is shown in Figure 9. Additionally, there
are several other fields as the students began to include additional features. The joystick controller
has several finger buttons and knobs, and these were used to light up the tri-color Led as well as
play a series of tones from the robot’s speaker.
Figure 9. LabVIEW Front Panel.
Assessment by the Students
Assessment data is provided from students completing the project during the fall 2013 semester,
since this was the first time the project was offered. The two student teams that completed the
project were asked to assess their work. Their responses were:
The project was reasonable for the course, since the design for the project was based on the
materials learned during the semester.
It was reasonable for a final project, since it took approximately 8 hours to complete and it
incorporated a number of programming features.
It was an interesting and challenging project, since there were multiple solutions to the problem.
It allows for “bonus” features, such as using the buzzer and tri-color Led for extra operations. A
motivated student could easily create additional engineering requirements for the project.
There were also some suggestions to improve the project:
Find a wireless controlled robot to eliminate the need for a tether to operate.
The project did not use a DAQ device (had internal sensors), so it did not involve a hardware
design element.
Find a robot that has better position and speed control. The Finch Robot had very little weight
and minimal wheel friction, and as a result, it was hard to get it to drive in a straight direction.
Conclusions /Recommendations
Both student teams successfully completed the project. However, it was interesting to see how the
two student teams approached their designs and where they focused their development. One team
identified the user interface as the section that required the most attention, while the other team
focused on sensors and motion. Both teams had trouble in converting gravity readings to degrees
and determining pitch from a 3-axis accelerometer.
One observation is that the team that worked mainly on the user interface spent very little time on
the sensor and signal conditioning. In addition, they received less satisfaction from the project than
the team working on integrating the sensors with the motion and control.
One goal in defining a final lab project is to empower students to determine creative solutions and
encourage them to go beyond the listed requirements. This occurred in one of the teams as they
implemented a series of audio/visual effects using some of the buttons on the joystick. This can be
encouraged by allowing students to select one of several possible options for the final project, and to
provide extra credit for students who want to add additional requirements to the project.
Additionally, it was important for student success on the project to have them break the project into
a series of smaller operations. Once these were implemented, it was easier for them to combine the
operations together to design the complete program.
Acknowledgement
We would like to thank Luke Elliott, a junior in our program, who completed the Finch Robot
project and provided the LabVIEW program from which the screen shots were obtained.
Bibliography
1. Bishop, Robert H., Learning with LabVIEW 2009, Pearson Education, 2010.
2. Travis, Jeffrey and Jim Kring, LabVIEW for Everyone, 3rd
Edition, Pearson Education, 2007.
3. Essick, John, Hands-On Introduction to LabVIEW for Scientists and Engineers, 2nd
Edition, Oxford University
Press, 2013.
4. Web Site http://www.finchrobot.com/
Paper ID #9584
Inductive Learning Tool Improves Instrumentation Course
Prof. James Andrew Smith P.Eng., Ryerson University
Dr. Smith specializes in Biomedical Engineering at Ryerson University in Toronto, Canada. He wasBiomedical Engineering Program Director in 2010/11 and is currently Biomedical Engineering StreamCoordinator. His research combines aspects of biomechanics and robotics, with active research projects inlegged systems, obstetrics and surgical systems. In addition to teaching awards received at the Universityof Alberta and Ryerson University, he is a recipient or co-recipient of four IEEE Real World EngineeringProjects awards between 2007 and 2010. He was co-recipient of second place in the 2012 HealthcareInnovation Conference’s design competition in Houston Texas.
c©American Society for Engineering Education, 2014
Inductive Learning Tool Improves Instrumentation Course
Abstract
Engineering instructors are typically restricted to a narrow range of simplified models that apply to both white-board lectures and hands-on labs. This is as true in a mechanical vibrations course as it is in electrical circuits or biomedical instrumentation courses. More complex systems, which are more representative of reality, require more time or more advanced tools for derivation or understanding than is often available in an undergraduate curriculum.
In 2011 we addressed this disconnect by introducing our students to Maplesoft’s MapleSim multi-domain simulation software in an instrumentation course. MapleSim permits students to schematically represent electrical and mechanical systems and simulates their changes over time. Using Maplesoft’s Maple application, MapleSim can then produce the analytical equations which underlie the model in a form that is similar to what is shown to the students in class or in their textbooks. This process can be self-directed and can occur even if students don’t have advanced skills in the mathematics required to derive the equations by hand. This tool, used in conjunction with traditional hands-on electronics lab activities permits the students to explore the behaviour of the systems of interest in an inductive learning manner more representative of natural everyday learning. To illustrate the inductive process taken by students, a typical example undertaken by students of examining temperature effects on a resistor and diode circuit is given.
Using MapleSim and Maple in an inductive context, it is now far easier for the student to use basic concepts conveyed by the textbook or standard lectures, vary them in subtle or extreme ways and to see the effect not only on the numerical simulation output but on the underlying mathematics as well. Examples that were not practical to attempt by anyone but the brightest students are now within the reach of motivated and curious students.
Student evaluations have shown an improvement since the introduction of the inductive approach along with Maple and MapleSim. The positive response of students to the use of MapleSim as a front-end tool and Maple as a support tool has encouraged us to use it as the core of a new distance education course in embedded systems architecture.
Introduction
The ELE 604 Sensors and Measurement class at Ryerson University is presented to Electrical and Computer Engineering students in the third (junior) year of the undergraduate program.1 The objective is to expose students to basic instrumentation systems, including both analog and digital aspects. The laboratory component focuses on the development of a microcontroller-based instrumentation system that can provide readings to a PC and an onboard LCD from switches, accelerometers and load cells.
The course is heavily-focused on hands-on work in the laboratory. From student feedback in 2009 and 2010 it became apparent that there was a mismatch between the theoretical background provided in class and the practical work being undertaken in the laboratory. This is reflective of the general approach taken in Engineering classes. Engineering instructors are typically restricted to a narrow range of simplified models that apply to both white-board lectures and
hands-on labs. This is as true in a mechanical vibrations course as it is in an electrical circuits or biomedical instrumentation courses. More complex systems, which are more representative of reality, require more time or more advanced tools for derivation or understanding than is often available in an undergraduate curriculum. Furthermore most subject-specific simulation products do not provide a method for linking the GUI-based model’s dynamics with the fundamental equations used to model the simpler examples in class.
In 2011 we addressed this disconnect by introducing our students to Maplesoft’s MapleSim2 multi-domain simulation software in our instrumentation course. Like its competitors, MapleSim permits students to schematically represent electrical and mechanical systems and simulates their changes over time. It can also produce the analytical equations which underlie the model in a form that is similar to what is shown to the students in class or in their textbooks. Students can examine these equations to derive further insight into the dynamics of their systems. This process can be self-directed and can occur even if students don’t have advanced skills in the mathematics required to derive the equations by hand. This tool, used in conjunction with traditional hands-on electronics lab activities permits the students to explore the behaviour of the systems of interest in an inductive learning manner more representative of natural everyday learning.
Observations
The learning strategy that underlies the introduction of MapleSim is that the professor-driven deductive learning approach of memorizing rules should be complemented with a student-driven inductive approach of determining intuitive rules through experience. The inductive approach3,4 is more natural and typically leads to a deeper, more thorough understanding of completed material -- it is also in stark contrast to the artificial deductive approach we experience so often in school. A comparison of the rule-to-behaviour approach used in deductive learning versus the behaviour-to-rule approach used in inductive learning is shown graphically in Fig. 1.
Watch how a three year old learns how to play videos on an iPhone. No one tells her how the capacitive sensing array under the glass works -- she simply moves and swipes, sometimes succeeding, sometimes failing at making the device work the way she wants. These hands-on, trial-and-error tests induce a change in how the child uses the phone. After a few attempts of learning like this, children are often more adept at using the phone than their parents are! It’s the opinion of the author that this kind of learning should be encouraged within the standard engineering curriculum.
Here we describe an example of using MapleSim in an inductive manner during a laboratory exercise. In the three years since introducing MapleSim as an inductive learning tool the student feedback has grown more and more positive, as is reported later.
Figure 1 The traditional deductive learning approach (left) compared to the inductive approach (right). The deductive approach focuses on rule-learning first, while the inductive approach emphasizes exposure to system behaviour first.
Methods
Over a 12 week period students are tasked with developing an instrumentation system from scratch using a Freescale 9s12 microcontroller.5,6 They design and implement circuitry and software which link the 9s12 to sensors and a desktop computer. Here we look at two introductory activities which use Maple and MapleSim in the learning of fundamental instrumentation concepts.
In the fourth and fifth weeks of the class students examine methods for driving LEDs and switches on the 9s12. One of the important aspects to explore is the temperature dependence of the instrumentation system. It is impractical to do cold weather testing within the laboratory so students explore the effect that temperature will have on the LED’s light through simulation.
(a) (b)
Figure 2 Schematic of the diode system (a) and closeup of the resistor (b).
The students set up a simple LED schematic in MapleSim, as shown in Fig. 2a. Normally students would set the resistance and temperature values to very specific values. However, here the numeric values are replaced with parametric values ResistorValue and TemperatureValue, as shown in Fig. 2b. Also, two probes have been inserted to measure diode voltage and current. These are given variable names DiodeVoltage and probecurrent. The model will then be called from Maple, where the resistance and temperature values are set. The voltage and current values will be returned to Maple. The MapleSim model is called from Maple using the Simulate command. This is similar to how Simulink models are called from Matlab. The main command for this is
simData := A:-Simulate(output = datapoint, duration = 1.0, params = ['ResistorValue' = 100, 'TemperatureValue' = 300])
where the resistance has been set to 100 Ohms and the temperature has been set to 300 Kelvin. The more detailed set of commands are given in Fig. 5 in the Appendix.
The result is a set of three numbers giving the time, current value and voltage value from the simulation. While students could explicitly run a few dozen simulations like this to determine the relationship between resistance, temperature, voltage and current, it is much more efficient to have the students run a parameter sweep, instead.
The parameter sweep centers on a nested For loop and outputs a pair of surface plots that show how voltage and current vary based on changing temperature and inline resistance. These are shown in Fig. 3, below. The Maple script for executing this is shown in Fig. 6 to 9 in the
Appendix. Without knowing the fundamental mathematical models for the resistor or diode the students can visualize how diode voltage and current are highest for the lowest resistance and temperature.
Figure 3 Parameter sweep results: highest current and voltage are found when resistance and temperature are lowest.
These graphs permit the student to observe the behaviour of the system as it is subjected to variations in system parameters. The rules governing the behaviour can be induced from these graphs, without requiring the student to perform high-level mathematical analysis. This type of parameter sweep could be performed in any number of software packages, including Matlab or a PSPICE variant. What distinguishes Maple and MapleSim from these other programs is the ability to generate the underlying equations of the system. This is a critically important feature.
MapleSim contains an “attach equation” option which populates a Maple worksheet with live interactive hooks to the MapleSim file. Three equations are created by the worksheet which describe the mathematical rules governing the system:
€
Vbatt −OutputVolt(t) −D_ s(t)⋅D_Goff D_off (t)
1 otherwise⎫ ⎬ ⎭
⎧ ⎨ ⎩
−D_Goff ⋅ D_VKnee = 0
D_off (t) = D_ s(t) < 0
OutputVolt(t) = D_ s(t)⋅1 D_off (t) = true
D_Ron otherwise⎫ ⎬ ⎭
⎧ ⎨ ⎩
+D_Vknee
where
€
Vbatt is the input battery voltage,
€
OutputVolt(t) is the voltage of the output voltage probe,
€
D_VKnee is the knee voltage of the Diode. Other characteristics of the Diode include the conductance in the off state,
€
D_Goff , the resistance in the on state,
€
D_Ron , and the saturation current,
€
D_ s(t) . The Diode’s on/off state is recorded by
€
D_off (t) is true when the diode is off and false when it is on.
The traditional deductive approach to teaching this material would require the student to have a relatively extensive foundation of mathematical and electrical engineering knowledge before being able to determine these equations by him or herself. The student’s exposure to the classroom professor and teaching assistant is limited and the relevant examples in the textbook or on the Internet tend to be simplified and repetitive.
The MapleSim approach provides a far more flexible and accessible resource to the student than is possible in the classroom or in the standard reference material. Can the student simply hit the “attach equations” button and submit the result for homework? No. The equations typically require some understanding of the underlying variables and structure. The student must still have some basic understanding of what many of the components are, and will benefit from being able to discuss questions and concerns with a teacher or teaching assistant. However, it is now far easier for the student to use basic concepts conveyed by the textbook or standard lectures, vary them in subtle or extreme ways and to see the effect not only on the numerical simulation output but on the underlying mathematics as well. Examples that were not practical to attempt by anyone but the brightest students are now within the reach of motivated and curious students.
Results
Inductive learning is very effective and the author started using new software in class that is designed specifically to work with this approach. Using MapleSim students start by drawing the system schematic, and then simulate it. Then they extract the underlying equations in the software, explore them using different scenarios, and analyze the equations to derive conclusions. The best part of this for students is that they can match it with what they are seeing in their textbooks, as the simulation process they go through is the same as in the textbook. Students in ELE 604 value the inductive approach. Since 2010 I have made the course more inductive and less deductive. The students' answers to the two pertinent questions on the Faculty Course Survey, Q1: “Concepts are clearly explained with appropriate use of examples” and Q2: “The way this course is taught helps me to learn”, shown in Fig. 4 provide good evidence that the approach is working.
0% 10% 20% 30% 40% 50% 60% 70% 80% 90%
2009 2010 2011 2012 2013 2014
Student Evaluations
Q1
Q2
Figure 4 Student evaluations have shown notable improvement. In particular Question 1: “Concepts are clearly explained with appropriate use of examples” and Question 2: “The way this course is taught helps me to learn” illustrate that since inductive tools were introduced and emphasized in class the students have given more positive ratings.
Perhaps the most significant feedback was the following comment from a student on the student evaluations in 2013, illustrating that students get this inductive approach:
“The professor for this course was very good. His approach in teaching the material was very helpful. He decided to teach us the material in a way where we learned the material through the labs and from the labs we would then learn the theory. This was an effective method of teaching, since the labs made me learn the material before the lectures and then understand the `why’ of the material I learned in the labs through the theory presented in the lectures. ... Overall, the professor was friendly and really helpful with the students -- he really wanted us to learn the material and see the actual application of it in the real world.”
As seen in Fig. 4 it took two years to see a real change in student response to the introduction of MapleSim as an inductive learning tool. The main reasons were that it initially was used only in the laboratories and without accessing the equation extraction feature. Over time it became more integrated in classroom exercises, the students began using the equation extraction feature and the ability to use components outside of the electrical engineering domain (from the mechanical toolboxes, primarily). In 2014 more Maple-only exercises (without MapleSim) were introduced during classroom discussions to illustrate methods for solving mathematical equations related to instrumentation problems. Students have reported that they are now self-selecting Maple in their Control Engineering class rather than the instructor’s preferred tool, Matlab. This could be due due to the Maple interface more closely resembling their textbooks and lecture material.
Finally, in 2014 MapleSim will be introduced in an online distance education course at Ryerson University, CKRE 130 Embedded Systems Hardware Architecture and Implementation. The possibility for self-directed learning using MapleSim is particularly attractive in the context of this course and the lessons learned from teaching with it in ELE 604 will be applied.
Conclusions
End-of-semester student evaluations have shown a quantifiable improvement in the course since the introduction of the inductive approach. As MapleSim and Maple has become more integrated into the course question categories such as “The way this course is taught helps me to learn” and “Concepts are clearly explained with appropriate use of examples” have seen quantifiable improvements over the past three years. The positive response of students to the use of MapleSim and Maple has encouraged us to use it as the core of a new distance education course in embedded systems architecture. Students are now electing to use Maple rather than Matlab in other engineering courses, without instructor intervention, likely due to Maple’s interface more closely resembling lecture and text material. The inductive approach is
complementary to the traditional deductive approach and should at least be considered for portions of courses where student self-exploration could result in positive educational outcomes. With insightful and intuitive tools like MapleSim and Maple it is far easier to find an appropriate mixture of both of these styles within your own courses.
Bibliographic Information
1. Ryerson University. ELE 604 Sensors and Measurement, 2013. URL: http://www.ryerson.ca/calendar/2012-2013/pg3180.html#249649 (Last visited January 1, 2014.)
2. MapleSoft. MapleSim product webpage. URL: http://www.maplesoft.com/products/maplesim/ (Last visited January 1, 2014.)
3. M. J. Prince and R. M. Felder. Inductive teaching and learning methods: Definitions, comparisons, and research bases. Journal of Engineering Education, 95(2):123 – 138, April 2006.
4. N. Knobloch. Is experiential learning authentic? Jour. of Agr. Education, 44(4):22–34, 2003.
5. M. A Mazidi and D Causey. HCS12 Microcontrollers and Embedded Systems. Prentice Hall Press, 2008.
6. F. M. Cady, Frederick M. Microcontrollers and Microcomputers Principles of Software and Hardware Engineering. Oxford University Press, Inc., 2009.
Appendix: Maple script codes to run parametric searches in a MapleSim model.
Figure 5 Simple example of calling the MapleSim model from Maple a single time.
Figure 6 Parameter Sweep, Part 1: Setting up Maple and MapleSim for the parameter sweep. tell Maple where and which files to simulate. Then run the simulation specifying which parameters to modify.
Figure 7 Parameter Sweep, Part 2: define parameters. When parameter sweeping you need to define the boundaries. Here's how you do it. The first parameter can be resistance, and the second temperature.
Figure 8 Parameter Sweep, Part 3: run the parameter sweep. Vary each of the parameters and accumulate the resulting outputs from the MapleSim simulation in two columns.
Figure 9 Parameter Sweep, Part 4: plotting a pair of surface plots to visualize the result of the parameter sweep.
Paper ID #10699
Designing, Building, and Testing an Autonomous Search and Rescue Robot— An Undergraduate Applied Research Experience
Zachary Cody Hazelwood
Cody Hazelwood is currently a software developer at the Alpha High Theft Solutions division of Check-point Systems. He received the B.S. degree in Professional Computer Science from Middle TennesseeState University in May 2013. He currently does freelance projects involving mobile software develop-ment, microcontroller applications, and electronics. He enjoys learning about and testing ways to improvepeople’s lives with technology.
Dr. Saleh M. Sbenaty, Middle Tennessee State University
Saleh M. Sbenaty is currently a professor of engineering technology at Middle Tennessee State Univer-sity. He received the B.S. degree in E.E. from Damascus University and the M.S. and Ph.D. degrees inE.E. from Tennessee Technological University. He is actively engaged in curriculum development fortechnological education. He has authored and co-authored several industry-based case studies. He is alsoconducting research in the area of mass spectrometry, power electronics, lasers, instrumentation, digitalforensics, and microcontroller applications.
c©American Society for Engineering Education, 2014
Designing, Building, and Testing an Autonomous Search and Rescue Robot —
An Undergraduate Applied Research Experience
Preamble
Middle Tennessee State University Undergraduate Research Center, MTSU URC, was created in
2004 to promote research at the undergraduate level and to provide university support for
undergraduate students and the faculty members who mentor them in scholarly and creative
activities. This includes providing information and financial support through Undergraduate
Research Experience and Creative Activity, URECA, grants. The URECA Committee evaluates
proposals based on their merits. This committee is composed of accomplished and passionate
faculty representatives from all five colleges at MTSU.
Universities usually do research as part of their missions (teaching, research, and service). As the
institution with the largest undergraduate population in TN, MTSU is committed to being a
leader in undergraduate education in the state. MTSU is known for student-centered learning and
great classroom teaching. A natural extension of the classroom is the one-on-one interaction
between a research student and his/her mentor that can shape a student's career.
URC Mission
As part of the Office of Research, the URC mission is to be the central hub for
communication about undergraduate research grant programs and other related opportunities
on and off campus and to distribute university funds for undergraduate research and creative
projects and travel to disseminate results.
URC Vision
The URC is pursuing its vision to nurture a culture of research and creative activity through
support for undergraduate students and their faculty mentors.
URC Values
Implement the goals of the University's Academic Master Plan related to the URC mission
with the following values:
Excellence in research, scholarship, and creative projects.
Opportunities for student-centered learning.
Productive internal and external collaborations and partnerships.
Success in academic and professional careers of our undergraduate students and their
faculty mentors.
Why Should MTSU Fund Undergraduate Research?
Through URECA grants and other center activities, MTSU invests $120,000 in
undergraduate research grants each year. Research involving undergraduates is a logical
investment because it helps the university to:
Maintain strong ties with alumni,
generate a workforce of accomplished and sought-after graduates,
build strong graduate programs,
provide the extra challenge and preparation for high-end students going to top-notch
graduate schools, and
attracting the 'best and brightest' to our campus
We consider undergraduate research to be a signature program at MTSU
Why Should an Undergraduate Student Do Research?
Undergraduate students are encouraged to conduct research since this unique experience put
them a cut above the rest when applying for jobs. This is true since undergraduate research
helps students by:
Integrating coursework through “hands-on” projects.
Creating independence and autonomous researcher.
Building Resume - writing a proposal, completing a research project, writing a final
report, and orally present the results greatly enhance the student’s experience.
Preparing for graduate school, where a main goal is a research project.
Developing “soft skills” important for entering into and succeeding in the job market.
URECA grants allow students to build skills in their chosen fields without having to
work an outside job.
The current paper describes the undergraduate research experience for an applied hands-on
project and how this benefited the student.
Introduction
Robotics is a relatively young field. One of the first demonstrations of Robotics as we know it
occurred in 1898 when Nikola Tesla built a remote controlled boat and demonstrated it at
Madison Square Garden. The Unimate, an industrial robotic arm, was first introduced at General
Motors1 in 1962. As robotics have improved, they have become very useful, especially in
dangerous situations. They can enter into locations that humans could not without placing lives
in danger. They can be strong enough to lift cars, or they can be accurate enough to perform
delicate surgery. So why do we not use robotics more often? It is because robotics is an
expensive field. For example, including development costs, the Northrop Grumman RQ-4
Global Hawk, currently used by the US Air Force and Navy, NASA, and the German Air Force,
costs $218 million dollars2. Despite these numbers, it is possible nowadays to develop a low-
cost robot prototype that, with some minor physical improvements and up scaling of component
quality, could be used and afforded by small-town law enforcement, rescue, and emergency
management teams. The outcome of the project is important, because it demonstrates that the
accessibility of robotics is increasing rapidly, and that it would be feasible for robots to be used,
even on a budget.
Burning buildings, collapsed mines, and hostage situations all have one thing in common; they
are dangerous for rescuers. Robots are, therefore, a preferred substitute! Robots can accomplish
many tasks without requiring a human being to enter a dangerous location, thereby possibly
saving lives. In the past, robots were not an option due to the high cost involved and the training
required to operate them. However, with modern advancements in technology, robots are
becoming increasingly used in situations and locations that were unthinkable in the past.
The objective of this paper is to develop an inexpensive, easy to operate, autonomous robot that
is capable of navigating itself in dangerous situations. This would be a significant project,
because it would demonstrate that using robots is not too far out of reach, even for local
emergency crews and law enforcements. This project explores artificial intelligence as it relates
to self-guided robots, microcontroller programming and code optimization, wireless video
streaming, and remote control using a smartphone’s or tablet accelerometer. Upon the project’s
completion, the plan is to develop a very simple to use robot capable of driving itself through a
building while sending a video feed from a user-controllable camera back to the smartphone.
The strategy for implementing this project was to use and integrate the knowledge that the
student had obtained from several courses such as Microprocessor Operation and Control,
Intelligent Robot Systems, and Electronics along with open-source hardware and software
libraries to develop a prototype robot. The project started by developing a very simple robot on
which to build a software platform. The initial robot would have a remotely operative design
and then move to the implementation of autonomy. One important issue to keep in mind when
implementing autonomy was that there needed to be a proper balance between full autonomy and
user control. In a search and rescue type environment, because of the unpredictability of an
environment in a disaster situation, it is important that accurate manual control can be obtained if
needed3.
Project Background
Carnegie Mellon University is on the forefront of research for Search and Rescue robots. One of
the most notable developments is the Snake Robot (also known as a hyper-redundant robot).
The National Institute of Standards and Technology is also doing research on Search and Rescue
robots. They have created a competition with three arenas of varying difficulties. Participants
are required to navigate through the arenas to complete tasks without damaging the environment
or the manikins. With careful data collection and observations of these competitions, they were
able to explore the variety of robot implementations, the pros and cons of each implementation,
and human and robot interactions. Researchers are able to use the data collected to learn about
and improve situational awareness, which is essential for robot autonomy.
According to the Springer Handbook of Robotics, there is a lot of work to be done in
autonomous robotics for search and rescue. Due to the unpredictability of the environments
encountered in disasters, better programming algorithms and better sensors need to be developed.
Also, a proper balance needs to be struck between full autonomy and user control. Current
search and rescue robots require hours of training for humans and animals.
This paper summarizes the development process of the robot. It covers the circuitry, hardware,
and software components used (as well as some alternatives) and the reasoning for using those
components. It also covers the results of this project, including reactions from the general
populous upon presentation at the X X State University Scholar’s Week poster presentations.
Methodology
1. Circuit Design
The circuit used for power
distribution had to meet the
following specifications:
It should be powered by a single
7.2 volt 3300 mAh remote control
car hobby battery; should provide
two separately regulated power
sources; providing up to one amp
each; should be compact; and
should be built on a custom printed
circuit board. Based on these
specifications, the 7805-voltage
regulator was chosen.
The 7805 is low-cost, and it is available locally at electronic and hobby stores. Along with a
couple of capacitors, using two 7805s made it possible to design a single compact circuit board
to power 5 servos, a microcomputer, a microcontroller, a camera, a light, and a handful of
sensors. EAGLE CAD was used to design the schematic and PCB layout.
An assortment of hardware and software was needed for the robot. It needed a method of
wireless communication, a device to host communication and delegate commands, a camera,
sensors, motors, a body, wheels, and a device to control it. The robot was divided into several
phases of functionality, and based on the requirements of those phases, appropriate hardware and
software solutions were chosen.
Figure 1. Power Distribution Circuit Layout
Figure 2. Power Distribution and Microcontroller
2. Hardware Design
The first consideration for the hardware was the need for a high-quality video feed while keeping
cost in mind. To tackle this issue, it was necessary to have a quality camera, high-speed data
transfer, and a device to handle the video stream. For the camera, a standard Logitech C310 HD
Webcam was chosen. The HD 310 is highly rated, very easy to obtain, cheap, and it provides
acceptable video quality up to 1280 x 720 pixels with a JPEG-based video stream4. For wireless
communication, the 802.11n wireless Ethernet protocol was chosen because of its wide support
and large bandwidth. A TP-Link TL-WR703N was chosen to act as the central point of
communication for the robot. The TL-WR703N is a miniature wireless router with a USB port,
Ethernet port, and 802.11n Wi-Fi support. It is very “hackable” (easy to modify and install
alternative firmware) and has large support in the hobby community. After flashing OpenWrt’s
alternative firmware5 and installing MJPEG-Streamer
6, this wireless router is able to route
network connections to the robot and handle all of the video streaming.
The next consideration for hardware was the need for a controller for motors and sensors. For
this purpose, an Arduino Uno SMD microcontroller was chosen. The Arduino Uno is a very
common open-source microcontroller platform based around an ATmega328 microcontroller.
The ATmega328 provides 6 channels of PWM (pulse width modulation) output and 6 analog
inputs along with several more digital input and outputs, which is plenty of I/O for a simplistic
robot6.
The next step in choosing hardware was to find a way to establish communication between the
TP-Link wireless router and the Arduino Uno. There were a couple of options available for this
with large differences in the pros and cons of each method. Adopting the appropriate method
was essential, because this connection is the heart of all communication that occurs with the
robot. The first option was to use an Arduino Ethernet shield. The Ethernet shield provides one
Figure 4. TP-Link TL-WR703N Figure 5. Logitech C310 – Image from Logitech.com
10/100 Mbit Ethernet port attached to a board that fits on top of the Arduino. This option would
make networking very easy, but because of the speed of the Ethernet connection, it would use a
lot of the Arduino’s available processing power. The next option was to modify the TP-Link
wireless router and use a connection to the built in serial circuit to directly route packetized serial
messages to the Arduino’s serial bus. This option was good, because it required no additional
hardware. It would also have very low latency, because there is no additional device in the
signal chain. However, it would not be expandable
at all, and there is a risk of damage when
modifying a multi-layer machine-soldered circuit
board. The final option was to use a RaspberryPi
to receive communications and pass them on to the
Arduino. This is the option that was chosen,
because it allows for expandability (such as adding
computer vision for autonomy), and it allows more
communications options than just packetized serial.
The RaspberryPi Model-B provides a 700 MHz
ARM11 CPU, a Broadcom VideoCore IV GPU
with OpenGL support, and two USB 2.0 ports.6
Another important step in choosing hardware was to decide the body style of the robot and the
method to be used for mobility. Since the goal of this project was to create a low-cost prototype,
Polymethyl methacrylate (PMMA) (the proper name for what most people know as Plexiglas or
Acrylite) was chosen. PMMA is easy to work with, and it provides plenty of strength for
developing a simple robot. For movement, tank style treads were chosen. This will allow using
only 2 motors for movement in any direction. Also, tank treads work well in unpredictable
environments due to their large surface area and the teeth that normally characterize industrial
tank tread design. In the case of this project, the treads were plastic. Standard Hitec HS-322HD
hobby servos were used for tilting and panning of the camera. Two Hitec HS-425BB servos
were chosen and modified to be continuously rotational for attaching to gears to drive the robot.
One additional step in choosing hardware was to determine the needed sensors for autonomy.
Based on the original project specifications of a heat-seeking robot, a thermal sensor was
necessary. Also, as is the case in any autonomous robot, sensors for obstacle avoidance were
Figure 6. RaspberryPi Board
Figure 7. Parallax Ping))) Sensor Figure 8. Sharp IR Sensor
needed. For the temperature sensor, the Melexis MLX90614 Infrared Thermometer was chosen.
This device is a medical grade temperature sensor that uses the 2-wire I2C protocol for
communication. It can be programmed to have a temperature range of -70 ºC to 382.2 ºC7.
For obstacle avoidance, multiple sensors were chosen. A Parallax Ping))) was chosen for the
front of the robot. It mounts on a servo, so measurements can be taken in 180º. For the sides of
the robot, Sharp GPD12 IR sensors were used. These can be used with minimal error as the
robot is moving, so they work well for situational awareness such as centering the robot in
hallways and doors.
The final step in choosing the hardware was to determine a method of remote control for the
robot. Because of the desire to make control as user friendly as possible, an Android smartphone
(Motorola Droid 3) and Android tablet (Asus Nexus 7) were chosen. By developing a remote
control system on a mobile computing device, the learning curve for operation is decreased
significantly as well as the cost of the project.
3. Software Design
The greatest challenge in designing a robotic system from the ground up is the software. For this
project multiples layers of software were needed. It was important to keep the modules
independent so that if one portion of the system changes, the entire software stack wouldn’t have
to be rewritten. The software consisted of modules for the microcontroller, the wireless router,
the camera server, the communication and logic server, and for the Android device that would
issue commands to the robot.
The first stage of software
development consisted of
developing a custom Android
application to control the
robot. This application
needed to be easy to use, able
to drive the robot using the
accelerometer, able to issue
commands to the robot, and
able to view the camera feed.
In addition, it helped to be
able to observe the
temperature readings obtained
by the robot. In order to get
some practice with Android
development, and to become familiar with the accelerometer API from the Android Software
Development Kit, a simple test application was written to display accelerometer values to the
screen. Once that was completed, the project was copied and modified to add some additional
libraries. The Autobahn WebSocket library for Android was chosen for communication. It
provided RFC 6455 WebSocket support, multi-threading, and an easy to use API.8 Next, some
code some code was added from a developer with a forum handle of ‘padde’ that provided a way
to take an MJPEG stream and display it full-screen.9 This code was heavily modified to remove
features that were unnecessary, to clean up the coding style, and to increase the efficiency. It
was also modified to allow a custom, resizable SurfaceView to be used, so that buttons to send
commands to the robot could be displayed simultaneously while viewing the camera. Once these
libraries were added, code was written to interpret accelerometer data and send it to the robot.
Once the Android application was complete, it was possible to work on the wireless router. The
factory-installed software on the router was in Chinese, which made operation very difficult. It
was also very limited in features. To overcome these limitations, OpenWRT was installed.
OpenWRT is a custom Linux based firmware that allows heavy customization of the router’s
internal settings.10
MJPEG Streamer, a software package that provides a simple web server and
image streamer was also installed. This allowed the router to function as a video streaming
server as well as a communications router.11
The Linux install was customized to start streaming
video as soon as it is powered on.
The next phase of software development involved the server (which runs on the RaspberryPi)
and the microcontroller code (which runs on the Arduino). These pieces of software needed to
be developed in tandem, because the logic for operating the robot was shared between the two
components.
Before beginning development, a method of communication between the RaspberryPi and the
Arduino needed to be chosen. The first method chosen was a library called Firmata. Firmata is
an application that can be installed on the Arduino that allows nearly complete control of the
board via serial. JohnnyFive, an asynchronous JavaScript library to interface with Firmata, was
chosen for the server side. Tests worked extremely well for motor control and even turning an
LED on and off fast enough to simulate PWM, however when adding in the time-sensitive Ping
sensor, motors became jerky and the communications became unreliable.
The next method chosen was a system of short numeric codes that was created to allow very
short serial messages to be sent. The server section of the code was based on the Node.JS
platform. Node.JS was chosen, because it provides an asynchronous JavaScript based
programming environment that is great for rapid development of scalable web applications.12
By
using web technologies, the application becomes extremely flexible, and very easy to develop
cross-platform. Although this isn’t technically a web application, it is important to have an
asynchronous application, because network
connections and robot communications need
to occur simultaneously. The Node.JS
application used the Serialport-Node13
library
and the WebSocket-Node14
library to receive
JSON (JavaScript Object Notation) messages
over a WebSocket network connection, parse
and process the messages, and send them over
serial to the Arduino. The Arduino software
would receive messages, process these
messages, and then send the appropriate
messages to the motors. The Arduino was
also responsible for sending temperature
readings back to the client application. The Arduino stored states for motors and also handled
obtained sensor readings when necessary.
For autonomous operation, the Arduino handled all logic. Communications were still maintained
with the remote control client so the user can stop and start, reposition the camera, and take over
manual control if needed. The autonomous system used the subsumption reactive robot
architecture. The potential fields were used to process sensor information to avoid obstacles and
move towards heat.
4. Project Assembly
Assembly was the last
phase of the project. A
frame for holding treads, a
bottom panel, and a rear
panel were designed using
AutoCAD. The designs
were uploaded to a laser
cutter where the PMMA
could be shaped to the
design specifications.
Pieces of PVC pipe were used for mounting sensors that needed to be elevated as well as the
camera and its pan and tilt servos. Once the pieces were cut, the frame was assembled using
machine screws and a gel-based cyanoacrylate glue. Wires were laid out and cut to length. The
power distribution board was populated and tested. Connections were soldered, and heat shrink
was applied. Once everything was tested to be working, zip ties were used to keep the
appearance of the wiring clean.
Code Message Code Message
1 Stop All 20 Camera Left
2 LEDs On 21 Camera Right
3 LEDs Off 22 Camera Up
4 Autonomy On 23 Camera Down
5 Autonomy Off 24 Camera LR Stop
10 Drive Left 25 Camera UD Stop
11 Drive Right 26 Camera Stop
12 Drive Forward 29 Camera Dir
13 Drive
Backward
14 Drive F Left
15 Drive F Right
16 Drive Stop
Table 1 – Numeric Message Codes
Figure 8. Prototype Tread Frames
Figure 9. Final Robot Assembly
Results
Running at full power, the battery lasted about 45 minutes. Running without motors, the battery
lasted about 3 hours. This demonstrates that the power distribution circuitry is fairly efficient
and is good enough for usage in a real world application. The battery life could be improved by
either using a larger battery or by using more efficient voltage regulators.
The hardware chosen fulfilled the project’s specifications. The drive servos turned out to be
slower than expected. One of them also had to be replaced during testing. The drive system
could be greatly improved by using gears to drive the tank treads. Also it would be better if the
servos were not supporting any weight from the robot. The wireless router was determined to be
unnecessary, as a USB 802.11n Wi-Fi dongle could have been used to provide an Ad-Hoc
network from the RaspberryPi and the MJPEG Streamer software is also compatible with the
RaspberryPi. Since the router was already purchased, it was used anyway in the design.
However, one advantage of using the router is that if something malfunctioned with the robot,
video was still available. Also, keeping the video separate allows many people to view the video
stream without compromising communication with the robot.
Software development accounted for a very large portion of the project’s total time. A lot of
research and trial and error went into the process of choosing various libraries and determining a
method of communication among all of the devices. In testing the software, the best-case time
for the robot to respond to a message from the Android device was undetectable by the human
eye. The worst-case time was still less than one second.
The first demonstrations of the robot’s teleoperation occurred in small settings early in the
project’s development. Based on feedback from alpha-testers, accelerometer algorithms and
camera control methods were modified until they were more comfortable to use. The result was
very simple robot operation.
The robot was publicly demonstrated at the Middle Tennessee State University Scholar’s Week
Poster Presentations. The response from the public was highly positive. The MTSU UAV
program as well as a forensic anthropologist that works for MTSU were interested in further
discussions on ways the technology developed in this project could be used in their respective
fields.
Further testing shows great success with the robot’s teleoperation, however, there were many
improvements that need to be made to the autonomy. Using computer vision to assist in tracking
objects as well as implementing path planning algorithms would make the robot much more
reliable. Also, modification of the heat sensor to allow it to focus better or adding a thermal
imaging camera to be processed with computer vision would allow a much better tracking of
heat.
Conclusion
This project was a very large undertaking for a first research project. Although the teleoperation
portion of the project was very successful, the fully autonomous operation could not be
completed in the URECA project timeline of 200 hours. Despite this shortcoming, the outcome
of this project was still a success. This project proves that developing a robot for search and
rescue that is within reach of local emergency crews is something that could occur in the very
near future. With some minor hardware changes, an upgraded chassis, and additional software
development time, a usable product is not far off from the prototype developed in this project.
Appendix A: Project Timeline
References
1. Isom, James. "MegaGiant Robotics." MegaGiant Robotics. 2002. Web. 20 Apr. 2013.
2. Drew, Christopher. "Costly Drone Is Poised To Replace U-2 Spy Plane." The New York Times. The New
York Times Media Group, 2 Aug. 2011. Web. 20 Apr. 2013.
3. Wolf, Alon, Howard H. Choset, Benjamin H. Brown, and Randall W. Casciola. "Design and Control of a
Mobile Hyper-redundant Urban Search and Rescue Robot." Advanced Robotics 19.3 (2005): 221-48. Print.
4. Logitech. "Logitech HD Webcam C310." Logitech. Web. 20 Apr. 2013.
5. "Arduino Uno." Arduino. Arduino. Web. 21 Apr. 2013. <http://arduino.cc/en/Main/ArduinoBoardUno>.
6. "RPi Hardware." ELinux.org RaspberryPi Wiki. Web. 22 Apr. 2013. <http://elinux.org/RPi_Hardware>.
7. Melexis. "Melexis MLX90614 Datasheet." Microelectronic Integrated Systems, 30 Mar. 2009. Web. 22
Apr. 2013. <https://www.sparkfun.com/datasheets/Sensors/Temperature/SEN-09570-datasheet-
3901090614M005.pdf>.
8. "Autobahn|Android." Autobahn. Tavendo. Web. 22 Apr. 2013. <http://autobahn.ws/android>.
9. Padde. "MJPEG on Android Anyone?" Anddev.org. Web. 22 Apr. 2013.
<http://www.anddev.org/mjpeg_on_android_anyone-t1871.html>.
10. "TP-Link TL-WR703N." OpenWrt. Web. 22 Apr. 2013. <http://wiki.openwrt.org/ru/toh/tp-link/tl-wr703n>.
11. "Mjpeg Streamer." Sourceforge. 9 Apr. 2010. Web. 22 Apr. 2013.
<http://sourceforge.net/apps/mediawiki/mjpg-streamer/index.php?title=Main_Page>.
12. "Node.js." Node.js. Joyent. Web. 21 Apr. 2013. <http://www.nodejs.org/>.
13. Voodootikigod. "Node-serialport." GitHub Node-serialport. 24 Aug. 2012. Web. 21 Apr. 2013.
<https://github.com/voodootikigod/node-serialport>.
14. Worlize. "Websocket-Node." GitHub. Jan. 2013. Web. 21 Apr. 2013.
<https://github.com/Worlize/WebSocket-Node>.
Top Related