engr.calvinblogs.orgengr.calvinblogs.org/.../2016/10/PPFS-Final-Submission-Team-14.d… · Web...
Transcript of engr.calvinblogs.orgengr.calvinblogs.org/.../2016/10/PPFS-Final-Submission-Team-14.d… · Web...
12 December 2016
Auto-botTeam #14
PROJECT PROPOSAL FEASIBILITY STUDY
Josiah Markvluwer (Mechanical)
Levi Dobson (Electrical)
Peter Jung (Electrical)
Peter Ye (Electrical)
Engr339/340 Senior Design Project
© 2016, Calvin College and Josiah Markvluwer, Levi Dobson, Peter Jung, Peter Ye
Funded partly by Delphi
1 | P a g e
Abstract
This report contains the feasibility and proposed design of an autonomous driving by senior
engineering students at Calvin College Josiah Markvluwer (ME), Peter Ye (EE), Levi Dobson
(EE), and Peter Jung (EE). This project is intended to showcase the capabilities of low-cost self-
driving cars on a smaller scale and also serves as a capstone course in the Calvin Engineering
Program. Primarily, this vehicle would be able to self-navigate around the Knollcrest loop of
Calvin College using sensors and GPSs to detect obstruction and stop signs. Secondarily, the
low-cost vehicle should showcase the reliability of its decision-making abilities.
2 | P a g e
Table of ContentsTABLE OF FIGURES.......................................................................................................................................6TABLE OF TABLES.........................................................................................................................................6
1. INTRODUCTION...........................................................................................................................7
2. PROJECT MANAGEMENT..............................................................................................................82.1 TEAM ORGANIZATION.............................................................................................................................82.2 SCHEDULE......................................................................................................................................82.3 BUDGET......................................................................................................................................92.4 METHOD OF APPROACH.................................................................................................................10
3. REQUIREMENTS..............................................................................................................................103.1 FUNCTIONAL REQUIREMENTS ................................................................................................................103.2 HARDWARE REQUIREMENTS ..................................................................................................................11
3.2.1 Sensors.....................................................................................................................................113.2.2 Central Processing Unit............................................................................................................123.2.3 Motor Control..........................................................................................................................123.2.4 Interface..................................................................................................................................12
3.3 SENIOR DESIGN REQUIREMENTS ............................................................................................................12
4. TASK SPECIFICATIONS AND SCHEDULE............................................................................................134.1 PPFS REPORT WRITE-UP......................................................................................................................14
4.1.1 Individual Assignment..............................................................................................................144.1.2 Write Up Meetings...................................................................................................................14
4.2 MATERIAL AND RESEARCH PLANNING......................................................................................................144.2.2 Positioning and Location System.............................................................................................154.2.3 Obstacle Detection Network....................................................................................................154.2.4 Motor Drive and System Actuator...........................................................................................164.2.4 Vehicle Motors.........................................................................................................................164.2.5 Steering System and Frame.....................................................................................................16
4.3 HARDWARE PURCHASING AND INSTALLATION.............................................................................................164.3.1 Sensor Units- LIDAR, Camera, Ultrasonic.................................................................................164.3.2 Location System- DGPS............................................................................................................174.3.3 Vehicle Frame and Motors.......................................................................................................174.3.4 Processors / Single Board Computers......................................................................................17
4.4 SOFTWARE DEVELOPMENT.....................................................................................................................174.4.1 LIDAR unit sensors...................................................................................................................174.4.2 Arduino Motor Drive................................................................................................................184.4.3 DGPS (pending)........................................................................................................................184.4.4 Camera Detection (pending/talk about finding open source)..................................................18
4.5 SYSTEM INTEGRATION...........................................................................................................................184.6 WEBSITE DEVELOPMENT.......................................................................................................................184.7 PRESENTATION WORK..........................................................................................................................184.8 PLANNING MEETINGS............................................................................................................................194.9 BUDGETING AND PURCHASING...............................................................................................................194.10 SYSTEM PLANNING/MAPPING..............................................................................................................19
5. SYSTEM ARCHITECTURE..................................................................................................................195.1 HARDWARE.........................................................................................................................................20
3 | P a g e
5.2 SOFTWARE..........................................................................................................................................265.3 MOTOR & CHASSIS..............................................................................................................................275.4 INTERCONNECTIONS..............................................................................................................................27
6. DESIGN...........................................................................................................................................286.1 DESIGN CRITERIA.................................................................................................................................28
6.1.1 Mapping and Navigation.........................................................................................................286.1.2 Obstacle avoidance and detection...........................................................................................296.1.3 Cost constraints.......................................................................................................................296.1.4 Design Norms...........................................................................................................................30
6.2 DESIGN ALTERNATIVES AND DECISIONS....................................................................................................306.2.1 GPS Module.............................................................................................................................316.2.2 LIDAR and Sensors...................................................................................................................326.2.3 Processing Unit........................................................................................................................326.2.4 Motor Control..........................................................................................................................32
7. INTEGRATION, TEST, DEBUG...........................................................................................................33
8. CONCLUSION..................................................................................................................................33
9. ACKNOWLEDGEMENTS...................................................................................................................35
10. REFERENCES.................................................................................................................................36
11. APPENDICES.................................................................................................................................3711.1 SOFTWARE CODE...............................................................................................................................37
11.1.1 Arduino Code for Motor Drive................................................................................................3711.1.2 Code for LIDAR.......................................................................................................................39
11.2 DATASHEETS.....................................................................................................................................4011.2.1 Victor SP Speed Controller.....................................................................................................4011.2.2 LIDAR Data Sheets.................................................................................................................41
11.3 GANTT CHART...................................................................................................................................43
Table of FiguresFigure 1: Team Structure...............................................................................................................8
4 | P a g e
Figure 2: Main Control Components............................................................................................11Figure 3: DGPS Diagram...............................................................................................................21Figure 4: NEO-M8P Module Diagram...........................................................................................21Figure 5: LIDAR Operation Diagram.............................................................................................24Figure 6: Motor Drive Components.............................................................................................25
Table of TablesTable 1: DGPS Unit C94-M8P – Hardware Pieces......................................................................21Table 2: DGPS Unit C94-M8P - Interfaces.................................................................................22Table 3: Positional System Decision Matrix.................................................................................31Table 4: Decision Matrix for DGPS...............................................................................................31
1. Introduction
The goal of this project is to create a vehicle that utilizes distance and camera sensors in order to
follow a path assigned by a GPS module. These sensors will give feedback to the motors
5 | P a g e
allowing the vehicle to direct itself. Specifically, the team is trying to make a prototype of an
autonomous vehicle which can drive around Calvin’s campus. It's an important problem because
autonomous vehicles are typically thought of as expensive systems which can only be developed
in high technology industrial settings. The goal is to realize a small-scale system with the same
concepts that self-driving cars utilize today.
With these goals in mind, the final deliverables for the project will consist of a mechanical car
and a motor system that will use sensors inputs in order to provide appropriate feedback to the
car. The nature of this project is probably one that could be expanded upon by future senior
design teams at Calvin College since many features could be added to the vehicle system if time
were more permitting. Since vehicle autonomy is something that is becoming the future of the
car market, it is extremely relevant to all electrical and mechanical engineers graduating in the
2010s. This project then not only serves as a great introduction to the field of automation but also
is a good test pilot for what driving automation can look like in small scale settings.
The team consists of three electrical engineering students, Peter Ye, Levi Dobson, and Peter
Jung; and one mechanical engineering student, Josiah Markvluwer. This project is part of a year-
long senior design course, which is required for every senior engineering student at Calvin
College. The goal of this senior design course is to give students opportunities to work on real-
world problems that suit their passions.
2. Project Management
2.1 Team Organization
The roles of the team members are diverse and yet all collectively working together. Team
members are individually tasked with owning certain aspects of the project but at the same time
6 | P a g e
collectively working and supporting one another in finishing the project and completing the final
outcome. The team members’ specific tasks are as follows. Peter Jung is owning the vision
system, how the cameras perform, and budgeting. Levi Dobson is owning the LIDAR/obstacle
detection sensors. Peter Ye is owning the DGPS and movement tracking unit and motor control
interface. Last, Josiah is owning the vehicle fabrication and keep tracking of meeting minutes.
Mr. Eric Walstra is the team’s Industrial Mentor having experience with self-driving vehicles.
Professor Michmerhuizen is the team's advisor as well as a Course Instructor for Senior Design.
There is a designated folder in Google drive where all documents are kept so that all team
members can access critical files at anytime and anywhere there is an internet connection.
Figure 1: Team Structure
2.2 Schedule
The team's approach to scheduling is to have three update meetings per week that follow the
meeting time for senior design class at 3:30 P.M. on Mondays, Wednesdays, and Fridays. The
intent of the meetings is to go over the organization chart and have a status update on how each
project is going and if the project will meet the deadline. The status meeting is also a time for the
7 | P a g e
team to gather input from other team members. This creates an environment where the team can
review how realistic time estimates are for a given task. That means that the team is constantly
update and placing team members in support roles in order to drive through important tasks that
are critical to the development of the project as a whole. Another important aspect of the status
meeting is assigning a time commitment to each project and the components with the project.
Josiah Markvluwer will be in charge of maintaining the schedule and keeping track of the
meeting minutes and moderating the meetings to keep them on track. Josiah Markvluwer also
has the duty of taking notes of any critical decisions made during the meetings. These notes are
stored in the team’s Google Drive folders for reference. A detailed Gantt Chart is included in
Appendix 11.3.
2.3 Budget
Funding is sought out through Calvin's Regional Gift Officer Bill Haverkamp. The budget will
be maintained by Peter Jung, who will be keeping track of how much is being spent on an excel
spreadsheet and keeping the budget up to date whenever a new cost is invoiced. The budget will
also be referenced as an action item during the status meetings to actively track the activity of
how much everyone in the team is spending. Action items for when budget issues arise will go to
a collective decision of the team. No spending will happen without the whole team’s agreement
and approval on an item.
So far Auto-bot has received $800 in funding from Calvin’s Regional Gift office (by way of
Delphi and Mr. Glen Devos) and have used about $400 on a DGPS module from U-blox and a
compass from Adafruit.
8 | P a g e
2.4 Method of Approach
The approach for design is to have team members that are owning an area lead that area in design
and development. The team members and areas follow as stated in section 2.1. The team will
have each member own the process and consult the team for design reviews. A big part of the
design process and review is going to be seeking outside experience and resources to make
ensure the design will be successful. The design lead will be in charge of calling the appropriate
members to the meeting and having an organized approach throughout the meeting. The leader
of each system component is in charge of the organization of work flow and following an
engineering decision matrix to ensure that all options are covered. Each Leader is responsible for
notifying the team when help is needed so another member can assist in meeting deadlines and
requirements.
One important Biblical principle to keep in mind during meetings is to have respect for
the opinions of each team member. A large part of having effective design review meeting is
hearing out other’s ideas even when they might contradict one's own. Doing unto others as you
would do to yourself is a timeless lesson that is especially important to keep in mind when
working on design.
3. Requirements
Since the project goal is to make a scale-down autonomous vehicle, Team 14 does not have
specific customers in mind, because the main purpose of the project is a proof of concept. This
project’s requirements are broken into three categories, functional requirements, hardware
requirements, and senior design course requirements.
3.1 Functional Requirements
9 | P a g e
There are three main functional requirements.
1. It shall be able to follow the roads on campus and stop at stop signs.
2. It shall know where to start and stop based on user’s inputs.
3. It shall adjust its speed based on the obstacles in its environment, including stop signs,
people etc.
3.2 Hardware Requirements
Figure 2: Main Control Components
Four parts of hardware have to be implemented for the Auto-bot to drive on the road without the
assistance of human intervention (beyond inputting a destination). The Auto-bot would have to
collect data from the GPS sensors and distance sensors in order to communicate with the motor
control unit through a central processing unit to actuate the vehicle movement. A user interface is
needed to set the starting point and ending point of the travel route.
3.2.1 Sensors
The sensors will have to be able to detect objects around the Auto-bot so that it can avoid the
obstacles or stop if needed. Since the Auto-bot has to have time to slow down and stop, the range
in these sensors would have to be between 0.5 to 5 meters. Another visual sensor (camera) would
have to be implemented to detect stop signs along the road. Lastly, the sensors should be able to
10 | P a g e
communicate with the processing unit fast enough so that it does the right operations according
to the sensory inputs.
3.2.2 Central Processing Unit
The central processing unit for the vehicle should be small so that it could fit on the vehicle but
also powerful enough so that it can process data from the different sensory inputs. It should also
be able to display the status of the vehicle and should be user-friendly.
3.2.3 Motor Control
Electrical motors would be needed in order for the Auto-bot to drive on the road. So the motor
driving boards are needed to slow down and accelerate the Auto-bot according to the information
given from the central processing unit.
3.2.4 Interface
The interface should make available to anyone to make a waypoint for the Auto-bot and set the
starting and stopping point. It should also be able to monitor the process and make adjustments
to it when something goes wrong.
3.3 Senior Design Requirements
The Senior design course requires that each student have a complete project proposal
feasibility study (PPFS) before December 12, 2016. The final report and a working prototype
will be done before May 6, 2017. Throughout the year, presentations and the project website will
be updated to show progress.
11 | P a g e
4. Task Specifications and Schedule
The first stage of this project is planning and research phase. The major step needed to remove
critical schedule linkages in developing a prototype is to order sensor components and begin
testing them. This stage is nearly complete and only a few sensors are needed for the start of the
integration to begin. The hope is that a mastery of each component by a team member will lead
to a group effort on the integration process.
The next major step is to test the capability of sensors that will be incorporated into the system in
order to form a good idea of what they can tell us about the environment. This will give clearer
definition to what the final project will look like. Examples of these tests are seeing what the
LIDAR unit can detect outside. This will be tested as a lab project in the environment of
Computer Architecture class for the Electrical Engineers.
In parallel to this task, Josiah Markvluwer will be working on repurposing vehicle parts in order
to make a vehicle that will serve as the final prototype. The hope is that these two tasks will be
completely finished by the end of the fall semester so that the spring semester’s focus will be on
integration and completion of the project.
The most complicated step will be the integration of all these parts into a system that can drive
the motors directly. Compilation of the components into one connected system is the most
significant part of the project because it will allow us to begin testing the interaction between
parts of the system.
Another main task for this project is Project Proposal Feasibility Study (PPFS) report, and the
report write-up schedule is detailed below. For a detailed version of the overall schedule, please
visit Team 14’s Asana WBS [10]. 12 | P a g e
4.1 PPFS Report Write-up
4.1.1 Individual Assignment
In order to be efficient in writing the feasibility report of this project, the team assigned each
section of the report to be written in advance by a specified team member. These were to be
completed in time for the compilation meeting so that the bulk of the writing and content could
be reviewed and formatted at the write-up meeting.
4.1.2 Write Up Meetings
A write-up meeting was held the weekend before the draft of the Project Proposal Feasibility
Study was due and once on the weekend before the final copy was due. The purpose of the
meetings was to review the material written by each member of the team and make any major
editing decisions. It also looked to identify any content voids in order to verify that the report
was comprehensive and complete.
4.2 Material and Research Planning
Research for this project was first based on looking for system objects that could meet the project
goals. For example, the first part of the project that stands out is that a vehicle is being made.
What follows from this is research into motors, wheels, sensors, and so on. Next, the Auto-bot
vehicle is described as autonomous, so some sort of unit must be in place inside the vehicle to
command the vehicle to stop, turn, or go. The goal of the Auto-bot is to drive with the purpose of
traveling to a user given location and stop whenever a potential object gets in the way. So what
follows is a question of what kind of items are needed to position the vehicle on a map and what
kind of objects will grant a “vision” to the vehicle. The research phase branches out to ask more
and more questions until a path is found to answer the original question completely. Because of 13 | P a g e
the nature of the project it is not possible to purchase and test out each potential branch of the
system so decision matrices and design norms are used to decide which ideas to pursue. An
example of this includes using a differential global positioning system instead of a local
beaconing system to find the location of the vehicle on campus. The research was typically split
between the group with only one or two people focusing on finding the right elements to meet a
specific project goal.
4.2.1 Decision Matrices and Design Norms
Decision matrices were drawn on a whiteboard in a team meeting based on the summary of the
system and product options given by each team member as a result of their research. Usually, for
finding the right type of system, it was important to look at factors like precedence (i.e. finding
hobbyists who made the system work as part of their self-driving vehicle) or affordability. Then
a system type was established then the team set out to research vendors for the best fit into the
Auto-bot vehicle. The research was done on an individual level by talking to advisors or
searching online and the options were compiled in the regular team meetings on Sunday nights
until the team felt that it was ready to make a decision about buying a particular product. The
design section (Section 6.2, specifically) details this process.
4.2.2 Positioning and Location System
The Positioning and Location System was chosen to be a DGPS by the team. Levi Dobson
oversaw researching the product options and specifications. The details of the DGPS unit
selected are shown in the system architecture section.
4.2.3 Obstacle Detection Network
14 | P a g e
Team 14 has decided to use LIDAR units as the primary means of obstacle detection. In addition,
further research will be done next semester to determine the sufficiency of these units. Peter Jung
primarily researched the libraries available for running the LIDAR and camera. The hope is to
also use an open source program to run a camera detection algorithm.
4.2.4 Motor Drive and System Actuator
For the motor driving unit, Team 14 chose to use an Arduino to actuate the system. Peter Ye
used his experience with Arduino programming and looked up the necessary resources to make
the motors run using sensory input.
4.2.4 Vehicle Motors
Team 14 has settled on using 12 Volt motors to run the vehicles that were used by the RoboSnow
senior design team from 2015-2016. Josiah Markvluwer was responsible for researching about
these motors from the documents of the RoboSnow team.
4.2.5 Steering System and Frame
For the chassis and steering system, Josiah Markvluwer was responsible for reworking the old
RoboSnow vehicle into something suitable for the system.
4.3 Hardware purchasing and installation
At this point of the project, most of the major needed parts of the system have been purchased
and have already arrived or are currently being fabricated. This section describes some of the
purchases and donations. It also discusses installing these components and testing each part of
the system individually when possible.
15 | P a g e
4.3.1 Sensor Units- LIDAR, Camera, Ultrasonic
Peter Jung and Levi Dobson have already begun working on the LIDAR sensor unit, that team
14 did not need to purchase because it was graciously donated to us by Mr. Paul Vander Kuyl.
The hardware includes provided includes a touch screen, keyboard, raspberry pi 3B with a
camera attached. The hardware was installed and data is currently storable by the raspberry pi
that reveals the angle and distance of objects from six meters away. The LIDAR unit is described
in a detailed manner in the system architecture.
4.3.2 Location System- DGPS
The differential global positioning system development kit has been purchased from U-blox. The
unit is the NEO-M8P, which includes a rover and base station along with a board that has UART
serial communication and USB plugins. The system, as is described by U-blox, is detailed in the
system architecture section.
4.3.3 Vehicle Frame and Motors
Currently, Team 14 is working with the materials from the RoboSnow team’s plow from last
year because of the want to avoid incurring extra costs since vehicle specifications are not
strictly defined for this project.
4.3.4 Processors / Single Board Computers
The team is using a Raspberry Pi for maximum versatility in processing sensor and location data.
This part of the project was determined early on in team discussions since the Raspberry Pi was
found to be very adaptable and generally used for prototyping in projects that are not going to be
mass produced (in the case of mass production, the team would most likely choose a microchip)
16 | P a g e
4.4 Software Development
4.4.1 LIDAR unit sensors
Peter Jung has worked with the LIDAR unit software packages and has a ROS system in place
that takes the raw and displays it to the ready.
4.4.2 Arduino Motor Drive
Peter Ye has written code to get the motor drive to work with both an ultrasonic sensor and a
joystick potentiometer to run the motor. Although it is unlikely that team 14 will end up using
the sensors in this manner, due to an inclination of working with the LIDAR unit that Team 14
received. The experience still shows that the code is easily attainable and readily available
4.4.3 DGPS
The Differential GPS was purchased this past week as mentioned before so the team have not
received it yet to test and install. The supporting documents from U-blox’s website show that the
product is highly accurate and are discussed in the software section of the system architecture.
4.4.4 Camera Detection
Camera detection is something found that would add an extra layer of redundancy to sensing the
would hopefully serve to make the design very safe.
4.5 System Integration
System integrations are the biggest challenge facing the group in the second semester. Team 14
have a lot of parts to integrate since they are planning on being done with partial testing soon.
4.6 Website Development
17 | P a g e
Website development was assigned to Levi Dobson. There were some requirements outlined that
were met in order to formalize and page and documentation and details are being added to the
site as needed.
4.7 Presentation Work
The first presentation was done by Josiah Markvluwer in order to introduce the project to the
Engineering 339 class. The second presentation was done by Peter Ye and Levi Dobson in at the
conclusion of the semester in order to show the progress of the design and the feasibility based
on that progress.
4.8 Planning meetings
Meetings were regularly held every Sunday night at 7 pm to determine what was due and
schedule any special meetings needed to complete group work. In addition, brief meetings were
regularly held just after senior design class concluded on Mondays, Wednesdays, and Fridays at
3:30. Typically these meetings were of the informal type and involved a discussion revolving
around the current task at hand.
4.9 Budgeting and Purchasing
Peter Ye has handled a lot of the purchasing of the units through Bob DeKracker. In addition,
team Auto-bot has sought out donations due to the expensive nature of the project. Josiah
Markvluwer has coordinated discussions with Mr. Bill Haverkamp, an advancement coordinator
at Calvin College, for possible funding. Mr. Glen DeVos, a supervisor for Delphi’s self-driving
car project, is graciously funding much of the Auto-bot project.
4.10 System Planning/Mapping
The general system and block diagram planning were done early in the semester on the first few
meetings in order to determine in general what the major components of the system are. As more
18 | P a g e
research was done, the system UML diagrams were updated to reflect the refined understanding
of the system.
5. System Architecture
The Auto-bot vehicle system architecture has four major components that Team 14 has divided
the system into. These pieces are the sensor network system, the location and navigation system,
the motor drive and actuator system, and the mechanical motor and chassis system. This top
level view shows that the sensor network and the location and navigation system both feed
information to the motor drive and actuator. The motor drive and actuator system use signals
from a speed controller in order to drive the system. This top level perspective shows each unit in
terms of its goals. Before discussing in detail the system as a whole, the components and their
functions and communication methods are discussed in the next few sections. Section 5.4,
interconnections, attempts to provide a more in depth look that combines details from 5.1, 5.2,
and 5.3. For simplicity, the team has divided the explanation of components into hardware and
software.
5.1 Hardware
DGPS:
As discussed within Design Decisions (section 6.2), Team 14 has selected the C94-M8P
application board as a means of positioning. This product from U-blox is made specifically to
provide “high accuracy solutions for RTK (real-time kinematics) for professional prototyping.”
This is accomplished with Base and Rover station functionality, which U-blox explains with the
19 | P a g e
diagram below.
Figure 3: DGPS Diagram
Figure 4: NEO-M8P Module Diagram
This product comes with two NEO-M8P chips. The total package contents of this system are
shown in the table below.
Table 1: DGPS Unit C94-M8P – Hardware Pieces
Hardware Content Purpose
20 | P a g e
2 application boards (both with NEO-M8P-2) Contains the chip that will gather the NMEA standard GPS information, Contains the ports needed to communicate with the
2 external UHF antennas Communicates with the satellites to give location data
2 external active GNSS antennas Communicates between the “Rover” station and the “Base” station to reduce error in positioning
2 antenna ground planes Used for conducting communication
2 micro-USB cables Used for power and to transfer GNSS (global navigation satellite system) data
In addition to the necessary things to collect data, team Auto-bot is provided with many ways to
interface the device as shown in the table below.
Table 2: DGPS Unit C94-M8P - Interfaces
Interface Purpose
RS232 Contains the chip that will gather the NMEA standard GPS information, Contains the ports needed to communicate with the
USB Communicates with the satellites to give location data
UART Communicates between the “Rover” station and the “Base” station to reduce error in positioning
Antennas 1 Used for conducting communication
2 micro-USB cables Used for power and to transfer GNSS (global navigation satellite system) data
This unit has been ordered this unit but will not be available to receive until about December
21 | P a g e
22nd. This means that the knowledge of this unit came from studying the manufacturer sheets
and Team 14 will have to devote time over the semester break and interim to installing and
testing the unit. This is reflected in the Gantt chart from sections 2 & 4. The hardware is well
documented and Team 14 has a good idea of the functions it will need from the DGPS unit and
can expect to implement with certainty. These include implementing what U-blox refers to as its
patented RTK, real-time kinematic, technology in order to reveal highly accurate data in the form
of NMEA coordinates. In addition, the hope is to test the functionality of installing a virtual geo-
fence in order to operate the vehicle in only known environments. This geo-fencing is discussed
in the software section. There is much testing to be done with the DGPS unit which will begin as
soon as the unit is received over Christmas break.
LIDAR:
The second part of the system that is known that will be used in the system is from the sensor
network. The LIDAR is a great tool that is gaining increasing relevancy in the field of
autonomous vehicle driving. The basic operational diagram of a LIDAR unit is shown in the
figure below.
22 | P a g e
Figure 5: LIDAR Operation Diagram [8]
The LIDAR unit the team is using was donated by Mr. Paul Vander Kuyl. It is a 5 Hz, 360
degrees, and two-dimensional RoboPeak LIDAR. It uses a serial communication with a baud rate
of 115200 and has a sampling frequency of 2kHz with a LIDAR range up to 6 meters. The
angular range is about 0-360 degrees with a clockwise rotation and the resolution is about
0.5mm. The frequency of the scanner can be easily adjusted using PWM to the motor. The 6 -
meter range gives us a lot of time to stop the vehicle in the event that an object is seen by the
laser scanner in front of the vehicle [11].
Motor Drive:
The current motor drive system being used involves a Victory SP speed controller from last
year’s RoboSnow design team. Team 14 has decided to use an Arduino that will take inputs and
use code to send a command to the speed controller that will run the motors. The speed controller
23 | P a g e
is shown below along with the Arduino as it is hooked up and running the motors.
Figure 6: Motor Drive Components
Camera:
Team 14 has not made a decision on which camera unit to use in assisting with detection at this
point of the project, but they do have a few cameras and sunny cables to attach them to a
Raspberry Pi units. The constraint is that the project needs cameras that are a low enough
resolution to avoid overloading the processors with too much data.
Ultrasonic Sensor:
24 | P a g e
In addition to the camera and LIDAR, there is a possibility of adding ultrasonic as an
inexpensive redundancy safeguard that will be further explored next semester. Ultrasonic sensors
use a high-frequency sound and listen for the time for an echo to return in order to determine
distance. Ultrasonic is relatively cheap and the Team have already run a test with the Motor
Drive unit in which ultrasonic sensors were used to change the speed of the motor at various
distances. Since the Auto-bot team has the luxury of receiving the donation of a LIDAR unit
early in the semester the majority of effort was put into making that unit work. Adding the
ultrasonic unit is something Team 14 will look at if it is determined that the sensor network does
not have enough functionality to operate as expected without an ultrasonic sensor [9].
5.2 Software
LIDAR
The LIDAR unit code was procured by Peter Jung and Levi Dobson who used the ROS libraries
to implement a module known as Robo-Peak LIDAR which uses real time mapping showing
obstacles as they are present. In addition, code was written or obtained to get the data from the
LIDAR as a list. With the help of the LIDAR donating party, Team 14 is already working on
code to interpret the code data list to get meaningful results about whether the system is facing
any potential obstacles it must stop for. Much open source code was discovered from several
references including [12], [13], [14], [15], [16] and [17]. This code is shown in the Appendix
11.1.2.
Motor Drive
The motor driving code was written using the Arduino programming language. This language is
a set of C / C++ functions that can be called by the unit. The Arduino can drive the motors at
25 | P a g e
variable speeds. The code written to control the Motor so far is based on operation with a
potentiometer input and an ultrasonic sensor that stops or slows the motors as the obstacles come
closer to the sensor. The code is shown in Appendix 11.1.1.
DGPS
The DGPS unit software is mostly an already written package for Team 14. The software does,
however, come with the possibility of implementing a geo-fencing system. This may be used by
the team in order to start a map that will allow us to define where the vehicle can and should
drive to remain on the correct parts of campus.
5.3 Motor & Chassis
The motor and chassis are mostly a rework of what the RoboSnow team developed last year. It
uses two coupled motors in order to drive the wheels. The wheels are a combination of types.
The chassis has been stripped down to the frame so that testing could begin. The Auto-bot will
also have a gearing system to optimize top speeds of 25 miles per hour. Auto-bot shell will be
designed for optimal functionality of the implemented sensors. The structure also needs to be
designed to have enough space for power to completely drive around campus. The purpose of
going with a simple design is so testing can begin over interim instead of worrying about making
the vehicle a complete and final product. It will be Josiah Markvluwer’s job to determine if there
are more things to be done with developing the frame. The current frame of the system is shown
below.
5.4 Interconnections
With all these pieces in mind, Team 14 has a pretty good idea of how to implement the system
and how each component will talk to the next component. A more complex diagram is shown
below.
26 | P a g e
The DGPS unit will utilize a USB connection to the Raspberry Pi unit onboard the vehicle. The
unit will share the NMEA protocol data with the Raspberry Pi 3 B. The Raspberry Pi will then
relay the data interpreted to the Arduino. The LIDAR will utilize Raspberry Pi to get data out to
the Arduino. This is already being retrieved through the ROS library and built-in functionality of
the LIDAR unit.
6. Design
6.1 Design Criteria
The nature of automated vehicles is that they typically require an immense amount of capital and
time investments which exceed that of a capstone undergraduate engineering project. The Auto-
bot project focuses on making a proof of concept version and this is significant because it means
that special considerations must be made on factors like price, scale, and quality. Finding system
components at low prices is reliant on the ability to define a proper scope by limiting the needs
of the implementation. For example, industry camera detection algorithms are very complex and
could be a senior design project unto themselves so the team is avoiding using them as a primary
data input.
6.1.1 Mapping and Navigation
A few very important criteria are directing many of the choices in this project. While keeping
things low price to fit within the scope is important, Team 14 must also consider the accuracy of
the system. Accurate direction for the vehicle was the top priority in component selection. This is
because, with a high precision input of where the vehicle is on a predefined course (like the
Calvin loop for example), the user, can tell the vehicle exactly where the road is supposed to be.
This means that team Auto-bot can create a low-cost alternative by knowing the environment
27 | P a g e
precisely and telling the vehicle exactly where it should be on the road. This means that the team
must program a route beforehand and the vehicle can run it with minimal error.
6.1.2 Obstacle avoidance and detection
The previous solution has no mention of moving obstacles like people so the vehicle must
include auxiliary sensors that will detect such cases and stop the vehicle until safe operation can
be resumed. The possible auxiliary sensors that the team is discussing are LIDAR, ultrasonic,
and camera with color detection algorithms. Since cost is a major consideration, e first working
with the LIDAR and camera units because they were gifted to us. In addition to these, the team
was also able to salvage motors, wheels, a motor relay module, and some Raspberry Pi boards
that will be used at least with the original prototype. The reasoning behind working with the
donated units is that the primary focus of the project is in the automation of a vehicle with
sensors and the project outcomes are not concerned with the speed of execution of the route or
the size of the prototype. This means that in the interest of devoting time and resources to
integrating various components, the Auto-bot team has decided to use easily attained parts.
6.1.3 Cost constraints
The first major item in the design specifications was to define the needed minimal components in
order to get the vehicle functioning and making decisions on its own. Since making an automated
vehicle can be extremely complex and very expensive, these two factors were the major
restricting factors. Another important factor is availability of support for the products since these
components are naturally quite expensive and one product failure could result in the system not
working and surpassing budget.
28 | P a g e
6.1.4 Design Norms
In addition to all these technical considerations, it is important to acknowledge the design norms
that are major factors in the decisions made. Two of the design norms that stand out in a complex
system are transparency and caring in the design project. In the case of designing this car, it is
important to work together to solve problems and support one another developing a level of care
for each other. When working with companies and outside sources, the same care and
cooperation must be extended. The limitations of the system being created and the limitations of
the Auto-bot must be clear in the case that future design teams may expand upon the capability
of the system. Users also need to understand that this is an attempt to replicate some of the
successes of a self-driving car on a very small scale and will not have the comparable safety
considerations as automotive companies. Transparency in operation of any automated project is
very important because the user needs to understand the decision-making process of the
automated vehicle in order to determine how much trust can be placed in that product. Safe
testing environments are important for prototyping this product since it is a moving vehicle with
no human operating that can stop it immediately.
6.2 Design Alternatives and Decisions
The design alternatives and decisions were made using decision matrixes. Some of the important
criteria that was accounted for each design were price, reliability, and ease of use.
29 | P a g e
6.2.1 GPS ModuleTable 3: Positional System Decision Matrix
Positioning and understanding where the vehicle is at all times is vital to directing where the car
should go. There are many ways to implement positioning systems. One major objective is to
keep a car on the right side of the road for this project and keep it operating safely. Top
weighting priority was placed on the Price and Accuracy sections since a cost-effective way to
determine exactly where the vehicle is at. A medium weight is placed upon the Reliability of the
system since Team 14 planning on operating the system in only predictable situations with clear
weather and signals. Lower weights are placed upon the variety of products and implementation
methods since marketplace options are a nice way to update the system but not extremely
important when the design objective is to make the vehicle functional. The weight on Ease of
Use refers to whether the system is very complex to implement. This was necessary to include
especially for the camera detection method because the complexity of detection algorithms
would not be completed in a timely manner and would hinder progress in the actual integration
of the system.
Table 4: Decision Matrix for DGPS
30 | P a g e
Using Table 2, Team 14 were able to pick out the best DGPS in the market. The team chose the
U-blox DGPS because it had the best Accuracy and Ease of use, while it had the best price in the
market.
6.2.2 LIDAR and Sensors
Obstacle detection is another part of the Auto-bot that is very important to the project. The
vehicle needs to be able to stop when it detects possible obstacles in its path. For this part of the
project instead of choosing one sensor, team Auto-bot is implementing multiple units in order to
check different cases of possible obstacles. The car design will implement a LIDAR unit that was
donated to us by Mr. Vander Kuyl. It will also implement a camera unit that will use very basic
pattern detection algorithms in order to look for stop signs. Other options that will be determined
include the possible use of ultrasonic sensors and IR (infrared) sensors to cover the cases that
LIDAR misses.
6.2.3 Processing Unit
Processing unit decisions are tentatively made to use two Raspberry Pi 3 Model B units in order
to process the data and an Arduino board to drive the motors. Further research will be done on
the processing needs and the decisions will be updated once module and unit testing begins. The
speed control unit was given to the team along with the motor and the wheels so that it is known
that it is appropriate to drive the vehicle.
6.2.4 Motor Control
Steering and motor implementation were another component that was considered in order to
actuate the system designed. The decision was to use front driving system with two casters in the
back. This is optimal for having a tighter turning radius and improving safety for avoidance of
31 | P a g e
vehicles and obstacles. The cost of implementing this system is also significantly less than a gear
and rack. These are the options that had that were compatible with the system that was donated
to the team. Since the size of the motors or wheels were not significant to fulfilling the project
goals (other than having them to test the system), the team accepted donated parts from a past
senior design team.
7. Integration, Test, Debug
The testing of the system will begin with component testing to determine capabilities of the
system. The components will be wired together and connected to a processing unit that
communicates the software directive to the motors that will actuate the vehicle. Integrated testing
will take place as needed to determine how these systems will work together. It is essential that
stopping for obstacles has a greater priority than moving toward for example.
The LIDAR unit would be tested for its range and the different obstacles it can detect. Since the
LIDAR uses laser to detect things using reflective technology, so unreflective objects might be
hard to detect with the LIDAR. However, other distance detectors will be added to the vehicle so
that it detects things that LIDAR might not pick up.
Team 14 will need to put a lot of concentration on the calibration of the DGPS to make sure they
will get the most accurate measurements of position. The accurate measurements will make sure
that the car is in the right lane and knows when to detect for stop signs on the road when it gets
near crossroads. It should also inform the car about the pathing and the destination of the car so it
knows its route.
32 | P a g e
8. Conclusion
The major component of the system that made the project feasible was the differential GPS unit.
After looking at options within the project budget, it was determined that the NEO-M8P-2 from
U-blox would be the high precision GNSS, or Global Navigation Satellite System, selected in
order to drive the system. Additional units to direct car motion will be a LIDAR unit to detect
distances from obstacles and a camera that notices key pieces of the environment in order to
provide a critical stopping warning.
After much preliminary research of the needed components of the system, the proposed project
has been deemed to be feasible. The major challenge unique to this project is that number of
modules must be connected to the vehicle system in order to provide a complete project.
Due to the complex and expensive nature of making autonomous vehicles, simplifying the
autonomous vehicle to drive in limited environments was necessary. For example, the desire of
the project is for the vehicle to avoid accidents while maintaining a course. In order to have the
appropriate goals, it was necessary to consider that the scope of the project is to drive it in slow
environments, where stopping and waiting for an obstacle to be cleared (which is not feasible on
high-speed roads for autonomous vehicles) can be an appropriate response.
The project is inspired by the latest major trend in the automotive industry of autonomy and has
several design benefits and goals. It will primarily serve as a proof of concept project in order to
introduce the team to a topic on which a career can be built. The project also is meant to be a
demonstration of the advancement of technology to the point where even a small project with a
small budget can make a car assign to itself direction and caution.
This project by its nature can have many components added to it in order to better help it make 33 | P a g e
accurate decisions and upgrading detection coding and sensors is a great goal for the future of
this project in order to improve the safety and usefulness of the vehicle.
9. Acknowledgements
Professor Michmerhuizen……………………….Main Project Advisor
Mr. Glen DeVos…………………………………Project Funding Patron
Professor Tubergen………………………………Vehicle Mechanics Consultant
Mr. Bill Haverkamp……………………………...Project Funding Consultant
Mrs. Michelle Krul………………………………Senior Design Administrative Coordinator
34 | P a g e
Mr. Bob DeKraker………………………………Parts Ordering Consultant
Mr. Eric Walstra…………………………………Industrial Advisor
Mr. Paul Vander Kuyl.……………………….....LIDAR donor
Mr. Matthew Budde…………………………….LIDAR operation advisor
10. References
[1]https://www.bloomberg.com/news/articles/2015-01-08/driverless-car-global-market-seen-reaching-42-billion-by-2025
[2]http://www.businessinsider.com/report-10-million-self-driving-cars-will-be-on-the-road-by-2020-2015-5-6
[3]https://www.cbinsights.com/blog/autonomous-driverless-vehicles-corporations-list/
[4]https://www.technologyreview.com/s/520431/driverless-cars-are-further-away-than-you-think/
[5] http://www.trimble.com/gps_tutorial/dgps.aspx
[6]https://en.wikipedia.org/wiki/Differential_GPS
35 | P a g e
[7]https://app.asana.com/0/194027505107134/list
[8] http://www.renishaw.com/en/optical-encoders-and-lidar-scanning--39244
[9] https://www.bananarobotics.com/shop/HC-SR04-Ultrasonic-Distance-Sensor?gclid=Cj0KEQiAsrnCBRCTs7nqwrm6pcYBEiQAcQSznPpMZk7xx03GCAz4cgXkYuigeqAQdLTdoOJvfzNOJFgaAjKU8P8HAQ
[10] Asana Schedule https://app.asana.com/0/194027505107134/list
[11] https://roborescue.nl/index.php/RPLidar
[12]http://wiki.ros.org/rplidar
[13]https://github.com/robopeak/rplidar_ros
[14]https://github.com/robopeak/rplidar_ros/wiki
[15]http://www.slamtec.com/en/Lidar/A1
[16]http://web.pdx.edu/~jduh/courses/geog493f12/Week04.pdf
[17]http://www.nps.edu/Academics/Centers/RSC/documents/IntroductiontoLIDAR.pdf
11. Appendices
11.1 Software Code
11.1.1 Arduino Code for Motor Drive//setting pins used
const int analogOutPin = 9; //PWM pin for motor control
#define trigPin 13 //ultrasonic sensor trig pin
#define echoPin 12 //ultrasonic sensor trig pin
// analog
const int pin_x = 0; //joystick x position pin
const int pin_y = 1; //joystick x position pin
const int pin_switch = 8; //joystick switch pin
int outputPWMValue = 185; // value output to the PWM (analog out) 185 as the neutral
void setup()
{
36 | P a g e
pinMode(analogOutPin, OUTPUT);
TCCR1B = TCCR1B & B11111000 | B00000011; // set timer 1 divisor to 64 for PWM frequency of 490.20
// TCCR1B = TCCR1B & B11111000 | B00000100; // set timer 1 divisor to 256 for PWM frequency of 122.55
//TCCR0B = TCCR0B & B11111000 | B00000100; // set timer 0 divisor to 256 for PWM frequency of 244.14
// Serial.begin (9600);
pinMode(trigPin, OUTPUT);
pinMode(echoPin, INPUT);
pinMode(pin_switch, INPUT);
}
void loop()
{
int x = analogRead(pin_x);
// int y = analogRead(pin_y);
// int b = digitalRead(pin_switch);
long duration, distance;
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);
duration = pulseIn(echoPin, HIGH);
distance = (duration/2) / 29.1; //get distance in cm
if (distance >= 100 || distance <= 0){
// Serial.println("Out of range");
if (x > 900) {
outputPWMValue = 250;
analogWrite(analogOutPin, outputPWMValue);
// delay(500);
}
else if (x > 600) {
outputPWMValue = 220;
analogWrite(analogOutPin, outputPWMValue);
// delay(500);
}
else if (x > 550) {
outputPWMValue = 200;
analogWrite(analogOutPin, outputPWMValue);
// delay(500);
}
else if (x > 500) {
outputPWMValue = 185;
analogWrite(analogOutPin, outputPWMValue);
37 | P a g e
// delay(500);
}
else if (x > 300) {
outputPWMValue = 160;
analogWrite(analogOutPin, outputPWMValue);
// delay(500);
}
else if (x > 100) {
outputPWMValue = 140;
analogWrite(analogOutPin, outputPWMValue);
// delay(500);
}
else if (x >= 0) {
outputPWMValue = 120;
analogWrite(analogOutPin, outputPWMValue);
// delay(500);
}
}
else if (distance >= 50 ){
if (outputPWMValue > 205) {
outputPWMValue = outputPWMValue - 20;
analogWrite(analogOutPin, outputPWMValue);
// delay(500);
}
else if (outputPWMValue < 165) {
outputPWMValue = outputPWMValue + 20;
analogWrite(analogOutPin, outputPWMValue);
// delay(500);
}
}
else {
outputPWMValue = 185;
analogWrite(analogOutPin, outputPWMValue);
// delay(500);
}
// Serial.print(distance);
// Serial.println(" cm");
}
11.1.2 Code for LIDAR# Sample RPLiDAR test code for Levi Dobson, for gathering of data
38 | P a g e
# December 5, 2016
# Matthew Budde
import rospy
import roslib
from std_msgs.msg import String
import sensor_msgs.msg
# Set up a threshold at which to start avoidance maneuvers
global THRESHOLD
THRESHOLD = .05
# Callback function for a LiDAR scan
def callback(data):
# Pull all data from the sensor message into local variables
startAngle = data.angle_min
endAngle = data.angle_max
angleInc = data.angle_increment
ranges = data.ranges
intensities = data.intensities
# Look for any points in the front half of the scan
# "Front half" will depend on how the LiDAR is oriented
# This is assuming the LiDAR is oriented with the 0-point at one side of the vehicle
for i in range( len( ranges ) / 2 ):
if( ranges[i] < THRESHOLD ):
currentAngle = startAngle + i * increment
# Turn the other way or something
# Set up the node and subscribe to the output of the LiDAR
def listener():
rospy.init_node( 'listener', anonymous = True )
rospy.Subscriber( "/scan", sensor_msgs.msg.LaserScan, callback )
rospy.spin() # Wait for data
# Run listener if this is the main program
if __name__ == '__main__':
listener()
11.2 Datasheets
39 | P a g e
11.2.1 Victor SP Speed Controller
40 | P a g e
11.2.2 LIDAR Data Sheets
42 | P a g e
Team 14
11.3 Gantt Chart
43 | P a g e