Student Competitions 2016 & 2018 NSF Student CPS ... · Jnaneshwar Das & Ani Hsieh July 18, 2018....
Transcript of Student Competitions 2016 & 2018 NSF Student CPS ... · Jnaneshwar Das & Ani Hsieh July 18, 2018....
Student Competitions 2016 & 2018 NSF Student CPS Challenges: Retrospective
and the Road ahead
Presenter: Jnaneshwar Das & Ani Hsieh July 18, 2018
Robot left behind in 2016 NSF Student CPS Challenge
Telemetry to UAV lost, crashed in the desert.
Multiple failed attempts to locate UAV due to inaccessible and dangerous (rattlesnakes) terrain.
UAV found couple of days later, across a fence with search team properly dressed for the task.
UAV found in the vicinity of where expected, based on final telemetry logs on the ground control station, before connection was lost.
https://cps-vo.org/group/CPSchallenge
UAV competitions
High-barrier to entry
Legacy code and knowledge
OpenUAV testbed
- Open-source - UAV swarms running
PX4 flight stack - Seamless integration
with hardware
Desired outcomes
Debug autonomy software in simulation rigorously
Help improve tools through open-source contributions
Enable teams to collaborate and have fun while learning.
Reusability
Final teams
1. Embry-Riddle Aeronautical University
2. Halmstad University (Sweden)
3. University of Pennsylvania
4. Vanderbilt University 5. NCSU
Teams that signed up or showed interest in participating (9):
Georgia Tech, Texas A&M, GIK Institute of Engineering (Pakistan), Lamar University, Duke University, University of North Texas, High-school team from China, ASU, Cornell
video
UAV considerations
Intel Aero ErleCopter UPenn F450
• PX4/DroneCore, ROS, OpenCV, Tensorflow • MAVlink, position control, velocity control, waypoints, state-estimation • Intel NUC i5 Skylake, Hardkernel Odroid, NVIDIA Jetson TX2
VO Simulation Architecture
Docker containers
OpenUAV server Intel Xeon 16x2 cores 2 x NVIDIA TITAN X Pascal 64GB RAM 1TB SSD, 6TB HDD
cps-vo.org
Internet
M. Schmittle, A. Lukina, L. Vacek, J. Das, C. P. Buskirk, S. Rees, J. Sztipanovits, R. Grosu, and V. Kumar, “OpenUAV: A UAV Testbed for the CPS and Robotics Community,” in 2018 International Conference on Cyber-Physical systems (ICCPS), presented and demonstrated in Porto, Portugal
VO-based Swarm Testbed
2018 vs 2016
Compared to 2016, in 2018: ● We didn't pay for hardware ● Did not have to assemble, or help
assemble hardware
What we did: ● Announcements ● Webinars with OpenUAV live demo ● Guidance with hardware
‘Optimal stopping’ scoring rule worked well: ● If you fly too soon, you might crash
your drone prematurely and lose the opportunity of a good run
● If you wait too long, you may not be able to get enough chances before wind picks up on last day
● This policy produced better engagement than ‘best of three’ scored trials.
Videos: https://www.facebook.com/david.cicotte.16/videos/2190394534309180/ ERAU test flight https://www.facebook.com/hogskolanihalmstad/videos/10160456736735302/ Halmstad at TIMPA
May 15-17, 2018 TIMPA Airfield, Mariana, AZ
Competition info Scoring done as optimal stopping. Teams attempt and lose previous score, till they are satisfied, or the event is over (out of time).
● ERAU (E) - 7 attempts, 80 points ● Halmstad (H) - 8 attempts, 40 points ● Vandy (V) - 4 attempts, 0 points ● Penn (P) - 2 attempts, 0 points
Attempt sequence: (Day1) E(10), E(40), E(40), E(60), (Day2) H(0), E(30), H(40), E(30), V(0), H(0), E(80), V(0), H(0), P*(0), H(30), H(0), V(0), P(0), H(0), P(0), V(0), H(40)
ERAU
Simplicity, discipline, team experience, guidance (Prof. Brian Butka has experience in supervising teams)
Halmstad
- Example of a team new to drones that did remarkably well
- Well prepared, complex system, many points of failure (heat, shadow, servos).
- Robust to wind by virtue of power and weight
Vanderbilt
- Robust platform (Aero), good recoveries, persistent team, not enough experience or time for testing prior to competition, however, they exploited day 0 trial opportunity.
Penn
- Very few attempts - A good example of what can go wrong with too
much dependence on simulations
"Engineering students in second place in international drone competition" , Halmstad University News, June 7, 2018.
https://news.erau.edu/headlines/daytona-beach-students-dominate-nsfs-autonomous-aerial-vehicles-competition/
The challenge for the undergraduate teams was to use an unmanned autonomous quadrotor aircraft with a downward facing camera, and possibly other sensors, to scan an area for a lost aircraft and recover it safely back to base.
“Despite a number of equipment failures during the competition, the team overcame the problems and won the competition in a convincing manner,” said Brian Butka, Ph.D., associate professor of electrical and computer engineering and faculty advisor for the team.
Event log 5/14 6:00p, preliminary briefing to ERAU and Halmstad at hotel lobby. Halmstad needs some assistance with one of their Pixhawk2 cable for 915MHz telem link (Europe uses ~400MHz)
5/15 (Day 0, assembly, preparation, calibration) ~8:00a Event briefing to team at TIMPA:
1. Michael safety checks Auto to Manual takeover using RC for failsafe 2. Optimal stopping scoring, team keeps flying and losing previous score till they are satisfied or the event is
over.
AM Penn transmitter stick broke, during transit, poor padding. Breakage is where bearing is mounted for one of stick axis. AM Halmstad servo broke (for leg landing legs based gripper)
ERAU did safety check
5/16 (Day 1) 7:55 Embry crash, they have spares 8:00 chair shortage JD signed release form 8:09 channel conflict for 915MHz link between Halmstad and Penn. Penn changed from 110 channel (still had conflicts on day 2 intermittently, fixed by firmware upgrade of 915MHz modem) 8:11 Michael checked Vandy Aero for safety, approved! 9:30 ERAU motor came apart mid-air, lost motor circlip. 9:40 Halmstad getting GPS reads, gearing for mission 9:58 Halmstad finished basic tests over search zone 10:00 ERAU out to buy motor (Brian and Caleb and one more student, David stayed back). 30 Hobby Lobby min drive one way, close to UofA. 11:15 Penn did manual test 11:45 Penn looks ready, issues with PX4flow. Recommended debugging using QGC analyze widget 11:56 Penn (Damian claims) PX4flow is not working in the desert (sun too bright?) 12:00 Halmstad changed pixhawk since GPS cable they had fixed on old pixhawk might have been faulty and they had position control issues. 12:10 Halmstad did flight with new pixhawk, seems fine. 12:20 DJI Phantom props fit E310 motors that Penn gifted ERAU, David and I (JD) checked. 12:45 Penn did safety check 13:30 Vandy did flight to get image data for lost UAV template 13:43 ERAU doing tests with Penn motors and 3 legs, one leg duct taped :)
ERAU duct-taping broken drone leg!
Vandy doing tests with Intel Aero
5/16 (Day 1) contd. 13:43 Halmstad doing tests 13:40 ERAU fixed drone, flight successful 14:02 ERAU test flight, target detection 14:20 ERAU scored test flight, 10 points (false positive, detected blue umbrella that matched blue lost-UAV frame-template color)
- Restricting object detection taking into account camera FOV and UAV pitching in addition to search zone lawnmower planning could have avoided this FP
5/17 (Day 2) 7:29 Vandy did detection tests. ERAU can lift up modified (lightened) UAV template Halmstad did test flight
Vandy broke carbon fiber aero leg. Fixed in two stages by looping and securing with a cable tie, and then stabilizing with epoxy that Michael loaned them. ERAU flipped their drone before takeoff, on the ground, seems no damage. Halmstad tried recovery attempt, didn't seem to succeed. Penn electro-permanent magnet not making good contact with colored center, now they are modifying the arms to have color by wrapping colored paper. Vandy had RC interference with Penn when Penn accidentally bound to Vandy RC receiver. Hamstad pixhawk heating due to sealed UAV casing and direct sun.
12:17p wind picking up
ERAU detection false-positive culprit
Vandy
Halmstad
ERAU
Penn
Event log
Pedagogy
● Learn by doing (and failing!), teamwork
● Support coursework on the VO ● Capstone and senior design
projects, robotics clubs ● Different from AIAA, AUVSI,
IARC ● challenge, as opposed to a
competition ● CPS research oriented
Source: from Facebook page of Halmstad University, showing their team autonomy brainstorming session.
Lessons Model complexity — UAV shadow produced lost-UAV false positive
● Halmstad used a deep neural network (R-CNN) to detect the lost UAV frame directly, without any fiducials.
● In simulation they observed that the drone was detecting its own shadow as the lost drone
● They assumed they will try to do the tests early in the day when the shadow was not in the field of view (FoV). In reality, most tests after 10:00a had shadow in FoV especially in bright desert sun.
● On last day, they attempted correcting the problem by modifying (flying) drone shadow, by fixing spare propellers on arms.
How it helped
Halmstad (Sweden) started early, had a working simulation in March.
UAV detected own shadow in simulation
They switched gear to hardware testing after first checkpoint, though they carried on doing simulations on their local machine.
Where it did not
Penn AiR too dependent on simulations, did not do sufficient hardware functional tests
Lessons Simulations — doomed to succeed?
Sustainability of student competitions• Special resources
• CPS-VO forum, design tools, OpenUAV simulation framework • Support from companies
• Sponsorship for cash awards • Shape the competition • Help with capability development (e.g., Mathworks for
dockerized simulation access) • Special recognition
• competing and winning • Templates -- Platinum, Gold, Silver sponsors • Award structure, usage, legalities, sponsorship (feedback from
Penn AiR, ERAU) • Transitionable results
• to courses, research testbeds, broadly usable open-source tools, tools for other organizations (ARL DCIST OpenUAV extensions)
Intel Aero was robust at the competition, and upon recent tests with summer undergrad interns, easy to set up and get to work.
However, precision flights and field usage requires R&D, in order to enable usage in UAV competitions. Potential collaboration opportunity with Intel, especially since Aero is also being branded within their IoT initiative.
*
Inside the Aero are Intel Atom CPU, an Altera MAX10 FPGA, and a 32 bit microcontroller, powering two RGB, and an Intel RealSense RGBD cameras.
2019 Challenge
1. ‘No robot left behind’ 2. Substantial prize money, split
across winners who finish the mission fully autonomously,
3. OpenUAV examples based on 2018 competition solutions
4. Matlab support (subject to licensing terms)
2019
Logistics● Arizona State University and University of
Arizona ● TIMPA airfield ● Food, lodging
● Arizona State University and UPenn ● OpenUAV software ● Pre-competition activities through the VO ● Support with hardware ● Webinars, team checkpoints
● Vanderbilt University ● VO ● Support with travel and budgeting,
certificates ● UCLA
● Support at competition, verification research on OpenUAV
Next stepsContact companies
1. Raytheon 2. Intel * 3. Microsoft 4. DJI 5. Aerial Application 6. Mathworks 7. Boeing
Templates -- Platinum, Gold, Silver sponsors Award structure, usage, legalities, sponsorship (feedback from Penn AiR, ERAU)
Intel Aero was robust at the competition, and upon recent tests with summer undergrad interns, easy to set up and get to work.
However, precision flights and field usage requires R&D, in order to enable usage in UAV competitions. Potential collaboration opportunity with Intel, especially since Aero is also being branded within their IoT initiative.
Inside the Aero are Intel Atom CPU, an Altera MAX10 FPGA, and a 32 bit microcontroller, powering two RGB, and an Intel RealSense RGBD cameras.
*
Conclusions Lowering the barrier to entry to student competitions Students can carry out end-to-end simulations, before actually building hardware
Enabling CPS education (and research)
Virginia Tech will host two teams under a student design competition setting in 2019 (through Prof. Pratap Tokekar and NAVAIR http://www.navair.navy.mil/)
ARL DCIST collaboration on OpenUAV capabilities (adding complex scenes, ground robots)
Extensions (offshoot challenges) Camera-trap deployment and recovery (sensor placement) for animal behavior modeling in Etosha National Park in Namibia, and Kruger National Park in South Africa