Mission Critical: Sensors

48
MISSION CRITICAL Winter 2012 1 VOLUME 2 NO.4 WINTER 2012 AUVSI 2700 South Quincy Street, Suite 400, Arlington, VA 22206, USA Inside this issue: MEMS go unmanned Localizing with lidar Talking to robots Sensors show the way

description

A look at the sensors involved in robotics and unmanned systems.

Transcript of Mission Critical: Sensors

Page 1: Mission Critical: Sensors

Mission CritiCal • Winter 2012 1

VOLUME 2 NO.4 • WINTER 2012 • AUVSI • 2700 Sou th Qu in cy S t ree t , Su i t e 400 , A r l i ng ton , VA 22206 , USA

Inside this issue:

MEMS go unmannedLocalizing with lidarTalking to robots

Sensors show

the way

Page 2: Mission Critical: Sensors

REGISTERNOW

Not the Same Old Briefs…

auvsi.org/usprDARPA’s Robotics Challenge

Commercial Applications

for UGVs

Precision Agriculture

ACC Perspectives on UAS Ops

UMVs in Offshore Oil & Gas

UMVs & COLREGS

12–14 Feb. 2013 • THE RITZ-CARLTON, TYSONS CORNER • McLEAN, Va., USA

Ground DayTuesday 12 Feb.

Air DayWednesday

13 Feb.

. . . W H E R E B U S I N E S S H A P P E N S

Maritime DayThursday 14 Feb.

3 D A Y S • 3 D O M A I N S • A L L S Y S T E M S

“AUVSI’s Unmanned Systems

Program Review cont inues

to be an invaluable forum for

understanding the nuances of

the defense and commercia l

autonomous robot ic market .”

– Rob Hughes, Rockwel l Col l ins

• Maj. Gen. Charles Lyon, Director of Operations,HQ ACC, U.S. Air Force

• Mr. Steve Markofski, Corporate Planning,Yamaha Motor Corp., U.S.A

• Dr. Kathryn D. Sullivan, Deputy Administrator and Acting Chief Scientist,NOAA

• Dr. Missy Cummings,Program Officer,AACUS, ONR, U.S. Navy

• Dr. Robert Ambrose, Division Chief, Software, Robotics and Simulation,NASA

• Dr. Karlin Toner, Director,JPDO, FAA

Speaker Lineup Includes…Speaker Lineup Includes…

Page 3: Mission Critical: Sensors

Mission CritiCal • Winter 2012 1

4 Essential componentsNews from the sensors market

VOLUME 2 NO.4 • WINTER 2012

8 Tiny and everywhereA look at the unmanned MEMS movement

14 Q&AJohn Marion, director of persistent surveillance at Logos Technologies

On the cover: How a self-driving car sees the world. The circles recreate how a lidar would perform at 360-degree scan of the sur-roundings. The boxes are objects and the green path is a map of the road ahead. For more on lidar technology, see the feature on Page 16. AUVSI im-age.

CONTENTS

22 State of the artA look at the security cameras watching cities around the world

25 Pop culture cornerSensor ideas imagined by “Star Trek” that became reality

Page 8

Page 4: Mission Critical: Sensors

2 Mission CritiCal • Winter 2012

Mission Critical is published four times a year as an official publication of the Association for Unmanned Vehicle Systems International. Contents of the articles are the sole opinions of the authors and do not necessarily express the policies or opinion of the publisher, editor, AUVSI or any entity of the U.S. government. Materials may not be reproduced without written permission. All advertising will be subject to publisher’s approval and advertisers will agree to indemnify and relieve publisher of loss or claims resulting from advertising contents. Annual subscription and back issue/reprint requests may be addressed to AUVSI.

Lost in space?How lidar ensures robots know more about their surroundings

26 TimelineThe sensors paving the way for self-driving cars

35 Market reportPivot to Asia drives new sensors

39 Testing, TestingMesh networking: robots setting up communications

41 Technology gapADS-B tests may help expedite UAS flights in public airspace

43 End usersIHMC’s tongue sensor fills in for human sight

Page 16

Page 29

Talking to robotsResearchers look for novel new ways to communicate with unmanned systems.

Page 5: Mission Critical: Sensors

Mission CritiCal • Winter 2012 3

EditorialVice President of Communications

and Publications, Editor Brett Davis

[email protected]

Managing EditorDanielle Lucey

[email protected]

Contributing WritersRamon Lopez

David Rockwell

AdvertisingSenior Advertising

and Marketing ManagerLisa Fick

[email protected]+1 571 255 7779

A publication of

President and CEOMichael Toscano

Executive Vice PresidentGretchen West

AUVSI Headquarters2700 South Quincy Street, Suite 400

Arlington, VA 22206 USA+1 703 845 9671

[email protected]

Sensors don’t always get the credit that they deserve. With-out them, robots and unmanned

systems would mostly be really ex-pensive toys, incapable of detecting and moving about their surround-ings. Sensors of various types en-able all the smart and sophisticated motions and learning that will one day make robotics as sophisticated as their human creators. In an effort to shine the spotlight on this often looked over sector of robotics, AUVSI dedicated this entire issue of Mission Critical to the topic.

Freelance writer, and former AUVSI editor, Ramon Lopez tackled how MEMS, or microelectromechanical, sensors are making their way into a multitude of smarter projects. The sensors themselves are notable be-cause of their tiny size, producing astonishingly small products, like photovoltaic cells for collecting so-lar energy that are the size of fleck of glitter. He also explores how the company Xsens is proliferating these micro-sensors into unmanned tech-nology. That story is on Page 8.

I spoke with AUVSI member com-pany Velodyne on its lidar, which aids robots like the Google self-driving cars, by helping them detect the many moving objects in their surroundings. The company got its roots in the DARPA Grand Chal-lenges, and now their product is featured on an endless list of large military ground vehicles. Leveraging

this laser-based, radar-like technol-ogy enables object detection within a centimeter of accuracy and could one day be featured on every car on the road. Read more about that on Page 16.

Brett Davis, editor of Mission Criti-cal and Unmanned Systems maga-zine, tackles robotic communication, which leverages many more senses than a simple satellite transmission. Computer giant IBM aims, within five years, to be able to relay textures, the quietest of sounds and even smell and taste over computers and wireless networks. This technology could smell disease before a person even thinks to visit a doctor or hear a mudslide days before it actually oc-curs. Look for that story on Page 29.

In addition to those features, we have many more departments that encompass many other aspects of sensing, like the possibly ubiquitous Automatic Dependent Surveillance-Broadcast system, a sensor that can be placed on blind people’s tongues that relays visual information and a well-rounded market report — writ-ten by Teal Group’s David Rockwell — that explores where future hot spots in sensors will be. We hope you enjoy it!

Editor’s message

Danielle Lucey

Page 6: Mission Critical: Sensors

4 Mission CritiCal • Winter 2012

Essential Components

Longest UAS flight aided by sensor

Although the company UAV Factory broke the longest recorded flight record for a small UAS, its supplier Gill Sensors played a part in ensur-ing the platform was able to make its historic flight.

Gill Sensors developed a fuel level sensor that enabled the Penguin B UAS to stay in the air for 54.5 hours, so it could accurately monitor the fuel left in its 7.5-liter tank.

The task was challenging, since the fuel tank had an irregular shape,

Gill also had to cut down on the weight of the sensor, slicing it to 60 kilograms.

“We were delighted when we were told about this fantastic achieve-ment by UAV Factory,” says Mike Rees, head of marketing at Gill Sen-sors. “Our design engineers relished the challenge when we first met UAV Factory at AUVSI’s Unmanned Sys-tems [North America] conference in Washington in 2011, and were able to utilize the proven microelectronic level sensor technology that is cur-rently supplied by Gill into other specialist applications, such as For-

and space was extremely limited, so the company could not mount the sensor through the top of the tank, as is custom. Engineers at Gill created a unique sensor that could instead be mounted to the side of the tank’s wall. The sensor used an angled probe to take measurements of the tank depth.

“Key to the excellent performance and suitability of the Gill fuel sensor for this aviation application is the use of new microelectronics that of-fers a 50 percent space saving com-pared to standard electronics,” said the company in a press release.

When Penguin B made its record-breaking endurance flight, a custom Gill Sensors fuel detector played a pivotal role. Photo courtesy UAV Factory.

Page 7: Mission Critical: Sensors

Mission CritiCal • Winter 2012 5

Essential Components

mula 1 race cars and military land vehicles. It was a great achievement for UAV Factory, and we are proud to be a part.”

Infrared sensor to fill Japan security gap

Japan is developing an unmanned aircraft outfitted with an infrared sensor after its existing sensor suite failed to pick up an attempted satel-lite launch by North Korea in April, according to an IHS Jane’s report.

Japan’s Ministry of Defense wants 3 billion yen ($37.6 million) for a re-placement of the ballistic missile de-fense system, slotted for fiscal year 2013. The existing system consists of land-based early warning radars and destroyers outfitted with Aegis, which have RIM-161 Standard Mis-sile 3 systems attached.

Japan is interpreting its failure to locate the North Korean launch, de-spite the fact that the satellite never reached very high in orbit before crashing, as a security gap.

The UAV would have a long-endur-ance capability, according to a No-vember budget request, operating over the Sea of Japan at around 44,000 feet. Japan would like a pro-totype of the unmanned system by 2014, with initial deployment not scheduled until 2020 or 2021.

Air Force, Raytheon put sense and avoid to the test

The U.S. Air Force and Raytheon Co. have conducted concept evaluation demonstrations, showing that exist-ing air traffic equipment could be modified to include ground-based sense and avoid to track the pres-ence of UAS.

The pair recently completed the testing near Edwards Air Force Base at Gray Butte Airfield using a mov-ing “dynamic protection zone,” a col-lision avoidance alert system. This zone creates a series of alerts sent to the UAS pilot as an object ap-proaches his system, to avoid near mid-air collisions.

They used a sense-and-avoid system based on the Airport Surveillance Radar Model-11 and a repurposed Standard Terminal Automation Re-placement System for air traffic con-trol. Using these two items reduces the need for new infrastructure to integrate a sense-and-avoid system.

“Our solution provides the Fed-eral Aviation Administration and the Department of Defense with a cost-effective and safe approach to handle the thousands of unmanned aerial systems that’ll be flying in our airspace in the next few years,” says Joseph Paone, director of Air Traffic Management for Raytheon’s Net-work Centric Systems business. “Our system properly notifies controllers and pilots of intrusions and accu-rately shows aircraft altitude, which is important in keeping commercial aircraft, unmanned aerial systems and other hazards safely separated.”

Raytheon says it will continue this testing at other sites around the United States.

A thinking man’s robot

Researchers at CNRS-AIST Joint Ro-botics Laboratory and CNRS-LIRMM Interactive Digital Human group are working on creating a robot that could be controlled entirely by thought.

The interface they are planning will use flashing symbols that will tell the robot how to move and interact with its environment.

“Basically we would like to create devices which would allow people to feel embodied, in the body of a hu-manoid robot,” says Abderrahmane Kheddar, professor and director at the robotics lab. “To do so we are trying to develop techniques from Brain Computer Interfaces so that we can read the people’s thoughts and then try to see how far we can go from interpreting brain waves signals, to transform them into ac-tions to be done by the robot.”

The user wears a cap, covered in electrodes. Then that electric brain activity is transferred to a computer. A signal-processing unit takes what the user is thinking and then assigns different frequencies to icons on the screen. Then they instruct the robot which task is related to the icon, so it can perform the thought.

“Basically what you see is how with one pattern … which is the ability to

sCan it or Click it:To see a video of CNRS’ work, click or

scan this barcode with your smartphone.

Page 8: Mission Critical: Sensors

6 Mission CritiCal • Winter 2012

Essential Components — continued from Page 5

associate flickering things with ac-tions, … we associate actions with objects and then we bring this ob-ject to the attention of the user,” he says. “Then by focusing their inten-tion, the user is capable of inducing which actions they would like with the robot, and then this is translat-ed.”

HEARBO can hear you now

The Honda Research Institute-Japan has developed a robot that can dif-ferentiate between four sounds at once, including voices, and tell where they are coming from.

Such a capability could one day lead to robots that are able to respond to various verbal commands. In one experiment with the robot, it took food orders from four people speak-ing at once and knew which person ordered which dish.

The robot is named HEARBO, for Hearing Robot, and the audio sys-tem is named HRI-Japan Audition for Robots with Kyoto University, or HARK. The university is a partner on the team developing the system.

“We have the ability to consciously or unconsciously listen to what we want to hear when there is noise around (cocktail party effect), but this is not the case in robots and their systems,” HRI-Japan says on its website. “Furthermore, the sys-tems have a severe limitation. In general voice recognition systems, all sounds input are recognized as voices. Therefore, not only human voices but music and sounds from a television set are also recognized as voices.”

HARK overcomes that limitation, al-lowing the robot to recognize hu-man voices as being distinct from other sounds.

“By using HARK, we can record and visualize, in real time, who spoke and from where in a room,” HRI-Japan says. “We may be able to pick up voices of a specific person in a crowded area, or take minutes of a meeting with information on who spoke what by evolving this technol-ogy.”

Integration is the name of the UAS game

Two recent announcements show-case how unmanned systems com-panies are teaming to integrate new sensors and capabilities onto exist-ing platforms, expanding their capa-bility.

Insitu of Bingen, Wash., has teamed with Melbourne, Australia-based Sentient to incorporate its Kestrel land and maritime software detec-tion systems into Insitu’s unmanned aircraft, including the ScanEagle and Integrator.

The Kestrel software is able to auto-matically detect moving targets on land or on the surface of the water.

“Many ScanEagle customers already use Kestrel to provide an automated detection functionality and are very satisfied with the results,” says Si-mon Olsen, Sentient’s head of sales and marketing. “This agreement al-lows customers to benefit from the two technologies working together seamlessly to enhance airborne ISR missions.”

sCan it or Click it:To see and hear HEARBO in action, click or

scan this barcode with your smartphone.

In one demonstration, HEARBO could play rock-paper-scissors by listening to people’s voices and determine who won. Image courtesy HRI-Japan.

Page 9: Mission Critical: Sensors

Mission CritiCal • Winter 2012 7

Essential Components

Saab integrates Roke altimeter onto Skeldar

Roke Manor Research Ltd. of the United Kingdom has worked with Saab to integrate its miniature ra-dar altimeter into Saab’s Skeldar unmanned helicopter, increasing its performance.

Roke’s MRA Type 2 will be integrated into the Skeldar’s landing system to enable it to determine its height above ground, even in misty or dusty conditions.

Saab’s Skeldar.

Roke Manor’s miniature radar altimeter, now standard equipment on Saab’s Skeldar. Photo courtesy Roke Manor.

“Roke’s MRA will deliver the very high accuracy required in order to be a part of the avionics suite in Skeldar. This will effectively support Skel-dar’s high autonomy during landing to maximize the safe conclusion of missions,” says Jonas Carlsson, se-nior product manager at Sweden’s

Saab. “The MRA’s compact size and light weight also allows us to free up space on Skeldar and maximize pay-load.”

Page 10: Mission Critical: Sensors

M

8 Mission CritiCal • Winter 2012

Meanwhile, automakers are step-ping up efforts to see if a car can monitor driver stress or illness, sav-ing the operator from having an ac-cident. Vehicles with MEMS-based biometric sensors would keep tabs on driver’s pulse and breathing. The steering wheel would sense sweaty palms, a possible prelude to a heart attack or a fainting spell. The driver’s vital health signs would be fed into a car’s safety system that would take action in an emergency. Cars wouldn’t start if a drunk driver gets behind the wheel. Already, some autos have steering sensors that de-tect drowsy drivers.

EMS devices — tiny ma-chines with moving parts —

are everywhere these days, and they have wrought a revolution for shrinking sensors that operate un-manned systems.

An acronym for microelectrome-chanical, the shrunken sensors can be found throughout daily technol-ogies. Arrays of micromirrors, for instance, enabled digital film pro-jectors, and MEMS gyros and accel-erometers like those in Nintendo’s Wii controller have changed gaming forever. MEMS accelerometers pro-vide orientation for smartphones and image stabilization for digital

cameras. And smartphones speak-ers incorporate one or more MEMS microphones.

MEMS devices monitor air pressure in car tires, and auto GPS devices won’t work without their MEMS-based inertial navigation system. Airbag crash sensors and side-im-pact airbags are lifesavers because of MEMS accelerometers, as are MEMS-based stability control sys-tems that activate during hydro-planes or skids. MEMS accelerators control auto parking brakes, and MEMS-based anti-rollover systems are becoming standard fit in auto-mobiles.

Tiny and everywhere: Unmanned memS movemenT

Tinyandeverywhere Ti

ny and

ever

ywhe

reTinyan

deverywhere

Tiny

andeverywhere Ti

ny

everywhere

and

and

everywhereTiny everywhereand

Unmanned memS movemenT

Tinyand

Tiny

ever

ywhe

re

and

andeverywhere

Tinyeverywhere

and

and

everywhere Tiny everywhand By RAMON LOPEz

everywTiny and

Page 11: Mission Critical: Sensors

Mission CritiCal • Winter 2012 9

coaches could use it to measure whether athletes have reached their performance limits.

MC10, a startup U.S. company that makes flexible electronics, recently unveiled a new product: a sports skullcap that measures contact sport impacts that could cause se-vere concussions. The device is thought to incorporate accelerome-ters wired up with the firm’s stretch-able electronics. The device can also support research into combat brain trauma.

The technology could lead to skin patches that monitor whether the wearer is sufficiently hydrated and other adhesive patches that monitor heartbeat, respiration, temperature and blood oxygenation. The skin patches can wirelessly transmit the

Devices, such as seatbelt-based res-piration sensors, are getting cheap-er and smaller through the magic of MEMS. The technology could also lead to self-driving cars that com-bine artificial intelligence software, a global positioning system and an array of sensors to navigate through traffic. Taxicabs might shuttle fares without a driver; people with medi-cal conditions and ineligible for a driver’s license would get around with a virtual chauffer.

Digital health feedback systems use MEMS sensors the size of a grain of sand to detect medications and record when they were taken. And one day, electro-responsive fibers in sleepwear and soft electronics in pillows will monitor your blood pres-sure, sleep patterns and stress lev-els while you slumber.

Researchers in Europe have devel-oped a vest embedded with sensors that measure the wearer’s muscle tension and stress level. At the core of the vest is wearable electronics consisting of sensors woven into the fabric that register the electrical ex-citation of the muscle fibers and thin conducting metallic fibers that pass the signals to an electronic analysis system.

Muscle tension changes with their stress level. Though barely percep-tible, electrodes register the change. Electrodes affixed to test subjects’ chests induce stress, making clini-cal test results of very little use. The smart vest was developed for incon-spicuous measuring during stress studies. The vest can also contrib-ute to workplace safety, and sports

everyw

Norway’s Northern Research Institute has developed an unmanned fixed-wing aircraft, named CryoWing, which can be used for power line inspection, environmental monitoring (land and sea), aerial mapping and meteorological measurements. The CryoWing is well suited for operations in extremely cold weather. Xsens provides the CryoWing’s heading and attitude control. Photo courtesy of Xsens.

Page 12: Mission Critical: Sensors

10 Mission CritiCal • Winter 2012

MEMS — continued from Page 9 that could revolutionize the way so-lar energy is collected and used. The tiny cells fastened to clothing could turn a person into a walking solar battery charger. The cells are fabri-cated using MEMS techniques.

MEMS goes unmanned

Nowhere has MEMS penetration been more pronounced than the area of sensors and avionics for un-manned systems.

Founded in 2000, Xsens is a privately held company with headquarters in Enschede, Netherlands, and a U.S. subsidiary in Los Angeles. The found-ers were interested in measuring the performance of athletes, and a company was born with launch of a measurement unit used for human motions and industrial applications.

Clients include Sony Pictures Im-agework, Daimler, Sagem, Siemens, Saab Underwater Systems and Kongsberg Defence & Aerospace.

Xsens is a leading innovator in 3-D motion tracking technology and products based upon MEMS inertial sensor technology. Since its incep-tion in 2000, several thousands of motion sensors and motion capture solutions have successfully been

ten fall short in terms of output, as many of today’s applications require higher power levels.

This technology breakthrough uses a low-cost polymer transducer that has metalized surfaces for electric contact. Unlike conventional ceram-ic transducers, the polymer-based generator is soft and robust, match-ing the properties of regular shoe fillings. The transducer can there-fore replace a regular heel on shoes.

Scientists at the University of Penn-sylvania think along the same lines, having developed a power-generat-ing backpack. The suspended-load backpack converts mechanical en-ergy from walking into electricity. It incorporates a rigid frame pack. Rather than being rigidly attached to the frame, a sack carrying the load is suspended from the frame by vertically oriented springs. It is this vertical movement of the backpack contents that provides the mechani-cal energy to drive a small generator mounted on the frame.

Meanwhile, Sandia National Labo-ratories scientists have developed tiny glitter-sized photovoltaic cells

medical data to a smartphone. One day, an inflatable balloon catheter equipped with sensors will snake through the heart to treat cardiac arrhythmias.

Surgery to treat strokes, hardened arteries or blockages in the blood-stream may be helped by MEMS-based micromotors small enough to be injected into the human blood-stream.

Australian researchers are harness-ing piezoelectricity to power mi-crobot motors just a quarter of a millimeter wide. Remote-controlled miniature robots small enough to swim up arteries could save lives by reaching parts of the body, like a stroke-damaged cranial artery, that catheters are unable to reach.

With the right sensors attached to the microbot motor, a surgeon’s view of a patient’s troubled artery can be enhanced and the ability to work remotely also increases the surgeon’s dexterity.

Researchers at Louisiana Tech Uni-versity are taking a different tack re-garding piezoelectricity. They have developed a technology that har-vests power from small generators embedded in the soles of shoes. It is based on new voltage regulation cir-cuits that efficiently convert a piezo-electric charge into usable voltage for charging batteries or for directly powering electronics. The technolo-gy, for example, could power emer-gency locators for lost hikers or cell phones.

Energy harvesting is an attractive way to power MEMS sensors and lo-cator devices such as GPS. However, power-harvesting technologies of-

Xsens MTi are used for navigation and control on SAAB’s multipurpose underwater vehicles. Photo courtesy of Xsens.

Page 13: Mission Critical: Sensors

Mission CritiCal • Winter 2012 11

deployed in areas such as 3-D char-acter animation, rehabilitation and sports science, and robot and cam-era stabilization.

Xsens officials have found new uses for MEMS sensors initially designed for rollover detection and impact detection in cars and MEMS gyro-scopes used in smartphones and game controllers.

It is a market leader in MEMS iner-tial measurement units (IMUs), at-titude and heading reference sys-tems (AHRS) and inertial navigation systems (INS). Xsens’ IMU consists of 3-D gyroscopes, 3-D accelerometers and a 3-D magnetometer. The AHRS adds filtering to that, estimating 3-D orientation based on the IMU sen-sor data. An INS additionally uses the accelerometers to find velocity and position, using GPS as a refer-ence. Xsens offers an alternative

average 2 inches in length, 1.5 inch-es in width and 1 inch in height. A traditional IMU, for example, snugly fits into a 4-inch cube.

He said Xsens uses the same MEMS hardware used by the automotive industry, such as smart seatbelts, but for a different application: sta-bilization and control of unmanned systems, whether air, maritime or ground vehicles. Xsens also applies the technology for camera systems or platform systems that need to be stabilized.

Xsens, says van Hak, provides sys-tems for the smaller unmanned aircraft, weighing between 3 and 300 pounds. The firm is aboard un-manned aerial systems made by Delft Dynamics and Area-I’s PTERA (Prototype Technology Evalua-tion Research Aircraft). He said his equipment is also on several robotic underwater vehicles.

The MTi-G-700 GPS/INS is the suc-cessor of the MTi-G introduced in 2007. Deliveries of the MTi-G-700 GPS/INS started in December 2012. The MTi-100 series can serve as a cost-effective replacement unit for high-grade IMUs, making the end product more economically viable.

The MTi-G-700 GPS/INS is now be-ing used to navigate an unnamed European target drone, replacing fi-ber optic gyros in test aircraft. Xsens established that the unit can cope with very high accelerations during launch and cornering. With similar performance to the fiber optic gyro it replaced, the unit is 15 to 20 per-cent lower in cost, produces a weight savings and provides more room for payload, says van Hak.

He said the MTi-G-700 GPS/INS will work with other target drones and unmanned air systems. “We are searching for additional customers. We are in discussions with three other customers who are actively considering the MTi-G-700 GPS/INS for their target drones.

The MTi OEM is a board-only version of the Xsen MTi. The housing-less MTi OEM is a small and ultra-light (11-gram) AHRS with the same functionality as the regular MTi. Photo courtesy Xsens.

Area-I’s PTERA provides a bridge between wind tunnel testing and manned flight by providing a low-risk, low-cost platform to flight test high-risk technologies. The 200-pound aircraft has a 50-pound payload capacity. The unmanned aircraft operates with an Xsens MTi-G INS. Photo courtesy Xsens.

to bulky and heavy fiber optic IMUs and ring-laser gyros, shrinking simi-lar tracking performance in a signifi-cantly smaller package. Xsens is able to offer high performance in a pack-age that is tens of times smaller than the traditional IMUs and INS used for sonar and unmanned aircraft, according to company officials.

Marcel van Hak, Xsen’s product manager for industrial applications, says his product line wouldn’t exist if not for MEMS technology. Using MEMS subcomponents allows Xsens to produce IMUs, AHRS and INS that

Xsens makes systems that keep telecommunications satellites and roving vehicles, whether trucks or maritime vessels, connected. He said half of the firm’s earnings come from that application.

The Dutch company’s current MTi product portfolio in-cludes the MTi-10 IMU, the MTi-20 VRU (Vertical Refer-ence Unit) and the MTi-30 AHRS.

The MTi 100-series includes the MTi-100 IMU, MTi-200 VRU and MTi-300 AHRS.

Page 14: Mission Critical: Sensors

12 Mission CritiCal • Winter 2012

“We have integrated the Xsens MTi-G AHRS sensor with a range of prod-ucts designed for installation on land, sea and air platforms, includ-ing tactical and rotary wing aircraft,” says Paul Wynns, aircraft systems program manager at Argon ST, a wholly owned subsidiary of Boeing. “We value the Xsens MTi product line for its ease of integration, reli-ability and accuracy, along with its small size and rugged packaging.”

Xsens is not alone in supplying MEMS-based sensors to the un-manned systems industry.

MicroStrain is a Vermont business specializing in combining microsen-sors with embedded processors to autonomously track operational usage and to navigate and control unmanned systems. It has the 3DM-GX3-45 GPS/INS for vehicle tracking, camera pointing, antenna pointing, and unmanned aerial and micro vehicle navigation and the 3DM-

GX3-35 AHRS with GPS. Mi-croStrain also offers the

3DM-GX3-15 IMU and Vertical Gyro. The 3DM-GX3-15 is a miniature IMU that utilizes MEMS sen-sor technology and

combines a triaxial accelerometer and a triaxial gyro to main-tain the inertial per-

formance of the original GX3-25. Ap-plications include unmanned vehicle navigation and robotic control.

Two other players in the field are De Leon Springs, Fla.-based Spar-ton with its AHRS-8 MEMS-based attitude heading reference system. Dallas-based Tronics has introduced a high-performance angular rate sensor (gyrometer) for demand-ing applications such as platform stabilization. The product is based on Tronics’ long-standing expertise in high-end inertial sensors using MEMS-on-SOI and high-vacuum wa-fer-level packaging technologies.

Trends in manufacturing

MEMS have revolutionized every market in which they play, but the trend for the still-nascent mini tech-nology is just beginning. Analysts predict rapid growth for the types of MEMS now in widespread use and in the making.

MEMS devices, especially motion sensors like accelerometers, have changed consumer electronics for-ever and, more recently, have en-abled an emerging market for fa-cial recognition, motion-controlled apps, location-based services, aug-mented reality and pressure-based altimeters.

The growing use of disposable medical devices and respiratory monitoring is due to MEMS tech-nology. The most common medical pressure sensor is the disposable catheter to monitor blood pressure. Another type if disposable, low-cost MEMS pressure sensor is the infu-sion pump used to introduce flu-ids, medication and nutrients into a patient’s circulatory system. MEMS pressure sensors are used in respi-ratory monitoring, such as the Con-tinuous Positive Air Pressure device, used to treat sleep apnea, and oxy-gen therapy machines.

MEMS devices will proliferate as cheaper manufacturing techniques for the micro machines are devel-oped. Massachusetts Institute of Technology researchers have found a way to manufacture them by stamping them on plastic film, open-ing up the possibility of coating large areas with tiny sensors.

That should significantly reduce their cost, but it also opens up the possibility of large sheets of sensors that could, say, cover the wings of an airplane to gauge their structural in-tegrity. The printed devices are also flexible, so they could be used to make sensors with irregular shapes.

MEMS — continued from Page 11

The Delft Biorobotics Lab’s FLAME robot is an active walker that uses the MTi for its stability. Photo courtesy Xsens.

Northrop Grumman supplies the fiber optic, gyrocompassing LCR-100 AHRS for Embraer Legacy 500 and Legacy 450 aircraft. The LCR-100 AHRS provides navigation information regarding the aircraft’s position, heading and attitude. Photo courtesy Northrop Grumman.

Page 15: Mission Critical: Sensors

Mission CritiCal • Winter 2012 13

And since the stamping process dis-penses with the harsh chemicals and high temperatures ordinarily re-quired for the fabrication of MEMS, it could allow them to incorporate a wider range of materials.

Conventional MEMS are built through the same process used to manufacture computer chips, which is called photolithography: different layers of material are chemically de-posited on a substrate — usually a wafer of some semiconducting ma-terial — and etched away to form functional patterns.

Photolithography requires sophisti-cated facilities that can cost billions of dollars, so MEMS manufacturing has high initial capital costs. And since a semiconductor wafer is at most 12 inches across, arranging today’s MEMS into large arrays re-

quires cutting them out and bond-ing them to some other surface.

Besides serving as sensors to gauge the structural integrity of aircraft and bridges, sheets of cheap MEMS could also change the physical tex-ture of the surfaces they’re applied to, altering the airflow over a plane’s wing, or modifying the reflective properties of a building’s walls or windows.

How they did it: The MIT process be-gins with a grooved sheet of a rub-bery plastic, which is coated with the electrically conductive material indium tin oxide. The researchers use what they call a “transfer pad” to press a thin film of metal against the grooved plastic. Between the metal film and the pad is a layer of organic molecules that weaken the metal’s adhesion to the pad. If the

researchers pull the pad away fast enough, the metal remains stuck to the plastic.

Once the transfer pad has been ripped away, the metal film is left spanning the grooves in the plastic like a bridge across a series of ra-vines. Applying a voltage between the indium-tin-oxide coating and the film can cause it to bend downward, into the groove in the plastic: The film becomes an “actuator” — the moving part in a MEMS device.

Varying the voltage would cause the film to vibrate, like the diaphragm of a loudspeaker. Selectively bend-ing different parts of the film would cause them to reflect light in differ-ent ways, and dramatically bending the film could turn a smooth surface into a rough one. Similarly, if pres-sure is applied to the metal film, it will generate an electric signal that the researchers can detect. The film is so thin that it should be able to register the pressure of sound waves.

Next steps

The researchers are working on bet-ter ways to bond the metal films to the plastic substrate, so that they don’t have to rely on tearing the transfer pad away quickly to get the film to stick. They’re also developing prototypes of some of the applica-tions they envision for the technol-ogy.

Ramon Lopez is an aviation, aerospace and defense journalist who previously served as editor-in-chief of Air Safety Week, editor of AUVSI’s Unmanned Systems and Washington Correspon-dent for Flight International, Jane’s Defence Weekly and International De-fense Review.

Australia’s EM Solutions was awarded a contract to develop a Mounted Battle Command Ka-band Satcom On-The-Move System by the Australian Defence Force. The system employs an Mti-G AHRS. Photo courtesy Xsens.

Page 16: Mission Critical: Sensors

14 Mission CritiCal • Winter 2012

But even as we collect imagery of a city-sized area, we don’t intend to look at all the data, only the parts that matter, directed by clues from other sources.

We can think of big data from a cou-ple of different angles. For example, there is the storage issue. But as disks get cheaper and denser, this becomes less of a problem.

Then there is the data transfer is-sue. But by using novel compres-sion techniques, we can compress the imagery by 50 times. And we can compress the data by 1,000 times if we just represent the moving targets and don’t update the background map.

Q: Can you describe how intelligent persistent

surveillance systems work?A: Many in the persistent surveil-lance field tend to focus on the plat-form, be it fixed wing, rotary wing and lighter than air. Others look at the sensors that go on those plat-

Q & A with John MarionQ & a

military setting, where the target is the Taliban, or in a local police sce-nario, where the target is an urban drug smuggling operation.

Q: Has the military’s use of these systems

changed in the years since they first became available?A: In terms of basic uses, much has stayed the same since the U.S. Army deployed the first wide-area persis-tent surveillance system, Constant Hawk, on turboprop planes back in 2006. What has improved is how we task assets, use persistent surveil-lance imagery with other intelligence sources and cross-cue different sen-sors. In addition, we are now putting a strong emphasis on the automa-tion and efficiency of analysis tools — a concept we call “intelligent per-sistent surveillance,” or IPS.

Q: What is the best way to cope with the mas-

sive amounts of data such systems can provide?A: The issue of big data is usually framed the following way: “How can we possibly look at all this data?” That’s the wrong way to think about the problem.

We collect all this data because we don’t know when, where or what sort of the bad things will happen.

Q: What does persistent surveillance bring to

the table, both for military and civilian users?A: While standard full-motion video cameras only have a “soda straw” field of view, wide-area persistent surveillance systems can provide video coverage of city-sized areas. They do this at medium resolution, enough to track vehicles and people in real time. On the battlefield, these systems provide over watch, giving the warfighter greater situational awareness and the user the ability to monitor multiple areas or targets at one time, from one sensor.

Wide-area persistent surveillance systems also give analysts a way of back-tracking events. For example, suppose an IED was found by the side of the road — the sensor op-erator could use the stored sensor imagery to go back in time to dis-cover when the IED was emplaced. He could then go even further back to find out where the emplacer came from. Finally, he could fast forward to where the emplacer went after planting the IED.

By using clues gleaned from the stored sensor data, we could even-tually map out a whole network of individuals, right up to the group’s leadership. And that’s both true in a

John Marion

John Marion is the director of the persistent surveillance division of Logos Technologies in Fairfax, Va., which offers systems for the wide area surveillance, remote sensing, cyber security and other areas.

Page 17: Mission Critical: Sensors

Mission CritiCal • Winter 2012 15

forms. However, the real challenge with the new persistent surveillance systems is the data analysis. That’s why attention should be directed to IPS.

IPS tools index data by transactions — geo-temporally tagging the starts and stops of all the targets within a field of view and then storing that information. When geo-temporal tagging is done across various intel-ligence sources, analysts can quickly search recorded sensor data for tar-gets at a specific location and over a specific time period, efficiently ex-ploiting those intelligence sources.

So, as you can see, IPS goes way be-yond the platform, sensor and mere data collection. It gives the analysts a means of efficiently extracting the intelligence value from the available data. That means fewer analysts pro-ducing better products, much faster.

Q: Are aerostats the best platform for such sys-

tems, or do they make sense for smaller systems as well?A: The platform choice really de-pends on the application. Aerostats are great for surveilling fixed loca-tions, such as the perimeter area of a forward operating base, or FOB, in Afghanistan and urban areas along the U.S.-Mexico border. By contrast, unmanned aircraft are best used when the target location changes frequently or where friendly forces don’t control the ground.

This is why we have developed intelli-gent persistent surveillance systems for both aerostats and unmanned aircraft.

Our Kestrel system is mounted on an aerostat located at a FOB. The system collects about 350 megapix-

els of day/night data per second in the air, while the processing can be performed on the ground with rela-tively large processing computers. We can do this because we send the imagery data down through a fiber-optic cable in the aerostat tether.

We also developed an IPS system for tactical, fixed-wing unmanned aerial systems. Called LEAPS, it provides ISR to ground forces on the move. So it cannot pump persistent surveil-lance imagery down a tether to large computers on the ground

Instead, LEAPS performs all the pro-cessing, geo-registration, nonunifor-mity correction, etc., in an 11-pound processor that shares a gimbal with the wide-area sensor.

Q: You’ve said that such systems have home-

land security applications. Can you describe a couple of them?A: We have demonstrated both air-craft- and aerostat-based wide-area persistent surveillance along the southern border. With these sys-tems, we can track illegal activity in both rural and urban areas, focusing on illegal border crossing and map-ping networks of drug traffickers op-erating in the urban areas.

This past March, the Department of Homeland Security conducted a weeklong test of an aerostat system in Nogales, Ariz. The demonstration was very successful. The Customs and Border Protection agents found it easy to work with the wide-area persistent surveillance system, and within seven days, they nabbed 100 suspects.

Likewise, a couple of years ago, we demonstrated LEAPS on a manned

aircraft for more dynamic border se-curity operations.

Q: Is there any commer-cial potential for such

systems as well?A: There’s definitely a strong domes-tic market for them. Besides local law enforcement, wide-area persis-tent surveillance could be used for disaster relief, public event security and environmental missions, like mapping the location of oil slicks in an offshore spill or counting polar bears over a large swath of Arctic wilderness.

Q: Assuming there is commercial potential,

how can the issue of privacy best be handled?A: I think it’s good that the UAS in-dustry is thinking about the privacy issue. In the case of persistent sur-veillance systems used for law en-forcement, I would point out that they are like any other police tool, and their use will have to be gov-erned by strict rules and regulations. We already have police helicopters; airborne persistent surveillance sys-tems just stay in the air longer.

Q: What technological hurdles, if any, remain

to be overcome for persistent surveillance?A: We will continue to improve the sensors — miniaturizing them and expanding beyond black and white imagery and into the to multi- and hyper-spectral area. Still, the largest challenges are in IPS as we develop the tools to make sensor analysts faster, more efficient and able to de-liver better products. That’s the area that needs the most focus.

Page 18: Mission Critical: Sensors

16 Mission CritiCal • Winter 2012

arning: Objects in your mirror are closer than they appear. And robot-ics has the answer for bringing that archaic notion into the 21st century.

Most drivers might currently use a series of mirrors to determine their surroundings, but for many robots, including the Google car, lidar is proving a better substitute than a quick glance and a prayer.

“If you’re driving on the street and somebody passes you, you want to know if somebody comes from be-hind before you start a passing ma-neuver,” says Wolfgang Juchmann, product marketing manager at Ve-lodyne Acoustic’s lidar division. “Es-sentially each time you look in your rearview mirror, you want to look backwards.”

Velodyne Lidar’s sensors provide this capability on a lot of high-pro-file projects. It makes the sensor of choice for Google’s self-driving car program, Oshkosh’s TerraMax, Lockheed Martin’s Squad Mission

By dANIELLE LUCEy

Support System and TORC Robot-ics’ Ground Unmanned Support Sys-tem, to name a few. They also were tapped by rock band Radiohead to create their Grammy-nominated “House of Cards” music video.

The company got its start as a spi-noff of the DARPA Grand Challenges, where company founders David and Bruce Hall entered the competitions as Team Digital Audio Drive, or DAD. The brothers had previous robotics experience in competitions such as “BattleBots,” “Robotica” and “Robot Wars” in the beginning of the 2000s.

After the first Grand Challenge, the Halls realized all the teams had a sensor gap they could fill. Stereovi-

sion was not good enough for the task, so they invented the HDL-64 lidar in 2005 and entered the sec-ond Grand Challenge with the sen-sor, though a steering control board failure ended their run prematurely. By 2006, the company started selling a more compact version of the sen-sor, the HDL-64E. By then, the teams were gearing up for DARPA’s Urban Challenge event. Instead of enter-ing the competition themselves, the brothers sold their device to other competitors. Five out of the six teams that finished used their lidar, including the top two teams.

sCan it or Click it:Click or scan this barcode with your smartphone to see Radiohead’s “House of Cards” video, which was shot using Velodyne’s lidar. The video shows how many robots use the sensor to perceive their environment.

W

Lost in space?How lidar ensures robots know more about tHeir surroundings

Page 19: Mission Critical: Sensors

Mission CritiCal • Winter 2012 17

By dANIELLE LUCEy

How lidar works

Though the device proved a break-through in autonomous sensing technology, lidar is not a new con-cept.

“The lidar itself is a technology that’s been around for a long time,” says Juchmann. “The laser beam hits an object and the object reflects light back. The time this takes tells us how far away the object is and the amount of light reflected back gives us an idea about the reflectivity of the object.”

Lidar works in a similar way to radar, in that it measures the time it takes for a signal to return to its point of origin, though it ditches radio waves for laser beams. Because of the dif-ferent nature of the two mediums, while radar excels at measuring far-away objects, Velodyne’s sweet spot is in the 100-meter radius range, says Juchmann. However, lidar over-all has a better angular resolution.

What makes Velodyne’s product dif-ferent than simple lidar technology, explains

Juchmann, is that instead of using one laser to determine an object’s range, it uses 64.

“Instead of just shooting one laser to the wall, we shoot 64 all on top of each other so if you look at the wall you’ll see a [vertical] line of dots,” says Juchmann. “This means you can see a wall with a resolution of 64 lines in a vertical field of view of about 26 degrees.”

Instead of measuring the time-to-distance correlation of this series of dots at the same time, Velodyne measures them one after the other, in a series, to capture the distance data from each point. If you were shooting the lasers toward a flat wall, it would be a fairly easy measure-ment, says Juchmann, because the

laser data would return almost si-multaneously. However, if the series of laser points were flashed toward a staircase, it would mark faster re-turns on the lower-level stairs and longer returns as the steps ascend, giving the user an idea of the varying distances.

The measurement of a single vertical line in space is not very useful though, especially to large cars trying to navi-gate their environment at fairly high speeds. Velodyne’s sensor also spins these 64 points, so there are 64 lines moving through the whole room. “The amazing part is the amount of data that is measured in a very short time,” he says.

A human blink lasts about two-fifths of a second. In that time span, Velo-dyne’s lidar has done a 360-degree scan of its surroundings four times. This 10-times-per-second scan pro-duces 1.3 million data points per second. At this speed, lidar can get in a centimeter’s range of accuracy in measuring an object’s location. While much older methods, like surveying, can measure an object’s accuracy in the smaller, millimeter range, high-definition lidar’s speed versus break-ing out some tripods is no contest.

After the success of the company’s HDL-64E, it has also released the HDL-32E, which uses the same con-cept but uses 32 laser points instead of 64. This is useful for smaller ve-hicles, because Velodyne’s HDL-32E lidar weighs 1 kilogram, versus 15 ki-lograms for double the laser points. This is a huge factor when people want to mount their lidar on some-thing lighter, explains Juchmann. It’s also less than half the price.

Velodyne’s lidar mounted atop Google’s self-driving Lexus. Photo courtesy Google.

Page 20: Mission Critical: Sensors

18 Mission CritiCal • Winter 2012

To make all this data useful, compa-nies integrate Velodyne’s lidar data with GPS and IMU data to determine how their robots should move.

“The vehicle needs to know where ex-actly it is,” says Juchmann. “Typically you have to feed in GPS information so you know where you actually are. With our sensor you can integrate and synchronize GPS information in order to determine not only the range, but also were you are.”

The IMU compensates for move-ments and angles that inherently oc-cur when the sensor is moved in real life. The key to all this data, though, is the software each company cre-ates that analyzes it all.

The Google self-driving car, for in-stance, integrates this data with its Google Maps product so the robot will know the long-range terrain data and also can detect if, for example, a bicyclist is coming up behind the car that is about to turn.

“If you have a robot or a self-driving car that moves around, it’s impor-tant to see what’s around it,” says Juchmann.

Not all of the technological aspects of lidar have been overcome. Lidar sensors are affected, the same way human eyes are, by low-visibility sit-uations. For instance, the laser beam can detect drops of rain, but if the rain is heavy enough it might view a downpour as an object. Juchmann likens it to watching an antenna TV with some white noise.

“You still see a picture, but only once in a while you have the full picture. If the rain becomes really, really heavy, you have more rain than picture.”

The same is true for fog and snow-fall. “If you have a little bit of that it’s all fine, and computer algorithms can figure out the once-in-a-while re-flection, but if it’s heavy snowfall” the reflections will outweigh the actual picture, explains Juchmann.

Other applications

Lidar has a lot of applications out-side robotics. Right now, Velodyne is addressing the security and surveil-lance market, says Juchmann, which could use lidar to monitor military perimeters and border fences. Right now, many fences are monitored with cameras, which at their best have around 130-degree fields of view.

Another big market that uses lidar is mobile mapping. Transportation department contractors put the sen-sors on manned vehicles and, using cameras and other sensors, give state transportation departments in-formation on the conditions of bridg-es and roads. The accurate mapping provides an idea of roadwork and maintenance that needs to be done.

AAI Textron uses Velodyne’s lidar on its Common Unmanned Surface Vehicle, to determine if there are in-truders in the immediate vicinity and for collision avoidance.

Lidar — continued from Page 17

How the Google car sees the path and obstacles ahead, using lidar integrated with other data and sensors. Photo courtesy Google.

Page 21: Mission Critical: Sensors

REACH THEUNMANNED SYSTEMS ANDROBOTICS COMMUNITIES

ylhtnom dna ylkeew ,yliad a no sredael laitneulfni tsom eht fo noitnetta eht erutpaC

basis. Advertise in Unmanned Systems magazine, unmanned systems : mission critical ,

Unmanned Systems eBrief or AUVSI.org.

For more information call +1 571 225 7779 or email [email protected]

auvs i . o rg

VOLUME 28 NO.6 • JUNE 2010

2700 Sou th Qu incy S t ree t , Su i t e 400 , A r l i ng ton , VA 22206 , USA

ROBOTICS WEEK U.K. TECH

BIOLOGICAL SYSTEMS

Robots

Helping

Robots

Page 22: Mission Critical: Sensors

20 Mission CritiCal • Winter 2012

Aside from Google, Juchmann says nearly every single major car manu-facturer in the world uses one or two of the company’s lidar to test out some of the other sensors that have made their way onto cars in the last 10 years. Auto companies will com-pare the results of the lidar with its backup warning signals, lane keep-ing and blindspot detection to mea-sure their accuracy.

Juchmann predicts, however, that the auto industry will be the big boon for lidar once they are adopted on every vehicle.

“The next big step is to get integrated into the large volume products,” he says.

For this to happen, the cost needs to come down and the sensors have

to get smaller. Also, many cars still rely on outdated computing technol-ogy that “isn’t adequate for modern sensor technology anymore.” While this isn’t a problem for Google, which uses its own computers, in tradition-al cars these old systems can be a bottleneck to how we actually use all this data.”

And there is one more big hurdle.

“People don’t want to have that thing on the top [of their car], and that’s where the balance between form and function needs to be found,” says Juchmann. The first thing all the car designers say is, “There’s no way in hell this thing is going to be on top of the car.”

No one has come up with an answer for that yet, says Juchmann, but simi-

lar problems have been solved in the past. He points to satellite radio, which originally required large an-tennas.

“But at some point somebody made the decision that we’re going to have satellite radio inside the car. That’s the future we need, and the only way that’s physically going to work is to have an antenna on the outside,” he says. “Let’s come up with a design that doesn’t look really bad, so they came up with the shark fin design.”

The small fin is now on the back of most cars with satellite radio. The best spot for lidar remains at the top of a vehicle, though, so how this final challenge plays out is still a question.

Danielle Lucey is managing editor of Mission Critical.

Lidar — continued from Page 18

AAI/Textron’s CUSV uses a Velodyne lidar, the small sensor at the very top of the vessel, to image the maritime landscape. Photo courtesy Textron.

Page 23: Mission Critical: Sensors

Mission CritiCal • Winter 2012 21

Need to Learn More? No Time to Leave Your Desk?

Tune into AUVSI Webinars

AUVSI webinars are a new and fast-

growing service that delivers expert

speakers and hot topics in unmanned

systems to your desktop. Topics over

the coming year will include all facets

of this dynamic industry.

Members receive a special discount

on all webinar content. Listen to the

webinars live, or download them to

your computer later.

Check our website for the latest schedule: http://www.auvsi.org/publications/auvsiwebinarseries.

Page 24: Mission Critical: Sensors

22 Mission CritiCal • Winter 2012

Smile! Surveillance cameras by citystatE oF tHE art

4,468 cameras in Manhattan (2005, New York Civil Liberties Union)

17,000 cameras in Chicago (2005, VinTechnology.com)

4,775 cameras in Washington, D.C. (2008, Los Angeles Times)

378 publicly owned security cameras in Rio de Janiero (2011, Study: “Cameras in Context: A Comparison of the Place of Video Surveillance in Japan and Brazil”)

2,000 cameras in Buenos Aires (2011, InfoSurHoy.com)

13,000 cameras in Mexico City (2011, Los Angeles Times)

Page 25: Mission Critical: Sensors

Mission CritiCal • Winter 2012 23

422,000 cameras in London (2012, CCTV.co.uk)

2,200 cameras in Sydney (2009, Daily Telegraph)

300 cameras in Paris, plans to install more than 1,100 more (2012, France 24)

400,000 cameras in Beijing (2011, Beijing Daily)

500,000 cameras in Chongqing, China (2012, VinTechnology.com)

184 cameras in Johannesburg central policing district (2003, Book: “Rainbow Tenement: Crime and Policing in Inner Johannesburg)

Security concerns and needs are different across the globe, but police departments, federal agencies and mili-taries everywhere are increasingly opting to respond using robots. With air systems much cheaper than their manned counterparts and ground robots’ ability to keep people out of dangerous point-man positions, un-

manned systems are adding more distance than ever to the long arm of the law.

While UAS are known for their 60,000-foot view of areas of interest around the globe, many surveillance cameras are eyeing the residents of major cities mere feet from street level. While it’s difficult to get au-thoritative numbers, here is a compilation of what the Mission Critical staff could find.

Page 26: Mission Critical: Sensors

BECOME A CORPORATE MEMBER TODAY!

• Discounts on Exhibits, Sponsorships, Advertising and AUVSI Products

• Access to Members Only networking, education and Select VIP Events

• Listing in Unmanned Systems Magazine and AUVSI’s Online Directory

• Complimentary Job Listings on AUVSI’s Online Career Center

• Research reports and knowledge sharing through AUVSI’s Online Community

• Chapters and Regions around the Globe

Join today at www.auvsi.org/membership

MAXIMIZE YOUR VISIBILITY

C

M

Y

CM

MY

CY

CMY

K

AUVSI membership 8_5x11 ad.pdf 1 1/2/13 1:51 PM

Page 27: Mission Critical: Sensors

Mission CritiCal • Winter 2012 25

ship invisible (“Harry Potter” has since borrowed the idea on a smaller level).

While this, too, has not yet come to pass, the field of metamaterials is taking a look at it, so to speak, by altering the path of light as it moves through special materials. Numer-ous universities around the world are working on it, some funded by government agencies. Scientists at the University of Texas in Austin re-cently revealed that they had cloaked a cylinder from the microwave part of the energy spectrum, although, sadly, the scientists could still see it.

Eventually, however, such an appli-cation could be useful to warplanes, which is essentially what the Romu-lans used it for.

“What we are thinking about is not necessarily cloaking the whole war-plane but some hot spots, a part such as the tailplane that you would want to cloak because it reflects most of the energy” from microwaves, one of the researchers said in the “New Journal of Physics.”

“Star Trek” also had the Phaser, a raygun that could be set to stun or kill. At the time, the only existing technology was the regular gun, which had only one setting.

In 1969 — right about the time the original “Star Trek” series was can-celed — a NASA employee named Jack Cover began working on a stun gun that used small tethered darts to disable opponents. In the mid-1970s he had finished his work on the Taser.

From ‘Star Trek’ to your house: Communicators, phasers and other ideas that came true PoP CUltUrE CornEr

Various researchers have built something resembling the Tricorder, but if you’d like to try your hand at it, the X Prize Foundation this year kicked off the Tricorder X Prize, a $10 million competition to develop a mo-bile solution that could diagnose pa-tients better than a panel of board-certified physicians. The prize is a collaboration with Qualcomm Inc., and the team used the son of “Star Trek” creator Gene Roddenberry to promote it.

“It’s great to see two amazing orga-nizations … bring the technology of ‘Star Trek’ to life and make the Tricorder a reality for people every-where,” Eugene Wesley “Gene” Rod-denberry Jr. said in a press release.

“Star Trek” also had the very futuris-tic transporters, which could “beam” anybody most anywhere. Like the Tricorder, such an invention has also proven to be a bridge too far, although here, too, science is giving it whirl.

In the November issue of AUVSI’s Un-manned Systems magazine, writer Di-anne Finch reported on the phenom-enon of quantum entanglement, where particles, such as photons, can be linked over great distances. If you change the state of one, the oth-er changes to match. While this has given rise to technologies that may be able to use this effect, such as quantum computers, the teleporter remains well out of reach — for now.

The field of “spooky science” has also tackled another “Star Trek” technol-ogy, the Romulan cloaking device, which could render an entire space-

They had some cool stuff in the TV show “Star Trek” … even in the original show, where the sets

were sometimes cardboard and the aliens looked a lot like humans wear-ing body paint.

One memorable piece of equipment was the communicator, a flip-top walkie-talkie that was truly revolu-tionary in the late 1960s. Back then, when most homes had party-line ro-tary phones, being able to flip open a little box to talk was miraculous.

In the intervening decades, it has be-come much less so. While the origi-nal cell phones of the early 1980s were clunky beasts that barely made phone calls, they have morphed into designs that would make Capt. Kirk quite envious. Well, in most ways — the “Star Trek” communicators could operate over vast distances and rarely seemed to drop calls.

Martin Cooper, who created the first personal cell phone while working at Motorola, has cited the “Star Trek” communicator as his inspiration. He hated talking on wired devices, and envied the freedom he saw on TV, so he helped create it.

Another nifty device was the Tri-corder, a doodad about the size of a tape recorder (now an obsolete piece of equipment) that could scan a surrounding area and analyze it. Various versions appeared on the TV show and its offspring, including a medical version that could diagnose illnesses. Alas, this is one area where science has yet to catch up, though not for lack of trying.

Page 28: Mission Critical: Sensors

26 Mission CritiCal • Winter 2012

Driving factorstiMElinE

GPSThough GPS existed long before

1998 in military technology, President Bill Clin-ton signed a law requiring the military to stop scrambling the systems of civilian GPS signals, so the general public could benefit from the technol-ogy. This move paved the way for in-car navigation devices, which Google’s fleet of self-driving cars rely on for mapping.

ElEctronic cruiSE controlAutomotive Electronic Cruise Control was invented in

1968 by an engineer for RCA’s Industrial and Automotive Systems Division. One of the two patents filed describes “digital memory,” where electronics would play a role in controlling a car — an industry first.

Antilock brAkinG SyStEmThough the technology was originally developed for aircraft

in 1929, Antilock Braking Systems got their automotive debut in 1971 through a technology called Sure Brake on that year’s Chrysler Imperial.

AdAPtivE cruiSE controlThe Mitsubishi Diamante was the first to use laser-based

adaptive cruise control; however, instead of applying the brakes, the car would simply throttle down to a lower speed. Toyota added braking control to its radar-based cruise control system in 2000.

bAckuP wArninG SiGnAlSIn 1996, the National Highway Traffic Safety Administration

tested backup warning signals, where ultrasonic sensors on a rear bumper and audible warnings work together to allow drivers to get a sense of how close an object is to the back of their car. Through these systems, the aver-age driver is able to stop a vehicle from hitting an object in 1.5 seconds, with little difference in response times by age group.

1968

1971

1995

1996

1998

Page 29: Mission Critical: Sensors

Mission CritiCal • Winter 2012 27

Although Google and auto manufacturers have made a lot of inroads into self-driving cars, technologies like lidar and Google Maps rest on the shoulders of a lot of sensor work that’s been going on under the hood for decades. Here’s a look at some of the formative sensor suites that have enabled more autonomy in our automobiles.

trAffic jAm ASSiStVolvo recently announced that its traffic jam assist

feature would be ready by 2014, allowing drivers to keep their hands off the wheel in low-speed, high-conges-

tion situations. The technology will work in traf-fic flowing at less than 30 mph.

lAnE kEEPinGNissan was the first company to offer a lane-keeping system on

its Cima, which it sold in Japan. The first car available stateside didn’t debut until 2004, and Europe got the technology in 2005.

PArkinG ASSiStLexus and Toyota introduced the world to the Intelligence Parking Assist System,

which uses a rear-facing camera to guide a car into a spot and also helps avoid objects. The system has a series of arrows that help the driver tell how he is aligned in a space. Using these arrows, the driver would determine the parameters of the spot and press “Set,” allow-ing the car to park on its own. The system debuted in the United States in 2006.

AdvAncEd front-liGhtinG SyStEmPan-European research and development firm EUREKA

worked to develop front-lighting systems, which use sensors to automati-cally make the headlines of a car work directionally. This around-the-cor-ner lighting system was actually featured on cars dating back to the late 1920s, however it was mechanical instead of automated.

blind SPot dEtEctionIn 2005, Volvo introduced its Blind Spot Information System, which

used a camera-based system to keep an eye on the area alongside and near the rear of its XC70, V70 and S60 models. The system uses warning lights to inform the driver when another vehicle enters this area.

2001

2003

2004

2005

2014

Page 30: Mission Critical: Sensors

VAULT.AUVSI.ORGAUVSI’s Knowledge Vault

Your Go-To Industry Search Engine

Maritime700+ platforms200+ companies

Ground750+ platforms250+ companies

Air2400+ platforms700+ companies

AUVSI’s Knowledge Vault is now live. Use one search bar to easily explore all AUVSI proceedings and publications. Did you miss AUVSI Unmanned Systems North America 2012? Visit the Vault to purchase and access proceedings from the conference.

COMING SOON: The world’s largest database of unmanned vehicles and company information.

Page 31: Mission Critical: Sensors

Mission CritiCal • Winter 2012 29

By BRETT dAVIS

Researchers and end users are constantly seeking new ways to

communicate with robots and un-manned systems.

One goal is to make such interac-tions as easy and intuitive as inter-action with other humans, but that poses tough challenges on engi-neers and programmers. Research continues, however, on new ways to talk to robots.

Five in five

For the past seven years, IBM has been releasing a list of five technolo-gies its researchers think have the potential to change the way people

to look at images, but can under-stand them. A computer could, for example, scan photos of skin mela-nomas taken on patients over time, possibly diagnosing cancer before physical problems result. This could be a boon for the emerging market of medical robotics.

Dmitri Kanevsky, an IBM master in-ventor, who lost his hearing at age three, says in another video that in five years computers will be able to hear “what matters,” such as moni-toring mountainsides in Brazil for audible signs that a mudslide is im-minent.

“It can hear that a flood is coming,” Kanevsky says. “This is an example

live and work. While not specific to robotics, most of the 2013 technolo-gies singled out could lead to a revo-lution in the way people interact with unmanned systems of all kinds.

The first is touch: In the next five years, you’ll be able to touch through a phone.

“You’ll be able to share the texture of a basket woven by a woman in a remote village halfway across the globe,” says IBM Retail Industry Ex-pert Robyn Schwartz in a company video. “The device becomes just as intuitive as we understand touch in any other form today.”

The second is sight. In five years, IBM posits, computers won’t just be able

R

Researchers look for novel, new ways to communicate with unmanned systems

Page 32: Mission Critical: Sensors

30 Mission CritiCal • Winter 2012

of how hearing sensors can help to prevent catastrophes.”

Another sense coming to comput-ers is smell, according to the IBM re-searchers. This could lead to sensors in the home that literally can smell disease and then communicate that to a doctor.

“Smelling diseases remotely, and then communicating with a doctor, will be one of the techniques which will promise to reduce costs in the healthcare sector,” says Hendrik Ha-mann, a research manager of physi-cal analytics, who adds that “your phone might know that you have a cold before you do.”

IBM further predicts that comput-ers will be able to detect how food tastes, helping create healthier di-ets and even developing unusual pairings of food to help humans eat smarter.

“These five predictions show how cognitive technologies can improve our lives, and they’re windows into a

much bigger landscape — the com-ing era of cognitive systems,” says Bernard Myerson, IBM’s chief inno-vation officer.

As an example, he cites a track-in-specting robot doing its work inside a train tunnel. A current robot could evaluate track but wouldn’t under-stand a train barreling down that same track.

“But what if you enabled it to sense things more like humans do — not just vision from the video camera but the ability to detect the rumble of the train and the whoosh of air?” he asks on the IBM website. “And what if you enabled it to draw inferences from the evidence that it observes, hears and feels? That would be one smart computer — a machine that would be able to get out of the way before the train smashed into it.”

In the era of cognitive systems, he says, “humans and machines will collaborate to produce better re-sults — each bringing their own su-

perior skills to the partnership. The machines will be more rational and analytic. We’ll provide the judgment, empathy, moral compass and cre-ativity.”

Verbal commands

DARPA has been working for years with the Legged Squad Support System, or LS3, the follow-on to the legendary Big Dog robotic mule. In a new video, the defense research agency demonstrated how a ground robot could obey verbal commands, giving it roughly the same capability to follow a soldier as an animal and handler would do.

In December, the LS3 was put through its paces, literally, at Vir-ginia’s Fort Pickett, where it followed a human soldier and obeyed voice commands.

“This was the first time DARPA and MCWL [the Marine Corps Wafighting Lab] were able to get LS3 out on the testing grounds together to simu-

Talking to Robots — continued from Page 29

The LS3 goes through its paces at Virginia’s Fort Pickett. Photo courtesy DARPA.

Page 33: Mission Critical: Sensors

Start your year

at the unmanned

systems industry’s

premier event, featuring

updates on the latest ground,

air and maritime programs

Mark your calendarTUESDAY, February 12

t h r o u g h THURSDAY, February 14

2013Ritz Carlton

McLean, VA, USA

GroundTuesday

12 February

AirWednesday

13 February

MaritimeThursday

14 February

. . . W h e r e B u s i n e s s h a p p e n s

auvsi.org/uspr

1 2 – 1 4 F e b r u a r y 2 0 1 3 • M c L e a n , V a

2013

Page 34: Mission Critical: Sensors

32 Mission CritiCal • Winter 2012

Talking to Robots — continued from Page 30

Page 35: Mission Critical: Sensors

Mission CritiCal • Winter 2012 33

late military-relevant training con-ditions,” Lt. Col. Joseph Hitt, DARPA program manager, says in a DARPA press release. “The robot’s perfor-mance in the field expanded on our expectations, demonstrating, for example, how voice commands and follow-the-leader capability would enhance the robot’s ability to inter-act with warfighters. We were able to put the robot through difficult natu-ral terrain and test its ability to right itself with minimal interaction from humans.”

In a DARPA video, the LS3 turns itself on after a voice command, and then begins following the human leader.

“The LS3 program seeks to demon-strate that a highly mobile, semi-autonomous legged robot can carry 400 pounds of a squad’s equipment, follow squad members through rug-ged terrain and interact with troops in a natural way similar to a trained animal with its handler,” DARPA says.

LS3 is being developed by Boston Dy-namics, leading a team that includes Bell Helicopter, AAI Corp., Carnegie Mellon, the Jet Propulsion Labora-tory and Woodward HRT.

The December testing was the first in a series of demonstrations planned to continue through the first half of 2014, according to DARPA.

Social interactions

Interacting with robots in a social manner could become more impor-tant in the future, as service robots take on a greater role in everyday life.

An IBM chart showing how computers could understand photographs in the next five years.

Page 36: Mission Critical: Sensors

34 Mission CritiCal • Winter 2012

Researchers at Carnegie Mellon Uni-versity have been working on what seems like a simple problem: how to let a robot tell where people are looking.

“It’s a common question in social set-tings, because the answer identifies something of interest or helps de-lineate social groupings,” the univer-sity’s Robotics Institute says.

The institute developed a method for detecting where people’s gazes intersect, by using head-mounted cameras.

“By noting where their gazes con-verged in three-dimensional space, the researchers could determine if they were listening to a single speak-er, interacting as a group or even following the bouncing ball in a ping-pong game,” the institute says.

The algorithm used for determining “social saliency” could be used to evaluate various kinds of social cues, including peoples’ facial expressions or body movements.

“This really is just a first step toward analyzing the social signals of peo-ple,” says Hyun Soo Park, a Ph.D. stu-dent in mechanical engineering, who worked on the project with Yaser Sheikh, assistant research professor of robotics, and Eakta Jain of Texas Instruments, who was awarded a Ph.D. in robotics last spring. “In the future, robots will need to interact organically with people and to do so they must understand their social environment, not just their physical environment,” Park said in a univer-sity press release.

Head-mounted cameras, as worn by soldiers, police officers and search-and-rescue officials, are becoming more common. Even if they don’t become ubiquitous, they could still be worn in the future by people who work in cooperative teams with ro-bots.

Tapping the phones

Ground robots have sometimes been plagued by issues of band-width and range. These problems are especially problematic in urban areas, particularly in modern, multi-story buildings, where communica-tions can drop off fast.

A research team from the U.S. Army, University of Washington and Duke University has demonstrated one way to help expand the communica-tions bandwidth of ground robots in-side buildings, by using the existing electrical systems to create a “super antenna” to achieve wireless, non-line-of-sight communications.

The concept is based on the idea of power line networking, or using the bandwidth in electrical connections to send information as well. Such applications are already in use for streaming high-definition television and music and even providing high-speed Internet service using existing wall plugs.

“The power line’s ability to receive wireless signals is a well-known phe-nomenon, but only recently has it been exploited for in-building com-munication,” says a paper present-ed by the Army’s David Knichel at

AUVSI’s Unmanned Systems North America 2012.

The downside for current power line systems is that users on both ends of such a connection have to be plugged into a wall, not a viable con-cept for a moving, stair-climbing ro-bot. A team led by Shwetak Patel of the University of Washington, which included the U.S. Army and Duke University, have developed a con-cept that takes the power line idea and makes it mobile.

According to the paper presented at AUVSI’s Unmanned Systems North America 2012, the concept is called Sensor Nodes Utilizing Power line Infrastructure, or SNUPI. SNUPI uses tiny, lightweight sensor nodes that contain antennas that can connect wirelessly to a power line infrastruc-ture, dramatically boosting their transmission range.

A soldier could be on the bottom floor of a building, or even outside it, and use a single base station con-nected to the system to control and communicate with a robot exploring the upper floors.

SNUPI features a low-power micro-controller that can provide cover-age for an entire building while con-suming less than one megawatt of power. The initial prototype of the system is just 3.8-by-3.8-by-1.4 cen-timeters and weighs only 17 grams, including the battery and antenna.

Brett Davis is editor of Mission Critical.

Talking to Robots — continued from Page 33

Page 37: Mission Critical: Sensors

retired, endurance UAS electro-op-tics spending will shrink in the near term. New technologies like wide field-of-view (WFOV) and hyperspec-tral imaging systems have a strong future, and development and pro-duction of increasingly sophisticated sensors for smaller tactical and mini/micro-UAS will continue, but if there is any segment of the UAS sensor market likely to suffer losses in the near term, the already-ubiquitous gimbaled EO/IR sensor ball is it.

With the Air Force already begin-ning to wonder what it is going to do with all those non-stealthy but not expendable Predator and Reaper orbits once the U.S. leaves Afghani-stan, interest is moving to next-generation systems and sensors. In mid-2012, Northrop Grumman cold-called Canada to offer three Block 30

“Polar Hawks” for Arctic surveil-lance, but there are few big op-portunities out there (except the U.S. Navy). Instead, the vultures are already circling.

In mid-2012, General Atomics of-fered its new extended range Predator B as an alternative to

Global Hawk. The new version adds two fuel pods

and a lengthened 27-meter wingspan, allowing a claimed 42-hour maximum

Pivot to Asia to drive new sensors

Retention of the current admin-istration in the U.S. will mean some consistency regarding de-

fense spending. A decade of war has taught U.S. and European services an unforgettable lesson — scout with your unmanned aircraft, not with your soldiers. This applies to no-boots-on-the-ground conflicts such as Libya, where Europe’s pain-ful intelligence, surveillance and re-connaissance inadequacies finally inspired NATO’s $1.7 billion Alliance Ground Surveillance buy, as well as to grueling attrition battles like Af-ghanistan, where dominant ISR at all levels from tactical to strategic has prevented a grueling bloodbath like Vietnam.

President Barack Obama’s pivot to Asia will require new sensor capa-bilities much more than new striker platforms. Just as in the geographical pivot after the Cold War, the West’s new paradigm will not be arming against an adjacent land threat with thousands of tanks and fighters, but a potential threat with limited power projection capability, requiring monitoring — ISR — rather than bulked up defen-sive lines on the Rhine.

Tomorrow’s need for improved ca-pability with decreased spending will lead to new UAS sensors, electron-ics upgrades and funding increases, even while manned shooter fleets shrink and nonsensor upgrades, such as new engines for manned JSTARS aircraft, are put on hold.

Electro-optical/infrared

Teal Group Corp. forecasts sub-stantial growth in UAS EO/IR system funding available to U.S. manufac-turers once the pivot is well under-way, rising from $754 million in fis-cal year 2013 to $1.2 billion in fiscal year 2021, but with a slow decrease in funding over the next few years as current systems and programs wind down. Production has now ramped up for the U.S. Army Gray Eagle, and Teal Group expects con-tinuing orders beyond current plans, but with hundreds of Air Force Predators and Reapers already in ser-vice, and Block 30 Glob-al Hawk production likely to end soon even if current air vehicles are not

NATO expects to spend 2 billion euros over the next two decades to operate its five AGS Global Hawks. Photo courtesy Northrop Grumman.

By dAVId L. ROCkWELL

MarKEt rEPort

35

Page 38: Mission Critical: Sensors

36 Mission CritiCal • Winter 2012

endurance at 45,000 feet, versus Global Hawk’s 30-36 hours. But non-stealthy UAS at 45,000 feet offer nei-ther the safety nor discretion of a Global Hawk at 60,000. Instead, the Predator C Avenger offers a much better future for near-peer UAV ISR, especially for a pivot to Asia. In January 2012, General Atomics flew its second Avenger. The Air Force has bought one, to be delivered by the end of 2014, to evaluate its per-formance characteristics. General Atomics has also considered devel-oping a carrier-borne Avenger, with folding wings and a tail hook, for the Navy’s stealthy UCLASS (Unmanned Carrier-Launched Airborne Strike and Surveillance) development pro-gram.

Regarding sensors, it is also a whole new ballgame. In mid-2012, the

Avenger was in testing with a Go-odrich MS-177 multispectral EO targeting system, a follow-on to the SYERS sensor on the U-2. In Febru-ary 2012, the Air Force acquired one BAE Systems SPIRITT hyperspectral system for the U-2, with more buys likely and transition to Global Hawk or Avenger possible. In 2012 the Army also evolved plans for a wide-area surveillance capability for Gray Eagle, with autonomous scanning for its EO/IR payload. And General Atomics has suggested an internal WFOV sensor for Avenger.

But all these programs are big future possibilities with little production planned for the next few years. In-stead, the fastest growth will be seen in synthetic aperture radar (SAR) and electronic warfare systems.

Synthetic aperture radars

In January 2012, the USAF complet-ed an analysis of alternatives for its next-generation SAR/Ground Mov-ing Target Indicator fleet, calling for a mix of Block 40 Global Hawks with the Multi-Platform Radar Technology Insertion Program (MP-RTIP) radar and a manned, business jet-based ISR aircraft. But the Air Force also de-cided it did not have the money for a new manned program and would keep JSTARS flying indefinitely. MP-RTIP testing is to continue through 2013, and while fleet numbers are not certain — Teal Group’s best guess is 19 for the Air Force — ex-pect MP-RTIP to remain the world’s most important SAR for decades.

In May 2012, NATO finally signed a $1.7 billion contract for five MP-RTIP Global Hawks for AGS. The first air

FY13 FY14 FY15 FY16 FY17 FY18 FY19 FY20 FY21 FY22 Total Global Hawk 97 12 8 10 12 16 18 10 14 16 213 BAMS 10 23 21 25 23 30 28 29 26 28 243 Predator/Warrior 259 280 246 280 310 330 335 345 330 334 3,049 UCAV 28 40 32 38 52 100 123 120 118 126 777 Tactical 77 81 81 68 90 90 92 91 101 112 883 Mini/Nano 80 78 97 112 114 120 130 150 170 168 1,219 Other U.S. 108 112 130 134 144 170 188 201 220 222 1,629 Available International 95 106 108 100 122 120 156 130 144 154 1,235 Total 754 732 723 767 867 976 1,070 1,076 1,123 1,160 9,248  

Synthetic aperture radars FY13 FY14 FY15 FY16 FY17 FY18 FY19 FY20 FY21 FY22 Total Global Hawk MP-RTIP 135 214 270 367 377 326 367 355 246 148 2,805 BAMS MFAS 42 72 78 90 84 98 110 98 102 108 882 Lynx/Starlite 160 137 142 148 139 120 110 97 99 92 1,244 Other endurance 100 123 142 143 156 160 194 190 198 210 1,616 UCAV 26 36 34 40 60 68 88 102 126 128 708 Tactical UAV 75 81 76 96 108 134 134 144 158 168 1,174 Mini/Micro/Nano-UAV 10 22 28 32 36 44 60 58 66 74 430 Available international 44 50 54 56 58 60 62 75 90 98 647 Total 592 735 824 972 1,018 1,010 1,125 1,119 1,085 1,026 9,506  

Electro-optic/infrared

Market Report — continued from Page 35

Page 39: Mission Critical: Sensors

Mission CritiCal • Winter 2012 37

vehicle is to arrive at Sigonella air base in Sicily around 2015, with IOC in 2016. In February 2012, a NATO official also stated NATO expected to spend 2 billion euros over the next two decades to operate its five Alliance Ground Surveillance (AGS) Global Hawks. Germany may inde-pendently buy a few more.

With a worldwide 20-30 air vehicle MP-RTIP fleet now pretty much set-tled, the biggest SAR wild card may be the Navy’s Global Hawk Broad Area Maritime Surveillance (BAMS) program. In mid-2012, the Navy spoke of the successful use of BAMS demonstrators providing maritime surveillance for the Navy’s 5th Fleet in the Persian Gulf region. Despite a BAMS-D crash in testing in 2012, the Navy plans to acquire 68 Global Hawks with the marinized Multi-Function Active Sensor SAR/inverse SAR, to maintain a standing opera-tional fleet of 22 five-aircraft orbits and allow for “attrition and depot maintenance requirements.”

But if only 22 operational aircraft are needed, Teal Group sees consider-able scope for a reduced BAMS buy, especially as the Navy is becoming something of a lame duck user of non-MP-RTIP Global Hawks (South Korea canceled its planned Global

Hawk buy due to cost increases, and Australia is no longer in the BAMS program). With Boeing’s 737-based manned P-8A Poseidon (and P-8I for India) ISR maritime patrol aircraft now ramping up a large produc-tion run to replace P-3C Orions, Teal Group sees BAMS as a top seques-tration target, especially if the Navy wants to buy manned Joint Strike Fighters. As AGS plans and JSTARS history shows, upgrade and opera-tions and maintenance funding for a 68-aircraft fleet with a unique radar will be massive. BAMS’ initial opera-tional capability is currently planned for December 2015, but Teal be-lieves this will be stretched, reduced or simply canceled.

The biggest growth market for UAS SARs in the second half of the fore-cast period will likely be for tacti-cal and smaller UAS. In September 2012, the Army awarded Northrop Grumman a contract option for an additional 44 Starlite SARs for Gray Eagle, bringing the total number of systems under contract to 174, and Starlite is also being downsized to about 45 pounds for the Shadow tactical UAS. Expect a much broader expansion of SARs to small UAS over the next decade, especially as small UAS endurance increases. All-weath-

er radio frequency sensors will offer great benefits compared to EO/IR when opponents can no longer hide behind clouds or smoke.

Overall, Teal Group forecasts the UAS SAR market will grow from $592 million in fiscal year 2013 to $1 bil-lion in FY22, with an 11.3 percent compound annual growth rate from fiscal 2013 to 2018 and 6.3 percent from fiscal 2013 to 2022.

SIGINT & EA

Teal Group has forecast UAS signals intelligence (SIGINT) and electronic attack (EA) as the fastest growing UAS sensor market, but the future of several major programs — Northrop Grumman’s Advanced Signals Intel-ligence Payload (ASIP) and BAE Sys-tems’ Tactical SIGINT Payload (TSP) — has recently become uncertain. ASIP is tied to the endangered Block 30 Global Hawk, and in early 2012 the Army issued an RFI for TSP pro-duction, requesting no more than 95 systems at a mere $955,000 per unit. Quick Reaction Program T-Pod systems were bought from BAE Systems for Gray Eagle UAS for just $12.3 million.

Teal Group still sees SIGINT sensors (essentially radio frequency ISR) mi-grating to nearly all types of UAS,

SIGINT & EA

FY13 FY14 FY15 FY16 FY17 FY18 FY19 FY20 FY21 FY22 Total Global Hawk/Predator ASIP 40 72 70 72 80 80 76 88 84 86 748 TSP/T-Pod 32 34 36 38 36 42 44 40 46 44 392 Other endurance 78 84 80 120 134 148 160 180 212 242 1,438 UCAV+EA 70 68 80 92 142 156 160 188 200 212 1,368 Other tactical 50 54 66 64 80 88 80 94 102 108 786 Available international 70 84 86 100 114 118 136 140 146 144 1,138 Total 340 396 418 486 586 632 656 730 790 836 5,870  

Page 40: Mission Critical: Sensors

38 Mission CritiCal • Winter 2012

but many small systems from ur-gent non-program-of-record devel-opments are now already in service. With coming budget cuts, these may just suffice, with expensive major programs such as ASIP being consid-erably reduced or eliminated. As ex-amples, BAE Systems has developed the company-funded NanoSIGINT payload for small UAS, and the Office of Naval Research has developed the Software Reprogrammable Payload C4I/SIGINT system for Marine Corps Shadow and other small UAS.

A new market, likely to see much classified funding, is electronic at-tack systems for stealthy UAS. The Air Force is undoubtedly working on this, while the Navy will lead the non-

black market with its UCLASS or a dedicated carrier-borne EA UCAV. In mid-2012, reports indicated the Navy planned to subject its X-47B UCAS-D to a burst of electromagnetic inter-ference (EMI) of 2,000 volts per me-ter, about 10 times the level used for most carrier-based aircraft testing. This reportedly indicates plans to de-velop an EMI-resistant EA platform, perhaps with a high–power micro-wave “weapon” intended to damage opposing electronics systems.

But as with SIGINT, smaller pro-grams have already put EA systems into service. The Army’s Communica-tions Electronic Attack with Surveil-lance and Reconnaissance (CEASAR) pod, based on Raytheon’s AN/ALQ-

227 Communications Countermea-sures Set from the EA-18G Growler, has been flying in Afghanistan since 2011 on two Beechcraft King Air manned ISR aircraft. The Army plans to integrate CEASAR on the MQ-1C Gray Eagle in 2013.

All told, Teal Group forecasts the UAS SIGINT and EA market growing from $340 million in fiscal 2013 to $840 million in fiscal 2022, with fairly steady 10 to 13 percent compound annual growth rate throughout.

David L. Rockwell is senior electronics analyst for Teal Group Corp., a pro-vider of aerospace and defense com-petitive intelligence based in Fairfax, Va. His email address is [email protected].

Increasinghumanpotential.org

• ByLand,AirandSea

• JobsandEconomy

• EnhancingPublicSafety

•MitigatingandMonitoringDisasters

• HelpingtheEnvironment

• FosteringEducationandLearning

• IncreasingEfficiencyinAgriculture

• FAAFlightRestrictions

promotes the use of unmanned systems and robotics in the following categories:

Discover the Endless Benefits of Unmanned Systems

Market Report — continued from Page 37

Page 41: Mission Critical: Sensors

Mission CritiCal • Winter 2012 39

tEstinG, tEstinG

Mesh networking: Robots set up networks where there aren’t any

robots to relay communications over greater distances,” the company says.

The U.K.-based Cobham specializes in providing radios to provide flex-ible, mesh networks that can be in-stalled on mobile robots, manned vehicles or at fixed locations, and which can shift as the mobile nodes move.

The Cobham technology was recent-ly used by the Los Angeles police force to monitor the crosstown prog-ress of the space shuttle Endeavour as it made its way to the California

Having a wireless network can be key in remote areas that lack infrastructure and dangerous areas like battle zones; unfortunately, those are some of the very places where such networks aren’t likely to be found.

Several companies and universities have been looking into the use of “mesh networks,” where robots either deploy network nodes or become part of the network themselves.

One such effort was undertaken by students at Northeastern University, who chose a robotic de-ployment system as part of their capstone design program, a project intended to display their computer and electrical engineering knowledge. The work began in the summer of 2011 and wrapped up in December, with the results published on a website later that month.

The group’s idea was for a ground robot to not only deploy net-work nodes to create a network, but for it to be controlled over that network at the same time. The students had hoped to use tiny Wi-Fi repeaters, but found them too expensive so instead resorted to off-the-shelf Linksys units running open-source firmware.

The robot was built from scratch and ended up being more than 3 feet long, 2 feet wide and weighing 150 pounds. It proved capable of dropping two Linksys routers encased in Pelican water-proof cases, giving it a total range of one kilometer. And, in the words of one of its inventors, it proved to be “utterly badass.”

The Northeastern University node-dropping ground robot. Photo courtesty Northeastern University.

sCan it or Click it:To see a video of the students’ work, click or scan this barcode with your smartphone.

Science Center in October. The net-work was hooked up to a network of Axis cameras that could “leap-frog” coverage as the shuttle moved along, according to Fox News.

In the cloud

Some efforts to develop mesh net-working rely on small unmanned air-craft instead of ground vehicles.

Germany’s Project AirShield, dem-onstrated last spring at Rotterdam Harbor, was intended to show how a swarm of small UAS could share information with each other, and op-erators on the ground, to study the content of burning clouds of hazard-ous smoke.

Such measurements can be made on the ground, but in the case of

The Northeastern robot is also de-scribed by its inventors as a “beast,” and indeed it could carry multiple versions of another robot intended to create mesh networks. That would be iRobot’s FirstLook 100, the tiny, throwable robot that has made a splash at recent AUVSI’s Unmanned Systems North America conferences.

The 5-pound robot, now being evaluated by the Joint Improvised Explosive Device Defeat Organiza-tion, which ordered 100 of them last spring, is partly designed to use mesh networking to “allow multiple

Page 42: Mission Critical: Sensors

40 Mission CritiCal • Winter 2012

sCan it or Click it:To see a video of Tomorrows’ Thoughts Today’s electronic countermeasures UAS network, click or scan this barcode.

fires, most pollutants are in the air and moving.

“At present the fire brigade person-nel are provided with special hand-held devices that can only measure the concentration of different pollut-ants at ground level but are unable to survey and quantify the level of contamination carried in the atmo-sphere by winds and/or ascending columns of smoke,” says a project briefing prepared by small UAS mak-er Microdrones, one of the partners in the program.

“Such a measurement is critical to the safety of outlying communities that may be affected by these aerial pollutants.”

The demonstration, which involved setting fires in three surface bins on the docks, used a single Microdrones md4-1000 vehicle, but the concept calls for a swarm of such systems us-ing mesh network software to com-municate with each other and the ground. The vehicle used in the dem-onstration employed a tiny Gumstix microcomputer with mesh network-ing software supplied by Germany’s TU Dortmund University, which has sponsored a series of conferences on robotic mesh networking.

File sharing

Not all of the network-in-the-sky ideas are aimed at generating net-works in war zones or for first re-sponders. One recent effort, in fact, was undertaken to allow file sharing.

The group Pirate Bay, a Sweden-based site that allows users to swap content (much of it copyrighted, hence the name) announced in 2012 that in the future some of its net-

“As we signal the drones they break formation and are called over,” the think tank says on its website. “Their bodies illuminate, they flicker and glow to indicate their activity. The swarm becomes a pirate broadcast network, a mobile infrastructure that passersby can interact with. Im-promptu, augmented communities form around the glowing flock.”

An overview of Project AirShield, funded by Germany’s Federal Ministry of Education and Research.

Microdrones’ md4-1000, a version of which was used in a test of AirShield.

sCan it or Click it:To see a video of the demonstration, click or scan this barcode with your smartphone.

work could be based on clouds of small UAS.

Pirate Bay has its own motives for becoming mobile — 14 countries have ordered their Internet provid-ers to block its site, and its founders were found guilty of allowing illegal file sharing — but supporters of the idea say mobile, airborne networks could also be useful in places like Egypt and Syria, where governments moved to shut down Internet access.

Thinking along those same lines, the London-based think tank Tomor-row’s Thoughts Today said it has already created a small fleet of In-ternet-capable UAS, which it dubbed an “aerial Napster,” and demonstrat-ed it in late 2011 at a festival in the Netherlands.

At that event, the UAS hovered over the crowd, which could interact with the aircraft using cell phones.

Testing, Testing — continued from Page 39

Page 43: Mission Critical: Sensors

Mission CritiCal • Winter 2012 41

Sense and avoid has become a key technological goal for allow flights of unmanned aircraft in

uncontrolled airspace around the world. One technology expected to help with sense and avoid is ADS-B, or Automatic Dependent Surveil-lance-Broadcast.

It’s part of the U.S. Federal Aviation Administration’s NextGen system, and is basically a GPS-based tran-sponder that reports an aircraft’s position, including heading and alti-tude.

Several demonstrations have been conducted in recent months show-ing the utility of ADS-B for sense-and-avoid use.

One took place at Camp Roberts, Calif., and involved cooperation be-tween two AUVSI members: Arcturus UAV, based in Rohnert Park, Calif.,

aircraft flying nearby if they were us-ing ADS-B.

The companies don’t claim that ADS-B is a magic bullet that solves the sense-and-avoid issue for un-manned aircraft, but say it’s a tool that could be used to speed the use of UAS in some instances, such as aircraft firefighting operations under temporary flight restrictions or dur-ing military range operations.

ADS-B in the news

Sagetech and Arcturus aren’t the only companies testing the uses of ADS-B. General Atomics Aeronauti-cal Systems, the builder of the Pred-ator, Reaper and Gray Eagle line of UAS, says it recently tested an ADS-B on the Guardian, the Office of Cus-toms and Border Patrol’s marinized Predator platform.

The aircraft used a prototype of BAE Systems’ Reduced Size Transponder, which has the military designation of AN/DPX-7. It’s a friend-or-foe tran-sponder that can operate with both military and civilian air traffic control systems and is capable of sending and receiving ADS-B signals.

In a test off the Florida coast on 10 Aug., the Guardian detected other ADS-B-equipped aircraft in the vi-cinity, displaying their location on a ground control station display, and also sent its own location via ADS-B out.

R3 Engineering, based in Lusby, Md., said in October that it had success-fully tested its All Weather Sense and Avoid (AWSAS) system, which com-

ADS-B tests may help expedite UAS flights in public airspace tECHnoloGY GaP

and Sagetech Corp., based in White Salmon, Wash.

In the demonstration, Kelvin Scrib-ner, the president and founder of Sagetech, piloted a Cirrus SR-22 air-craft, taking off from nearby Paso Robles airport. An Arcturus T-20 UAV took off from Camp Roberts’ McMil-lan Airfield via its rail launch system and the two aircraft then flew an aerial ballet, although Scribner didn’t stray into the restricted airspace and the T-20 didn’t stray out of it.

The system used Sagetech’s tiny XP transponders to broadcast ADS-B position messages, which were then received by the company’s Clarity re-ceivers, which relayed them via Wi-Fi to an iPad using Hilton Software’s WingX application. The aircraft then appeared over a terrain map, identi-fied by name. It also identified other

The ADS-B display inside the Cirrus aircraft used in the North Dakota demonstration. Photo courtesy NASA.

Page 44: Mission Critical: Sensors

42 Mission CritiCal • Winter 2012

manded an unmanned aircraft’s au-topilot to depart from its flight path to avoid another aircraft.

The flight was conducted on 10 Aug. in Argentia, Newfoundland, Canada, following earlier demonstrations in Arizona, California and North Dako-ta. The development and testing of AWSAS has been funded by the Of-fice of Naval Research, the Defense Safety Oversight Council’s Aviation Safety Technologies program and Naval Air Systems Command. R3 plans to conduct further testing that will include sensor data tracking noncooperative aircraft — that is, aircraft without transponders show-ing their location.

On 20 Sept., aircraft flown by NASA and the University of North Dakota took to the skies to demonstrate a new kind of unmanned sense-and-avoid technology. Over the course of two weeks of testing, NASA, UND and the MITRE Corp. worked on technology that could one day help unmanned aircraft better integrate into the National Airspace System.

MITRE and UND developed automat-ic sense-and-avoid computer soft-ware algorithms that were uploaded onto a NASA Langley Cirrus SR-22 general aviation aircraft. A support-ing UND Cessna 172 flew as a simu-lated “intruder” aircraft.

The Cirrus, which was developed as a test bed to assess and mimic unmanned aircraft systems, had a safety pilot in the cockpit, but re-searchers say computer programs

developed by MITRE and UND auto-matically maneuvered the aircraft to avoid conflicts.

NASA and its partners are plan-ning additional test flights in 2013. Follow-on testing is to feature addi-tional advanced software by MITRE and UND, as well as sense-and-avoid software managed by a task auto-mation framework developed by Draper Laboratory.

The Arcturus T-20 before the flight. AUVSI photo.

The ADS-B information was available both on computer monitors and iPad and iPhone screens. AUVSI photo.

Page 45: Mission Critical: Sensors

Mission CritiCal • Winter 2012 43

The key to overcoming blind-ness might be on the tip of our tongues.

That’s the approach Dr. Anil Raj at the Florida Institute of Human and Machine Cognition is taking with his research subjects that have lost their sight serving in the military. Raj is repurposing a tactile sensor placed on the tongue along with an array of wearable infrared sensors to give a blind person an impression, quite lit-erally, of his surroundings.

The project uses a commercial tactile tongue sensor, called the Brain Port, meaning it creates a touch-based impression on the tongue of what its camera detects in the visible light spectrum. If a subject were, for ex-ample, focused on the letter E, like on an eye chart, he could feel the contrast of the E on his tongue.

“The big advantage of the tongue for vision is its high resolution, and that allows you to use the camera in a way that is very similar to the cen-tral vision that we’re all familiar with, that we read with or that we recog-nize faces with,” explains Raj.

The perception is high resolution, because the tongue and pharynx are tied into a lot of brain matter via the nervous system — the reason humans can talk and similar species cannot. After the retina, the tongue and hands account for the largest amount of the brain’s ability to pro-cess senses.

But the tongue has a leg up on the hands — its position on the body. The tongue’s short distance from the brain allows for very low signal lag between its location and where an image is processed, much like the eyes.

“The signal delay caused by nerve conduct velocity is not terribly sig-nificant, so if we can present a signal there on the tongue, it is perceived quickly relative to, say, sending that same tactile signal somewhere else on the body.”

These signals have quickly translat-ed to a fill-in for sight in the subjects Raj has tested.

“The blind individuals we’ve tested it in … fairly consistently describe their perception in visual terms,” he says. “They’ll say ‘I looked at this,’ ‘I saw that he moved this.’ They think in terms of it being visual after they get used to it.”

For one individual, the experience of wearing the tongue display was so akin to sight that he kept seeking sensory inputs even after the experi-ment was over.

“We had one gentleman that after he took the tongue display out, after we were done with training for the day, just off the cuff asked, ‘Well, when does this image go away?’” says Raj. “Well, what image? And he said ever since he became blind, everything seemed pitch back, the coldest, dark-est black you can ever imagine. That was his entire visual perception. But

Lip readingIHMC’s tongue sensor fills in for sight EnD UsErs

after using the system for six hours in one day, he said it felt like there was a television screen in his visual perception that was just showing sig-nals [and] that the station had gone off the air.”

IHMC’s work focuses on those who have recently lost their sight, who tend to have a better visual memory than people who lost their vision de-cades ago or have never had a visual memory, says Raj.

“If you have somebody that’s been born blind, they’ve never seen the let-ter E,” he says. “They have no sense of what the relationship is in general. They might have felt it, a raised let-ter E somewhere, but to translate it … to what a camera picks up, I don’t necessarily think that maps directly.”

When Raj wears the sensor himself, he suspects he’s filling in missing in-formation with his own visual knowl-edge.

“We had an interest of a military standpoint of trying to address the problems of our service members that are coming back blinded,” he says. “And so these are all young, healthy individuals … who now are blind and trying to get on with their lives. We found there’s a big differ-ence for their ability to incorporate the information. For somebody that’s been blind or has been born blind, their visual memories are not great or aren’t that rich anymore, or they might not have any visual mem-ories if they were born blind.”

Page 46: Mission Critical: Sensors

44 Mission CritiCal • Winter 2012

Filling in the periphery

Focusing in on a person’s central vi-sion doesn’t cover the entire picture though. Raj’s work incorporates pe-ripheral vision with what the tongue display’s camera sees to give a per-son a wider field of view.

“If we’re reading the menu at the restaurant, we can notice when a waiter approaches us,” explains Raj. “We’re not startled when they come up, whereas with something like the BrainPort for a blind person, they would have to be zoomed in tightly enough on the menu to be able to read the text, and with that setting they wouldn’t necessarily be able to perceive that someone walked up.”

To accomplish this, Raj is using a series of infrared emitters that use a frequency-modulated beam to detect a person’s surroundings, like doors and hallways. The sensors work the same way as a television remote does, only accepting a mod-

ulated signal that doesn’t consider other light sources.

Raj uses 24 pairs of these sensors in an array around the head, which through an algorithm the institute created can tell how far away objects are by measuring the echoes of in-frared beams.

“But our real interesting part is our software algorithm that more or less creates this streaming map of the environment via multiple sources and sensors.”

The software filters out errors, like strange reflections, so it can work in nearly any environment.

The algorithm also has another task because of the infrared array’s loca-tion — canceling out any head unin-tentional head motion.

“So if you’re walking, your head is bobbing around a little bit, or if you just turn your neck around, it doesn’t really change how things are relative

to your body in the world around you. So we don’t necessarily want to reflect every single one of those changes.”

To do this, Raj takes other measure-ments from accelerometers and gy-roscopes and cancels out updates that are strictly related to head movements.

“The focus is to improve the sensory interactions between robotic sys-tems, for example, and the humans that are interacting with those sys-tems.”

This kind of focus is the essence of Raj’s work at IHMC, he says.

“Robots don’t go off and do things by themselves,” he says. ” … They do what we ask them to do or what we command them to do, and what we’re working on is when we can make that level of interaction more interactive and more functional with less cognitive effort. And that goes both ways.”

The BrainPort sensor that IHMC is using to reroute sight to the tongue for the blind. AUVSI photo.

End Users — continued from Page 43

Page 47: Mission Critical: Sensors

Walter E. Washington Convention Center • Washington D.C.

AVE DAtE

Conference from 12 - 15 August

Tradeshow from 13 - 15 August

550+ Exhibiting Companies

40+ Countries Represented

8,000+ Attendees

auvsishow.org

Promoting and Supporting

Unmanned Systems and Robotics Across the Globe

12-15 August

thE

USNA13_SaveDate_8_5x11.indd 1 9/25/12 11:22 AM