SMC2013

7
See discussions, stats, and author profiles for this publication at: http://www.researchgate.net/publication/259741742 Kelpie: A ROS-Based Multi-Robot Simulator for Water Surface and Aerial Vehicles CONFERENCE PAPER · JANUARY 2013 DOI: 10.1109/SMC.2013.621 CITATIONS 3 DOWNLOADS 483 VIEWS 292 6 AUTHORS, INCLUDING: Pedro Santana ISCTE-Instituto Universitário de Lisboa 51 PUBLICATIONS 216 CITATIONS SEE PROFILE Francisco Marques Institute for the Development of New Techno… 9 PUBLICATIONS 11 CITATIONS SEE PROFILE André Lourenço Institute for the Development of New Techno… 7 PUBLICATIONS 10 CITATIONS SEE PROFILE J. Barata New University of Lisbon 188 PUBLICATIONS 1,094 CITATIONS SEE PROFILE Available from: Pedro Santana Retrieved on: 25 June 2015

description

hite

Transcript of SMC2013

  • Seediscussions,stats,andauthorprofilesforthispublicationat:http://www.researchgate.net/publication/259741742

    Kelpie:AROS-BasedMulti-RobotSimulatorforWaterSurfaceandAerialVehiclesCONFERENCEPAPERJANUARY2013DOI:10.1109/SMC.2013.621

    CITATIONS3

    DOWNLOADS483

    VIEWS292

    6AUTHORS,INCLUDING:

    PedroSantanaISCTE-InstitutoUniversitriodeLisboa51PUBLICATIONS216CITATIONS

    SEEPROFILE

    FranciscoMarquesInstitutefortheDevelopmentofNewTechno9PUBLICATIONS11CITATIONS

    SEEPROFILE

    AndrLourenoInstitutefortheDevelopmentofNewTechno7PUBLICATIONS10CITATIONS

    SEEPROFILE

    J.BarataNewUniversityofLisbon188PUBLICATIONS1,094CITATIONS

    SEEPROFILE

    Availablefrom:PedroSantanaRetrievedon:25June2015

  • Kelpie: A ROS-Based Multi-Robot Simulatorfor Water Surface and Aerial Vehicles

    Ricardo Mendonca1, Pedro Santana1,2, Francisco Marques1, Andre Lourenco1, Joao Silva1, Jose Barata11CTS-UNINOVA, New University of Lisbon, Portugal

    2Instituto Universitario de Lisboa (ISCTE-IUL), Portugal{r.mendonca, facmarques, p110251}@campus.fct.unl.pt, [email protected], [email protected]

    AbstractTesting and debugging real hardware is a timeconsuming task, in particular for the case of aquatic robots, forwhich it is necessary to transport and deploy the robots on thewater. Performing waterborne and airborne field experimentswith expensive hardware embedded in not yet fully functionalprototypes is a highly risky endeavour. In this sense, physics-based 3D simulators are key for a fast paced and affordabledevelopment of such robotic systems. This paper contributes witha modular, open-source, and soon to be freely online available,ROS-based multi-robot simulator specially focused for aerial andwater surface vehicles. This simulator is being developed as partof the RIVERWATCH experiment in the ECHORD european FP7project. This experiment aims at demonstrating a multi-robotsystem for remote monitoring of riverine environments.

    Index TermsPhysics-Based Simulation, Multi-Robots, Un-manned Surface Vehicles, Unmanned Aerial Vehicles, ROS

    I. INTRODUCTIONField robotics is rapidly expanding into the aerial [15], [19],

    [22] and the aquatic [1], [3], [5], [11], [17], [23] domains. Agreat deal of the current success of Unmanned Aerial Vehicles(UAV) comes from the adoption of multi-rotor configurations,which allow vertical take-off and landing, high stability andmaneuverability. These characteristics are considerably usefulfor tasks like environmental monitoring, wild life tracking, andsearch & rescue missions. The limited energetic autonomy ofmulti-robot UAVs makes them of limited reach in the aquaticdomain. Conversely, Unmanned Surface Vehicles (USV) areable to provide long lasting operation in the aquatic domain(refer to [2] for a survey on USVs). This complementarynature of UAVs and USVs can be exploited in the contextof heterogeneous robots teamwork [12], [16], [18]

    Although appealing, field robotics, in particular aquaticand aerial, suffers from a rather challenging developmentprocess, mostly due to the remoteness and harshness of theenvironments in which these robots are to operate. In additionto the involving and time consuming process of taking thehardware out to the field, there is also the risk of damagingexpensive equipment throughout the debugging process. Theuse simulation environments is key to mitigate these problems.Moreover, simulators allow testing and tuning the system toperform in exceptional situations, i.e., situations that are barelyreplicated in actual field trials. Testing in exceptional situationsmeans checking how the system responds to hardware failures,power shortage, or extremely odd environment configurations.

    Fig. 1. Snapshots of some environments simulated with Kelpie.

    Another key advantage of simulations in multi-robot systemsis the ability to assess the scalability and robustness of thesystem in the face of varying team members cardinality.

    Simulation tools for USV-UAV teams require physics-based3D modelling, advanced rendering capabilities, real-time ca-pabilities, flexibility, ability to running on a set of distributedcomputational units, and seamless integration with the robotcontrol system. The last requirement grants a rapidly andinterchangeably linking between the control system and eitherthe simulated or the physical devices (e.g., sensors, actuators).To our knowledge, no previous simulator has managed tocope with all these requirements simultaneously. Conversely,this paper presents Kelpie, a novel open source, soon to beonline, multi-robot simulator capable of coping with all theserequirements (see Fig. 1). Kelpie is being developed for theRIVERWATCH experiment in the ECHORD1 european FP7project. This experiment aims at demonstrating a multi-robotsystem for remote monitoring of riverine environments.

    This paper is organised as follows. Section II surveysprevious simulators for aerial and aquatic robotics. Then,Section III describes the Kelpie simulator architecture. Severalapplication cases for the simulator are presented in Section IV.Finally, some conclusions and future work avenues are drawnin Section V.

    1ECHORD homepage: http://www.echord.info/

  • II. RELATED WORK

    Most simulators used as research tools perform numericalsimulation and generate visualisation data mostly in the formof 2D or 3D plots (e.g., Simulink R). While this kind ofsimulators are useful to support the design and developmentof low-level control architectures, they fail to scale towards3D physics-based simulation with photo-realistic renderingcapabilities.

    Good rendering capabilities are essential to enable thesimulation of vision and other perceptual stimuli upon whichperceptual algorithms can be reasonably debugged and vali-dated. As a result, simulators based on advanced 3D computergraphics engines are preferable to support the development ofhigh-level functionalities in robotic systems, such as: objectdetection and tracking, environmental exploration and interac-tion, path planning and obstacle avoidance. An emerging trendin this direction is to build simulators from existing renderingengines, such as game engines [7].

    Previous simulators for aquatic robots are split into twomain categories: underwater robots and surface robots. Thevast majority of simulators, such as the open source SubSim[4], Neptune [20], IGW [6], MVS [10], and the commercialDeepWorks [8], lies on the underwater category. These simula-tors include hydrodynamic models and implement some usefulsensors, such as underwater sonars and vision cameras. As arepresentative of surface robot simulators there is WaveSim[21], which is also able to simulate underwater vehicles. Alimitation of WaveSim is that it relies on basic geometricprimitives as approximations to represent vehicles and objectmodels in the physics engine. For instance, surface vehiclesare represented as boxes and underwater robots as cylinders.

    Kelpie, the simulator herein presented shares some of theconcepts underlying WaveSim and goes beyond in severaldirections. First, Kelpie is fully compliant with the Robot Oper-ating System (ROS), which is a free and open source softwareframework that is becoming a de facto standard in robotics.ROS provides standard operating system services such as hard-ware abstraction, low-level device control, message-passingbetween processes and commonly used functionalities, en-abling a distributed computing development framework. Sec-ond, Kelpie provides more accurate physics simulation andrendering quality by using the geometric meshes of the vehiclemodels instead of simple geometric shapes. Finally, Kelpie alsoallows the simulation of UAVs.

    Kelpies core architecture is based on Gazebo [9], a roboticssimulator distributed with ROS. However, Kelpie distinguishesfrom Gazebo by supporting the rendering of water regions andthe simulation of vessel dynamics (e.g., drag forces caused bywater resistance and buoyancy), as well as, flight dynamicsfor airborne vehicles. Note that Kelpie is not necessarilyconstrained to these two type of vehicles, as it also supportsthe simulation of terrestrial and underwater robots.

    Please refer to [7] for a survey on unmanned vehiclessimulators and to [14] for the specific case of underwaterrobotics.

    III. THE KELPIE SIMULATOR

    The Kelpie simulator is able to account for several aerial,surface, underwater, and ground virtual robots and their typicalsensors. In Kelpie, these robots are simulated according to thelaws of physics and immersed on high quality rendered 3Dheterogeneous environments (e.g., water and terrain regions).To ensure full integration with contemporary autonomousrobots control systems, Kelpie is fully compliant with ROS.Each of these properties of Kelpie is dissected next.

    A. System Architecture

    In the ROS framework, nodes interact to each other througha publish-subscribe messaging and service mechanism. Mes-sages are associated to topics, which are containers with uniqueidentifiers to and from messages are push and pulled. In aROS network, Kelpie is just a node with reconfigurable setof interfaces, i.e., topics. Hence, several other ROS nodes areable to interact with Kelpie, i.e., virtual robots actuators andsensors, as if they would be interacting with the actual robots.Other ROS nodes may be collecting data from the simulationand storing them to offline analysis or setting simulationcontrol parameters, such as those required to change time,season, or weather.

    Fig. 2 illustrates the major components of the ROS noderesponsible for Kelpie: a ROS interface, a computer graphicsrenderer, a physics engine, and a XML parser.

    Fig. 2. System architecture.

    The ROS client interface provides access to core function-alities and implements the concept of topic-based commu-nications, providing the compliance needed to integrate thesimulator with the ROS framework. For instance, the simulatoruses the ROS client interface to store and make publiclyavailable some simulation parameters (e.g., direction and speedof water currents) in the ROS system-wise parameter sharedserver. These simulation parameters can then be modified atrun-time by other ROS nodes.

    Kelpie also exploits the ROS diagnostic toolchain, via itsclient interface, to ease the development of diagnostic nodesfor robotic systems. This toolchain provides the means forcollecting, publishing, troubleshooting and viewing diagnostic-related data. For instance, Kelpie can publish diagnostic mes-

  • sages related to several robot models parameters (e.g., batterylevel and actuators temperature).

    Real-time computer graphics rendering is managed byOpenSceneGraph (OSG)2, which is an open source graphicstoolkit based on the scene graph concept and it is written instandard C++ and OpenGL3. Furthermore, the open sourceosgOcean library4 is used to generate the above and belowwater rendering effects, as depicted in Fig. 3.

    Fig. 3. Water effects in Kelpie, as seen from above (left) and from below(right) the surface.

    The physics simulation is supported by the BulletPhysicslibrary5, which is an open source physics engine featuring3D discrete and continuous collision detection with diversecollision shapes, as well as soft and rigid body dynamics. Tointegrate the BulletPhysics engine into OSG-based models, theosgBullet6 library is used. Concretely, this library maps thestate of a BulletPhysics dynamical model to the parameters ofthe corresponding OSG geometric model of the object, givenexternal stimuli (e.g., forces and torques).

    Finally, a XML parser is used to save and load SimulationDescription Format (SDF) files describing the scene, the virtualrobots and their sensors. This gives the ability for Kelpie toimport existing Gazebo SDF world files.

    B. Buoyancy Simulation

    Kelpie uses the Archimedes principle to simulate buoyancyin water. Briefly, this law of physics states that a fluid exertsan upward force (i.e., buoyant force) that equals the weightof the fluid displaced by an immersed body. For instance, thesurface vehicle depicted in Fig. 4 suffers a buoyant force Fbfrom the surrounding water. This force is directed upward andhas a magnitude equal to the weight of the water displaced bythe submerged hull. Formally, the magnitude of the buoyantforce vector Fb is given by

    ||Fb|| = vf g, (1)where g is the gravitational acceleration constant, is thewaters density and v is the volume of the water displaced.

    When the surface vehicle is floating, the buoyant force, Fb,and the gravitational force, Fg , share the same magnitude,though opposite directions (see Fig. 4). If the buoyant forcesmagnitude gets lower than the magnitude of the gravitational

    2OSG homepage: http://www.openscenegraph.org3OpenGL homepage: http://www.opengl.org4osgOcean homepage: code.google.com/p/osgocean/5BulletPhysics homepage: http://bulletphysics.org/wordpress/6osgBullet homepage: osgbullet.googlecode.com/

    Fig. 4. Buoyancy simulation.

    force, then the sinking occurs. Hence, to assess buoyancy, thesimulator needs to compute the buoyant and the gravitationalforces that the physics engine must exert on the vehicle ateach time step. The gravity force is easy to compute as itonly requires the vehicles mass to be known, which is givena priori. The buoyant force is computed with Eq. 1 given thewaters density, which is also known a priori, and the volumeof the displaced water, vf , which needs to be computed at eachsimulation step. The volume of displaced water corresponds tothe volume of the vehicles immersed region, which is hard todetermine exactly. Currently, this quantity is approximated bythe volume of a bounding box.

    C. Wind and Water Currents

    The vehicles motion and attitude on the waters surfaceis a function of its propellers actuation and forces caused byexternal factors, such as winds, water currents, and waves. Inits current version, Kelpie does not consider sailed vehicles.Thus, wind forces are only considered indirectly via changesapplied to the amplitude and frequency of water waves. Forinstance, if a surface vehicle receives lateral winds it will sufferan oscillating roll torque (see Fig. 5), whereas a frontal windwill cause an oscillating pitch torque.

    Formally, Kelpie approximates the water-wind-vehicle in-teraction dynamics by first computing the amplitude of wavesas a function of time t and wind amplitude , as follows:

    h(, t) = hm() sin( () t ), (2)where hm() and () are the maximum amplitude of thewave and its frequency, respectively, given the wind speed. These two function were implemented according to theBeaufort scale, which relates wind speed to observed con-ditions at sea and land. This approximation disregards thecomplex interactions among waves of different frequencies andamplitudes, which are key for photorealistic simulation but oflittle value to the purpose of debugging perceptual and controlalgorithms of unmanned vehicles. The surface vehicles pitchand roll angles, pitch and roll, are then modified as follows:

    pitch(pym,pw, , t) = (p

    ym pw) h(, t) pitch; (3)

    roll(pxm,pw, , t) = (p

    xm pw) h(, t) roll; (4)

    where the inner product is used to verify the alignment betweenthe vehicles directional vectors, pxm and p

    ym (see Fig. 6(a)),

    and the wind directional vector, pw. For instance, if bothvectors are strictly parallel, the inner product returns 1.0 or

  • 1.0, either they yield the same direction or not, respectively.If both vector are perpendicular, it returns zero. Otherwise thereturned value will range between zero and one. This allowsthe surface vehicles attitude to change in its pitch and roll axisaccording to the vehicles orientation in relation to the traveldirection of the water waves along the time. The proportionalfactors, pitch and roll, are defined by the user accordingto the vehicles kinodynamic properties. For instance, if thevehicle is very lightweight its attitude can change in the fullrange of the waves amplitude. Otherwise, if the surface vehicleis heavy weighted then its attitude changes in a narrow rangeof water wave amplitude.

    (a) h(v, t) (b) h(v, t+ 1) (c) h(v, t+ 2)

    Fig. 5. Roll torque applied by wind to a surface vehicle.

    Concerning water currents, they can exert linear and ro-tational forces (e.g., yaw torques) on the surface vehicle,depending on its orientation relative them. Formally, the yawtorque, , applied to the surface vehicle, given the currentsforce applied to the vehicles keel, Fc, is = r Fc, wherer is the displacement vector from the point from which torqueis measured (vehicles center of mass) to the point where theforce is applied (see Fig. 6(b)).

    (a) (b)

    Fig. 6. (a) 2D directional vectors of the model, pxm and pyw , and hypothetical

    wind vector pw (b) Effect of water current on vehicles yaw rotation(represented by red arrow).

    Aerial robots are also affected by wind forces, but in adifferently way compared to aquatic models. Aerial models donot display an oscillating behaviour. Instead, a steady forcewith magnitude and direction of wind is applied to them.Depending on the direction of the aerial vehicles motion, thisforce can behave as a thrust/drag force Fy and as a lateralforce Fx, as follows:

    Fy (pym,pw) = (p

    ym pw)pw; (5)

    Fx (pxm,pw) = (p

    xm pw)pw; (6)

    where pw is the magnitude of the wind force. Hence, ifboth vectors are strictly parallel and share the same direction

    (positive signal from the inner product), the aerial vehicle willsuffer a thrust force. Otherwise it will feel a drag force. Forthe time being only this simple physics is implemented in thesimulator.

    D. Simulated Sensory Data

    Kelpie provides several sensor models, such as vision cam-eras, underwater sonars, tilting laser scanners, inertial measure-ment units (IMU) and global position systems (GPS). Thesemodels generate sensory data in a compliant format ready tobe published in the ROS network (e.g., sensor msgs::Imu, sen-sor msgs::LaserScan, sensor msgs::NavSatFix, etc.). There-fore, changing from a simulated sensor to its real counterpartis a seamless operation. To account for sensor noise (excludingvision cameras), only a Gaussian model is currently beingused. Several ROS nodes can then access and process thesedata in the context of high-level functionalities (e.g., surfaceobstacle avoidance from laser scanners, bathymetric fromunderwater sonars, navigation from positioning and attitudesensors). Fig. 7 illustrates the output of the tilting underwatersonar currently available in Kelpie. Sonar and laser beamsare simulated using the ray casting and ray-geometry collisiondetection techniques available through OSG.

    Fig. 7. Simulated underwater sonar beams.

    IV. APPLICATION CASEAs mentioned, Kelpie is being developed for the RIVER-

    WATCH experiment in the ECHORD european FP7 project.This experiment aims at contributing with a multi-robot solu-tion for the remote monitoring of riverine environments (e.g.,assessing the water quality, find pollution sources, and trackingunderwater biodiversity). Concretely, one of robots is a anUnmanned Surface Vehicle (USV), whereas the other is anUnmanned Aerial Vehicle (UAV), a hexacopter. These inno-vative multi-rotor flying platforms have significant advantagesover other types of aerial robots, namely capability of verticaltake-off and landing, high stability and manoeuvrability.

    The two robots have complementary properties which canonly be fully exploited if coordinated. While the USV is ableto assess the water body, the UAV is able to visually inspect theriverbanks. While the USV is able to transport a solar panelto perform energy harvesting, the UAV is not and so mustrecharge itself by docking in the USV. The higher speed ofthe UAV means that the USV can move downriver while theformer is performing its associated inspection while its energysupply allows it. Furthermore, the UAV can help the USV onpath planning by providing a better vantage point with its aerial

  • perspective. To speed up the analysis, these robotic pairs canbe multiplied, which in turn requires their coordination at bothUSV and UAV levels.

    To show the usefulness of Kelpie in the context of devel-oping and debugging a project as RIVERWATCH, the follow-ing sections present three application cases, namely, obstacledetection and avoidance, UAV-USV cooperative environmentperception.

    A. Obstacle Detection and AvoidanceThe short-range obstacle detection in the USV rely on 3-D

    data provided by a stereo vision head, a tilting laser scanner,and an underwater tilting sonar. The data produced by thesesensors is integrated on a volumetric map, from which a bi-dimensional cost map is produced. Cost is high in the presenceof objects on waters surface and in the presence of low depthbathymetry.

    Fig. 8 illustrates the volumetric map generated with Kelpiessimulated tilting laser scanner. These data were gathered whilethe simulated USV was moving on the waters surface. Theregistration of range data in a common volumetric data is doneusing the Kelpies simulated GPS and IMU filtered with anExtended Kalman Filter (EKF).

    (a) (b)

    Fig. 8. Set of virtual buoys in (a) and corresponding volumetric map in (b).

    Fig. 9 depicts the successful autonomous behaviour of theUSV in an environment composed of natural obstacles whenasked to move towards a given waypoint. This time the costmap is fed by the tilting laser scanner operating above the watersurface. To navigate safely, the USV uses a path planner on thetop of a local obstacle avoidance algorithm, both running onthe cost map. Again, the localisation of the USV is estimatedby filtering the GPS and IMU simulated sensors with an EKF.

    (a) (b)

    Fig. 9. Laser-based autonomous navigation among natural obstacles towardsa given waypoint. (a) Perspective over the simulated environment. (b) Costmap built online with the tilting laser scanner. Green points correspond tomapped obstacle points. The red lines in (a) and (b) correspond to the pathexecuted by the USV from its initial position to the goal waypoint.

    B. USV-UAV Cooperation

    To enable robust safe navigation in aquatic environments,long-range water/land segmentation of the environment isessential, which is considerably hampered by the low vantagepoint of the USV. Alternatively, we can use the UAV highvantage point to extend the perceptual range of the USV and,consequently, enable truly long-range obstacle detection. Forinstance, compare the perspective of the environment as seenslightly above the water level and as seen from a considerablyhigh vantage point in Fig. 8(a) and Fig. 10, respectively. Thesefigures are representative of both USV and UAV perspectivesover the environment. This USV-UAV cooperation facilitatesconsiderably the hard problem of detecting sand banks anddistant shorelines, which are key for safe navigation anddifficult to detect from the USVs perspective.

    Fig. 10. Aerial perspective of the environment. The simulated UAV isrepresented in the upper-left corner of the figure.

    A key element in this cooperative process is the ability forthe UAV to register the images acquired with its downwardslooking camera in the USVs frame of reference. One solutionto the problem is to detect the USV in the images themselves.Fig. 11 presents a set of images acquired from the simulatedUAVs onboard camera while approaching a simulated USV.These images feed a ROS node responsible for actually de-tecting the USV based on a computer vision algorithm tunedto detect a H-shaped helipad (see Fig. 11(d)). This experimentrelies on the set of ROS nodes abstracting both USV and UAVand on ROS messages for their interactions. These protocolsare exactly the same as the ones used in the real roboticplatforms, thus enabling a seamless transition from simulatedto real world experiments.

    V. CONCLUSIONS AND FUTURE WORKA ROS-based multi-robot simulator, entitled Kelpie, spe-

    cially tuned to enable fast development and debugging ofsystems composed of cooperating unmanned surface and aerialvehicles was presented. A set of practical cases in remoteenvironmental monitoring were provided in order to demon-strate the usefulness and reach of the simulator. These caseswere extracted from the requirements of the RIVERWATCHexperiment included in the EU-FP7 ECHORD project.

    Kelpie being freely available and open sourced, is expectedto contribute towards affordable development of field robotics,in particular in the context of environmental monitoring inaquatic environments. Kelpie is deeply inspired by Gazebosimulator, through standing out from it by being able to render

  • (a) (b)

    (c) (d)

    Fig. 11. (a)-(c) Images acquired from the simulated UAVs onboard cameraas it approaches the USV. (d) Output of the USV detection and tracking ROSnode, given the input image in (c).

    marine environments and providing context-related dynamics,such as buoyancy and feedback from environmental forces(e.g., winds and water currents).

    Kelpie is under active development. Bugs are being fixedand improved functionality being included. For instance,Kelpie is being extended to include day-night cycles anddynamic weather changes, which will help assessing the ro-bustness of the control systems in extreme situations (e.g.,assess the robots behaviour in context when vision-basedobstacle detection is hampered due to the presence of heavyfog).

    An interesting extension would be to include improvedwind models so that the simulation of sailing vessels wouldbe enabled, as well as the effects caused by wind in theappearance of water waves. In particular, the addition of visualeffects in the water (e.g., density of white foam crests andairborne spray) according to the Beaufort scale.

    To increase the simulator efficiency in the simulation ofmarine environments, a migration from the osgOcean libraryto the faster and novel rendering method based on GraphicsProcessing Units (GPUs) proposed in [13] is planned.

    Finally, improvements to the graphical user interface willbe carried out. The interface will provide several features tothe user, such as: drag and drop mechanism for on-line mate-rialisation of ready-to-use models (e.g., deployment of buoysinto the virtual environment as aquatic obstacles); point-and-click way-point creation for navigation; mouse-based selectionof models for easy on-screen access of internal variables; and,notifications of simulation events using the heads-up display(HUD) system (e.g., successful docking procedure, collisiondetected, etc.). This opens the door to Kelpie as front-end ofmulti-robot teleoperation and diagnostic systems.

    ACKNOWLEDGMENTSThis work was developed within the scope of the RIVER-

    WATCH experiment from the European FP7 Large Scale Inte-

    grating Project ECHORD and it was also partially supportedby CTS multi-annual funding, through the PIDDAC Programfunds.

    REFERENCES[1] Arrichiello, F., et al., Cooperative caging using autonomous aquatic

    surface vehicles. In Proc. of Intl. Conf. on Robotics and Automation(ICRA), IEEE, 2010.

    [2] Bertram, V., Unmanned surface vehicles a survey, Skibsteknisk Sel-skab, Copenhagen, Denmark, 2008.

    [3] Bhattacharya, S., et al., Cooperative control of autonomous surfacevehicles for oil skimming and cleanup. In Proc. of Intl. Conf. on Roboticsand Automation (ICRA), IEEE, 2011.

    [4] Boeing, A. and Brunl, T., SubSim: An autonomous underwater vehi-cle simulation package. In Proc. of the Intl. Symp. on AutonomousMinirobots for Research and Edutainment, Springer, 2006.

    [5] Caccia, M., Bono, R., Bruzzone, G., Spirandelli, E., Veruggio, G., Stortini,A., and Capodaglio, G., Sampling sea surface with SESAMO. IEEERobotics and Automation Magazine, vol. 12, no. 3, pp. 95-105, 2005.

    [6] Choi, S., Yuh, J., Takashige, G. Y., Development of the Omni DirectionalIntelligent Navigator. Robotics and Automation Magazine IEEE, vol. 2,no. 1, pp. 44-53, 1995.

    [7] Craighead, J., et al., A survey of commercial and open source unmannedvehicle simulators. In Proc. of the Intl. Conf. on Robotics and Automation(ICRA), IEEE, 2007.

    [8] DeepWorks [Online]. Available: http://www.fugrogrl.com/software/[9] Koenig, N. and Howard, A., Design and use paradigms for Gazebo,

    an open-source multi-robot simulator. In Proc. of the Intl. Conf. onIntelligent Robots and Systems (IROS), vol. 3, pp. 2149-2154, 2004.

    [10] Kuroda, Y., Ura, T., Aramaki, K., Vehicle control architecture foroperating multiple vehicles. In Proc. of the Symp. on AutonomousUnderwater Vehicle Technology, pp.323-329, 1994.

    [11] Leedekerken, J., Fallon, M., and Leonard, J., Mapping complex marineenvironments with autonomous surface craft. In Proc. of the Intl. Symp.on Experimental Robotics (ISER), 2010.

    [12] Lindemuth, M., Murphy, R., Steimle, E., Armitage, W., Dreger, K., Elliot,T., Hall, M., Kalyadin, D., Kramer, J., Palankar, M., Pratt, K., Griffin,C., Sea Robot-Assisted Inspection. IEEE Robotics and AutomationMagazine, vol. 18, no. 2, pp. 96-107, 2011.

    [13] Liu, S., Xiong, Y., Fast and stable simulation of virtual water sceneswith interactions. Virtual Reality, vol. 17, no. 1, pp. 77-88, Springer-Verlag, 2013.

    [14] Matsebe, O., Kumile, C., and Tlale, N., A review of virtual simulatorsfor autonomous underwater vehicles (AUVs). NGCUV, Killaloe, Ireland,2008.

    [15] Merino, L., et al., Cooperative fire detection using unmanned aerialvehicles. In Proc. of the Intl. Conf. on Robotics and Automation (ICRA),IEEE, 2005.

    [16] Murphy, R., et al., Cooperative use of unmanned sea surface and microaerial vehicles at Hurricane Wilma. Journal of Field Robotics, 25(3), pp.164-180, 2008.

    [17] Pascoal, A., Oliveira, P., Silvestre, C., Bjerrum, A., Ishoy, A., Pignon, J.,Ayela, G., and Petzelt, C., MARIUS: an autonomous underwater vehiclefor coastal oceanography. IEEE Robotics and Automation Magazine, pp.46-59, 1997.

    [18] Pinto, E., Santana, P., Barata, J., On Collaborative Aerial and SurfaceRobots for Environmental Monitoring of Water Bodies. In TechnologicalInnovation for the Internet of Things, 183191, Springer-Verlag, 2013.

    [19] Rafi, F., et al., Autonomous target following by unmanned aerialvehicles. Defense and Security Symposium, International Society forOptics and Photonics, 2006.

    [20] Ridao, P., et al., NEPTUNE: A HIL simulator for multiple UUVs. InProc. of OCEANS, vol. 1, pp. 524-531, 2004.

    [21] Santos, A., WaVeSim-Ambiente de simulacao para veculos aquaticos,Phd Dissertation, Universidade do Porto, 2006.

    [22] Spry, S., Girard, A., and Hedrick, J., Convoy protection using multipleunmanned aerial vehicles: organization and coordination. In Proc. of theAmerican Control Conference, IEEE, 2005.

    [23] Thomas, P. and Djapic, V., Improving autonomy and control of au-tonomous surface vehicles in port protection and mine countermeasurescenarios. Journal of Field Robotics, 27(6), pp. 903-914, 2010.