Modern Computer Games Technology in Systems and Control Education

download Modern Computer Games Technology in Systems and Control Education

of 6

Transcript of Modern Computer Games Technology in Systems and Control Education

  • 7/27/2019 Modern Computer Games Technology in Systems and Control Education

    1/6

    MODERN COMPUTER GAMES TECHNOLOGY INSYSTEMS AND CONTROL EDUCATION

    Bruno T. Vigrio, Antnio Pessoa de Magalhes and Francisco T. Freitas

    IDMEC FEUPSeco de Automao, Instrumentao e ControloR. Dr. Roberto Frias, 4200 Porto PORTUGAL

    Email {mai04009, a.p.magalhaes, ffreitas}@fe.up.pt

    Abstract: The notion that virtual environments controlled by human operators or externalhardware as PLCs or PID controllers are valuable tools for control education andtraining is not new. Yet, powerful but low cost modern computer games technology isenabling the very realistic visual simulation of complex 3D virtual systems, from where

    the consequences of control actions can be observed in real-time and from differentangles. The paper discusses the development of virtual systems, showing that whilst

    modern graphics and physics engines are central to the design of the virtual environments,the immersion of the controller in the synthetic environment is greatly enhanced andsimplified when an instrumentation engine abstractly supports the virtual I/O devicesand the data exchange between these and the external controller. The specification, designand application of an instrumentation engine are the main themes of the paper.

    Keywords: Computer Simulation, Computer Graphics, Control Education, Virtual

    Reality, Real-Time Systems, Instrumentation, System Integration.

    1. INTRODUCTION

    Although simulation packages based on numericaltechniques exist since digital computers wereintroduced, only recently simulation tools are beingused as standard training tools in systems and control

    education. This is mainly due to the low cost andwide dissemination of personal computers.

    Technology is indeed introducing major changes inthe educational process. For example, the potential ofVirtual Reality (VR) for many education and trainingapplications is ever increasing due to its key attribute

    of providing cost effective access to high fidelitycomputer simulations (Kalawsky 2005).

    Flight simulators are classical examples of VR basedsystems dedicated to training (Menendez andBernard 2001), where a trainee pilot using a real

    cockpit interface interacts with a computer basedsystem representing an external world. Yet, until

    recently, man in the loop simulation systems(Innocenti and Pollini 1999) were only common in

    nuclear plants, aerospace, military, medicine, fire

    fighting and other training scenarios where risk andcost are major concerns. Currently, 3D VR basedtraining systems are spreading to industry, services,education and even arts (Houston 2004).

    Modern 3D computer games are interesting cases of

    man in the loop simulation systems. As computergames are becoming more sophisticated, more theplayers are feeling immersed in a syntheticenvironment where the conventional laws of physicsare present and to which players interact in a verynatural and intuitive manner. Head mounted displays,force feedback (haptic) devices and other specific

    peripheral technologies are common in man in theloop simulation scenarios.

    Recent advances in the ever evolving computer gametechnology are contributing for this deeperimmersion (Zyda 2005). Whilst modern GPUs

    (Graphics Processing Units) are the central hardwaresupport for 3D animation display, graphics and

    physics engines and the main modern softwaresupport to the design of VR scenarios. A graphics

    engine (Michal 2001) is a library that abstracts the

    CONTROLO 20067th Portuguese Conference on Automatic ControlInstituto Superior Tcnico, Lisboa, PortugalSeptember 11-13, 2006

  • 7/27/2019 Modern Computer Games Technology in Systems and Control Education

    2/6

    drawing of 3D graphics by gathering events fromkeyboard, mouse or other input devices and usinglower level methods to render a new scene to the

    screen. A physics engine (Hecker 2000) is a softwareresource that supports real-time physics simulation,greatly improving the visual quality of a virtualscenario and making obsolete predefined graphicsanimations (Rosenbloom 2000).

    Having a system controlled by a hardware deviceinstead of a human operator is a common goal incontrol engineering. Hardware in the Loop i.e., a

    virtual computer environment controlled by a realdevice such as a PLC or a PID controller reflectsthis notion and is becoming quite common in controlsystems education and training. Advanced graphicsfront ends rendering very sophisticated visual 3Dscenarios are not usually mandatory when hardware

    is in the loop. Yet, high visual performance levels arebecoming common, as they provide a very realistic

    and detailed representation of the impact of controlactions upon the controlled system, contributing thus

    also to a better education on machines and controlsystems.

    A key element for a successful hardware in the

    loop application is the real-time data exchangebetween the synthetic (software) plant and theexternal (hardware) controller. Data types andstreams consistent to real world data exchange andinterfaces are also central to the proper immersion ofthe controller in the synthetic environment. This issynonymous to say that the run-time data productionand handling from the virtual sensors and actuators,as well as the data exchange with the external

    controller via I/O boards, serial ports, TCP/IP,fieldbusses, data servers and other communicationprotocols, is desirably supported by a proper, feasibleand easy to integrate software system. Thats whereinstrumentation engines come into play. Theengineering requirements, design and application of

    an instrumentation engine is the main topic of thispaper, which is organised as follows:

    Section 2 provides a short background for hardwarein the loop applications. It presents some casesdeveloped by the authors and informally introducesthe notion of instrumentation engine. Section 3

    describes the general engineering requirements of aninstrumentation engine, Section 4 briefly describes

    the design of such an engine and section 5 presentssome preliminary applications. Section 6 concludesthe paper summarising the most importantconclusions and pointing directions for futureresearch and developments.

    2. BACKGROUND

    For nearly a decade the authors of the paper areworking in real-time visual simulation systems forcontrol training of undergraduate engineering

    students. PLC programming education has been untilnow the major concern. PLC control of physicalelectromechanical systems is very interesting and

    mandatory in mechatronic systems education, but it

    is necessarily limited to a short number of smalldevices and applications in any engineering school.Common PLC applications are very expensive,

    hazardous or just too large to implement in a didacticlab. Manufacturing, storage, filling, transportation,palletizing and painting are just a few examples ofindustrial applications to which these constraintsapply. Yet, these are very common in industry andmust be included in every PLC programming

    educational curriculum.

    To bridge this gap, more than a dozen 2D visual

    simulations scenarios were firstly developedsimulating machining operations, bottle filling, partssorting systems, palletizing, painting, washing andcross train control among others. The result was avery flexible software package communicating via adigital I/O board to a PLC.

    Even at the present time, these applications deeply

    motivate students to PLC programming (especiallythose who grow-up playing computer games) and, as

    a bonus, graphics simulation, as it is usually noticedin similar applications (Rosenbloom 2003).However, 3D scenarios provide a much betterrepresentation of industrial machinery and processesof interest, as the third dimension allows a deeper

    immersion into the world or simulation. A 3Ddynamic simulation of a PLC controlled system maybe displayed on a screen in real-time, while users canwalk or fly around looking at a complex applicationfrom different angles and distances, appreciating thewhole system or important design and control details.As such nave 2D environments are now evolving todetailed and realistic 3D space, where objects spread

    over a virtual world that includes a floor, walls andeverything else that mimics a bounding scenario(Magalhes et al 2005). A significant modellingeffort and a run-time support that guarantees a real-time, cyclic and consistent to physics laws, renderingof a given controlled system, is the price to pay for

    the evolution.

    Computer and video games and providingincreasingly realistic and lifelike 3D visualenvironments, thus driving demand for home 3Dentertainment. But non-entertainment applicationsare also becoming more immersive, benefiting a lot

    from computer games technology as 3D modellingpackages, GPUs, graphics engines, physics engines

    and, last but not least, architectures and developmentmethods.

    Although cutting edge development software forcomputer games has a prohibitive price for non-entertainment applications, it is possible to develop3D virtual environments for PLC programmingeducation and training using low cost or even free

    software packages. Systems represented in figures 1and 2 were developed using low cost solutions. Thefirst one simulates a prototype warehouse serving

    educational purposes, which exists at our facilities.The second one is a demonstration of a very largeindustrial warehouse. It has been used for

    demonstration and marketing purposes.

  • 7/27/2019 Modern Computer Games Technology in Systems and Control Education

    3/6

    Fig. 1. A 3D Model of a prototype warehouse.

    Fig. 2. A 3D Model of an industrial warehouse.

    Ironically, the hardware in the loop training approach

    may have its major weakness precisely in the notionthat it is possible to acquire knowledge of a systemand the corresponding control without being in actualcontact with it! It is important to be aware that anecessary condition for a valid hardware in the loop

    simulation is that controller senses and acts upon thevirtual environment as it would in a real world

    application. This means that the virtual environmentmust mimic somehow the logic and timed behaviour

    of (industrial) sensors and actuators that are part ofthe application, while input and output data areexchanged in real-time with the external controllervia I/O boards, serial ports, TCP/IP channels,

    industrial fieldbusses, data servers or others means.Additionally, but not so critical, if sensing

    information is represented on alphanumeric displays,gauges, or other visualisation devices existing in thevirtual scenario, then the graphics representation ofthese devices must be rendered accordingly and inreal-time. These requirements introduce the need fora specific software system for hardware in loopapplications: an instrumentation engine.

    3. REQUIREMENTS FOR ANINSTRUMENTATION ENGINE

    When virtual environments are not too complex, thevisual simulation of the controlled system may bebased on pre-defined animations. All the 2Dsimulation scenarios developed by the authors and

    the warehouse represented in Figure 1 were based onpre-defined animations.

    Physics engines introduce much more realism andcredibility to simulations as they lead to animationswhose content is defined on-line according to actualsystem state and physics laws e.g., once a ball hasbeen thrown, it will decelerate, fall to the ground andbounce according to the actual conditions of the

    experiment. Yet, physics based animations makedifficult the simulation and real-time generation ofthe data as it would be provided by the (virtual)

    sensors. For instance, if a predefined animation isused at run-time, then it is very easy to recognizewhen a given proximity sensor is supposed to beactivated, issuing a digital signal accordingly and atthat time. Physics based animation introduces theneed for an extra processing job aiming at defining

    the data provided by sensors according to theirbehavioural rules and the actual state of the

    interfering objects.

    This post-processing is a job for an instrumentationengine. A second on-line job for this engine is thetransmission of the sensing information through thedesired communication channels. For instance, if an

    application includes a (virtual) barcode reader, thenthe information from this device must be transmittedfrom a serial port, as this is the standard physicalinterface for this kind of device. Yet, since this aprogrammable device, the instrumentation enginemust also handle the configuration information sentto this device and change its dynamic behaviouraccordingly.

    It is possible that some sensing information is strictlyinternal, in the sense that there is no need to send it tothe controlling device. For instance, an applicationmay contain a gauge displaying the actual pressure ofa pneumatic system. In this case, the instrumentationengine must pass somehow the sensing information

    to the graphics engine so that the position of thegauge pointer will be represented accordingly.

    These and other considerations allow us to state thegeneral but real-time requirements for aninstrumentation engine:

    1 Estimation of the data produced by each virtualsensing device according to its behavioural

    properties and interfering objects;

    2 Passing of the required sensing data to thehardware controller using predefinedcommunication channels and protocols;

    3 Passing to the graphics engine the sensing datawhose visual representation is somehow

    required;

    4 Introduction of state changes on actuators

    according to data coming from the hardwarecontroller (using the appropriate communicationchannels and protocols) so that the consequencesof control actions may be dynamically evaluated

  • 7/27/2019 Modern Computer Games Technology in Systems and Control Education

    4/6

    by the physics engine and thus represented bythe graphics engine.

    5 Introduction of behavioural changes on thesensing devices as related reconfiguration data isreceived.

    It is worth noting that the usage of aninstrumentation engine has also implications on the

    modelling task of the virtual system. In fact,modelling must consider the included sensors,actuators and possible interacting objects as

    particular sets of items so that they can be efficientlyprocessed later in real-time by the instrumentationengine. Additionally, and for the sake of softwaredevelopment and reusability, sensors, actuators,protocols and communication channels must beorganised in libraries and understood as central

    software objects. Following this approach, asinstrumentation engine may greatly assist in the

    design, configuration, integration and run-timesupport of the (virtual) instrumentation included in a

    hardware in the loop application, exactly in thesame way that graphics and physics engines aid inthe design, programming and real-time execution ofvisual simulations.

    4. DESIGNING AN INSTRUMENTATIONENGINE

    Reflecting its major real-time jobs an instrumentationengine can be seen as consisting of a sensing layerand an underlying communication layer. The sensinglayer supports the simulation of the behaviouralmodels of sensors and actuators. The communication

    layer implements the instrumentation data exchangewith the external controller and software resources,namely physics and graphics engines.

    The communication layer is the interface betweenthe virtual sensing devices and real world controllers.

    It is based on a client/server architecture where thevirtual instruments play the role of clients. The

    server is a collection of abstract devices, each ofwhich providing a cached representation of somememory area in the external controller. Thesynchronisation with the corresponding physicaldevice is guaranteed by a multithreaded and periodic

    polled task.

    From the point of view of an abstract device, thisinteraction is accomplished through one or morechannels, which are an abstract representation of areal communication support like a serial port, aTCP/IP socket, an IO card, etc. This representation isaccomplished by a simple tree graph, where a noderepresents a group of points existing in the realdevice (with the same data type) and leafs represent a

    row of binary data. Instrumentation data can beexchanged with a cached device from a node or aleaf. If from a leaf, it allows for read and write

    operations on the binary data it represents; if from anode, it allows for the read and write operations on aleaf or group of leafs.

    The sensing layer simulates the behaviour of sensingdevices, by producing data accordingly and passing itto the communication layer. Inversely, it introduces

    state changes on virtual actuators according to thedata it gets from the communication layer.

    It is important to note that virtual sensing does notmean to simulate the specific physical principle onwhich a transducer is based. It just means real-time

    production of data as it would be outputted by asensing device, according to its behaviouralproperties and the state of its environment.

    This means that the inclusion of virtual sensingdevices in virtual environments driven by physicsand graphics engines may be surprisingly simple.This is because most information about the state ofthe moveable objects such as position, orientation,

    velocity, acceleration, mass, inertial force or torque are already handled by the physics or graphics

    engine. Thus, if the actual acceleration of an objectmust be passed to an external controller, then the

    sensing layer just needs to request it to the physicsengine, format it accordingly (e.g., convert a doubleprecision floating point variable to a 16 bit integer)and pass it to communication layer. Here, it can bepassed to a D/A converter so that analogue

    information is transmitted to a controller or,alternatively, the communication layer may take thearriving data word and convert it to an ASCII stringto be transmitted via a serial port and according to agiven protocol, just as a remote telemetry unit would.

    Ironically, whilst a sophisticated GPS or a laserdisplacement measurement system is trivial to

    simulate, the very common micro switches, opticalsensors, variable reluctance and Hall-effect proximitysensors are not. Several approaches can support thesimulation of the later sensing devices, scoping fromthose based on very simple mathematical principlesto much more complex algorithms based on collision

    detection (Matthias T, et al, 2003).

    Fig. 3. Acting on a micro switch as viewed by the

    physics, graphics and instrumentation engines.

    The distance between two points is a very simple

    and interesting approach for simulating the dataprovided from different kinds of mechanical andelectronic switches. This is illustrated in Figure 3.

    The 3 frames in the top image show, left to right, the

  • 7/27/2019 Modern Computer Games Technology in Systems and Control Education

    5/6

    representation of the open switch for the physicsengine, graphics engine and instrumentation engine,respectively. The points A and B are, respectively,

    the pivots for the movable and fixed parts of theswitch. If an object acts on the switch as shown inthe bottom frame set, its moving parts change theirpositions as computed by the physics engine. Thisleads to a new graphics representation, where thepoints A and B are now closer to each other.

    Therefore, knowing the distance from A to B issufficient to simulate the output signal of the microswitch according to a transfer function of interest.

    This fairly simple technique can also be used tosimulate displacement transducers, as mechanicaldial test indicators, LVDTs, linear potentiometers ordynamometers. Rotational sensing devices can besimilarly simulated by computing the angle formed

    by two lines.

    Detection of moveable objects may also beaccomplished by measuring the distance between

    two points, but it is sometimes tricky. For instance,for variable reluctance and hall-effect sensingdevices, the distance between two points isimportant, but so is the material of the interfering

    objects. Also important is to measure the distance ofinterest according to a given orientation: a variablereluctance sensor detects a ferrous object if it is infront of it, but not if it is at its back!

    Typically, materials are seen as object parametersthat have some visual impact. In a broader sense,materials are defined according to a given set ofproperties of the objects in a given context. In

    particular, a material may represent the colour of anobject, its transparency, the combination of both, thesubstance on which the object is made, or the lastcode written on a RFID (transponder) device.Materials are fundamental for defining the outputdata of a sensor when it senses a particular object.

    Figure 4 shows an interesting material: a barcode.

    Only barcode readers detect objects having thismaterial. Yet, for reading this code, a barcode readermust be firstly configured accordingly. Configurationdata i.e., information that introduces behaviouralchanges on sensors come from the hardware

    controller, is handled by the communication layer,and then passed to the sensing layer.

    Fig. 4. A barcode as a material.

    Ray cast based techniques are an interestingalternative to the elementary distance between two

    points approach. The technique is to find whether aray starting on a sensor and oriented similarlyintersects an object within a given distance. Yet, the

    geometry of an object can make it very difficult to berigorously detected by a proximity sensor, even ifbased on ray cast techniques. This is because ray casttechniques tend to be time consuming and thusinefficient in real-time applications if the geometryof the interfering object differs a lot from its

    bounding box.

    Fig. 5 shows an example where the bounding

    volumes do not provide a fair representation of targetobject, as the ferrous teeth (a main component of themeasuring system) are completely eliminated whenthe geometry of the object is viewed as its boundingbox. In a case like this a polygon collision detectionalgorithm is required. Various approaches based on

    distance fields and spatial partitioning algorithms, aswell as 3rd party collision detection libraries are

    presently being analyzed (Brentzen J. A. andAans H. 2001). Once again, technology, algorithms

    and techniques used in computer games are guidingthe research Tremblay C (2004).

    Fig. 5. Example of an application where objectdetection based on ray cast is not possible.

    5. PRELIMINARY APPLICATIONS

    An instrumentation engine based on the designconcepts presented in the last section was already

    implemented. C# was used for the development. Thisinstrumentation engine was integrated with theAxiom graphics engine and Newton physics engine,providing thus a powerful but low cost run-time

    environment. Preliminary experiences aiming atgetting a qualitative figure of merit of the

    instrumentation engine are in progress.

    A developed scenario is basically a transportationsystem, consisting of two roller conveyers and a

    transfer table Fig. 6. Using a PLC as the externalcontroller, several objects, transported on pallets Fig.7 -, must be routed from the first conveyer to thesecond via the transfer table, which is driven by anelectrical motor. A micro switch is activated when apalette reaches the transfer table.

    Using this scenario, it was possible to verify that amicro switch is indeed perfectly simulated using thedistance between two points approach. Yet, also asexpected, detecting the objects on the pallets using a

  • 7/27/2019 Modern Computer Games Technology in Systems and Control Education

    6/6

    proximity sensor based on ray casting, is far frombeing rigorous, except for the case of the box, asbounding boxes do not closely match the geometry

    of the transported objects.

    Fig. 6. A view of the experimental virtual controlledsystem

    Fig. 7. Another view of the experimental virtualcontrolled system.

    A barcode reader (Omron V550-A20HD-X) was also

    simulated as well as its communication protocol. Inthis case, the importance of the communication layer

    was demonstrated as it greatly contributed for aneasier and faster development; additionally, itprovided a valuable run-time support to theapplication, namely on error handling and dataexchange with the controller.

    In conclusion, preliminary applications are showing

    that the design principles stated in the paper arepromising and easy to integrate with graphics andphysics engines. Evaluation testing with some targetgroup of students and relevant simulation scenarioswill be conducted in the near future.

    6. CONCLUSIONS

    Virtual environments simulating processes that arenot possible to implement in an engineering schoollaboratory are becoming frequent in systems andcontrol courses. The paper presented authorsexperience in designing these training systems,showing that computer games technologies are

    enabling increasingly more realistic applications.Visual simulations are now supported by graphicsand physics engines, while designers are adoptingdevelopments methods and software architecturessimilar to those found in computer gameapplications.

    However, computer games differ considerably fromhardware in the loop applications, as the

    immersion of the physical controller in the syntheticenvironment is a major concern in the second case.An instrumentation engine is the required software

    resource to bridge this gap. Its main mission is theon-line and real-time estimation of the data producedby each virtual sensing device as well as the inputand output data exchange with the hardwarecontroller according to predefined communicationchannels and protocols.

    The preliminary design and application of aninstrumentation engine was discussed in the paper. It

    became clear that development methods for computergames and associated technologies are central to thedevelopment of instrumentation engines, which aredefinitely a very valuable software resource.Additional developments concerning a wider class ofsensors and communication protocols are being

    developed, while a modern framework intended toaid the management and incorporation of sensors

    during the modelling job is also in progress.

    REFERENCES

    Brentzen J. A. and Aans H. (2001) Generating

    Signed Distance Fields From Triangle MeshesIMM-Tec Report-2002-21.

    Hecker, C. (2000) Physics in Computer Games.Comm. ACM, Vol.43, 7 Pp34-39, Jul 2000.

    Houston, M. (2004) Visualizing DynamicArchitectural Environments. Comm. ACMVol.47,8 Pp54-59, Aug 2004.

    Innocenti, M. Pollini, L (1999) A SyntheticEnvironment for Simulation and Visualization of

    Dynamic Systems. Proc American Control

    Conf. 1999 Pp 1769, 1774. 1999.Kalawsky, R.S. (2005) Exploiting Virtual Reality

    Techniques in Education and Training,www.agocg.ac.uk/reports/virtual/vrtech/title.htm

    Magalhes, Vigrio e Freitas (2005). 3D Virtual

    Environments for PLC Programming Educationand Training. Proc. of 2005 EuropeanSimulation and Modelling Conf. Pp. 349-353.Oct 2005.

    Matthias T, et al, (2003) Optimized Spatial Hashingfor Collision Detection of Deformable Objects,

    Vision, Modeling, and Visualization, 2003.Menendez, R and Bernard J. (2001) Flight

    Simulation in Synthetic Environments. IEEEAESS Magazine, Pp19-23, September 2001.

    Michal, V. (2001) 3D Engines in games Introduction, www.dimension3.sk. April 2001.

    Rosenbloom, A. (2000) Physically Based ComputerAnimations. Comm. ACM, Vol. 43,7 Pp30-33,Jul 2000.

    Rosenbloom, A. (2003) A Game Experience inEvery Application. Comm. ACM, Vol. 46,7

    Pp28-32, Jul 2003.Tremblay C (2004) Mathematics for Game

    Developers, Premier Press 2004.Zyda, M. (2005) From Visual Simulation to Virtual

    Reality to Games. IEEE Computer, Pp 25-32,Sept. 2005.