Grey Box Testing

26
International Conference On Software Testing Analysis & Review May 1-5, 2000 Orlando, FL, USA P R E S E N T A T I O N Wednesday, May 3, 2000 3:15PM GRAYBOX SW TESTING IN THE REAL WORLD IN REAL-TIME André Coulter Lockheed Martin W14 Presentation Paper Bio Return to Main Menu

Transcript of Grey Box Testing

Page 1: Grey Box Testing

International Conference OnSoftware Testing Analysis & Review

May 1-5, 2000Orlando, FL, USA

P R E S E N T A T I O N

Wednesday, May 3, 20003:15PM

GRAYBOX SW TESTING IN THE REAL WORLD INREAL-TIME

André CoulterLockheed Martin

W14

PresentationPaperBioReturn to Main Menu

Page 2: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

1

Graybox Testing

Graybox Software Testing in the RealWorld in Real-Time

André Coulter

Page 3: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

2

Graybox – Data Capture

• The Graybox methodology involvessoftware/hardware testing using a-priori Systemsbehavioral modeling data to determine theexpected results of the system.

• Given a system with a known state and input data,it should be possible to predict the next systemstate and its output data.

• A system with 5 modes (off, standby, norm-op,test, calibrate) can be tested to verify proper statetransitions.

Page 4: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

3

Graybox Testing – AutoTester

AutomatedModuleTester

ModuleUnderTest

Test Harness

Test harness created from the MUT’s specification

Software Test Expert domain knowledge is captured

Page 5: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

4

Graybox Testing - Modal

• The valid state transitions are:– Off => Standby

– Standby => Norm-Op, Test, Calibrate, Off

– Norm-Op, Test, Calibrate => Standby

• The System in the Off mode can only transition tothe Standby mode.

• The System in any mode must first transition tothe standby mode before going to a new mode.

Page 6: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

5

Graybox Testing – Module_Tester

Module_Tester

Module_Driver

Module_Under_Test

MTIF MTOFInputs/Expected Results

Inputs Outputs

Software under Testexecuting under thecontrol of a TestExecutive

Actuals compared toExpected results and scored

MTIF/MTOFModule Test input/output files

Page 7: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

6

Graybox – System Moding

• Discrete state data– Off Mode (0)– Standby Mode (1)– Normal Op Mode (2)– Test Mode (3)– Calibrate Mode (4)

• System transitions to acommanded mode andremains in that mode untila new mode command isreceived 0

1

2

3

4

State

Page 8: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

7

Graybox Testing - Modal

• A Graybox test consists of setting up aninput condition and verifying the systemcorrectly responded.

• During a modal test, the system willnormally remain in the commanded mode,therefore time is not a factor in testing.

Page 9: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

8

Graybox – Modal Test

• Modal Test– Verify state = Off– Transmit state = Standby– Verify state = Standby– Transmit state = Test– Verify state = Test– Transmit state = Off– Verify state = Test

Page 10: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

9

Graybox Testing – Non RealTime Vs Real Time

• During Modal testing, time is not important.• During Operational Data testing, time is

important. Data is constantly changing overtime.

• Operational data captured during normalsystem operation requires a definite sense oftime. A radar system tracking targets mustcalculate the a/c position in real time.

Page 11: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

10

Graybox Testing – Non RealTime

• The original concept of Graybox testing provideda method to verify embedded software during theCode and Unit Test phases of a project.

• The methodology utilized no concept of timeduring testing since the software was executing ina debugger.

• To expand the Graybox methodology into theformal test arena a definite concept of time wasneeded.

Page 12: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

11

Graybox Testing – Time Domain

• Time Domain– Non Real-Time (NT), Passage of time is not

important, event driven system.– Near Real Time (NRT), Passage of time is

important but simulated by frame clock. Usedwhen performing Real Time simulations.

– Real Time (RT), Passage of time is important.Time is real. Execution of the actual system.

– Soft Real Time (SRT), Hard Real Time (HRT)

Page 13: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

12

Graybox Testing – Real Time

• Real Time– Local Time

• Internal CPU clock time

• Local Frame Time ( 1 tic = 100ms )

• Object Level Time (OLT) starts at the object’s birth!

– Master System clock time• ZULU time

• IRIG time.

Page 14: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

13

Graybox Testing – OperationalData

• Operational Data is collected and time stamped atthe object level.

• Object level time stamping involves dating anobject based on the amount of time the object hasbeen alive.

• Data extraction and verification can now be basedon the life time of the object vs actual wall clocktime.

• Object Level Time stamping allows tests to berepeated in real time.

Page 15: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

14

Graybox Testing – Object LevelTime

• Object Level Time in the Graybox sense is fuzzytime.

• To verify an Object has the correct value at thecorrect time, an object is first requested to returnits lifetime value, then the object request value,followed by the lifetime value.

• The object is then tested to verify a value wasreturned that is within the starting OLT and theending OLT.

Page 16: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

15

Graybox Testing in Real Time

• Capturing data in realtime requires aknowledge of time

• Each data point mustbe time stamped.

• Each data point mustbe compared to theexpected data point inthe time dominion.

Sine of X over Time

-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

Sin(X)

Page 17: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

16

Graybox Testing in Real Time• Systems A-priori Data in Table• Captured Data

– Object Start_Lifetime = 3– Object Value = 0.866– Object Ending_Lifetime = 5

• Data is validated if Object Valuefalls within Beginning andEnding OLT times.

• All Three values are acceptable• (0.7071, 0.866, 1.0) 170

160

150

140

130

120

110

0.7077

0.8666

1.05

0.8664

0.7073

0.52

01

FrmTm -OLT - Value

Page 18: Grey Box Testing

Lockheed Martin – Missiles and FireControl - Orlando

17

Graybox Testing - ToolKit

Identify a module to be tested

Identify filename of the module

Instrument module for coverage

Stub incomplete modulesGenerate a test driver

Execute module in the tester

View results after execution

Generate a test case file

Generate a regression test file

Testing as easy as point and click

Page 19: Grey Box Testing

Graybox Software Testing in the Real World in Real-TimeAndré C. Coulter

Lockheed Martin Missiles and Fire Control - Orlando

AbstractThe Graybox Testing Methodology is a software testing method used to test embedded systems.The methodology is platform and language independent. The current implementation of theGraybox methodology is heavily dependent on the use of a host platform debugger to executeand validate the software under test. Recent studies have confirmed that the Graybox methodcan be applied in real-time using software executing on the target platform. This now expandsthe capabilities of the Graybox method to include not only path coverage verification but alsoworst-case / best-case path timing. The Graybox toolset can now be called upon to perform andverify performance requirements. It is now possible to verify/validate CSCI/CSU/Module/Pathtiming, functional and structural requirements in a single test case with the same format used toverify/validate functional requirements. The Graybox methodology is a full life-cycle testingmethodology that enables software developers to test embedded systems in non real-time or inreal-time. This paper will present the Graybox methodology and how this method has beenapplied in a real-time environment to validate mission critical embedded software systems.

1.0 IntroductionThe computer industry is changing at avery rapid pace. In just a few short yearswe have seen an industry where accessto a computer was limited to a select fewindividuals in white lab coats to onewhere almost every employee in anoffice complex has a computer on theirdesktop. This explosive trend has beenfueled by both hardware and softwareadvances. Computers are becoming anindispensable item for every productbeing developed.

In the aerospace industry, computers andtheir corresponding software controlevery aspect of a product from itsconcept and design, to its development,manufacture, and test. “Software as aproduct item refers to any computerprogram, whether it is an operatingsystem, a single subroutine, or amicroprogram”. [Spencer85] In order tokeep pace with a rapidly changingcomputer industry, software testing mustdevelop methods to verify and validatesoftware for all aspects of a productlifecycle. Software engineers must havetools that will enable the rapid and

accurate testing of both functional andperformance requirements.

2.0 ScopeGraybox Software testing in Real-Timeseeks to provide a method of testingsoftware that will be both easy toimplement and easy to understand usingCommercial Off The Shelf (COTS)software and hardware. The Grayboxmethodology is defined as “Blackboxtesting + Whitebox testing + Regressiontesting + Mutation testing”. [Coulter99]The modified Graybox method willaddress real-time performance testing.

The original Graybox methodology didnot address real-time or system leveltesting. The major concentration of themethodology was to provide a systemthat allowed test engineers to createample test suites that thoroughly tested asoftware product. The emphasis of thispaper will be to include real-timeperformance verification and validationin the methodology without altering theintent of a Graybox test. As “recentlyas 1998, in one Fortune 100 company,performance testing was conductedwhile one test engineer sat with a

Page 20: Grey Box Testing

stopwatch, timing the functionality thatanother test engineer was executingmanually”. [Elfriede99]

The Graybox method enabled theengineer to state the inputs and theexpected results in terms of systemrequirements. The engineer can state thesystem performance requirements alongwith the functional and structuralrequirements to form a complete andthorough test of the deliverable system.“NASA stated that on-board softwarereliability must be such that, during theentire shuttle program, not one vehicle islost or damaged beyond repair becauseof software problems”. [Jelinski72] Inorder to achieve this level of reliabilitythe software must be thoroughly tested atboth the functional and performancelevels.

Applying performance testing earlier inthe project testing lifecycle will giveconfidence that the correct product willbe delivered and operates correctly at theperformance stated in the requirements.Testing of performance characteristicsmust be moved closer to the beginningof the development cycle. “If equalemphasis was given up front to testing,then I believe that good code wouldfollow, because the things that werecognize as good in coding are the samethings that contribute to easy testing”.[Beizer84]

3.0 Computer SystemsComputer systems can be placed in oneof many categories. The categories arenormally based on the type of processingrequired and the speed at which the userexpects a response. Batch or non real-time systems require a minimal amountof human intervention and are heavilydependent on the amount of time spent

determining the answer to a problem.As the requirement for humaninteraction increases or the amount oftime spent on solving a problem isreduced, computer systems move into arealm of real-time processing. “Manmachine interactive analysis andprocessing are preferred to batchprocessing for more efficient analysis”.[Hanaki81]

As the systems engineer moves into thereal-time realm, new and interestingproblems arise. The computer systemmust not only produce the correctanswer, but also must meet strict timingconstraints that do not exist in a batch-oriented non real-time system. “Real-Time processing implies dedicated,special-purpose computers often runningmany processing elements in parallel”.[Onoe81]

This multi-tasking, multi-computerenvironment presents special challengeswhen testing computer software thatmust meet specific timing requirementsin this hybrid environment. “A singleCPU cannot afford to deal with variouskinds of image data or variousprocessing phases in an interactivemulti-user environment”. [Isaka81] Theoriginal Graybox technique requires thestopping and starting of the computersystem to verify internal code coverageand functional requirements. “It isusually desirable that a real-time systemshould operate without stopping”.[Martin67]

4.0 Graybox Software TestingThe Graybox methodology is a ten stepprocess for testing embedded computersoftware (refer to Table 1. Ten StepGraybox Methodology). Themethodology starts by identifying all the

Page 21: Grey Box Testing

input and output requirements to acomputer system. This information iscaptured in the software requirementsdocumentation.

Table 1. Ten Step GrayboxMethodologyStep Description1 Identify Inputs2 Identify Outputs3 Identify Major Paths4 Identify Sub-function (SF) X5 Develop Inputs for SF X6 Develop Outputs for SF X7 Execute Test Case for SF X8 Verify Correct Result for SF X9 Repeat Steps 4:8 for other SF10 Repeat Steps 7&8 for Regression

The Graybox methodology utilizesautomated software testing tools tofacilitate the generation of test uniquesoftware. Module drivers and stubs arecreated by the toolset to relieve thesoftware test engineer from having tomanually generate this code. The toolsetalso verifies code coverage byinstrumenting the test code.“Instrumentation tools help with theinsertion of instrumentation codewithout incurring the bugs that wouldoccur from manual instrumentation”.[Beizer84]

By operating in a debugger or targetemulator, the Graybox toolset controlledthe operation of the test software. TheGraybox methodology has moved out ofa debugger into the real world and intoreal-time. The methodology can beapplied in real-time by modifying thebasic premise that inputs can be sent tothe test software via normal systemmessages and outputs are then verifiedusing the system output messages.

In real-time, data must be captured,logged, time-tagged and then comparedto a-priori data generated by a simulationthat has knowledge of timed data. “Thetimed entry call is one place where anupper bound is placed on the timeduration for some action to occur”.[Volz88] Specifying the time at whichan event must occur and verifying thatthe output is correct with respect to timeverifies the performance requirement.

The software test system must havesome concept of time. There are severaltiming systems that are available in areal-time system. The test engineercould use the CPU clock as a timingsource. The frame counter is anothersource for timing data. The lastalternative is an off-CPU chip systemsuch as an IRIG timer. Once a timingsystem is adapted, the data must be timetagged and compared to the expectedresults in the time domain.

An alternative to any absolute timinggenerated by an on-board or off-boardsystem would be an Object LevelTiming (OLT) system. OLT involvesgenerating a relative time from start ofan object’s existence. Using an OLTreference allows repeatability in thetesting of real-time systems. OLT is “amethodology for the statement of timingrequirements”. [Coolahan88]

5.0 System SimulationsAutomated Test case generation hasalways been the desire of software testengineers. “There is no simple way toautomatically generate test cases”.[Beizer83] The best and most efficientmethod of software test case generationcan be realized by inserting specializedtest data capture code into an existingsystem simulation.

Page 22: Grey Box Testing

In the aerospace industry, specializedcomputer simulations are generated todevelop algorithms and softwarerequirements. Existing simulations canbe interactive or batch systems, but thetiming information is based on a framecounter that is used to simulate time.Generating test case data from thesimulation enables the test engineeraccess to precise output data timed to aspecific event. The “speed and accuracyof performing various tasks” can beverified by analyzing the time taggedoutput data. [Spencer85]

Any product developed must meet twosets of expectations. The firstexpectation is that of the developingcompany. If a company has internalstandards that must be met, theseexpectations will be tested by the testorganization long before the customerever sees the delivered product. Theother set of expectations belongs to thecustomer. “The successful product mustbe satisfactory both to the user/customerand to the producing company”.[Spencer85]

Performance requirements must bestated in quantitative terms. The outputof a message must be generated beforesome elapsed time interval based on ameasurable/observable event. Themeasurable event can be the receipt ofan input message or the transition to aparticular state. Performancerequirements stated in terms of eventsand interval timing can be verified byobserving the OLT of a specific messageobject. “Performance measures andrequirements are quantitative – that is,numbers. A performance specificationconsists of a set of specified numbers.The systems actual performance is

measured and compared to thespecification”. [Beizer84]

Care must be exercised in the generationof test case data. In a real-time system itmay not be possible to verify the entirecontents of a program as is the casewhen executing in a debugger. The testengineer must selectively determine thebest candidates for data capture toachieve the best performance andrequirements coverage. “We shoulddiscipline ourselves not to collect dataaimlessly; rather, we should collect databy planned experiments”. [Grenander72]

The Graybox methodology involvessoftware proof of correctness. Once testcase data is obtained and the system isexercised using the simulation generatedtest case data, the output is verifiedagainst the expected results. Care mustbe exercised in the dependence on thesimulated results. A computer systemverified with a simulation with corruptedalgorithms will only verify that thealgorithms in the simulation have beenfaithfully implemented in the real-timesystem. “It is only too easy to take theimpressive output of a simulation as fact,when the input does not justify this. Thepossible error of input must be translatedinto terms of possible error of results”.[Martin67]

6.0 Graybox Testing in Real-TimeApplying the Graybox methodology inreal-time is simply adding a timecomponent to the expected output datavalue. A normal Graybox test for asystem that generates sines based onangular displacements would consist ofthe input angle in degrees and theexpected output sine value. The test “y= Sin(x) for x = 30” would be 0.5. Thistest has no time component associated

Page 23: Grey Box Testing

with the answer. Therefore wheneverthe response is generated it would becompared to the expected result of 0.5.This could be 1 millisecond or 10seconds from the start of the test. In areal-time system this test would beunacceptable.

A real-time test would be stated as “y(t)= sin(x,t)”. The added time element “t”assures the expected value will becompared in the OLT domain. UsingFigure 1. Sin of X, the major Object isX. A component of object X is thex.sine. The OLT values of x.sine areread from the curve. The OLTs areobtained from the X-axis. Therefore atthe creation of the X object, the value ofx.sine should be 0 at X’s OLT (x.olt) of1. At the x.olt of 10 the value of x.sineshould be –0.5.

During the execution of the simulation,one “step was to examine the raw datawith the intention of determining trends.An additional step for examining rawdata was to display them using a plotpackage”. [Feeley72] These steps willhelp the engineer visualize the databeing collected and can be used as amethod of reducing the amount of databeing extracted in real-time. “Softwareinstrumentation while giving precise andcopious information, cannot be insertedinto a program wholesale lest the systemspend all its time processing artifact andinstrumentation rather than doing honestwork”. [Beizer84]

During a real-time test, the system isexamined for the values of x.olt, x.sine,x.olt. This time-stamp/value/time-stampgroup is logged and then post-processedimmediately after the test to verify thatthe time-stamped value lies on the curvebetween the beginning and ending time

stamps. “The processes of logging,counting, timing and sampling can bedone by software: by inserting code”.[Beizer84]

sin(x)

-2

-1

0

1

2

1 4 7 10 13 16

sin(x)

Figure 1. Sin of X strip chart

In real-time the Graybox methodologyconsists of the five steps presented inTable 2. Five Step Graybox Real-TimeMethodology.

Table 2. Five Step Graybox Real-Time MethodologyStep Description1 Identify all System Performance

Requirements2 Instrument System Simulation for

Object Level Time data3 Execute System Simulation to

Obtain Test Case data4 Monitor Behavior of Real-Time

Program5 Verify Correct Values based on

Time-Stamps

To perform a real-time Graybox test alltest data must be prepared before handfrom the system simulation and fed tothe real-time program as input systemmessages. “The online load generatortakes transactions from the scenario fileand attempts to issue them to the testedsystem at the time marked for thattransaction”. [Beizer84] Because theprecise time for object creation cannotbe duplicated for each run, the OLT will

Page 24: Grey Box Testing

guarantee that the object data isconsistent from run to run.

7.0 ConclusionDepending on the nature of theapplication, a real-time system falls inone of two major groups. “There aretwo types of real-time systems, namely,soft real-time and hard real-timesystems”. [Cheng88]

Soft real-time is defined as a real-timesystem where it is desirable that systemprocesses occur rapidly, but few if anyreal time constraints are placed on thesystem. A requirement might state thatthe system respond within a reasonabletime limit. This could be interpreted byone system engineer as one second andby another system engineer as onemillisecond.

A hard real-time system places responsetimes and process time constraints on amajority of the system processesespecially at the system interfaces.“Hard real-time systems arecharacterized by the presence of tasksthat have timing constraints”. [Zhao88]

To verify real-time software, a systemlevel simulation is required. Thesimulation is used to generate the testcase data that is fed to the real-timesoftware. The output messages areextracted, time-tagged, logged and thencompared to the expected results in thetime domain. When the expected resultsare successfully compared in the timedomain, the real-time performancecharacteristics can be verified.“Similarly this technique will be usefulfor verification and validation ofprograms”. [Enomoto81]

By adding the element of time a nonreal-time Graybox test can be convertedinto a real-time test capable of verifyingand validating real-time embeddedapplications.

Author Biography

Andre' Coulter is the Tools andTechnology Leader at Lockheed MartinMissiles and Fire Control - Orlando.Mr. Coulter has over 22 years ofsoftware development experience andhas spent over 12 years investigating anddeveloping automated testing tools insupport of Lockheed Martin embeddedsystems applications in Ada, Java, Perl,Fortran, C, and assembly language.Mr. Coulter is a graduate of Bowie StateUniversity, Bowie, Md. in 1978 with aBS in Bus Admin. He also taughtcomputer programming at Drake StateCollege in Huntsville Ala.Email [email protected]

Bibliography[Beizer83] Boris Beizer, SoftwareTesting Techniques, 1983 Van NostrandReinhold Company Inc., New York, pg67

[Beizer84] Boris Beizer, SoftwareSystem Testing and Quality Assurance,1984 Van Nostrand Reinhold CompanyInc., New York, pgs (238, 258, 263, 309,312)

[Cheng88] Sheng-Chang Cheng, John A.Stankovic (ed), “Scheduling Algorithmsfor Hard Real-Time Systems – A BriefHistory”, Hard Real-Time Systems,1988 Computer Society Press of theIEEE, Washington, D.C., pg 151

[Coolahan88] James E. Coolahan, JohnA. Stankovic (ed), “TimingRequirements for Time-Driven SystemsUsing Augmented Petri Nets”, Hard

Page 25: Grey Box Testing

Real-Time Systems, 1988 ComputerSociety Press of the IEEE, Washington,D.C., pg 78

[Coulter99] André Coulter, “GrayboxSoftware Testing Methodology –Embedded Software TestingTechnique”, 18th Digital AvionicsSystems Conference Proceedings, 1999IEEE, pg (10.A.5-2)

[Elfriede99] Dustin Elfriede, AutomatedSoftware Testing, 1999 Addison WesleyLongman, pg 39

[Enomoto81] H. Enomoto, Morio Onoe(ed), “Image Data Modeling andLanguage for Parallel Processing”, Real-Time/Parallel Computing ImageAnalysis, 1981 Plenum Press, New Yorkpg 101

[Feeley72] John W. Feely, WalterFreiberger (ed), “A ComputerPerformance Monitor and MarkovAnalysis for Multiprocessor SystemEvaluation”, Statistical ComputerPerformance Evaluation, 1972 AcademicPress Inc., London, pg 177

[Grenander72] U. Grenander, WalterFreiberger (ed), “Quantitative Methodsfor Evaluating Computer SystemPerformance: A Review and Proposals”,Statistical Computer PerformanceEvaluation, 1972 Academic Press Inc.,London, pg 15

[Hanaki81] S. Hanaki, Morio Onoe(ed), “An Interactive Image Processingand Analysis System”, Real-Time/Parallel Computing ImageAnalysis, 1981 Plenum Press, NewYork, pg 219

[Isaka81] J. Isakai, Morio Onoe (ed), “ACompound Computer System for ImageData Processing”, Real-Time/ParallelComputing Image Analysis, 1981Plenum Press, New York, pg 257

[Jelinski72] Z. Jelinski, WalterFreiberger (ed), “Software ReliabilityResearch”, Statistical ComputerPerformance Evaluation, 1972 AcademicPress Inc, London, pg 467

[Martin67] James Martin, Design ofReal-Time Computer Systems, 1967Prentis-Hall Inc, New Jersey, pgs(257,373)

[Onoe81] Morio Onoe (ed), Real-Time/Parallel Computing ImageAnalysis, 1981 Plenum Press, New Yorkpg vii

[Spencer85] Richard H. Spencer,Computer Usability Testing andEvaluation, 1985 Prentis-Hall Inc, NewJersey, pgs (13,50,98)

[Volz88] Richard Volz, John A.Stankovic (ed), “Timing Issues in theDistributed Execution of AdaPrograms”, Hard Real-Time Systems,1988 Computer Society Press of theIEEE, Washington, D.C., pg 130

[Zhao88] Wei Zhao, John A. Stankovic(ed), “Preemptive Scheduling UnderTime and Resource Constraints”, HardReal-Time Systems, 1988 ComputerSociety Press of the IEEE, Washington,D.C., pg 225

Page 26: Grey Box Testing

André Coulter

Andre Coulter is the Tools and Technology Leader at Lockheed Martin Missiles and FireControl - Orlando. Mr. Coulter has over 22 years of software development experience andhas spent over 12 years investigating and developing automated testing tools in support ofLockheed Martin embedded systems applications in Ada, Java, Perl, Fortran, C, andassembly language.

Mr. Coulter is a graduate of Bowie State University, Bowie, Md. in 1978 with a BS inBusiness Administration. He also taught computer programming at Drake State College inHuntsville Ala. Email [email protected]