How Can Simple Model Test Complex System Model Based Testing of Large-Scale Software Victor Kuliamin...
-
date post
18-Dec-2015 -
Category
Documents
-
view
215 -
download
0
Transcript of How Can Simple Model Test Complex System Model Based Testing of Large-Scale Software Victor Kuliamin...
How Can Simple Model Test Complex SystemModel Based Testing of Large-Scale Software
Victor [email protected] RAS, Moscow
Real Software Systems
System Year Size (MLOC)Windows 3.1 1992 3Windows NT 3.1 1993 6Windows 95 1995 15Windows NT 4.0 1996 16,5Red Hat Linux 5.2 1998 12Debian Linux 2.0 1998 25Windows 2000 1999 29Red Hat Linux 6.2 2000 17Sun StarOffice 5.2 2000 7,6Debian Linux 2.2 2000 59Red Hat Linux 7.1 2001 30Windows XP 2001 45Red Hat Linux 8.0 2002 50Debian Linux 3.0 2002 105
They are huge and have a lot of functions They have very complex interfaces They are developed by hundreds of people They are distributed and concurrentSystem Year DevTeam Size
Windows NT 3.1 1993 200Windows NT 3.5 1994 300Windows NT 4.0 1996 800Debian Linux 1.2 1996 120 *Debian Linux 2.0 1998 400 *Windows 2000 1999 1400Debian Linux 2.2 2000 450 *Debian Linux 3.0 2002 1000 *
Quality of Real Software Systems
System Year TestTeam SizeWindows NT 3.1 1993 140Windows NT 3.5 1994 230Windows NT 4.0 1996 700 (0.9)Windows 2000 1999 1700 (1.2)
System Test Cases, KMS Word XP 35Oracle 10i 100Window XP >2000 (?)
They are tested a lot But
Details of their behavior are not well definedAnd they still do have serious bugs
Model Based Testing – a Solution?
Potential to test very large systems with high adequacy Parallelization of work on system and its tests
Google on “model based testing” “case study” gives ~630 links on ~230 sites ~60 separate case studies concerned with industry since 1990
Most MBT case studies are small < 10 case studies concerned with systems of size > 30 KLOC
Most MBT techniques are based on state models and hence prone to state explosion problem
?
Fighting Complexity
No simple way to test a complex system adequately
But manageable way exists –use of general engineering principlesAbstractionSeparation of concernsModularizationReuse
UniTesK Solutions
Modularize the system under test – contract specifications of components
Modularize the test system –flexible test system architecture Adapters – binding test system and SUT Contracts Oracles – checking SUT’s behavior Test coverage goals based on contracts Test data generators for single operation Testing models (test scenario) – test sequence composition
Abstract contract, more abstract testing model Reusability of contracts, testing models,
test data generators
→
Software Contracts
Component A Component C
Component B Component D
Contract A
Contract D
Contract C
Contract B
Subsystem IISubsystem I
Contract II
Contract I
Contracts (preconditions, postconditions, data integrity constraints) help to describe components on different abstraction levels
Test Coverage Goals
post {
if ( f(a, b) || g(a) ) …
else if( h(a, c) & !g(b) ) …
else …
}
!f(a, b) && !g(a) && !h(a, c)
|| !f(a, b) && !g(a) && g(b)
Testing Model
states
parameters operation domain
1
2
3coverage goals
Test Data Generation
Computation of single call arguments
1
2
3
current state
parameters
states
Test data generation is
based on simple generators
and coverage filtering
The Whole Picture
System under TestBehavior Model Testing Model
Coverage Model
On-the-flyTest Sequence Generation
Single Input Checking
Testing Concurrency
s11
Target System
s21
s12
s31
Multisequence is used instead of sequence of stimuli
Stimuli and reactions form a partially ordered set
r12
r22
r11
r21
Time
11
12
21
11
22
21
12
31
Plain concurrency : behavior of the system is equivalent to some sequence of the actions
Checking Composed Behavior
Plain concurrency axiom
✕
The Case Study
1994 – 1996ISP RAS – Nortel Networks project on functional test suitedevelopment for Switch Operating System kernel
Size of the SUT is ~250 KLOC ~530 interface operations
44 components were determined ~60 KLOC of specifications
~40 KLOC test scenarios developed in 1.5 year by 6 people A lot of bugs found in the SUT, which had been in use for 10 years
Several of them cause cold restart ~30% of specifications are used to test other components 3 versions of the SUT were tested by 2000 (~500 KLOC)
Changes in the test suite were <5%
Other Case Studies
IPv6 implementations - 2001-2003 Microsoft Research Mobile IPv6 (in Windows CE 4.1) Oktet
Intel compilers - 2001-2003 Web-based banking client management system Enterprise application development framework Billing system Components of TinyOS
http://www.unitesk.com
UniTesK Tools
J@T 2001 Java / NetBeans, Eclipse (plan)
J@T-C++ Link 2003 C++ / NetBeans + MS Visual Studio CTesK 2002
C / Visual Studio 6.0, gcc Ch@se 2003
C# / Visual Studio .NET 7.1 OTK 2003
Specialized tool for compiler testing
References
1. V. Kuliamin, A. Petrenko, I. Bourdonov, and A. Kossatchev. UniTesK Test Suite Architecture. Proc. of FME 2002. LNCS 2391, pp. 77-88, Springer-Verlag, 2002.
2. V. Kuliamin, A. Petrenko, N. Pakoulin, I. Bourdonov, and A. Kossatchev. Integration of Functional and Timed Testing of Real-time and Concurrent Systems. Proc. of PSI 2003. LNCS 2890, pp. 450-461, Springer-Verlag, 2003.
3. V. Kuliamin, A. Petrenko. Applying Model Based Testing in Different Contexts. Proceedings of seminar on Perspectives of Model Based Testing, Dagstuhl, Germany, September 2004.
4. A. Kossatchev, A. Petrenko, S. Zelenov, S. Zelenova. Using Model-Based Approach for Automated Testing of Optimizing Compilers. Proc. Intl. Workshop on Program Undestanding, Gorno-Altaisk, 2003.
5. V. Kuliamin, A. Petrenko, A. Kossatchev, and I. Burdonov. The UniTesK Approach to Designing Test Suites. Programming and Computer Software, Vol. 29, No. 6 , 2003, pp. 310-322. (Translation from Russian)
6. S. Zelenov, S. Zelenova, A. Kossatchev, A. Petrenko. Test Generation for Compilers and Other Formal Text Processors. Programming and Computer Software, Vol. 29, No. 2 , 2003, pp. 104-111. (Translation from Russian)
Contacts
Victor V. Kuliamin
109004, B. Kommunisticheskaya, 25
Moscow, Russia
Web: http://www.ispras.ru/groups/rv/rv.html
Phone: +7-095-9125317
Fax: +7-095-9121524
Thank you!