Slide 1

laboredbumbaileySoftware and s/w Development

Jun 7, 2012 (5 years and 2 months ago)

438 views

How Can Simple Model
Test Complex System

Model Based Testing of
Large
-
Scale Software

Victor Kuliamin

kualimin@ispras.ru

ISP RAS, Moscow

Real Software Systems

System

Year

Size (MLOC)

Windows 3.1

1992

3

Windows NT 3.1

1993

6

Windows 95

199
5
15

Windows NT 4.0

1996

16,5

Red Hat Linux 5.2 1998 12

Debian Linux 2.0 1998 25

Windows 2000

1999

29

Red

Hat Linux 6.2

2000
17

Sun StarOffice 5.2 2000 7,6

Debian Linux 2.2

200
0
5
9

Red

Hat Linux 7.1

2001

30

Windows XP

2001

45

Red Hat Linux 8.0 2002 50

Debian Linux 3.0 2002 105


They are huge and have a lot of functions


They have very complex interfaces


They are developed by hundreds of people


They are distributed and concurrent

System

Year

DevTeam
Size

Windows NT 3.1

1993

200

Windows
NT 3.5
1994

300

Windows NT 4.0

1996

800

Debian Linux 1.2 1996 120 *

Debian Linux 2.0 1998 400 *

Windows 2000

1999

1400

Debian Linux 2.2 2000 450 *

Debian Linux 3.0 2002 1000 *

Quality of Real Software Systems

System

Year

TestTeam
Size

Windows NT 3.1

1993

140

Windows
NT 3.5
1994

230

Windows NT 4.0

1996

700 (0.9)

Windows 2000

1999

1700 (1.2)

System

Test Cases,
K

MS Word XP 35

Oracle 10i 100

Window XP >2000 (?)


They are tested a lot


But


Details of their behavior are not well defined


And they still do have serious bugs


Model Based Testing


a Solution?


Potential to test very large systems with high adequacy


Parallelization of work on system and its tests



Google on “model based testing” “case study” gives


~630 links on ~230 sites


~60 separate case studies concerned with industry since 1990


Most MBT case studies are small


< 10 case studies concerned with systems of size > 30 KLOC


Most MBT techniques are based on state models and
hence prone to state explosion problem

?

Fighting Complexity


No simple way to test a complex system
adequately


But
manageable

way exists


use of general engineering principles


Abstraction


Separation of concerns


Modularization


Reuse

UniTesK Solutions


Modularize the system under test


contract
specifications of components


Modularize the test system


flexible test system architecture


Adapters


binding test system and SUT


Contracts Oracles


checking SUT’s behavior


Test coverage goals based on contracts


Test data generators for single operation


Testing models (test scenario)


test sequence composition


Abstract contract, more abstract testing model


Reusability of contracts, testing models,

test data generators



Software Contracts

Component A

Component C

Component B

Component D

Contract A

Contract D

Contract C

Contract B

Subsystem II

Subsystem I

Contract II

Contract I

Contracts (preconditions, postconditions, data integrity constraints)
help to describe components on different abstraction levels

Test Coverage Goals

post

{


if

( f(a, b) || g(a) ) …


else if
( h(a, c) & !g(b) ) …


else





}


!f(a, b) && !g(a) && !h(a, c)

||

!f(a, b) && !g(a) && g(b)

Testing Model

states

parameters

operation domain

1

2

3

coverage goals

Test Data Generation

Computation of single call arguments

1

2

3

current state

parameters

states

Test data generation is

based on simple generators

and coverage filtering

The Whole Picture

System under Test

Behavior Model

Testing Model

Coverage Model

On
-
the
-
fly

Test Sequence Generation

Single Input Checking

Testing Concurrency

s
11

Target
System

s
21

s
12

s
31


Multisequence
is used instead of sequence of
stimuli


Stimuli and reactions form a partially ordered set

r
12

r
22

r
11

r
21

Time

11

12

21

11

22

21

12

31

Plain concurrency

: behavior of the system is
equivalent to some sequence of the actions

Checking Composed Behavior

Plain concurrency axiom

















The Case Study

1994


1996

ISP RAS


Nortel Networks project on functional test suite

development for Switch Operating System kernel



Size of the SUT is ~250 KLOC


~530 interface operations



44 components were determined


~60 KLOC of specifications

~40 KLOC test scenarios developed in 1.5 year by 6 people


A lot of bugs found in the SUT, which had been in use for 10 years

Several of them cause cold restart


~30% of specifications are used to test other components


3 versions of the SUT were tested by 2000 (~500 KLOC)

Changes in the test suite were <5%

Other Case Studies


IPv6 implementations
-

2001
-
2003


Microsoft Research


Mobile IPv6 (in Windows CE 4.1)


Oktet


Intel compilers
-

2001
-
2003


Web
-
based banking client management system


Enterprise application development framework


Billing system


Components of TinyOS

http://www.unitesk.com

UniTesK Tools


J@T 2001

Java / NetBeans, Eclipse (plan)


J@T
-
C++ Link 2003


C++ / NetBeans + MS Visual Studio


CTesK 2002

C / Visual Studio 6.0, gcc


Ch@se 2003

C# / Visual Studio .NET 7.1


OTK 2003

Specialized tool for compiler testing

References

1.
V. Kuliamin, A. Petrenko, I. Bourdonov, and A. Kossatchev.
UniTesK Test Suite
Architecture.

Proc. of FME 2002. LNCS 2391, pp. 77
-
88, Springer
-
Verlag, 2002.

2.
V. Kuliamin, A. Petrenko, N. Pakoulin, I. Bourdonov, and A. Kossatchev.
Integration of Functional and Timed Testing of Real
-
time and Concurrent Systems.

Proc. of PSI 2003. LNCS 2890, pp. 450
-
461, Springer
-
Verlag, 2003.

3.
V. Kuliamin, A. Petrenko.
Applying Model Based Testing in Different Contexts.

Proceedings of seminar on Perspectives of Model Based Testing, Dagstuhl,
Germany, September 2004.

4.
A
.

Kossatchev
,
A
.
Petrenko
,
S
.

Zelenov
,
S
.
Zelenova
.
Using Model
-
Based
Approach for Automated Testing of Optimizing Compilers
. Proc. Intl. Workshop on
Program Undestanding, Gorno
-
Altaisk, 2003.

5.
V. Kuliamin, A. Petrenko, A. Kossatchev, and I. Burdonov. The UniTesK Approach
to Designing Test Suites. Programming and Computer Software, Vol. 29, No. 6 ,
2003, pp. 310
-
322. (Translation from Russian)

6.
S. Zelenov, S. Zelenova, A. Kossatchev, A. Petrenko. Test Generation for
Compilers and Other Formal Text Processors. Programming and Computer
Software, Vol. 29, No. 2 , 2003, pp. 104
-
111. (Translation from Russian)


Contacts

Victor V. Kuliamin

kuliamin@ispras.ru

109004, B. Kommunisticheskaya, 25

Moscow, Russia

Web: http://www.ispras.ru/groups/rv/rv.html

Phone: +7
-
095
-
9125317

Fax: +7
-
095
-
9121524

Thank you!