Reusability and Effective

fanaticalpumaMechanics

Nov 5, 2013 (3 years and 7 months ago)

105 views

Reusability and Effective
Test Automation in
Telecommunication System
Testing

Mikael Mattas


Supervisor: Professor Sven
-
Gustav Häggman

Instructor: B.Sc. Jouni Haara

Contents

>
Research Problem and Objectives

>
Testing

>
Automation of Testing

>
Tellabs 8600 Managed Edge System

>
Testing Pratices Used for the Tellabs 8600 Managed Edge
System

>
Automating System Testing of the Tellabs 8600 Managed
Edge System

>
Conclusions and Recommendations

>
Further Research


Research Problem and Objectives

>
Design and implement an effective solution for automation
of system testing of the Tellabs 8600 Managed Edge
System

>
Analyze the automation praxices used in integration
testing to check what could be reused in automation of
system testing


Testing

>
The process of exercising software to verify that it satisfies
specified requirements and to detect errors

>
Important part of a telecommunication system development
project

>
Dissatisfied customers if the system does not work as expected

>
Costly to fix errors after the system has been shipped to the
customer

>
Testing Levels

>
Unit/Module testing

>
Integration testing

>
System testing

>
Acceptance testing

Testing

>
V
-
model

>
Frequently used software development model

Requirements
Functional
Specification
Detailed
Design
Architectural
Design
Unit/Module
Testing
Implementation
System
Testing
Acceptance
Testing
Integration
Testing
Test design and
verfication of test results
Review
Review
Review
Review
Review
Automation of Testing

>
Writing software to test other software

>
Make testing more effective

>
Able to run existing tests easily and quickly on a new version of
software

>
Better use of resources

>
Testers more motivated and have time to do other tasks like planning

>
Machines and workspace can be used during nights and weekends

>
Perform tests which would be difficult or impossible to do
manually, e.g. simulation of many users

>
Consistency and repeatability of tests, the tests will be repeated
exactly every time


Automation of Testing

>
Also a lot of pitfalls and limitations

>
Poor testing practice

>
Better to improve the effectiveness of testing first than to
improve the efficiency of poor testing

>
Maintenance of automated tests

>
Automated tests should be designed so they are easy to
maintain and can be re
-
run successfully on a new version of
software

>
Trying to automate too much

>
It is not possible or desirable to automate all testing activties.
One should not automate tests that are run only rarely, tests
without predictable results, tests where the result is easily
verified by a human, but is difficult if not impossible to
automate, tests where the software is very unstable and
usability tests.



Automation of Testing


>
Bad test automation

>
Capture/replay method

>
Scripts generated through recording have hard coded data that makes
it difficult to maintain the scripts

>
Good test automation

>
Based on reusable modules and one
-
point system maintenance

>
Script reuse means elimination of duplication, speeding up
implementation of new tests, and saving on maintenance costs

>
Automated tests vs. automated testing

>
Also pre
-

and post
-
processing tasks surrounding the execution of a
test case have to be automated if unattended testing during nights and
weekends is wanted

Tellabs 8600 Managed Edge
System

>
IP/MPLS based system used by service providers to build
access networks for providing Intranet, Extranet, Internet
access and corporate voice services to business
customers

>
Connections based on layer 2 or layer 3 MPLS VPNs

>
Consisting of different network elements (NEs) and a
network management system (NMS) with a graphical user
interface (GUI)

N
-
PE 1
N
-
PE 2
P
IP/MPLS
Core
IP/MPLS
Access
Customer
Premises
IP/MPLS
Access
CE
Customer
Premises
CE
iBGP
BGP
BGP
VRF
VRF
VRF
VRF
VRF
VRF
VRF
VRF
P
-
a
P
-
a
U
-
PE
U
-
PE
Testing Pratices Used for the Tellabs
8600 Managed Edge System

>
Enhanced V
-
model based on internal releases

>
Implementation testing (unit/module testing), integration
testing, and system testing

>
Implementation testing

and integration testing

is divided into separate

network element (NE)

testing and network

management system

(NMS) testing, whereas

system testing combines

both parts

Testing Pratices Used for the Tellabs
8600 Managed Edge System

>
No automated test cases in system testing, but some
automation used in NMS integration testing and NE
integration testing

>
NMS integration testing uses a functional testing tool
called WinRunner for automation of NMS GUI operations

>
NMS integration testing uses so called user functions to
build up a test case. A user function is reusable and can
e.g. handle an NMS window. User functions are built up by
so called reusable system functions, which are used to
carry out actions common for many applications within the
NMS, e.g. clicking different types of buttons.


Testing Pratices Used for the Tellabs
8600 Managed Edge System

>
The structure of a test case in NMS integration testing


Start up
script
Script calling
modules
Module 1
Module 3
Module 2
Start up Level
Module Level
Execution Level
User Function
Level
System Function
Level
WinRunner Function
Level
Module files
XML file
Reusable
Functions
Testing Pratices Used for the Tellabs
8600 Managed Edge System

>
NE integration testing uses an in
-
house built test
automation environment called TestNET and Tcl
-
scripts to
configure network elements and measurement equipment

>
A typical test case in system testing consists of
both
configuring NEs with help of the NMS

for setting up a VPN
and configuring measurement equipment for generating
the data to be sent through the VPN to test it
→ techniques
used in both NMS integration testing and NE integration
testing are needed in automation of system testing → need
for integration of WinRunner and TestNET


Automating System Testing of the
Tellabs 8600 Managed Edge System

>
Automation of GUI operations in the NMS

>
A basic VPN test case was chosen. The VPN was tried to be
built by using the same structure and functions used in NMS
integration testing to check whether the structure could be
used also in system testing. Purpose also to find shortages
and improvement needs.

>
Good structure in general, could be reused. But for improved
maintainability there was a need for more reusability than
just reusable user and system functions
→ the concept of
higher level functions was introduced. By moving logical
sequences of user function calls within the module files into
higher level functions, a new level of reusable functions was
reached.

Automating System Testing of the
Tellabs 8600 Managed Edge System

>
The suggested structure of a test case in system testing

Start up
script
Script calling
modules
Module 1
Start up Level
Module Level
Execution Level
User Function
Level
System Function
Level
WinRunner Function
Level
Module files
XML file
Reusable
Functions
Module 2
Module 3
Higher Function
Level
Automating System Testing of the
Tellabs 8600 Managed Edge System

>
Automating the configuration of the measurement
equipment

>
New measurment equipment (N2X) was procured for system
testing purposes
→ no scripts could be reused from NE
integration testing

>
Generic Tcl procedures were developed and TestNET was
used for execution of Tcl scripts. NE integration testing can
also utilize the procedures if they start using N2X.


Automating System Testing of the
Tellabs 8600 Managed Edge System

>
Integration of the techniques

>
Communication channel needed in order to command
measurement equipment from WinRunner with TestNET

>
A TCP/IP socket was created between WinRunner and
TestNET, and plain
-
text commands and responses were sent
between the parties


Conclusions and
Recommendations

>
Evaluation of the implemented structure for automated test
cases

>
The integration of the techniques was successful and the first
automated system test case could be created

>
By using higher level functions it was found that it is easy and does
not take a long time to create new test cases

>
There was no need for copy
-
paste actions as in the structure used
in NMS integration testing

>
Maintenance costs were minimized, so that if something had
changed in the GUI one had to make changes only in one place to
update all test cases

>
Easy to run the test cases also in different environments

>
Structure recommended to be used in subsequent automation of
system testing

Further Research

>
Before more test cases are automated, it has to be studied
whether software is stable enough

>
Automate pre
-

and post
-
processing tasks so that a set of
tests can be run unattended during nights and weekends

>
Integration of the WinRunner scripts with the used test
management tool, TestDirector