Automated Testing for Mobility Management Entity of Long ... - Noppa

leathermumpsimusSoftware and s/w Development

Dec 13, 2013 (4 years and 19 days ago)

305 views



Automated Testing for Mobility
Management Entity of Long Term
Evolution System



12/13/2013

Xi Chen

2



Supervisor:


Prof. Jyri Hämäläinen


Instructor:


M.Sc. Risto Nissinen (Nokia Siemens Networks Oy)



Acknowledgement









Background


Test System Overview


ATCA MME


Tektronix G35


Robot Framework & Test Suite


Agile Methodology


Test suite & Test cases


Results

Outline









The work is done in
Nokia Siemens Network
Oy
in Espoo.


It is a research about test automation of
Mobility Management Entity (MME) of LTE
core network with the usage of Tektronix G35
tester.


The purpose is to track the performance of
ATCA hardware platform by implementing a
test suite which could collect MME counters’
statistics.

Background








High
-
level Overview of Test System








ATCA (MME)
Framework

Tektronix
G35
Framework

Robot
Framework

Network traffic

Performance data

Performance data


Advanced Telecom Computing Architecture
(ATCA)


Open standard


specification


16 slots for computer


units


322.25 mm high


280 mm deep




ATCA hardware










Modularity, scalability and flexibility.


Standardized rack size & power supply


Service providers get: smaller equipment &
significant energy saving

ATCA hardware









Five key functional units:


Control Plane Processing Unit (
CPPU
)


Signaling & Mobility Management Unit (
SMMU
)


IP Director Unit (
IPDU
)


Marker & Charging Unit (
MCHU
)


Operation & Maintenance Unit (
OMU
)

ATCA based MME









CPPU:

transaction based mobility
management


SMMU:

storing information of visiting
subscribers into visiting subscriber database
(Home Subscriber Sever)


IPDU:

balance the loads & connectivity


MCHU:

offers statistics function


OMU:

handles operation & maintenance
functions

ATCA based MME









Tektronix G35
-
Traffic Procedure









G35 generates network traffic


Define traffic profile in G35


Start scenario:


Initialization (e.g. set eNBs)


Call scenario:


Periodically attach & detach subscribers


Stop scenario:



Do nothing in our work


Test suite is developed in a remote client
workstation.


Test suite remotely controls G35 by invoking
operations that are exposed by G35 through
one interface.


E.g. Remotely configure G35, start traffic, stop
traffic


In practice:



Tektronix G35 remote control









Generic test automation framework


Open source software


Implemented in Python


Can be extended with Python, Java or other
languages

Robot framework









Interacts with System Under Test through Test
Library

Architecture of Robot framework
based test








Robot framework test case









Tabular syntax


Constructed with
keywords


Keywords:


Build
-
in keywords


Imported keywords from test library


User keywords



One row one step (executed row by row)


Test case do not need to know what is
happening underneath besides keywords


Each keyword is a function call which accepts
arguments.


E.g. User keyword:
Add Two Numbers



Robot framework test case









All test cases are encapsulated in one test
suite


A test suite has
Setup

&
Teardown

phases:


Setup:

Initialization actions (e.g. configure G35)


Teardown:

Final actions


Keep the real tests be focused in between


Each test case can have its own
Setup

&
Teardown

Robot framework test suite









Aim at flexibility, adaptbility & productivity


Assume, the requirement, schedule will probably
be changing during development


Development cycle is a
sprint


One
sprint

= e.g. 2 weeks


Daily Scrum


Planning meeting


Review meeting

Agile methodology
-

Scrum









Test suite
-

overview








Initialization


Organize directories structure for
test results


Configure G35 working environment


Configure SSH feature on ATCA


Set traffic profile on G35



Statistics Collection (ATCA & Tektronix)


Details are explained on the following slides


Results Generation


Main 3 phases


Test cases








G35 & MME preparation

Start the traffic

Periodically

Collect Statistics (G35 & ATCA) of
traffic period

Stop the
traffic

Wait & Collect Statistics (G35 & ATCA) of
plus period

Final step

Initial statistics collection


The essential part of the test sutie is counters’
statistics collection for both G35 & ATCA


Counters’ statistics collection for G35:


Python function on remote client gets a list of
counters defined in G35 & access the values


Counters’ statistics collection for ATCA:


Test case establish SSH connection to the computer
units on ATCA



Send Man Machine Language (MML) commands
to get counters value


Test suite


statistics collection









Computer units on ATCA:


OMU: Operation & Management Unit


CPPU: Control Plane Processing Unit


MCHU: Market & Charging Unit


SMMU: Signaling & Mobility Management Unit


Test case establish a telnet connection to ACTA


start a remote session to the computer
unit

send MML commands



Test suite


ATCA statistics collection









Track CPU loads of all computer units on ATCA


Make sure the CPU loads are in acceptable
level


CPU loads are tracked simultaneously by
running a python script which implements the
multithreading feature.



Test suite


ATCA CPU loads tracking









Step 1: data extraction


Plain text


list of strings


Step 2: String formatting



Each string in the list



Start Time Period, Counter
Number, Counter Name, Counter Value



One record


Step 3: Recording


records are written to
*.dat

file


Step 4: Graph generating


Use Gunplot to generate statistic graph from DAT file



Test suite


Statistic recording &
graph generating









Three user input values for the test suite run


Traffic period
&
Plus period
& Record duration


Traffic period:
Total period spends on
periodical counters collection after traffic is
started


Plus period:
The period spends on counters
collection after traffic is stopped


Record duration:
The period for each round of
counters collection


Time domain


user inputs for test run









Time domain
-

all test steps








Start

traffic

Periodical

statistics

collection

(of traffic period)

Stop

traffic

last statistics

collection

(of plus period)

Final step

(stop to measure)

~22s

0s

~nrOfRound * (44s+wait)

User defined

duration period

0s

~plusPeriod+44s

~22s

Next

Slide

Initial statistics

collection

~44s

G35 & MME

preparation

Stop

Stop

Stop

(User defined)
Traffic period

Plus period

Counters

Collection

Statistics Recording

& data parsing,organization

& write to files

Wait until duration is over

Duration

(1round)


Time domain
-

counters collection part









Results directory









Result graph


ATCA counters









Result graph


G35 counters









Result graph


CPU loads of CPPU








G35 & MME preparation
**

Start the traffic

Periodically

Collect Statistics (G35 & ATCA) of
traffic period
*

Stop the
traffic

Wait & Collect Statistics (G35 & ATCA) of
plus period
*

Final step
*


1.
Prepare for the G35
counters(~2s)

2.
Set timer for
counters
measurement from
MME memory (~20s)


*
TOPTEN


(~22s)

*Stop to measure
counters from
MME memory

(~20s)

~22s

0s

~44s

Tek: ~15s(SE)
<2s(classic)

SMMU: ~0s

MME : ~0s

TOP: ~22s


1 round

org. & files: ~6s

0s

~plusPeriod+44s

~20s

*
TOPTEN


(~22s)

Initial statistics collection

CPPU: ~1s


Results of time measurements









Automated testing for MME is found to be
very important.


It provides an efficient way to generate a clear
picture of the performance of MME.


It is helpful when improving the quality of
MME


Conclusion








Thank you! & Questions?