Master Test Plan - Google Project Hosting

perchmysteriousΔιαχείριση Δεδομένων

1 Δεκ 2012 (πριν από 4 χρόνια και 4 μήνες)

202 εμφανίσεις



Enciso Servicios Cooperatives




Enciso Information System

Master Test Plan


Version <1.0>



Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠
2


Revision
History

Date

Version

Description

Author

28
/
04/2010

0.1

First Revision

Anshuman Mehta

29/04/2010

0.2

Additions for Elaboration 1 delivery

Anshuman Mehta

06/04/2010

1.0

Addition and Modification

Anshuman Mehta






Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠
3


Table of Contents

1.

Introduction

5

1.1

Purpose

5

1.2

Scope

5

1.3

Intended Audience

6

1.4

References

6

2.

Governing Evaluation Mission

6

2.1

Project Context and Background

6

2.2

Evaluation Missions applicable to this Project/ Phase

6

2.3

Sources of Test Motivators

6

3.

Target Test Items

6

4.

Overview of Planned Tests

7

4.1

Overview of Test Inclusions

7

4.2

Overview of Other Candidates
for Potential Inclusion

7

5.

Test Approach

7

5.1

Measuring the Extent
of Testing

7

5.2

Identifying and Justifying and ConductingTests

7

5.2.1

System Testing [Use Case Base Testing, User Interface Testing, Functional Testing]

7

5.2.2

Performance Testing

8

5.2.3

Security and Access Control Testing

8

5.2.4

Installation & ReleaseTesting

9

6.

Entry and Exit Criteria

9

6.1

Project/ Phase Master Test Plan

9

6.1.1

Master Test Plan Entry Criteria

9

6.1.2

Master Test Plan Exit Criteria

9

6.1.3

Suspension and Resumption Criteria

9

7.

Deliverables

10

7.1

Test Evaluation Summaries

10

7.2

Reporting on Test Coverage

10

7.3

Perceived Quality Reports

10

7.4

Incident Logs and Change Requests

10

7.5

Smoke Test Suite and Supporting Test Scripts

10

7.6

Additional Work Products

10

7.6.1

Detailed Test Results

11

7.6.2

Additional Automated Functional Test Scripts

11

7.6.3

Test Guidelines

11

7.6.4

Traceability Matrices

12

8.

Testing Workflow

12

9.

Environmental Needs

12

9.1

Base System Hardware

12

9.2

Base Software Elements in the Test Environment

13

Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠
4


9.3

Productivity and Support Tools

13

9.4

Test Environment Configurations

13

10.

Responsibilities, Staffing, and

Training Needs

14

10.1

People and Roles

14

10.2

Staffing and
Training Needs

15

11.

Key Project/ Phase Milestones

15

12.

Master Plan

Risks, Dependencies, Assumptions, and Constraints

15

Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠
5


Master Test Plan

1.

Introduction

1.1

Purpose

The purpose of the Master Test Plan for t
he
c
omplete lifecycle of the
Enciso Information System

is to:



Provide a central artifact to govern the planning and control of the test effort. It defines the general
approach that will be employed to test the software and to
evaluate the results of that testing, and is the
top
-
level plan that will be used by managers to govern and direct the detailed testing work.



Provide visibility to stakeholders in the testing effort that adequate consideration has been given to
various asp
ects of governing the testing effort, and where appropriate to have those stakeholders
approve the plan.

This
Master Test Plan also
supports the following specific objectives:



Identify existing project information and the software components that should be

tested



List the recommended Requirements for Test (high level)



Recommend and describe the testing strategies to be employed



Identify the required resources and provide an estimation of the test efforts



List the deliverable elements of the test project.


1.2

S
cope

The Enciso Information System will be tested according to the following test stages

1.

Unit Testing

2.

Integration Testing

3.

System Testing

4.

User Acceptance Testing


Unit tests will address functional quality of each module. The responsibility for the executi
on of unit tests is
with the developers. During the Unit Testing stage, the developer will primarily focus on code coverage
adopting white
-
box type testing. The deve
lopers will use mock objects to test their individually developed code.

Integration Tests a
ddress functional requirements. The responsibility for the execution of the Integration Tests
will be with the developers and the Architects will be responsible for the final quality of the integrated
components. During the Integration Test stage, the impl
ementer will focus on code coverage and functional
testing adopting white
-
box and black
-
box type testing.

System tests address functional quality of end
-
to
-
end scenarios and will address issues of scalability and
performance as well. The responsibility for

the execution of the System test cases is with the Testers and the
Architect will be responsible for the final quality assessment of the system. During the System test stage, the
testers will primarily focus on the requirement coverage of the system adopt
ing black
-
box type testing. Use
Case based testing, Load Testing and Performance Testing will form the main Test Strategy.

Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠
6


A捣数瑡n捥 T敳琠慤dr敳e fun捴楯n慬aqu慬楴y of th攠敮d
-

-
end s捥n慲楯s 慮d w楬氠lddr敳e 楳su敳f s捡污l楬楴y 慮d
p敲forman捥 慳aw敬l
. Th攠r敳eons楢楬楴y for th攠數散u瑩tn of 瑨攠A捣数瑡t捥 瑥s瑳 wi汬⁢攠w楴i th攠T敳瑥es 慮d 愠
s敬散瑥t group from th攠却Skeho汤敲s 慮d 瑨攠Arch楴散琠w楬氠l攠r敳eons楢汥lfor 瑨攠fin慬aqu慬楴y of th攠d数汯y敤
produ捴c 䑵r楮g A捣数瑡n捥 T敳琠e瑡g攬 瑨攠imp汥m
敮瑥t w楬l pr業慲楬y fo捵s on r敱u楲emen琠tov敲慧攠of 瑨攠
sys瑥m wh楣h w楬氠i攠va汩l慴ad by th攠捵s瑯m敲 who may giv攠愠d整慩汥e 楤敮瑩晩t慴楯n of t敳琠捡s敳⁴e b攠
v敲楦楥i.

Th攠mos琠tr楴楣慬i瑥獴ing w楬氠i攠瑨攠汯慤 慮d p敲forman捥 瑥t瑩tg.

Th楳 w楬氠i攠a
ddr敳e敤 慳⁦o汬lws :


A 瑥t琠獣tn慲楯 w楬氠len敲慴攠in捲敡sing numb敲s of r敱ues瑳 up 瑯 10,000.

Simulation of increasing concurrent users’ load up to 1000.

1.3

Intended Audience

This document will serve to outline and communicate the intent of the testing ef
fort for a given schedule to the
stakeholders.

1.4

References

Use Case Specifications

Supplementary Specifications

Software Development Plan

Software Architecture Document

Glossary

2.

Governing Evaluation Mission

2.1

Project Context and Background

The Enciso
Information System Project will be developed as an Event Management Product. This product will
provide the users the ability to search for facilities, activities, supplies etc. to create an event and options to
search and join other events. In addition the

product will have a platform for providers of the facilities,
activities, supplies etc. to store text/multimedia descriptions into the product.

2.2

Evaluation Missions applicable to this Project/ Phase

The mission for the test and evaluation effort will incor
porate the following concerns

Find as many bugs as possible

Certify to programming standards

Verify Specifications

Advice about testing


2.3

Sources of Test Motivators

Testing will be motivated by the project plan, use cases and supplementary specifications. T
he Use Case Based
Test Strategy will be adopted in this project to analyze, design and implement functional test cases.

3.

Target Test Items

The listing below identifies those test items

software, hardware, and supporting product elements

that have
been iden
tified as targets for testing. This list represents what items will be tested.

The Enciso Information System Web application

The Amazon EC2 cloud infrastructure with Linux Instance, JRE 1.6 , Spring 2.5x and PostGreSql.

Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠
7


4.

Overview of Planned Tests

4.1

Overview

of Test Inclusions

The following Test Strategies will be included:



Functional Testing



Use Case Base Testing



User Interface Testing



Performance Testing



Load Testing



Security and Access Control Testing



Configuration Testing



Installation Testing


4.2

Overview of

Other Candidates for Potential Inclusion

The following Test Strategies are candidates for potential inclusion:



Volume Testing



Failover / Recovery Testing



Stress Testing

5.

Test Approach

I
n this section we present the recommended
strategy for analyzing,
designing, implementing and executing the
required tests for the Enciso Information System
. Sections 3, T
arget Test Items, and 4, Overview

of Planned
Tests, identified what items will be tested and what types of tests would be performed. This section descr
ibes
how the tests will be realized.

The main considerations for the test strategy are the techniques to be used and the criterion for knowing when
the testing is completed.

In addition to the considerations provided for each test below, testing should onl
y be executed using known,
controlled data, in secured environments.

5.1

Measuring the Extent of Testing

In order to measure the progress of testing effort the following dimensions will be considered



Test Plan coverage



Requirements coverage



Testing quality ris
ks



Historical trend across iterations

5.2

Identifying and Justifying
and Conducting
Tests


5.2.1

System Testing

[Use Case Base

T
esting, User Interface Testing, Functional
Testing]

Testing of the application should focus on any target requirements that can be traced directly to use cases and
business rules. The goals of these tests are to verify proper data acceptance, processing, and retrieval, and the
appropriate implementation of
the business rules. This type of testing is based upon black box techniques, that
Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠
8


楳, v敲ifying 瑨攠慰p汩捡瑩ln (and 楴i in瑥tn慬apro捥ss敳⤠by in瑥t慣瑩tg w楴h 瑨攠慰p汩捡瑩ln v楡i瑨攠G啉 慮d
慮慬yz楮g th攠ou瑰u琠tr敳u汴s). Id敮瑩t楥i b敬ew is 慮 ou瑬tn攠
of 瑨攠瑥t瑩tg r散ommend敤:


Test Objective
: Ensure proper application navigation, data entry, processing, and retrieval.


Technique
: Execute each use case, use case flow, or function, using val
id and invalid data, to verify that

1.

the
expected results occ
ur when valid data is used.

2.

The appropriate error / warning messages are displayed when invalid data is used.

3.

Each business rule is properly applied.


Completion Criteria:

All planned tests have been executed and all identified defects have been address
ed.


5.2.2

Performance Testing

Performance testing measures response times, transaction rates, and other time sensitive requirements. The goal
of Performance testing is to verify and validate that the performance requirements have been achieved.
Performance tes
ting is usually executed several times, each using a different "background load" on the system.
The initial test should be performed with a "nominal" load, similar to the normal load experienced (or
anticipated) on the target system. A second performance t
est is run using a peak load.

Additionally, Performance tests can be used to profile and tune a system's performance as a function of
conditions such as workload or hardware configurations.


Test Objective:
Validate System Response time for designated tran
sactions or business functions under
the
following

two conditions:


o

normal anticipated volume

o

anticipated worse case volume


Technique:
Use Test Scripts developed for System Testing and modify data files (to increase the number of
transactions) or modify
scripts to increase the number of iterations each transaction occurs. Scripts should be
run on one machine (best case to benchmark single user, single transaction) and be repeated with multiple
clients (virtual or actual, see special considerations below).




Completion Criteria:
Single Transaction / single user: Successful completion of the test scripts without any
failures and within the expected / required time allocation (per transaction)

Multiple transactions / multiple users: Successful completion of

the test scripts without any failures and within
acceptable time allocation.


5.2.3

Security and Access Control Testing

Security and Access Control Testing focus on two key areas of security:

1.

Application security, including access to the Data or Business Funct
ions.

2.

System Security, including logging in and remote access to the system.


Application security ensures that, based upon the desired security, users are restricted to specific functions or
are limited in the data that is available to them. For example,
everyone may be permitted to enter data and
create new accounts, but only managers can delete them. If there is security at the data level, testing ensures
that user "type" one can see all customer information, including financial data, however, user two o
nly sees the
demographic data for the same client. System security ensures that only those users granted access to the
system are capable of accessing the applications and only through the appropriate gateways.


Test Objective:

Function / Data Security: Ve
rify that user can access only those functions / data for which their
user type is provided permissions.

System Security: Verify that only those users with access to the system and application(s) are permitted to
Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠
9


慣捥ss 瑨em.



Technique:
Function / Data
Security: Identify and list each user type and the functions / data each type has
permissions for. Specifically for the Enciso Information System’s Content Management System, the access to
the system must be reviewed / discussed with the appropriate networ
k or systems administrator. This testing
may not be required as it maybe a function of network or systems administration.



Completion Criteria:

For each known user type the appropriate function / data are available and all transactions
function as expecte
d and run in prior Application Function tests


5.2.4

Installation & ReleaseTesting

Installation & Release testing has two purposes. The first is to ensure that the software can be installed on all
possible configurations, such as a new installation, an upgrade,

and a complete installation or custom
installation, and under normal and abnormal conditions. Abnormal conditions include insufficient disk space,
lack of privilege to create directories, etc. The second purpose is to verify that, once installed, the soft
ware
operates correctly. This usually means running number tests that were developed for Function testing.


Test Objective:

Verify and validate that the bundled software properly installs and deploys on the cloud
infrastructure under the following conditions:

1.

New Installation, a new Instance, never installed.

2.

Update machine previously installed with same version

3.

Update mach
ine previously installed with older version



Technique
:

o

Develop automated scripts to validate the condition of the target machine (new
-

never installed,
same version or older version already installed).

o

Launch / perform installation.

o

Using a predeter
mined sub
-
set of Integration or System test scripts, run the transactions.



Completion Criteria
: Transactions execute successfully without failure.



6.

Entry and Exit Criteria

6.1

Project/ Phase Master Test Plan

6.1.1

Master Test Plan Entry Criteria

The criteria th
at will be used to determine whether the execution of the Test Plan can begin are:

1.

The test motivators are identified, described and detailed.

2.

The components, services and other deliverables are available according to the Software Development
Plan.

6.1.2

Master
Test Plan Exit Criteria

The test cases are executed for the specified

iteration.

6.1.3


Suspension and Resumption Criteria

If the test cases executed against any of the architectural components elicit some issues and those issues imply
system failures then all the test activities should be suspended before the plan has been completely executed.
Once those reported defects are
solved the testing activities can be resumed.


Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠





7.

Deliverables

7.1

Test Evaluation Summaries

The Test Evaluation Summary collects, organizes, and presents the Test Results and key measures of test to
enable objective quality evaluation and assessment. The
Test Evaluation Summary also presents an interim
evaluation from the test team, indicating their assessment of the software
.


The information provided will include:



Test Coverage (indicating the ratio of Test Cases performed and successful)



Code Coverage (
indicating the ratio of line of codes executed)



Perceived Quality Reports (indicating the number of defects by severity level)


7.2

Reporting on Test Coverage

Report the ratio of Test Cases performed and successful.

Ratio Test Cases Performed = 53/75= 70.67%

Ratio Test Cases Successful = 43/75 = 57.33%


7.3

Perceived Quality Reports

Report the number of defects by severity level. The following levels are identified



Low



Medium



High



Critical

7.4

Incident Logs and Change Requests

In order to record, track and manage
test incidents BugZilla will be used. The outline defect tracking process
will be as follows:



Defect Submitted



Assigned to Resolve



Opened for Resolving



Resolved



Validated



Closed

7.5

Smoke Test Suite and Supporting Test Scripts



Junit test suite will be used for

unit
-
testing of code.



Ant

and Shell scripts will be built to automate the build and release process.



TeamCity will be used as a tool for continuous integration and testing.




7.6

Additional Work Products

Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠



7.6.1

Detailed Test Results

The Test Cases for the Use Cases

will be listed in separate files in the format specified in section 7.6.3. Test
cases will pertain to a particular use case and will be performed with both valid and invalid data where possible.

Test Identifier

TT
-
UUU
-
N

Test Category

XXX
-
Y

Test
Description

Tests the abcd of xyz


Pre
-
C
ondition

abc with xyz exists in rst

Entry Point

abc navigates to
def

Inputs

Outputs Expected

Pass/Fail

1.




2.




3.




4.




5.




Expected Result

lmn


The details of the Input and Outputs will be specified and the
specific cause for failure shall be mentioned in the
TestCase document.

7.6.2

Additional Automated Functional Test Scripts

Junit tests will be written for all the code. These junit tests will be automate to run via a continuous integrated
test suite,TeamCity, in

a distributed environment. TeamCity will build the checked in code and run test cases
automatically every hour. In case of a build failure an email will be sent to the developer to rectify the change.

On top of the Junit tests the automated integrated tes
ting suite will be built which will be run via TeamCity.

7.6.3

Test Guidelines

Each test case will carry the following format.


Test Identifier

TT
-
UUUU
-
N

TT

ST

System Testing

PT

Performance Testing

SA

Security and Access Control Testing

IR

Installation and Release Testing

UUUU

The first four characters of the Use Case

N

The numerical identifier for the Test Case

Test Category

XXX
-
Y

XXX

U

Use Case Based Testing

G

Graphical Based Testing

F

Functionality Testing

Y

V

Valid Data

INV

Invalid Data


Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠



7.6.4

Traceability Matrices

For every test case which fails there will be a bug reported in Bugzilla where the exact steps for replicating the
bug will be written and the bug will be associated with corresponding test identifier.

8.

Testing
Workflow







9.

Environmental Needs

9.1

Base System Hardware

Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠



Th攠fo汬lwing 瑡t汥ls整猠for瑨 瑨攠sys瑥m r敳eur捥s for 瑨攠瑥s琠tffor琠tr敳敮瑥t 楮 this
Master Test Plan
.

System Resources

Resource

Quantity

Name and Type



Intel Based Workstation [ Windows /
Ubuntu]

Client

4

Mobile Client [ Iphone, Android ]




Amazon Server Instance

2

Linux Amazon ECS Instance




Internet







9.2

Base Software Elements in the Test Environment

The following base software elements are required in the test environment for this
Master Test Plan
.

Software Element Name

Version

Type and Other Notes

Microsoft Internet Explorer

7+


Google Chrome

4.1


Mozilla Firefox

3.5


Java Runtime Environment

6+


Spring

2.5+


Jquery

1.3+


Microsoft Windows

Xp SP2, Vista, Win7


Linux

Ubuntu 9.1 , 10.04



9.3

Productivity and Support Tools

The following tools will be employed to support the test process for this
Master Test Plan
.

[Note: Add or delete items as
appropriate.]

Tool Category or Type

Tool Brand Name

Vendor or In
-
house

Version

Test Management

BugZilla


3.6

Automated Continuous Integration

Team City


5.1

Automated Unit Testing

Apache Ant


1.8






9.4

Test Environment Configurations

The following
Test Environment Configurations needs to be provided and supported for this project.

Configuration Name

Description

Implemented in Physical
Configuration

Web Browser

Web Terminal Access


Terminal Shell

Deployment, Package, Release









Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠



10.

Responsibilities, Staffing, and Training Needs

10.1

People and Roles

This table shows the staffing assumptions for the test effort.

Human Resources

Role

Minimum Resources
Recommended

(Number of full
-
time roles
allocated)

Specific Responsibilities or Comments

Test Manager

1

Provides management oversight.

Responsibilities include:



planning and logistics



agree mission



identify motivators



acquire appropriate resources



present management reporting



advocate the interests of test



evaluate effectiveness of test
effort

Test Analyst

& Tester


2

The Test Analyst i
dentifies and defines the
specific tests to be conducted.

Responsibilities include:



define test approach & test automation



define test details

& test details



document change requests



evaluate product
quality

The Tester implements and executes the tests.

Responsibilities include:



implement tests and test suites



execute test suites



log results



analyze and recover from test failures

document incidents

Designer

1

Identifies and defines the operations,
attributes,
and associations of the test classes.

Responsibilities include:



defines the test classes required to support
testability requirements as defined by the test
team

Implementer

1

Implements and unit tests the test classes and test
packages.

Respo
nsibilities include:



creates the test components required to
support testability requirements as defined by
the designer



Enciso Servicios Cooperatives


Version: <1.0>

Master Test Plan


Date: <
06
/
04/2010
>



Confidential


En捩獯 卥Sv楣楯s Coop敲慴楶敳

㈰ㄳ

P慧攠



10.2

Staffing and Training Needs

This section outlines how to approach staffing and training the test roles for the project.

Two
Developers and Testers each will be additionally staffed to the project. The Developers will be in charge
of designing and developing the unit test cases. The Analysts will help in setting up the environment of TeamCity
and creation o
f the automatic build
sc
ripts. The testers would assist in Functional, UseCase Base Testing and
Graphical testing of the system
.

11.

Key Project/ Phase Milestones


Milestone

Planned
Start Date

Actual
Start Date

Planned
End Date

Actual
End Date

Project/ Phase starts

05/03/2010

05/03/2010

07/10/2010


Master Test Plan agreed

30/04/2010




Testing resources requisitioned





Testing team 1
st

training complete





Phase 1 exit milestone

06/04/2010

06/04/2010

06/04/2010


Requirements baselined

06/04/2010




Architecture baselined

0
3/05
/2010


04/06/2010


User Interface baselined

06/04/2010

10/04/2010

30/04/2010


Phase 2 exit milestone

04/06/2010

04/06/2010

04/06/2010


Test Process Audit Conducted

16/07/2010




System Performance Test starts

30/08/2010




Customer Acceptance Testing
Starts

27/08/2010




Project Status Assessment review

17/09/2010




Project/ Phase ends

07/10/2010





12.

Master Plan Risks, Dependencies, Assumptions, and Constraints

Risk

Mitigation Strategy

Contingency (Risk is

realized)

Environment not
ready

Monitor and drive the acquisition of Amazon
ECS licenses and assign responsibilities to
prepare the environment.


Poor understanding
of requirements by
testers

Training and Meetings for Testing team to be
updated on the

requirements.