QuAlity Assurance Plan

horsescurrishInternet και Εφαρμογές Web

30 Ιουλ 2012 (πριν από 5 χρόνια και 17 μέρες)

377 εμφανίσεις







2009


The Jazz Rockers QA plan


Version 1.5


Nan Li, Abhishek Minde, Yuki Saito, Jeff Salk, Bhanu Sistla


Q
U
A
LITY ASSURANCE

PLAN









This document describes the
quality assurance plan of the team The Jazz Rockers and how we
intend to assure a quality product from our project. It includes a brief context of our project,
followed by our quality goals, quality assurance and test strategies, and the organization of ou
r
team to support the goals.


The Jazz Rockers

QA Plan

Due: 04/24/09







Page
2

of
37

T
HE
J
AZZ ROCKERS
QA

PLAN

C
ONTENTS

Context

................................
................................
................................
................................
................................
.............

5

Context

Diagram

................................
................................
................................
................................
...........................

5

Project

Goals

................................
................................
................................
................................
................................

6

Quality Definiti
on

................................
................................
................................
................................
.........................

6

Quality Measures

................................
................................
................................
................................
.........................

6

Quality Goals

................................
................................
................................
................................
................................
.....

7

Quality Metrics

................................
................................
................................
................................
.........................

8

Process Metrics

................................
................................
................................
................................
........................

8

Product Metrics

................................
................................
................................
................................
........................

9

Quality Attributes

................................
................................
................................
................................
.........................

9

Functional

Requirements

................................
................................
................................
................................
...........

11

QA Strategies

................................
................................
................................
................................
................................
..

12

High Level Summary and QA strategies

................................
................................
................................
.................

12

Profiling (Business Drivers)

................................
................................
................................
................................
....

12

Traceability Matrix (Business Drivers)

................................
................................
................................
....................

12

Application of Static Analysis (Defects)

................................
................................
................................
......................

13

Defect Classification (Defects)

................................
................................
................................
................................
....

13

Tools and Techniques

................................
................................
................................
................................
.................

14

Reviews /Inspection of artifacts

................................
................................
................................
................................
.

16

Review Process

................................
................................
................................
................................
...........................

16

Self Review

................................
................................
................................
................................
.............................

16

Peer Review
................................
................................
................................
................................
............................

17

External Review

................................
................................
................................
................................
......................

17

Causal Analysis

................................
................................
................................
................................
.......................

17

Software Quality Assurance

................................
................................
................................
................................
.......

17

Types of Inspections

................................
................................
................................
................................
...................

18

Test Strategy and Approach

................................
................................
................................
................................
............

18

Sy
stem Test

................................
................................
................................
................................
................................

19

Performance Test

................................
................................
................................
................................
.......................

19

Security Test

................................
................................
................................
................................
...............................

21

Regression Test
................................
................................
................................
................................
...........................

21

Recovery Tes
t

................................
................................
................................
................................
.............................

21

Documentation Test

................................
................................
................................
................................
...................

22

Beta Test

................................
................................
................................
................................
................................
.....

22

User Acceptance Test (UAT)

................................
................................
................................
................................
.......

22

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
3

of
37

Integration Test

................................
................................
................................
................................
..........................

23

Unit test

................................
................................
................................
................................
................................
......

23

Control Procedures

................................
................................
................................
................................
.........................

2
3

Reviews

................................
................................
................................
................................
................................
.......

23

Bug Review meetings

................................
................................
................................
................................
.................

23

Change R
equest

................................
................................
................................
................................
..........................

23

Defect Reporting

................................
................................
................................
................................
........................

24

Functions To Be Tested

................................
................................
................................
................................
...................

25

Suspension / Exit Criteria

................................
................................
................................
................................
...........

25

Resumption Criteria
................................
................................
................................
................................
....................

25

Dependen
cies

................................
................................
................................
................................
.............................

26

Personnel Dependencies

................................
................................
................................
................................
........

26

Software Dependencies

................................
................................
................................
................................
.........

26

Hardware Dependencies

................................
................................
................................
................................
........

26

Test Data & Database

................................
................................
................................
................................
.............

26

Risks

................................
................................
................................
................................
................................
................

26

Schedule

................................
................................
................................
................................
................................
.

26

Technical

................................
................................
................................
................................
................................

26

Client

................................
................................
................................
................................
................................
......

26

Personnel

................................
................................
................................
................................
...............................

26

Requirements

................................
................................
................................
................................
.........................

27

Documentation

................................
................................
................................
................................
......................

27

Test
Completion criteria

................................
................................
................................
................................
.................

27

Quality Assurance Process Organization

................................
................................
................................
........................

27

Time and Human
Resources

................................
................................
................................
................................
.......

27

Team
Organization

................................
................................
................................
................................
.....................

30

Resources and Responsibilities

................................
................................
................................
..............................

30

Resources

................................
................................
................................
................................
...............................

30

RESPONSIBILITIES

................................
................................
................................
................................
...................

31

Conclusion
................................
................................
................................
................................
................................
.......

33

GLOSSARY:

................................
................................
................................
................................
................................
......

34

Appendix

................................
................................
................................
................................
................................
.........

35

TSP Default Quality Metrics:
................................
................................
................................
................................
.......

35

SRS Checklist

................................
................................
................................
................................
...............................

36

Architecture document checklist

................................
................................
................................
...............................

36

Coding Standard Checklist

................................
................................
................................
................................
..........

36

Inspection Standard

................................
................................
................................
................................
...................

36

References

................................
................................
................................
................................
................................
......

37

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
4

of
37


R
evision History

Version

Date

Author

Comments

1.
0

04
/17/09

Bhanu Sistla

Draft

1.1

04/20/09

Abhishek Minde

Added Test Details for system test,
UAT test and other tests

1.2

04/20/09

Nan Li



Changed the estimated hours



Changed the estimated
time spent
for QA from 20% to 35
%

1.3

04/20/09

Yuki Saito

Added details of various tools to be
used for testing and rationale behind
those

1.4

04/21/09

Bhanu Sistla

Restructure

and re
-
write
various

parts
to suit the requirements of the
checklist provided

1.5

04/21/09

Nan Li

Review and
u
pdate


Purpose of the document

The aim of this document is to describe our quality assurance plan and how we intend to assure a
quality product from our project. It includes a brief context of our project, followed by our quality
goals
, quality assurance
and test

strategies
,

and the
organization of our team to support the goals.




The Jazz Rockers

QA Plan

Due: 04/24/09







Page
5

of
37

C
ONTEXT

Our client
has several globally distributed business units. Different business
units use

different
tools to collect and manage software related data.

The c
lient has
identified the
following key issues
:




Different business units have different processes for collectin
g softwar
e
e
ngineering
data
(metrics, artifacts etc.).



Making meaningful observations about the collected data is hard.



Collected

data in one phase of a software engineering process is not always
used in

downstream processes.


The vision of the project is to develop abstracted representation of data coming

from

different tools

and software processes. In addition, multi
-
modal views o
f real time data

are provided to make
project teams more aware of the project and the processes. Th
e
tool can also maintain relationships
between multiple project data elements, illustrating

how they are related to each other and their
impact on each other
. It enables project

management teams to monitor and track different tasks in
projects. It would also provide

basis to organize historical data for future developments.

C
ONTEXT

D
IAGRAM

The f
ollowing figure represents the context of our project


As shown in the diagram, the primary function of the GSD tool is to extend the capabilities of Jazz
Server. The tool accesses data from Jazz server. The system will primarily have two types of users,
p
rocess
m
anager and normal users. The tool
will

allow a
process manager to define abstraction for
data elements in Jazz repository on the Jazz server. This abstraction of a Jazz element is called a
n

“abstract data element”. The user can also define
a
relationship
, called traceability,

between two
abstract data
elements
, and it can be
uni
-
directional or bi
-
directional.

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
6

of
37

Furthermore, traceability can also be established between two defined
traceability

relationships
,
which would allow a process manager to define a chain of
traceabilit
y

(
i.e.,
transitivity).

In orde
r to
provide compatibility with existing tool
s in use at
the client company
, the GSD tool can also
import data from the files (CSV, Word or Excel) as per the defined format and puts the data into
the Jazz repository.

P
ROJECT

G
OALS

1.

To develop an extension
to Jazz
/
Rational Team Concert(RTC)

2.

To abstract data elements that are managed by IBM Jazz Platform

3.

To establish traceability between different abstracted data elements

4.

To provide traceability views to different stakeholders

5.

To import fixed
-
format
client’s
company

data into Jazz repository


Note: By data we mean the entire project related information that is being created,

updated,
retrieved or deleted in Jazz repository during various phases of the project.

Please follow the below link for more information

on requirements of the project
:


https://msesrv4a
-
vm.scs.ad.cs.cmu.edu:8443/svn/GSD/trunk/Requirements/

Q
UALITY
D
EFINITION

The meaning of quality with respect to our project
is to develop a product that meets the below
mentioned quality goals for the product as well as for the process that we follow to build the
product.

We

realize that
we may not be able to meet all the quali
ty goals defined,
but
since we are
following TSPi
,
it recommends
setting

ambitious goals to start with and then
changing them, if
necessary,

in every cycle, based on the cycle postmortem

and reflection from the data collected.
For example, one of our quali
ty goal
s

is to have our SPI > 0.9 at all points of time in all phases.
Though it may seem unrealistic to start with, what TSP suggests is, it is good to aim for it and try
to change measures during cycle postmortems
if we find that it truly is
unrealistic
.

We have had
two cycle postmortems this semester and we find ourselves meeting this goal. So we aim to
continue with the goal for summer semester as well.

Q
UALITY
M
EASURES

Following are the means we would use to measure the quality of our product and our p
rocess:

a)

Process Quality:

Measuring process quality means measuring whether we are meeting our
process goals (as described in the next section). We are using process dashboard (a team
software process tool developed by Tuma Solutions) that help us to plan o
ur project, track
time and defects, define quality measures, and evaluate if we are meeting those measures
.
For example, it helps us analyze our defect injection rate in various phases, defects removal
rate, process yield, phase yield,
SPI, CPI, schedule v
ariance etc. thus helping us measure
our compliance
with
the goals.

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
7

of
37

b)

Product Goals:

W
e
will
measure

on a weekly basis

our product goals using the data that
we gather in each cycl
e
.

The q
uality manager (the role described in the team organization
section)
wi
ll tack

the data (defect logs, time logs, etc
.
)

weekly

and
will
use the review data
to analyze the quality of the product (quality
will
be measured in terms of defects/
KLOC
in
various phases).

Q
UALITY
G
OALS

As

we are following TSPi, we
will start with the
quality

goals defined by Watts Humphrey in
the
TSPi book

and tailor them as we see fit
. Following are the specific goals that we
have set
:

Goals

Measure

1.

Produce a quality product



Percent of defects found before the first integration
testing > 75%



Number
of defects found in system test < 10/KLOC



‘Must have’ Requirements functions included at project
c潭灬e瑩潮›‱〰o



All the artifacts must adhere to the standards attached
in the appendix as checklists

2.

Run a productive and well
managed project



Error in
estimated product size: < 20%



Error in estimated development hours < 20%



Percent of data recorded and entered in the project
notebook: 100%

3.

Finish on time



Days early or late in completing the development
cycle: < 4



SPI > 0.9



CPI > 0.85



Schedule
V
ariance <

50 hours



Base
line deviation < 20%


Following are the details of the above mentioned goals:

a)

We will measure the quality of our process based on Schedule performance index, cost
performance index and schedule variance. At any given point in time of any ph
ase, our SPI
> 0.9, CPI > 0.85 and Schedule variance < 50 hours.

b)

We have also devised a goal to keep our baseline deviation that has been described in the
next section to be always < 20%

c)

A maximum difference of four days is allowed in terms of schedule
completion of a cycle

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
8

of
37

d)

The error estimated n product size and development hours < 20%.
The data collected in the
process
dashboard
would

form the base as historical data that would help us estimate better
in the next semester. Process dashboard also allows us to use PROBE estimation method
for estimating the product size, schedule completion and forecasts the time required to
complete a gi
ven task etc. based on the historical data

e)

The percentage of defects that we will find before the integration testing > 75% and the
number of defects found during system test < 10 defects /KLOC.

In addition,
Appendix
contains
the default standards as presc
ribed by the

TSPi that we plan to
adhere to.

Q
UALITY
M
ETRICS

This section specifies the metrics that will be collected. The Goal
-
Question
-
Metrics approach is
used to identify the metrics for the project.

P
ROCESS
M
ETRICS


S
CHEDULE

D
EVIATION

Goal

Purpose

To ensure the achievement of project deadline

Issue

Project deadlines must be met as scheduled

Process

Schedule

Metric Name

Schedule Deviation

Assessment

Are you achieving the goal of schedule compliance?

Metrics

Baseline deviation + Current
Iteration deviation = Baseline deviation.
Baseline Deviation < 20%


R
EVIEW

E
FFECTIVENESS

Goal

Purpose

To measure and improve the effectiveness of review

Issue

The review process should be measured and
improved

Process

Reviews

Metric
Name

Review Effectiveness

Assessment

Are your reviews effective?

Metrics

D
SR


D
PR
> 0

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
9

of
37

D
PR



D
ER
> 0

D
SR

-

Self Review Defect Count

D
PR

-

Peer Review Defect Count

D
ER

-

External Review Defect Count



P
RODUCT
M
ETRICS




D
EFECT
D
ENSITY

Goal

Purpose

To measure and improve the quality of source code

Issue

Number of defects in the source code should be
measure, evaluated , fixed and be used to prevent the
future defects

Product

Source Code

Metric Name

Defect density

Assessment

Is quality code
being developed?

Metrics

Actual Defects per KLOC <= * Estimated Defects per KLOC

* The

initial estimate will be 40 defects per KLOC. This is ba
sed on intuition. At the end of each
i
teration, this estimate will be re
fined based on the actual value.

Scope of the plan:
We intend to assure all the quality goals mentioned in this document as of now.
In the next semester, we would revisit all the goals in every cycle and revise the goals if they seem
unachievable. We are currently following a cycle based
process and would continue to follow the
same in the next semester, where in at the end of every cycle, we have postmortems and cycle
launches, where we re
-
visit the goals and changes them as required.

Q
UALITY
A
TTRIBUTES

Following are our

high priority qua
lity attributes for our project. Please refer to

Jazz Rockers QA
Scenarios.docx

for

more

QA scenarios.

Scenario Priority

1:

Scenario
Refinement for Scenario 5

Scenario(s):

The BU has a new analysis to be visualized using the subset of data elements existing
in the repository

after t
he system is deployed.
The new analysis feature should be
implemented, tested, and ready for use within ‘
10
’ person days.

Business Goals:


The Jazz Rockers

QA Plan

Due: 04/24/09







Page
10

of
37

Relevant Quality

Attributes:

Modifiability

Scenario Components

Stimulus:

The BU has a new analysis to be visualized using the subset of data elements existing
in the repository.

Stimulus Source:

The BU

Environment:

The system is deployed.

Artifact

(If Known):

The new analysis feature requested by the BU

Response:

The new analysis feature should be implemented, tested, and ready for use within ‘
10

person days.

Response

Measure:

A

person
day

Questions:

What
is the predefined set of data?

Issues:

Complexity of new analysis, data changes, data model changes


Scenario Priority

2
:

Scenario Refinement for Scenario
4

Scenario(s):

The BU wants to integrate the GSD tool with a new data source. The system is
deployed. The new connector is fully implemented and tested within ‘
10
’ person days.
The implementer/ integrator are familiar with the connector of GSD tool.

Business Goals:


Relevant Quality

Attributes:

Modifiability

Scenario Components

Stimulus:

The
BU wants to integrate the GSD tool with a new data source

Stimulus Source:

The BU

Environment:

The system is deployed

Artifact

(If Known):

The GSD tool

Response:

The new connector is fully implemented and tested within ‘
10
’ person days.

Response

Measure:

A person day

Questions:


Issues:

The implementer/ integrator
’s
familiar
ity

with the connector of GSD tool


Scenario Priority

3:

Scenario Refinement for Scenario
17

Scenario(s):

A new release of the tool is available. The system is deployed.
C
learly defined process
exists for performing the upgrade.

The upgrade is deployed without any loss of
existing data

Business Goals:


Relevant Quality

Attributes:

Upgradability

Scenari
o
Compo
nents

Stimulus:

A new release of the tool is available

Stimulus Source:

A new release of the tool

Environment:

The system is deployed
.

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
11

of
37

C
learly defined process exists for performing the upgrade
.

Artifact

(If Known):

The GSD Tool

Response:

The upgrade is deployed without any loss of existing data

Response

Measure:

The amount of data loss

Questions:


Issues:



Scenario Priority

4
:

Scenario Refinement for Scenario
3

Scenario(s):

The GSD tool experiences an internal error and crashes. The system is operational. The
s
tate of the previously applied

operation is restored within ‘
30
’ seconds and the last
saved data should be retained and the impact of the jazz client only includes GSD tool.

Business Goals:


Relevant Quality

Attributes:

Availability

Scenario Components

Stimulus:

The GSD tool
experiences an internal error and crashes. The system is operational

Stimulus Source:

The GSD tool

Environment:

The system is operational

Artifact

(If Known):

The GSD tool

Response:

The State of the previously applied operation is restored within
‘n’ seconds
.

T
he last saved data should be retained and the impact of the jazz client only includes
GSD tool.

Response

Measure:

Restoration time, data integrity, and significance of the impact

Questions:

What happens if the GSD tool is connected to
other external repositories and it
experiences an internal error? Can we guarantee that there is no data loss in external
repositories?

Issues:


F
UNCTIONAL

R
EQUIREMENTS

The functional requirements of this project have been classified into
four

categories,



Authorization :

allow a process manager to assign roles to the defined users



Data
abstraction:

allows
defining

abstract data types for the underlying data representation in
IBM Jazz Platform (Rational Team Concert).



Traceability views:

allows

to create traceabilities (relationships) between different abstracted
data types (abstracted from concrete data elements stored in IBM Jazz platform).



Importing data:

import a fixed format comma separated file into
Jazz repository


The Jazz Rockers

QA Plan

Due: 04/24/09







Page
12

of
37

QA

S
TRATEGIES

H
IGH
L
EVE
L
S
UMMARY AND
QA

STRATEGIES

Our QA and test strategy consists of a series of different tests that will fully exercise the GSD
system. We are following a cyclical development process. Unit tests and integration tests would be
conducted at the end of each
cycle

for the modules developed in those cycles
. Whereas System
test, user acceptance test,
B
eta test, recovery test, documentation test and other tests mentioned
below would occur during the testing phase
( the plan for which is described in subsequent
sections)
. Our quality assurance strategies can be classified by quality aspects into general
approaches, business driver
-
oriented approaches, and defect
-
oriented approaches.

Also, we have planned to spend 35% of our available time in the coming semester o
n Quality
Assurance.

Though the number might look too ambitious, but w
e
came up with this estimation

of
35% after having discussed with one of our instructor Eduardo Miranda who is an estimation
expert and only after having carefully evaluating the require
ments and demands of our project,
have we agreed on using so much time only on QA, since our project requires extensive testing
due to its complexity.


P
ROFILING

(B
USINESS
D
RIVERS
)

We would use dynamic analysis tools to evaluate the system against the perf
ormance
requirements. We will use application server profiling tool for analyzing the GSD server. We will
also use VTune Performance analyzer to analyze different system parameters when the GSD
system

(client and server) are deployed. W
e will discuss this
in details u
nder Testing section.

T
RACEABILITY
M
ATRIX

(B
USINESS
D
RIVERS
)

A traceability matrix from requirements will be maintained. This would show the requirement,
architecture element that handles it, detailed design, code and test case that implements
it. This
allows the team to verify that the design satisfies the requirements and vice versa. It will also a
help us ensure that we don’t miss out on requirements.

An example of
requirement
traceability matrix is as follow:

Requirement

Satisfied

Design

Decision

Element

Relationship

Design

Artifact

Reference

Comments

and

Description














We are using a traceability matrix used in Architecture Centric Development Method (ACDM).
For more details, please refer to Table 10.3 in [Lat08].


The Jazz Rockers

QA Plan

Due: 04/24/09







Page
13

of
37

A
PPLICATION OF
S
TATIC
A
NALYSIS
(D
EFECTS
)

We will use
PMD
to scan source code and look for potential problems. We will discuss this
approach further in the section on code review.

Many problems that can be detected with dataflow
analysis (such as unhandled null pointers exceptions) can be caught by
PMD

and inspection.


We will use the following static analysis tools:



FindBugs



PMD

We chose FindBugs for the following reasons:



It has more than 300 bug patterns so that
it can

capture typical bugs that we might
introduce.



It catches exceptions that cannot easily be identified at compile time such as null pointer
exception.



It allows the user to filter out
irrelevant

bugs by config
uring its settings.

We chose PMD for the following reasons:



Since we have already tried it on assignment 10, we can start using
basic functionality.



It catches exceptions related to coding standard, which can reduce the code inspection time.



It has a meani
ngful description for each error in the report.

D
EFECT
C
LASSIFICATION
(D
EFECTS
)

Process Dashboard
tool provides defect

logging and

tracking
.

It also helps us classify the defects
according to the
below mentioned
classification

types.

While logging the
defect, the phase in
which the defect was detected and the phase in which the defect was removed is also logged which
provides us necessary information to calculate and measure our review yield at different phases.

The defect classification as per TSPi mat
ches the ODC classification very closely, though it does
not have the ‘impact’ as present in ODC.

We are not going with the full fledged ODC for two
reasons


the first one as already mentioned above is the inbuilt types built into the TSP tool and
second
reason being the additional overhead of maintaining impact and other non available
attributes.

Following

are the categories of defects that we are using and will continue to use:

Type
Number

Type Name

Description

Purpose

10

Documentation

Comments,
Messages

We
will use the data
collected based on these
types to analysis our
problems and then

improve

our process
and
20

Syntax

Spelling, punctuation, typos ,
instruction

30

Build, package

Change management, library,
version control

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
14

of
37

40

Assignment

Declaration, duplicate names,
scope, limits

product
accordingly. For
example, if we find that
the rate of
defect
s

with
interface
type
is

high,
we'll specifically look
for key
interfaces

during
code inspection
.

This is
similar to the ODC
analysis.

50

Interface

Procedure calls and references, I/O
user formats

60

Checking

Error messages, inadequate checks

70

Data

Structure, content

80

Function

Logic, pointers, loops, recursion,
computation, function defects

90

System

Configuration , timing,
memory

100

Environment

Design, compile, test or other
support system problems

T
OOLS AND
T
ECHNIQUES

In this section, we
mention
the names of tools and techniques that we would
use

to assure that
quality attributes and functional requirements are being
met
. T
he details of these tests, tools and
techniques are provided in the

Test Strategy

section.

Unit tests using
JUnit
, integration tests and
system tests would be conducted for the whole GSD system as part of the tes
ting phase. These
tests would be performed to verify the stability and functionality of the system. Other than these
general purpose tests, following are the special test techniques that we would use to test whether
each QA and functional requirements are
being met.

1.

QA for

scenario priority 1

& 2
:

Modifiability



Since both the first two high priority
quality attributes represent modifiability, we will use the same type of test to ensure the
both.
In order to assure this quality attribute, we will use regression
analysis

that
ensures

that the working functionality (baseline) does not break in case of modification or addition
of certain system functions.



For example: for testing traceability framew
ork, we will develop some traceability view
plug
-
ins
. Similarly, for import
plug
-
ins
, we will write a plug
-
in for csv import support.
These are more kind of extensibility rather than pure modifiability
.


2.

QA for scenario priority
3 & 4
:

Upgradability and Availability
-

since we have to
ensure that in case of upgradability
, there should be no loss of data and for availability QA
we have to ensure that in case of an internal failure, the system should be able to get back
to the
previous st
ate

which is also related to data integrity, we will use data recovery testing
and performance testing (to ensure system reverts to it previous state within 30 sec). We
have decided to use these two techniques because data recovery testing provides data
in
tegrity and ensures no loss of

data,

where as our performance testing would ensure the
response measure of 30sec. Please refer to testing strategy section for further details.


The Jazz Rockers

QA Plan

Due: 04/24/09







Page
1
5

of
37

3.

Functional Requirement: Authorization


We will perform security test to ensure

right
authorizations are provided to right roles and the function that system does not entertain
any unauthorized access to non authorized roles
.

Most of them would be manual tests. We
would also analyze the risk of compromise at a component level. We wil
l have tests which
ensure, that tour GSD system does not allow any unauthorized user or attacker to gain an
unauthorized access to Jazz server data
.


4.

Functional Requirement:

Data abstraction



testing this functionality involves testing
the internal behavi
or of the GSD system and since it would be difficult to use any external
tests or techniques, we have decided to test this functionality based on
JUnit

test cases,
integration
tests,

system tests and usability tests.

In each of these steps, the specific
functionality of the components would be tested to check against the verification of
requirements.

Also, the Beta test and user acceptance test would validate these
functionalities.


5.

Functional
Requirement:

Import Data


since testing this functionality involves testing
import of a fixed format of data into GS
D jazz repository, other than
JUnit

tests and other
normal routine tests, we would also perform load testing to test whether a large volume of
data can

be imported without causing
bottlenecks. We

will spend most of the time on
testing and ensuring correctness of this component. We will conduct code inspection and
rigorous testing on this part.

Tools

Phase/Purpose

Tool to be used

Unit test

and regression testing

JUnit

Coverage test

Code coverage testing

Regression test

JUnit

Performance Analysis

RAD Server Profiler, VTune analyzer

Automated test

Rational Robot

Random test case generator

T2 random
test generator

tool

Code optimization
standards adherence,
design bugs

FindBugs and PMD


We have decided not

to go with tools
and techniques

for protocol verification,

like ESCJava

and
Plural
,

because all team members are

not

comfortable with it and it requires a great learning curve
to be able to ch
eck the correctness of the code, though there are some good reasons to be inclined
towards using it that include pre condition and post condition verification; but after having don
e
cost benefit analysis, we decided against this technique keeping in mind the limitations of our
understanding and given time constraints.

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
16

of
37

R
EVIEWS

/I
N
S
PECTION OF ARTIFACTS

We have devised a very elaborative inspection and review process for all
key

artifa
cts that are to
be delivered to the client.

We will use reviews

during self review, peer review and external review
as already explained above under review process. But we would use formal inspections as a follow
up to the self and peer review for

the foll
owing artifacts:



SRS



SAD



HLD



DLD



STP



Code



User Manual



Prototypes

We have decided to use Formal Fagan style inspection for
inspecting
all the artifacts ex
c
e
p
t the
Software Architecture
Document, in which case we are
follow
ing

ACDM Review

approach

where
in the author goes through the SAD, QA scenarios, and maps those to the requirements and team
discusses potential issues and concerns. The author resolves the solvable issues after the meeting
and updates the architecture where as the irresolvable i
ssues are recorded in an Issue deposition
document and form the base for conducting experimentations for further clarifications. We have

devised our own inspection process based on TSP inspection and Fagan formal inspection. Our
inspection process

provides
:

o

A mechanism for ensuring quality is built into the software.

o

A means for assuring the quality of the process.

o

A means for producing and supporting a software inspection process and the quality
assurance aspects of that process for a project.

o

A common uni
form format and content for a software inspection process across The
Jazz Rockers artifacts and processes followed.

o

A software inspection process standard tailored to MSE studio’s environment.

R
EVIEW
P
ROCESS


Following is our review process.

The effectiveness of this would be measured as per

the Process review metric.


S
ELF
R
EVIEW

All project files (artifacts and source code

to be delivered to the customer
) should be
reviewed

by the creators of the same. A Self Review log should be maintained.



SELF REVIEW

PEER

REVIEW

EXTERNAL

REVIEW


CAUSAL ANALYSIS

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
17

of
37

P
EER
R
EVIEW

All project files (artifacts and source code) should be reviewed by all the team members
other than the creator of the same. A Peer review log should be maintained.

Peer re
view is
not formal inspection, as other team members review the artifact individually, which in turn
prepares all the team members for the

formal inspection that would follow peer review.

E
XTERNAL
R
EVIEW

All project files (artifacts and source code) should

be reviewed by the team’s External
Quality Analyst.

(
Note:
our

client’s, mentors and other experts from MSE faculty would
serve as external reviewer)

The External Quality Analyst should review the files (artifacts and source code) to
primarily identify th
e logical fallacies, standards mismatch and erroneous representations.

C
AUSAL
A
NALYSIS

A causal analysis will be done at the end of each cycle to inquire into the cause of the
defects and to propose a solution to reduce the recurrence of similar defects in

future
cycles
.

S
OFTWARE
Q
UALITY
A
SSURANCE

Software Quality Assurance (SQA)
will

assure compliance with proce
ss requirements by working
with other manager roles
in defining the inspection procedures and records.

SQA

(that consists of
quality manager and
team lead)

will

assure compliance to documented inspection procedures by:

a)

Verifying that the

required

data
has
been collected.

We will collect data specified by Team
Software Process (TSP).

b)

Selectively reviewing inspection packages for required inspection
materials.

c)

Participating in inspection meetings to whatever extent is deemed necessary by SQA,
including fulfillment of any of the inspection roles except author.

d)

By performing, participating in, and/or assuring the analysis in "Process Evaluation," SQA
wi
ll

provide an independent evaluation of the effectiveness of the inspection process and
the product quality.

SQA
will

assure that

the r
eports of inspection process evaluation/analysis are:



Defined and scheduled.



Provided as needed to:

a)

Validate positive

trends in the inspection process.

b)

Address adverse trends in the inspection process.



Reviewed with appropriate management and/or technical personnel.

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
18

of
37



Considered in inspection process improvements

(through PIP forms)
.

All inspection process improvements
are documented and tracked for analysis and incorporation,
and that inspection anomalies are documented and tracked for analysis and correction.

T
YPES OF
I
NSPECTIONS

Since different artifacts produced in different phases are different, we would need to pre
pare for
these inspections in a slightly different way.
We have prepared a list of activities relevant and
specific to that particular inspection to be performed while conducting formal inspections. Since
inspections are expensive, we decided to be more el
aborate and descriptive on maintaining
checklists and exit criteria for each type of inspection that we would conduct.


These types include System Requirements Inspection, Software Requirements Inspection,
Architectural design inspection, Source code
inspection, Test Plan Inspection and Test Procedure
Inspection. Other than source code inspection, the work product of rest of the inspections are
documents, we would inspect the entire artifact in our inspection meeting. But, since it is not
possible to c
onduct code inspection for all of the source code, we have decided to use the judgment
of the Development Manager to choose the work products and inspection package to be inspected
based on the complexity and criticality of the module.

We would select thos
e modules for
inspection that determine the core functionality and logic of the system.

Please refer to our inspection standard that presents a very thorough and descriptive inspection
process that can be customized
as per the situational circumstances
.

Our inspection standard is
based on the code inspection standards prescribed by TSPi
that in

turn follows the Fagan Formal
inspection
style. Following

are the details on specifics our inspection types and process elements
that
include

the customization de
tails as well:

<<

https://msesrv4a
-
vm.scs.ad.cs.cmu.edu:8443/svn/GSD/trunk/Quality Standards/Jazz Rockers Inspection Types
and Process Elements.docx
>>

T
EST
S
TRATEGY

AND
A
PPROACH

The primary purpose of these tests is to uncover the systems limitations and measure its full
capabilities.

Test Scaffolding:

We will use
test scaffolding

technique to test components (primarily
during unit testing)

which are relying on other components t
hat are not available. We will consider preparing
proxy components for unavailable components during

the testing. Then later on these proxies
will be replaced by

the corresponding tested components (mostly this integration will be
The Jazz Rockers

QA Plan

Due: 04/24/09







Page
19

of
37

tested using integration
test scripts).

In this way, based on our release plan, we will plan our
integration tests and scaffolding required.

Tests:

A list of the various planned tests and a brief explanation follows below.

a)

System Test

b)

Performance Test

c)

Security
Test

d)

Regression

Test

e)

Recovery Test

f)

Documentation Test

g)

Beta test

h)

User Acceptance Test

i)

Integration Test

j)

Unit Test

S
YSTEM

T
EST

Purpose
:

The System tests will focus on the

overall

functional
behavior

of the GSD system.
Most important
objec
tive is to test all the functions with all system components
integrated
.
User scenarios will be executed against the system as well as screen mapping and error
message testing. Overall, the system tests will test the integrated system and verify that

it meets
the requirements defined in the requirements document.



Phase
:

System tests would be performed after the integration testing.

Test
description
:

System tests would primarily include, testing of functions defined in the SRS and user
scenarios. Most tests would be manual tests as they would require complex user interactions
with the system. We are also thinking of using
Ration
al

Robot

for certain aut
omatic system
tests.
Rational robot provides tool sets for UI testing for client and server applications.

P
ERFORMANCE

T
EST

Purpose
: The main objective of this test is to analyze system’s behavior at different load
conditions. These tests will also allow us

to analyze performance measures of the system which
will aid in predicting the scalability of the system.

Phase:
Most of the system tests would be performed after successful testing of system tests.
However, certain performance tests would be executed as
part of regression and unit test to
ensure that we meet certain performance

criteria

from the
beginning.


The Jazz Rockers

QA Plan

Due: 04/24/09







Page
20

of
37

Test description:

Performance test will be conducted to ensure that the GSD system’s response
times meet the
user expectations and
specified performan
ce
quality goals
.


We will define different hardware
and load profiles for the performance tests.

Hardware profiles:

We will define a basic set of hardware profiles which will include, server machine, main
memory, CPU etc. We will test the system in

the de
fined hardware profiles.

Load
profiles
:

In this test,

we will test our GSD client against three load profiles.

We are
planning to test each load profile on every hardware profile.



1.

High Load Profile:

In this profile, we will load the Jazz server with more than 50,000
work items and more than 2000 version controlled files.

GSD Server (different from Jazz
Server) is loaded with more than 100 abstract data type definitions (specific to our project)
and t
raceability definitions. Some abstract data type definitions will have heavy
transformation filter chain

(10+ filters)
.

CSV files with 20,000 rows will be prepared to be
imported in the system.

More than 10 users
will access

the system at the same time.

2.

Me
dium Load Profile
:

In this profile, we will load the Jazz server wi
th more than 1
0,000
work items and more than
500

version controlled files. GSD Server (different from Jazz
Server) is loaded with more than
50

abstract data type definitions (specific to our project)
and traceability definitions. Some abstract data type definitions will have
medium

transformation filter chain

(5 filters)
.

More than 5 users will access the system at the same
time.

3.

Low Load Profil
e
:

In this profile, we will load the Jazz server with more than 1000 work
items and more than 50 version controlled files. GSD Server (different from Jazz Server) is
loaded with more than 10 abstract data type definitions (specific to our project) and
trac
eability definitions. Some abstract data type definitions will have
light

transformation
filter chain (2 filters).

One or two users will access the system at the same time.

We will capture different analysis (memory, performance, CPU, latency time) for the

following
profiles.

Latency time:

We will use
VTune performance analyzer

to measure the latency time for
functions that fetches data from the GSD server and Jazz server. Our system will have
client
-
server architecture. At
r
untime, GSD client fetches data

from two different servers,
GSD server and Rational Jazz server.

CPU and
Memory:
We will measure how much

CPU and

memory (Main and secondary
storage) is required at different load profiles. We will analyze how GSD client handles the
memory at different lo
ads. Analysis of GSD server will be different as it will
be deployed
The Jazz Rockers

QA Plan

Due: 04/24/09







Page
21

of
37

as a
Web
S
phere Application S
erver

(WAS)
. We will analyze GSD server with
WAS
server
profiling tools.

S
ECURITY
T
EST

Purpose
:

Security tes
ts will determine how good the GSD system is in providing authorized access to
different system functions. We have a requirement that states the traceability views and
abstract data type definitions will be accessed only by the authorized user roles.


Pha
se:

Security tests would b
e executed during system test
, and regression test
.


Test description:

Following are
the key
high level security tests.



Traceability reports show

up data for which the user has authorization.



User cannot access abstract data type
definitions and traceability definitions for which
they do not have access.

R
EGRESSION

T
EST

Purpose:
The main objective of this test category is to ensure that the working functionality

(baseline) does not break in case of modification or addition of certain system functions.
A
suite of tests will be developed to test the basic functionality of the GSD system and
would be
executed in automatically while preparing a new build. R
egression
t
est is useful to find
problems in the

areas of the systems that
underwent recent
defect

fixes
.


Phase:
Certain regression tests will be executed in every build. And certain regression test
(heavy and time consuming)
suites would be executed in integration
and system tests.
We will
not be able to perform all regression tests in each build, as it would be very costly.

Test description:

Regression test would cover different tests including some security tests,
some performance tests etc.
Some regression tests
will be built using automated test case/data
generator. Such tests will have an essence of random testing.

R
ECOVERY
T
EST

Purpose
:
Recovery tests will force the
GSD
system

(both client and server)

to fail in a various
ways and verify the recovery is properly performed. It is vitally important that all GSD data is
recovered after a system failure & no corruption of the data
occur
.


The Jazz Rockers

QA Plan

Due: 04/24/09







Page
22

of
37

Phase:
We will plan our

recovery tests during system, performance an
d beta tests.

Test description:
Recovery test would primarily test the system in following scenarios.

1.

GSD server crashes and upon rebooting it, the data is
intact
.

2.

GSD client crashes in the midst of a long operation. Upon restarting, it recovers the
tempor
ary data tha
t it retrieved from the servers, instead of restarting from the
beginning (Checkpoint method).

D
OCUMENTATION
T
EST


Purpose
: Documentation tests will be conducted to check the accuracy of the user
documentation (user manual).

Phase:
We will
conduct the

documentation tests in Fall 09 semester, after the user acceptance
test.

Test description:
Documentation test will include,



Consistency in the user manual and the actual user interactions required



Complexity of user manual



Completeness of the u
ser documentation with respect to the system functions.

B
ETA
T
EST

Purpose
:

This test would reveal the defects that will go und
etected in our test environment.
Primary objective of this test is to identify the
defects in the field
.

Phase:
We will conduct
the beta tests in Fall 09 semester, after the user acceptance and
documentation tests.

Test description:

Some users

will
use the
GSD system and will report any defects they find

in the system or
documentation
. This will
subject the

system to tests that cou
ld not be performed in our test
environment.

U
SER
A
CCEPTANCE

T
EST

(UAT)

Purpose
:
The purpose of these tests is to confirm that the system is developed according to the
specified user requirements and is ready for operational use.

Phase:
After successfu
l
system and performance test
, UAT will be conducted.

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
23

of
37

Test description:

Once the GSD system is ready for implementation, the GSD users will

perform User Acceptance Test
.

UAT will be performed in our test environment.

I
NTEGRATION
T
EST

Purpose
: The pu
rpose of

the integration test

is to test the interaction between multiple
components.

Phase:
Integration test

will be conducted after

unit test
, during the build. Build is prepared
only if all integration tests pass.

Test description:
Integration tests would be designed based on
b
lackbox

testing principles.
System components will be developed in isolation. Testing for such components will make use
of different scaffolding techni
ques. During integration test
, proxies used in scaffolding
will be
replaced with the actual components.

U
NIT TEST

Purpose
: The purpose of unit test

is to test components at low levels.

Phase:
Integration test

will

be conducted after unit test
, during the build. Build is prepared
only if all integration tests
pass.

Test description:
This includes several
whit
e
box

testing methods. We will use code coverage
(path coverage) for making sure that we have tests for most of the code base.

We will use
equivalence classes to identify different test areas, which would ma
ximize effectiveness of the.


C
ONTROL

P
ROCEDURES

R
EVIEWS

As already mentioned above in the review process section, r
eviews are different from
inspections. Reviews will be performed by each team member individually in order to prepare
for inspection

and also perform a self review before committing a task
.

B
UG
R
EVIEW
MEETINGS

Regular weekly meeting will be held to discuss reported defects. The development department
will
provide status
/updates on all defects reported and the test department will prov
ide addition
defect information if needed. All member of the project team will participate.

C
HANGE
R
EQUEST

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
24

of
37

Once testing begins, changes to the GSD system are discouraged.
We will conduct CCB
meeting bi
-
weekly

(6 times total)

to discuss
the proposed changes

if any

functional changes are
required
.
The CCB will determine the impact of the change and if/when it should be
implemented.



D
EFECT
R
EPORTING

When defects are found, the testers will complete a defect report on the defect tracki
ng system.
The defect
tr
acking systems

is accessible by testers, developers & all members of the project
team. When a defect has been fixed or more information is needed, the developer will change
the status of the defect to indicate the current sta
te. Once a defect is verified a
nd marked as

FIXED by the testers
,
the

testers will close the
corresponding
defect report.



The Jazz Rockers

QA Plan

Due: 04/24/09







Page
25

of
37

F
UNCTIONS
T
O
B
E
T
ESTED

The following is a list of functions
based on our functional requirements
that will be tested:



Add/update work item




Add/update data element



Define Traceability



View role reports



Check Authorization



Import data from external system



View bi
-
directional traceability between different data elements



Define filter criteria for traceability view



Define sorting criteria for traceability view



Define a
ssociation of transformation filter to data abstract element



Allow process manager to assign roles to the defined users



Create a new data abstract element in the data repository

Test cases for all the functions to be tested would be written and as describe
d above, a

Requirements Validation Matrix will “map” the test cases back to the requirements.


S
USPENSION
/

E
XIT
C
RITERIA

If any defects are found which seriously impact the test progress, the QA manager may choose to

Suspend testing. Criteria that will
justify test suspension are:



Hardware/software is not available at the times indicated in the project schedule.



Source code contains one or more critical defects, which seriously prevents or limits testing



progress.



Assigned test resources are not

available when needed by the test team.

R
ESUMPTION
C
RITERIA

If testing is suspended, resumption will only occur when the problem(s) that caused the suspension
has been resolved. When a critical defect is the cause of the suspension, the “FIX” must be veri
fied
by the test department before testing is resumed.



The Jazz Rockers

QA Plan

Due: 04/24/09







Page
26

of
37

D
EPENDENCIES

P
ERSONNEL
D
EPENDENCIES

The test team requires experience testers to develop, perform and validate tests. The test team
will also need
these

resources
to be
available: Application develop
ers and GSD users.

S
OFTWARE
D
EPENDENCIES

The source code must be unit tested and provided within the scheduled time outlined in the
Project Schedule.

H
ARDWARE
D
EPENDENCIES

The GSD Server, Jazz Server should be up an
d

during and individual computers / laptops PCs
(with specified hardware/software) as well as the LAN environment need to be available
during normal working hours. Any downtime will affect the test schedule.

T
EST
D
ATA
&

D
ATABASE

Test data (mock employee inf
ormation, organization information, role and project related
information etc.) & database should also be made available to the testers for use during testing.

R
ISKS

S
CHEDULE

The schedule for each phase is very aggressive and could affect testing. A slip in

the schedule
in one of the other phases could result in a subsequent slip in the test phase. Close project
management is crucial to meeting the forecasted completion date.


T
ECHNICAL

Since this is a new GSD system, in the event of a failure the old system

(that
already exists
)

can be used. We will run our test in parallel with the production system so that there is no
downtime of the current system.

C
LI
E
NT

Client support is required so
that
when the project falls behind, the test schedule does
not get

sq
ueezed to make up for the delay.

Client can reduce the risk of delays by supporting the test
team throughout the testing phase and assigning people (testers

who are the actual end users
) to
this project with the required skills set.

We consider this to be
a possibility because, in June,
another team would be established at the Client’s site who would work on extensions to our
tool.

P
ERSONNEL

All 5 team members should be available as per their required number of hours and any absence
of leave from any one wi
ll cause delay in schedules.

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
27

of
37

R
EQUIREMENTS

The test plan and test schedule are based on the current Requirements Document. Any changes
to the requirements could affect the test schedule and will need to be approved by the CCB.

T
OOLS

Presently we are
using

academic license

with RTC tool and do
not
have the real license.


D
OCUMENTATION

The following documentation will be available at the end of the test phase:



Test Plan



Test Cases



Test Case results



Requirements Validation Matrix



Defect
reports



Final Test Summary Report

T
EST
C
OMPLETION CRITERIA

The overall completeness criteria for our testing would be to have passed all the tests written in the
test procedure plan and to have rectified all the defects and errors fou
nd in the process.

Completeness criteria for some tests have been defined as exit criteria in the following section.

Q
UALITY
A
SSURANCE
P
ROCESS
O
RGANIZATION

T
IME AND
H
UMAN
R
ESOURCES


The total amount of resources available in the summer semester is 2880
hours (48 hours/week * 5
people * 1
2 weeks). We intend to devote 35
% of that time to quality assurance. Total expected
time for QA tasks for team is estimated as followings.


Task

Exit Criteria

Estimated Time

Total Hours

Detailed Design Inspection



Checklist for design inspection met



Inspection status of the document has been
finalized (approved, approved with minor
modifications, Re
-
inspection required etc.)

5 people * 3
sessions * 3
hours/session

= 45 hours

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
28

of
37

Task

Exit Criteria

Estimated Time

Total Hours

Unit Testing



Units test for each
module are written and
tested



We will analyze the defect trends over the
time. We will stop unit testing code for
components when defects for that
component are diminishing or we run out of
the defined budget.



For critical components,
80% code
coverage
is

achieved
by unit tests
. We still
need to pick the critical components.

15% * 1300
[
1
]
hours

= 195 hours

Code Inspection



For less complex code segments: review by
at least one peer



For complex code: formal inspection
including preparation, review session, and
follow
-
up with the author.



All critical code is reviewed.

We will define
critical code when design is prepared.

5000 LOC/100
[
2
]
LOC/hour * 5
people

= 250 hours

Static Analysi
s



FindBugs

and PMD

analysis applied to all
code before checking in the version control.



Resolution / Documentation of potential
problems

5% * 1300
[
3
]
hours

= 65 hours

CCB preparation



Every defect to be reviewed in the meeting
is analyzed and is
made
ready
f
o
r

discuss
ion


5 people * 6
sessions * 2
hours/session

= 60 hours




[
1
]

Planned effort for coding = 1300 hours (We estimate the final product will have about 10~13 KLOC, and the
number of LOC we will modify to fix bugs is about 7KLOC. Co
nsidering the rate of coding is 15 LOC/Hr/Person,
we plan 1300 hours for total.)

[
2
]

1 person
-
hour per every 100 LOC

[
3
]

Planned effort for coding = 1300 hours (We estimate the final product will have about 10~13 KLOC, and the
number of LOC we will modify

to fix bugs is about 7KLOC. Considering the rate of coding is 15 LOC/Hr/Person,
we plan 1300 hours for total.)

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
29

of
37

Task

Exit Criteria

Estimated Time

Total Hours

CCB meetings



Every defect decided in the meeting is
assigned to rectify or to re
-
analyze

5 people * 6
sessions * 1
hours/session

= 30 hours

Integration Test



Tests run on all code units
being integrated



All specifications met through satisfaction
of test criteria



Significant drop in the rate of defect
discovery

5% * 1300
[
3
]
hours

= 65 hours

Regression test



All regression tests have been applied to the
updated code



All specifications met through satisfaction
of test criteria



All tests must pass since they were working
before.

60 hours

= 60 hours

System Test



The test passes the user scenarios being
mapped to the system



Requirements described in the SRS are met

150 hours

= 150 hours

Documentation Test



All the features are covered



Contents can be easily understood




The client has approved has the
documentation

20 hours

= 20 hours

Acceptance Test



Specification fully implemented in the form
of tests



No additional specifications remain
unaddressed.

45 hours
[
4
]

= 45 hours


Beta Test



Product deployed and tested by users at
the
client’s company

12 hours
[
4
]

= 12
hours




[
3
]

Planned effort for coding = 1300 hours (We estimate the final product will have about 10~13 KLOC, and the
number of LOC we will modify to
fix bugs is about 7KLOC. Considering the rate of coding is 15 LOC/Hr/Person,
we plan 1300 hours for total.)

[
4
]

User will conduct the testing and we only implement the failed cases so time spent on both tests is excluded from
our planning and corresponding

effort added into coding and regression testing.

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
30

of
37

Task

Exit Criteria

Estimated Time

Total Hours

Planned:


940 hours

Buffer:

68 hours

Budget (
2880 *35%
):


1008 hours


T
EAM
O
RGANIZATION


It is the quality manager’s responsibility to review and ensure the compliance of the deliverables to
the set quality standards but it is every one’s responsibility to comply with quality assurance
standards. We have decided to have
everyone in

the team be

involved in the QA tasks. The
external tests like UAT, Beta test and usability test will be performed by our clients and the end
users. Following is our team organization plan for QA.


R
ESOURCES AND
R
ESPONSIBILITIES

The Quality Manager and Development manager will determine when system test will start and
end. The Quality also
is

responsible for coordinating schedules, equipment, &

tools for the testers
as well as writing/updating the Test Plan, Weekly Test Status reports and Final Test Summary
report. The testers will be responsible for writing the test cases and executing the tests. With the
help of the Quality, the GSD tool Clien
ts Marcelo
Cataldo
and Charles Shelton will be responsible
for the Beta and User Acceptance tests.

R
ESOURCES

The test team will consist of:



A Team Lead



A Quality Manager



5 Testers (including quality manager, development manager and team lead)



The Developme
nt Manager



5 GSD tool users



The Jazz Rockers

QA Plan

Due: 04/24/09







Page
31

of
37

RESPONSIBILITIES



Team Lead

Serves as Liaison between clients and project
team. He/she will help coordinate the Beta and
User Acceptance testing efforts.

Participate
in

CCB

meeting.


Planning Manager

Respo
nsible for

Project schedules and progress
report
.

Participate
in
CCB

meeting.


Development Manager

Serve as a primary contact/liaison between the
development department and the project team.

Participate
in CCB meeting.


Quality Manager

Ensures the
overall success of the test cycles.
He/she will coordinate weekly meetings and
will communicate the testing status to the
project team.

Participate
in

CCB

meeting.

Support Manager

Ensure the team has suitable tools and methods
to support its work,
configuration control, and
risk and issues tracking and report.

Lead
the
CCB
meeting.


Testers

Responsible for performing the actual system
testing.


GSD tool users

Will assist in performing the Beta and User
Acceptance testing.



The Jazz Rockers

QA Plan

Due: 04/24/09







Page
32

of
37

Role

Responsibl
e For

Description

Quality Manager

Conduct, moderate and follow
up all inspections



Serves as facilitator for inspection
meetings



Assigns inspection roles, acts as
moderator



Follows up the inspections with the
author

Static Analysis



Monitor the static
analysis
methodologies used by team and
improves the practice based on results

Test



Performs System test, usability test,
regression test and integration test



Maintain defect logs



Perform integration test



Run tool PMD on the integrated code



Prepare
acceptance test



Register found defects

Author

Static Analysis




Run tool PMD on the his/her code



Fix code according to the results

Test



Develop unit test code by using JUnit



Fix code according to the results

Inspections



Fix the defects from the
inspection and
review the fixed product with QA
manager for approval

Support Manager

Static Analysis



Setup environment for PMD

Test



Setup environment for JUnit



Setup environment for EclEmma

CCB



Facilitate CCB meetings



Assign responsibility before the
CCB
meeting


The Jazz Rockers

QA Plan

Due: 04/24/09







Page
33

of
37

C
ONCLUSION

All the techniques, t
est and QA strategies and processes written in the document are in scope of
our QA plan and our initial take on this plan is to follow it as written and tailor it to add
improvements in every
Process
Improvement Proposal

meeting that happens after every cycle
postmortem. This would not only help us evaluate our plan and strategy, but would also allow us to
incorporate changes when ever required.






















The Jazz Rockers

QA Plan

Due: 04/24/09







Page
34

of
37

G
LOSSARY

Acronym/Element

Description
/Meaning

ACDM

Architecture Centric Development Method

CCB

Change Control Board

CPI

Cost Performance Index

DLD

Detailed Design document

HLD

High level design document

PIP

Process Improvement Proposal

QA

Quality Assurance

SAD

System Architecture
Document

SPI

Schedule Performance Index

SRS

System Requirement Specification

STP

System Test Plan

TSP

Team Software Process

















The Jazz Rockers

QA Plan

Due: 04/24/09







Page
35

of
37


A
PPENDIX

TSP

D
EFAULT
Q
UALITY
M
ETRICS
:


Title

Phases

Percentage

Quality Profile Parameters

Design time as a %

of coding
time

100%


Code review time as a % of code
time

50%


Compile defects/KLOC

10


Unit test defects/KLOC

5


Design review time as a % of
design time

50%

Estimated Phase Yields

Detailed Planning

0%


System Requirements Review

70%


System
Requirements Inspection

70%


System Design Review

70%


System Design Inspection

70%


Software Requirements Review

70%


Software Requirements
Inspection

70%


Software Design Review

70%


Software Design Inspection

70%


Detailed Design Review

70%


Detailed Design Inspection

70%


Code Review

70%


Compile

50%


Code Inspection

70%


Unit Test

50%


Unit Integration

50%


Software Qualification Testing

50%


System Integration

50%


System Qualification Testing

50%


Software Use and Transition

100%

Estimated Defect Injection
Rates

Detailed Planning

0


System Requirements Analysis

0.25


System Design

0.25


Detailed Design

2


Code

4


Compile

0.3

The Jazz Rockers

QA Plan

Due: 04/24/09







Page
36

of
37


Unit Test

0.2

SRS

C
HECKLIST

<

https://msesrv4a
-
vm.scs.ad.cs.cmu.edu:8443/svn/GSD/trunk/Quality%20Standards/Jazz%20Rockers%20SRS%20Review%20Checklist.
p
df
>

A
RCHITECTURE DOCUMENT

CHECKLIST

<

https://msesrv4a
-
vm.scs.ad.cs.cmu.edu:8443/svn/GSD/trunk/Quality%20
Standards/Jazz%20Rockers%20Architecture%20Document%2
0Checklist.pdf
>

C
ODING
S
TANDARD
C
HECKLIST

We have decided on
following Coding

Conventions for
Java
as per
Sun Standard
s

http://java.sun.com/docs/codeconv/CodeConventions.pdf


I
NSPECTION
S
TANDARD

<

https://msesrv4a
-
vm.scs
.ad.cs.cmu.edu:8443/svn/GSD/trunk/Quality%20Standards/Jazz%20Rockers%20Software%
20Formal%20Inspection%20Standard.pdf
>

Q
UALITY
A
TTRIBUTE SCENARIOS

<

https://msesrv4a
-
vm.scs.ad.cs.cmu.edu:8443/svn/GSD/trunk/QAW/Jazz%20Rockers%20QA%20Scenarios.docx
>













The Jazz Rockers

QA Plan

Due: 04/24/09







Page
37

of
37

R
EFERENCES

[1]

Jazz Rockers STP

[2]
Team Hermes QA Plan

[3]
Team Pangea QA Plan

[Lat08]
Architecting Software Intensive Systems: A Practitioner’s

Guide
, by Anthony J. Lattanze,
Taylor and Francis/Auerbach 2008