Verification and Validation

beansproutscompleteΛογισμικό & κατασκευή λογ/κού

13 Δεκ 2013 (πριν από 3 χρόνια και 5 μήνες)

66 εμφανίσεις


SENG 530: Software

Verification and Validation

V&V Processes and Techniques


Prof. Bojan Cukic

Lane Department of Computer Science and Electrical Engineering

West Virginia University




2

Overview


Software Inspections.


02/14/2002


Software Metrics.


Today


Software Reliability Engineering.


02/28/2002


3

Agenda


Software Engineering Measurements.


Measurement Theory.


A Goal
-
Based Framework for Software
Measurement.


Verification and Validation Metrics.

4

Measure? Why?


Developers angle.


Completeness of requirements, quality of design,
testing readiness.


Managers angle.


Delivery readiness, budget and scheduling issues.


Customers angle.


Compliance with requirements, quality.


Maintainers angle.


Planning for upgrades and improvements.

5

Measurement


A process by which numbers (symbols) are
assigned to attributes of entities in the real
world in such a way to describe them
according to clearly defined rules.


Measurement process is difficult to define.


Measuring colors, intelligence is difficult.


Measurement accuracy, margin of errors.


Measurement units, scales.


Drawing conclusions from measurements is
difficult.

6

Measurement (2)


“What is not measurable make measurable”
[Galileo, 1564
-
1642].


Increased visibility, understanding, control.


Measurement
:


Direct

quantification of an attribute.


Calculation
:



Indirect
, a combination of measurements used to
understand some attribute.


(Ex. Overall scores in decathlon).


7

Measurement in Software
Engineering


Applicable to managing, costing, planning,
modeling, analyzing, specifying, designing,
implementing, verifying, validating,
maintaining.


Engineering

implies understanding and control.


Computer science provides theoretical foundations
for building software, software engineering
focuses on controlled and scientifically sound
implementation process.


8

Measurement in Software
Engineering


Considered somewhat a luxury?!?


Weakly defined targets: “Product will be user
-
friendly, reliable, maintainable”.


Gilb’s Principle of Fuzzy Targets
: “Projects without
clear goals will not achieve their goals clearly.”


Estimation of costs.


Cost of design, cost of testing, cost of coding…


Predicting product quality.


Considering technology impacts.

9

Software Measurement
Objectives


“You cannot control what you cannot measure.”
[DeMarco, 1982]


Managers angle.


Cost
: Measure time and effort of various processes
(elicitation, design, coding, test).


Staff productivity
: Measure staff time, size of artifacts.
Use these for predicting the impact of a change.


Product quality
: Record faults, failures, changes as
they occur. Cross compare different projects.


User satisfaction
: Response time, functionality.


Potential for improvement
.

10

Software Measurement
Objectives (2)


Engineers angle.


Are requirements testable?


Instead of requiring “reliable operation”, state the expected
mean time to failure.


Have all faults been found?


Use models of expected detection rates.


Meeting product or process goals.


<20 failures per beta
-
test site in a month.


No module contains more than x lines (standards).


What the future holds?


Predict product size from specification size.

11

The scope of software metrics


Cost and effort estimation.


COCOMO 1, Function points model, etc.


Effort is a function of size (LOC, function points),
developer’s capability, level of reuse, etc.


Productivity models and measures.


Simplistic approach:
Size/Effort
.


Can be quite misleading, even dangerous.


12

A productivity model

13

The Scope of Metrics (2)


Data collection


Easier said than done.


Must be planed and executed carefully.


Use simple graphs and charts to present
collected data (see next slide).


Good experiments, surveys, case studies
are essential.


14

Presenting collected data

15

The Scope of Metrics (3)


Quality models and measurements.


Quality and productivity models are usually
combined.


Advanced COCOMO (COCOMO II), McCalls
model.


Usually constructed in a tree
-
like fashion.


At a high level are indirect factors, at a low level
are directly measurable factors.

16

17

The Scope of Metrics (4)


Reliability models.


Performance evaluation and modeling.


Structural and complexity metrics.


Readily available structural properties of code
(design) serve as surrogate for quality
assessment, control, prediction.


Management by metrics.


Many companies define standard tracking and
project monitoring/reporting systems.


Capability maturity assessment.

18

Agenda


Software Engineering Measurements.


Measurement Theory
.


A Goal
-
Based Framework for Software
Measurement.


Verification and Validation Metrics.

19

The basics of measurement


Some pervasive measurement techniques are
taken for granted


A rising column of mercury for temperature
measurement was not so obvious 150 years ago.


But, we developed a measurement framework for
temperature.


How well do we understand software
attributes we want to measure?


What is program “complexity”, for example?


20

The basics of measurement
(2)


Are we really measuring the attribute we
want to measure?


Is the number of “bugs” in system testing a
measure of quality?


What statements can be made about an
attribute”


Can we “double design quality”?


What operations can be applied to
measurements?


What is “average productivity” of the group?


What is “average quality” of software modules?

21

Empirical relations

22

Empirical relations (2)


“Taller than” is an empirical relation.


Binary relation (
x

is taller than
y
).


Unary relation (
x

is tall).


Empirical relations need to be mapped from
real world into mathematics.


In this mapping, real world is the
domain
,
mathematical world is the
range
.


Range can be the set of integers, real numbers, or
even non
-
numeric symbols.

23

Representation condition


The mapping should preserve real world
relations.


A
is taller than

B iff M(A) > M(B).


Binary empirical relation “taller than” is
replaced by the numerical relation >.


So, “x
is much taller than

y” may mean

M(x)>M(y)+15.

24

Stages of formal measurement

25

Agenda


Software Engineering Measurements.


Measurement Theory.


A Goal
-
Based Framework for Software
Measurement
.


Verification and Validation Metrics.

26

The framework


Classifying the entities to be examined


Determining relevant measurement
goals


Identifying the level of maturity reached
by the organization

27

Classifying software measures


Processes are collections of software
related activities.


Associated with time, schedule.


Products are artifacts, deliverables or
documents that result from a process
activity.


Resources are
entities

required by a
process activity.

28

Classifying software measures
(2)


For each entity, we distinguish:


Internal attributes


Those that can be measured purely in terms of
the product, process or the resource itself.


Size, complexity measures, dependencies.


External attributes


Those that can be measured in terms of how
the product, process or the resource relate to
their environment.


Experienced failures, timing and performance.

29

Process


Measures include:


The duration of the process or one of its
activities.


The effort associated with the process or
activities


The number of incidents of the specific
type arising during the process or one of
its activities.


Ave. cost of error=cost/#errors_found.

30

31

Products


External attributes:


Reliability, maintainability,
understandability (of documentation),
usability, integrity, efficiency, reusability,
portability, interoperability…


Internal attributes


Size, effort, cost, functionality, modularity,
syntactic correctness.

32

Product measurements


Direct measure example:


Entity: Module design document (D1)


Attribute: Size


Measure: No. of bubbles (in flow diagram).


Indirect measure example:


Entity: Module design document (D1, D2,…)


Attribute: Average module size


Measure: Average no. of bubbles (in flow
diagram).

33

Resources


Personnel (individual or team), materials
(including office supplies), tools, methods.


Resource measurement may show what resource
to blame for poor quality.


Cost measured across all types or resources.


Productivity:


amount_of_output/effort_input.


Combines resource measure (input) with the
product measure (output).

34

GQM Paradigm


Goal
-
Question
-
Metric


Steps:


List the major goals of the development
effort.


Derive from each goal questions that must
be answered to determine if goals are
being met.


Decide what must be measured to answer
the questions adequately.

35

Generic GQM Example

36

AT&T
GQM
Example

37

Goal definition templates


Purpose:


To (characterize, evaluate, predict…) the (process,
model, metric…) in order to (understand, assess,
manage, learn, improve…)


Perspective


Examine the (cost, correctness, defects,
changes…) from the viewpoint of (developer,
manager, customer…)


Environment


The environment consists of process factors,
people factors, methods, tools, etc.

38

Process improvement


Measurement is useful for
understanding, establishing the
baseline, assessing and predicting.


But the larger context is
improvement
.


SEI proposed five maturity levels,
ranging from the least to the most
predictable and controllable.

39

Process maturity levels

40

Maturity and measurement
overview


Level

Characteristic


Metrics


Initial


Ad hoc



Baseline


Repeatable


Process depends

Project





on individuals



management


Defined


Process defined &

Product





institutionalized



Managed


Measured process

Process+









feedback


Repeatable


Improvement fed back

Process+feedback





to the process


for changing the






process.


41

Repeatable process

42

A defined process

43

A managed process

44

Applying the framework


Cost and effort estimation


E=a*S
b


E is effort (person months), S is size
(thousands of delivered source statements)


A, b are environment specific constants.


Data collection


Orthogonal defect classification

45

Applying the framework (2)


Reliability models


JM model: MTTFi=a/(N
-
I+1)


N: total no. faults, 1/a is the “fault size”.


Capability maturity assessment


The maturity attribute is also viewed as an
attribute of contractor’s process.

46

Applying the framework (3)


Evaluation of methods and tools

47

Agenda


Software Engineering Measurements.


Measurement Theory.


A Goal
-
Based Framework for Software
Measurement.


Verification and Validation Metrics
.

48

V&V application of metrics


Applicable throughout the lifecycle.


Should be condensed for small projects.


Used to assess product, process, resources.


V&V metric characteristics:


Simplicity


Objectivity


Ease of collection


Robustness (insensitive to changes)


Validity

49

50

51

52

53

V&V specific metrics
(requirements and design)

54