A Day in the Life of a Tester

childlikenumberSecurity

Nov 5, 2013 (3 years and 8 months ago)

89 views

1

A Day in the Life of a Tester


Irinel Crivat

SDET Technical Lead


ASP.NET

http://www.asp.net

2

Overview


The purpose of testing a program is to
find problems in it



What a tester does?


Design test case (automated & manual)


Find problems is the core of a tester work


Write problem reports

3

What are we testing?


Writing dynamic, high
-
performance Web
applications has never
been easier




ASP.NET helps you
deliver real world Web
applications in record time.




ASP.NET combines
developer productivity with
performance, reliability,
and deployment.






4

What are we testing?


Developer Productivity


Easy Programming Model



Flexible Language Options


Great Tool Support


Rich Class Framework


Enhanced Reliability


Memory Leak,
DeadLock and Crash
Protection



Easy Deployment


"No touch" application
deployment


Dynamic update of running
application


Easy Migration Path

New Application Models


XML Web Services


Mobile Web Device Support

Improved Performance and
Scalability


Compiled execution



Rich output caching


Web
-
Farm Session State

5

Feature Crews


Feature Crew


“A small
self
-
organizing
,
cross
-
discipline

group that
works together to
completely

deliver a discrete
Feature”.




Crew has flexibility to decide how to work, how to meet
quality bar


Feature: Independently testable functionality taking 1
-
6
weeks of elapsed time to complete


Crew: 1PM, 1
-
5 Dev, 1
-
5 Test, other disciplines as
appropriate


No remaining work is left to do on a feature when it gets
checked in


interim builds of the product are higher quality


Tasks are done earlier in the development cycle


6

Feature Crew Tenets


Small, agile

work units


the smaller the
better


Dedicated
resources to reduce context
switching


Autonomy

and decision making pushed to
lowest levels


you decide how to work, who
does what


All members “
in it together



begin and end
together


Feature Crew continues until all work is
“done”
-

feature complete, no bugs, tests in
place, etc.

7

Key Concepts


Quality Gates


A set of quality metrics and deliverables that must accompany
a feature when it is first submitted to the product
.


Feature


A discrete piece of independently testable functionality that
integrates at the same time into the PU Branch
.


Feature Branch


A light
-
weight source branch that is dedicated to the
development of a single feature. Goal is to allow Feature Crew
to fully stabilize in isolation before integrating
.


SCRUM

A project management methodology that involves regular
check
-
in meetings to map progress
.



8

What do Feature Crews
offer?


Increased Efficiency


More parallel involvement of all disciplines (better Dev and Test
integration)


Less context switching in the core phase of building a product


Design issues and feature bugs are found earlier when they are
cheaper to fix


More effective and regular communication with cross discipline
counterparts


Fewer intra
-
version breaking changes


Enhanced Agility and Predictability


More honest schedules


More consistent definition of “done” with quality gates


More confidence in release dates


Spend more time building and less time stabilizing


Improved Quality


Higher quality new features


More stable builds to allow for better customer feedback

9

Feature Crew Life Cycle

10

Design Phase

Testing

Coding

Checkpoint 2

Status Review

Feature
complete,

ready to
integrate

Dev

Test

PM

UE

Discipline Activities

Implementation Phase

Integration

Phase

Feature design,

dev doc,

prototyping

Coding,

bug fixing

Unit tests,

bug fixing

Finalizing

quality gates,

testing,

bug fixing

Feature design,

test plans,

Test libraries,

automation

Testing

Finalizing

quality gates,

testing

Feature design,

functional spec

Threat model,

bug triage,

Feature DCRs,

Bug triage,

writing samples,


quickstarts


Finalizing

quality gates,

testing

Feature design,

content plans

Writing

content

Finalizing

quality gates

RI

Checkpoint 1

Design Review

11

Responsibility Areas

PM



Lead the feature crew



Status tracking/reporting



FC schedule



Functional specs



Bug Triage


Write samples, quickstarts



App Building


Review UE docs



Test features (scenario testing)

Dev



Design docs



Code features, F1



Code unit tests



Fix bugs



Test features



Review UE docs

Test



Test plan



Automation tests



Test features



Review UE docs

UE



Content plan



Write documentation



Write samples, quickstarts



Review FC plans, designs



Review UI strings



Test features (scenario testing)

Architects



Review FC plans



Review dev designs



Review code

Loc, UX,
Fundamentals,
Partners



Review FC plans, specs



Test features as appropriate

Leads, Mgmt



Review FC plans, specs, status



Manage change, people

Feature Crew Members

12

SCRUM


Agile programming methodology


Small, dedicated groups


Daily sync up meetings among crew members


Anyone can attend, but only crew members can talk


Led by a Scrum Master who tracks progress and drives
problem solving


Ideally


short ~10 minute meeting where crew
members answer only three questions:


What did you do since we last met?


What are you doing next?


Are you blocked? Yes/No


Problem solving and discussion happens after the
SCRUM meeting

13

Phase
-
by
-
Phase Breakdown


FC Kickoff


Design Phase


Checkpoint #1


Design Review


Implementation Phase


Checkpoint #2


Progress Review


Integration Phase


Feature Complete/Integration Verification

14

FC Design Phase


Activities



agreeing what to do, how to do it


Create functional specs, dev design plans, test plans, UE plans


All disciplines document plans from their perspective in parallel


Enumerate Quality Gates, Dependencies, Fundamentals


Build into FC plans, decide when/who/how


Identify risks, issues with any


Schedule Quality Gate requirements


Updated costing of features, updates to schedule if needed


Define Checkpoint process for FC (depends on feature complexity)


Deliverables



functional spec, dev design, test plans, UE plan


FC Schedule solidified, Work items identified, Feature list updated with appropriate
info


Feature Branch created, builds setup


Tools in place


How it works


FC works together to define designs, all participate in all aspects of design, iterative
brainstorming meetings


Interactive, iterative discussions between PM, Dev, Test, UE


Design phase completion triggers Checkpoint #1


FC decides when/how to schedule checkpoint with mgmt team

Design
specs
Design
Phase
Functional
specs
15

Testing
-

Design Phase


Characteristics of a good test


It has a reasonable probability of catching
an error


It is not redundant


It’s the best of its breed


It’s neither too simple nor too complex


It makes program failures obvious

16

Testing
-

Design Phase


Techniques to come up with powerful
test cases


Equivalence class analysis


Boundary analysis


Testing state transitions


Testing race conditions and other time
dependencies


Doing error guessing

17

Test case design

Equivalence class analysis


If you expect the same result from two
tests, you consider them equivalent


A group of tests forms an equivalence
class when


They involve the same input variables


They result in similar operations in the
program


They affect the same output variables


None force the program to do error
handling or all of them do

18

Test case design

Boundary analysis


The boundary values are the biggest,
smallest, soonest, shortest, loudest,
fastest, ugliest members of the class,
i.e. the most extreme values



Many testers include a mid
-
range value
in their boundary tests. This is a good
practice.

19

Test case design


Testing state transitions


Every interactive program moves from one visible
state to another. If you do something that changes
the range of available choices or makes the
program display something different on the screen,
you’ve changed the program state


Advice:


Test all paths that you think people are particularly likely to
follow


If you suspect that choices at one menu level or data entry
screen can affect the presentation of choices elsewhere,
tests the effects of those choices or entries


Try a few random paths through the program.


20

FC Implementation

Phase


Activities



getting it done


Coding, creating unit tests,


Test automation


Testing and bug fixing


Quality work: API Reviews, Threat models, etc.


Fundamentals (perf, stress, etc.)


Deliverables


Daily SCRUM meetings (minimally quick hallway/email sync
-
ups among FC)


Feature code and unit tests complete


Automation tests in place


Documentation, including tech reviews of content, samples, app building


All FC bugs addressed


How it works


Dev and Test work in tandem
-

iterative, parallel progress made throughout


PM facilitates, tracks/communicates status, drives bug triage


All FC members help out as appropriate for load balancing


no discipline
boundaries


Checkpoint #2 scheduled by FC during Implementation phase when main
coding is complete

Coding
Testing
Bug fixing
Implementation
Phase
21

Testing



Implementation Phase


Activities



getting it done


Find bugs


File problem report
,


Test automation


Run automated tests and perform analysis on the
failures


Quality work: API Reviews, Threat models, etc.


Fundamentals (perf, stress, etc.)



Deliverables


70% automation tests in place


100% passing rate on the automated suite


All problem report (bugs) are verified/closed

Coding
Testing
Bug fixing
Implementation
Phase
22

Fundamentals Details


Perf and Scalability


Stress and Capacity


Globalization/Localization


Security


Compatibility


Side
-
by
-
Side


Accessibility


User Experience


Hosting


64
-
bit


Servicing


23

Problem report

A good report is


You need to track the problem so you
must describe it in writing. Otherwise,
some details (or the whole problem) will
be forgotten.


You also need a report for testing the fix
later

24

Problem report

A good report is


Track Problem Reports numerically.


Assign a unique number to each report.


If you use a computerized database, the
report number will serve as a
key field
.


This is the one piece of information that
always distinguishes one report from all
the rest.

25

Problem report

A good report is

simple = not compound

Describe only one problem on one report

Why multiple bugs on a single report are a
problem?


The programer can fix only some of them and
pass the report back as fixed, even though
some bugs have not been fixed. This wastes
time and leads to bad feelings. Remaining
problems often stay unfixed long time…

26

Problem report

A good report is

You must describe the program’s
problematic behavior clearly.


Keep all unnecessary steps out of the
steps required to reproduce the
problem.


27

Problem report

A good report is

Think of the person reading it. Unless you
are reporting a disaster, the programmer
will toss an illegible report onto his/her
pile of things to look at next year


28

Problem report

A good report is

Nobody likes being told that what they did
was wrong. As a tester, that’s what you
tell people every day.

Do not express personal judgments in
your reports

The programmer and the tester are a team
that makes the product better


29

Problem report

A good report is

How can I make the bug reproducible?


You can describe how to get the program into a known
state, and from there describe an exact series of steps


Find the simplest, shortest and most general
conditions that will trigger the bug


Find related problems


Find alternate paths to the same problem


Find the most serious consequences


Look for the critical step

30

Test Automation


Automation framework (in house)


Building Maintainable Testsuites


Automation Results Reporting


Automation auto
-
analysis tools


Code coverage


Keeping test suites reliable


Gauntlet and Code Reviews

31

FC Integration Phase


Activities


making sure it works


integrate from PU branch


Run all tests, verify feature works in integrated world


Several functional runs


Stress


Performance


Code Coverage Run


Fix and verify bugs (includes Product bugs)


Quality Gates final verification


integrate into PU branch, final verification in PU branch


Deliverables


Feature List data updated


Quality Gates complete, verified


Final integration testing complete


How it works


FC tests feature with latest integrated code from PU Branch


All FC members verify feature meets requirements and works as
expected


Team members move onto next Feature Crew



32

Resources:


Books


How to Break Software: A Practical Guide to
Testing,
by Whittaker, James A. (Author)



Lessons Learned in Software Testing,
by Cem
Kaner (Author)


Effective Software Testing: 50 Specific Ways to
Improve Your Testing

by Elfriede Dustin

(Author)








33

Comments, Q & A