Test Automation Report

basicgratisMechanics

Nov 5, 2013 (3 years and 10 months ago)

129 views






Test Automation Report
















May 15, 2003









Submitted To:

Prof. Gao

CmpE 196H


Submitted By:

Valerie Stanton

Hatsey Frezghi

Todd Wilkinson

Nick Monge


VHTN Software Development


Test Automation Report

2

Table of Contents


1

Introduct
ion

................................
................................
................................
.............................

3

1.1

Objective

................................
................................
................................
..........................

3

1.2

Focuses
................................
................................
................................
.............................

3

1.3

Strategy

................................
................................
................................
............................

4

2

Unit Test Bed

................................
................................
................................
...........................

4

2.1

Unit Testing

................................
................................
................................
.....................

4

2.1.1

White Box Testing


Basis Path Testing of Tree

Repository Module

.....................

4

2.1.2

Black Box Testing
-

Equivalence Partitioning

................................
......................

20

3

System Testing

................................
................................
................................
.......................

22

3.1

System
-
Level Testing
-

Equivalence Partitioning

................................
.........................

22

3.2

System
-
Level Testing
-

Boundary Value Testing
................................
..........................

24

3.3

System
-
Level Testing
-

Performance Testing

................................
...............................

27

4

Problem Reporting
................................
................................
................................
................

28

5

Tool Usage
................................
................................
................................
..............................

28

5.1

Junit

................................
................................
................................
................................

28

5.2

Custom Automation Tool

................................
................................
..............................

28

5.3

eValid

................................
................................
................................
.............................

28

5.4

Element Tool
................................
................................
................................
..................

29

6

Experience and Lessons Learned

................................
................................
........................

29

6.1

Overall Experience

................................
................................
................................
........

29

6.2

Junit

................................
................................
................................
................................

30

6.3

Custom Automation Tool

................................
................................
..............................

30

6.4

eValid

................................
................................
................................
.............................

30

6.5

Element Tool
................................
................................
................................
..................

30

7

Comparison of Manual vs. Automated Testing

................................
................................
.

31

8

References

................................
................................
................................
..............................

32


VHTN Software Development


Test Automation Report

3

1

Introduct
ion


This document includes the objective, focuses, and strategy used in our test automation. In
addition, it includes the test bed structure and tools used for unit white box testing and the test
environment and supporting tools for the system level test
ing. The results and planning involved
in of our automation testing will be compared with the manual testing. The actual code written
for the automation and the results themselves will be included in accompanying documentation.



1.1

Objective


The primary o
bjective for our test automation was to become familiar with the process of
automating test cases and generating test data. Through this process, we were able to compare
the automation with manual testing. In determining how much automation would be
acco
mplished, we took into account the given period of time as well as the availability of tools to
assist in the automation. Since all of the testing had previously been performed manually, there
was no need to develop test cases as we were able to reuse the
test cases used in manual testing.
We kept track of the number of hours spent developing the automated tests for the selected
features and in our conclusion, a comparison to the number of hours that we spent to test the
same features manually is presented
. These time studies will provide two measures that can be
used for setting future goals and predicting the percentage of coverage from test automation:




Time to automate per feature



Time to automate as compared to time to manually test


We anticipate howe
ver, that subsequent projects would take less time because of experience.



1.2

Focuses


Because none of our team had any experience in developing automation tools, determining a
focus for our testing was difficult. We spent some time just deciding what to au
tomate and
agreed to perform the following automation:




Perform test data generation



Automate test case execution (using both an existing tool and a using our own routine)



System
-
level testing using eValid



Problem report generation using an existing tool (
Element Tool)


We then had to plan the test automation and determine how we would automate the test cases
and generate test case data. The ability to control test data in the test environment is important in
both manual and automated testing to ensure the

validity of expected results. When tests are
automated, it is important to have the
ability to reset the data prior to test execution

and to
simulate feeding the data.
We decided to prepare a separate set of data for each test case to
ensure that the res
ult of one test case did not impact the result of another. In addition, we decided
to use random inputs that were generated using a specific set of rules for each required input. In
generating the test case functions, we developed and enforced a set codin
g standards and naming
conventions to promote efficiency so the results of each test case could be easily evaluated.

VHTN Software Development


Test Automation Report

4

1.3

Strategy


Our strategy was to provide automated coverage for as many areas as possible where manual
testing had already taken place. This
allowed us to perform a direct comparison of the two.


2

Unit Test Bed


This section addresses the tools used to automate the white box testing for the repository portion
of the program.


2.1

Unit Testing


Unit Testing is done at the source or code level for l
anguage
-
specific programming errors such
as bad syntax, logic errors, or to test particular functions or code modules. The unit test cases
were designed to test the validity of the programs correctness.


2.1.1

White Box Testing


Basis Path Testing of Tree Re
pository Module


In white box testing, the user interface is bypassed. Inputs and outputs are tested directly at the
code level and the results are compared against specifications. This form of testing ignores the
function of the program under test and wi
ll focus only on its code and the structure of that code.
The test cases that have been generated shall cause each condition to be executed at least once.


Each function of the binary tree repository is executed independently; therefore, a program flow
for

each function was derived from the code. Using the program flow graph for each function in
our tree repository module, we were be able to determine all of the paths that needed to tested
and have developed the corresponding test cases. In order to test
the success of each path, we
used Junit to create a test suite, which included all of our white box test cases. Any
preconditions needed to exercise a path were created upon execution of each test case. The test
cases were all executed independently, so
the test data was reloaded with each test case.


Upon initialization of the complete suite being run, a random number generator is used to set the
input data. It sets the variables according to the rules associated with the respective variable (i.e.,
roo
t, right child, left child, etc.) We also created a version of the Junit test suite that utilizes the
specific values called out in the test cases. Both sets of results are shown in the Test Automation
Results. To make the internal test visible in the p
rogram, assertion checking was used. If the
path executed correctly, the test was set up to return true. The test cases included in the Junit test
suite are shown in the following tables.

VHTN Software Development


Test Automation Report

5

Insert



Path ID

Path

1.1

1, 2, 3, 5, 1, 2, 3, 6, 7, 11, 12

1
.2

1, 2, 3, 6, 7, 11, 12

1.3

1, 2, 4, 8, 10, 11, 12

1.4

1, 2, 4, 9, 1, 2, 4, 8, 10, 11, 12


Test Cases


Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
1.1

Test Case Precondition


root (28) and left chi
ld (27) exist and the key to be inserted
is less than 27

Item(s) to be tested

Repository Module: Insert
-
Path 1.1

Specifications


Input

Expected

Output/Result

25

key 25 inserted: return true


VHTN Software Development


Test Automation Report

6

Test to be Performed By:

Executed by Junit

Test Type

Whit
e Box


Basis Path

Test Case Number

W
-
1.2

Test Case Precondition


root (28) exists and key to be inserted is less than 28

Item(s) to be tested

Repository Module: Insert
-
Path 1.2

Specifications


Input

Expected

Output/Result

27

key 27 inserted: return

true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
1.3

Test Case Precondition



root (28) exists and key to be inserted is greater than 28

Item(s) to be tested

Repository Module: Insert
-
Path 1.3

Specifications


Input

Expected

Output/Result

50

key 50 inserted: return true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
1.4

Test Case Precondition


root (28) and right child (50) exist and the ke
y to be inserted
is greater than 50

Item(s) to be tested

Repository Module: Insert
-
Path 1.4

Specifications


Input

Expected

Output/Result

55

key 55 inserted: return true


VHTN Software Development


Test Automation Report

7

Delete




Path ID

Path

2.1

1, 2, 3, 4, 5, 7, 17, 18, 19, 21, 22, 24, 25, 35, 3
6

2.2

1, 2, 3, 4, 5, 7, 17, 18, 19, 21, 23, 24, 25, 35, 36

2.3

1, 2, 4, 5, 6, 7, 17, 26, 27, 29, 30, 32, 25, 35, 36

2.4

1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 35, 35

2.5

1, 2, 3, 4, 5, 6, 7, 8, 9, 13, 14, 16, 35, 36

2.6

1, 2, 4, 5, 7, 33, 34, 1, 2, 3, 4,
5, 6, 7, 8, 9, 13, 15, 16, 35, 36


VHTN Software Development


Test Automation Report

8

Test Cases


Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
2.1

Test Case Precondition


tree contains root (28), left child (27)

Item(s) to be tested

Repository Module
: Delete
-
Path 2.1

Specifications


Input

Expected

Output/Result

28

28 deleted
-

return true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
2.2

Test Case Precondition


tree contains root (28), right c
hild (50)

Item(s) to be tested

Repository Module: Delete
-
Path 2.2

Specifications


Input

Expected

Output/Result

28

28 deleted


return true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
2.3

Test C
ase Precondition


tree contains root (28), right child (50), right child (55)

Item(s) to be tested

Repository Module: Delete
-
Path 2.3

Specifications


Input

Expected

Output/Result

28

28 deleted
-

return true


VHTN Software Development


Test Automation Report

9

Test to be Performed By:

Executed by Juni
t

Test Type

White Box


Basis Path

Test Case Number

W
-
2.4

Test Case Precondition


tree contains root (28)

Item(s) to be tested

Repository Module: Delete
-
Path 2.4

Specifications


Input

Expected

Output/Result

28

return null




Test to be Performed
By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
2.5

Test Case Precondition


tree contains root (28), right child (50)

Item(s) to be tested

Repository Module: Delete
-
Path 2.5

Specifications


Input

Expected

Output/Result

50

50 deleted


return true




Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
2.6

Test Case Precondition


tree contains root (28), left child (27)

Item(s) to be tested

Repository Module: Delete
-
Path 2.6

Specifications


Input

Expected

Output/Result

27

27 deleted


return true




VHTN Software Development


Test Automation Report

10

Search




Path ID

Path

3.1

1, 2, 3, 10

3.2

1, 2, 4, 5, 1, 2, 3, 10

3.3

1, 2, 4, 6, 10

3.4

1, 2, 7, 8, 10

3.5

1, 2, 7, 9, 1, 2, 3, 10


Test Cases


Test to be Performed B
y:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
3.1

Test Case Precondition


tree contains root (28)

Item(s) to be tested

Repository Module: Search
-
Path 3.1

Specifications


Input

Expected

Output/Result

28

28 found
-

return
true



VHTN Software Development


Test Automation Report

11

Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
3.2

Test Case Precondition


tree contains root (28), left child (27)

Item(s) to be tested

Repository Module: Search
-
Path 3.2

Specifications


Inpu
t

Expected

Output/Result

27

27 found
-

return found true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
3.3

Test Case Precondition


tree contains root (28), left child (27), right child (50)

Item(s)
to be tested

Repository Module: Search
-
Path 3.3

Specifications


Input

Expected

Output/Result

5

5 not found
-

return not found true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
3.4

Test Case Preco
ndition


tree contains root (28), left child (27) right child (50)

Item(s) to be tested

Repository Module: Search
-
Path 3.4

Specifications


Input

Expected

Output/Result

60

60 not found


return not found true


VHTN Software Development


Test Automation Report

12

Test to be Performed By:

Executed by Juni
t

Test Type

White Box


Basis Path

Test Case Number

W
-
3.5

Test Case Precondition


tree contains root (28), right child (50)

Item(s) to be tested

Repository Module: Search
-
Path 3.5

Specifications


Input

Expected

Output/Result

50

50 found
-

return f
ound true



List
-

Ascending




Path ID

Path

4.1

1, 2, 5

4.2

1, 2, 3, 1, 2, 5

4.3

1, 2, 3, 4, 5

4.4

1, 2, 3, 4, 1, 2, 5



VHTN Software Development


Test Automation Report

13

Test Cases


Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
4.1

Test Case Pr
econdition


there are no nodes in the tree

Item(s) to be tested

Repository Module: List Ascending
-
Path 4.1

Specifications


Input

Expected

Output/Result

execute ascend passing null root

return tree traversed true



Test to be Performed By:

Executed by

Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
4.2

Test Case Precondition


tree contains root (28)

Item(s) to be tested

Repository Module: List Ascending
-
Path 4.2

Specifications


Input

Expected

Output/Result

execute ascend, passing roo
t value of 28

return tree traversed true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
4.3

Test Case Precondition


tree contains root (28), left child (27), right child
(50)

Item(s) to be tested

Rep
ository Module: List Ascending
-
Path 4.3

Specifications


Input

Expected

Output/Result

execute ascend, passing root value of 28

return tree traversed true


VHTN Software Development


Test Automation Report

14

Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
4.4

Test Case Precondition


tree contains root (28), right child (50)

Item(s) to be tested

Repository Module: List Ascending
-
Path 4.4

Specifications


Input

Expected

Output/Result

execute ascend, passing root value of 28

return tree traversed true


L
ist
-

Descending




Path ID

Path

5.1

1, 2, 5

5.2

1, 2, 3, 1, 2, 5

5.3

1, 2, 3, 4, 5

5.4

1, 2, 3, 4, 1, 2, 5


VHTN Software Development


Test Automation Report

15

Test Cases


Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
5.1

Test Case Precondition


th
ere are no node in the tree

Item(s) to be tested

Repository Module: List Descending
-
Path 5.1

Specifications


Input

Expected

Output/Result

execute descend passing null root

return tree traversed true



Test to be Performed By:

Executed by Junit

Test

Type

White Box


Basis Path

Test Case Number

W
-
5.2

Test Case Precondition


tree contains root (28)

Item(s) to be tested

Repository Module: List Descending
-
Path 5.2

Specifications


Input

Expected

Output/Result

execute descend, passing root value of
28

return tree traversed true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
5.3

Test Case Precondition


tree contains root (28), left child (27), right child (50)

Item(s) to be tested

Repository Mod
ule: List Descending
-
Path 5.3

Specifications


Input

Expected

Output/Result

execute descend, passing root value of 28

return tree traversed true


VHTN Software Development


Test Automation Report

16

Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
5.4

Test

Case Precondition


tree contains root (28), left child (27)

Item(s) to be tested

Repository Module: List Descending
-
Path 5.4

Specifications


Input

Expected

Output/Result

execute descend, passing root value of 28

return tree traversed true


Read





Path ID

Path

6.1

1, 2, 6

6.2

1, 2, 3, 4, 5, 2, 6


VHTN Software Development


Test Automation Report

17

Test Cases


Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
6.1

Test Case Precondition


no items in the file

Item(s) to be tested

Repository Module:

Read
-
Path 6.1

Specifications


Input

Expected

Output/Result

empty file: test.txt

return file read true




Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
6.2

Test Case Precondition


2 items in the fil
e

Item(s) to be tested

Repository Module: Read
-
Path 6.2

Specifications


Input

Expected

Output/Result

file test.txt containing:

20,

10,


return file read true



VHTN Software Development


Test Automation Report

18

Store





Path ID

Path

7.1

1, 2, 6

7.2

1, 2, 3, 4, 1, 2, 6

7.3

1, 2, 3, 4, 5, 1, 2, 6


Test Cases


Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
7.1

Test Case Precondition


there are no nodes in the tree

Item(s) to be tested

Repository Module: Store
-
Path 7.1

Specifications


Input

Exp
ected

Output/Result

execute store passing null root

return tree traversed true




VHTN Software Development


Test Automation Report

19

Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
7.2

Test Case Precondition


tree contains root (28), left child (27)

Ite
m(s) to be tested

Repository Module: Store
-
Path 7.2

Specifications


Input

Expected

Output/Result

execute store, passing root value of 28

return tree traversed true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test C
ase Number

W
-
7.3

Test Case Precondition


tree contains root (28), left child (27), right child (50)

Item(s) to be tested

Repository Module: Store
-
Path 7.3

Specifications


Input

Expected

Output/Result

execute store, passing root value of 28

return tre
e traversed true



Write



Path ID

Path

8.1

1, 2, 3, 4

8.2

1, 2, 3, 2, 3, 4


VHTN Software Development


Test Automation Report

20

Test Cases


Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
8.1

Test Case Precondition


create vector containing 1 integer


root (28)

Item(s) to be tested

Repository Module: Write
-
Path 8.1

Specifications


Input

Expected

Output/Result

execute write

return file written true



Test to be Performed By:

Executed by Junit

Test Type

White Box


Basis Path

Test Case Number

W
-
8.2

Test Case Precondition


create vector containing 2 integers (28, 50)

Item(s) to be tested

Repository Module: Write
-
Path 8.2

Specifications


Input

Expected

Output/Result

execute write

return file written true


2.1.2

Black Box Testing
-

Equivalence Par
titioning


The following table represents the automated equivalence classes, both valid and invalid, for the
repository. There are many other equivalence classes at the system level that were not within the
scope of this automation.


Input/Output Event

V
alid Equivalence Classes

Invalid Equivalence Classes

Input maximum number
of allowed values


1: 25 values

2: > 25 values



In addition to testing official test cases, we added some other miscellaneous function testing into
this tool. Since a number of
values were loaded (25 integers), we decided to test the
functionality of the ascending and descending lists functions as well as search, delete, and clear.

VHTN Software Development


Test Automation Report

21

Test Cases


Test to be Performed By:

Custom Tool

Test Type

Black Box


䕱畩癡汥湣e⁐a牴楴r潮楮o

T
est Case Number

B
-
1

Test Case Precondition


Tree is empty, input the maximum number of allowed
values (25)

Item(s) to be tested

Unit Level


f湰畴ax業畭畭扥爠潦⁡汬潷o搠癡汵敳

⡖a汩搠da獥‹)

Specifications


Input

Expected

Output/Result

Insert 80,

24, 91, 25, 88, 54, 97, 22, 45,
101, 82, 99, 12, 15, 87, 27, 65, 32, 94,
105, 72, 46, 34, 8, 77

display message: “The tree is full”



Procedural Steps

1. Enter unique valid integer and click insert button

2. Repeat 24 times

3. Halt entry and view disp
layed tree



Test to be Performed By:

Custom Tool

Test Type

Black Box


䕱畩癡汥湣e⁐a牴楴r潮楮o

Test Case Number

B
-
2

Test Case Precondition


Tree is empty, input greater than maximum number of
allowed values (30)


Item(s) to be tested

Unit Level


f湰畴ax業畭畭扥爠潦⁡汬潷o搠癡汵敳

(f湶n汩搠da獥′)

Specifications


Input

Expected

Output/Result

Insert: 80, 24, 91, 25, 88, 54, 97, 22, 45,
101, 82, 99, 12, 15, 87, 27, 65, 32, 94,
105, 72, 46, 34, 8, 77, 20, 200, 1, 150, 7

display message: “The
tree is full”



Procedural Steps

1. Enter unique valid integer and click insert button

2. Repeat 29 times

3. Halt entry and view displayed tree


VHTN Software Development


Test Automation Report

22

3

System Testing


The goals of system testing are to detect faults that can only be exposed by testing the
entire
integrated system or some major part of it. Generally, system testing is mainly concerned with
areas such as performance, security, validation, load/stress, and configuration sensitivity. We
will perform the system
-
level testing allowed by eValid.


3.1

System
-
Level Testing
-

Equivalence Partitioning


The valid and invalid classes are shown below along with the corresponding valid and invalid
test values. By using eValid to test the equivalence classes, we can guarantee that using invalid
equivalence clas
s values will not cause any unexpected problems within the applet.



Input/Output Event

Valid Equivalence Classes

Invalid Equivalence Classes

Input integers

3: Integers between

999 and
999

4: Integers > 999


5: Integers <
-
999


6: Non
-
integers (character
s)


7: Non
-
integers (decimal values)


8: Non
-
duplicate integer



Test Cases


Test to be Performed By:

eValid

Test Type

Black Box


䕱畩癡汥湣e⁐a牴楴r潮楮o

Test Case Number

B
-
3

Test Case Description


Try to insert integer between

㤹㤠9湤‹㤹

Item(s)
to be tested

System Level


f湰畴⁩湴e来牳r
(噡汩搠da獥″

Specifications


Input

Expected

Output/Result

25

25 displayed in tree


VHTN Software Development


Test Automation Report

23

Test to be Performed By:

eValid

Test Type

Black Box


Equivalence Partitioning

Test Case Number

B
-
4

Test Case Descriptio
n


Try to insert integer between > 999

Item(s) to be tested

System Level


Input integers
(Invalid Case 4)

Specifications


Input

Expected

Output/Result

9999

display message: “Integer is out of range”



Test to be Performed By:

eValid

Test Type

Black

Box


䕱畩癡汥湣e⁐a牴楴r潮楮o

Test Case Number

B
-
5

Test Case Description


Try to insert integer between <
-
999

Item(s) to be tested

System Level


f湰畴⁩湴e来牳r
(f湶n汩搠da獥‵

Specifications


Input

Expected

Output/Result

-
9999

display message:
“Integer is out of range”



Test to be Performed By:

eValid

Test Type

Black Box


䕱畩癡汥湣e⁐a牴楴r潮楮o

Test Case Number

B
-
6

Test Case Description


Try to insert non
-
integer value (character)

Item(s) to be tested

System Level


f湰畴⁩湴e来牳r
(f湶
a汩搠da獥‶

Specifications


Input

Expected

Output/Result

a

display message: “Input error


湯n
-
integer value”



VHTN Software Development


Test Automation Report

24

Test to be Performed By:

eValid

Test Type

Black Box


Equivalence Partitioning

Test Case Number

B
-
7

Test Case Description


Try to insert

non
-
integer value (decimal)

Item(s) to be tested

System Level


Input integers
(Invalid Case 7)

Specifications


Input

Expected

Output/Result

1.5

display message: “Input error


non
-
integer value”



Test to be Performed By:

eValid

Test Type

Black Bo
x


䕱畩癡汥湣e⁐a牴楴r潮楮o

Test Case Number

B
-
8

Test Case Description


Tree contains the value 5


瑲y⁴漠 e楮獥i琠t




Item(s) to be tested

System Level


f湰畴⁩湴e来牳r
(f湶n汩搠da獥‸

Specifications


Input

Expected

Output/Result

5

display m
essage: “The integer has already
been inserted”


3.2

System
-
Level Testing
-

Boundary Value Testing


The acceptable range of values for this application was set by the development team. Due to the
limitations of the GUI, the developers also limited the size of

the input values to three digit
integers. The valid and invalid ranges are shown below along with the corresponding valid and
invalid boundary test values. By using eValid to test the boundary values, we can guarantee that
using values outside of the all
owed boundaries will not cause any unexpected problems within
the applet.


Acceptable Range:
-
999


x


999

Invalid Range:
-


< x <
-
999 and 999 < x < +



VHTN Software Development


Test Automation Report

25

Valid Boundary Tests:

Boundary
1
: x =
-
999

Boundary
2
: x = 0

Boundary
3
: x = 999


Invalid Boundary T
ests:

Boundary
4
: x = 1000

Boundary
5
: x =
-
1000

Boundary
6
: x = 999999

Boundary
7
: x =
-
999999


Test Cases


Test to be Performed By:

eValid

Test Type

Black Box


B潵湤ory sa汵攠l湡ly獩s

Test Case Number

B
-
9

Test Case Description


Try to insert valid bounda
ry integer value (

㤹9)

Item(s) to be tested

System Level


sa汩搠do畮摡ry

Specifications


Input

Expected

Output/Result

-
999

-
999 displayed in tree



Test to be Performed By:

eValid

Test Type

Black Box


B潵湤ory sa汵攠l湡ly獩s

Test Case Number

B
-
10

Test Case Description


Try to insert valid mid
-
boundary integer value (0)

Item(s) to be tested

System Level


sa汩搠do畮摡ry

Specifications


Input

Expected

Output/Result

0

0 displayed in tree


VHTN Software Development


Test Automation Report

26

Test to be Performed By:

eValid

Test Type

Black Box


Boundary Value Analysis

Test Case Number

B
-
11

Test Case Description


Try to insert valid boundary integer value (999)

Item(s) to be tested

System Level


Valid Boundary

Specifications


Input

Expected

Output/Result

999

999 displayed in tree



Test

to be Performed By:

eValid

Test Type

Black Box


䕱畩癡汥湣e⁐a牴楴r潮楮o

Test Case Number

B
-
12

Test Case Description


Try to insert invalid boundary integer (1000)

Item(s) to be tested

System Level


f湶a汩搠d潵湤ory

Specifications


Input

Expected

Output/Result

1000

display message: “Integer is out of range”



Test to be Performed By:

eValid

Test Type

Black Box


䕱畩癡汥湣e⁐a牴楴r潮楮o

Test Case Number

B
-
13

Test Case Description


Try to insert invalid boundary integer (
-
1000)

Item(s) to be
tested

System Level


f湶a汩搠d潵湤ory

Specifications


Input

Expected

Output/Result

-
1000

display message: “Integer is out of range”



VHTN Software Development


Test Automation Report

27

Test to be Performed By:

eValid

Test Type

Black Box


Equivalence Partitioning

Test Case Number

B
-
14

Test Case De
scription


Try to insert distant invalid boundary integer (999999)

Item(s) to be tested

System Level


Invalid Boundary

Specifications


Input

Expected

Output/Result

999999

display message: “Integer is out of range”



Test to be Performed By:

eValid

Test Type

Black Box


䕱畩癡汥湣e⁐a牴楴r潮楮o

Test Case Number

B
-
15

Test Case Description


Try to insert distant invalid boundary integer (
-
999999)

Item(s) to be tested

System Level


f湶a汩搠d潵湤ory

Specifications


Input

Expected

Output/Result

-
9
99999

display message: “Integer is out of range”



3.3

System
-
Level Testing
-

Performance Testing


This test will be conducted to evaluate the fulfillment of a system with specified performance
requirements. It will be done using the eValid test tool and will

be performed on every
button/text field in the GUI. eValid will help guarantee that there are not missing links and that
all buttons are functioning properly. It will also display the results which will allow us to test the
functionality of our system.
(All functional specifications are included in the Software
Specification).





Load a file



Insert an integer

o

Text field

o

Insert button



Ascending list

o

Ascending button



Descending list

VHTN Software Development


Test Automation Report

28

o

Descending button



Delete an integer in the tree and try to delete an intege
r not in the tree

o

Text field

o

Delete button

o

Text field

o

Delete button



Search an integer in the tree and try to search an integer not in the tree

o

Text field

o

Search button

o

Text field

o

Search button



Store

o

Store button



Clear

o

Clear button


4

Problem Reporting


The o
nline freeware tool, Element Tool, was used for problem reporting. Since all of our bugs
had been fixed by the time we did the automated testing, we entered our bug from the manual
testing.


5

Tool Usage


We were able to apply tools to all aspect of testing



white box, black box, and system
-
level
testing.


5.1

Junit


Junit was used to perform white
-
box testing on the Binary Search Tree repository code. We
were able to create a test suite that would test every path identified using the Basis
-
Path Testing
appr
oach. We tested a total of 31 of our test cases. We were also able to implement automated
test data generation within this tool.


5.2

Custom Automation Tool


We used our custom automation tool to perform black
-
box, equivalence partitioning for the
Binary
Search Tree repository code. In addition, we were able to do some unit
-
level functional
testing to see how each of the separate pieces worked together. We included detailed comments
that were both written to a file and displayed to the screen to verify e
very step in the automation
routine.


5.3

eValid


The eValid tool was used at the system
-
level to test the functionality of the applet. It can
confirm that there are no broken links and that each button is working properly. The advantage
VHTN Software Development


Test Automation Report

29

to using eValid is t
hat it can record a string of operations that can be performed over and over
using the playback feature. This is useful if changes are made to the code since it records the
operations.


5.4

Element Tool


Element Tool was used for bug tracking. All of the bug
s from the manual testing phase were
entered into Element Tool for tracking.


6

Experience and Lessons Learned


6.1

Overall Experience


Upon completion of our automation project, there were many lessons learned about the
automation process as well as some differ
ent pros and cons while working within an automation
paradigm. One of the biggest challenges of automation is setting up a specific application to
integrate within an automation infrastructure. Because all software is different, third party
automation so
ftware vendors can only make their best predictions on how an application might
be structured in order to successfully automate. This can cause problems within any automation
project because most software is created differently and will not always use sta
ndard objects. In
fact, software engineers will sometimes use rather clever methodologies in creating certain
modules that a third party vendor never even thought of designing their automation tool around.
This problem affected our own binary search appl
ication while trying to automate within eValid.
It was apparent that eValid could not find a certain class within Java and so it would not allow
any automation scripts to run. It took weeks of communication with the vendor before we were
given a solution

to the problem.



Other problems with automation can sometimes occur while writing the scripts themselves. If a
script is not written properly or not robust enough, although it may work correctly, many a time
it is only temporary. As a software product

evolves over time, scripts that have once worked in
the past will cease to work and may either require modification or abortion. This system will
create a constant maintenance requirement for all automation scripts produced past, present, and
future.



Although automation can be difficult to setup and maintain, there are definitely strong benefits to
it. During the manual testing phase of our project, it took many hours to determine requirements,
write each black box test cases, and write each white box

test case. Furthermore, after the test
cases were completed, it took even longer to go through each one and manually test each part of
the product. After each test case was modified, this process quickly became effortless. What
initially took hours to
complete, was done in a matter of seconds. On account of this, these tests
could be repeated over and over again. This would be of a great benefit as when the product
changes over time, the entire test conglomeration could easily be run as many times as
management would require. Furthermore, test generation could be truly random and tested more
thoroughly. Overall, difference in time is so significant that testing the legacy areas of a product
become faster and easier.

VHTN Software Development


Test Automation Report

30

6.2

Junit


Junit was our favorite tool

as it was easy to use and provided just the right features to perform
white box testing in an organized, simple manner. Since white box testing can sometimes be
difficult to perform, this tool is especially useful. It only took a few minutes to figure o
ut how to
apply and the coding of the test cases was very straightforward. It would also be a useful tool if
changes were to be made to the code. The complete suite of white box testing can be done in
one step. In addition, be implementing the random nu
mber generation, all of the test cases could
be run using completely different sets of values in a very short time. This provided even more
assurance in the correctness of the code.


6.3

Custom Automation Tool


The custom generated test tool was more of an
experiment to see how difficult it would be to
develop our own tool. Since the white
-
box testing was already covered by the Junit tool, we
decided to implement this tool for black
-
box testing. With the idea of how to develop a tool
achieved using Junit,
the custom tool was not too difficult to create. It turned out to be a great
way to not only perform two of the black
-
box test cases, but also test how all of the functions in
the tree repository function together.


6.4

eValid


The eValid tool turned out
to be by far the most frustrating. Not only did we have to convert our
application into an applet just to use the tool, we also spent two weeks emailing back and forth
with eValid to get the tool to recognize our applet. Then, even after we got the tool
to recognize
the applet, it was difficult to get the system properly configured so it would operate properly. It
would initially not recognize our mouseclicks or text entry and we finally received word from
eValid that we had to use the absolute mode for
it to record. Because it took so long to get
everything worked out with eValid, we ended up using another tool that would allow our
application be recognized b
y

eValid. We found a tool called WebCream that converted the
applet/application into JSP/HTML pa
ges. Once we did this, we were finally able to actually
create and play back a script. Automation tools are only useful if they make testing easier or
provide better coverage. This tool really did neither. The documentation was not adequate to
solve th
e problems on our own and relying on the support team resulted in numerous delays in
our testing. Now that we have the tool working, future testing would be simplified as we could
just playback our recorded series of mouseclicks to retest the whole set of

system tests.
Unfortunately, we’re not sure if it was worth all the trouble.


6.5

Element Tool


Our team did not feel that there was any value added by using this tool. Since we were only
using the freeware version, it did not provide enough detail or flexi
bility to effectively manage
our problem reports. It was however very easy to use and would probably be more worth while
if were able to use one of the more robust versions.


VHTN Software Development


Test Automation Report

31

7

Comparison of Manual vs. Automated Testing


Since this was our first experience
with test automation, we weren’t as efficient as we could have
been. It took us a while to understand you to use Junit, we had numerous problems with eValid
that increased testing time, and we also needed to spend time figuring our where automation
could
be used. All of these tasks would have taken less time if we had some previous
experience. With the automation in place, however, subsequent testing will be much quicker and
easier. We could make changes to the software and easily rerun all of the unit
tests. We can also
change the input data and rerun the test suite. Based on our best estimation, the following
timetable was produced to provide a direct comparison. The times below do not reflect the time
spent in developing the test cases which took u
p much more time than the actual testing.


Item Tested

Time for Manual
Testing

Time for Automated
Testing

Basis
-
Path Testing
-

Repository

6 hours

4 hours

Equivalence Partitioning


oe灯獩瑯ty

㌰3湵瑥n

㈠桯畲s

c畮u瑩潮o氠呥獴sng


pys瑥t
-
ie癥l

ㄠ桯畲

n
漠瑯潬⁡癡楬a扬b

䕱畩癡汥湣e⁐a牴楴r潮楮o


py獴敭
-
ievel

ㄵ1湵瑥n

ㄵ1湵瑥猪

B潵湤ory 噡汵攠


py獴sm
-
ie癥l

ㄵ1湵瑥n

ㄵ1湵瑥猪

me牦潲浡湣e⁔ 獴sng


pys瑥t
-
ie癥l

湯⁴潯n⁡癡楬a扬b

ㄵ1湵瑥猪

䕬b浥湴⁔潯m

ㄠ桯畲

㌰3湵瑥n


⨠摯*猠湯s⁩湣汵摥⁴
e′
-
㌰⁨潵3猠獰敮琠睯s歩kg 潵琠o桥⁥噡汩搠灲d扬敭b



VHTN Software Development


Test Automation Report

32

8

References


Pressman, Roger S.
Software Engineering
-

A Practitioner's Approach
. Fifth edition. The
McGraw
-
Hill companies, Inc.


Kaner, C., Falk, J., Nguyen, H.
-
Q.
Testing Computer Software
. Wiley Com
puter Publishing,
1999.