Hierarchal Keyword Test Automation

cavalcadehorehoundMechanics

Nov 5, 2013 (4 years and 2 days ago)

78 views




Hierarchal Keyword Test
Automation

(Why we did it that way)


By


John Crunk

Purpose


It is the purpose of this document to try and discuss some of the decisions that have been
made over the years as they pertain to the WRSAFS. Often, some history will
be
necessary and direct quotes will be used as they were discussed prior to my involvement
in the project. To capture accurate information, I will use the actual documentation that
was produced at the time and try and discuss how this evolved into what we
have today.


What we have


At present, we have a approach that includes 3 levels of test tables; they are Cycle, Suite,
and Step. The reason for this as narrated by Carl is that many of the early involved
discussed and researched many of the possibilities
for an implementation of a Data
Driven approach and determined that this was the best way to implement such a solution.


In part, their discussion was:


Figure 4 is a diagram representing the design of our automation framework. It is
followed by a descript
ion of each of the elements within the framework and how
they interact. Some readers may recognize portions of this design. It is a
compilation of keyword driven automation concepts from several sources. These
include Linda Hayes with WorkSoft, Ed Kit from

Software Development
Technolgies, Hans Buwalda from CMG Corp, myself, and a few others.



Figure 4


In brief, the framework itself is really defined by the
Core Data Dr
iven Engine
, the
Component Functions
, and the
Support Libraries
. While the
Support Libraries

provide generic routines useful even outside the context of a keyword driven
framework, the core engine and
Component Functions

are highly dependent on
the existen
ce of all three elements.

The test execution starts with the
LAUNCH TEST
(1) script. This script invokes the
Core Data Driven Engine

by providing one or more
High
-
Level Test Tables

to
CycleDriver
(2).
CycleDriver

processes these test tables invoking the
Suit
eDriver
(3) for each
Intermediate
-
Level Test Table

it encounters.
SuiteDriver

processes these intermediate
-
level tables invoking
StepDriver
(4) for each
Low
-
Level Test Table

it encounters. As
StepDriver

processes these low
-
level tables it
attempts to keep th
e application in synch with the test. When
StepDriver

encounters a low
-
level command for a specific component, it determines what
Type of component is involved and invokes the corresponding
Component
Function
(5) module to handle the task.


This adds to the

reusability of the engine, and was subsequently carried over to other
engines as they were developed. In later years, common and shared functions we
translated in Visual Basic, C++, and most recently in Java.


We are allowed in the sharing of the functio
ns to further add to the reusability of the
engines by the making of independent and common shared functionality. This gives us
one of the largest advantages that we get from OOD, reusability; on several levels.


We are able to reuse Step level test among

Suite steps in many different projects and
similarly able to reuse Suite level test amount Cycle test in different projects or test.
Additionally, component libraries are reused at lower level among different applications
and even at different organizatio
ns. And even further, shared functions like string
manipulation, logging, application map and variable storage are reused among all the
engines at all companies currently using SAFS.


So in the end, the real reason that we use test tables in a hierarchal m
anor is to give
reusability to the user when writing their test. An additional, and equally important
reason is that it allows us to put common functions together in a method that makes
sense. As an example, the following real scenario is offered that may
help understand
more this hierarchal advantage.


One project that I worked on was a reporting application for financial
information. This application had 100’s of screens each representing add,
modifies, and deletes for many various functions. Each time th
e user wanted
something, it was always in this form, so much so that the developers had a
template that did much of the work for them.


One of the challenges of the test automation was that each screen was distinctly
different enough that you could not cod
e the functions that were the same in a
shared library. Using this hierarchal approach though made things very different,
where I would have had to write 3 functions for each screen, add, modify, and
delete; I was able instead to write 3 functions for the
application and use high
level test table to pass in the information that made the screens different.


In the above scenario, the test were complete before the developers even had the specs for
the next screen so testing was able to stay ahead of testing i
n an extremely aggressive
timeframe. This would not have been possible were it not for this hierarchal approach.
This approach allows you to write more efficient scripts and group common functions
together. The biggest challenge is not writing the test, bu
t it is grouping the functions that
are similar together.

References:

1.

Nagle, C. “Test automation frameworks” 2004

2.

]Kit, E. & Prince, S. "A Roadmap for Automating Software Testing" Tutorial
presented at STAR’99
East

Conference, Orlando, Florida, May 10, 199
9.

3.

Hayes, L. "Establishing an Automated Testing Framework" Tutorial presented at
STAR’99
East

Conference, Orlando, Florida, May 11, 1999.

4.

Kit, E. "The Third Generation
--
Integrated Test Design and Automation" Guest
presentation at STAR’99
East

Conference, O
rlando, Florida, May 12, 1999.

5.

Mosley, D. & Posey, B.
Just Enough Software Test Automation
New Jersey:
Prentice Hall PTR, 2002.

6.

Wust, G. "A Model for Successful Software Testing Automation" Paper presented
at STAR’99
East

Conference, Orlando, Florida, May

12, 1999.

7.

Dustin, E.
Automated Software Testing: Introduction, Management, and
Performance
.

New York: Addison Wesley, 1999.

8.

Fewster & Graham
Software Test Automation: Effective use of test execution tools
New York: Addison Wesley, 1999.

9.

Dustin, E. "Au
tomated Testing Lifecycle Methodology (ATLM)" Paper presented
at STAR EAST 2000 Conference, Orlando, Florida, May 3, 2000.

10.

Kit, E. & Buwalda, H. "Testing and Test Automation: Establishing Effective
Architectures" Presentation at STAR EAST 2000 Conference,

Orlando, Florida,
May 4, 2000.

11.

Sweeney, M. "Automation Testing Using Visual Basic" Paper presented at STAR
EAST 2000 Conference, Orlando, Florida, May 4, 2000.

12.

Buwalda, H. "Soap Opera Testing" Guest presentation at STAR EAST 2000
Conference, Orlando, Fl
orida, May 5, 2000.

13.

Pollner, A. "Advanced Techniques in Test Automation" Paper presented at STAR
EAST 2000 Conference, Orlando, Florida, May 5, 2000.

14.

Cem Kaner,
http://www.kaner.com


15.

Zambelich, K.
Totally Data
-
Driven

Automated Testing

1998

http://www.sqa
-
test.com/w_paper1.html


16.

SQA Suite Users, Discussions and Archives, 1999
-
2000,
http://www.dundee.net/sqa/


17.

Nagle, C.
Dat
a Driven Test Automation: For Rational Robot V2000
1999
-
2000

DDE Doc Index