Final Report - Senior Design - Iowa State University

herbunalaskaData Management

Jan 30, 2013 (4 years and 2 months ago)

161 views

Attack Tool Repository and Player for ISEAGE


Final

Report




Team

May06
-
11



Client

Information Assurance Center



Faculty Advisor

Dr. Doug Jacobson



Team Members

Jeremy Brotherton

Timothy Hilby

Brett Mastbergen

Jasen Stoeker




REPORT DISCLAIMER NOTICE

DISCLAIMER: This document was developed as a part of the requirements of an
electrical and computer engineering course at Iowa State University, Ames, Iowa. This
document does not constitute a professional engineering design or a professional land
surveyi
ng document. Although the information is intended to be accurate, the associated
students, faculty, and Iowa State University make no claims, promises, or guarantees
about the accuracy, completeness, quality, or adequacy of the information. The user of
t
his document shall ensure that any such use does not violate any laws with regard to
professional licensing and certification requirements. This use includes any work
resulting from this student
-
prepared document that is required to be under the responsibl
e
charge of a licensed engineer or surveyor. This document is copyrighted by the students
who produced this document and the associated faculty advisors. No part may be
reproduced without the written permission of the senior design course coordinator.





Date Submitted

May 3
, 2006

i

Table of Contents

List of Figures
……....……….…………….…………………………………………………………………
ii
i

List of Tables

…..…..……….…………….…………………………………………………………………
iv

List of Definitions

...……….…………….…………………………………………………………………
v

1

Introductory Materials

................................
................................
................................

1

1.1

Executive Summary

................................
................................
............................

1

1.2

Acknowledgements

................................
................................
.............................

2

1.3

Problem Statement and Solution

................................
................................
.........

2

1.3.1

Problem Statement

................................
................................
......................

2

1.3.2

Problem Solution

................................
................................
........................

2

1.4

Operating Environment

................................
................................
.......................

3

1.5

Intended Users and Uses

................................
................................
.....................

3

1.5.1

Intended Users

................................
................................
............................

3

1.5.2

Intended Uses

................................
................................
..............................

3

1.6

Assumptions and Limitations

................................
................................
.............

3

1.6.1

Assumptions

................................
................................
................................

3

1.6.2

Limitations

................................
................................
................................
..

3

1.7

Expected End Product and Deliverables

................................
.............................

4

1.7.1

Software Application

................................
................................
..................

4

1.7.2

Documentation

................................
................................
............................

4

2

Project Approach and Results

................................
................................
.....................

4

2.1

Functional Requirements

................................
................................
....................

4

2.2

Design Constraints

................................
................................
..............................

5

2.3

Approach Considered
and Used
................................
................................
..........

5

2.3.1

Technologies and Approaches Considered

................................
.................

5

2.3.2

Technologies and Approach Used

................................
..............................

7

2.4

Detailed Design

................................
................................
................................
...

8

2.4.1

User Interface

................................
................................
..............................

9

2.4.2

Web Server
................................
................................
................................

13

2.4.3

Attack Systems
................................
................................
..........................

14

2.5

Implementation Process

................................
................................
....................

15

2.6

End Product Testing

................................
................................
..........................

15

2.7

Project End Results

................................
................................
...........................

16

3

Resources and Schedule

................................
................................
............................

17

3.1

Resources

................................
................................
................................
..........

17

3.1.1

Personnel Effort Requirements

................................
................................
.

17

3.1.2

Other Resource Requirements

................................
................................
..

19

3.1.3

Financial Requirements

................................
................................
............

20

3.2

Schedules

................................
................................
................................
..........

21

3.2.1

Project Schedule
................................
................................
........................

21

3.2.2

Deliverables Schedule

................................
................................
...............

23

4

Closure Material
................................
................................
................................
........

23

4.1

Project Evaluation

................................
................................
.............................

23

4.2

Commercialization

................................
................................
............................

24

4.3

Recommendations for Additional Work

................................
...........................

24

4.4

Lessons Learned
................................
................................
................................

24

ii

4.5

Risk and Risk Management

................................
................................
..............

25

4.5.1

Anticipated Risks and Planned Management

................................
...........

25

4.5.2

Anticipated Risks Encountered and Management

................................
....

26

4.5.3

Unanticipated Risks Encountered and Management

................................

26

4.5.
4

Changes in Risk Management Due to Unanticipated Risks

.....................

26

4.6

Project Team Information

................................
................................
.................

27

4.6.1

Client Information

................................
................................
.....................

27

4.6.2

Faculty Advisor Information
................................
................................
.....

27

4.6.3

Team Member Information

................................
................................
.......

27

4.7

Closing Summary
................................
................................
..............................

28

Appendix A

................................
................................
................................
.......................

29

iii

List of Figures

Figure 2
-
1. Attack Tool Repository and Player Syste
m Layout

................................
.........

9

Figure 2
-
2. System Use Cases

................................
................................
............................

9

Figure 2
-
3. Initial Search Screen

................................
................................
......................

10

Figure 2
-
4. Search Results Table

................................
................................
......................

11

Figure 2
-
5. Attack Launch Page

................................
................................
.......................

12

Figure 2
-
6. Attack Output Window

................................
................................
..................

13

Figure 2
-
7. Attack Table

................................
................................
................................
...

14

Figure 2
-
8. Attack Table cont.

................................
................................
..........................

14

Figure 3
-
1. Project Schedule

................................
................................
.............................

22

Figure 3
-
2. Deliverables Schedule

................................
................................
....................

23


iv


List of Tables

Table 2
-
1. Important Pros

and Cons of Databases

................................
..............................

6

Table 2
-
2. Important Pros and Cons of Programming Languages

................................
......

7

Table 2
-
3. Project Testing Results

................................
................................
....................

16

Table 2
-
4. Project Component Status Results
................................
................................
...

17

Table 3
-
1. Original Personnel Effort Resource Requirements

................................
.........

19

Table 3
-
2. Revised Personnel Effort Resource Requirements

................................
..........

19

Table 3
-
3. Final Personnel Effort Resource Requirements

................................
..............

19

Table 3
-
4. Original Estimate of Other Resource Requirements

................................
.......

19

Table 3
-
5. Revised Estimate of Other Resource Requirements

................................
........

19

Table 3
-
6. Final Costs of Other Resource Requirements

................................
.................

20

Table 3
-
7. Original Estimated Project Costs
................................
................................
.....

20

Table 3
-
8. Revised Estimated Project Costs

................................
................................
.....

21

Table 3
-
9. Final Project Costs

................................
................................
...........................

21

Table 4
-
1. Project Milestones and Relativ
e Importance

................................
...................

23

Table 4
-
2. Milestone Evaluation Criteria

................................
................................
..........

23

Table 4
-
3. Project Results

................................
................................
................................
.

24

v

List
of Definitions

Attack

-

An assault against a computer system or network that is deliberately executed.


Database



A set of related files and groups of information that are managed by a
database management system.


Database management s
ystem



Software tha
t manages a database allowing the search
and retrieval of its contents.


Exploit



An attack on a computer system that takes advantage of a particular
vulnerability in the system.


ISEAGE



Internet Scale Event and Attack Generation Environment. A network

d
edicated to creating a virtual Internet for the purpose of researching, designing, and
testing cyber defense mechanisms.


Network



A series of computers and devices interconnected by communication paths.
May also include smaller interconnected subnets.


SSH



Secure Shell, This is a protocol

that allows users

to interact with a computer
remotely.


Virus



A piece of software that “infects” a computer by attaching itself to other files on
a system and behaving maliciously.


Vulnerability



This is a weak
ness in a system due to security procedures,
implementation or other means that could be exploited.


Web a
pplication



An application interface that resides on a web server, which is
accessed and used through a web browser.



1

1

Introductory Materials

This se
ction includes the executive summary, acknowledgements, problem statement

and
solution
, operating environment, intended user(s) and use(s), assumptions and limitations,
and expected end
-
product and deliverables.

1.1

Executive Summary

Project
Need

Today’s world

is changing shape as it increases its dependency on computer
technology. As society moves further into the digital world, there has been growing
concern for the security of the information stored on computers. Finding exploits to
evaluate the security o
f a given system can be a daunting task. Those individuals
wishing to test system security need a way to quickly locate relevant exploits and
execute them.


Approach

Used

The May06
-
11 team’s approach develop
ed

a solution that provides a web
-
based user
i
nterface to a central repository of exploits. This interface provide
s

users with the
ability to search for, and then execute, specific exploits based on their characteristics.
Standardized documentation about how to use each exploit
is also

available to
users.
In addition, administrators
have the ability to

maintain the repository. Finally, this
system
has been given to the Information Assurance Center to

be deployed for
ISEAGE (Internet Scale Event and Attack Generation Environment), meaning that
users

can evaluate security solutions by carrying out real attacks on real equipment.


Project
Activities

The May06
-
11 team completed several activities over the course of the project. The
team first developed a basic set of requirements and assumptions, whic
h formed the
basis for the project plan. The team then researched existing technologies to
determine the best solutions for the project, which were MySQL (database), PHP
(programming language), and Apache (web server). A design document was created
to in
clude
the
technology decisions, as well as a prototype of the user interface in the
form of screenshots. The implementation phase included both website and database
development. The testing phase verified
that
the end product met the project
requirements

as defined by the client
.
Additionally, u
ser and administrative help
manuals were created as supplements to the end product.
Finally, the team produced
t
his document to summarize the project
a
nd
its
results.


Project
Results

The team’s solution
was comp
rised
of a web application and accompanying database.
The application
was made up

of a series of simple web pages
which served
as the user
interface. An overall description of the software and the project objectives is
presented on the homepage. The sea
rch page contains a text box for the search name
and a series of drop
-
down menus to select attack
search
characteristics. Upon
conducting a search through the database, a table of relevant results is generated

for
the user
. The user may
then
click on the

name of an attack to take him/her to the


2

launch page. At this point, desired attack parameters may be entered into a text box
and once the ‘Launch’ button is pressed, the attack will be executed

with the specified
parameters
.


The team also developed a s
et of supplements to the end product. An administrative
website was provided to allow attacks to be added

or removed

to
/from

the database.
Where possible, the team created standardized documentation for each exploit.
Finally, user and administrative hel
p manuals were delivered to the client.


Follow
-
on Work

One aspect of the end product that could be extended is the launch attack options
screen. Options could be added to allow users to target specific machines

via a
network diagram

and set timeouts for
the attacks.
Of course
, new attacks can always
be added to the database.
In addition, it could also be valuable to enhance the launch
mechanism to include a web
-
based,

interactive shell for the user
s to launch attacks
from
.

1.2

Acknowledgements

The team woul
d like to thank Dr. Doug Jacobson, the team’s faculty advisor, for his
time and expertise.

1.3

Problem Statement and Solution

The problem
that was

solved is described in the following two sections. First, the
team describes the general problem area. Secondly
, the team
discusses

the approach
for the
solution.

1.3.1

Problem Statement

The team develop
ed

a tool that would allow researchers to carry out specific
network attacks against computer hosts. End
-
users need
ed

the ability to locate
and launch relevant attacks a
s quickly and easily as possible. In addition, users
should also have the ability to search for a specific attack based on a variety of
criteria. They also need
ed

to have a way to easily interact with this tool, such as
a web based graphical interface.

1.3.2

Problem Solution

To solve this problem, the team develop
ed

an application that connects to a
database of network attacks. The application
allows
users to search the database
and launch attacks from other dedicated machines.
Restricted, a
dministrative
ac
cess
allows the database to be updated or modified
. The solution
also

allow
s

multiple users access

to

it
simultaneously. The application
is

a web interface,
which allows platform
-
independent access.


A
n

end
-
user machine send
s

a search request to the web
server. The web server
then

execute
s

a PHP script that connects to the database and performs the
desired query. Results from the query are then returned back to the user’s


3

machine. The attack is launched in a similar fashion by using another PHP
script
that actually execute
s

the desired attack.

1.4

Operating Environment

This application run
s

on computers on the ISEAGE network. A temperature
controlled environment of 60
-
90 degrees Fahrenheit is necessary to ensure operation.
High levels of moisture could ca
use hardware failures, which could render the
database or other machines inoperable. Computers using the application or those
running the database could
also
be affected by computer viruses.

Users can access
the application with an
y

operating system that

has a web browser. The application
itself can be run on any platform.

1.5

Intended Users and Uses

The sections below describe the intended users and uses for the end
-
product.

1.5.1

Intended Users

This application is intended for researchers, vendors, and computer
professionals. Most end
-
users have a strong background in information
technology and
a
familiarity with information security. It may also be used for
training purposes so it also cater
s

to those with lesser skills. Most users
expect
ed

to see details abo
ut search results to evaluate the threat level and scope
of a particular exploit.

1.5.2

Intended Uses

This application provide
s

a mechanism to evaluate weaknesses in computer
systems and network architectures. It allow
s

users to search and launch attacks.
It m
ay also be used for training purposes. It
does

not fix vulnerabilities or
pinpoint the cause of failure. This solution
does

not contain all possible

exploits
and its database relies

on administrators for updating its contents.

1.6

Assumptions and Limitations

The assumptions and limitations of this project are listed in the following sections.

1.6.1

Assumptions

The assumptions for this project are:



Maximum number of simultaneous users is twenty



Maximum query response time is two seconds



The application
is

coded usin
g PHP and MySQL



Database updates require administrator action



The application run
s

on any system with a web browser



Any attack can only execute for 60 seconds


1.6.2

Limitations

The limitations for this project are:



The database
does

not include all possible att
acks or all known attacks



4



Launching attacks
is

done at the click of a button



The application
is

only used in the ISEAGE environment



Not

all attacks in the repository are

tested and verified



This system
does

not fix vulnerabilities

or

pinpoint the cause of
failure

1.7

Expected End Product and Deliverables

T
he team deliver
ed

the end software product and associated documentation

to the
client
.

1.7.1

Software Application

T
he team provide
d

ISEAGE with a
small

database of malicious software from
both the past and the prese
nt. In addition, the team
also

deliver
ed

an end
-
product that interfaces to this database.

1.7.2

Documentation

T
he team deliver
ed

documentation of setup and software use to the client. A
user’s guide and administrator troubleshooting guide
were

developed. In

addition, the team turn
ed

over commented source code and testing results.

2

Project Approach and Results

This section includes the
functional requirements, design constraints, approach
considered and selected, detailed design, the implementation process, te
sting results, and
project end results.

2.1

Functional Requirements

The final product meet
s

the following requirements:

1.

W
eb
-
based i
nterface



An online interface suitable for users with different
technical abilities is provided.

2.

Detailed exploit classification

and s
earch



Database is searchable based
upon the following characteristics: exploit name, target platform, source
platform, source IP, attack type, service attacked, documentation
availability, confirmation of the exploit’s claims, and whether the explo
it is
runnable.

3.

One
-
c
lick
l
aunch



Allows users to launch attacks with the click of a
button. Allows optional parameters to be entered and used in the launch.
After the launch, the program will open up a web browser that will be
dynamically refreshed and r
eturn
s

results from the launch in
semi real
-
time
.

4.

Administrative
a
ccess



Administrative users have the ability to add,
remove, or update items from the database using phpMyAdmin.

5.

Online d
ocumentation



Users are supplied with documentation about
each expl
oit’s usage, if available.

6.

Exploit h
omepage



Users are provided with links to the exploit
homepage, if applicable.

7.

Exploit d
ownload



Gives users the ability to download exploit code in
order to launch the attack manually.



5

8.

Program h
elp



Help informat
ion
is provided to users via a h
elp link.

9.

Support c
ontact



Contact

information is provided via a c
ontact link.

10.

Program
f
unctionality



The purpose of t
he program is explained via an
a
bout link.

11.

No a
uthentication



Authentication to the system was not d
evelope
d

because the product will be stored in a secure environment on an isolated
subnet.

2.2

Design Constraints

The constraints on the end
-
product are the following:

1.

Platform i
ndependent



The system was designed to operate on variety of
architectures and operating

systems.

2.

Web b
ased



The system was designed to work with any computer using a
modern web browser.

3.

Open s
ource



The software was designed with open source technologies
to avoid commercial licenses.

4.

Database p
ower


The database is powerful enough to hold

a large variety
of attacks and return results in less than two seconds.

5.

Vulnerability e
numeration



The system was only designed to be an aid in
discovering vulnerabilities. It does not fix vulnerabilities, nor is it likely to
find
all vulnerabilities

tha
t exist in a system.

6.

Administrator c
ontrol



The project was de
signed to be integrated with
php
MyAdmin, which allows administrators to add, delete, and update
exploits in the database.

7.

Completeness



Due to time constraints, the database
does

not include
a
ll
known or unknown
exploits
.

2.3

Approach
Considered and Used

This section details the technical approach the team used for
developing the

design of
the end product.

2.3.1

Technologies and Approaches Considered

Before deciding on the technologies to use for this pr
oject, the team carefully
examined the options currently available. This project required a powerful and
fast database solution. There were a number of systems that would have
delivered the functionality required, both commercial and open source. The
team

decided to consider solutions from Oracle, PostgreSQL, MySQL, and SQL
Server 2005.
Error!
Reference source not found.

at the end of this section
highlights the pros and cons the team considered for
the
database technologies.


Next, the t
eam had to decide on a programming language for creating the web
interface.
The

main contenders were PHP and ASP.NET 2005.
Table
2
-
2

at the
end of this section covers the important pros and cons of each language.

B
oth
options could have provided an effective solution.




6

After that, the team had to decide on a web server. The choices were Apache
and Microsoft IIS. Because the database and language choices were
significantly more important to our development,
the tea
m

only looked at these
to ensure that either would be able to meet
the project

needs. This is because the
databases and languages were typically packaged with the web server, and
the
team

only had to make sure that neither server would prove a massive stum
bling
block that might impact
the

decisions about the other technologies.


Finally, the team had to decide on a development approach.
The

options were to
have all team members work together on each aspect of the project, or to assign
various parts of the p
roject to different team members and integrate them at the
end. Either of these methods would have worked, but the latter did have an
advantage. By having each team member be responsible for a core area (i.e.


PHP, web page design, database, etc.), the te
am member could get more
experience with a technology and could develop expertise in
t
his area.


Table
2
-
1
. Important Pros and Cons of Databases

Products

Pros

Cons

PostgreSQL

-

Relatively Common.

-

Mature and

well tested.

-

No licensing issues.

-

More features than MySQL.

-

Complex.

-

Not as intuitive to learn as MySQL or
SQL Server 2005.

-

Smaller user base than other products
listed.

-

Less online examples than other
products listed.

-

Additional features ov
er MySQL are
not ones
the team

need
s
.

Oracle

-

Heavily used by enterprise.

-

Very large user base.

-

Well established in industry.

-

Lots of documentation.

-

Mature and well tested.

-

Extremely complex, more than any of
the other products.

-

Proprietary l
icensing issues.

-

Additional enterprise features are not
ones
the team

would use.

SQL Server
2005

-

Best integration of any solution.

-

Most extensive tools.

-

Fast and scalable.

-

Large amount of prewritten
functions and objects.

-

Interacts well with t
he Visual
Studio IDE.

-

Microsoft licensing issues.

-

Future release date.

-

Newness means additional likelihood of
major bugs.

-

Database tied to Microsoft platforms.

MySQL

-

Large amount of online examples
and a large online user community.

-

Current ve
rsion mature and well
tested.

-

Open source licens
e will be
easiest to manage
.

-

Able to be used on most major
platforms.

-

Relatively easy to learn.

-

Not as much code comes with the
database as with SQL Server 2005.

-

Not as well integrated with other
pr
oducts as SQL Server 2005 is with .Net
products.







7

Table
2
-
2
. Important Pros and Cons of Programming Languages

Products

Pros

Cons

ASP .NET
2005

-

Extremely well integrated with SQL
Server 2005.

-

Ability t
o drag and drop graphical
web interface.

-

Large MSDN documentation library.

-

Very large amount of built in objects
and functions.

-

Excellent IDE and graphical
debugger.

-

Microsoft licensing issues.

-

Future release date.

-

Newness means additional
like
lihood of major bugs.

-

A server for ASP .NET would
be tied to Microsoft platforms.

-

A
lready decided against SQL
Server 2005.

PHP

-

Current version well tested.

-

Many online examples.

-

No licensing issues.

-

Cross platform.

-

Fast code execution

-

Easy

to learn

-

Not as well integrated as ASP
.NET.

-

No graphical debugger/IDE.

-

No ability to drag and drop
interfaces.

2.3.2

Technologies and Approach Used

The first technology
the team

decided on was the database, largely because it
forms the core of
the

syste
m. Due to complexity and lack of exclusive features
that would benefit this particular project, Oracle and PostgreSQL were quickly
removed from contention. MySQL and SQL Server 2005 were the two
remaining solutions to be evaluated. Both appeared to have
similar feature sets,
and SQL Server 2005 appeared to be the solution that would have been the
simplest to learn and would have had the best integration with other products,
such as ASP.NET and Microsoft IIS. However, several factors contributed to
outwei
gh those advantages.


First, the team would have had to contend with licensing issues that would ha
ve
introduced hassle and stressed the
budget. Second
ly
, the release date for SQL
Server 2005 was November 7
th
, 2005 (which was still two weeks into the futu
re
at the time the technology was being considered). The team did not want to
take chances on the release date being pushed back, and knew the
documentation available for the product would be limited due to its upcoming
release.

Additionally, previous ve
rsions of this product

would not have worked
because they

lacked some core features that made it a viable option.


The current version of MySQL has a very large user base around the globe, and
is mature and tested, and unlikely to suffer from the type of b
ugs that might
exist in a newly released product. Additionally, this project was open source,
and would therefore fit within
the

budget constraints.

This product also had a
simple set of features that met the needs for this project.
Because of these
f
actors
, it was decided that MySQL
was

the best database solution.


Next, the team had to decide on a programming language for creating the web
interface. PHP is a popular option in the global community, and as a result, a
large amount of documentation and
open
-
source examples are readily available.
On the other hand,
the

code may have been easier to write using ASP.NET (due


8

to the amount of built in functionality provided by Microsoft, and the ability to
choose from a variety

of languages, allowing
the pro
grammer to use
what
he/she feels is most comfortable
).


However, while ASP.NET certainly had the potential to be an incredible help on
this project, it fell victim to some of the same issues that were involved in
deciding against SQL Server 2005. Its rele
ase date was also November 7
th
,
2005, and it would have had the same licensing issues as the server. In addition,
while any computer could view ASP.NET generated web pages, this would
have required the use of a Windows
-
based server. Using PHP gave more
f
lexibility in choosing the platform, which was important because platform
independence was one of the constraints
the

team
decided should be applied to
the end
-
product.

As a result, PHP was selected for the programming language.


After that, the team had
to decide on a web server. Given the fact that the team
was aiming for platform independence and had already decided against using
the other Microsoft products, the decision to use Apache was a relatively easy
one.


Finally,
the

team decided on
the

technic
al approach to development. Because of
the advantage of having team members with various areas of expertise, it was
decided that
the project was
divide
d

up into sections

for each member and then
integrat
ed

at the end.

2.4

Detailed Design

The ISEAGE attack tool

repository and player is ma
de up of three separate parts: t
he
user interface, the web server, and the attack systems.


These three parts are described
in detail in the following sections.

Figure
2
-
1

illustrates

t
he basic layout of the
system and
Figure
2
-
2

depicts the basic system use cases.




9

User Machine
Database
Web Server
PHP Script
Windows Attacks
Macintosh Attacks
Linux Attacks
Target Machine

Figure
2
-
1
.
Attack Tool Repository and Player System Layout



Figure
2
-
2
. System Use Cases

2.4.1

User Interface

The user interface is a PHP based web application that allows the user to
search and launch attacks and read attack documentation.


In addition to
regular user functionali
ty, the administrator also has the ability to update the
attack database through this web interface using an existing tool called


10

phpMyAdmin.


Figure
2
-
3

below is the initial screen the user sees when
connecting to the attack repo
sitory

website
.



Figure
2
-
3
. Initial Search Screen


The menu pane on the left allows the user to navigate to

the

other parts of the
webpage

listed below.

About

-

gives a simple description of the tool

s pur
pose and intent.

Contact i
nfo



provides
contact information for the administrators of the
repository.

Help

-

gives basic instructions for using the repository

and launch
features
.

Search

-

brings the user back to the initial search screen.

Admin

-

allows
the user access to phpMyAdmin, a tool for modifying the
attack database


The search pane allows the user to search for attacks based on attack name
and/or a number of attack parameters:



Name

-

allows for searching for exact names or using SQL
wildcards



Tar
get platform

-

the type of platform to be attacked



Source platform

-

the type of platform

on which the attack can be
executed



Attack type

-

the type
category

into which
the attack falls



11



Service attacked

-

the service which the attack is meant to disable
or

exploit



Document available

-

defines whether there is documentation for
the attack



Confirmed exploit

-

defines whether the attack has been confirmed
to work by the system administrator



Runnable

-

defines whether the attack has been confirmed to at least
e
xecute (an attack may be runnable, but may not be confirmed)


After entering the desired search parameters and hitting search, a table of
results will appear below the search pane, as in
Figure
2
-
4
.



Figure
2
-
4
. Search Results Table


The results table displays all the same fields available from the search pane as
well as four additional fields:



Version

-

the version of the attack



Location

-

the hostname or IP address of the machine
on which the
attack is installed



Homepage

-

the website of the attack

creator



Download

-

a link to the
source

code of the attack


Clicking on the column names above the name, target platform, source
platform, attack type, or service attacked columns will s
ort the table based on


12

the selected parameter.


Also, if documentation is available for the attack,
clicking on the 'Yes' in the documentation column will display any
documentation available in a new browser window.


To launch an attack the
user clicks the

name of the attack which loads the attack page.


As seen in the
Figure
2
-
5

below, the table entry for the selected attack will remain.


Below it
is a text box where any extra parameters for the attack can be entered.



Figure
2
-
5
. Attack Launch Page


When all desired parameters are added, selecting the 'Launch Attack' button
will open a new browser for attack output and execute the attack.

Figure
2
-
6

illus
trates what the output of an attack looks like to the user.




13


Figure
2
-
6
. Attack Ou
t
put Window

2.4.2

Web Server

The web server is the central part of the system that serves the web front
-
end
to the user, houses th
e database of attack entries, and communicates with
attack systems to remotely execute attacks.


The user interface is served using the Apache web server which supports PHP
script execution.


It is also linked to a MySQL database that contains a table
with

entries for all attacks available in the repository.


Figure
2
-
7

and
Figure
2
-
8

below show the database table.


Each table entry corresponds to an attack
that is hosted on an attack system located at the

addr
ess in the 'Location' field,
which is executed using the command in the 'ExecCommand' field.


The
'Download' field stores the path to the corresponding attack source code that is
also stored on the attack system.




14


Figure
2
-
7
. Attack Table




Figure
2
-
8
. Attack Table cont.


When a user selects the 'Launch Attack' button in the user interface, the web
server executes a PHP script which connects to the appr
opriate attack system
at the address in the location field of the attack's database entry.


It then
connects to the attack system via an SSH connection, executes the attack, and
receives any output the attack generates so that it can be displayed to the us
er
via the attack output window.

2.4.3

Attack Systems

The attack systems are a group of computers where the actual attack code and
executables are stored.


These computers are network accessible by the web
server and may be running any operating system or may be

of any different
type of architecture required for a particular attack or group of attacks.


These
machine
s run an SSH server that allows connections to be made to them by
the web server.


This allows the web server to execute the attacks stored on
the sy
stem, receive any output the attacks may generate, and download the
attack source code.


Any number of attack systems can be added to the


15

repository system to support as many different types of attack and platforms
as desired by the systems users.


This al
lows for a very flexible and scalable
solution.

2.5

Implementation Process

The implementation was carried out in a fairly simple incremental process.


First
,

a
web server and development system were set up.


The server runs Debian Linux as
well as the Xampp se
rver package.


This server package includes the Apache web
server, PHP 4 and 5, MySQL database, and the phpMyAdmin tool for database
management.


This package made getting the web server and development tools up
and running quick and simple.


At this point

a simple HTML web page was made that
had the general layout that the final product would have.


Although it had no actual
functionality

built in, it served as a framework on which to build the PHP scripts that
would be responsible for the search dropdown
box population, the search results table
generation, database connectivity,
SSH

connectivity, attack execution, and displaying
attack output.


Along with the basic HTML page
,

the database table was also created
at this point with a basic set of fields and
a dozen sample attack entries.


These
sample entries would be used during the
development

process to represent attacks that
weren't actually going to be installed until later in the implementation process.


At this point functionality was added to the HTML

webpage in logical order.


First
,

PHP scripts were added that populated the search parameter dropdown boxes with the
fields in the database table.


After this, scripts were added for SQL query generation
and to display query results in the search results
table.

Next, the remote execution script was written.


The remote attack execution script
connects to the remote attack system using an SSH connection.


It was necessary to
install the libssh2 libraries as well as the PHP SSH2 extension.


Once these were
i
nstalled the script was relatively simple.


This script is also responsible for receiving
any output from the remote system and displaying it to the user in the attack output
window.

With all the main functionality complete, it was time to begin populating

the
repository with actual attacks by installing them on the development machine and
adding them to the attack database.

This was the final step in the implementation
process.

2.6

End Product Testing

The end product was continuously tested both during and af
ter the implementation to
maximize the likelihood of finding and correcting bugs. Testing implicitly occurred
any time a team member accessed the website. In addition, more formalized use
-
case
testing was also performed against the project requirements.

CVS was used to ensure
code could be rolled back to a known stable state if necessary. For additional
verification, the end product was also inspected by the client.

The team also relied on
input from students working on the ISEAGE project to verify it
would meet their
needs.




16

Use
-
case testing and black
-
box testing were the primary methods employed. Test
results were assessed on a pass
-
fail basis. In addition, each test was given a priority
based on the importance of the aspect being tested. The tests

were categorized into
priorities as follows:



High


A high
-
priority test covers a feature that is critical to meeting the
product requirements. A high
-
priority failure may significantly impede the
client, or even completely prevent the client from using
the tool effectively.
A high
-
priority failure usually directly corresponds to product failure. All
high
-
priority tests must pass before the product can be considered to meet
all requirements
.



Medium


A medium
-
priority test covers a feature that suppleme
nts the product
in meeting the product requirements. A medium
-
priority failure may
impede the client, especially if the client is not experienced with the
product. A medium
-
priority failure usually can be overcome or
circumvented, but it may keep inexper
ienced users from attaining their
goals, which is effectively the same as product failure. All medium
-
priority
tests must pass before the product can be considered a true success
.



Low


A low
-
priority test covers a feature that is not essential to the pro
duct. A
low
-
priority failure may appear unprofessional or briefly confuse the client.
A low
-
priority failure usually can be ignored, even by inexperienced users.
Low
-
priority tests are important to measuring the quality of the program, but
not all low
-
p
riority issues need to be resolved before closing the project.


Aspects tested included site navigation, search requests, results table links, and the
launch attack process. A brief summary of the test results is provided below:



Table
2
-
3
. Project Testing Results

Priority

# of Tests

# Passed

# Fixed and Retested

% Complete

High

9

8

1

100%

Medium

4

4

0

100%

Low

3

0

1

33%

Total

16

12

2

88%


The product testing phase was considered successful when all hig
h
-
priority and
medium
-
priority tests passed.

Appendix A contains a complete listing of each test

and its results
.

2.7

Project End Results

The team was able to accomplish the goals established at the beginning of the project.
The team created a successful fou
ndation for the project by gaining a thorough
understanding of the requirements. The team built a solid design document based
on

those requirements. The implementation proceeded in conformance with the design.
The testing procedure was successfully cond
ucted concurrently with implementation
phase.
Table
2
-
4

below summarizes the status of the major components for this
project.

The project completed successfully and on schedule.



17



Table
2
-
4
. Project Component Status Results

Component

End Result

Website software

Completed

User documentation

Completed

Administrative

guide

Completed

Database structure and contents

Completed

Commented source code

Completed


3

Resources and Schedule

This section details the
resources
used

for this project and the schedule associated with its
completion.

3.1

Resources

This section details the personnel effort requirements, other resource requirements,
and
total

financial requirements.

3.1.1

Personnel Effort Req
uirements

This section
identifies
the tasks that were defined for this project and
the
personnel effort necessary to complete each task.

3.1.1.1

Task Definitions

Task 1


Problem Definition

Subtask 1a


Problem Definition Completion

Subtask 1b


End
-
Users and End
-
Use Identification

Subtask 1c


Constraint Identification


Task 2


Technology Considerations and Selection

Subtask 2a


Identification of Possible Technologies

Subtask 2b


Identification of Selection Criteria

Subtask 2c


Technology Research

Subtask 2d


Technology Selection


Task 3


End
-
Product Design

Subtask 3a


Identification of Design Requirements

Subtask 3b


Design Process

Subtask 3c


Documentation of Design


Task 4


End
-
Product Prototype Implementation

Subtask 4a


Identification of Prototype L
imitations and Substitutions

Subtask 4b


Implementation of Prototype End
-
Product


Task 5


End
-
Product Testing



18

Subtask 5a


Test Planning

Subtask 5b


Test Development

Subtask 5c


Test Execution

Subtask 5d


Test Evaluation

Subtask 5e


Documentation of
Testing


Task 6


End
-
Product Documentation

Subtask 6a


Development of End
-
User Documentation

Subtask 6b

Maintenance/Support Documentation


Task 7


End
-
Product Demonstration

Subtask 7a


Demonstration Planning

Subtask 7b


Faculty Advisor Demonstration

Subtask 7c


Client Demonstration

Subtask 7d


Industrial Review Panel Demonstration


Task 8


Project Reporting

Subtask 8a


Project Plan Development

Subtask 8b


Project Poster Development

Subtask 8c


End
-
Product Design Report Development

Subtask 8d

De
velopment of Project Final Report

Subtask 8e


Weekly Email Reporting

3.1.1.2

Personnel Effort Requirements

Table
3
-
1

includes

the original personnel
effort

estimat
es
from the project
plan
.
Table
3
-
2

holds

the revised personnel reso
urces
from the project’s
design report
.

At that time, Tasks 1 and 2 were completed and Task 3
resources were updated based on the actual design plan.
Table
3
-
3

details
the final

personnel resources needed to complete the entire project.


As the tables show, the actual personnel effort required by each team
member is significantly less than the original estimates. There are several
reasons for these changes.
The first task took
about half the time
originally estimated because
Dr. Jacobson was able to clearly define his
requirements.

Task 2 took longer because of the number of technologies
the team considered and the depth at which each was considered. Task 5
also took approxima
tely half of the anticipated time because
the design

turned out to be much less complicated than originally expected. As a
result, testing proved to be rather straight forward. The remaining tasks,
in most cases, remained close to the
original
estimates.

Overall, the effort
required for the project was less than expected because the design and its
implementation turned out to be simplistic but functional
,

which is what
the client desired.
Differences between each team member’s efforts were

because the
team decided to utilize the abilities of the individuals with the
best skills for each task. For example,
user
documentation was worked on


19

more heavily by the individuals who had previous experience developing
similar documents.


Table
3
-
1
.
Original Personnel Effort Resource Requirements

Team Members
Task 1
Task 2
Task 3
Task 4
Task 5
Task 6
Task 7
Task 8
Totals
Jeremy Brotherton
9
4
50
31
16
11
16
40
177
Tim Hilby
5
5
42
40
20
14
13
35
174
Brett Mastbergen
8
6
45
37
15
12
11
42
176
Jasen Stoeker
7
3
43
35
22
16
12
45
183
Total
29
18
180
143
73
53
52
162
710
Table 3-1. Original personnel effort resource requirements


Table
3
-
2
.
Revised Personnel Effort Resource Requirements

Team Members
Task 1
Task 2
Task 3
Task 4
Task 5
Task 6
Task 7
Task 8
Totals
Jeremy Brotherton
4
10
17
31
16
11
16
40
145
Tim Hilby
2
7
12
40
20
14
13
35
143
Brett Mastbergen
4
7
9
37
15
12
11
42
137
Jasen Stoeker
2
8
12
35
22
16
12
45
152
Total
12
32
50
143
73
53
52
162
577
Table 3-2. Revised personnel effort resource requirements


Table
3
-
3
. Final Personnel Effort Resource Requirements

Team Members
Task 1
Task 2
Task 3
Task 4
Task 5
Task 6
Task 7
Task 8
Totals
Jeremy Brotherton
4
10
16
25
8
2
12
28
105
Tim Hilby
2
7
17
14
7
2
10
13
72
Brett Mastbergen
4
7
10
18
5
10
10
25
89
Jasen Stoeker
2
8
14
14
2
7
8
15
70
Total
12
32
57
71
22
21
40
81
336
Table 3-3. Final personnel effort resource requirements

3.1.2

Other Resource Requirement
s

As you can see from
Table
3
-
4
,
initially
the team did not expect this project to
hav
e any additional resource requirements. However, when preparing the
design document
,

costs for bound documentation were included as shown in
Table
3
-
5
. Finally,
Table
3
-
6

includes the costs for laminati
on and
foam
board

for the project poster.



Table
3
-
4
. Original Estimate of Other Resource Requirements

Item
Costs
-
-
$

Total
-
$

Table 3-4. Original Estimate of Other Resource Requirements



Table
3
-
5
. Revised

Estimate

of Other Resource Requirements

Item
Costs
Project Plan Binding
6.00
$

Design Report Binding
6.00
$

Final Report Binding
6.00
$

Total
18.00
$

Table 3-5. Revised Estimate of Other Resource Requirements



20



Table
3
-
6
. Final Costs of Other Resource Requirements

Item
Costs
Project Plan Binding
6.00
$

Design Report Binding
6.00
$

Final Report Binding
6.00
$

Project Poster Lamination and Board
25.00
$

Total
43.00
$

Table 3-6. Final Costs of Other Resource Requirements

3.1.3

Financial Requirements

This section details the total financial res
ources necessary to complete this
project.

3.1.3.1

Estimated Costs

The total costs for this project are based on the resources identified in
section
3.1.1

and
3.1.2
.
Table
3
-
7

re
presents the
original estimate of the
project costs
.


Table
3
-
8

details the revised estimate of the project costs.
Finally,

Table
3
-
9

lists the final costs for this project. The differences
between th
e original and revised costs are primarily because the team did
not know that senior design covered poster printing costs. In addition, the
team reduced the personnel effort

estimates

which reduced project costs.
The actual costs reflect preparing the pr
oject poster, which was laminated
and mounted on foam board,
and a further reduction in the personnel
effort necessary. Additionally, the team only need
ed

to use two computers
from ISEAGE
which
reduc
ed

the donated costs.


Table
3
-
7
.
Original Estimated Project Costs

Item
W/O labor
With labor
Donated costs
Project poster
65.00
$

65.00
$

Bound project documentation
20.00
$

20.00
$

4 Donated computers (ISEAGE)
1,600.00
$

Labor at $11.00 per hour:
Jeremy Brotherton
1,947.00
$

Tim Hilby
1,914.00
$

Brett Mastbergen
1,936.00
$

Jasen Stoeker
2,013.00
$

Total costs
85.00
$

7,895.00
$

1,600.00
$

Table 3-7. Original estimated project costs




21


Table
3
-
8
. Revised Estimated Project Costs

Item
W/O labor
With labor
Donated costs
Project poster printing
40.00
$

Bound project documentation
18.00
$

18.00
$

4 Donated computers (ISEAGE)
1,600.00
$

Labor at $11.00 per hour:
Jeremy Brotherton
1,595.00
$

Tim Hilby
1,573.00
$

Brett Mastbergen
1,507.00
$

Jasen Stoeker
1,672.00
$

Total costs
18.00
$

6,365.00
$

1,640.00
$

Table 3-8. Revised estimated project costs



Table
3
-
9
. Final Project Costs

Item
W/O labor
With labor
Donated costs
Project poster printing
40.00
$

Bound project documentation
18.00
$

18.00
$

Poster lamination and board
25.00
$

25.00
$

PHP book
40.00
$

2 Donated computers (ISEAGE)
800.00
$

Labor at $11.00 per hour:
Jeremy Brotherton
1,155.00
$

Tim Hilby
792.00
$

Brett Mastbergen
979.00
$

Jasen Stoeker
770.00
$

Total costs
43.00
$

3,739.00
$

880.00
$

Table 3-9. Final project costs

3.2

Schedules

The section

details the project schedule and the deliverable
s

schedule for the team.

3.2.1

Project Schedule

Figure
3
-
1

below shows the schedule for this project. The or
iginal estimates are
found in blue, revised estimates in red and the actual schedule in green. The
revised timeline reflects the completion of Tasks 1 and 2. The actual timeline
remains close to the team’s revised estimates. Actual implementation took l
onger
than expected but did not affect the success of the project. As a result, testing w
as
delayed but continuing some o
f the subtasks in parallel kept the completi
on date
close to the estimate. End
-
product documentation was started earlier than
anticip
ated allowing the team to complete it sooner. The project poster was also
started and completed earlier than originally estimated. Finally, client and advisor
demonstrations were completed much sooner than anticipated. Overall, the team
managed to remai
n very close to estimated schedule in some cases finishing tasks
early and other times slightly late.



22



Figure
3
-
1
. Project Schedule



23

3.2.2

Deliverables Schedule

The figure below shows the timeline for project deli
verables.

The dates for
deliverables have not changed.


Figure
3
-
2
.
Deliverables Schedule

4

Closure Material

This section
addresses project evaluation, commercialization, recommendations for
additional work,
lessons learned, risk and risk management, team information and the
closing summary.

4.1

Project Evaluation

The milestones for this project are shown below in
Table
4
-
1

in addition to the
relative importance of each to the project. T
hese milestones were evaluated based on
the criteria in
Table
4
-
2

and weighted with importance. A total project score of 90%
w
as

considered successful.

Overall project evaluation is detailed in
Table
4
-
3
.

With
a score of 97%, the team deem
ed

this project successful.

The client also evaluated
the end
-
product against his requirements and determined that the project was deemed
completed and successful.


Table
4
-
1
. Project Milestones and Relative Importance

Table 4
-
1. Project milestones and relative importance

Number

Milestone

Importance

1

Project plan development

20%

2

Design research

5%

3

Technology selection

5%

4

Initial product design

20%

5

Framework
implemented

10%

6

End
-
product testing

15%

7

End
-
product documentation

15%

8

End
-
product demonstration

10%

Total


100%


Table
4
-
2
. Milestone Evaluation Criteria

Table 4
-
2. Milestone evaluation criteria

Cr
iteria

Score

Greatly exceeded

110
%

Exceeded

10
5
%

Fully met

100%

Partially
met

80%

Not met

30%

Not attempted

0%



24




Table
4
-
3
. Project Results

Table 4
-
3. Project Results

Milestone

Evaluation

Resultant Pe
rcentage

1

Fully Met

20%*100 = 20%

2

Fully Met

5%*100 = 5%

3

Fully Met

5%*100 = 5%

4

Fully Met

20%*100 = 20%

5

Fully Met

10%*100 = 10%

6

Partially Met

15%*80 = 12%

7

Fully Met

15%
* 100 = 15%

8

Fully Met

10%

* 100 = 10%

Total


97
%


4.2

Commercializati
on

The team does not plan to commercialize this product. Dr. Jacobson, the team’s
faculty advisor, has also indicated that this product is intended to only be used within
the ISEAGE environment at Iowa State University.

4.3

Recommendations for Additional Work

The team recommends that any future work be aimed at producing an enhanced
prototype. More specifically the team recommends developing a method that allows
the user to interact with attacks that are launched from the website. Currently, the
website only

launches the attack and
returns
any output that is
generated

within a
predetermined time period. It is recommended that the launch mechanism allow the
user to open a new browser window with an interactive command line much like

a

Linux terminal window.

Another feature could allow users to specify target machines
for each attack simply by clicking on a live network map of computers.
Additionally,
it is recommended that more exploits be added to the repository to enhance the
functionality of the product.

4.4

Lessons Learned

What went well:



Implementation



All of the team members were eager to develop the code for
this application. As a result, individuals took on extra work without having to
be asked to do so.



Client d
emonstration



The team’s faculty adviso
r (also the client) was very
pleased with the progress over the semester and had no recommendations for
any changes when the team demonstrated the prototype.



Team w
ork


The group was very willing to share ideas and be supportive of
one another which creat
ed a very productive atmosphere and a successful
project.




25



What did not go well:



Equipment setup



The team experienced problems installing the operating
system onto the testing machines. It required a switch to a different operating
system to overcome
this problem. In addition, the team had to move the
machines out of Coover during the second semester.



Project plan



During the first document the team experienced some conflicts
because of differences in team member’s work habits.


Technical knowledge g
ained:



PHP



The team learn
ed

the scripting language of PHP to develop this product.



MySQL



The team learned how to use MySQL to create a database used by a
website.



XAMPP



The team learn
ed

how to use this all in one package for developers
which included

PHP, MySQL and Apache.


Non
-
technical knowledge gained:



Communications skills



The team develop
ed

both writing and oral
communications skills. All of the documents associated with this project helped
to develop good writing skills. Working with a grou
p required team member’s
to learn to communicate better and more effectively with one another to
complete this project successfully.



Long term planning



For this project, the team was required to develop a plan
for
successfully implementing a project over

two semesters. Never before had
any of the team members been involved in a project with such a long
development time.

As a result, the team learned how to plan and manage a
project of the long term.


What would be done differently:



Implementation



The
team was happy with the approach used for
implementation but thinks it would have been more effective to start
development during the end of first semester. It would
have
result
ed

in more
time to deal with unexpected problems and allow for time to add mor
e features
to the end
-
product.



Hardware setup



It would have been best to setup the machines for
development as soon as possible to better manage any problems or difficulties.


4.5

Risk and Risk Management

This section addresses anticipated risks and manageme
nt, anticipated and
unanticipated risks encountered and management, and changes in risk management
due to unanticipated risks.

4.5.1

Anticipated

R
isks and
P
lanned
M
anagement

Potential risks are as follows:



26



Loss of a team member due to sickness or other unexpecte
d
circumstances



Missed deadlines



Faulty product



Poor communications among team members may halt the project



Data loss


Management of the associated risks:



All team members w
ere

required to thoroughly understand the product
and associated technology as it i
s developed to ensure that any one
person’s tasks could be taken over by another team member



The team attempt
ed

to stay ahead of schedule and work
ed

to establish
the most feasible deadlines



All stages of production w
ere

reviewed to ensure that functional
requirements
were

met and code development
was

well documented to
provide aid in debugging



Team members work
ed

to create an environment where it
wa
s
comfortable to ask questions. All members w
ere

held accountable for
his/her own tasks. Task
delegations we
re

well documented to guarantee
all team members underst
ood

his/her responsibilities.



Concurrent Versioning System was used to keep backups of all
documents and to track changes each team member made

4.5.2

Anticipated Risks Encountered and Management

Anticipated

risks encountered:



Loss of a team member due to sickness


Management of the encountered risks:



Documentation of the team member’s activities and good
communication with the team leader allowed him to still contribute to
the group and helped other team mem
bers continue his efforts with few
problems.

4.5.3

Unanticipated Risks Encountered and Management

Unanticipated risks encountered:



D
ead on arrival

hardware


Management of the encountered risks:



The team was able to troubleshoot the computer and reconfigure its
s
ettings to allow for the installation of the operating system.

4.5.4

Changes in Risk Management Due to Unanticipated
Risks

As a result of the problems encountered with setting up hardware, the team
moved

deadlines ahead to allow for more time if problems were en
countered.


27

The team also tried to approach development with the expectation that
problems would be encountered. In other words, the team assumed each task
would prove more difficult than expected ensuring the team was not caught
off guard by any problems

that arose.

4.6

Project Team Information

Contact information for team members, the client, and faculty advisor are pr
ovided in
the following sections
.

4.6.1

Client Information

Information Assurance Center

Contact: Doug Jacobson

2419 Coover

Ames, IA 50011
-
3060

Office Phone: 515
-
294
-
8307

Fax: 515
-
294
-
8432

Email:
dougj@iastate.edu

4.6.2

Faculty Advisor Information

Doug Jacobson

2419 Coover Hall

Ames, IA 50011
-
3060

Office Phone: 515
-
294
-
8307

Fax: 515
-
294
-
8432

Email:
dougj@iastate.edu

4.6.3

Team Member Information

Jeremy Brotherton

Major: Computer Engineering

905 Dickinson #113

Ames, IA 50014

Phone: 515
-
292
-
9704

Email:
bigjermb@iastate.edu


Tim
othy Hilby

Major: Computer Engineering

252 Barton Anders

Ames, IA 50013

Phone: 515
-
572
-
0890

Email:
steiner@iastate.edu


Jasen Stoeker

Major: Computer Engineering

2310 Martin Lovelace

Ames, IA 50012

Phone: 515
-
572
-
6079



28

Email:
jasen@iastate.edu


Brett Mastbergen

Major: Computer Engineering

4510 Steinbeck St. #4

Ames, IA 50014

Phone: 515
-
451
-
5700

Email:
siver94@iastate.edu

4.7

Closing Summary

With today’s rapid increase in computer technology the problem of computer security
is rising. The ability to create defenses against potential security threats begins with
gaining an understanding of how computer networks and computer technologies can
b
e attacked an
d

exploited. The Attack Tool Repository and Player, in conjunction
with the ISEAGE
,
provide
s

the ability to quickly

locate and execute a large number
of
attacks and exploits. The team’s
solution include
d

a web
-
based search engine
capable of
searching the attack database. Each attack entry contain
s

relevant attack
information and documentation and
is

tied to a repository that allow
ed

all attacks to
be executed from their native platform.




29

Appendix A


Test name or objective:
Tester: Brett Mastbergen
Test date: 2 / 23 / 2006
Tester: Brett Mastbergen
Test date: 2 / 23 / 2006
Test type: Test or Retest
Test time: 2 : 47 ( PM )
Test type: Test or Retest
Test time: 2 : 55 ( PM )
Priority:
High
Priority:
High
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
Tester: Brett Mastbergen
Test date: 2 / 26 / 2006
Tester: Brett Mastbergen
Test date: 2 / 26 / 2006
Test type: Test or Retest
Test time: 4 : 18 ( PM )
Test type: Test or Retest
Test time: 5 : 04 ( PM )
Priority:
High
Priority:
Medium
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
Comments (or recommended fix if known):
Comments (or recommended fix if known):
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
Comments (or recommended fix if known):
Comments (or recommended fix if known):
May06-11 Scheduled Test Reporting Form
May06-11 Scheduled Test Reporting Form
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
May06-11 Scheduled Test Reporting Form
Launch Attack Button
May06-11 Scheduled Test Reporting Form
Test name or objective:
User Feedback
User Feedback (Again)
Attack Timeout
Test name or objective:
Test name or objective:
This test checks that attacks are initiated and basic
feedback is provided to the user
Launching an attack opens a new browser window. The
user feedback will appear in the window.
The window is blank.
Unknown
It seems that the SSH command is not working, but it will
take more investigation to find the root cause of the
problem.
This test checks that the Launch Attack button correctly
prepares to initiate the attack
Clicking the “Launch Attack” button opens a new window.
The command to execute is properly built, including the
attack name and command-line parameters.
Figured out that the filename had to include the fully-
qualified path instead of the relative path. Looks great!
This test checks that attacks are timed out after a specified
amount of time
A launched attack is killed after twenty seconds
This test checks that attacks are initiated and basic
feedback is provided to the user
Launching an attack opens a new browser window. The
user feedback will appear in the window.



30

Test name or objective:
Tester: Jeremy Brotherton
Test date: 2 / 24 / 2006
Tester: Jeremy Brotherton
Test date: 2 / 24 / 2006
Test type: Test or Retest
Test time: 2 : 15 ( PM )
Test type: Test or Retest
Test time: 2 : 25 ( PM )
Priority:
High
Priority:
High
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
Tester: Jeremy Brotherton
Test date: 2 / 24 / 2006
Tester:
Test date: / / 2006
Test type: Test or Retest
Test time: 2 : 33 ( PM )
Test type: Test or Retest
Test time: : ( )
Priority:
Medium
Priority:
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
Comments (or recommended fix if known):
Comments (or recommended fix if known):
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
Comments (or recommended fix if known):
Comments (or recommended fix if known):
May06-11 Scheduled Test Reporting Form
May06-11 Scheduled Test Reporting Form
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
May06-11 Scheduled Test Reporting Form
SQL Parameter Search
May06-11 Scheduled Test Reporting Form
Test name or objective:
SQL Name and Parameter Search
Table Sorting
Test name or objective:
Test name or objective:
This test ensures the correct SQL statement is built if one or
more parameters (including the name) are specified
The search statement should include the name and only
have search restrictions on the parameters specified by the
user
This test ensures the correct SQL statement is built if one or
more parameters (other than the name) are specified
The search statement should assume an empty name and
only have search restrictions on the parameters specified by
the user
This test verifies the search results table can be sorted by
field
Fields that are sortable have their name as a link. Clicking
the link sorts the table according to that field.




31

Tester: Jasen Stoeker
Test date: 2 / 24 / 2006
Tester: Jasen Stoeker
Test date: 2 / 24 / 2006
Test type: Test or Retest
Test time: 10 : 10 ( AM )
Test type: Test or Retest
Test time: 10 : 18 ( AM )
Priority:
High
Priority:
High
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
Tester: Jasen Stoeker
Test date: 2 / 24 / 2006
Tester:
Test date: / / 2006
Test type: Test or Retest
Test time: 10 : 30 ( AM )
Test type: Test or Retest
Test time: : ( )
Priority:
Low
Priority:
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
Comments (or recommended fix if known):
Comments (or recommended fix if known):
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
Comments (or recommended fix if known):
Comments (or recommended fix if known):
May06-11 Scheduled Test Reporting Form
May06-11 Scheduled Test Reporting Form
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
May06-11 Scheduled Test Reporting Form
Default SQL Statement Search
May06-11 Scheduled Test Reporting Form
Test name or objective:
SQL Name Search
Test name or objective:
SQL Injection Prevention
Test name or objective:
Test name or objective:
The priority for this test was high because a default search
should be an effective way to view all possible attacks
This test ensures the correct SQL statement is built if an
attack name is specified
The search statement should include the name and have no
other search restrictions
This test ensures the correct SQL statement is built to
execute a default search
The search statement should assume an empty name and
have no other search restrictions
This will require some investigation to fix, if it is fixed at all (it
is a low priority). Another possibility may be to make sure
the attacker could do no more than conduct searches with
the website account.
This test ensures that the SQL database is not vulnerable to
an SQL injection attack
The website should not build more than one search
statement, and the built search statement should be valid
It was possible to build two separate search statements
HTTP get method doesn't account for SQL injection, and no
other steps have been taken to prevent it



32

Test name or objective:
Tester: Tim Hilby
Test date: 2 / 22 / 2006
Tester: Tim Hilby
Test date: 2 / 22 / 2006
Test type: Test or Retest
Test time: 1 : 35 ( PM )
Test type: Test or Retest
Test time: 1 : 45 ( PM )
Priority:
Low
Priority:
Medium
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
Tester: Tim Hilby
Test date: 2 / 22 / 2006
Tester: Tim Hilby
Test date: 2 / 22 / 2006
Test type: Test or Retest
Test time: 1 : 55 ( PM )
Test type: Test or Retest
Test time: 3 : 45 ( PM )
Priority:
High
Priority:
Low
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
Comments (or recommended fix if known):
Comments (or recommended fix if known):
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
Comments (or recommended fix if known):
Comments (or recommended fix if known):
May06-11 Scheduled Test Reporting Form
May06-11 Scheduled Test Reporting Form
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
May06-11 Scheduled Test Reporting Form
Table Homepage Links
May06-11 Scheduled Test Reporting Form
Test name or objective:
Table Download Links
Table Name Links
Table Homepage Links (Attempt II)
Test name or objective:
Test name or objective:
Bug in the PHP code
Should be an easy fix. Also, this particular problem is a low
priority because it causes minimal trouble for the user.
This test verifies the download links under the search
results table work as designed
Items without downloads display as “N/A” (in normal text).
Items with downloads display as “Download” (in a link). The
link opens the download window
This test verifies the homepage links under the search
results table work as designed
Items without homepages display as “N/A” (in normal text).
Items with homepages display as “Download” (in a link).
The link opens the homepage in a new window
The "N/A" text showed up as a broken link
This test verifies the homepage links under the search
results table work as designed
Items without homepages display as “N/A” (in normal text).
Items with homepages display as “Download” (in a link).
The link opens the homepage in a new window
This test verifies the name links under the search results
table work as designed
Item names display as links. Clicking on the name selects
the attack. The launch attack options appear for the
selected attack



33

Test name or objective:
Test name or objective:
Tester: Tim Hilby
Test date: 2 / 22 / 2006
Tester: Tim Hilby
Test date: 2 / 22 / 2006
Test type: Test or Retest
Test time: 1 : 10 ( PM )
Test type: Test or Retest
Test time: 1 : 15 ( PM )
Priority:
High
Priority:
Low
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
Tester: Tim Hilby
Test date: 2 / 22 / 2006
Tester: Tim Hilby
Test date: 2 / 22 / 2006
Test type: Test or Retest
Test time: 1 : 25 ( PM )
Test type: Test or Retest
Test time: 1 : 30 ( PM )
Priority:
High
Priority:
Medium
Test description:
Test description:
Anticipated results:
Anticipated results:
Actual results:
Pass
or
Fail
Actual results:
Pass
or
Fail
Reason for failure:
Reason for failure:
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
May06-11 Scheduled Test Reporting Form
Homepage / Page Layout (Part I)
May06-11 Scheduled Test Reporting Form
Homepage / Page Layout (Part II)
May06-11 Scheduled Test Reporting Form
–––––––––––––––––––––––––––––––––––––––––––––––
–––––––––––––––––––––––––––––––––––––––––––––––
Site Navigation
Table Documentation Links
Test name or objective:
Test name or objective:
Since the webpage is presentable in the more commonly
used web browsers from Part I, this issue is considered a
low priority. Also, the required changes will have broad
effects and may break other aspects of the website.
Comments (or recommended fix if known):
Comments (or recommended fix if known):
This test verifies the homepage loads and displays correctly
in Internet Explorer and Mozilla Firefox
The homepage loads and is presentable in all browsers
This test verifies the homepage loads and displays correctly
in Opera and Konqueror
The homepage loads and is presentable in all browsers
The webpage was presentable in Opera. Some fields were
cut off using the Konqueror browser
Basic organization, use of frames, use of "Transitional"
HTML as opposed to "Strict" HTML
All links worked as anticipated
May06-11 Scheduled Test Reporting Form
This test verifies the documentation links under the search
results table work as designed
Items without documentation display as “No” (normal text)
and items with documentation display as “Yes” (as a link).
Clicking “Yes” opens the documentation in a new window.
Comments (or recommended fix if known):
Comments (or recommended fix if known):
This test verifies that the site navigation works as intended
Navigation bar links work