Application Development Standards V1.0

makeshiftluteΛογισμικό & κατασκευή λογ/κού

14 Ιουλ 2012 (πριν από 5 χρόνια και 4 μήνες)

507 εμφανίσεις













APPLICATION DEVELOPMENT


STANDARDS 

Information Technology Division
Initial
October 2009














Revision Chart 
Please provide a description of the change as it relates to the new release of the document.
Attach any identifying release notes and functionality descriptions relevant to the new issue and
document serial number.
Naming scheme for the document serial number follows the first two initials of the document
purpose (e.g. SRS, two-digit month, four-digit year, v, and a two-digit version number.)
Version
Number Primary Author(s) Description of Version Date Completed
1.0 Initial Version
I
NITIAL
R
ELEASE
Page 2 of 67



Table of Contents 
1. INTRODUCTION ................................................................................................................. 5 
1.1 B
ACKGROUND
................................................................................................................... 5 
1.2 P
URPOSE
........................................................................................................................... 5 
1.3 S
COPE
............................................................................................................................... 5 
1.4 R
EFERENCES
..................................................................................................................... 6 
1.5 R
ECOMMENDED
V
ERSIONING AND
/
OR
C
HANGE
M
ANAGEMENT
....................................... 6 
2. GENERAL STANDARDS.................................................................................................... 7 
2.1 G
ENERAL
O
VERVIEW
........................................................................................................ 7 
2.2 R
EQUIREMENTS
A
NALYSIS
D
EFINITION
(R
EQUIREMENTS
S
TANDARDS
) ........................... 8 
2.3 A
RCHITECTURE
D
ESIGN
G
UIDELINES AND
P
ATTERNS
O
VERVIEW
(D
ESIGN
S
TANDARDS
-

A
RCHITECTURE
D
ESIGN
G
UIDELINES
)...................................................................................... 8 
2.4 C
ODING
(C
ODING
S
TANDARDS
) ........................................................................................ 8 
2.5 T
ESTING
S
TANDARDIZATION
(T
ESTING
S
TANDARDS
) ....................................................... 8 
2.6 S
TABILIZATION
,

D
EPLOYMENT
&

S
OFTWARE
D
ISTRIBUTION
(D
EPLOYMENT
S
TANDARDS
)
8 
2.7 M
AINTENANCE
&

S
YSTEM
R
EPLACEMENT
(M
AINTENANCE
S
TANDARDS
) ........................ 9 
2.8 SOS

W
EB
P
UBLISHING
S
TANDARDS
................................................................................. 9 
3. REQUIREMENTS STANDARDS..................................................................................... 10
3.1 O
BJECTIVES OF
R
EQUIREMENTS
D
EFINITION
.................................................................. 10 
3.2 S
CALE
............................................................................................................................. 10 
3.3 R
EQUIREMENTS
G
ATHERING
,

A
NALYSIS AND
D
EFINITION
.............................................. 11 
3.4 V
ERIFYING THE
R
EQUIREMENTS
S
PECIFICATION
............................................................ 12 
3.5 A
CCEPTANCE OF
P
ROJECT
R
EQUIREMENTS
S
PECIFICATION
D
OCUMENT
......................... 13 
3.6 T
EMPLATE
:

S
TANDARD
P
ROJECT
R
EQUIREMENTS
S
PECIFICATION FOR
A
PPLICATION 
S
OFTWARE
D
EVELOPMENT
..................................................................................................... 13 
4. DESIGN STANDARDS ...................................................................................................... 21 
4.1 D
ESIGN
G
OALS
............................................................................................................... 21 
4.2 D
ESIGN
C
ONSTRAINTS
.................................................................................................... 21 
4.3 D
ESIGN
A
PPROACH
......................................................................................................... 21 
4.4 D
ESIGN
P
ROCESS
............................................................................................................ 22 
4.5 D
ESIGN
O
BJECTIVES
....................................................................................................... 24 
4.6 D
ATA
V
ALIDATION
......................................................................................................... 28 
5. CODING STANDARDS ..................................................................................................... 29 
5.1 ASP.NET

C
ODING
S
TANDARDS
..................................................................................... 29 
5.2 A
RCHITECTURE
............................................................................................................... 33 
5.3 S
ECURITY
........................................................................................................................ 34 
5.4 R
EPORTING
..................................................................................................................... 35 
5.5 M
ICROSOFT
SQL

S
ERVER
C
ODING
S
TANDARDS
............................................................. 35 
6. TESTING STANDARDS .................................................................................................... 38 
I
NITIAL
R
ELEASE
Page 3 of 67



6.1 T
ESTING
O
BJECTIVES
...................................................................................................... 38 
6.2 T
YPES OF
T
ESTING
.......................................................................................................... 38 
6.3 T
ESTING
S
TRATEGY
........................................................................................................ 40 
6.4 T
EST
E
NVIRONMENT
....................................................................................................... 42 
6.5 R
EQUIREMENTS
T
RACEABILITY
M
ATRIX
........................................................................ 43 
6.6 A
UTOMATED
T
EST
E
XECUTION
....................................................................................... 44 
6.7 T
EMPLATES
..................................................................................................................... 44 
7. ARCHITECTURE DESIGN GUIDELINES.................................................................... 46 
7.1 A
UTHENTICATION
/A
UTHORIZATION
(
ASP
.
NET INCLUDING WEB SERVICES
)..................... 46 
7.2 W
EB
A
PPLICATION
A
RCHITECTURE
................................................................................ 46 
7.3 R
EPORTING
..................................................................................................................... 46 
7.4 R
EVERSE
P
ROXY
(
FOR NETWORK INFRASTRUCTURE
)...................................................... 46 
7.5 W
INDOWS APPLICATION ARCHITECTURE
(
FOR NETWORK INFRASTRUCTURE AND 
D
ATABASE
A
DMINISTRATOR
) ................................................................................................. 46 
8. DEPLOYMENT STANDARDS ......................................................................................... 47 
8.1 I
NTRODUCTION
............................................................................................................... 47 
8.2 D
EPLOYMENT
P
RE
-
REQUISITES
....................................................................................... 47 
8.3 T
RAINING
P
LAN
(O
PTIONAL
) .......................................................................................... 47 
8.4 D
EPLOYMENT
S
COPE
...................................................................................................... 47 
8.5 C
UT
O
VER
A
CTIVITIES
.................................................................................................... 48 
8.6 T
RANSITION TO
O
PERATIONS
.......................................................................................... 48 
9. SOFTWARE MAINTENANCE STANDARDS ............................................................... 50
9.1 I
NTRODUCTION
............................................................................................................... 50 
9.2 S
COPE AND
P
URPOSE
...................................................................................................... 50 
9.3 A
PPLICABLE
D
OCUMENTS
............................................................................................... 51 
9.4 M
AINTENANCE
P
ROCESS
A
CTIVITIES
.............................................................................. 51 
9.5 S
OFTWARE
U
NIT
M
AINTENANCE
P
ROCEDURES
.............................................................. 53 
9.6 U
SER
I
NTERFACES
........................................................................................................... 54 
10 GLOSSARY ......................................................................................................................... 55 
APPENDIX 1: DEPLOYMENT PLAN TEMPLATE SAMPLE ........................................... 56
APPENDIX 2: SYSTEM DOCUMENTATION TEMPLATE SAMPLE ............................. 65
APPENDIX 3: REPORTING TOOLS ..................................................................................... 67
I
NITIAL
R
ELEASE
Page 4 of 67



Application Development Standards 
1. Introduction
1.1 Background
The Information Technology Division (ITD) has been supporting a large number of
applications created using a range of technologies at the Office of the Secretary of State
(SOS). Application Development Standards (ADS) are required to set the processes and
guidelines for new IT application development as well as major upgrades to existing
applications at the SOS. These standards will be used to help define which technologies are
preferred, acceptable, supported and prohibited.
Application development refers to a software development process used by an application
developer to build application systems. This process is commonly known as the Software
Development Lifecycle (SDLC) methodology and encompasses all activities to develop an
application system and put it into production, including requirements gathering, analysis,
design, build, testing, deployment, and maintenance stages.
1.2 Purpose
The purpose of this document is to establish design guidelines, standards and conventions for
developing applications at the SOS.
The intent of the Application Development Standards is to describe the standards, which
apply when developing applications at the SOS. This document is not intended to be an 'all-
inclusive' methodology for application development or the software development lifecycle
but will outline specific standards that shall be followed when building applications. The
ADS document is a living document that will evolve over time. The ADS document is
intended for application developers who will be designing, developing, and maintaining
applications; as well as external contractors, consultants, and business partners, and other
SOS employees.
1.3 Scope
This document applies to all projects developed in the Information Technology Division.
However the Coding Standards will focuses on Microsoft’s .NET Framework (C# emphasis)
and SQL Server Database. These two elements compose the preferred base platform for
current and future projects.

This ADS document defines
o  Application/System development lifecycle common methodology and activities,
including requirements gathering, analysis, design, build, testing, deployment, and
maintenance
o  Appropriate application development languages for use at SOS and corresponding coding
standards
o  Guidelines for user authentication
o  Appropriate application platforms
I
NITIAL
R
ELEASE
Page 5 of 67



o Data and database standards
The ADS document interprets current industry standards and recommends an application
development standard for adoption in the SOS for the software/application development
lifecycle, consistent with SOS architecture standards, principles, and best practices.
1.4 References
o Project Management Methodology
o Architecture Standard
1.5 Recommended Versioning and/or Change Management
Application Development Standards is a living document and the ITD development
management team is responsible for maintaining these standards. Modifications during the
life of these standards must be approved by the organizational owner of the document.
The organizational owner of the ADS is the Chief of IT Division and the application
development management team.
I
NITIAL
R
ELEASE
Page 6 of 67



2. General Standards
2.1 General Overview
2.1.1 Services Provided
As an agency, the SOS maintains computer technologies to facilitate critical services for the
people of California, including on-line Business Filings, Political Reform Filings, Election
Results, and several other important services.
2.1.2 IT Support Team
To provide these services the agency maintains a very lean IT shop of approximately thirty
staff which include the Help Desk, Managerial, Support Staff, Server Administrators,
Network Administrators, and the Application Development Teams.
2.1.3 Legacy Technologies
Legacy systems at SOS include a gamut of all available technologies from the last fifteen
years. Multiple platforms and languages that cover the range from mainframe Natural and
COBOL applications to web sites created from PHP, Perl, Java applets,Tcl, C++, Java
servlets, Classic ASP, PowerBuilder, and other technologies. This diversification left a small
IT shop in the difficult position of finding and retaining a limited number of developers with
the needed breadth of knowledge to maintain and enhance these aging systems.
Currently mature applications at SOS utilize a wide array of databases such as ADABAS,
VSAM file systems, MySQL, ORACLE, and other legacy data storage solutions.
2.1.4 Standardized Computer Technologies
The SOS has selected the Microsoft© .Net Framework as the standardized base for all new
application development work. The .Net Framework (.NET) was selected by the SOS for a
number of reasons. One of the primary reasons included the large number of development
options the .Net Framework makes available to architecture and development teams. The
.Net Framework allows for a development standardization that maximizes the flexibility and
utilization of a limited number of development staff within a very small IT shop.
.NET features allow SOS to be dynamically proactive in designing new applications to meet
current needs within the agency by providing a comprehensive suite of products that enable
developers to design and implement custom solutions. From an Internet (ASP.NET)
platform to web services or Windows Forms applications, .NET has allowed the SOS IT shop
to be successful in a number of projects across multiple platforms.
Standardizing on .NET has allowed the IT shop the option of selecting from a wider pool of
available developers. With one .NET C# development language and one Visual Studio
common interface, developers can easily team develop and cross maintain existing .NET
applications created by other developers.
Database standardization on Microsoft's SQL Server and ongoing consolidation of existing
legacy database systems has brought an increased flexibility and lower maintenance cost.
I
NITIAL
R
ELEASE
Page 7 of 67



2.1.5 Standardized Software Development
Standardizing on a common development platform, language, and developer interface has
allowed the SOS IT staff the ability to more dynamically respond to customer needs,
maximize the use of staff resources, lower overall maintenance costs through consolidation,
and has resulted in an increased number of successfully implemented projects for the agency.
The following document details the standards and approaches used by the SOS IT staff in
successfully implementing previous projects within the agency for both internal and external
customers. The specifics provided are the standards for current and future development
projects, and legacy system conversions.
2.2 Requirements Analysis Definition (Requirements Standards)
This section reviews the gathering and analyzing of preliminary information including the
use of tools such as modeling notations, use cases, and usage scenarios.
2.3 Architecture Design Guidelines and Patterns Overview (Design Standards -
Architecture Design Guidelines)
This section covers the phase of a project where requirements are transformed into a solution.
Vision, scope and risk analysis documents from earlier project phases, guide the design of the
emerging application structure.
Design processes and components such as patterns are also covered. Conceptual, logical, and
physical document steps are reviewed. User interface design is discussed including
designing user process components. Data layer and security design specifications are also
included.
2.4 Coding (Coding Standards)
Coding standards are presented with the concept of decreasing future maintenance issues and
increasing development productivity.
2.5 Testing Standardization (Testing Standards)
This section presents standards for testing objectives and best practices. Utilizing technical
specifications for the process of creating and reviewing test plans are covered.
2.6 Stabilization, Deployment & Software Distribution (Deployment Standards)
Stabilization of the application and creation of the deployment plan are reviewed. System
deployment techniques and software distribution standards are discussed.
I
NITIAL
R
ELEASE
Page 8 of 67













2.7 Maintenance & System Replacement (Maintenance Standards)
System maintenance issues, resource allocation, change control, and cost requirements are
reviewed. System hardware performance and software functionality enhancements to extend
system life and finally system replacement are covered in detail.
2.8 SOS Web Publishing Standards
The SOS web publishing standards can be found at http://webdev.sos.ca.gov/standards/.
These standards encompass look and feel, accessibility, security, etc. These are the current
standards that the business divisions’ web developers are using but are subject to change
depending on decisions made by the Web Governance Group (WGG). The project is
underway to develop all of the web policies and standards and is scheduled to be
completed by June 30, 2010.
The look and feel of a .Net web application needs to be consistent with the current SOS
websites. Any web application needs to get the sign-off of the ITD representative in the
Web Governance Group before it goes to the business user for user acceptance test.
The standard web browser for new Intranet applications is Microsoft Internet Explorer 8.
However, any Internet web applications shall be designed and tested against different web
browsers (e.g., ie6, ie7, ie8, Firefox, .etc).
The links below are provided as resources to developers who want to learn more about web
development, server side includes, style sheets, and accessibility.
Server Side Includes
http://en.wikipedia.org/wiki/Server_Side_Includes
http://www.smartwebby.com/web_site_design/server_side_includes.asp 
Cascading Style Sheets
CSS Help 
CSS2 Specifications (W3C) 
A List Apart 
Accessibility / ADA Compliance
Web Developer Meeting (1/27/2009) 
W3C Web Accessibility Initiative 
Web Accessibilty in Mind (AIM) 
State of California Webtools 
Other Sites
State of California eServices
W3C Validators/Tools
I
NITIAL
R
ELEASE
Page 9 of 67



3. Requirements Standards
3.1 Objectives of Requirements Definition
The objectives of requirements definition
o  Discovery
x To find out what the stakeholders want and need 
x To find out what automation opportunities are available 
o Communication
x To provide a common terminology in order to facilitate group understanding
x To describe complex problems in a simpler framework that aids understanding
x  To facilitate negotiation of tradeoffs where requirements and/or objectives are in
conflict
o  Agreement and Documentation
x  To enable consensus - by facilitating communication and negotiation - and to
document what has been agreed upon
o Guidance
x  To provide the foundation upon which to organize subsequent project activities
(design, build, testing, acceptance, training)
o  Success 
x To build and deploy a software solution: 
™ that is agreed upon by the project team.
™ that meets the needs of the sponsor/owner, as perceived during the project.
™ that meets the needs of the sponsor/owner, as understood after the software has
been in use for a period of time.

3.2 Scale
Project scale usually determines the scale of the requirements definition effort and the
resulting Project Requirements Specification document. The effort should fit the need.
Requirements precede design, for efficiency and effectiveness.
In larger projects there tends to be greater formality and greater separation between
Requirements and Design. In smaller assignments, requirements and design may occur
simultaneously and more informally, for example, as a series of prototype meetings. A less
formal approach works well when both customer and IT staff understand the distinction
between requirements and design.
When abbreviated requirements definition is sufficient to serve a small assignment, it is the
responsibility of IT staff to ensure that all involved in the project understand the distinction
between customer needs, specified requirements, and design specifications.
The Project Requirements Specification template includes more topics than may be needed
for any particular project. Topics which do not involve changes from the existing
environment, and which do not need clarification for the sake of the project team, may be
omitted from the final document. The list of required use cases (identifying functional
I
NITIAL
R
ELEASE
Page 10 of 67



interfaces between the system and users) should be included in the requirements document
for any project, regardless of scale.

Table 1. Guide to project scale.

Project Scale Typical Project Description Typical Project Structure

Very Large A project to develop a new application Formal project: Feasibility Study Report
system or to make a major revision to an (FSR), Request For Proposals (RFP),
existing one. bids, selection, formal award, formal
project life cycle and oversight, post-
project review.
Large A mini-project or Service Request (SR) Structured in-house project; with a project
that involves the basic functionality of plan; led by a project manager leading a
one part of an application system, project team with structured roles and
including building new components or management chain of authority; overseen
revising existing ones. by high-level customer and ITD
managers.
Medium An SR that affects several components of Project or assignment involving at least
an application system. one customer manager, a customer lead
staff, and maybe several additional
customer staff working closely with a
lead IT staff person and additional IT
staff.
Small A Trouble Report (TR) or SR that affects Assignment led by one IT staff person,
one or two components of an application working with one or two customer staff,
system. and involving assistance of additional IT
staff.


3.3 Requirements Gathering, Analysis and Definition

Requirements analysis and definition consists of four interwoven group activities: eliciting,
analyzing, documenting and negotiating. These activities usually take place simultaneously.
o  Eliciting requirements
The task of communicating with customers and users to determine what their
requirements are. Sometimes called requirements gathering, but “gathering” implies a
passive activity, and it is now recognized that specialized skills and techniques are
needed to actively lead group activities that guide customers and IT professionals in
exploration of business needs and automation opportunities, as well as project team
building.

o  Documenting requirements
Documenting is an essential task during requirements analysis and definition. It should
be noted that some documents may support the activities involved in gathering, analyzing
and defining. These documents might not be included in the final Project Requirements
Specification package. Supporting documents may include: lists, worksheets, notes, and
diagrams of various kinds, which were used to support the activities of elicitation,
analysis and negotiation.
I
NITIAL
R
ELEASE
Page 11 of 67




o  Analyzing requirements
The task of determining whether the stated requirements are unclear, incomplete,
ambiguous, or contradictory, should be conducted both individually and in groups. The
lists, diagrams, and worksheets provide structure for exploring and analyzing, and
recording progress.

o  Negotiating requirements
Project facilitators realize the need to resolve differing opinions and conflicting needs by
identifying trade-offs and then negotiating consensus. Negotiation of requirements is
central to resolving out-of-sync understanding between team members, tradeoffs between
alternative or conflicting needs, conflicting preferences among customers, and other
issues that often arise when working to specify clear and consensual business and
software requirements.

Tools and techniques to support eliciting, analyzing, documenting, negotiating and defining
requirements may include:
o  Interviews
o  JADs (Joint Application Design, or Joint Application Development) JRDs (Joint 
Requirements Definition) 
o  Worksheets
o  Business-oriented Data Models: Entity Relationship Diagram, Data Flow Diagram
o  Use Cases
o  Templates

3.4 Verifying the Requirements Specification
Tools for testing requirements:
o Requirements Traceability Matrix
o  Transforming Requirements Specifications into Test Plans
o  Criteria for Acceptance
The Project Requirements Specification document (PRS) should include a Traceability
Matrix, depending on the scale of project, which links requirements, design specifications,
issues (if any), and validation (The matrix acts as a cross reference between requirements,
design cases, tests, and acceptance criteria. It provides the project with a tool to evaluate
whether requirements are met.
Sample Requirements Traceability Matrix



Requirem
ent
Specification
Design
Specification
Test Case Test Result Issue
1.1 Certificate Certificate
Layout
A.1.7 batch print A.1.7
A.1.8 online
print
A.1.10
B.3.6 stored
image
not tested
I
NITIAL
R
ELEASE
Page 12 of 67














B.3.7 retrieved
image
D.3.6
2.1.1 Transaction program pg0409 F.2.03 Applicant
submits
application
F.2.03.01-17
acceptedSubmit app
screen sc0412
Transaction 2.1.2 UseCase
Transaction 2.2.1 UseCase
2.3.1 UseCase program pg0502 F.3.08 Reviewer
rejects application
F.3.08.01-06 #107
Clarify
reject
criteria.
Resolved.
Transaction screen sc0502
Transaction 2.1.2 UseCase
Transaction 2.2.1 UseCase
Transaction
* The sample entries are completely fictional. Any numbering scheme may be used, 
providing it is well organized and each entry is uniquely identified. 
3.5 Acceptance of Project Requirements Specification Document
o Encompassing all specified requirements
o Unambiguous
o Quantifiable, Measureable, Testable
o Compliant with Standards

3.6 Template: Standard Project Requirements Specification for Application Software
Development
The following outline should be used to draft a Project Requirements Specification
document. Those segments which fit the scale and circumstances of a given
project/assignment should be included in the specifications document. It is not necessary to
include non-applicable elements.
It is optional to include texts, diagrams and worksheets used for requirements definition
activities, unless they are listed under this set of standard elements.

Cover Page
Include: Title, Application Name, Impetus (SR#, Legislation bill number,etc.), Owner
Division (optional), Date, Status (Draft or Final), Revision Log.

Revision Log
Include: Title or description of revision, version number (optional), date approved. If
short, place on Cover Page. If lengthy, place on separate page.

Sa
mple Revision Log:
Version
Primary
Description of Version
Date Adopted
I
NITIAL
R
ELEASE
Page 13 of 67



Number Author(s)
1.0 Susan Johnson Initial Version 5/1/20XX


Table of Contents

1.0  INTRODUCTION
1.1  Background
Describe the business problem or needs and existing automation tools if any.
Describe the context leading to authorization of the project.

1.2  Purpose
Describe the envisioned solution.

1.3  Scope
Describe what falls within the project, and also what falls outside the project.

1.4  Definitions and Abbreviations
Define the terms used in the SRS and any other key terms used by the project.
Enlist and explain acronyms used.

1.5  Assumptions and Constraints
Describe any assumptions held in conjunction with this requirements specification.
Mention any portion of the requirements template intentionally omitted due to the
scale of the project or the context that the issue is already handled by the current
environment.

2.0  SPECIFIC REQUIREMENTS

2.1  Data Requirements
Describe any new data elements, and their Records Management status, if
applicable. Any new data element that is likely to be covered by the agency’s
policy for Record Management, should be identified. Describe any known or
anticipated requirements for retention, confidentiality, public access and disposition
of such data, as indicated by that policy.
Data elements covered by this requirement include:
o 
an official public record (e.g. certificate, license, etc.)
I
NITIAL
R
ELEASE
Page 14 of 67



o 
data that will be made accessible to (viewable by) the public (e.g. official
notice, regulatory report, etc.)
o 
records or reports containing confidential data ( e.g. name, address, birth date,
social security number, etc)

Describe the structure and composition of an anticipated new database, or of any
significant modifications or enhancements to an existing database.
Describe any existing production data that will become inactive, archived,
converted, migrated or removed from production use by the project.
Identify any data elements that are known to be essential entries in the data
dictionary
Include conceptual data modeling diagrams, as appropriate, to provide clarity and
context.

2.2 Functional Requirements
o  Describe functionalities necessary to meet the business needs
o Describe functionalities necessary to perform administrative functions needed to
maintain the new functionality
o  Include a list of required business transactions that will form the basis for Use
Cases developed in design and test phases:
x
interactions essential to meet the business need
x
interactions to control user identification and permissions
x
interactions with administrative and support users 
x
interactions with other automated systems 
The use cases identified in this list will be developed and described more fully
during the design and test phases.

2.3 Interface Requirements

2.3.1 Process Interfaces
o  Provide a complete overview of the boundary between the project and all other
business and software processes in terms of interfaces.
o  Describe each interface as a business rule (or set of business rules).
o  Specify every feature (in software-neutral or software-specific terms, as appropriate
to the scale of the project) that must be provided by the project for each interface in
order to implement the business rule for that interface.


2.3.2 User Interfaces
I
NITIAL
R
ELEASE
Page 15 of 67




Include a list of user interactions with the system to be developed that are needed to
provide the functionality required of the system. Business analysts (from IT or
customer staff) are usually best-positioned to supply basic information, and to lead
the discovery and negotiation of these requirements with customers.
Include this section in the PRS, based on the audience for this document and
suitability for the project.
The transactions identified in this list will be developed and described more fully as
Use Cases during the design and test phases.

2.3.3 Hardware Interfaces
Include known requirements for equipment and interfaces with the new system. IT
architects and infrastructure managers are usually best-positioned to supply basic
information, and to lead the discovery and negotiation of these requirements with
customers.

2.3.4 Software Interfaces
List software constraints, including existing software with which the new system
must be compatible, including: operating systems, security systems, email systems,
etc.
IT analysts are usually best-positioned to supply basic information, and to lead the
discovery and negotiation of these requirements with customers.

2.3.5 Network Interfaces
Include known constraints regarding data transmission. IT architects and
infrastructure managers are usually best-positioned to supply basic information, and
to lead the discovery and negotiation of these requirements with customers.

2.4 Operational Requirements
Describe any requirements for the ongoing operation of the software to be
developed that are not already met by existing procedures, staff, equipment and
software.
If data must be imported into or exported out of the system on a regular basis,
describe the nature of the data, source, destination, frequency and other essential or
expected requirements.
If ongoing operations will require the development of extensive supporting
procedures, outline the highlights of the procedures that will be needed.
Identify any known or anticipated constraints that could affect the ongoing
operation of the software to be developed.
I
NITIAL
R
ELEASE
Page 16 of 67



2.5 Software System Attributes (PASSMADE)

Specify levels for the following standard attributes (PASSMADE) of software
systems that should be met for this project:

P - Performance 
A - Availability 
S - Security 
S - Scalability 
M - Maintainability 
A - Accessibility 
D - Deployability 
E - Expandability 

2.5.1 Performance
Any performance levels essential for meeting business objectives should be
included. This may include: volume of output, volume and speed of data
transmission, automatic transaction auditing. Where possible, include measurable
parameters, for which data can be captured automatically.

2.5.2 Availability
Availability is the amount of time a system or component is online, ready to
perform its specified function. Availability is closely related to reliability, as
availability is reliability plus the time required to bring a system back to normal
operations after it goes offline. A consideration of the availability requirements at
this stage is key.
A reliable system functions successfully during the agreed hours of operation,
without unexpected down times. Requirements should address:
o 
acceptable levels of both planned and unexpected downtimes
o 
running-time periods of greater or lesser criticality to the business mission
o 
services to be provided during unexpected down times and trouble situations
o 
critical indicators of system integrity after failures and before recovery,
especially for data status and database synchronicity. System failures should be
accompanied by error messages that: have a unique id, describe what failure
was detected, and confirm that data rollbacks have been completed (if
applicable).

A list of all error messages should be prepared, containing for each error message: unique
id, description, possible cause, suggested resolution, relevant integrity indicators.


2.5.3 Security
In the requirements-gathering phase the focus is on recognizing current and future
threats and developing security measures that counteract these real or implied
threats.
I
NITIAL
R
ELEASE
Page 17 of 67



There are many security topics. Include specifications for all issues presented by
the project:
o 
providing authorized access
o 
preventing unauthorized access
o 
protecting data integrity during system functioning, data transmission, storage,
and maintenance
o 
assuring confidentiality of protected data elements
o 
protecting functional integrity
Security issues already addressed in the existing environment, which require neither
changes, nor clarification for the sake of the project team, may be omitted from the
Project Requirements Specification document.
2.5.4 Scalability
Scalability is the ability to increase system capacity by upgrading hardware and
software without extensive modifications to the application software. Scalability is
not an issue for standalone applications but for distributed applications. Most
scalability issues are resolved during the design phase and initial implementation.
There are still many issues that should be considered when gathering requirements.
For example an application’s scalability requires a balance between hardware and
software – poor choices regarding scalability in one can adversely affect scalability
in the other.
Key factors in order of increasing impact on scalability are:
o 
Hardware tuning
o 
Product tuning
o 
Code tuning
o 
Requirements and design

2.5.5 Maintainability
An application should be designed in such a way that it can be maintained and
repaired easily. Maintainability is defined as the ease with which a software system
or component can be modified to correct faults or errors, improve performance or
be adaptable to accept new functionality. Maintainability is a measure if how easy it
is to keep the system functioning. Maintainability requirements define how the
system is to be maintained at an operational level after it has been put into
production.
Maintainability requirements should be concerned only with modifications made
between major releases of an application and should be specified quantitatively.
Specify requirements and issues, as applicable, regarding maintenance of:
o 
configuration of the application,
o 
configuration of the users and their permissions,
o 
data integrity and cleanup,
I
NITIAL
R
ELEASE
Page 18 of 67



o 
supporting infrastructure; software and hardware.

2.5.6 Accessibility
Accessibility requirements focus on making the application as flexible as possible
to accommodate for a wide range of users’ needs with regard to disabilities and
impedances that would make their experience of the application less fruitful.
For a web-based application, it is estimated that about 8% of all web users are
disabled, so making a website that is accessibility-enabled can reach many more
users than a non-accessibility-enabled one.

2.5.7 Deployability
Deployment requirements focus on the users. Who are they and how are they
getting the application? How many sites is the application deploying to? Which
users require solutions in different languages?
Users can be broken down into three categories:
o  Standalone Users - Users who are not connected to a LAN (Local Area
Network).
o
o Remote Access Users - Users who will gain access to the network through
remote access modems.
o  Local Network Area Users - Users who are directly connected to the network.
The important thing to remember is that we need to understand the needs of each of
these users to have a successful deployment. Collect information from the project
teams, staff, and user groups. Surveys and interviews are also good supplements to
figuring out how best to deploy a solution.

2.5.8 Extensibility
Extensibility is the degree to which an application can be enhanced in the future to
meet changing requirements or goals. Extensibility is also known as adaptability,
changeability, expandability, extendibility, and flexibility.
When creating extensibility requirements it is helpful to have a full understanding
of the environment the application will reside in and to try to predict in which
directions the application may need to evolve in the future.
Extensibility can be tightly coupled with scalability so extensibility requirements
should include the ability to enhance data, hardware and software components.

2.6 Other Essential Features
I
NITIAL
R
ELEASE
Page 19 of 67



Include a description of any additional feature, issue or requirement considered
essential to meeting the objectives of the project that is not addressed elsewhere i
the document.

3.0 ADDITIONAL MATERIALS
The Project Requirements Specification document may optionally include other
texts, charts, diagrams or worksheets, which may be useful for clarification or
illustration.


n
I
NITIAL
R
ELEASE
Page 20 of 67



4. Design Standards

This section defines the design standards for software projects used by the ITD of the
California Secretary of State. During the life cycle of a typical development project there are
key requirements and system constraints that have a significant bearing on the design of the
application. They are:
4.1 Design Goals
o  Maximize the opportunities for code reusability
o  Separation of business logic from UI/presentation logic
o  Easily identify the responsibility of each object (what the object represents and what it
must accomplish)
o  Divide development effort along skill lines
4.2 Design Constraints
o  Microsoft SQL server database is the current standard for new applications
o  The default platform is - an ASP.NET application that runs in a browser
o  Conformance with existing SOS User Interface (UI) guidelines

ASP.NET provides the easiest deployment and security control. Factors that may override the
default and indicate the use of a Windows application are primarily due to complex user
interface requirements, such as:

o  Long scrolling grids
o  Tab controls, and tabs within tabs
o  Pop-up dialog forms that the user cannot “close” without completing a task, and closing
causes a refresh of the parent form.

Other reasons to choose using a Windows application include the need to access or create
local files, and the need to support long processes that would timeout a browser session.

4.3 Design Approach
This section covers the approaches for creating project application designs and the resulting
design deliverables. The goal of the design is to show how the system will be realized in the
implementation phase. The application design serves as an abstraction of the source code;
that is, the application design acts as a 'blueprint' of how the source code is structured and
written.
The SOS development staff and contractors/consultants doing development work for SOS
will use SOS IT and industry standard Object Oriented technologies and concepts during the
design phase of development project. The main deliverable resulting from the design phase
is a Software Design Document (SDD).
The SDD is used to document the architecture and design of the system. It is used to gather
and communicate all of the architectural decisions made during the design phase. The SDD
I
NITIAL
R
ELEASE
Page 21 of 67



provides a comprehensive architectural overview of the system, using a number of different
views to depict different aspects of the solution. It is intended to capture and convey the
significant architectural decisions which have been made regarding the system. Included in
the SDD are the following models:
o Use Case Model
A Use Case Model describes a system’s functional requirements in terms of use cases. A
Use Case Model is a summary of the Use Cases, with the goal of identifying key actors
and key functional requirements. Sample Use Case Models are attached below.

Figures 1& 2 ~ Use case relationships & Actor inheritance
(Ref: http://en.wikipedia.org/wiki/Use_case_diagram)

o Analysis Model
The analysis model provides an initial component structure for the system, abstracting
away low-level design details. The Analysis Model depicts the translation of application
requirements (use cases) into a high level technical representation of the components
required to build the system. Components used in the Analysis Model include: Boundary
Objects (web pages, external interfaces, web services, etc.), Entity Objects (User,
Account, License, etc.), and Control Objects (Data Access Objects, Validation Objects,
etc.).
The Analysis Model also provides a starting point for identifying high level packages and
the grouping of components into these packages. Further refinement of the packages
occurs during the object modeling phase of the design process.
o Object Model
An object model is a collection of descriptions of classes or interfaces, together with their
state (variables), member functions (methods), and class-static operations. The object
model illustrates the major concepts identified in the Analysis Model. These concepts are
modeled as objects and identify the relationships between the objects.
o Data Model
A data model is a collection of descriptions of data structures (entities), their contained
fields, and field characteristics (attributes).
4.4 Design Process
The first step of design is to partition the system into it major components:

I
NITIAL
R
ELEASE
Page 22 of 67





System Components Diagram
(Chapter Numbers refer to “Systems Analysis and Design in a Changing World - 2007”)

The design phase is the process of specifying the system requirements in detail sufficient to
code the application. The level of detail is subjective and varies with each project. This
generally includes:

o  Develop additional use cases to define system function, or user interaction with the
system.
o  Mock up user interfaces and navigation, perhaps creating a functional prototype.
o  Define the reports, perhaps mocking up some sample reports.
o  Define the database schema. This could be the entire schema or just the core data entities
and relations.
o  Define the system components and the external interfaces among those components.
o  Identify common functionality that would otherwise be duplicated in multiple areas of the
code. This includes validation, conversion, error handling, and business rules.
o  Specify the third-party libraries that the system will use.
I
NITIAL
R
ELEASE
Page 23 of 67



o Identify hardware and software needs.
o Identify any manual operations.
o Identify documentation of the existing system.
o Prepare data flow diagrams, use cases, class diagrams, and interaction diagrams.

The above artifacts are then documents in the SDD. The document may undergo internal peer
reviews and walkthroughs, and an architecture checkpoint review.

Design features need to be traceable to the requirements that they support. This may be
documented in a “traceability matrix.”

As appropriate, the design document will address the following activities by answering the
key questions:



4.5 Design Objectives
4.5.1 Application tiers

Development uses three tiers or layers:

o Presentation
I
NITIAL
R
ELEASE
Page 24 of 67



o Business (only if “business rules” are involved.)
o Data Access

The design phase should identify and document the user interface screens that comprise the
presentation layer, the rules that comprise the business layer, and the database interface that
define the data access layer.

The presentation layer consists of standard ASP.NET Web forms (.aspx files), documents,
and Windows Forms, etc. This layer works with the results/output of the business logic layer
and transforms the results into something usable and readable by the end user. The
presentation layer should only contain client-side validation (ASP.NET validation controls),
user interface elements, and ASP.NET controls.

However, through controls such as ObjectDataSource, the presentation layer can access data
directly from the data layer when no business transformation is necessary. Additionally the
aspx “code behind” may directly call the data layer when there is no business rules involved,
such as performing an insert/update call from a “submit” button.

The business layer contains classes which implement business functionality. These classes do
not access data (except through the data layer), contain inline SQL statements, display, or
present this data to the user. All we are interested in at this point are the complexities of the
business itself and validating that logic.

Simple web applications, such as a “contact form,” have minimal business rules and may be
implemented without a “business layer.”

The data access layer may be implemented as query and table adapters, object data source, or
other coding controls on a dataset as appropriate. Use of controls helps to abstract the
database at the client supporting reusability.

4.5.2 User Interface

To most users, the user interface is the most important part of the system because, to them, it
is the system. A well-designed interface helps to insure the success and acceptance of a
business application.

UI Design Goals
SIMPLE: Simplicity is a key factor when designing a user interface. If a user interface looks
crowded with controls, then there may be a steeper learning curve for the novice user.
Simplicity means the user interface should allow the user to complete all the required tasks
by the program quickly and easily. Also, program flow and execution should be kept in mind
while designing. Try to avoid the use of flashy and unnecessary images that distract the user.

STRUCTURED: Positioning of controls should reflect their importance. For example, if you
are designing an application that has a data-entry form with textboxes, buttons, and radio
buttons the controls should be positioned in such a way that they are easy to locate and match
I
NITIAL
R
ELEASE
Page 25 of 67



the program flow. The last item on the page would then be a submit button to allow the user
to send the form information for processing,

When possible, place all related information and input controls on the same screen. If this is
not possible, consider using tabs or wizards.

CONSISTENT: The user interface should have a consistent look throughout the application.
The key to consistency lies during the design process. Before developing an application, plan
and decide a consistent visual scheme for the application that falls in line with the look and
feel of the department or with the pre-established design of other SOS developed .NET Web
applications.

The above objectives are supported by using:
o Master Pages
o Cascading Style Sheets (CSS)
o Themes and skins

Standard Web Browser
The standard web browser for new intranet applications is Microsoft Internet Explorer 8.
However, any public Internet web applications shall be designed and tested against
different web browsers (e.g., ie6, ie7, ie8, Firefox, .etc).
Standard Web Page Layout Design
The user interface implements a templating mechanism to provide a consistent look and
feel for all pages. Refer to 2.8 for the current SOS Web Publishing Standards. The SOS
web publishing standards can be found at http://webdev.sos.ca.gov/standards/. The look
and feel of an Internet .Net web application needs to be consistent with the current SOS
websites.
The UI template consists of the following components:
o a common banner
o a navigation menu
o a content body
o a footer
The content body is a template which consists of:
o a page title
o an optional page navigation bar
o a data section

Getting users to accept an interface requires that the users’ needs and workflow be given
primary consideration during its design and development. However, do not compromise
functionality when meeting these design requirement standards.

A graphic representation or page mockups of the application should be created. The goal of
this deliverable is to gauge user acceptance of the proposed graphical layout, information
I
NITIAL
R
ELEASE
Page 26 of 67



architecture, and functionality of the site as well as to refine the design of the system. The UI
design can include screenshots of forms.

The design should also capture the navigation and flow.

4.5.3 Database Design

Create the core tables using the classes identified in the class diagrams of the data modeling
section. Identify and create lookup and other supporting tables from the user interface
diagrams.

SQL and other databases have sufficient diagramming tools to provide documentation by
creating the database, such as:





I
NITIAL
R
ELEASE
Page 27 of 67



4.6  Data Validation
Data validation is accomplished by performing two separate validation processes. The first
validation process is directed at maintaining data integrity by performing data type
validation. The second validation process is directed at maintaining data integrity by
performing business rules validation. Data is owned and maintained by the end user.
Ongoing procedural operation to monitor data integrity such as report is another validation
component.

4.6.1 Data Type Validation
Data type validation is the process of ensuring that the correct data types are entered in all
text box entries. Data type validation occurs whenever the Save, Next, or Back buttons are
pressed. Data type validation checks for the following:
o Numeric fields contain numeric data only
o Date fields contain valid dates
o  “Block” data is complete (all address information is complete; all phone information is
complete; etc.)
o Correct data lengths are entered

4.6.2  Business Rule Validation
Business rule validation is the process of ensuring that all business rules have been met when
data is persisted. Business rule validation occurs on the server, whereas data type validation
occurs on the client.
Enforcement of business rules often require additional data which may or may not be
available on the client. Performing business rule validation on the server enables the
validation process to retrieve the additional data more conveniently, and more importantly,
allows business rules to be enforced even when JavaScript is disabled on the client.
Business rule validation is the responsibility of a business delegate class, either directly, or
through use of a helper/validator class working on behalf of the business delegate.
I
NITIAL
R
ELEASE
Page 28 of 67
























5. Coding Standards
5.1 ASP.NET Coding Standards
This document defines the Web Application coding standards for the California Secretary of
State, Information Technology Division. This document focuses on Microsoft’s .Net
Framework (C# emphasis) and SQL Server Database. These two elements compose the
preferred base platform for current and future projects.
Coding standards will help accomplish the goal of increasing the maintainability of existing
and new applications. The goal of this document is to provide a development environment
that allows team members to easily learn and maintain existing applications and to rapidly
develop new applications to meet the increasing needs of this division’s customers.
The ability to easily maintain existing applications will decrease the resources required for
that purpose and allow those same resources to be better utilized in meeting the agency’s
needs in the form of new development.
5.1.1 Naming Convention
Use Microsoft’s Visual Studio default naming convention. 
Example: Label_HelloWorld or DropDownList_PickCountyName
Justification:
The desired control can be isolated in Intellisense by typing the control type, e.g. “TextBox_” 
brings up all of the textbox elements on the form. 
Use Pascal casing for Class names
public class HelloWorld
{ 
... 
} 
Use Pascal casing for Method names
void SayHello(string name)
{ 
... 
} 
Use Camel casing for variables and method parameters
int totalCount = 0;
void SayHello(string name)
{
Label_Name.Text = (new StringBuilder(“Hello“).Add(“
“).Add(name)).ToString(); 
I
NITIAL
R
ELEASE
Page 29 of 67



Label_Name2.Text = “#” + totalCount + “ names received.”;
TotalCount++; 
... 
}



5.1.2 Database Access

Database accessed exclusively via stored procedures (No embedded SQL).

Justification:
Design: The stored procedure names are descriptive of the business function, whereas an 
embedded query usually is not. 

Maintenance: DB schema changes can be made without impacting the application. 

Security: External access rights granted only to execute stored procedures. And access 
exclusively via procedures is a key means of protecting against SQL injection attacks. 


5.1.3 Error Handling

All expected exceptions are caught locally. If the user can work around then display the error
to the user.

All errors and exceptions are identified by a unique 4-digit number. 

Justification: 
This makes it easy to pinpoint the source of an error via search. 

Unexpected exceptions are trapped globally in global.asax. 

All global exceptions are logged to the Windows Event Log. 

All local exceptions are either converted to error message on the screen, or logged to the 
Windows Event Log (appropriate if the user cannot work around the error, or if the error may 
result in a Trouble Report [TR] being issued) or both.

Justification:
Sorting the Event Log by Alpha groups all of the errors thrown by our web applications:
I
NITIAL
R
ELEASE
Page 30 of 67


Example function shows the option of displaying the error to the user.

public static
void LogException(
int errNum

,
string pubMsg ,
string privMsg,
Exception ex ,
bool showMsgBox)
{ 
if (showMsgBox)
System.Windows.Forms
.MessageBox.Show("ERROR # " + errNum.ToString() +
"\n\n" + pubMsg + "\n\nPlease Notify ITD", "Safe at Home");

UtilQueriesTableAdapter ta = new UtilQueriesTableAdapter();
ta.LogError(UsersAndRoles.GetShortUsername(), errNum, 0, 0,
pubMsg, 0, ex.ToString(), privMsg);
}


5.1.4 Configuration

Recommend breaking the
web.config
file into 3 separate files:

<configuration>
<connectionStrings configSource="ConnectionStringsDev.xml"/>
<appSettings configSource="AppSettingsDev.xml"/>
<system.web>


Justification:



I
NITIAL
R
ELEASE
Page 31 of 67






















Web.config and app settings changes can be deployed to production without having to
know the connection string for the production server. In this way Infrastructure can be
responsible for maintaining the connection string information.
5.1.5 Style
Use themes to set the layout of components like Gridviews. Themes can found under the
/App_Themes
folder, and the name of the subfolder defines the name of the theme:
/App_Themes/Theme1/GridView.skin
The theme is specified in the
web.config
file:
. . . 
<pages theme="Theme1"/>
</system.web> 
</configuration> 
Styles for DetailsView, GridView, and FormView should not be hard-coded within the
control.
Justification:
Themes provide a global means of controlling style, and are easily reused by other web
applications
Use style sheet to control the look and feel of the pages. Never specify font name and font
size in any of the pages. Use appropriate style class.
Justification:
This will help to change the UI of your application easily in future.
5.1.6 MasterPage
Use a masterpage to define the applications’ formatting and layout. ITD’s Intranet ASP.NET
2.0 uses a common design:
I
NITIAL
R
ELEASE
Page 32 of 67





The navigation menu content is controlled by a web.sitemap file.

Note: Labels on the masterpage can serve as a cross-page “viewstate.” The controls can be
accessed from code behind:

// Hide Admin menu from the Create page
Label Label_Welcome2 = (Label)Master.FindControl("Label_Welcome");
if (Label_Welcome2 != null)
Label_Welcome.Visible = false; 

5.2 Architecture

Use the Infrastructure Guidelines for reference. Based on the requirements, special user
needs, and data sensibility, an architecture design review needs to be conducted with the ITD
Architect.

Here are some general rules:

o Implement pages with multiple functions as a component (.ascx) for each function.

Justification:
Complexity: This reduces the complexity caused when several functions are attempted to 
be implemented on the same form. 

Maintainability: Maintenance can be isolated to the affected component.

I
NITIAL
R
ELEASE
Page 33 of 67



Reuse: Components allow for reuse of functionality on other web pages, perhaps just
tweaked by a different property setting.


o  Use constants to specify the name portion of querystrings params, and read them in this
manner:

string name = Request.QueryString["ReportFormat"].ToString();

if (!String.IsNullOrEmpty(name))
hiddenReportFormatCode.Value = name;
Check
Page.IsValid
before processing button events from forms that include a
validation control.

Justification: 
This blocks execution if the post bypasses the JavaScript client validations, which could 
indicate a malicious attack. 

protected void bnCreateNewUser_Click(object sender, EventArgs e)
{
if (Page.IsValid)
{ 
if (InsertNewUser() == 0)
Response.Redirect(ADMIN_LIST_URL);
}
} 

o WinForm versus WebForm
By default all intranet applications will be WebForm. WinForm will be justified on an as
needed basis. However there are no requirements or restrictions on using WinForm when
appropriate.

5.3 Security

As a generic example, place functions to be restricted by a specific user role within a sub-
folder, and control authorization via a
web.config
file in that folder. Security approaches
will be dictated by system requirements.

<?xml version="1.0" encoding="utf-8"?> 
<configuration
xmlns="http://schemas.microsoft.com/.NetConfiguration/v2.0"> 
<system.web>
<authorization> 
<allow roles="staff" />
<deny users="*" />
</authorization> 
</system.web> 
</configuration> 

Justification: 
This makes the app simpler to understand, enhance and maintain. 

I
NITIAL
R
ELEASE
Page 34 of 67


Windows Authentication

By default all intranet applications use Windows Authentication. This allows authorized
users to run the Web applications without ever logging in, or having to remember an
additional password.

5.4 Reporting
Informal reports can be generated using .NET standard controls like Gridview with an export
option for users to export the data into an Excel or Word document.

Depends on the report volume and accessibility, small amounts of formal reports/forms can
be created by using Dynamic PDF.

Currently, Reporting services local reports are used in several projects but are not
recommended at this time due to the compatibility issue with SQL server 2008 and Visual
Studio 2008.

Complex Reporting may be implemented as a separate “Reporting Services” project. This
report is made to appear as part of the web application by embedding the report within an
HTML control.

To embed Reporting Services Local Report and Server Report in the project – Report Viewer
Control is required to download and install

See Appendix #3 for reporting tools references. Choice of specific reporting tools should be
based on specific needs of the users and application requirements.

5.5 Microsoft SQL Server Coding Standards
5.5.1 General Rules

o Access should be through stored procedures
o Execute permissions to stored procedures should only be granted to SQL server database
roles
o User accounts should be added to the SQL server database roles
o Authentication should be though windows accounts
o All primary and foreign key columns should generally be indexed
o Queries should use ANSI join syntax
o Try to Access Resources (e.g. tables, views, stored procedures, etc) in the Same Order
every time
o Avoid Interleaving DDL and DML in Stored Procedures
o Pre-Grow Databases and Logs to Avoid Automatic Growth and Fragmentation 
Performance Impact 
o Minimize Cursor Use
o Tracking data changes 
x EditedBy-nvarchar(100) 
NITIAL ELEASE
I R Page 35 of 67



x EditedOn-datetime -- Default getdate() 
x Deleted-bit 
o  Don't delete rows directly--delete stored procedure should set Deleted = 1. If you want to
actually delete the row then do it in the trigger
o  Update history table by a trigger

The general rule is that only raw data should be stored in the database (e.g., 123121234 not 123-
12-1234 for a social security number.)

5.5.2 Naming Convention

o Table and column names should be singular (
Person, Employee, Processing Log,
.etc)
o  Names may include spaces, underscores and other allowed characters.
o  Tables may use primary or surrogate primary keys based on design requirements.
o  Avoid abbreviations.
o  Avoid acronyms.
o  Keep database object names short, simple, meaningful, and readable.
o  Use database 'System Tables' and 'System Views' to obtain relevant information instead
of using name prefixes.








5.5.3 Security

o Access should be through stored procedures
o  Create Roles to control execute rights to all stored procedures. Package appropriate
execute rights for stored procedures into roles based on role types. Execute permissions
to stored procedures should only be granted to SQL server database roles
o  User accounts should be added to the SQL server database roles
o  Authentication should we though windows accounts


I
NITIAL
R
ELEASE
Page 36 of 67



5.5.4 Exception Handling

If the application is a WinForm that cannot log exceptions to the Windows Event Log, the
DB should have an ErrorLog table (See “SafeAtHome” as an example.)



Errors should be caught and logged directly from the DB procs. Pass in the “username” to
stored procedures that will trap and log exceptions or other errors so that the username can be
captured in the log.

BEGIN TRY

INSERT INTO [CourtActions]
( 
[CaseUID], . . . 
) 
VALUES
( 
@CaseUID, . . . 
) 

END TRY

BEGIN CATCH
--- LOG EXCEPTION
DECLARE @Comment varchar(100)
SET @Comment = 'Login:' + @UserName +
' CaseUID:' + CONVERT( varchar(50),@CaseUID) +
' CourtName:' + @CourtName
EXECUTE [dbo].[usp_LogError] @Comment; -- See SafeAtHome 
SET @ReturnValue = 0 
END CATCH 




I
NITIAL
R
ELEASE
Page 37 of 67



6.  Testing Standards
6.1  Testing Objectives

The basic objective of testing application software is to gain reliable understanding of how
software actually works. This is achieved by executing it in controlled conditions and
documenting the actual results in juxtaposition with expected results.

Specialized objectives for testing application software vary depending on context:

o 
Application Development.
x
To verify that software meets design specifications.
x
To verify that software fulfills functional requirements (business requirements).
x
To assist in creating code; enabling coders to compare actual vs intended results.
o 
Application Acceptance.
x
To provide an objective, measurable, documented method for assessing whether
software should be accepted and approved for promotion from development to
production status.
o 
Application Support.
x
To debug software.
x
To enable programmers to learn how an unexpected or unwanted result is being
produced. 
x
To assist in revising code to produce the desired result. 
o 
Application Documentation. 
x
To establish a documented baseline of the functionality of existing software. 
x
To assist in decisions about whether to change, replace or purchase software. 
x
To assist in creating procedures, user guides, performance baselines, etc. 

Best practices are recommended for all contexts and forms of testing. This set of standards
and templates should be followed in situations where the objective of testing is:
o 
To verify functionality (positive and negative testing).
o 
To assess whether acceptance criteria have been met.

6.2  Types of Testing

To obtain reliable and objective understanding about software functionality, it must be tested
in controlled conditions, and the results must be documented in a systematic way.

6.2.1  There are two basic types of testing applicable to all application software: positive
testing and negative testing.

o 
Positive Testing. Testing designed to verify that the software provides the expected,
designed, required functionality; the software provides the desired processing, produces
the desired outputs, and meets the business needs. Also called “Happy Path” testing, to
indicate that all intended inputs are processed as expected/desired.

I
NITIAL
R
ELEASE
Page 38 of 67



o 
Negative Testing. Testing designed to verify that the software can handle invalid and
unexpected inputs and combination of execution conditions. When the software is
subjected to invalid data or unexpected conditions:
x
It should detect invalid inputs and the known range of common risk conditions.
x
It should produce useful warning or explanatory messages.
x
It should not fail.
x
It should bring processing to a controlled stop before proceeding to an uncontrolled
failure.
x
It should not produce invalid, inaccurate or partial results.
x
It should rollback partial updates when a transaction cannot be fully completed.
x
It should provide the user with an option to bypass problematic processing, if
possible.

Every test plan should include both positive and negative testing.


Each business process should undergo both positive and negative testing – especially for
input and outputs processing, where most negative risks tend to occur. Ideally, the software
will be designed, coded, and required to handle, the known range of common risks – as well
as provide functionality to meet business needs. Ideally, both positive and negative
conditions will be tested thoroughly.

General types of application software testing include:

6.2.2 Development Testing

o 
Unit Testing. Tests software units: programs, routines, modules. Emphases are on data
design, data interfaces and graphical interfaces.
x
Data Design: adequate data elements and formats to facilitate the business processes;
filters for missing data and invalid formats and values.
x
Data Interfaces: reliable input and output connectivity, data integrity features,
detection and handling of potential corruption or failure of input and output
processes.
x
Graphical Interfaces: input screens, forms, windows, pages collect data efficiently,
intuitively, correctly, provide useful messages, and handle abnormal conditions
before failure occurs.
o 
Integration Testing. Tests combinations of programs, routines and modules that must be
executed serially or concurrently to: complete a business transaction, produce
synchronized outputs, navigate around the application, or to complete any multi-step
process.
o 
System Testing. Tests integration of all units and multi-step processes.
o 
Stress Testing. Tests performance of the application under high-volume conditions:
concurrent users, data inputs, data transmission, etc.

6.2.3 Acceptance Testing

o 
Positive Testing. Verify required functionality is provided.
o 
Negative Testing. Verify that unexpected conditions do not cause failure. Verify that
unintended, unwanted results do not occur.
I
NITIAL
R
ELEASE
Page 39 of 67




o 
Sampling. Writing representative test cases to reveal whether requirements have been
met.
o 
Acceptance Criteria. Objective, measurable criteria to evaluate test results, to provide a
basis for making the decision to accept or reject the software.

6.2.4 Baseline Testing

o 
Design test cases to document the actual functionality of existing software.

6.2.5 Debugging

o 
Locate the source(s) of unwanted results.
o 
Rule out non-contributing factors.

6.2.6 Change Control Testing

o 
Positive Testing. Verify required functionality is provided.
o 
Negative Testing. Verify that unexpected conditions do not cause failure. Verify that
unintended, unwanted results do not occur.
o 
Regression Testing. Verify that desired pre-existing functionality is not damaged nor
lost. Verify that unintended consequences do not result from program changes.

6.2.6 Beta Testing

“Beta Testing” refers to production use by a small group of users. “Beta release” is perhaps a
more accurate term than “beta testing”. It can be a good strategy, especially in larger
deployments. Beta releases are not covered by this standard for application software testing.

6.3 Testing Strategy
6.3.1 Purpose and Organization
The test strategy describes what the testing objectives are, and which types of software
testing will be conducted, and the overall organization of testing scenarios, test cases. For
larger projects, the testing strategy must take into account testing resources such as tester
availability and time frames.
Test scenarios may be arranged into different stages and groupings, depending of project
scale. For an acceptance testing for a large project, test scenarios may be grouped by
application functionality, guided by menu options, for example. It might be useful to place
negative test cases in a group completely separate from the positive “happy path” cases. Or
it might be more useful to combine positive and negative cases for each business function; in
which case, business functions would be grouped separately.
It is important for the project team to understand who will approve the results of each testing
stage and how the approval will take place. This affects testing strategy and should be
incorporated into the test plan. It also affects the procedures for handling test rejects.
The Testing Strategy guides the creation of test cases and the Test Plan.
I
NITIAL
R
ELEASE
Page 40 of 67



6.3.2 Test Cases

The standard format for a Test Case consists of unambiguous description of these controlled
components:
o 
Test Number. The test number uniquely identifies the test case. It should be entered into
the appropriate cell of the Requirements Traceability Matrix.
o 
Test Conditions: 
x
Software: name/numbers of program/modules, screen/window/pages that will 
execute. 
x
Pre-existing data. Description of state and values of pre-existing test data. Should
have baseline backup so test conditions can be refreshed for re-testing.
o 
Inputs: Test data. Each test case should have its own test data.
o 
Action: Test procedures that activate the execution of the software.
o 
Desired Results: There are two types of objective, measurable results: data and
presentation.
x
Data results include: files outputted, rows added, deleted or changed. Some data
results do not have a graphical user interface at the unit level (such as CICS
“commareas”). These kinds of data results can be tested either by using developer
testing tools at the unit level, or, by using application user interfaces at the integration
or system levels.
x
Presentation results include: messages, displays, and reports.
o 
Actual Results. Actual data and presentation results should be recorded with the same
level of detail as the Desired Results.

See 6. 7.1 TEMPLATE: Standard Test Case

6.3.3 Control and Disambiguation

Attention to detail is essential to achieve the control and disambiguation needed for
successful testing. Disambiguation is more than mere clarity; it means identifying important
distinctions, and revealing the structure of key processes.

6.3.4 Acceptance Criteria

Acceptance criteria must be negotiated and agreed to by sponsor/user and IT team. Objective
criteria are essential. Preparation of a thorough test plan based on testing standards and using
the Standard Test Case format will facilitate objective acceptance criteria. The accepter
(sponsor or user) may specify that all test cases must pass before the software will be
accepted. A less rigorous approach might include accepting the software “as is” for
production release - with conditions, such as fixing the software responsible for certain failed
test cases and releasing a patch by a later, specified date.

6.3.5 Test Plans

I
NITIAL
R
ELEASE
Page 41 of 67



The Test Plan organizes test cases into strategic stages and groups, documents each test case,
and schedules test resources (testing staff, test stages, testing dates, locations, support staff,
support tasks).

Larger projects will require more extensive, formal and structured test plans. Test cases
should be arranged into test suites, grouped by functionality, or user type, or other grouping
appropriate for the project.

Smaller assignments will need fewer test cases, but no less attention to completeness and
attention to detail.

See TEMPLATE: 6.7.2 Standard Test Plan

6.4 Test Environment.

6.4.1 Separate and Congruent Software Environment

Ideally the test environment will be separate from both the development environment and the
production environment. This separation will help provide the control needed to protect the
production environment and to preserve the carefully monitored conditions of the ideal test
environment. The larger the scale of the project, the greater is the importance of risk-
avoidance that may be achieved by separating these environments.

Separation of test and production environments is only as useful as the congruence between
them. The test environment cannot actually “mirror” production, because the test
environment must contain new software and test data. Conditions are controlled only to the
extent that the Testing Team (or developers, if there is no Testing Team) understands what is
same and what is different.


6.4.2 Test Data

Manufacture, or select from live data, sufficient test data to supply:
o
all positive, “happy path” test cases
o
all negative test cases (unexpected conditions)
o
stress testing

Analysis of live data is an essential part of preparing test data. It aids preparation of 
representative test data. However, even after careful analysis, live data often reveals 
surprises – unexpected data conditions. 

Since live data often contains more attributes than a development team can discover, it is
important to select randomly from live data sufficient test data to ensure the test cases are
exposed to:
o
unexpected data conditions
o
unexpected volumes

I
NITIAL
R
ELEASE
Page 42 of 67



The full set of live data is still the best “sample size” for thoroughly testing how software
handles volumes and unexpected data.

6.4.3 Testing Procedures

Procedures are needed for several testing activities.

o 
Test Cases. The interface actions (key presses, mouse clicks, etc) must be included in
each test case.
o 
Re-testing. Procedures for reporting, handling and tracking failed test cases must be
prepared. For a large project, a special development recycling process, dedicated to the
project, might be needed to handle failed test cases, distinguish them from change
requests, return them to development, and then to subsequent re-testing.

6.5 Requirements Traceability Matrix
The Requirements Traceability Matrix is an essential resource for creating the test plan for a
software development project. The use cases developed during requirements definition
and/or design must serve as the basis for test cases. The matrix links each requirement
specification with subsequent design specs, test cases, and test results.
Requirements Traceability Matrix, sample

Requirement Design Test Case Test Issue
Specification Specification
Result
1.1 Certificate Certificate Layout A.1.7 batch print A.1.7
A.1.8 online print A.1.10
B.3.6 stored image not tested
B.3.7 retrieved D.3.6
image
2.1.1 UseCase program pg0409 F.2.03 Customer F.2.03.01-
Submit app submits application 17
screen sc0412
accepted
2.1.2 UseCase
2.2.1 UseCase
2.3.1 UseCase program pg0502 F.3.08 Reviewer F.3.08.01-#107 Clarify
rejects application 06 rejection
screen sc0502
criteria.
Resolved.
2.4.1 UseCase
2.4.2 UseCase
I
NITIAL
R
ELEASE
Page 43 of 67




* These sample entries are completely fictional. Any numbering scheme may be used,
providing it is well-organized and each entry is uniquely identified.

6.6 Automated Test Execution

Mainframe software frameworks have some automated testing products.

A testing framework for client-server software, called Microsoft “Team Edition”, is installed 
in this shop. 

Several validation software products have been custom developed in-shop – targeting data 
and functionality for specific application software. 

There are some testing products for web software, especially for ADA compliance, such as 
“Bobby”. 

Microsoft Visual Studio 2008 has an application testing capability for W3C compliance. 

FANG in Firefox converts a page to text format – so a reader-image can be previewed. 

6.7 Templates

6.7.1 Standard Test Case

The standard format for Test Cases is very similar to the standard format for Use Cases,
developed during requirements definition and/or design activities.

Standard Test Case, sample

Test Environment Inputs Desired Results Actual Results Disposition
Case & Notes
Number
(unique Softwa Pre- Test Test Data Presentation Data Presentation (pass/fail)
number) re existing Data Procedures
Results Results Results Results (accept/rejec
Data
t)


Supplied by Test Preparer Supplied by Tester Supplied by
Accepter


6.7.2 Standard Test Plan Template

I
NITIAL
R
ELEASE
Page 44 of 67




The following template should be used to draft an Application Software Test Plan. The scale
and composition of a test plan should be determined by the scale of the project. Elements
that fit the scale and circumstances of a given project or assignment should be included in the