Internet History.docx - PAP-IT20-21-22

looneyvillestaticSoftware and s/w Development

Aug 15, 2012 (4 years and 11 months ago)

601 views

Internet History
--

One Page Summary


The conceptual foundation for creation of the Internet was largely created by three
individuals and a research conference, each of which changed the way we thought about
technology by accurately predicting its future:



Vannevar Bush

wrote the first visionary description of the potential uses for
information technology with his description of the "memex" automated library
system.



Norbert Wiener

invented the field of Cybernetics, inspiring future researchers to
focus on the use of technology to extend human capabilities.



The 1956 Dartmouth Arti
ficial Intelligence conference

crystallized the concept that
technology was improving at an exponential rate, and provided the first serious
consideration of the consequences.



Marshall McLuhan

made the idea of a global village interconnected by an electronic
nervous system part of our popular culture.

In 1957, the Soviet Union launched the first satellite, Sputnik I, triggering US President
Dwight Eisenhower to create the
ARPA

agency to regain the technological lead in the arms
race. ARPA appointed
J.C.R. Licklider

to head the new
IPTO

organization with a mandate to
further the research of the
SAGE

program and help protect the US against a space
-
based
nuclear attack. Licklider evangelized within the IPTO a
bout the potential benefits of a
country
-
wide communications network, influencing his successors to hire
Lawrence Roberts

to implement his vision.

Roberts led development of the network, based

on the new idea of packet switching
invented by
Paul Baran

at RAND, and a few years later by
Donald Davies

at the UK National
Physical Laboratory. A special computer called an
Interface Message Processor

was
developed to realize the design, and the
ARPANET

went live in early October, 1969. The first
communications were between
Leonard Kleinrock
's research center at the University of
California at Los Angeles, and
Douglas Engelbart
's center at the Stanford Research Institute.

The first networking protocol used on the ARPANET was the
Network Control Program
.
In
1983, it was replaced with the
TCP/IP

protocol invented Wby
Robert Kahn
,
Vinton Cerf
, and
others, which quickly became the most widely used network protocol in the world.

In 1990, the ARPANET was retired and transferred to the
NSFNET
. The NSFNET was soon
connected t
o the
CSNET
, which linked Universities around North America, and then to the
EUnet
, which connected research facilities in Europe. Tha
nks in part to the NSF's
enlightened management, and fueled by the popularity of the
web
, the use of the Internet
exploded after 1990, causing the US Government to transfer management to
independent
organizations

starting in 1995.




Project objectives can be formulated as
S.M.A.R.T.




Specific



Measurable



Achievable



Realistic



Time
-
bounded

Approaches
, Styles, Or Philosophies

Formal
methods

Formal methods are mathematically
-
based techniques for the specification,
development

and verification of
software

and hardware systems. Due to the high cost of
using formal methods, they are usually only used in the
development

of high
-
integrity
systems,
where safety or security is important.

Programming
paradigm

A programming paradigm is a fundamental style of programming regarding how
solutions to problems are to be formulated in a programming language. For example
-

Aspect
-
oriented
[
^
],
Rule
-
based
[
^
],
Functional decomposition
[
^
],
Procedural
[
^
],
Object
-
oriented
[
^
].

Methodology

Methodology is a style of solving specific
softwa
re

engineering problems; a codified set
of practices (sometimes accompanied by training materials, formal educational
programs, worksheets, and diagramming tools) that may be repeatably carried out to
produce
software
.

Anti
-
patterns

Anti
-
patterns are sp
ecific repeated practices that appear initially to be beneficial, but
ultimately result in bad consequences that outweigh the hoped
-
for advantages.



Top
-
down and bottom
-
up
[
^
]. Top
-
down and bottom
-
up are strategies of information
processing and knowledge ordering, mostly involving
software
, but also other humanistic and
scient
ific theories (see systemics). In practice, they can be seen as a style of thinking and
teaching. In many cases top
-
down is used as a synonym of analysis or decomposition, and
bottom
-
up of synthesis.



Design by Contract (DBC)
[
^
]The central idea of DBC is a metaphor on how elements of a
software

system collaborate with each other, on the basis of mutual
obligations and benefits.

o

What does it expect?

o

What does it guarantee?

o

What does it maintain?



Test Driven
Development

(TDD)
[
^
] Test
-
Driven
Development

is a
software

development

technique consisting of short iterations where new test cases covering the desired improvement
or new functionality are written first, then the production code nece
ssary to pass the tests is
implemented, and finally the
software

is refactored to accommodate the changes.

o

Requires test automation

o

"Keep It Simple, Stupid" (
KISS

[
^
]) principle

o

"You Ain't Gonna Need It" (
YAGNI

[
^
]) principle

o

"Fake it, till you make it" principle

o

Requires techniques, such as discussion and code reviews for example, to ensure not
only that the correct

features are implemented, but that the resulting code uses
acceptable styles



Model Driven
Development

(MDD)
[
^
] Model
-
Driven
Development

refers to a range of
development

approaches

that are based on the use of
software

modeling as a primary form of
expression. Sometimes models are constructed to a certain level of detail, and then code is
w
ritten by hand in a separate step. Sometimes complete models are built including executable
actions.



Feature Driven
Development

(FDD).
[
^
] FDD is a model
-
driven short
-
iteration process that
consists of five basic activities. For accurate state reporting and keeping track of the
software

development

project, milestones that mark the progre
ss made on each feature are defined.

o

Develop Overall Model

o

Build Feature List

o

Plan By Feature

o

Design By Feature

o

Build By Feature

o

Milestones



User Experience
Development

(UX)

[
^
] User experience is a term used to describe the overall
experience and satisfaction a user has when using a product or system.

o

Reducing excessive features which miss the needs of

the user

o

Improving the overall usability of the system

o

Expediting design and
development

through detailed and properly conceived guidelines

o

Incorporating business and marketing goals while catering to the user



Cowboy coding
[
^
] Cowboy coding is a
software

development

methodology without an actual
defined method


team members do whatever they feel is right. T
ypical cowboy coding will
involve no initial definition of the purpose or scope of the project, no formal description of the
project, and will often involve one programmer.

o

Well known and common
-
place. Developers need no training to quickly ramp up and be
productive.

o

Facilitates rapid project completion.

o

Allows for quick solutions for small problems. Often a problem is small enough and well
understood enough that documentat
ion and planning are overkill. Typically when the
job is going to take a day or two, and involve only a single developer.

o

Can allow a
spike

to see if a programming idea is valid before embarking on a larger
project that depends on the idea. A spike is whe
re you write a small proof of concept
application to prove that a method is valid. These applications generally do not form
part of a real finished application.



Chaos Model



Prototyping

The most common software development models applied for the developm
ent
process are:



Waterfall model
. This model is mainly apt for small and relatively easy software projects.
Software companies working according to this model complete each stage in consecutive order
and review its results before proceeding to another stag
e, which renders the waterfall model
inflexible and unsuitable for complex long
-
term software projects.




Spiral model
. The essence of this model is in the underscored importance of a risk
-
analysis
during the development process. The spiral model presuppose
s that each stage of the classical
waterfall model is divided into several iterations, and each iteration undergoes planning and risk
analysis. As a result this model allows a software company to produce working software after
each iterative stage, while e
valuating the risks on an ongoing basis. However, adopting the spiral
model may result in notably higher costs.



V
-
shaped model.

This model is similar to the waterfall model, though the main emphasis is
placed on the verification stage and testing, which ov
erlap all the other stages of the software
development lifecycle. Tests are planed starting from the documentation stage, then
throughout integration and coding and after the actual implementation of a software product
testing itself is initiated. Therefor
e, the V
-
shape is formed due to the upward direction of
testing, i.e. test execution.




Iterative model.

This model allows a software company to spot and mend problems at the
earlier stages of the software development lifecycle, which makes the development process
more flexible. This aim is achieved by breaking down the whole lifecycle into several iteration
s,
thus handling the process in smaller portions. The iterative model allows creating the initial
version of a software product straight after the first iteration.




Agile development.

This development model adopts the iterative model as a baseline, while
p
utting an emphasis on the human factor, which is achieved by software team feedbacks
throughout the ongoing development process.

Hence, software professionals have several options to organise their work efficiently in order to build
cutting edge software s
olutions and avoid extra costs.

Unified Modeling Language

From Wikipedia, the free encyclopedia



A collage of UML diagrams.

Unified Modeling Language

(
UML
) is a standardized general
-
purpose
modeling language

in
the field of
object
-
oriented

software engineering
. The standard is managed, and was created, by
the
Object Management Group
. It was fir
st added to the list of OMG adopted technologies in
1997, and has since become the industry standard for modeling software
-
intensive systems.
[1]

UML includes a set of graphic notation techniques to create
visual models

of object
-
oriented
software
-
intensive systems.

History



History of object
-
oriented methods and no
tation.

[
edit
]

Before UML 1.x

After Rational Software Corporation hired
James Rumbaugh

from
General Electric

in 1994, the
company became the source for the two most popular object
-
oriented modeling approaches

of
the day:
[
citation needed
]

Rumbaugh's Object
-
modeling technique (OMT), which was better for
object
-
oriented analysis

(OOA), and
Grady Booch
's Booch method, which was better for
object
-
oriented design

(OOD). They were soon assisted in their efforts by
Ivar Jacob
son
, the creator of
the object
-
oriented software engineering (OOSE) method. Jacobson joined Rational in 1995,
after his company, Objectory AB,
[5]

was acquired by Ration
al. The three methodologists were
collectively referred to as the
Three Amigos
.

In 1996, Rational concluded that the abundance of modeling languages was slowing the adoption
of object technology,
[
citation needed
]

so repositioning the work on a unified method, they tasked the
Three Amigos with the development of a
non
-
proprietary

Unified Modeling Language.
Representatives of competing object technology companies were consulted during
OOPSLA

'96;
[
citation needed
]

they chose
boxes

for representing classes rather than the
cloud

symbols that were
used in Booch's notation.

Under the technical leadership of the Three Amigos, an international consortium called the

UML
Partners

was organized in 1996 to complete the
Unified Modeling Language (UML)

specification, and propose it as a response to the OMG RFP. The UML Partners' UML 1.0
specificat
ion draft was proposed to the OMG in January 1997. During the same month the UML
Partners formed a Semantics Task Force, chaired by
Cris Kobryn

and administered by
Ed
Eykholt
, to finalize the semantics of the specification and integrate it with other standardization
efforts. The result of this work, UML 1.1,

was submitted to the OMG in August 1997 and
adopted by the OMG in November 1997.
[6]

[
edit
]

UML 1.x

As a modeling notation, the influence of the OMT notation dominates (e.

g., using rectangles for
classes and ob
jects). Though the Booch "cloud" notation was dropped, the Booch capability to
specify lower
-
level design detail was embraced. The
use case

notation from Objectory and the
component

notation from Booch were integrated with the rest of the notation, but the
sem
antic
integration

was relatively weak in UML 1.1, and was not really fixed until the UML 2.0 major
revision.
[
citation needed
]

Concepts from many other OO
methods were also loosely integrated with UML with the intent
that UML would support all OO methods. Many others also contributed, with their approaches
flavouring the many models of the day, including: Tony Wasserman and Peter Pircher with the
"
Object
-
Oriented

Structured Design (OOSD)" notation (not a method), Ray Buhr's "Systems
Design with Ada", Archie Bowen's use case and timing analysis, Paul Ward'
s data analysis and
David Harel's "Statecharts"; as the group tried to ensure broad coverage in the real
-
time systems
domain. As a result, UML is useful in a variety of engineering problems, from single process,
single user applications to concurrent, dist
ributed systems, making UML rich but also large.

The Unified Modeling Language is an international
standard
:

ISO
/
IEC

19501:2005 Information technology


Open Distributed Processing


Unified Modeling
Language (UML) Version 1.4.2

[
edit
]

UML 2.x

UML has matured significantly s
ince UML 1.1. Several minor revisions (UML 1.3, 1.4, and 1.5)
fixed shortcomings and bugs with the first version of UML, followed by the UML 2.0 major
revision that was adopted by the OMG in 2005.
[7]

Although UML 2.1 was never released as a formal specification, versions 2.1.1 and 2.1.2
appeared in 2007, followed by UML 2.2 in February 2009. UML 2.3 was formally released in
May 2010.
[8]

UML 2.4 is in the beta stage as of March 2011.
[9]

There are four parts to the UML 2.x specification:

1.

The Superstru
cture that defines the notation and semantics for diagrams and their model
elements

2.

The Infrastructure that defines the core metamodel on which the Superstructure is based

3.

The
Object Constraint Language

(OCL) for defining rules for model elements

4.

The UML Diagram Interchange that defines how UML 2 diagram layouts are exchanged

The current versions of these standards follow: UML Superstructure version 2.3,
UML
Infrastructure version 2.3, OCL version 2.2, and UML Diagram Interchange version 1.0.
[10]

Although many
UML tools

support some of the new features of UML 2.x, the OMG provides no
test suite

to objectively test compliance w
ith its specifications.

[
edit
]

Topics

[
edit
]

Software development methods

UML is not a development method by itself;
[11]

however, it was
designed to be compatible with
the leading object
-
oriented software development methods of its time (for example OMT, Booch
method,
Objectory
). Since UML has evolved, some of these metho
ds have been recast to take
advantage of the new notations (for example OMT), and new methods have been created based
on UML, such as
IBM Rational U
nified Process

(RUP). Others include Abstraction Method and
Dynamic Systems Development Method
.

[
edit
]

Modeling

It is important to distinguish between the UML model and the set of diagrams of a system. A
diagram is a partial graphic representation of a syste
m's model. The model also contains
documentation that drives the model elements and diagrams (such as written use cases).

UML diagrams represent two different views of a system model:
[12]



Static (or
structural
) view: emphasizes the static structure of the system using objects,
attributes, operations and relationships. The structural view includes
class diagrams

and
composite structure diagrams
.



Dynamic (or
behavioral
) view: emphasizes the dynamic behavior of the system by showing
collaborat
ions among objects and changes to the internal states of objects. This view includes
sequence diagrams
,
activity diagrams

and
state machine diagrams
.

UML models can be exchanged among UML tool
s by using the
XMI

interchange format.

[
edit
]

Diagrams
overview

UML 2.2 has 14 types of
diagrams

divided into two categories.
[13
]

Seven diagram types represent
structural

information, and the other seven represent general types of
behavior
, including four
that represent different aspects of
interactions
. These diagrams can be categorized hierarchically
as shown in the following cla
ss diagram:


UML does not restrict UML element types to a certain diagram type. In general, every UML
element may appear on almost all types of diagrams; this flexibility has been partially restricted
in UML 2.0. UML profiles may define additional diagram

types or extend existing diagrams with
additional notations.

In keeping with the tradition of engineering drawings,
[
citation needed
]

a comment or note ex
plaining
usage, constraint, or intent is allowed in a UML diagram.

[
edit
]

Structure diagrams

Structure diagrams

emphasize the things that must be present in the system being modeled. Since
structure diagrams represent the structure, they are used extensively in documenting the
software
architecture

of software systems.



Class diagram
: describes the structure of a system by showing the system's classes, their
attributes, and the relationships among
the classes.



Component diagram
: describes how a software system is split up into components and shows
the dependencies among these components.



Composite structure diagram
: describes the internal structure of a class and the collaborations
that this structure makes possible.



Deployment diagram
: describes the hardware used in system implementations and the
execution environments and artifacts deployed on the hardware.



Object diagram
: shows a complete or partial view of the structure of an example modeled
system at a specific time.



Package diagram
: descr
ibes how a system is split up into logical groupings by showing the
dependencies among these groupings.



Profile diagram
: operates at the metamodel level to show stereotypes a
s classes with the
<<stereotype>> stereotype, and profiles as packages with the <<profile>> stereotype. The
extension relation (solid line with closed, filled arrowhead) indicates what metamodel element a
given stereotype is extending.




Class diagram




Component diagram




Composite structure diagram




Deployment diagram




Objec
t diagram




Package diagram

[
edit
]

Behaviour diagrams

Behaviour diagrams emphasize what must happen in the system being modelled. Since
behaviour diagrams illustrate the behavior of a system, they are used extensively to describe the
functionality of software systems.



Activity diagram
: describes the business and operational step
-
by
-
step workflows of components
in a system. An activity diagram shows the overall flow of control.



UML state machine

diagram: describes the states and state transitions of the system.



Use case diag
ram
: describes the functionality provided by a system in terms of actors, their
goals represented as use cases, and any dependencies among those use cases.




UML Activity D
iagram




State Machine

diagram




Use case diagram

[
edit
]

Interaction diagrams

Interaction diagrams, a subset of behaviour diagrams, emphasize the flow of control and data
among the things in the
system being modeled:



Communication diagram
: shows the interactions between objects or parts in terms of
sequenced messages. They represent a combination of infor
mation taken from Class, Sequence,
and Use Case Diagrams describing both the static structure and dynamic behavior of a system.



Interaction overview

diagram
: provides an overview in which the nodes represent
communication diagrams.



Sequence diagram
: shows how objects communicate with each other in terms of a sequence o
f
messages. Also indicates the lifespans of objects relative to those messages.



Timing diagrams
: a specific type of
interaction diagram where the focus is on timing constraints.




Communication diagram




Interaction overview diagram




Sequence diagram

The Protocol State

Machine is a sub
-
variant of the State Machine. It may be used to model
network communication protocols.

[
edit
]

Meta

modeling



Illustration of the Meta
-
Object Facility.

The Object Management Group (OMG) has developed a
metamodeling

architecture to define the
Unified Modeling Language (UML), c
alled the
Meta
-
Object Facility

(MOF). The Meta
-
Object
Facility is a standard for
model
-
driven engineering
, designed as a four
-
layered architecture, as
shown in the image at right. It provides a meta
-
meta model at the top layer, called the M3 layer.
This M3
-
model is the language used by Meta
-
Object Facility to build met
amodels, called M2
-
models. The most prominent example of a Layer 2 Meta
-
Object Facility model is the UML
metamodel, the model that describes the UML itself. These M2
-
models describe elements of the
M1
-
layer, and thus M1
-
models. These would be, for example,

models written in UML. The last
layer is the M0
-
layer or data layer. It is used to describe runtime instance of the system.

Beyond the M3
-
model, the Meta
-
Object Facility describes the means to create and manipulate
models and metamodels by defining
CORBA

interfaces that describe those operations. Because
of the similarities between the Meta
-
Object Facility M0
-
model and UML structure models,
Meta
-
Object Facility metamodels are usually modeled as
UML class diagrams. A supporting
standard of the Meta
-
Object Facility is
XMI
, which defines an XML
-
based exchange format for
models on the M3
-
, M2
-
, or M1
-
L
ayer.

[
edit
]

Criticisms


This article's
Criticism

or
Controversy

section
may compromise the article's
neutral point of
view

of the subject
. Please
integrate the section's contents

into the article as a whole, or
rewrite the material; see the discussion on the
talk page
.
(December 2010)

Although UML is a widely recognized and used modeling standard, it is frequently
[
citation needed
]

criticized for the following:

Standards bloat

Bertrand Me
yer
, in a satirical essay framed as a student's request for a grade change, apparently
criticized UML as of 1997 for being unrelated to object
-
oriented software development; a
disclaimer was added later pointing out that his company nevertheless supports U
ML.
[14]

Ivar
Jacobson, a co
-
architect of UML, said that objections to UML 2.0's size were valid enough to
consider the application of
intelligent agents

to the problem.
[15]

It contains many diagrams and
constructs that are redundant or in
frequently used.

Problems in learning and adopting

The problems cited in this section make learning and adopting UML problematic, especially
when required of engineers lacking the prerequisite skills.
[16]

In practice, people often draw
diagrams with the symbols provided by their
CASE tool
, but without the meanings those
symbols are intended to provide
. Simple user narratives e.g. "what i do at work ..." have shown
to be much simpler to record and more immediately useful.
[
citation needed
]

Linguistic inc
oherence

The standards have been cited as being ambiguous and inconsistent.
[17]
[18]
[19]

The UML 2.0
standard still suffers many issues
[20]
[21]

Capabilities of UML and implementation language mismatch

Typical of other notational systems, UML is able to represent some systems more concisely or
efficiently th
an others. Thus a developer gravitates toward solutions that reside at the
intersection of the capabilities of UML and the implementation language. This problem is
particularly pronounced if the implementation language does not adhere to orthodox object
-
or
iented doctrine, since the intersection set between UML and implementation language may
be that much smaller.

Dysfunctional interchange format

While the XMI (XML Metadata Interchange) standard is designed to facilitate the interchange of
UML models, it has

been largely ineffective in the practical interchange of UML 2.x models. This
interoperability ineffectiveness is attributable to several reasons. Firstly, XMI 2.x is large and
complex in its own right, since it purports to address a technical problem mor
e ambitious than
exchanging UML 2.x models. In particular, it attempts to provide a mechanism for facilitating the
exchange of any arbitrary modeling language defined by the OMG's Meta
-
Object Facility (MOF).
Secondly, the UML 2.x Diagram Interchange specif
ication lacks sufficient detail to facilitate
reliable interchange of UML 2.x notations between modeling tools. Since UML is a visual
modeling language, this shortcoming is substantial for modelers who don't want to redraw their
diagrams.
[1]

This shortcoming is being addressed by the Diagram Definition OMG project for
which a proposed standard is already available.
[22]
[
not in citation given
]

Cardinality Notation

As with database
ER diagrams
, class models are specified to use "look
-
across" cardinalities, even
though several authors (
Merise
,
[23]

Elmasri & Navathe
[24]

amongst others
[25]
) p
refer same
-
side
or "look
-
here" for roles and both minimum and maximum cardinalities. Recent researchers
(Feinerer,
[26]

Dullea et. alia
[27]
) have shown that the "look
-
across" technique used by UML and
ER diagrams

is less effective and less coherent when applied
to n
-
ary relationships of order >2.

In Feinerer it says "Problems arise if we operate under the look
-
across semantics as used for
UML associations. Hartmann
[28]

inves
tigates this situation and shows how and why different
transformations fail."
(Although the "reduction" mentioned is spurious as the two diagrams 3.4
and 3.5 are in fact the same)

and also "As we will see on the next few pages, the look
-
across
interpretati
on introduces several difficulties which prevent the extension of simple mechanisms
from binary to n
-
ary associations."

Exclusive

The term "Unified" applies only to the unification of the many prior existing and competing
Object Orientated languages. Impor
tant well known and popular techniques, almost universally
used in industry, such as Data Flow Diagrams and Structure charts were not included in the
specification.
[
citation needed
]

Modeling experts have written criticisms of UML, including
Brian Henderson
-
Sellers

and
Cesar
Gonzalez
-
Perez

in "Uses and Abuses of the Stereotype Mechanism in UML 1.x and 2.0".
[29]

[
edit
]

UML mode
lling tools

Main article:
List of Unified Modeling Language tools

The most well
-
known UML modelling tool is IBM Rational Rose.
[
citation needed
]

Other tools
include, in alphabetical order,
ArgoUML
,
BOUML
,
Dia
,
Enterprise Architect
,
MagicDraw
UML
,
PowerDesigner
,
Rational Rhapsody
,
Rational Software Architect
,
StarUML
, Software
Ideas Modeler and
Umbrello
. Some of popular
development environments

also offer UML
modelling tools, e.g.:
Eclipse
,
NetBeans
, and
Visual Studio
.

s


Vannevar Bush and Memex


Consider a future device for individual use, which is a sort

of
mechanized private file and library. It needs a name, and to coin
one at random, "memex" will do. A memex is a device in which an
individual stores all his books, records, and communications, and
which is mechanized so that it may be consulted with exc
eeding
speed and flexibility. It is an enlarged intimate supplement to his
memory.

It consists of a desk, and while it can presumably be operated from
a distance, it is primarily the piece of furniture at which he works.
On the top are slanting translucen
t screens, on which material can
be projected for convenient reading. There is a keyboard, and sets
of buttons and levers. Otherwise it looks like an ordinary desk.

-

Vannevar Bush;
As We May Think
; Atlantic Monthly; July 1945.

Vannevar Bush established the U.S. military / university research partnership that later
invented the
ARPA
NET
, and wrote the first visionary description of the potential use for
information technology, inspiring many of the
Internet
's creators.

Vannevar Bush was born on March 11, 1890, in Everett, Massachusetts. H
e taught at Tufts
University from 1914 to 1917, carried out submarine
-

detection research for the US Navy,
and then joined the faculty of the Massachusetts Institute of Technology (
MIT
) at the age

of
twenty
-
nine. At MIT, Bush worked with a team of researchers to build an automated
network analyzer to solve mathematical differential equations, and in the 1930's helped
build the first analog computers.

President Roosevelt appointed Bush to Chairman o
f the National Defense Research
Committee in 1940 to help with World War II. In 1941, Bush was appointed Director of the
newly created Office of Scientific Research and Development (
OSRD
), established to
coordinate weapons development research. The organization employed more than 6000
scientists by the end of the war, and supervised development of the atom bomb. From 1946
to

1947, he served as chairman of the Joint Research and Development Board.

Bush brought together the U.S. Military and universities with a level of research funding not
previously deployed, providing the universities with large, new revenue streams for
est
ablishment of laboratories, acquisition of equipment, and the conduct of pure and applied
research. In return, the military obtained the advantages of rapidly improving technology.

Thanks in part to Bush's initial setup,

the three lead universities in this partnership for
several decades were
Harvard University
, the Massachusetts Institute of Technology, and
the
University of California at Berkeley
. Through the influence of projects like
SAGE

and
organizations like the
IPTO
, the

university / military partnership established by Bush
naturally laid the foundation for subsequent development of the ARPANET by
DARPA
.

However, Vannevar Bush's most direct influence on the deve
lopment of the Internet comes
from his visionary description of an information system he called "memex", in an article
titled
As We May Think

published in the Atlantic Monthly in July, 1945, in which he describes
the first automated information management system (see excerpt top of this page).

Bush's memex was a breakthrough revelation, an information centric application of
electronic technology

not previously considered. The vision stamped by memex strongly
inspired succeeding generations of scientists and engineers who built the Internet, notably
J.C.R. Licklider

and
Douglas Engelbart
. Many leading researchers realized that a memex
type system would eventually be built, and worked to help realize it. Only now, more than
50 years later, is Bush's dream becoming fully r
ealized with the development of personal
computers, the
web
, and
search engines
.

In the private sector, Vannevar Bush was a cofounder of
Raytheon
, one of the United
State's largest defense contractors. He was also president of the
Carnegie Institute of
Washington

research organization from 1939 to 1955.

Resources
. The following sites provide additional information on Vannevar Bush.



Bush, Vannevar;
Science The Endless Frontier
; A Report to the President by
Vannevar Bush, Director of the Office of Scientific Research and Development;
United States Government Printing Office; July, 1945.



Memex and Beyond

--

hi
storical and current research in hypermedia inspired by
Vannevar Bush's memex vision.



MIT Vannevar Bush Symposium
--

1995
.



Book review

of "Endless Frontier: Vannevar Bush, Engineer Of The American
Century", by G. Pascal Zachary.

Norbert Wiener Invents Cybernetics



Since Leibniz there has perhaps been no man who has had a full
command of all the intellectual activity of his day. Since that time,
science has been increasingly the task of specialists, in fields
which show a tendency to grow progressively narrower. A ce
ntury
ago there may have been no Leibniz, but there was a Gauss, a
Faraday, and a Darwin. Today there are few scholars who can call
themselves mathematicians or physicists or biologists without
restriction.

A man may be a topologist or an acoustician or a

coleopterist. He
will be filled with the jargon of his field, and will know all its
literature and all its ramifications, but, more frequently than not, he
will regard the next subject as something belonging to his
colleague three doors down the corridor,

and will consider any
interest in it on his own part as an unwarrantable breach of privacy.

-

Wiener, Norbert;
Cybernetics
; 1948.

Norbert Wiener invented the field of cybernetics, inspiring a generation of scientists to think
of computer technology as a
means to extend human capabilities.

Norbert Wiener was born on November 26, 1894, and received his Ph.D. in Mathematics
from Harvard University at the age of 18 for a thesis on mathematical
logic
. He
subsequently studied under Bertrand Russell in Cambridge, England, and David Hilbert in
Göttingen, Germany. After working as a journalist, university teacher, engineer, and writer,
Wiener he was hired by MIT in 1919, coincidentally the same year

as
Vannevar Bush
. In
1933, Wiener won the Bôcher Prize for his brilliant work on Tauberian theorems and
generalized harmonic analysis.

During World War II, Wiener worked on guided missile techno
logy, and studied how
sophisticated electronics used the feedback principle
--

as when a missile changes its flight
in response to its current position and direction. He noticed that the feedback principle is
also a key feature of life forms from the simpl
est plants to the most complex animals, which
change their actions in response to their environment. Wiener developed this concept into
the field of cybernetics, concerning the combination of man and electronics, which he first
published in 1948 in the boo
k
Cybernetics
.

Wiener's vision of cybernetics had a powerful influence on later generations of scientists,
and inspired research into the potential to extend human capabilities with interfaces to
sophisticated electronics, such as the user interface studie
s conducted by the
SAGE

program. Wiener changed the way everyone thought about computer technology,
influencing several later developers of the
Inter
net
, most notably
J.C.R. Licklider
.

In 1964, Norbert Wiener won the US National Medal of Science. In the same year, he
published one of his last books called "God and Golem, Inc.: A Comment o
n Certain Points
Where Cybernetics Impinges on Religion".

Resources
. The following sites related to Norbert Wiener and cybernetics have been
established for several years.



Principia Cybernetica

has more than a thousand pages, including a list of influential
Cybernetics and Systems Thinkers
.



The
Research Committee on Sociocybernetics

is a member of the
International
Sociological Association
, and operates under the auspices of UNESCO to "promote
the develop
ment of (socio)cybernetic theory and research within the social sciences".



The
American Society for Cybernetics
, whose objective is to "develop a
metadisciplinary language with which we may

better understand and modify our
world."



The
Max Planck Institute for Biological Cybernetics
.



The University of Reading
Department of Cybernetics
.



The
Bacterial Cyberne
tics Group

collected evidence of cybernetic sophistication by
bacteria, including advanced computation, learning, and creativity.

Lists of sites:



Google

-

Cybernetics
.

Semi
-
Automatic Ground Environment (SAGE)


-

Charles Babbage Institute

The
SAGE

program significantly advanced the state of the art in human
-
computer
interaction, influenced the thinking of
J.C.R. Licklider
, caused the establishment of the MIT
Lincoln Laboratory

where
Lawrence Roberts

later worked, and established one of the first
wide
-
area networks.

Thanks in part to
Vannevar Bush's

funding, the Massachusetts Institute of Technology (MIT)
became a hotbed of advanced research. One of their most influential projects was the Semi
-
Automatic Ground Environment (SAGE) program, established in 19
54 by the US Air Force to
develop a continental air defense system to protect against a nuclear bomber attack from
the Soviet Union.

MIT established the Lincoln Laboratory in Lexington, Massachusetts, to produce the SAGE
system design. Lincoln Labs went o
n to do a great deal of useful later research, and gave
their best network scientist, Lawrence Roberts, to the
IPTO

to help create the
ARPANET
. In
1958, the
MITRE

Corporation was formed from the Computer System Division of the
Massachusetts Institute of Technology Lincoln Laboratories, and conducted the software
development o
f
SAGE's digital computer system
. The System Development Corporation in
Santa Monica was created to develop the software. The program drove the discovery of
several seminal softwa
re engineering concepts, such as multi
-
user interaction, advanced
data structures, structured program modules, and global data definitions.

MIT and IBM developed the IBM AN/FSQ
-
7 computer to run the SAGE centers. The final
model was the largest computer e
ver built, and weighed 250 tons, took up twenty thousand
square feet of space, and was delivered in eighteen large vans. It had 50,000 vacuum
tubes, more than 150 CRT monitors, needed more than a million watts of power, and took
up two stories of a buildin
g. The vacuum tubes generated so much heat that human beings
couldn't stand close to the computer for more than a few seconds, and the whole computer
would melt and self
-
destruct within sixty seconds if the air conditioning ever failed. The US
Air Force bo
ught twenty
-
seven.

Operators accessed the SAGE system through the first "cathode ray displays" or monitors,
and used a light pen to select tracks of potential incoming hostile aircraft and manage their
status. When SAGE was deployed in 1963, it consisted o
f 24 Direction Centers and 3
Combat Centers, each linked by long
-
distance telephone lines to more than 100 radar
defense sites across the country, thereby establishing one of the first large
-
scale wide
-
area
computer networks. This had a great influence on
a lot of people who worked on the
program, including Licklider, who later became the first Director of the IPTO and initiated
the research that led to creation of the ARPANET. SAGE remained in continuous operation
until 1983.

Resources
.

The following sites provide more information on SAGE:



Fact sheet

--

SAGE computer backup, the Burroughs Corporation AN/GSA
-
51 Radar
Course Directing Group for the Back Up Interceptor Contro
l System (BUIC)



Federation of American Scientists
-

Sage



Jim Ray
-

SAGE



MITRE
-

SAGE Beginnings



MITRE
-

SAGE Photos



Online Air Defense Radar Museum



The Rise of Air Defense



SAGE Air Defence



SAGE Computers



Williamson Labs
--

SAGE Computer



Whirlwind Computer
.

Dartmouth Artificial Intelligence (AI) Conference

We propose that a 2 month, 10 man study of artificial intelligence
be carried out during the summer of 1956 at Dartmouth College in
Hanover, New Hampshire.

The study is to proceed on the basis of the conjecture that every
aspect of learning or any other f
eature of intelligence can in
principle be so precisely described that a machine can be made to
simulate it. An attempt will be made to find how to make machines
use language, form abstractions and concepts, solve kinds of
problems now reserved for humans,

and improve themselves.

-

Dartmouth AI Project Proposal
; J. McCarthy et al.; Aug. 31, 1955.

The 1956 Dartmouth Artificial Intelligence (AI)
conference gave birth to the field of AI, and
gave succeeding generations of scientists their first sense of the potential for information
technology to be of benefit to human beings in a profound way.

In 1956,
John McCarthy

invited many of the leading researchers of the time in a wide range
of advanced research topics such as complexity theory, language simulation, neuron nets,
abstraction of content from sensory inputs, relationsh
ip of randomness to creative thinking,
and learning machines to Dartmouth in New Hampshire to discuss a subject so new to the
human imagination that he had to coin a new term for it: Artificial Intelligence.

This conference was the largest gathering on th
e topic that had yet taken place, and laid the
foundation for an ambitious vision that has affected research and development in
engineering, mathematics, computer science, psychology, and many other fields ever since.
It was no coincidence that, as early a
s 1956, evidence indicated that electronic capacity and
functionality were doubling approximately every eighteen months, and the rate of
improvement showed no signs of slowing down. The conference was one of the first serious
attempts to consider the conse
quences of this exponential curve. Many participants came
away from the discussions convinced that continued progress in electronic speed, capacity,
and software programming would lead to the point where computers would someday have
the resources to be as
intelligent as human beings, the only real question was when and
how it would happen.

This conference and the concepts it crystallized gave birth to the field of AI as a vibrant area
of interdisciplinary research, and provided an intellectual backdrop to a
ll subsequent
computer research and development efforts
--

not to mention many books and movies. This
new field's revolutionary vision was a significant influence on several of the people that
helped create the
Internet
, perhaps most notably
J.C.R. Licklider

with his concept of a
universal network that produces power greater than the sum of its parts.

Resources
: The following references provide
more information on artificial intelligence:



Google
-

Artificial Intelligence



Yahoo
-

Artificial Intelligence



ACM Crossroads Fall 1996
.



ACM Special Interest Group on Artif
icial Intelligence



Artificial Intelligence: A Modern Approach



American Association for Artificial Intelligence



Bibliographies on Artificial Intelligence



International Joint Conference on Artificial Intelligence



The Loebner Prize
--

Turing Test



Premise.org
.



DARPA / ARPA
--

Defense / Advanced Research Project Agency

DARPA's
ability to adapt rapidly to changing environments and to
seek and embrace opportunities in both technology and in
processes, while maintaining the historically proven principles of
the Agency, makes DARPA the crown jewel in Defense R&D and a
unique R&D org
anization in the world.

-

DARPA Over the Years
, August 1997.



DARPA

(
later ARPA) is the innovative R&D organization that funded the development
of the
ARPANET
.



In 1957, only twelve years after publication of Arthur C. Clarke's
seminal paper

describing the idea of satellites, the Soviet Union launched the first satellite, Sputnik
I, beating the United States into space. This meant that the USSR could theoreti
cally
launch bombs into space and then drop them down anywhere on earth. The
American military became highly alarmed.



In 1958, President Dwight Eisenhower appointed
MIT

President James Killian as
Presidential Assistant for Science and created the Advanced Research Projects
Agency (ARPA) to jump
-
start U.S. technology and find safeguards against a space
-
based missile attack. The US military was particularly concerned about

the effects of
a nuclear attack on their communications infrastructure, because if they couldn't
communicate, they wouldn't be able to regroup or respond, thereby making the
threat of a first strike by the Soviet Union more likely.



To meet this need, ARP
A established the
IPTO

in 1962 with a mandate to build a
survivable computer network to interconnect the DoD's main computers at the
Pentagon
,
Cheyenne Mountain
, and SAC HQ. As described in the following pages,
this initiative led to the development of the ARPANET seven years later, and then to
the
NSFNET

and the
Internet

we know today. ARPA also funded some of the early
networking research done by
Lawrence Roberts
, who later became the ARPANET
Program Manager.



ARPA had unique authorization and direction to make quantum jumps in technology
using any means they believed appropriate. For example, they had the unusual
mandate to use res
earch before it had been peer
-
reviewed, since the peer
-
review
process prevented mistakes but slowed down progress. It worked
--

within 18
months of its creation ARPA developed and deployed the first US satellite.



From its inception ARPA significantly funde
d many US university research labs, and
as early as 1968 had a close relationship with
Carnegie
-
Mellon University
,
Harvard
Universit
y
,
MIT
,
Stanford University
,
UCB
,
UCLA
,
UCSB
,
University of Illinois
, and the
University of Utah
, as well as leading industry labs including
Bolt Beranek and
Newman
,
Computer Corporation of America
,
Rand
,
SRI
, and Systems Development
Corporation. Most of these labs were connected to the ARPANET soon after it was
created in order to enable cross
-
fertilization of research activity.



In the early 1970's the word "Defense" was prefixed to the name, and ARPA became
known as DARPA. By the late 1990's, DARPA reported to the Director for Defense
Research and Engineering and had about 250 staff and a budg
et of US$2 billion. A
typical project was funded with between ten and forty million dollars over a period of
four years, and drew support from several consultants and one or two universities.
An excerpt from a 1997 description of the organization is provid
ed below:

DARPA's mission has been to assure that the U.S. maintains a
lead in applying state
-
of
-
the
-
art technology for military capabilities
and to prevent technological surprise from her adversaries.

The DARPA organization was as unique as its role, rep
orting
directly to the Secretary of Defense and operating in coordination
with but completely independent of the military research and
development (R&D) establishment.

Strong support from the senior DoD management has always been
essential since DARPA was

designed to be an anathema to the
conventional military and R&D structure and, in fact, to be a
deliberate counterpoint to traditional thinking and approaches.

-

DARPA Over the Years
, August 1997.



DARPA program managers have always had complete control over program funding,
unprecedented flexibility in management capabilities, and direct responsibility for
making their program a success. A description of the role of a DARPA program
manager from 1977

is provided below. Send in your application today.

The DARPA environment is one of the most demanding and
electric in the government. It is where people who want to make a
difference come to invest 4 years in public service as a program
manager.

The ide
al program manager is technically deep, with excellent but
eclectic technology taste, usually seasoned by five or more years
of accomplishment in industry, the military, or academia. An
outstanding technical foundation is needed to triumph over
unforeseen
problems or to pounce on opportunities at the frontiers
of knowledge.

The program manager must be able to integrate, innovate, and
readily accept new ideas proposed by others. The program
manager formulates a vision for the program, positions and
advocate
s the program within the context of DARPA's overall
mission, charts a course for the near
-

and long
-
term
accomplishments necessary to reach the program objectives, and
manages all technical, procurement, and financial aspects of the
program. The ideal prog
ram manager must complement technical
excellence with management and leadership skills, including
people skills, public speaking skills, project management
experience, careful financial management skills, the ability to make
timely decisions, and a sense o
f controlled urgency.

No one in government has more constructive power than a DARPA
program manager. Spend four years at DARPA as part of your
career. It will change the way you view the world. It will be a
service to your technical community and to the N
ation. You can
move the world, if you stand in the right place.

-

Working As A DARPA Manager
, Original from August
1977.

J.C.R. Licklider And The Universal Network


It seems reasonable to envision, for a time 10 or 15 years hence, a
'thinking center' that will incorporate the functions of present
-
day
libraries together with anticipated advances in information storage

and retrieval.

The picture readily enlarges itself into a network of such centers,
connected to one another by wide
-
band communication lines and
to individual users by leased
-
wire services. In such a system, the
speed of the computers would be balanced,
and the cost of the
gigantic memories and the sophisticated programs would be
divided by the number of users.

-


J.C.R. Licklider,
Man
-
Computer Symbiosis
, 1960.

Joseph Carl Robnett "L
ick" Licklider created the idea of a universal network, spread his
vision throughout the
IPTO
, and inspired his successors to realize his dream by inventing
the
ARPANET
, which led to the
Internet
. He also developed the concepts that led to the idea
of the
Netizen
.

Licklider started his scientific career as an experimental psychologist and professor at MIT
interested in psychoacoustics, the study of how the human ear and brain convert air
vibrations into the perception of sound. At MIT he also worked on the
SAGE

project as a
human factors expert, which helped convince him of the great potential for human /
computer interfaces.

Licklider's psychoacoustics research at MIT took an enormous amount of data analys
is,
requiring construction of several types of mathematical graphs based on the data collected
by his research. By the late 1950's he realized that his mathematical models of pitch
perception had grown too complex to solve, even with the analog computers t
hen available,
and that he wouldn't be able to build a working mechanical model and advance the theory
of psychoacoustics as he had wished.

In response to this revelation, in 1957 Licklider spent a day measuring the amount of time it
took him to perform th
e individual tasks of collecting, sorting, and analyzing information,
and then measured the time it took him to make decisions based on the data once it was
collected. He discovered that the preparatory work took about 85% of the time, and that
the decisio
n could then be made almost immediately once the background data was
available. This exercise had a powerful effect, and convinced him that one of the most
useful long term contributions of computer technology would be to provide automated,
extremely fast
support systems for human decision making.

After Licklider was awarded tenure at MIT, he joined the company Bolt, Beranek &

Newman
to pursue his psychoacoustic research, where he was given access to one of the first
minicomputers, a PDP
-
1 from Digital Equipment Company. The PDP
-
1 had comparable
computing power to a mainframe computer of the time, at a cost of $250K, and only t
ook
up as much space of a couple of household refrigerators. Most importantly, instead of
having to hand over punched cards to an operator and wait days for a printed response
from the computer, Licklider could program the PDP
-
1 directly on paper tape, eve
n stopping
it and changing the tape when required, and view the results directly on a display screen in
real
-
time. The PDP
-
1 was the first interactive computer.

Licklider quickly realized that minicomputers were getting to be powerful enough to support
the

type of automated library systems that
Vannevar Bush

had described. In 1959, Licklider
wrote his first influential book, titled "Libraries of the Future", about how a computer could
provide an au
tomated library with simultaneous remote use by many different people
through access to a common database.

Licklider also realized that interactive computers could provide more than a library function,
and could provide great value as automated assistants.

He captured his ideas in a seminal
paper in 1960 called
Man
-
Computer Symbiosis
, in which he described a computer assistant
that could answer questions, perform simulation modeling, graphically display results, and
extrapolate solutions for new situations from past experience. Like
Norbert Wiener
, Licklider
foresaw a close symbiotic relationship between computer and human, including
sophisticated computerized interfaces with the brain.

Licklider also quickly appreciated the power of computer networks, and predicted the eff
ects
of technological distribution, describing how the spread of computers, programs, and
information among a large number of computers connected by a network would create a
system more powerful than could be built by any one organization. In August, 1962,

Licklider and Welden Clark elaborated on these ideas in the the paper "On
-
Line Man
Computer Communication", one of the first conceptions of the future Internet.

In October, 1962, Licklider was hired to be Director of the IPTO newly established by
DARPA
. His mandate was to find a way to realize his networking vision and interconnect the
DoD's main computers at the
Pentagon
,
Cheyenne Mountain
, and SAC HQ. He started by
writing a series of memos to the other members of the office describing the benefits of
creation of a global, distributed network, ad
dressing some
memos

to "Members and
Affiliates of the Intergalactic Computer Network". Licklider's vision of a universal network
had a powerful influence on his successors at the IPTO
, and provided the original intellectual
push that led to the realization of the
ARPANET

only seven years later.

In 1964, Licklider left the IPTO and went to work at IBM. In 1968, he went back
to MIT to
lead the Project MAC project. In 1973, he returned again to lead the IPTO for two years. In
1979, he was one of the founding members of
Infocom
.

Netizen
. In April, 1968, Licklider and

Robert Taylor published a ground
-
breaking paper
The
Computer as a Communication Device

in
Science and Technology
, portraying the
forthcoming universal network as more than a service to

provide transmission of data, but
also as a tool whose value came from the generation of new information through interaction
with its users. In other words, the old golden rule applied to an as yet unbuilt network
world, where each netizen contributes mor
e to the virtual community than they receive,
producing something more powerful and useful than anyone could create by themself.

Michael Hauben
, a widely read

Internet pioneer, encountered this spirit still going strong in
his studies of online Internet communities in the 1990's, leading to his coinage of the term
"net citizen" or "netizen". Newcomers to the Internet usually experience the same benefit of
parti
cipating in a larger virtual world, and adopt the spirit of the netizen as it is handed
down the generations. It cannot be a coincidence that so many Internet technologies are
built specifically to leverage the power of community information sharing, such
as the
Usenet newsgroups
,
IRC
,
MUD's
, and mailing lists. The concept of the netizen

is also the
foundation for the motivation of
netiquette
.

Marshall McLuhan Foresees The Global Village

Today, after more than a century of electric technology, we have
extended our central nervous s
ystem itself in a global embrace,
abolishing both space and time as far as our planet is concerned.

-

Marshall McLuhan,
Understanding Media
, 1964.

Marshall McLuhan's insights made the concept of a global village, interconnected by an
electronic nervous sy
stem, part of our popular culture well before it actually happened.

Marshall McLuhan was the first person to popularize the concept of a global village and to
consider its social effects. His insights were revolutionary at the time, and fundamentally
changed how everyone has thought about media, technology, and communications ever
since. McLuhan chose the insightful phrase "global village" to highlight his observation that
an electronic nervous system (the media) was rapidly integrating the planet
--

e
vents in one
part of the world could be experienced from other parts in real
-
time, which is what human
experience was like when we lived in small villages.

While McLuhan popularized this concept, he was not the first to think about the unifying
effects of
communication technology. One of the earliest thinkers along this line was
Nicolas
Tesla
, who in an interview with Colliers magazine in 1926 stated: "
When wireless is
perfectly applied the whole earth will be converted into a huge brain, which in fact it is, all
things being particles of a real and rhythmic whole. We shall be able to communicate with
one another instantly, irrespective of distance. Not only this, but th
rough television and
telephony we shall see and hear one another as perfectly as though we were face to face,
despite intervening distances of thousands of miles; and the instruments through which we
shall be able to do his will be amazingly simple compare
d with our present telephone. A
man will be able to carry one in his vest pocket.
"

McLuhan's second best known insight is summarized in the expression "the medium is the
message", which means that the qualities of a medium have as much effect as the
inform
ation it transmits. For example, reading a description of a scene in a newspaper has a
very different effect on someone than hearing about it, or seeing a picture of it, or watching
a black and white video, or watching a colour video. McLuhan was particula
rly fascinated by
the medium of television, calling it a "cool" medium, noting its soporific effect on viewers.
He took great satisfaction years later when medical studies showed that TV does in fact
cause people to settle into passive brain wave patterns.

One wonders what McLuhan would
make of the
Internet
?

Like
Norbert Wiener

and
J.C.
R. Licklider
, McLuhan made a study of the extrapolation of
current trends in technology, and specialized in the effects on human communications. He
generally felt that the developments he described would be positive, but particularly worried
about the pote
ntial for very sophisticated, manipulative advertising.

McLuhan's ideas have permeated the way we in the global village think about technology
and media to such an extent that we are generally no longer aware of the revolutionary
effect his concepts had w
hen they were first introduced. McLuhan made the idea of an
integrated planetary nervous system a part of our popular culture, so that when the
Internet finally arrived in the global village it seemed no less amazing, but still somehow in
the natural order

of things.

Resources
. Two of McLuhan's best known books are
The Gutenberg Galaxy
, published in
1962, and
Understanding Media
, published in 1964. The following references provide more
information about Marshall McLuhan the man and his work:



Google
-

Marshall McLuhan



Yahoo
-

Marshall McLuhan



McLuhan Global Research Network



McLuhan Studies



MCS McLuhan References



VideoMcLuhan

-

uncertain availability



Wired Magazine
-

Marshall McLuhan page
.



Paul Baran Invents Packet Switching




The expenditure milestone points that can be ear
-
marked for
system evaluation would occur at
about the $1.25
-
million level
(during the Study and Research Phase), after the $5
-
million point
(at the conclusion of the entire Study and Research Phase), at the
$11.6
-
million level (at the end of the Design Phase), at the $15.7
-
million mark (at the end o
f the Test Phase), at the $21.7
-
million
level (at the end of the Development Phase), and at the $23.7
-
million point (at the end of the Final Test Phase). Thus, there are
many early opportunities to reevaluate and redirect this program
upon discovery of unf
oreseen difficulties or better alternative
approaches.

-

Paul Baran,
On Distributed Communications
, Volume XI, 1964.



Paul Baran invented the field of
packet switching

networks while conducting research
at the historic
RAND

organization, a concept embedded in the design of the
ARPANET

and the standard
TCP/IP

protocol used on the
Internet

today.



Paul Baran's packe
t switching story starts at the Research And Development (RAND)
research organization. RAND was founded in Santa Monica, California, soon after the
second world war to help maintain the unique system analysis and operations
research skills developed by the

US military to manage the unprecedented scale of
planning and logistics during that global conflict. RAND still maintains a high
proportion of research staff with advanced degrees, and provides an extensive
research capability capable of tackling a wide r
ange of problems for governments
and industry.



In 1959, a young electrical engineer named Paul Baran joined RAND from Hughes
Aircraft's systems group. The US Air Force had recently established one of the first
wide area computer networks for the
SAGE

radar defence system, and had an
increasing interest in survivable, wide area communications networks so they could
reorganize and respond after a nuclear attack, diminishing the attractiveness of a
fi
rst strike option by the Soviet Union.



Baran began an investigation into development of survivable communications
networks, the results of which were first presented to the Air Force in the summer of
1961 as briefing B
-
265, then as paper P
-
2626, and then
as a series of eleven
comprehensive papers titled
On Distributed Communications

in 1964.



Baran's study describes a remarkably detailed architecture for a distributed
,
survivable, packet switched communications network. The network is designed to
withstand almost any degree of destruction to individual components without loss of
end
-
to
-
end communications. Since each computer could be connected to one or
more other comp
uters, Baran assumed that any link of the network could fail at any
time, and the network therefore had no central control or administration.



Baran's architecture was well designed to survive a nuclear conflict, and helped to
convince the US Military that

wide area digital computer networks were a promising
technology. Baran also talked to Bob Taylor and
J.C.R. Licklider

at the
IPTO

about his
work, since they were also working to build a wide area communications network.
Baran's 1964 series of papers then influenced
Roberts

and
Kleinrock

to adopt the
technology for development of the
ARPANET

network a few years later, laying the
groundwork that leads to its continued use today by TCP/IP on the Inte
rnet today..



In another of those scientific synchronicities, Baran's packet switching work was
strikingly similar to the work performed independently a few years later by Donald
Davies at the
Nati
onal Physical Laboratory
, including common details like a packet
size of 1024 bits. This idea was almost waiting to be discovered.



Baran later left RAND to become an entrepreneur and private investor in the early
1970's, and founded
Metricom
, co
-
founded Com21.com, and co
-
founded the
Institute
for the Future
.



Paul Baran has also received numerous

awards, including the IEEE Alexander
Graham Bell Medal, and the Marconi International Fellowship Award.

UK National Physical Laboratory (NPL) & Donald Davies


-

Donald Davies

Donald Davies and his colleagues at the UK
National Physical Laboratory

independently
discovered the idea of
packet switching
, and later created a smaller scale packet
-
switched
version of the
ARPANET
.

In 1965, while Donald Davies was Superintendent of the National Physical Lab (NPL) in
Britain, he proposed development of a country wide packet communications network. H
e
gave a talk on the proposal in 1966, and afterwards a person from the Ministry of Defence
told him about
Paul Baran
's work for RAND being done independently. In one of those
coincidences so perv
asive in the history of science, Davies had independently decided upon
some of the same parameters for his network design as Baran, such as a packet size of
1024 bits.

At the 1967
ACM

Symposiu
m on Operating System Principles,
Lawrence Roberts

met Davies
and Roger Scantlebury from the NPL, who had published a paper at the conference titled
A
Digital Communications Network For Compute
rs
. As Roberts continued his planning at ARPA
to build a wide area communications network, the terms "packet" and "packet switching"
were taken from Davies work.

In 1970, Davies helped build a packet switched network called the Mark I to serve the NPL.
The Mark I had only a few nodes within the NPL, and operated at a speed of 768 kbps. It
was replaced with an improved network called the Mark II in 1973, and remained in
operation until 1986, but it never had the funding to develop on the scale of the ARPA
NET.

Resources
. The following paper provides more information on the NPL:



Kirstein, Peter;
Early Experiences with the ARPANET and INTERNET in the UK
.

IPTO
--

Information Processing Techniques

Office


-

Bob Taylor

The IPTO funded the research that led to the development of the
ARPANET
. It was variously
headed over the years by Licklider, Sutherland, Taylor, and Roberts as described

below.

Licklider
. In 1962, Jack Ruina, Director of
ARPA
, hired
J.C.R. Licklider

to be the first
Director of the new Information Processing Techniques Office (IPTO). The original mandate
of the office was to extend the research carried out into computerization of air defense by
the
SAGE

program to other military command and control systems. In particular, the IPTO
was to build on SAGE's development of one of the first wide area computer networks for the
cross country radar defense system, and build a survivable electronic netw
ork to
interconnect the key DoD sites at the
Pentagon
,
Cheyenne Mountain
, and SAC HQ.

The IPTO funded re
search into advanced computer and network technologies, and
commissioned thirteen research groups to perform research into technologies related to
human computer interaction and distributed systems. Each group was given a budget thirty
to forty times as la
rge as a normal research grant, complete discretion as to its use, and
access to state
-
of
-
the
-
art technology at the following institutions:



Carnegie
-
Mellon University



MIT



RAND Corporation



Stanford Research Institute



System Development Corporation



Universi
ty of California at Berkeley, Santa Barbara, and Los Angeles



University of South Carolina



University of Utah

In 1963, Licklider funded a research project through the IPTO, headed by Robert Fano

at
MIT called Project MAC, which explored the potential for establishment of communities on
time
-
sharing computers. The project monitored the interactions between a community of
users using time
-
shared communication, and found that the technology encourag
ed the
establishment of real, if somewhat unique, electronic relationships among people across
distances. This example had a lasting effect on the IPTO and wider research community as a
prototype of the benefits of widespread networking.

Licklider's visio
n of a universal network greatly influenced his successors at the IPTO, Ivan
Sutherland, Bob Taylor, and MIT researcher
Lawrence Roberts
, and shaped the subsequent
research that led to developm
ent of the
Internet
.

Sutherland
. In July, 1964, Licklider went to work for IBM, and handed over directorship of
the IPTO to its second Director,
Ivan

Sutherland
, who had created the revolutionary
Sketchpad program for storing computer displays in memory where they could be modified,
processed, copied, and redrawn. Sutherland's program enabled the creation of the field of
computer graphics that has led
to all the graphical displays available today.

In 1965, Sutherland gave Lawrence Roberts at MIT an IPTO contract to further develop the
technology of computer networking. Roberts and Thomas Merrill then implemented the first
packet exchange by dial
-
up tele
phone connection, between a TX
-
2 computer at MIT and a
Q
-
32 computer in California.

Taylor
. In 1966, Sutherland handed over directorship of the IPTO to its third Director,
Robert Taylor, who had been powerfully influenced by Licklider, and was coincidental
ly (like
Licklider) a researcher in psychoacoustics.

Taylor had three different terminals in his IPTO office to connect to three different research
sites, and realized this architecture would severely limit his ability to scale access to
multiple sites. H
e dreamed of connecting a single terminal to a network with access to
multiple sites, and from his position in the Pentagon he could lobby for the funding to
implement it the vision.

In 1966, DARPA head Charlie Hertzfeld promised Taylor a million dollars
for the IPTO to
build a distributed communications network if he could get it organized. Taylor was
impressed by Roberts work, and asked him to come on board to led the effort. When
Roberts resisted, Taylor asked Hertzfeld to get the Director of Lincoln La
bs to pressure
Roberts to reconsider, which eventually caused Roberts to relent and join the IPTO as Chief
Scientist in December, 1966.

When Roberts gave the report titled
Resource Sharing Computer Networks

describing the
plan to build the ARPANET to Taylo
r on June 3, 1968, Taylor approved it 18 days later on
June 21, leading directly to the ARPANET's creation fourteen months later.

Roberts
. Three years later when the ARPANET was well on its way, Taylor handed over
directorship of the IPTO to Roberts in Se
ptember, 1969.

Licklider
. When Roberts left ARPA to become CEO of Telenet, Licklider returned again as
IPTO Director in October, 1973, to complete the organization's life cycle.

Leonard Kleinrock Helps Build The ARPANET


It all began with a comic book! At the age of 6, Leonard Kleinrock
was reading a Superman comic at his apartment in Manhattan,
when, in the centerfold, he found plans for building a crystal radio...
Kleinrock built the crystal radio and was totally hooked
when 'free'
music came through the earphones
--

no batteries, no power, all
free! An engineer was born.

-

Kleinrock, Leonard;
The Birth of the Internet
.

Leonard Kleinrock

is one of the pioneers of digital network communications, and helped
build the
ARPANET
.

Leonard Kleinrock

received his BEE degree from
CCNY

in 1957, then went to
MIT
, where he
was a Ph.D. classmate of
Lawrence Roberts
. Kleinrock published his first paper on digital
network communications,
Information Flow in Large Communication Nets
, in the RLE
Quarterly Progress Report, in July, 1961. He developed his ideas further in his 1
963 Ph.D.
thesis, and then published a comprehensive analytical treatment of digital networks in his
book
Communication Nets

in 1964.

After completing his thesis in 1962, Kleinrock moved to UCLA, and later established the
Network Measurement Center (NMC),

led by himself and consisting of a group of graduate
students working in the area of digital networks.

In 1966, Roberts joined the
IPTO

with a mandate to develop the ARPANET, and used
Kleinrock'
s
Communication Nets

to help convince his colleagues that a wide area digital
communication network was possible. In October, 1968, Roberts gave a contract to
Kleinrock's NMC as the ideal group to perform ARPANET performance measurement and find
areas for
improvement.

On a historical day in early September, 1969, a team at Kleinrock's NMC connected one of
their SDS Sigma 7 computers to an
Interface Message Processor
, thereby becoming the first
node

on the ARPANET, and the first computer ever on the
Internet
.

As the ARPANET grew in the early 1970's, Kleinrock's group stressed the system to work out
the detailed design and performance issues involved with

the world's first
packet switched

network, including routing, loading, deadlocks, and latency. The UCLA
Netwatch

program
la
ter performed similar functions to Kleinrock's Network Management Center from the
ARPANET years, until it was outsourced to other
organizations

in 2003.

Kleinrock has
continued to be active in the research community, and has published more
than 200 papers and authored six books. In August, 1989, he organized and chaired a
symposium commemorating the 20'th anniversary of the ARPANET, which later produced
the document
RFC 1121
, titled "Act One
--

The Poems".

Kleinrock has also been active in federal policy making with the
National Research Council's

Computer Science and Telecommunications Board (CSTB) committee. He led the CSTB work
in 1988 to lay out the framework for today's Gigabit Internet networks, and led the CSTB
committee which produced the influential 1994 report
R
ealizing the Information Future; The
Internet and Beyond
.

Kleinrock was a cofounder of the original Linkabit, and founder and chairman of
Nomadix

and the
Technology Transfer Institute
. He is a member of the National Academy of
Engineering, an IEEE fellow, and an ACM fellow. He is the recipient of the CCNY Townsend
Harris Medal, the CCNY Electrical Engineering Award, the Marcon
i Award, the L.M. Ericsson
Prize, the UCLA Outstanding Teacher Award, the Lanchester Prize, the
ACM

SIGCOMM
Award, the Sigma Xi Monie Ferst Award, the INFORMS Presidents Award, and the IEEE
Harry G
oode Award. He shared the
Charles Stark Draper Prize

for 2001 with
Vinton Cerf
,
Robert Kahn
, and Lawrence Roberts for their work on the ARPANET and Internet.

Lawrence Roberts Manages The ARPANET Program


Just as time
-
shared computer systems have permitted groups of
hundreds of
individual users to share hardware and software
resources with one another, networks connecting dozens of such
systems will permit resource sharing between thousands of users.

-

Larry Roberts,
ARPA Program Plan 723
, 3 June 1968.


Lawrence (Larry) Roberts

was the
ARPANET

program manager, and led the overall

system
design.

Roberts obtained his B.S., M.S., and Ph.D. degrees from MIT, and then joined the
Lincoln
Laboratory
, where he carried out research into computer networks. In a pivotal meeting in
November, 1964, Roberts met with
J.C.R. Licklider
, who inspired Roberts with his dream to
build a wide area com
munications network.

In February, 1965, the director of the
IPTO
, Ivan Sutherland, gave a contract to Roberts to
develop a computer network, and, in July, to Thomas Marill (who had also been insp
ired by
Licklider) to program the network. In October, 1965, the Lincoln Labs TX
-
2/ANS/Q
-
32
computer talked to Marill's SDC's Q32 computer in one of the worlds first digital network
communications.

In October, 1966, Roberts and Marill published a paper tit
led
Toward a Cooperative Network
of Time
-
Shared Computers

at the Fall AFIPS Conference, documenting their networking
experiments.

Also in 1966,
DARPA

head Charlie Hertzfeld promised IPTO Director

Bob Taylor a million
dollars to build a distributed communications network if he could get it organized. Taylor
was greatly impressed by Lawrence Roberts work, and asked him to come on board to lead
the effort. Roberts resisted, but finally joined as ARPA

IPTO Chief Scientist in December
1966 when Taylor got Hertzfeld to twist the arm of the head of Lincoln Lab to put pressure
on Roberts. Roberts immediately started working on the system design for a wide area
digital communications network that would come

to be called the ARPANET.

In April, 1967, Roberts held an "ARPANET Design Session" at the IPTO Principal Investigator
meeting in Ann Arbor, Michigan. The standards for identification and authentication of users,