Neural Networks - Math and Computer Science - Indiana State ...

cracklegulleyAI and Robotics

Oct 19, 2013 (3 years and 7 months ago)


Neural Networks

LaTonya Allen

Indiana State University

Project Advisors:

Dr. Henjin Chi

Dr. David Hutchison

Dr. Torsten Alvager

Dr. George Graham

Project Goals and Objectives

To help the audience understand what
artificial neural networks are, how to use
them, and where they are currently being

To successfully simulate an artificial neural

To demonstrate data compression in the
generated sequences of a recirculation

Technologies Used

The technologies used for this project are
NeuralWorks Professional II/PLUS from NeuralWare
and Windows 98.

System Requirements:

Intel x86 Architecture

Windows 95/98/ME/NT/2000

Microsoft Excel 97/2000 (For MS Excel interface usage)

Pentium class processor

64MB memory

10MB disk

What is a Neural Network?

A cognitive information processing structure
based upon models of brain function.

An interconnected network of simple
processing elements (PE). It is a powerful
data modeling tool that is able to capture and
represent complex input/output relationships.

Brief History of Neural

The 1940’s

Warren McCulloch and Walter Pitts introduced the
first neural network computing model.


Perceptron, the first artificial neural network, is invented
by Frank Rosenblatt.


John Hopfield presented a paper to the National
Academy of Sciences that focused on using neural networks not
to simply model brains but to create useful devices.

Late 1980’s

Researchers showed renewed interest in neural
networks. Studies included Boltzmann machines, Hopfield
networks, competitive learning models, multilayer networks, and
(ART) adaptive resonance theory models.

Neural Network Architecture

At its most basic level, a neural network
consists of several "layers" of neurons

input layer, hidden layers, and output layers.

Each layer consists of one or more nodes,
represented by the small circles or dots. The
lines between the nodes indicate the flow of
information from one node to the next.

Neural Network Architecture

A Simple Neural Network Structure

Creating a Network Using

Title Screen for NeuralWorks Professional II/Plus Software

Creating a Back Propagation
Network Using NeuralWorks

About Back Propagation Networks

Back Propagation is an example of a feed
forward network, which means that the data
flows only in a forward direction.

Deciding how the PE’s in a network are
connected, how the PE’s process their
information, and how the connection
strengths are modified all go into creating a
neural network.

Types of Neural Networks

Network Type


Use for Network


Back Propagation

Delta Bar Delta

Extended Delta Bar Delta

Directed Random Search

Higher Order Neural Networks

Use input values to predict some
output (e.g. pick the best stocks in
the market, predict weather, identify
people with cancer risks etc.)


Learning Vector Quantization

Counter Propagation

Probabalistic Neural Networks

Use input values to determine the
classification (e.g. is the input the
letter A, is the blob of video data a
plane and what kind of plane is it)

Data Association


Boltzmann Machine

Hamming Network

Bidirectional Associative Memory

Temporal Pattern Recognition

Like Classification but it also
recognizes data that contains errors
(e.g. not only identify the characters
that were scanned but identify when
the scanner isn't working properly)

Data Conceptualization

Adaptive Resonance Network

Self Organizing Map

Analyze the inputs so that grouping
relationships can be inferred (e.g.
extract from a database the names of
those most likely to buy a particular

Data Filtering


Smooth an input signal (e.g. take the
noise out of a telephone signal)

Building A Network

Building A Network

RMS error plots the
error of the output

Network weights show
weights going into
output layer.

Classification rate is a
percentage of the
correct matches.

Building A Network

How Do Neural Networks

Once a network has been structured for a particular
application, that network is ready to be trained. To
start this process the initial weights are chosen
randomly. Then, the training, or learning, begins.

There are two approaches to training

and unsupervised.

Supervised training involves a mechanism of providing the
network with the desired output either by manually
"grading" the network's performance or by providing the
desired outputs with the inputs.

Unsupervised training is where the network has to make
sense of the inputs without outside help.

Training A Network

As each training
example is presented to
the network, the
network produces an

Training Output for a Network

Results of Network

Results of Network

Creating a Recirculation
Network Using NeuralWorks

Building A Network

Building a Network

Training the Network

About Recirculation Networks

In a recirculation, or recurrent network, the units in
the input and hidden layers are fully connected in
both directions. When the data for training the
system are presented at the input layer, they are
first filtered to the hidden layer through the use of a
set of constant weight factors. The processed data
is then recirculated back and filtered to the input
level through a second set of constant weight
factors. Finally, the data is sent for a second time to
the hidden layer through a third set of factors. The
learning occurs after the second pass through the

Testing the Network


The results given from the recirculation tests were
inconclusive. However, from the learning set, it was
clearly visible that the number of units from the
hidden layer and visible layer were compressed.
Further study is necessary.

Joint studies with the mathematics, computer
science, engineering, physics, and biology
departments can also aid in getting better results
and a general understanding of neural networks.

The Future of Neural Networks

Where are neural networks going?

We have only begun to scratch the surface in the
development and implementation of neural networks in
commercial applications. Because neural networks are
such a marketable technology, it is projected that there will
be a lot of development in this area in the years to come.

Currently, a great deal of research is going on in the field of
neural networks worldwide. This ranges from basic
research into new and more efficient learning algorithms, to
networks which can respond to varying patterns.

Thank you!