Neural Nets in Forecasting

cartcletchAI and Robotics

Oct 19, 2013 (4 years and 20 days ago)

59 views

Neural Networks

And

Its Applications

By

Dr. Surya Chitra

OUTLINE


Introduction & Software


Basic Neural Network & Processing


Software Exercise Problem/Project


Complementary Technologies


Genetic Algorithms


Fuzzy Logic


Examples of Applications


Manufacturing


R&D


Sales & Marketing


Financial

Introduction


A computing system made up of a number of
highly interconnected processing elements,
which processes information by its dynamic
state response to external inputs






Dr. Robert Hecht
-
Nielsen

What is a Neural Network?

A parallel information processing system
based on the human nervous system
consisting of large number of neurons,
which operate in parallel.

Biological Neuron & Its Function

Information Processed in Neuron Cell Body and

Transferred to Next Neuron via Synaptic Terminal

Processing in Biological Neuron

Neurotransmitters Carry information to Next Neuron and

It is Further Processed in Next Neuron Cell Body

Artificial Neuron & Its Function

Neuron

Processing Element

Inputs

Outputs

Dendrites

Axon

Processing Steps Inside a Neuron

Electronic Implementation

Processing Element

Inputs

Outputs

Summed

Inputs



Sum



Min



Max



Mean



OR/AND

Add

Bias

Weight

Transform



Sigmoid



Hyperbola



Sine



Linear

Sigmoid Transfer Function

Transfer

1

Function

=




( 1 + e
(
-

sum)

)


Basic Neural Network & Its Elements

Input

Neurons

Hidden

Neurons

Output

Neurons

Bias Neurons

Clustering of
Neurons

Back
-
Propagation Network

Forward Output Flow


Random Set of Weights Generated


Send Inputs to Neurons


Each Neuron Computes Its Output


Calculate Weighted Sum

I
j

=



i

W
i, j
-
1

* X
i, j
-
1

+ B
j



Transform the Weighted Sum

X
j

=

f (I
j
)

=

1/ (1 + e


(Ij + T)
)



Repeat for all the Neurons

Back
-
Propagation Network

Backward Error Propagation


Errors are Propagated Backwards


Update the Network Weights


Gradient Descent Algorithm


W
ji

(n)

=





j

* X
i


W
ji

(n+1)

=

W
ji

(n)


+

W
ji

(n)


Add Momentum for Convergence


W
ji

(n)

=




j

* X
i

+




W
ji

(n
-
1)


Where


n = Iteration Number;



㴠䱥慲湩湧=剡瑥





㴠剡瑥t潦 䵯浥湴m洠⠰ 瑯 ㄩ


Back
-
Propagation Network

Backward Error Propagation


Gradient Descent Algorithm


Minimization of Mean Squared Errors


Shape of Error


Complex


Multidimensional


Bowl
-
Shaped


Hills and Valleys


Training by Iterations


Global Minimum is Challenging

Simple Transfer Functions

Input Unit

Bias Unit

Computation Node

Context Unit

Recurrent Neural Network

Input Unit

Bias Unit

Computation Node

Higher Order Unit

Time Delay Neural Network

Training
-

Supervised


Both Inputs & Outputs are Provided


Designer Can Manipulate


Number of Layers


Neurons per Layer


Connection Between Layers


The Summation & Transform Function


Initial Weights


Rules of Training


Back Propagation


Adaptive Feedback Algorithm

Training
-

Unsupervised


Only Inputs are Provided


System has to Figure Out


Self Organization


Adaptation to Input Changes/Patterns


Grouping of Neurons to Fields


Topological Order


Based on Mammalian Brain


Rules of Training


Adaptive Feedback Algorithm (Kohonen)

Topology:

Map one space to another without



changing geometric Configuration

Traditional Computing Vs. NN Technology


CHARACTERISTICS

TRADITIONAL

COMPUTING

ARTIFICIAL

NEURAL

NETWORKS

PROCESSING STYLE

Sequential

Parallel


FUNCTIONS

Logically

Via Rules, Concepts

Calculations

Mapping

Via Images, Pictures

And Controls

LEARNING METHOD

By Rules

By Example


APPLICATIONS

Accounting

Word Processing

Communications

Computing

Sensor Processing

Speech Recognition

Pattern Recognition

Text Recognition

Traditional Computing Vs. NN Technology


CHARACTERISTICS

TRADITIONAL

COMPUTING

ARTIFICIAL

NEURAL

NETWORKS

PROCESSORS

VLSI
-

Traditional

ANN

Other Technologies

APPRAOCH

One Rule at a time

Sequential

Multiple Processing

Simultaneous

CONNECTIONS

Externally
Programmable

Dynamically Self
Programmable

LEARNING

Algorithmic

Adaptable Continuously

FAULT TOLERANCE

None

Significant via Neurons

PROGRAMMING

Rule Based

Self
-
learning

ABILITY TO TEST

Need Big
Processors

Require Multiple
Custom
-
built Chips

HISTORY OF NEURAL NETWORKS

TIME PERIOD

Neural Network Activity

Early 1950’s

IBM


Simulate Human Thought Process


Failed

Traditional Computing Progresses Rapidly

1956

Dartmouth Research Project on AI

1959

Stanford


Bernard Widrow’s ADALINE/MADALINE

First NN Applied to Real World Problem

1960’s

PERCEPTRON


Cornell Neuro
-
biologist(RosenBlatt)

1982

Hopfiled


CalTech, Modeled Brain for Devices

Japanese


5
th

Generation Computing

1985

NN Conference by IEEE


Japanese Threat

1989

US Defense Sponsored Several Projects

Today

Several Commercial Applications

Still Processing Limitations

Chips ( digital,analog, & Optical)