Neural Networks Architecture - IPM

appliancepartAI and Robotics

Oct 19, 2013 (4 years and 8 months ago)


Neural Networks

Baktash Babadi


Fall 2004

The Neuron Model

Architectures (1)

Feed Forward Networks

The neurons are arranged in
separate layers

There is no connection between
the neurons in the same layer

The neurons in one layer receive
inputs from the previous layer

The neurons in one layer delivers
its output to the next layer

The connections are unidirectional


Architectures (2)

Recurrent Networks

Some connections are
present from a layer to
the previous layers

Architectures (3)

Associative networks

There is no hierarchical arrangement

The connections can be bidirectional

Why Feed Forward?

Why Recurrent/Associative?

An Example of Associative
Networks: Hopfield Network

John Hopfield (1982)

Associative Memory via artificial neural

Solution for optimization problems

Statistical mechanics

Neurons in Hopfield Network

The neurons are binary units

They are either active (1) or passive

Alternatively + or

The network contains


The state of the network is described as a
vector of 0s and 1s:

The architecture of Hopfield

The network is fully interconnected

All the neurons are connected to each other

The connections are bidirectional and symmetric

The setting of weights depends on the

Updating the Hopfield Network

The state of the network changes at each time
step. There are four updating modes:



The state of a randomly chosen single neuron will be
updated at each time step

Sequential :

The state of a single neuron will be updated at each time
step, in a fixed sequence


All the neurons will be updated at each time step

Parallel Asynchronous:

The neurons that are not in refractoriness will be updated at
the same time

The updating Rule (1):

Here we assume that updating is serial

Updating will be continued until a stable state is

Each neuron receives a weighted sum of the inputs
from other neurons:

If the input is positive the state of the neuron will
be 1, otherwise 0:

The updating rule (2)

Convergence of the Hopfield
Network (1)

Does the network eventually reach a stable
state (convergence)?

To evaluate this a ‘energy’ value will be
associated to the network:

The system will be converged if the energy is

Convergence of the Hopfield
Network (2)

Why energy?

An analogy with spin
glass models of Ferro

magnetism (Ising model):

The system is stable if the energy is minimized

Convergence of the Hopfield
Network (3)

Why convergence?

Convergence of the Hopfield
Network (4)

The changes of E with updating:

In each case the energy will decrease or remains constant thus the system tends to


The Energy Function:

The energy function is similar to a
multidimensional (N) terrain

Global Minimum

Local Minimum

Local Minimum

Hopfield network as a model for
associative memory

Associative memory

Associates different features with eacother







Recall with partial cues

Neural Network Model of
associative memory

Neurons are arranged like a grid:

Setting the weights

Each pattern can be denoted by a vector of
1s or 1s:

If the number of patterns is m then:

Hebbian Learning:

The neurons that fire together , wire together

Limitations of Hofield associative

1) The evoked pattern is sometimes not
necessarily the most similar pattern to the

2) Some patterns will be recall more than

3) Spurious states: non
original patterns

0.15 N

Hopfield network and the brain (1):

In the real neuron, synapses are distributed
along the dendritic tree and their distance
change the synaptic weight

In hopfield network there is no dendritic

If they are distributed uniformly, the geometry is
not important

In the brain the Dale principle holds and
the connections are not symmetric

The hopfield network with assymetric
weights and dale principle, work properly

Hopfield network and the brain (2):

The brain is insensitive to noise and local

Hopfield network can tolerate noise in the
input and partial loss of synapses

Hopfield network and the brain (3):

In brain the neurons are not binary
devices, they generate continuous values
of firing rates

Hopfield network with sigmoid transfer
function is even more powerful than the
binary version

Hopfield network and the brain (4):

In the brain most of the neurons are silent
or firing at low rates but in hopfield
network many of the neurons are active

In sparse hopfield network the capacity is
even more

Hopfield network and the brain (5):

In hopfield network updating is serial
which is far from biological reality

In parallel updating hopfield network the
associative memories can be recalled as

Hopfield network and the brain (6):

When the number of learned patterns in
hopfield network will be overloaded, the
performance of the network will fall
abruptly for all the stored patterns

But in real brain an overload of memories
affect only some memories and the rest of
them will be intact

Catastrophic inference

Hopfield network and the brain (7):

In hopfield network the usefull information
appears only when the system is in the
stable state

The Brain do not fall in stable states and
remains dynamic

Hopfield network and the brain (8):

The connectivity in the brain is much less
than hopfield network

The diluted hopfield network works well

Hopfield network and the brain (9):