DOWN - Ubiquitous Computing Lab

apricotpigletAI and Robotics

Oct 19, 2013 (3 years and 8 months ago)

87 views

Soft Computing

Colloquium 2

Selection of neural network,

Hybrid neural networks.


14.11.2005

2

Objectives


Why too much of models of neural
networks (NN)?


Classes of tasks and classes of NN


Hybrid neural networks


Hybrid model based on MLP and ART
-
2


Paths to improvement of neural networks

14.11.2005

3

Submit a questions to discuss




Paths to improvement of neural networks:


Development of growth neural networks with
feedback and delays


Development of theory of spiking neurons and
building of associative memory based on its


Development of neural network in which
during learning logical (verbal) inference
would appearance from associative memory

14.11.2005

4

Why too much of models of neural
networks (NN)?

Models of neural networks

simulate separate aspects

of working of brain (e.g.

associative memory,


but

how it works in whole is

unknown for us.

Questions:

1)
What is consciousness?

2)
What is role of emotions?

3)
How different areas of

brain are coordinated?

4) How associative links

are transformed and used in

logical inference and


calculations?

14.11.2005

5

14.11.2005

6

Classes of tasks :


prediction


classification


data association


data conceptualization


data filtering


Neuromathematics


14.11.2005

7

Classes of Neural Networks:


Multi Layer Networks


Multi Layer Perceptron (MLP)


Supervised learning


Radial Basis Functions (RBF
-
networks)


Supervised learning


Recurrent Neural Networks
(Elman, Jordan)


Supervised learning


Reinforcement learning


Counterpropagation network


Supervised learning


One
-
layer networks


Self
-
organized map (MAP)


Unsupervised learning


Artificial resonance theory
(ART)


Unsupervised learning


Hamming network


Supervised learning



Fully interconnected networks


Hopfield network


Supervised learning


Boltzmann machine


Supervised learning


Bi
-
directional associative
memory


Supervised learning


Spiking networks


Supervised learning


Unsupervised learning


Reinforcement learning


14.11.2005

8

Counterpropagation network

14.11.2005

9

Network Selector Table

14.11.2005

10

Hybrid Neural Networks.


Includes:


Main neural network


Other neural network


Preprocessing


Postprocessing


Some models of neural networks consist of
some layers working by different manner and so
such neural networks may be viewed as hybrid
neural networks (including more elementary
networks)


Some authors calls hybrid neural networks such
model which combine paradigms of neural
networks and knowledge engineering.


14.11.2005

11

Hybrid Neural Network based on models of
Multi
-
Layer Perceptron and Adaptive
Resonance Theory (A.Gavrilov, 2005)


Aims to keep capabilities of ARM
(plasticity and stability)


Include in ART capabilities of MLP during
learning to obtain complex secondary
features from primary features (to
approximate any function)

14.11.2005

12

Disadvantages of model ART
-
2 for recognition of images


It uses of metrics of primary features of
images to recognize of class or create of
new class,


Transformations of graphic images (shift
or rotation or others) essentially influence
on distance between input vectors


So it is unsuitable for control system of a
mobile robots

14.11.2005

13

Architecture of hybrid neural network

output vector

output layer:

clusters

input layer:

input variables

y
1

y
2

y
m

input layer of

ART
-
2, output


layer of perceptron

hidden layer of

perceptron

input vector

x
1

x
2

x
n

14.11.2005

14

Algorithm of learning without
teacher


Set of initial weights of neurons; N
out
:=0;


Input of image
-
example and calculate of outputs of
perceptron;


If N
out
=0 then forming of new cluster
-
output neuron;


If N
out
>0 then calculate of distances between
weight vector of ART
-
2 and output vector of
perceptron, select of minimum of them (selection of
output neuron
-
winner) and decide to create or not
new cluster;


If new cluster is not created then calculate new
values of weights of output neuron
-
winner and
calculate new weights of perceptron with algorithm
“error back propagation”.



14.11.2005

15

The illustration of algorithm



1


3


4


2


5


R
1

14.11.2005

16

Images and parameters used in
experiments

Quantity of input neurons (pixels)
-

10000 (100х100),

Quantity of neurons in hidden layer of perceptron
-

20,

Quantity of output neurons of perceptron (in input layer of ART
-
2)
Nout

-

10,

Radius of cluster
R
was used in experiments in different manners:



1) adapt and fix,



2) calculate for every image by formulas
S/(2N
out
)
,

where S


average input signal,
N
out



number of output neurons of perceptron,

3) calculated as 2D
min
,

where D
min


minimal distance between input vector of ART2 and weight


vectors in previous image.

Activation function of neurons of perceptron is rational sigmoid with parameter a=1,

Value of learning step of perceptron is 1,

Number of iterations of recalculation of weights of perceptron is from 1 to 10.


1)

2)

3)

14.11.2005

17

Series of images 1

------------------

14.11.2005

18

Program for experiments

14.11.2005

19

For sequence of images of series 1, 2, 1, 2 (a dark
points are corresponding to 2nd kind of calculation
of vigilance and light


to 1st one).

14.11.2005

20

For sequence of images of series 1 at different
number of iteration of EBP algorithm: 1, 3, 5, 7, 9.


14.11.2005

21

Paths to improvement of neural networks


Development of growth neural networks
with feedback and delays


Development of theory of spiking neural
networks and building of associative
memory based on them


Development of neural network in which
during learning logical (verbal) inference
would appearance from associative
memory