Neural network architectures and learning algorithms

muscleblouseΤεχνίτη Νοημοσύνη και Ρομποτική

19 Οκτ 2013 (πριν από 3 χρόνια και 9 μήνες)

56 εμφανίσεις

Neural network architectures
and learning algorithms


Author :
Bogdan

M.
Wilamowski

Source : IEEE INDUSTRIAL ELECTRONICS MAGAZINE

Date : 2011/11/22

Presenter

:
林哲緯

1

Outline


Neural Architectures


Parity
-
N Problem


Suitable Architectures


Use Minimum Network Size


Conclusion

2

Neural Architectures

3

Lecture Notes for E
Alpaydın

2010 Introduction to Machine Learning 2e © The MIT Press (V1.0)

Neural Architectures

4

Lecture Notes for E
Alpaydın

2010 Introduction to Machine Learning 2e © The MIT Press (V1.0)

Neural Architectures

5

Lecture Notes for E
Alpaydın

2010 Introduction to Machine Learning 2e © The MIT Press (V1.0)

error back propagation(EBP) algorithm


error back propagation(EBP) algorithm


multilayer
perceptron

(MLP)



6

Lecture Notes for E
Alpaydın

2010 Introduction to Machine Learning 2e © The MIT Press (V1.0)

multilayer
perceptron

(MLP)

7

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

MLP
-
type architecture 3
-
3
-
4
-
1(without connections across layers)

neuron by neuron(NBN) algorithm


neuron by neuron(NBN) algorithm


bridged multilayer
perceptron

(BMLP)


fully connected cascade (FCC)


8

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

arbitrarily connected network

neuron by neuron(NBN) algorithm


Levenberg

Marquardt(LM) algorithm


Improve nonlinear function of least square


Forward & Backward Computation



Jacobian

Matrix


Forward
-
Only Computation

9

bridged multilayer
perceptron

(BMLP)

10

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

BMLP architecture 3=3=4=1(with connections across layers marked by dotted lines)

fully connected cascade (FCC)

11

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

Bipolar neural network for parity
-
8 problem in a FCC architecture

Outline


Neural Architectures


Parity
-
N Problem


Suitable Architectures


Use Minimum Network Size


Conclusion

12

parity
-
8 problem

MLP 8*9 + 9 = 81 weights

BMLP 4*9 + 8 + 4 + 1 = 49 weights

13

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

parity
-
8 problem

9 + 10 + 11 + 12 = 42 weights

14

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

parity
-
17 problem


MLP architecture needs 18 neurons


BMLP architecture with connections across
hidden layers needs 9 neurons


FCC architecture needs only 5 neurons

15

parity
-
N problem


MLP architectures



BMLP architectures



FCC architectures

nn

= neurons

nw

= weights

16

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

Outline


Neural Architectures


Parity
-
N Problem


Suitable Architectures


Use Minimum Network Size


Conclusion

17

suitable architectures


For a
limited number
of neurons, FCC neural
networks are the most powerful architectures,
but this does
not mean that they are the only
suitable architectures


18

suitable architectures


if the two weights marked by
red dotted lines


signal has to be propagated by fewer layers

19

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

Outline


Neural Architectures


Parity
-
N Problem


Suitable Architectures


Use Minimum Network Size


Conclusion

20

Use Minimum Network Size


receive
a close
-
to
-
optimum answer for all
patterns that were
never used in training



generalization abilities

21

Case Study

22

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

TSK fuzzy controller:

(a) Required control surface

(b) 8*6 = 48
defuzzification

rules

TSK fuzzy controller:

(a) Trapezoidal membership functions

(b) Triangular membership functions

Case Study

23

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

(a) 3 neurons in cascade (12 weights), training error = 0.21049

(b) 4 neurons in cascade (18 weights), training error = 0.049061

(a) 5 neurons in cascade (25 weights), training error = 0.023973

(b) 8 neurons in cascade (52 weights), training error = 1.118E
-
005

time complexity

NBN algorithm can train neural
networks 1,000 times faster than
the EBP algorithm.

24

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

(a)
EBP algorithm, average solution time of 4.2s, and average 4188.3 iterations

(b)
NBN algorithm, average solution time of 2.4ms , and average 5.73 iterations

two
-
spiral problem

25

Neural network architectures and learning algorithms,
Wilamowski
, B.M.

NBN algorithm using FCC architecture

244 iterations and 0.913s

EBP algorithm using FCC architecture

30,8225 iterations and 342.7s

Outline


Neural Architectures


Parity
-
N Problem


Suitable Architectures


Use Minimum Network Size


Conclusion

26

Conclusions


FCC or BMLP architectures are not only more
powerful but also easier to train


use networks with a minimum number of
neurons


NBN have to invert a
nw
*
nw

matrix, but 500
weights are limit now.

27