1
Pattern Recognition:
Statistical and Neural
Lonnie C. Ludeman
Lecture 21
Oct 28, 2005
Nanjing University of Science & Technology
2
Lecture 21 Topics
1.
Example
–
Analysis of simple Neural Network
2.
Example

Synthesis of special forms of
Artificial Neural Networks
3
.
General concepts of Training an Artificial Neural
Network

Supervised and unsupervised,training
sets
4. Neural Networks Nomenclature and Notation
5. Derivation and Description of the
Backpropagation Algorithm for Feedforward
Neural Networks
3
Example:
Analyze
the following Neural Network

1
1

1
1
1
0
0
0
1
4
Solution: Outputs of layer 1 ANEs
5
Output of layer 2 ANE is
Thus from layer 1 we have

2
≥ 0
<
0
6
7
Final
Solution:
Output
Function for
Given Neural
Network
8
Example:
Synthesize
a Neural Network
Given the following decision regions
build a
neural network to perform the classification
process
Solution:
Use
Hyperplane

AND

OR
structure
9
Each
g
k
(
x
)
specifies a
hyperplane boundary
10
Hyperplane Layer
AND
Layer
OR
Layer
all
f(
∙
) =
μ
(
∙
)
Solution:
11
Training a Neural Network
“With a teacher”
“Without a teacher”
12
13
Training Set
x
j
are the training samples
d
j
is the class assigned to training sample
x
j
14
Example of a training set
:
(
x
1
= [ 0, 1 ,2 ]
T
, d
1
= C
1
)
,
(
x
2
= [ 0, 1 ,0 ]
T
, d
2
= C
1
)
,
(
x
3
= [ 0, 1 ,1 ]
T
, d
3
= C
1
) ,
(
x
4
= [ 1, 0 ,2 ]
T
, d
4
= C
2
)
,
(
x
5
= [ 1, 0 ,3 ]
T
, d
5
= C
2
)
,
(
x
6
= [ 0, 0 ,1 ]
T
, d
6
= C
3
) ,
(
x
7
= [ 0, 0 ,2 ]
T
, d
7
= C
3
)
(
x
8
= [ 0, 0 ,3 ]
T
d
8
= C
3
)
(
x
9
= [ 0, 0 ,3 ]
T
d
9
= C
3
)
(
x
10
= [ 1, 1 ,0 ]
T
d
10
= C
4
)
(
x
11
= [ 2, 2 ,0 ]
T
d
11
= C
4
)
(
x
12
= [ 2, 2 ,2 ]
T
d
12
= C
5
)
(
x
13
= [ 3, 2, 2 ]
T
d
13
= C
6
)
{
}
15
General Weight Update Algorithm
x
(
k
)
is the training sample for the
k
th
iteration
d
(
k
)
is the class assigned to training sample
x
(
k
)
y
(
k
)
is the output vector for the
k
th
training sample
16
Training with a Teacher( Supervised)
1. Given a set of N ordered samples with their
known class assignments.
2. Randomly select all weights in the neural
network.
3. For each successive sample in the total set
of samples, evaluate the output.
4. Use these outputs and the input sample to
update the weights
5. Stop at some predetermined number of
iterations or if given performance measure is
satisfied. If not stopped go to step 3
17
Training without a Teacher( Unsupervised)
1. Given a set of N ordered samples with unknown
class assignments.
2. Randomly select all weights in the neural network.
3. For each successive sample in the total set of
samples, evaluate the outputs.
4. Using these outputs and the inputs update the
weights
5. If weights do not change significantly stop with that
result. If weights change return to step 3
18
Supervised Training of a
Feedforward Neural Network
Nomenclature
19
Output vector
of layer m
Output vector
of layer L
Node Number
Layer m
Node Number
Layer L
1
20
Weight Matrix for layer m
Node 1
Node 2
Node N
m
N
N
m
21
fix
Layers, Nets, Outputs, Nonlinearities
22
Define the performance
E
p
for sample
x
(
p
)
as
We wish to select weights so that
E
p
is
Minimized
–
Use Gradient Algorithm
23
Gradient Algorithm for Updating the
weights
p
w
(
p
)
p
x
(
p
)
24
Derivation of weight update equation for Last
Layer (Rule #1) Backpropagation Algorihm
The partial of
y
m
(L)
with respect to
w
kj
(L)
is
25
General Rule #1 for Weight Update
Therefore
26
Derivation of weight update equation for Next to
Last Layer (L

1) Backpropagation Algorithm
27
28
General Rule #2 for Weight Update

Layer L

1
Backpropagation Algorithm
Therefore
and the weight correction is as follows
29
where weight correction (general Rule #2) is
w
(
L

1)
30
Backpropagation Training Algorithm
for Feedforward Neural networks
31
Input pattern
sample
x
k
32
Calculate Outputs
First Layer
33
Calculate Outputs
Second Layer
34
Calculate Outputs
Last Layer
35
Check Performance
E
TOTAL
(
p
)
½
(
d
[
x
(
p

i
)]
–
f(
w
T
(
p

i
)
x
(
p

i
) )
2
i
= 0
N
s

1
E
TOTAL
(
p+1
) =
E
TOTAL
(
p
) +
E
p+
1
(
p+
1)
–
E
p

Ns
(
p

N
s
)
Single Sample
Error
Over all
Samples Error
Can be computed recursively
36
Change Weights Last
Layer using Rule #1
37
Change Weights previous
Layer using Rule #2
38
Change Weights previous
Layer using Modified Rule #2
39
Input pattern
sample
x
k+1
Continue Iterations Until
40
Repeat process
until performance is
satisfied or maximum number of iterations
are reached.
If performance
not
satisfied
at
maximum number of iterations the algorithm
stops and
NO
design is obtained.
If performance
is satisfied
then the
current weights and structure provide the
required design
.
41
Freeze Weights to get Acceptable
Neural Net Design
42
Backpropagation
Algorithm for
Training
Feedforward
Artificial Neural
Networks
43
Summary Lecture 21
1.
Example
–
Analysis of simple Neural Network
2.
Example

Synthesis of special forms of
Artificial Neural Networks
3
.
General concepts of Training an Artificial
Neural Network

Supervised and
unsupervised,and description of training sets
4. Neural Networks Nomenclature and Notation
5. Derivation and Description of the
Backpropagation Algorithm for Feedforward
Neural Networks
44
End of Lecture 21
Comments 0
Log in to post a comment