Supervised learning in neural networks: Perceptrons and ...

journeycartAI and Robotics

Oct 15, 2013 (4 years and 28 days ago)

149 views

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996


INFO331


Machine learning. Neural
networks. Supervised learning in
neural networks.MLP and BP


(Text book: section 2.11, pp.146
-
155; section
3.7.3., pp.218
-
221); section 4.2, pp.267
-
282;catch
-
up reading: pp.251
-
266)



© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Machine learning


Issues in machine learning


Learning from static versus learning from
dynamic data


Incremental learning


On
-
line learning, adaptive learning


Life
-
long learning


Cognitive learning processes in humans



© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Inductive learning


learning from examples


Inductive decision trees and the ID3
algorithm


Information gain evaluation

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Other methods of machine
learning


Learning by doing


Learning from advice


Learning by analogy


Case
-
based learning and reasoning


Template
-
based learning (Kasabov and
Clarke)
-

Iris example


© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Learning fuzzy rules from data


Cluster
-
based methods


Fuzzy template
-
based method
(Kasabov, 96), pp.218
-
219


Wang’s method (pp.220
-
221)


Advantages and disadvantages

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Supervised learning in neural
networks


Supervised learning in neural networks


Perceptrons


Multilayer perceptrons (MLP) and the
backpropagation algorithm


MLP as universal approximators


Problems and features of the MPL

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Supervised learning in neural
networks


The learning principle is to provide the input
values and the desired output values for each
of the training examples.


The neural network changes its connection
weights during training.


Calculate the error:


training error
-

how well a NN has learned the data


test error
-

how well a trained NN generalises over
new input data.

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Perceptrons


fig.4.8

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Perceptrons


fig.4.9

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Perceptrons


fig.4.10

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

MLP and the backpropagation
algorithm


fig.4.11

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

MLP and the backpropagation
algorithm


fig.4.12

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

MLP and the backpropagation
algorithm


fig.4.13

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

MLPs as statistical tools


A MLP with one hidden layer can
approximate any continuous function to
any desired accuracy (Hornik et al,
1989)


MLP are multivariate non
-
linear
regression models


MLP can learn conditional probabilities

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Problems and features of the
MPL


How to chose the number of the hidden
nodes


Catastrophic forgetting


Introducing hints in neural networks


Overfitting (overlearning)

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Problems and features of the
MPL


Catastrophic forgetting


fig. 4.14

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Problems and features of the
MPL


Introducing hints


fig.4.15

© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996

Problems and features of the
MPL


Overfitting


fig. 4.16