# 16 November 2012

AI and Robotics

Oct 19, 2013 (4 years and 6 months ago)

95 views

COSC 4V82

Michael
Samborski

16 November 2012

Artificial Neural Network Review

The First to Try It

Developmental Approaches

Good Ideas Always Come Back

Comparison to Other Evolutionary ANN techniques

Neuron has summation of links in, activation function,
and output

Take inputs and give outputs

Learns by changing the link weights that the inputs are
passed through

Hidden layers between the input and output layers are
necessary to handle non
-
linearly separable problems

Wide variety of problems leads to wide variety of
network configurations

No good way to know what network configuration to
use

Trial and error to find good configuration is a lengthy
process

Never sure if you have the optimal network setup

What is the better way?

GA was used to some success but a new player had just
arrived on the AI scene

Surprise, surprise,
Koza

is one of the first to try it

Used a direct representation of an ANN in GP
language form

Tree would organize itself into a single root tree that
could be seen as a single output arbitrary shape ANN

No regular layering of the tree

Nodes could be anywhere in his tree and there was no
guaranteed organized structure of layers except for the
output layer

F = {P,W,+,
-
,*,/}

P is the linear activation function of a neuron

W multiplies all the branches coming into itself

Restrictions

Root of the tree must be P

All branches coming into a P must be Ws

Any subtree below and arithmetic operation must only contain
more arithmetic operations or terminals

LIST function could be used to give multiple outputs

Only used as root and could only have P branches coming into it

T= {D0,D1,R}

Two inputs and ephemeral random constant

When
Koza

applied this
to XOR, this tree result
work 100% of the time

Translated to a 2
-
2
-
1
network

Easy problem = easy tree

complex problem?

This full adder tree performed at 100% as well

Over 102 runs of
popsize

= 500, a 100% solution was
found 86% of the time after 51 generations

Koza’s

idea seemed to have promise and as
Koza’s

ideas
typically do, GP for ANN snowballed

Frederic
Gruau

used developmental GP performing
operations on nodes of a ANN to create new ones

This lead to highly connected graphs and there were few
ways to change the individual edges

Sean Luke and Lee
Spector

took
Gruau’s

idea but
applied it to edges instead of nodes

This lead to less connected graphs that were able to
change edges and nodes easier

Unfortunately
Gruau’s

report was inaccessible behind
a pay wall and Luke and
Spector

only released a
preliminary report so no data on how well these really
performed was available

D.
Rivero

et al. upon reading at the
Gruau

and Luke
and
Spector

papers thought they had a better way

Have a language where there is a function to represent
a neuron, a function to represent an input neuron, and
a function to act as the tree root that takes in all the
output neurons and lists them

Sounds familiar, doesn’t it?

Koza’s

paper was never referenced

in
Rivero

et al, 2005

Likely they came upon the idea organically just like
Koza

did

While
Koza’s

tests for his GPANN could be considered
toyish
, it was tested much more rigorously this time

While not explicitly stated their activation function
was likely more complex than the linear one
Koza

used

Attempted 4 different problems from the UCI
database

D.
Rivero

at al. didn’t compare themselves to
Gruau
, Luke,
and
Spector

but did compare their ANN to the results
found by
Cantú
-
Paz and
Kamath

in a paper that compared
many different ANN with evolutionary algorithms

Their method on these problems performed at least as
good and better in most cases

All other algorithms had individual evaluation separate
from the design process

The combination of these by
Rivero

et al. led to shorter
training time and much less computational power used

Koza
, John R., and James P. Rice. "Genetic generation of both the
weights and architecture for a neural network." In

Neural Networks,
1991., IJCNN
-
91
-
Seattle International Joint Conference on
, vol. 2, pp.
397
-
404. IEEE, 1991.

F.
Gruau
, “Genetic micro programming of neural networks”, in
Kinnear
,
Jr., K. E., editor, Advances in Genetic Programming, chapter 24, MIT
Press, 1994. pp. 495

518,

Luke, Sean, and Lee
Spector
. "Evolving graphs and networks with edge
encoding: Preliminary report." In

Late Breaking Papers at the Genetic
Programming 1996 Conference
, pp. 117
-
124. Stanford, CA: Stanford
University, 1996.

Rivero
, Daniel,
Julián