University of Otago

jiggerluncheonAI and Robotics

Oct 19, 2013 (4 years and 21 days ago)

97 views








University of Otago
Te Whare Wananga O Otago
Dunedin, New Zealand




FuNN/2 – A Fuzzy Neural Network
Architecture for Adaptive Learning
and Knowledge Acquisition


Nikola K. Kasabov
Jaesoo Kim
Michael J. Watts
Andrew R. Gray




The Information Science
Discussion Paper Series

Number 96/23
December 1996
ISSN 1172-455X


University of Otago

Department of Information Science

The Department of Information Science is one of six departments that make up the Division of Commerce at
the University of Otago. The department offers courses of study leading to a major in Information Science
within the BCom, BA and BSc degrees. In addition to undergraduate teaching, the department is also
strongly involved in postgraduate programmes leading to the MBA, MCom and PhD degrees. Research pro-
jects in software engineering and software development, information engineering and database, artificial
intelligence/expert systems, geographic information systems, advanced information systems management
and data communications are particularly well supported at present.


Discussion Paper Series Editors

Every paper appearing in this Series has undergone editorial review within the Department of Information
Science. Current members of the Editorial Board are:

Mr Martin Anderson Dr George Benwell
Dr Nikola Kasabov Dr Geoff Kennedy
Dr Martin Purvis Professor Philip Sallis
Dr Hank Wolfe

The views expressed in this paper are not necessarily the same as those held by members of the editorial
board. The accuracy of the information presented in this paper is the sole responsibility of the authors.


Copyright

Copyright remains with the authors. Permission to copy for research or teaching purposes is granted on the
condition that the authors and the Series are given due acknowledgment. Reproduction in any form for pur-
poses other than research or teaching is forbidden unless prior written permission has been obtained from the
authors.


Correspondence

This paper represents work to date and may not necessarily form the basis for the authors’ final conclusions
relating to this topic. It is likely, however, that the paper will appear in some form in a journal or in confer-
ence proceedings in the near future. The authors would be pleased to receive correspondence in connection
with any of the issues raised in this paper. Please write to the authors at the address provided at the foot of
the first page.

Any other correspondence concerning the Series should be sent to:

DPS Co-ordinator
Department of Information Science
University of Otago
P O Box 56
Dunedin
NEW ZEALAND
Fax: +64 3 479 8311
email: workpapers@commerce.otago.ac.nz

F
uNN/2- A Neural
Network
Architecture for
Fuzzy
Adaptive
and
Learning
Knowledge Acquisition
Nikola K
Jasco Michael J
Kasabov, Kim, Andrew R
Watts,
Gray
ot" information
Science
Department
P.0.Box
New Zealand
University o1‘()tago, 56, Dunedin,
Phone: +64 3 479 Fax: +64 3
8319, 479 831 email: nkasabov
I,
@otago.ac.nz
Abstract
neural networks have several
features that make them
Fuzzy well suited to a
wide o l
range
These
knowledge include fast and
engineering applications. accurate
strengths
learning, good
excellent
generalisation facilities in the
capabilities, fonn of
explanation
semantically
meaningful
and the to
rules, accommodate both data and
fuzzy
ability
about the
existing expert
knowledge
under
consideration. This
problem
rule
paper investigates extraction and
adaptive learning,
and
for a
insertion, model ol’ a
neural/fuzzy reasoning neural
particular network called
fuzzy
FUNN _ As
well as for
a
providing with an
representing neural
fuzzy system
adaptable
FLINN also a
architecture,
in one of its
incorporates genetic
algorithm modes. A
adaptation
~
version
of FuNN
which
FuNN/2, functions
employs and
triangular
membership
correspondingly
modified and
is also
in the
learning adaptation algorithms,
presented
paper,1. Introduction
Different architectures
of neural
networks have
fuzzy been as a
(PNN)
proposed
knowledge
and used for
various
engineering technique PNN
have been
applications [ l~7].
systems
used for
and
successfully rules as well as
learning
tuning fuzzy
solving classification,
prediction
and control
Some recent
methods
problems. for
publications suggest FNNS in order to
training
to new or
data
adjust and situations
dynamically
changing [4, 5,
131.
In several
the
architecture o l a
publications [3,4,5] PNN called
general which
FiuNN, stands for
was
introduced with
ljgzzy l§leural i\l_etworlwasintroducedalongwithalgorithmsforlearning,adaptationandrules for
along algorithms and rules
learning,
adaptation
extraction and the first
version ol’ its
Here
further
implementation. o l the FuNN
development
and
one
called
principles particular liuNN/2 are
implementation The
presented.
paper investigates
some
and
associated
iearning with FuNN/2
adaptation strategies which include rule
insertion and
rule
extraction
One of the
o l the FLINN
algorithms.
unique aspects architecture is that it combines
several
in one
ie neural
paradigms
system, networks,
fuzzy logic, This multi-
genetic algorithms.
has
results
paradigm approach fora wide of
produced good
range problems.
2.
The
Architecture of FuNN and
FUNN/2
The
FuNN model is
to be used
in a
designed and
distributed,
eventually
agenbbascd,
environment. The
architecture facilitates
from data
and
learning
as well
approximate
reasoning,
as
rule extraction
and insertion.
fuzzy It allows for the
combination o l
both data
and rules into
one thus
the
benefits
system, producing associated with
synergistic the two
sources. In
addition,
it
allows for several
methods of
in
adaptation (adaptive a
learning
dynamically
changing
Considerable
environment). can be
carried out
experimentation this
to
using evaluate
system
d i i t ‘ e r c n t
adaptation
strategies.
7
A4FUNN uses a
network and a modified
multi~layer (MLP)
pereeptron
baeltraining training
The Fui\ll\I
architecture consists of
algorithm [3,4,5]. 5 of neurons
general with
layers
partial
feed’l‘orward
connections zu; shown in 1. It is an
PNN where the
figure
adaptable
membership
functions of the
as well as the
rules inserted before
fuzzy predicates, or
fuzzy
training adaptation,
and
to new
data. Below a brief
may adapt change of the
according of
description
components
the FuNN ztrehitiecture and the
behind this architecture
are
philosophy
given.
2.1
Input
Layers
The o l
neurons the
input layer variables as
represents values. These values are
input fed
crisp
to the
condition elemezif which
fuzzification. This
layrfr is in
p e r t o r m s fPuNN/2
implemented
functions
using three-point with centers as
triangular the into
membership
represented
weights
this condition element
The
are with the
layer. triangles rninirnurri and maximum
completed
attached to
or shouldered
points centers, in the ease of the
adjacent first and last
membership
t‘unetions.
The
functions
are allowed to be
triangular
membership and value
non-symmetrical
any input
will
to at
maximum o l two
belong functions with
from
membership zero
degrees differing (it
will involve two
unless the
value falls on ll
always function
input center in
exactly
membership
which case the
will be but
single this is
membership activated,
equality
unlikely given floating
These
variables). for
point will
membership degrees sum to
any given
input always one,
up
that some rules will he
the
ensuring to fire for all
given in the
opportunity
points
input space.
Functions
Using makes the
triangular membership f t J z z i l i c a t i o n and the
rleftizzifiezitiion
in FuNN fast without
the
procedures ofthe
compromising solution.
accuracy
This centenbased
taken
f l * t i N N / 2 avoids
membership approach the of
by uncovered
problems
in the
that can exist
regions with more
input space flexible
membership representation
3l


W
ll

ll


Condition
"
elements
Action VHIUGS
Rum Real!
Rea]
Values
layer
elements
layer
Futput
at
input layer
(sc, , ay"
ot)
layer
_
(SRJaR9()R)
,
(SA a/\,
OA)
(SO,
aG,
A1 OO)
DI
CODl1CCfiOI1S -
CF
connections

_;~‘;
» , , "
_

_
t v
4- R1
~’~»



_
l
n
t
QF1

’f

*Z

>."
"
-‘
>
\\\\
’ /f

B1 W___»F
_t -V,

X

..__
-
AAA ‘_
\..._»l}§§/___‘f, \\\\\)_‘\
/
l _---
-_-~»_~~.._-- ----



Ca
’ - ’ ~ - ~
connections
pre»~set
" ‘ * "
initial value
is zero

" " " "

connection
optional
1.
A
Figure structure for
two initial
111105: IF
is
tilzzy RI: xl
A; and is
(DIN)
xg B,
THEN is
(DIN)
C| IF
y (CF|); Rjjf is
xl A; and X2 is
(Dim)
B2 THEN is
(Dlghg)
y C2
(Clig),
where
DIS are
of
degrees attached to
importance the
Condition
elements and
CFS are
confidence
factors attaczhed
to the
ofthe
consequent rules
parts
from
(adopted The
[3,4,5]).
triplets
triplets
for the
represent
specific
layer summation,
and
activation,
functions.
outputThese do not
limit centers and widths in
strategies. such a as to ensure
always
way
complete
Wliile
could be formulated and
coverage. used in such cases
algorithms to force the
to cover the the
center-based
memberships input seems both
space, simple more
approach
efficient; and more with fewer
restrictions. It
natural, should be noted that
arbitrary there are no
b i a s connections
for this in FuNN/2.
necessary representation
The from the
to condition element
of neurons can take
weights input values in
layers
{0,l]
only
since
the data are assumed to be
normalised to this
This
normalisation is
range.
normally
carried out as of the
FuNN
and can be
part and reverse
pieprocessing operations,
performed
in
transparently applications.
the functions are
over
Initially the
membership spaced equally il’
weight space,
although
any
is available
this can be used for
expert knowledge initialisation. In order to
maintain the
semantic of the
contained in
meaning this of connections
some
memberships layer restrictions
are on
Under the
FLINN/2 architecture labels
placed adaptation. can be attached to
weights
when the
network is constructed. When
is
the centers are
adaptation
taking place
spatially
constrained
to some
rules. One
according is that
constraining remain within certain
possibility
sized of
the
equally Another alternative
partitions weight constraint is a
space. monotonic
within the set
o l
centers such as the
ordering
membership function
membership
weight
l o w
will
have a center less than
representing which will
always be
i n e d i u m , less
always
than
h i g h .
The condition element
ol‘ neurons is
layer the
potentially
expandable during
adaptation phase
with more
nodes more
functions
representing for the
variables.
membership input
Simple
activation functions are used in the
condition
element nodes to
i‘uzzii‘ication
perform
(see
A forthe full of
the PuNN/2
appendix
algorithm
implementation).
4An
of this
is that different
important can have
aspect layer numbers
inputs of
differing
functions.
The same
to the
membership
principle functions. This
applies output
membership
allows for
different of
to be used
very As a
types inputs
together.
simple example,
temperature
be
divided into
may seven different
functions
the from
membership cold to
representing range
while
hot, is a
(which variable to indicate whether
holiday it is a
binary or
public holiday not)
can be
two for
and no.
represented. using _just
yes
2.2
Rule
Layer
ln
the rule each
node a
layer rule. The
represents single is also
fuzzy
layer
potentially
in
that nodes can be
expandable added to
more rules as the
network or
represent
adapts
shrinkable. The
activation function is the
potentially
function with a
sigmoidal variable
logistic
coefficient:
default
gain value of 1 is used
g (a the
standard
giving activation
sigmoidal
For the
function). coefficient
values will make
gain it close to the
large hard limited
function. A value of
2.19722 would
thresholding ensure that a rule
node would
provide
activation values
from 0.1 to 0.9
when the net
values are
between --I and
input +I. These
values be
desirable as
of the architectures
may f u z z i n e s s .
part
The
semantic
of the activation
ofa node is that it
meaning the
to which
represents
degree
input
data
matches the
antecedent
of an
associated
component rule. However the
fuzzy
synergistic
nature of rules
in a
architecture must be
fuzzy-neural remembered when
such
rules.
interpreting
The
connection
from the
condition element
weights
called the
layer (also
membership
functions
to
the rule
layer)
layer the
represent of
semantically of the
degrees
importance
condition elements
for the
corresponding activation of this node.
The
values of
the connection
to and from
the rule
weights can be
limited
layer
during training
to be within a
certain
interval, thus
[~l,l],
say into
introducing the
non~linearity
synaptic
5This mimics a
weights. option but should be
biologically plausible phenomenon
[12]
in accordance with
an factor for
the activation function.
implemented gain For
appropriate
the
if interval is a suitable value
for thc factor
example, [-l,l| be 2.19722 as
gain
may
described above. The
limitation would ensure that
into the rules remain within
weight
inputs [-
the
functions are all between 0 and
l,l] (since and the
input membership factor would
1)
gain
allow the rules to values in
This further enhances
only the of
output [().l,O.9]. the
meaning
rules and saturation will not
occur. As an of the
weight of rule
example problems
with unrestricted
it is difficult to
interpretation a rule that has
weights,
interpret
input Weights
that are
without some form
values, 16, of normalisation.
very high say With this
weight
the for such normalisation is
removed.
limiting option, necessity
A b i a s
connection to the mic nodes is
A b i a s
would be
optional. as a
weight
interpreted
for the net a
to rule neuron.
shifting parameter
input signal
2.3
Output
Layers
In the action
L f ( f l ? ’ l ( ! I l f a node a
label from the
layer
represents fuzzy of
fuzzy quantisation
space
an
for
variable, or for the
output s r n a l l , m e d i u m , variable
example
l a r g e output
r e q u i r e d
in the The
activation of the node
change the
v e l o c i t y . to which this
represents
degree
function is
the current data
used for recall.
membership supported by So this is the level
to
which
the function for this
label is
c u t
membership fuzzy linguistic to the current
according
facts. The connections from the
rule to the action
element
layer
layer
represent
conceptually
the confidence factors of the
rules when
corresponding values.
inferring are
fuzzy output
’They
to constraints that
them to remain in
subject intervals
require or
specified as
specified
ordering
for the
with the same
of semantic
previous layer
advantages The activation
inte-iprctahility.
function for the
nodes of this is the
function
layer sigmoidal with the same
logistic
(variable)
6factor as in the this
factor should be
gain previous layer. Again,
gain
adjusted
appropriately
the size o’l‘ the
given
weight boundary.
The a modified center
of
output layer performs defuzzification.
gravity
Singletons,
centers of
as it
representing was the case
triangular membership functions, of the
input
are attached to the
connections from the action to the
variables,
Linear activation
output layer.
functions are used here.
As an
medium and can be
small, as
example, large connection
represented
weights having
values of 0.5 and 1.0 from the
O, ol
if normalised
respectively output range [(),]] are
outputs
considered. the
functions would mean
the
Adapting output menibersliip but
centers,
moving
the that the
to which
a
requirement rnembership degrees value to the
particular
output
belongs
various
labels must sum to is
satisfied.
fuzzy always up one, For each there is
always a
centre,
rule or
This is used in
constraining (partitioning the same as
ordering). the
way input
function c e n t e r s
restrictions are. More than
one
membership variable can be used in
a
output
FUNN structure and the different
variables can have
d i i t ‘ e r e n t numbers of
output
memhersliip
functions.
One of the of the
FLINN architecture is
that it to
advantages a
manages provide
fuzzy logic
without
to extend the
system traditional
having unnecessarily MLP. Since standard
transfer
linear
and are used
functions, with a
sigmoidal, modified
along
slightly
bacle-propagation
the main the
algorithm, departure rnueh of the
being rules, o l
constraining
large body
theory
such networks is still
For those
regarding results not
applicable.
for
immediately
applicable
FUNN/2 the modifications are
made much
FnNN/2‘s
natural
simpler given structure and
algorithm.
3. of FuNN
Adaptation
7There are three versions of
in the FUNN to the mode of
and
weight updating
according training
These are not
exclusive versions but are all
[3,4,5]. Within the
adaptation mutually
provided
same environment and
the versions can be switched
between as needed. These methods of
are:
adaptation
a
version where the functions of the
(a) and the
partially adaptive (MF)
membership
input
variables do not and a
modified is
output change
during training backpropagation
a l g o 1 i t l ’ i r n
used for the of rule
This mode can be suitable
purpose for
adaptation. adaptation
systems
where the functions to be
used are known in advance or where
the
membership
implementati_on
is constrained the
in some
by problem
way.
a
version with an
(b) extended as in
fully adaptive
backpropagation algorithm, explained
A. This version allows to be made
to both rules and
appendix functions,
changes
membership
to constraints for
semantic
subject
necessary retaining meaning.
a
version with the use ofa
(c) for the
partially adaptive
genetic algorithm adapting membership
functions.
This mode does not alter the rules. The
limitations are
constraining
only partially
on the
with the
limit but not the intermediate
imposed version, {(),l] used, limits. The
used is described in
section 5.
algorithm
These modes are not
alternatives as such to be chosen
but can be used in
between,
together
whatever
combination is most for
the at a certain time.
It be
appropriate given
problem may
useful to use several different modes in an
iterative with
each version o l the
manner,
adaptation
suited to
some of the task.
algorithm best
part
adaptation
4. Extended
for of
I¥‘uNN/2
Baekpropagation Algorithm
Adaptive Training
8Here the fuzzification and the defuzzification their connections based
layer layer change input
on and intuitive formulas. These reflect the the
simple changes concepts represented by layers
on the functions.
and must the constraints
satisfy imposed membership
The same for the two but different formulas are used to ealctilzite the
apply layers,
principles
ofthe 2 shows the initial
functions ofa
(see A).
change Weights appendix Iiigure membership
and the
variable x or functions after The amount
(either input output) membership adaptation.
of has been in order to demonstrate the involved. In the normal
exaggerated
change concept
course of to functions are limited to
small, movements,
training changes membership gradual
with the of with the into and out of the rules.
majority weight changes occurring Weights
5. Genetic for of MF in FuNN
Using Algorithm Adaptation
are
Genetic effective Search that are based on the
( G A s )
algorithms highly nlgorithins
ot natural and natural selection on
the success
genetics LQE.
principles By iteiiitively building!
of at a G A s are able to
search
solution,
previous attempts problem quickly investigate large
to
to find solutions difficult conibinatorinl
spaces approximate problems.
Given that the of functions involve to
optimisation fuzzy
niembership may many changes many
different and that a to one function effect the
functions,
others,
change may large possible
solution for ai
this is natural candidate for a GA based This has
space problem
approach.
been in and has been shown to be more effective than
manual
already investigated [l(.)},
alterzition. The of these to a FLINN a
can be seen as extension of
application
principles logical
this work.
previous
The work carried out in focused on the use
of small to the Width and
[10] centre
changes
of the functions of the ’l‘hese delta values
be
positions membership system.
may
easily
9Membership degree
Ak
Small
Medium
Large
.. ..._
._

lf*
...
.
_
/

¢
.
l
l
|
\
I
. »
\
»
.
\ ’
Z »
, »
, ~.
.
X 1
- -f 1 - l
......-...f Y- _
_ ....~._.-. ie -f _ W --,fV
.1 , _, I
,Q-_fi _1 ........, ,-_.:;:ff-f-~ ,
f)
C<
l Cm
Ci
2. Initial
functions
Figure of a variable
membership (solid x or
lines)
(either
input, output)
in a FuNN
and the
represented functions
after
In this
membership adzlptation (dotted
lines).
case it
is used
as 21
rigid rule, The
partitioning to which each centre
constraining boundziries,
can move
but not are
also indicated.
cross,also a convenient
encoded within a chromosome stmcture.
They provide
genetic algorithm
of the GA
the amount of movement allowed a
means of
by single application
restricting
the fitness
from these
the ofthe
changes,
system. By measuring performance system resulting
taken in the GA module of
of the chromosome bc calculated. A similar has been
approach
may
be the chromosome
FuNN/2. In this case centers need to
strings speeding
only representedby
minima over the center and
the and local
reducing spurious
up adaptation process possibly
width used
[10].
approach by
as a stand alone for
The GA, module for FuNN/2 is
optimising fuzzy
adapting designed system
been The will
neural networks that have rules inserted or have rules that have
adapted. system
and action some
the functions oi" both the condition
Although
membership layers.
optimise
the rest o l the
such as the of the chromosomes are determined
automatically,
parameters length
o l fitness
such as the size of the the mutation tate and the use
parameters, population,
as
in the in the same
normalisation are user The GA used is, essence,
configurable. system
that the chromosomes
Genetic with the
G o l d b e r g s Algorithm [9], important exception
Simple
are as of rather than of bits. mutation
numbers, Also,
floating strings
represented strings point
ot’ a as a reinitialisation of the rather than an alteration of the
is
gene implemented gene,
in
allele. The interface to the GA module for FUNN/2 is shown
adaptation t‘ig.3.
existing
The for the GA will be described here. Alter the user has
operating process system briefly
he the calculates the of the chromosome
the FuNN to
specified optimised, system length
functions.
to the to be to the and
necessary represent changes applied input output membersliip
with that
A is made of the l r u E \ l N all zero to elitism,
(with dcltas) ensure,
original along
copy
the error function is for GA
monotonically decreasing training.
10.
._ f .» . » -v-
r ,ly ,__ w -,~.¢¢ z »_=, \ r _»
»;~,= 3-vi-__ .,,..__- _,»-¢-;;..1,-¢=_-;VV.,- V
6 ‘ ! 1 \ f »\
, *Y
_ ‘ _ _ :Y ’ M ,z V _ uf-nv#
_ _ . .___ _. .. ~? __ I .»
_u ,_ *UKITJ . , , ___ , J 1 a
’wliiawf _,_ __ ’ »: f 1
’ -~ V V - -
we »-V _V,= _ff -_»_ , _ _ _ __ V
¢.» , .,__, . , ._
,
~
~-:¢»‘:’.~~» 5 ff
> * f1.»_
a ~ 1 » ’ > W r ’ % # . ’ " Ø e f ~ » t f 1 . » _ m f ? - § 2 ~ » ~ z , _ , _ . . _ _ _ _ ~ ~ , ’ - - . » ~ _ ~ ~ n - ( _ mf?-§2~»~ 14 14" _gy-f Y P #
_ a ; ; , w ; ; p = - _ _ { "A’i Y 1 I
’ 1 ,i ’H’ 1 ,~
z , _ .. ~ V we--,s4§_g’.»: fs |:,( "
_ , ,_ ____ _ ~,’- -.»~_~~n-(_ ~~=’ fi > 2
z, , »~_~~n-(_ _ xy f $6 Fl’ _G " _ _ , , , 4 _ ’%;¢ ,fØ f
_:gg Q. Q
’ S P 3’
1 *ig
< *O ~ »».if
x » "",§Y‘ . 4 s ~ / f f i i f i s ; ‘ f i g ; ; f f V . » - . / . _ "\ ;f"_
._ rmmumcs . ~ (5
f
;orfnpus- , _,__- of f#
X ’ * wx/ ssuecnon , Q V .1
. V’ \ ___g;;;[y§1affdVf¢QHQS‘S_6VER:’_§"1-1;/ , ’=__?\fype ’ ’§_FlLES .f.1 ;
; 1 _ _ > ; » » *~5< » -_=q-% _:__;- _A ; , » i = ¢ - _ ~ » 2 ;,,,,_.. V A *J r
» ¥ § " , , ~! ’ P v~»f~. ,r __ Q. sw*Qi _ >. _
:» Us if __ ’»;= _-77-_ 9 __¢.,;_.;> » ~ ’ » : > » » f ’ - " § § ’ > _ - _ ¢ a f _ ’ . , _ » , , _ . ¢ _ i k - , ¢ , V . V g - , ? @ , _ 1 . , _ , c w - > : = ¢ ; _ » f ’ - " § § ’ > _ - _ ¢ a f _ ’ - " § § ’ > _ - _ ¢ a f _ ’ » ,
WQQ. x._-f .R _ - ifr_s.>;;. _ It ~_
D _-1_~:._*V: J ; ; : 5 _ § . a ; . ; £ , § - 3 3 4 3 V I _;_ iq, 1. 53,1 ,
g_W Q _ i k - , ¢ , V . V g - , ? @ , _ 1 . , _ , c w - > : = ¢ ; _ ; wg,
’gg V g - , ? @ , _ x > ‘ § , Y ~ V _,cw->:=¢;_
..N__,_‘7 .,_»,,_.¢ Cgyggi
f _ ;_§i:§;3§§L_ H
fr ’ »» 4 f ‘
(Ai
SU im: ~¢, _ f
*?2_‘§H1pu|aU{Jn;SiZØ_ ._ _ 214- F I
R _ _ _ ._~_...»»_~.< sf:-_ .gqmt Y _Q ’Sl 54.* Q’ g f 13" BvillellØ Wheel 1 5 \ 1 ,M
" 5 f,. 4
_ f } ’ > » = Q 1 » ’ ___
-_ $1 J H* _ 5 e1_Q.D_wQ¥_;_;-;z;_V;¢=_3§=V " 1
/ > fir; ,_¢ if; 2* __._..,,_,_____%§§ I ,_,,.,eH_,,,__
a./2.1: :fest ,, _T ii,
__ an ._ 1 ‘
_,, , . _ _ - - ; - _ - , _ _ , i V : ; . ~ § ’ = " _ _ » Z _ q . 4 L
¢ § .;§=,;;»j5_~:§fi’
_,,» | M Eff _fi M ‘ 1 "
’Y _ ,J W
i ,M PP, ~ ¢
\ z i i T 0 u m a m e ; \ t f ¥ § _ § f 5 ~ % ~ 5 ; 2 : ¢ & § f » ~ 3 F
95; ) in
; _vj;5_"_:-;m{y§’;i:_ l » y
3
_ ’1_Ul:£- _ 1’ W m 9
, < -_ W
W, ,_ _J I ., » fnml-Fin fn. » " _
.J .Q n " " " " ’ Af;}_¢?,Kg3-t;;¢r:!=*53,.s-;.Pg f , ,T ,
+¢,,__ ‘ y X 3;
fy -5 " __ 1
:f ft! 1* _ X _ 4.» f*L_., ~»3_5 Y V-x’;_>_; S 1
& , 1 ; ‘ § : z _ § ~ = ~ , , ~ f ; 5 / . f ’ ; _ ; ‘ ~ » 3 ’ ’ 221.1-33g,»-.~_;¢ P; "§E_¢i1~
_ 9, _ 37’ ,ff ,,;,\_¢ & -,V -_ . .,
""_H ;_:_ , :_¢ »» -J?" ;"
~ P -Eff; f 5-(__ / § ¢ ; ~ - T
V 1 fr 1% 1’ Fw f‘:»v,b¢,.,§»- Q _._ 17-
2 ,T , Nl
¢ r.of Generation 50 QM; 5 »- ~>~ f ’~
_f’NUl"f’|b|? » ’~ V ~ li: ___ ; ¢ H ,= P
» . ’I -"T ’ ~: 5 _ »
/ V _ f . ; . . _ - _ _ -1 "_Y 5 f n%i_Øf_¥f¢_0u U t-
Q * ._ » Wx ’
( 1 ’>_;_ ,» V » : ; : f 1 » ? > 2 ¢ ~ : f ‹ ’ 11 ‘_ ~. »¢
r* Vw.-»~v.~¢m ‘ I 2" ’ / 1"? ’- Z
-*’ ~_§~..=~< .EE ;.@;;_f£_._ 2,4 .X ‘ if in U ,Q .#;§?|‘e_1_QQe§¢s+_.§ysf__:@_1_=;
f W ’,§§Ws;$Iec1 ’ ’ ’ =¢ , x g ; a " ; : » " ¢ & » £ a f ; Y!Q-
_ * = ’ § f : ’ Z @ & - - i f ’ bd’
’ ww x =_, 5 *_ Q
- ; s * $ ’ ~ ¢ * § : _ » »’=‘3"’f U any » 5
’ .Q ’;»-\ J ’
< _ _ :gf;»f:»:-=’s’~f~3
_ . _,;i_e;;_~’i:1f_§_; _ J .E _
yu’ 11 i f~ Af
~ " : ’ E n o r u ,wi ,gf .
num 4 . _ . » . 1 § i ¥ ¢ - ’ E T ; * * L, ~{_,,,§£ f__§_ QU __%,>;W,:4_ _ ~H ,
3 QU { __ ;»~e.%-;_ 2
U Bequiremenr #1 "~ 7 V I* f X ai Evaluatnon
1 ff_¢Q§@~L;i2x_ 1
_fa___eeV»_’ fm ¢ r = _ V _V QA n 4 33"’=, V
_ _ Ø ~
~- . ,,, dk f . . | 9 ; § F 9 P _ ~ ! * { 9 f _ ; > ; f ~ V A
,.~ ff; -_ *fi* ,V,,f?< 2 _
’ if - " fw~ X »’
f __ Vw
__ = ._~;_V=-:»_ .».x
‘ »ff»;i»;_ : § { " Q " S ’ ! i > * - _ » va 1 /¢»§ » 1 - ~ » V w 9, »¥ r
Ni * _ Q); 3. 4 _ . » . _ ,
if jfs* . = §»» 1 " ;
=» »~ » 11
» _
-= ,M Q _ "-.I-: T
zi . J ~ L.,-_"_§?_:.§;l’ 1* ng’ »_
_._- ¢» 4’ =~@w~ » » f f,T _ __
f~= ¢>>:Z ,n \ ;
‘ ww. /"_: ¢ F ,
__ * f § § ‹ f X ‹ f f "» 5 I
1’ 5 "2< ~ f Y ;’_Q!1It_
f._,g;\’?*; _ _ ~< .1~
_ ;» _ _Q ‘ _ f ? f ‘ _ 5 > § 0 i ? ‹ I \ I ¢ _ ; M f ._ _»~2 ,- " § y = i ‹ f .- V _,M _ _ ¢ EA.. .
> , . . _ _ j x < , _Q ’ f, \; *W* _
X » » »_-¢ - 1- _ _ ; » v , ’ s.. : _
M’ 1 Q »¢ 3
_:gy 5 ~,g,;f_»;Q / s >
Q ’ ;_ if *, _ EV,35’
!___EP1eS§_NQ"P@l’¥?!!Q% .1 » MV f _ . ; ; _ ~ » x 5 $ _ f ) ; _ _ m Q;
gff-Mb? ’f *
\f, 1 _ _
5 ~ _ I _ ,_
gkwg »f;;5f~, 7.
1 » -
/_ 7 .a . . » ,_ _ 93% gi,
, V » % r A V_.¢._ -.;. _-f-.¢_____ V,_
»:. ,_ ; ~ M Q1, 2 _ Q .
1 ,HA -_.§:fw;Vf L, \ ,fu _,
_‘T’5§_j3 _g>.;-.-.;~,fu,_’;,_:-;~’.,=,,’,___::_» >.;-.-.;~ ,_’;,_:-;~’my .,=,,’,__ _::_»
module for FuNN/2
interface of the GA
3 The adaptation
Figure
’-’_ -V
.V -_~ -_ ____ _
0.0109 __ __ __ _
‘ ____ _ __
_ ____ _
d - ’1 4" ’- -~"’ ’
F514 .2 ’ ~ f ~ & » s = : "._~f_~.A
» § V » " ’ 4 f f % ; » ‘ . 2 ’ ~ f ~ & » s = : - - V : ¥ - ; _ . _ ’ . ; - " . _ ~ f _ ~ . A ’ - - w 1 : ~ e » ; 2 ~ f f / > Ø Ø » " * ~ 2 » ; ’ f ’ z = ‘ - ~ " ’ ; ~ ¢ s § - 2 - ’ ~ ~ = r ~ - \ . _ _ - ’ _ V _ = = > - 5 = ’ & - ; ~ ; : _ : : _ u n i ; ~ j . C - L _ . , - f : * ’ r " ; - - . ¢ ; M _ - - V : ¥ - ; _ . _ ’ . ; - -w1:~e»;2~ff/>ØØ»"*~2»;’ f ’ z = ‘ ;~¢s§-2-’~ _V
~=r~-\.__- _==>-5 = ’ & - ; ~ ; : _ : : _ u n i ; ~ j . C - L _ . , - f : * ’ r " ; - - . ¢ ; M _ _ u n i
;~j.C-L_.,-f:*’r";--.¢;M_

~
_..__» W. ¢-.__f ~~» < V
~ - » ¢ * £ = , = ~ > . M _ ¢ ¢ m - - . »_~rf#~~»<..~f»»V._»~V f+»§-W#--a¢~¢_w_.wu/~»V»_~_=_V~»#,___ rf# .. ~f»»V.
0.0103 ’-~ ~ V V=- _»~ »- f+»§-W# --a¢~¢_w _.wu/~ »V»_~_=_V~»#
Kazaa; ; wife; »>~ l - Y f » - A a . ; - ’ v i g - _ ~ Q , ; ; ; " ’ ; ; # ; ~ - \ ~ _ - . a x - , = _ g > . » 4 ? ? f : 1 e f ¥ ~ _ > » _ f * ~ : _ ~ f ,___
-4-:gf V V = - - f y V - > - ’ - > 1 ~ - » ~ 5 ~ 2 § ¥ § ‘ - » > ~ l ’ _ ; , ~ , ¢ _ g i g - Y f » - A a . ; - ’ v i g - _ ~ Q , ; ; ; " ’ ; ; # ; ~ - \ ~ _ - . a x - , = _ g > . » 4 ? ? f : 1 e f ¥ ~ _ > » _ f * ~ : _ ~ -’_;,~,¢
5 -fy V - > - ’ - > 1 ~ - » ~ 5 ~ 2 § ¥ § ‘ - - . . - - » V » 1 " W " f ~ _ ; = ~ . ~ ¢ f 2 _ = l f V _ = ‹ : _ 1 - _ . > - ~ , f ~ . ~ w s ’ " x ~ w f r i ? f m ; - f * - x n f i i _ " e . _ - _ _ Q » » . _ _ , - V » - - = _ - Q - _ _ . , ~ » » f 9 & _ _ = W - N - , _ _ ~ } » = - _ _ _ _ _ - _ _ _ _ 2 ; _ " _ _ , / 5 » M ; - w » i u = ‹ . = ~ r , - . - . _ n s . ¢ - _ g i g -Y Aa.; . » 4 ? ? f : 1 e f ¥ ~ _ > » _ f * ~ : _ ~
_ ’ "f - ’ v i g - _ ~ Q , ; ; ; " ’ ; ; # ; ~ - \ ~ _ - . a x - , = _ g > . » 4 ? ? f : 1 e f ¥ ~ _ > » _ f * ~ : _ ~
__= . .- »V» 1 "W" = _ g > . » 4 ? ? f : 1 e f ¥ ~ _ > » _ f * ~ : _ ~
f ~ _ ; = ~ . ~ ¢ f 2 _ = l f V _ = ‹ : _ 1 - _ . > - ~ , f ~ . ~ w s ’ " x ~ w f r i ? f m ; - f * - x n f i i _ " e . _ - _ _ Q » » . _ _ , - V » - - = _ - Q - _ _ . , ~ » » f 9 & _ _ = W - N - , _ _ ~ } » = - _ _ _ _ _ - _ _ _ _ 2 ; _ " _ _ , / 5 » M ; - w » i u = ‹ . = ~ r , - . - . _ n s . ¢ f V _ = ‹ : _ 1 -_ .>-~,f~.~ws’"x~wfri?fm;-f*-xnf .>-~,f~.~ws’"x~w fri?
" fm;-f*-xnf
_
» -V »- ii
._ -_ _Q -= _ = - e
’ ».__, ’ ‘ " ~ ¢
~ -Q -__., ~»»f9&__ W-N ,__ }»=- _ _ __ 2; »
0.0107 __ _-__ { _"__, /5 M ; - w » i u = ‹ . = ~ r , - . - . _ n s.
iii?-§‘$ Y § 1 _ , _ _ *P
§ § l _ ; ; ; » : 2 f ’ G f » ; - - § 2 2 ; z . " § 3 § 3 ’ Q . 7 ’ , ’ ; ’ ~ ¢ - ~ ¥ : 7 n f l * P i 5 5 " £ f - * 1 ~ = ‘ ~ * f ; f l ‘ § _ } - I f 1 2 f { Q : f f 4 * " 7 $ § - § = ; _ s z u ~ § @ - ’ 1 " ~ 2 4 ~ * - 5 ; § f % ; ’ : w f n f l Q: f f
._ _ ~_ ».~,.W_ ~ i 5 5 " £ f - * 1 ~ = ‘ ~ * f ; f l ‘ § _ } - I f 12 f
Wx, -8 _,__ »,V V~ 4 * " 7 $ § - § = ; _ s z u ~ § @ - ’ 1 " ~ 2 4 ~ * - 5 ; § f % ; ’ : w f
..__»>\~_¢¢_.»_- W¢-1=_ - » ? % = ? » ¢ ~ . @ --~~~
;,. , V ~ = » f J ~ f -~r~>_r~---~~~
= . > w » V » » » . ~ - . f § » . i i 3 f f > ; - = V ~ - » ? % = ? » ¢ ~ . @ ; , . , V ~ = » f J ~ f - ~ r ~ > _ r ~ - - - ~ ~ ~ ~ _ ; . . _ , § ‹ f ~ . § » ¢ Q _ - ’ ; ; A _ , , , p g g y , ~ , » g - , , e . ’ _ " ’ _ ; _ - § 5 » \ f ; g _ . \ K ; a , " ~ , , , _ § 2 . , ; f , ~ § ¢ = ; Q * _ " w e n ’ » ~
-’ "’
Q _
_ ; . . _ , § ‹ f ~ . § » ¢ , , _ * ’»
;;A_, ’_ ;
~ ~,»g-,,e.’_"’_; _-§5»\f;g_.\K;a,"~,,,_§2.,;f,~§¢=;Q*_"wen ’» _ "wen
pggy, _-§5»\f;g_.\K;a,"~ ,,,_§2.,;f,~§¢=; Q
0.0106
,»,;, - -4
-~

,;, _g m_;_,@> __ ~_ :»:>a_;/¥~», _
>v~;_ r w ¢ ¢ - ~ _ f e f g g i j w r . ~ » : ~ » § ~ , ? J ; i w ¢ » » ~ . i i i ’ i i . ; w @ , » w ¢ - f § e z m v = @ ; : ’ 1 f Q § _ § _ wr. iii’ w
~_ . ~ ’ fefggij ii.;
_Vx *if ~ » : ~ » § ~ , ? J ; i w ¢ » » ~ . @ , » w ¢ - f § e z m v = @ ; : ’ 1 f Q § _ § _
;~__v¢». 4:2 m v 4; rf-
> _ w * i f ’ ’ L f m ; g d _ fi f = ¢ = ~ : = f ; _ » ? i ¥ ‘ § ~ _ r f - ~ » » ’L fm; gd _ ~»»
- fif=¢=~:=f;_»?i¥‘§~_

1
_
Q 00105 » ~
- - --
:_-ff; ’35 --~ -mv f Z % l ; a ~ 6 ’ - - _ / - _ - W . . - _ ~ _ _ _ _ _ - - - - » ¢ ~ = ~ V = ¢ ? ~ > ~ § ’ § ¢ ~ ) § ¢ ; , . , . » V 1 4 - f - _ ’ - " 1 u W..-_~_ __ _
59’ _ »¢~=~V=¢?~
> ~ § ’ § ¢ ~ ) § ¢ ; , . , . » V 1 4 - f - _ ’ - " 1 u 14-f-_’-"1u
UI V14-f-_’-"1u
1 - V = ¥ ; _ .x "_

, . ~ M Q * - 1 » = i @
" 3 ? @ i § § ¢ ? ¢ 1 z ? ; ~ ? ; 2 % f f M Q - E 2 _ ; _ 2 ¢ § , . ~ M Q * - 1 » = i @ » MQ*-1»=i@

_ _ _

~
0,0104
_ ’fir
_-x_ .-- 1 § 5 : A 3 ; ~ ¢ » ’ ¥ 1 ; f e s ¥ ? § : a 2 * 4 ? § f § f ) > ~ f . - , _ - W 5 _:»xØ?;;:-;~:-:_ 323. ;;:-;~:-:_
5s,_Y>,¢§].,»%~Y¢;,{§’_;¥.t§¢§1;§§)_ M - _ _ » ~ _ : - : , . - - _ _ _ ; § § ’ _ ; , _ ’ ; § j j » ; a , v £ ~ - ~ ; : - > . ; ~ ~ , , ¢ ; ; ; , . - A 5 1 7 . ; _ _ _ _ ; _ £ : 1 / @ ; ~ , , _ ; ; ‹ , ; s ; ¢ > , ~ _ M , J J . - ~ g . ¢ _ 2 @ , _ ’ V , , » , ‘ _ - _ _ _ : H T V _ - _ _ _ _ - ~ » . _ _ 1 ’ - § , _ ‘ w = = $ 3 § § * f < * » b » > s : ~ . ; f $ 1 - ¢ § > ; > v , § 5 § § ; S E § ‘ f 4 _ _ V f G ’ § ~ _ = , ¢ s z » ’ V f a + i . = . s . m _ ¥ - ¢ 1 * = ; § . A - ’ » f _:-: § § r x y ; : 1 § 5 : A 3 ; ~ ¢ » ’ ¥ 1 ; f e s ¥ ? § : a 2 * 4 ? § f § f ) > ~ f . - , _ § . ~ : 1 % - ; ? * § . ; : > E § $ . ; ; $ 7 § . ~ § - 1 = _ : . x _ - _ > . . _ . _ t W 5 7 ~ ’ f i r _ : » x Ø ? ; ; : - ; ~ : - : _ 3 2 3 . , .- =_:.x_-_>.._._t - 7 V 323.
M-__»~ ___; §§’_;,_’;§jj»;a,v£~-~;: >.;~~,,¢; ; ; , . - A 5 1 7 . ; _ _ _ _ ; _ £ : 1 / @ ; ~ , , _ ; ; ‹ , ; s ; ¢ > , ~ _ M , J J . - ~ g . ¢ _ 2 @ , _ ’ V , , » , ‘ _ - _ _ _ : H T V _ - _ _ _ _ - A517 § . ~ : 1 % - ; ? * § . ; : > E § $ . ; ; $ 7 § . ~ § - 1 = _ : . x _ - _ > . . _ . _ t W 5 7 ~ ’ f i r _ : » x Ø ? ; ; : - ; ~ : - : _ 3 2 3 . .;_ __ ? * § . ; : > E § $ . ; ; $ 7 § . ~ § - 1 M, ,
_ ; _ £ : 1 / @ ; ~ , , _ ; ; ‹ , ; s ; ¢ > , ~ _ JJ. ~ g . ¢ _ 2 @ , _ ’ V ,»,‘_-__ _:HT _- ____-

».
~ _ _
1’ w

-§,_‘ * » b » > s : ~ . ; f $ 1 - ¢ § > ; > v , § 5 § § ; S E § ‘ f 4 _ _ V f G ’ § ~ _ = , ¢ s z » ’ V f a + i . = . s . m _ ¥ - ¢ 1 * = ; § . A - ’ » f A- ’»f

110103 = = $ 3 § § * f <
» > s : ~ . ; f $ 1 - ¢ § > ; > v , § 5 § § ; S E § ‘ f 4 _ _ V f G ’ § ~ _ = , ¢ s z » ’ V f a + i . = . s . m _ ¥-¢1*=;§.A-’»f
"QQ
.’§1~2¥~1
Aa,
- ¢ § ~ , ¢ ¢ f § ; ~ P § ¢ - ~ _
"7 ; § ’ V " § ; V 1 ’ f : - : " 2 _ ; » : V - § e ‘ Q _ _ 3 " . X n ; _ : _ ; ; ; , ~V¢=,,,-§§:Y;; ¥=f¥i§§?~*;» $13 ’_:‘}»V , e ; § g 2 ¥ . ; ; ? ; : _ f - _ ; § _ _ : ; % f 7 ~ ; Z £ ’ _ \ , _ " 7 _ ~ ; : _ . » ; ; ¢ ; v Z ’ _ . ( § § f ; ’ : _ ¢ _ ; u ; f 1 ; 1 - ~ _ - L W " § , V $ ; ¢ _ f e ; ; > , \ - : ~ ‹ ’ v > - = ’ ~ - - f § , = . § ¥ " r w " B ; ; 2 » _ Z ; f x - i f . : ‘ > f ~ § ’ . % Q M K Q a w » \ § i ; ’ £ 5 9 - = $ < = f ’ " . f ¢ 1 ’ 1 3 ’ ’ f _ f - _ ; § _ _ : ; % f 7 ~ ; Z £ ’ _ \ , _ " 7 _ ~ ; : _ . » ; ; ¢ ; v Z ’ _ . ( § § f ; ’ : _ ¢ _ ; u ; f 1 ; 1 - ~ _ - L W " § , V $ ; ¢ _ f e ; ; > , \ - : ~ ‹ ’ v > - = ’ ~ - - f § , = . § ¥ " r w " B ; ; 2 » _ Z ; f x - i f . : ‘ > f ~ § ’ . % Q M K Q a w » \ § i ; ’ £ 5 9 - = $ < = f ’ " . f ¢ 1 ’ 1 3 ’ ’ f
_ ~ ; : _ . » ; ; ¢ ; v Z ’ _ . ( § § f ; ’ : _ ¢ _ ; u ; f 1 ; 1 - ~ _ - L W " § , V $ ; ¢ _ f e ; ; > , \ - : ~ ‹ ’ v > - = ’ ~ - - f § , = . § ¥ " r w " B ; ; 2 » _ Z ; f x - i f . : ‘ > f ~ § ’ . % Q M K Q a w » \ § i ; ’ £ 5 9 - = $ < = f ’ " . f ¢ 1 ’ 1 3 ’ ’ f _ _-LW ¢_ ; > , \ - : ~ ‹ ’ v > - = ’ ~ - - _ Ø : ; # Q L ) z 4 i ’ , § . ? . k - @ ; ’ r 4 f _ ; § 7~;Z£’_\,_
_ . » ; ; ¢ ; v Z ’ _ . ( § § f ; ’ : _ ¢ _ ; u ; f 1 ; 1 - ~ _ - L W " § , V $ ; ¢ _ f e ; ; > , \ - : ~ ‹ ’ v > - = ’ ~ - - f § , = . § ¥ " r w " B ; ; 2 » _ Z ; f x - i f . : ‘ > f ~ § ’ . % Q M K Q a w » \ § i ; ’ £ 5 9 - = $ < = f ’ " . f ¢ 1 ’ 1 3 ’ ’ f Z’ . ( § § f ; ’ : _ ¢ _ ; u ; f 1 ; 1 - ~ _ - L W " § , V $ ; ¢ _ f e ; ; > , \ - : ~ ‹ ’ v > - = ’ ~ - - f § , = . § ¥ " r w " B ; ; 2 » _ Z ; f x - i f . : ‘ > f ~ § ’ . % Q M K Q a w » \ § i ; ’ £ 5 9 - = $ < = f ’ " . f ¢ 1 ’ 1 3 ’ ’ f ’:_¢_;u;f1;1-~ " § , V $ ; fe; "rw "B _ Z; fx-if.: = f ’ "
f § , = . § ¥ " r w " B ; ; 2 » _ Z ; f x - i f . : ‘ > f ~ § ’ . % Q M K Q a w » \ § i ; ’ £ 5 9 - = $ < = f ’ " . f ¢ 1 ’ 1 3 ’ ’ f ; ; 2 » _ Z ; f x - i f . : ‘ > f ~ § ’ . % Q M K Q a w » \ § i ; ’ £ 5 9 - = $ < = f ’ " . f ¢ 1 ’ 1 3 ’ ’ f ‘ > f ~ § ’ . % a w .f¢1’13’ ’f
’ QMKQ » \ § i ; ’ £ 5 9 - = $ < = f ’ " . f ¢ 1 ’ 1 3 ’ ’ f
00102 "_,
1 ‘ - > f f ? § 2 1 ~ @ ¢ 7 ? ’ § f : I = $ » » . f , J 2 » ; ~ : e - . _ . ~ . _ _ _ i » . n - ’ : > 1 f % * ’ » @ f > . I Z i $ 5 > » § ~ 1 a i : > = : * ~ 2 ¢ f Q z @ ~ > 5 M f ~ - - a % ¢ i f ; ~ ; a 2 ¢ § 2 ~ ~ ¢ § ? § @ m * = n q ~ ; ’ 4 " i ~ W 1 § ; - V e > * " L ’ 1 . " Z ~ ’ - . $ _ _ » . . _ ’ » ~ ~ _ ’ - w h ; - : _ ~ f 5 ; § ’ ? ’ = - _ ’ 5 i s ~ ’ 1 : = _ V : § 2 e T ~ ; , ; \ 2 " : _ m ~ ’ = 4 & - ’ ’ - ’ 2 4 L ’ B E § r ’ : » ~ » . _ Z ’ ¢ = ; » ’ § § ¢ " = £ < " ’ 5 » , % 1 K : - ’ - = F 4 v > m ; 7 » - = t " ’ ¢ 2 - : . ’ > I ~ " * ’ f _ » . a _ I f " . ¥ ’
."Z~’ - . $ _ _ » . . _ ’ » ~ ~ _ ’ - w h ; - : _ ~ f 5 ; § ’ ? ’ = - _ ’ 5 i s ~ ’ 1 : = _ V : § 2 e T ~ ; , ; \ 2 " : _ m ~ ’ = 4 & - ’ ’ - ’ 2 4 L ’ B E § r ’ : » ~ » . _ Z ’ ¢ = ; » ’ § § ¢ " = £ < " ’ 5 » , % 1 K : - ’ - = F 4 v > m ; 7 » - = t " ’ ¢ 2 - : . ’ > I ~ " * ’ f _ » . a _ I f " . ¥ __ » . . _ ’»~~_’-wh;-:_ ’-
~ f 5 ; § ’ ? ’ = - _ ’ 5 i s ~ ’ 1 : = _ V : § 2 e T ~ ; , ; \ 2 " : _ m ~ ’ = 4 & - ’ ’ - ’ 2 4 L ’ B E § r ’ : » ~ » . _ Z ’ ¢ = ; » ’ § § ¢ " = £ < " ’ 5 » , % 1 K : - ’ - = F 4 v > m ; 7 » - = t " ’ ¢ 2 - : . ’ > I ~ " * ’ f _ » . a _ I f " . ¥
» s~’1:=_V:§2eT~ ; , ; \ 2 " : _ m ~ ’ = 4 & - ’ ’ - ’ 2 4 L ’ B E § r ’ : » ~ » . _ Z ’ ¢ = ; » ’ § § ¢ " = £ < " ’ 5 » , % 1 K : - ’ - = F 4 v > m ; 7 » - = t " ’ ¢ 2 - : . ’ > I ~ " * ’ f _ » . a _ I f " . ¥ »§~1ai:> =:*~2¢fQz@~>5Mf~--a%¢if;~;a2¢§2~~¢§?§@m*=nq~;’4"i~W1§;-Ve>*"L ’24 L ’ B E § r ’ : » ~ » . _ Z ’ ¢ = ; » ’ § § ¢ " = £ < " ’ 5 » , % 1 K : - ’ - = F 4 v > m ; 7 » - = t " ’ ¢ 2 - : . ’ > I ~ " * ’ f _ » . a _ I f " . ¥ §§¢"=£< F - = t " ’ ¢ 2 -
-.;-’_ - > 5 j g - , 1 3 1 - > > z ; * i ’ ~ < » _ . ; - , » f ; ¢ . : V » - _ , a p - 5 ~ . _ * _ _ - . ~ _ ; ; » - . ’ - f » ; . - . ~ » < _ ~ \ 3 1 - - , , , { § E ; = = f _ ~ 5 5 : » ~ ; » i § ~ ; ¢ ; - , , : ; » ; f ’ § § ¢ : § - W P ’ - 3 3 : 9 2 . , _ ; V : n - M ’ ~ ¢ : » f . 5 * f f ~ ’ _ ~ » I > » _ " ’ 5 » , % 1K:-’-= 4v>m;7» : . ’ > I ~ " * ’ f _ » . a _ I f " . ¥ ’
jg- ,131->>z;*i’~< .;- ,»f;¢.:V»-_,ap-5~. _ * _ _ - . ~ _ ; ; » - . ’ - f » ; . - . ~ » < _~\31-
- , , , { § E ; = = f _ ~ 5 5 : » ~ ; » i § ~ ; ¢ ; - , , : ; » ; f ’ § § ¢ : § - W P ’ - 3 3 : 9 2 . , _ ; V : n - M ’ ~ ¢ : » f . 5 * f f ~ ’ _ ~ » I > » -,,,{§E;==f_~ 55: »~;»i§~;¢;-, , : ; » ; f ’ § § ¢ : § - W P ’ -33:92. , _ ; V : n - M ’ ~ ¢ : » f . 5 * f f ~ ’ _ ~ » I > » ’~¢:»f.5*ff~ ’_~»I>» . 5* ff ~ ’_~»I>»
_~»I>»

__
_ _. _ __ , . _
, ,__, ___ _____ _ __ _ _ _
(10101 _ _ ______i_ _ ,W _ _
"___, * ,___
._ ._ ._ . * * __;___P_,i____
, _ ._ ._ ._ ._ ._ ,_r_¥ 5 L , _ ; _ _ . .
:
=
=
_
Generation
\
\
_
4 The decrease of
RMS with the increase
Figure of for
/el after 100
generations
phoncmc
of
with FUNN/2 in
mode
epochs training
t‘uil~adaptz1tionof the in which the current FuNN is each
After the random initialisation
present,
population,
an the delta values encoded within the
is evaluated. To evaluate individual,
individual
the A
chromosome are to the of user defined test
i n d i v i d u a l s
copy original
applied
FuNN._
and the Root Mean error
data file is then submitted to the modified FuNN,
Square (RMS)
is then calculated as the inverse of the RMS
fitness of the modified FuNN
calculated. The
with the lower RMS errors are more fit than those with
since those individuals
higher
error,
GA of a FuNN/2 trained on data
errors. An of
system speech (the
using adaptation
example
is in
/e/) given f,ig.4.
English phoneme
elitism that
the fitness values and with ensures
After
norrnalising optional (which
optionally
is retained without modification in the nest a
the most fit network
population), breeding
either tournament or roulette wheel selection. after an
is selected
Finally,
using
population
is created. One crossover is used for the
mutation a new
phase, population point
optional
of chromosomes.
reproduction
and for
6. with Different
Training Adaptation Strategies
Experimenting
FuNN/2
for UND( environrnent A MS
shows the interface of the FuNN/2
implementation
Fig.5
Windows version of which is of an tool called
FLINN/2,
part integrated hybrid development
has been made available free from the WWW site:
l ‘ u z z y C O P E / 2 ,
http://divcorn.otago.ae_nzz800/COM/INFOSCI/KIZL/fuzzycop_htm
and to be tested
l3uNN/2 allows for different before the most
training adaptation strategies
suitable is selected for a certain Seine ofthe issues involved in this are
application, adaptation
discussed below.
ll V’
»f_.
"V ’
»=< >
_

,l

§

5

T 3 3 ; _ _ _ f ? § § § f § i f . § : ~ § f Q § ‘ - , g f § § § f § . 3
*_

v- » "
DE5C3IP1’ION;_§f?uzzy Eackpropagation
_ ’ H - - * * 1 2 $ » - ’- - insights

,.a" ’
-~~"’/ V 11533; 5,12-f1,f_w;,f;=.;;;-.¢,
1:,’»" ’-=> r-
21-§’;
‘-1-’»
M : / . f f : ; . " ? k " ’ » ~ ’ L " ’
"
»


Mm


,

6 U l i 1 i r r i N ‘ : . ¢ ? f § i 6 i 2 3 4 ’ 1-L
~""
».
f.
:~_;J’7 T9
~ _ ‘ ¢ - = i f : _ - i ‹ ; 2 s f f . » . T 9 ~ " " T i *

bfi# Ti*
§
.f~,;3* = f ‹ § » ;



_
li? l
_ . 1 : ~ _ ; ; L - : § ¢ * @ 1 ¢ . f Ø z k Ø i i _
g f L ; . ~ i - - 1 ; ’ f § E f } x ~ 7 1 f ’ % § $ f 2 ? ~ ’ * ’
’SvLEf;-1



_XIQQQHS . _
.
i
1_2 I
_ _ _ ‘
__
15.),;-A
__;.)¢:_..¢.;c,
3,5
’ f : _ Ø _ . ¥ ; f ; _ ; l : : _ _ ‘ _ 1 ! , > , 4 E
:;},
_;:_3 ‘,~z»
vzj.:::»f;4_,f -=,§.=i.=._\
~-
f i*



I
"
f l
»
=»=;?_&§ , -. _ ’ 5 ~ " " ‘ 7 ; , - f & f ¢ 4 ¢ , , ~ ¢ -
_ ._ - f.


_- ~ »~

-’=-’ »
w,;.=.,;_ ~ M
.;=_~._, 1 iw.;-_ wa-l’;~ . #561
- _ m y m g - @ z i § 2 ~ 1 § _ : § § ~ L a y e r : f . i w u u n u y m ; ¥ f ; 2 ; i i ; . § ? ’ W @ o u z p u ¢ m y e n a __1f_;~.< ,Ml
_ . > 2 _ , ~ wif ¢q l-l-~,-fw

.~ _
f. _.ff__,~f»~~_¢;y,.-»_§i_¥;»;f»¢4~
Layer: ;--~-»
_mymg-@zi§2~1§_:§§~ iwuun

uym
-.
"
" ; ¥ f ; 2 ; i i ; . § ? ’ W @ o u z p u ¢
" myena

l~’l~@mf"s Bala

4 |001
_» _(jill) " " ’ Haw
f’_:.;
_ (gm Im
-_wmwa
Raw
571 ¢..;;:- . » _ ; , 1 _ . ~ > ’ Ha1e_ iv~1_@1;.@
‘j;»,%l’k:§’ 01
mam
T ._
~*jj_ ~’j_f;- le
_ @ _ ’ , % ; » * " * " " ’ f
l ’ ’ " f = " 4 ? 2 ~ I l l _. .=>
. I l l _ " ’ . ‹ l f § Ø § ? ? § § . f " ? " - : " ’ ? ’ 4 T T 1 ¥ P _ ¢ 2 f M 9 r l e 9 $ i w 2 l p l v l t e l I ¢ § _ ( H u w I 0 1 f " ’ . ? " § ' ’ ? . § ‘ f 5 . ’ ~

~_ _ » » : _ ~ - - » ~ < " j j ~ ’ " " ~ » ¢ : _ - = = : = - _ ’ . > = f ; ¢ ; j ; . » ’ Q ~ » ~ - ~ j ’ _ 2 _ ; ; , _ " ’ . ‹ l f § Ø § ? ? § § . f " ? " - : " ’ ? ’ 4 T T
y 1 ¥ P _ ¢ 2 f M 9 r l e 9 $ i w 2 l p l v l t e l I ¢ § _ ( H u w I 0 1
M 9 r l e 9 $ i w 2
_ » » : _ ~ - - » ~ < " j j ~ ’ " " ~ » ¢ : _ - = = : = - _ ’ . > = f ; ¢ ; j ; . » ’ Q ~ » ~ - ~ j ’ _ 2 _ ; ; , lplvltel »~-~ 1
~
I ¢§ _(HuwI01 _(Huw
_ " j j ~ ’ " " ~ » ¢ -==:=-_’.>=f;¢;j;.» ’Q I0
:_
~-
. l § ~ ‹ f ; f z ¢ » § % ¢ G = » 1 ’ ? j’_2_;;,


A _
<
; T’
" ~ * f l 4 = l T _ ( l \ ¢ ¢ ~ @ l 1 v ’ v ) ] l . ; § ; ~ ; ? f Ø > - Q f r i f - i i i G = ~ l ’ 1 F 1 l = l f 2 \ " £ l " ’ 1 " f " ) . I l _ . i i ; § @ f z =
,-

"’
’l_|..___] ""
~
‘ -
f=’>
- ..,,,._.. f
. f f 7 f ’ 5 ’ J V
._
, ’ :W
’ _ ,
_

~


_?;iziz___

" " ’ = ~ _ _ _ ._ __.__ . _
_, ,_ _ __
___ ._ _

_
-_._.=TRA|N|r|G
f~ "¢‘_§_ Q’
~* ’ , ; ; ; 7 f ; § ~ . k . ; ¢ ; i f ; ; $ ; f 2 ; " ¢ : ’
0Fn0nS: =_:§V
= ’ " .
EQ ’ f1L£S:’?,>-1; - f ; f . 3 f ; g : ; : ~ » ; = _ . » _ ’ . ~

_ ‘ j j f _ ’ 1 ’ ; ’ j 1 ; " " . a j f , J ; ’ 2 f i ’ : % ‘ . Ø ’ " " ’ » A " ; Q : > . _ _
§f_fZ’-aw
;;Z.‘1i=;§_,.»r;L’{=l;;§fq‘}~ ,
1’ Q;
31,2
¢ ; _ , ? E § f E f _ : l l
- l w ¥ : s _ H e f 2 !51.
~~
_».~u»1w v<
.... -. ~ =
f

_ ~
»
_ f
;_Z_

- ‘ , . _ g _ 1 ’ i _ ( A l p h a ) £ 0 5 - l ; ; * ~ _ . , _ j j ’ n u | 1 m g : Q i e f p m l l r r l A (Alpha)
»-.1 £05
~ -l;;*~_.,_jj’nu|1mg: A
_ 3 _ »- -
g Qiefpmllrrl
;~~__.;§_Jj;;751523;-’ V-> A . . .-
.. .-
¢

_
- ’
Farm;
,,,. _,V-Gain , (Activation) _
. _
W, ,_ W... ---_-=¢BecaI|;_ ;_ masteucl
‘ MN
’-i ‘ ’
_,. _;_-V1-~-_.N
;:_ l ’ } r = ‘ ‘ ’ ‘ ’ - - ‘ ‘ 5 ‘ ____::i_;; ’_::;t’
‘ ‘ ’ - ‘ _
-
.§ 1 "~’ 5
__
__ - M ? " F a f " ~ ’ = 9 " ? ~ P ¥ 5 | _ " ’ a ’ " ’ f " - 3 ’ I i
__ _ _
_ _ _ __ = 9 " ? ~ P ¥
__ 5 | _ " ’ a ’ " ’ f " - 3 ’ I
_ _ _ i


v _.___
_V
,Error na
uirement. Build 55
_ . _ om. .1
._ _ _q .» Recall
_ !~Trwl\ Zum
...__.__.
i
,l
Z,
1 _ _
Y ,_
1
_Ai _
weighlPwndw_y_f5»_
_’».-_._,»- . ._
I
3 ._ ( ,_ - _ _ m s t , | _ » t m c i - - - ~ ~ . Ma.skS§t Masl<

--
-__
mst, |_» -~ ~.
~’ tmci
_g Furtlrerjumng
Paramenzrs
__
_
= ff*
~ ’ f : > " > ~ : E 1 f f f f f ; _ - ~Q » ‘
=;
_
g
Q. »;
__ l
~_i:iA;1_E;7%1;;‘FuQsrlg
lvfFs_by

_,§Q:__._;
__

E "
- ’ l _ _ ; _ - T u a i n i i y g j x i / i t h MPS
frixzgm


._
_ j _ ’~;.¢

Y_;;»L
if
/ij
5
The FuNN/2
Figure
interfaceInitiailisation: distributed
MF can be used as initial values for the
uniformly triangular
input
and distributed can be used as
variables, initial values for the
uniformly
singletons output
variables, These are the defaults that are
used in l.he absence of other information.
function insertion. If
some is available then this can be used
Membership expert knowledge
to inilialise the
or at least initialise those for which
exists with the
memberships,
knowledge
remained initiztlised the default method.
being using
Rule
insertion. If initial set of rules is it is used for
initinlisation of the FUNN
available,
structure rules insertion mode. The
rules are as so as well as
through
represented
weights
the existence of at
the relative of that rule
rule, and its to
inserting importance
sensitivity
vz11"iz-zbles
can be
input
provided.
9
the PuNN can be either for the inner
two rule in which
Training
accomplished weight layers,
case the its
rules but does not the
or for
system adapts fuzzy functions,
adapt membership
the four
in which ease the
both the rules and the
weight layers, system
adapts
membership
functions. The difference between these
two is that the connections in
only the
options
fuzzificntion and dcfuzzifieation are
f r o z e n in the former case and
are
layers
they
subject
to in the latter
case.
change
0
ti
GA Wit.h this
Adaptation the functions
through algorithm.
option only
membership
Since
the rules will need some
small for the new
change.
ttdjustrnents
membership
functions the best network can then be to ti
short of
the restricted
subject
period training by
version of the FUNN/2
membership baeltpropagtttion
ulggoritlirn.
When the FuNN/2
the arises
for several and
training opportunity
selecting
training ztdziptzttion
For the
these include factors for
parzinieters. each
brick-propagationtraining option
gain
layer,
12rates and
momentum rates for each and boundaries for the
two sets of rule
learning
layer,
connection as in the
sections. For the
weights explained previous genetic algorithm
training
the are
number o l mutation
size, fitness
parameters population generations, rate,
optional
and the of selection mechanism used.
normalisation, elitism,
optional
type
Two main areas of arise for a FuNN These are
either for data
applications
using system.
it is the ease with the feed-forward
(as neural networks or for
approximation standard,
[4]),
rule and
fuzzy
interpretation adaptation.
Since FuNN a
the initial insertion o l rules and
functions
implements fuzzy system,
membership
should allow the
network to learn both faster and more
The first is not
accurately. point
as too for reasons for be discussed but
the second is an
regarded below,
important
important
of the FLINN
This of
some local
aspect system. greater accuracy
learning, through avoiding
minima the initial closer to the
can also be an
by placing weights minima,
global important
feature when with not
small, data sets.
dealing
necessarily
fully-representative,
The number of used in a
FUNN to a MLP will
greater slow
weights compared
corresponding
to some This is
not as
training since the rates and
degree. regarded problematic
learning
momentum
terms are sct to lower values for Ful\lN‘ than for
traditional MLPs.
usually Still,
some
into and to
maximise the networl
investigations optimal
training periods speeds
are
gencralisation capabilities
necessary.
When new data becomes
available decisions must be made how to
to,
regarding adapt
new circumstances and
in this data.
on the size
potentially, relationships
expressed Depending
of this
new data two
can be and used
segment general learning
strategies distinguished [l3}:
13--
0
a section of new
data is used for further
and without
aggressive
training adaptation
using
of the data.
old,
any previously used,
Q -
(:¢m.s’er-valiw: new data is added to all
of the old data and is
on the
training performed
entire set.
these of
and conservatism
in are since some
Obviously, concepts aggressiveness
training
fuzzy
of old data be
can combined with new data to
a data set.
proportion In
produce fact,
retraining
some of the new data with a
of old data
compromise tends to be most
using percentage
efficient. The amount of old data
retained on
for on-line
depends
operating requirements (since
a
data set not be the
ofthe
adaptation using large feasible), and
may stationarity
relationships,
the
of time that tend to
for until i l at
length to the
changes persist returning, all,
original
An here is that
of the stock market. This
tends to exhibit term
relationships. example
long
trends with occasional
from that trend. However
in unusual
departures
circumstances,
except
such as a financial market
the will
large crash, be restored..
long-term pattern
eventually
7.
Rule Insertion to
Initialise FuNN and Rule
Extraction from Trained
FUNN
Several
different methods for rules
extraction are to the PUNN
architecture. Two of
applicable
them have
been and
on test cases. The
first called REFuNN
implemented investigated one,
I
4;],
is
based on the idea of
connection
to a
simple thresholding limit. The
weights according
preset
which are above
the threshold are as
or action
weights, condition, elements in the
represented
extracted rules with their associated
fuzzy of
weights and
representing degrees
importance
confidence factors. One rule node in the
FuNl\E architecture
is
several rules
represented
by
fuzzy
each of them a
combination ofthe
condition elements
representing which
input would activate
I4version o l the rules without of can also be extracted
that node. A
degrees importance
simplified
at classical inference A called is
and later in
[3,4].
fuzzy engine procedure z e r o i n g f
interpreted
are and
also in which is the connection which
FuNN/2,
keeping weights beyond
implemented
for a threshold T. This is useful when used as of the whole
interval
[~"I‘,’I‘]
given procedure part
[13].
training procedure
a structure in tenns ol‘ is
A second for l*uNN rules also
algorithm interpreting aggregatedfuzzy
Each rule node is as one rule. The connection ’front a
represented fuzzy strongest
implemented.
condition element node for an variable to the rule with the
node, along
input neighbouring
are in the rule. The
condition element connection of
nodes,
represented corresponding weights
these connections are as of attached to the
interpreted degrees importance corresponding
condition elements. In order to in a Ful\lI\l structure these connection a
only
keep weights
has been in FuNN/2. One rule has in ns
masking procedure implemented general many
as the of the in
elements number nodes the action element each action
consequent layer, (class)
interred with a defined the connection from this
certainty degree (confidence factor) by weights
rule node to that class node. An of a set o l rules extracted from a FLINN/2 is
weig/’tied
example
in 6. shows the same rules in a format which is for
fig. Fig.7
given simple appropriate
explanation purposes.
The extracted rules from a
Ful\lN can be i l l S ( 3 f l ‘ Ø ? ( fin other Ful\lN/2 modules the rules
through
insertion mode. This makes _FUNN/2 in it environment
with other
possible using
hybrid along
connectionist and inference modules
fuzzy [l l]_
and
S. Conclusions directions for further research
neural architecture
Here a network called FLINN is further and its
fuzzy [3,4,5]
developed
F u N l \ 1 / 2 is introduced. The
used with the
implementation adaptive
learning algorithms along
I5
N.--
.M . . . _-WC
_ Cs
.
_"__s
___,
RULES
if
is A 2_88> and
is not B
isA2_88>andisnotB5.69>andisnotAl.57>andisB 5.69>
isnotB5.69>andisnotAl.57>andisB and is not
A l.57>
and
isnotAl.57>andisB is B
isB
l.66>
is not C I.O6>
then
isnotCI.O6>thenisnotA1221>andnotB3.98>andC2.08> is not A
1221> and notB3.98>andC2.08> not B
isnotA1221>andnotB3.98>andC2.08>3.98> and
C2.08> C
2.08>
else
if
is not A
6.2’7909> and
isnotA6.2’7909>andisB2.’75163>andisnotC0123>and is B
2.’75163> and
isB2.’75163>andisnotC0123>and is not C
0123>
isnotC0123>and and

is not B
2.85492> and
is C
4.9954-4> and
isC4.9954-4>andisnotD7_4l543>thenisA is not D
7_4l543> then
isnotD7_4l543>thenisA
is A
isA
2.59 l63> and B1.22>andnotC4.8-4>z1ndnotD4011>
B 1.22> and notC4.8-4>z1ndnotD4011>
not C 4.8-4>z1nd
notD4011> not D
4011>

6
TWG
Figure weighted mics
exemplzir extracted from a
masked
FuNN/2
A
EJIJIETSWC In
if
is A > and is B >
then is C >
isA>andisB>thenisC>else else
isB>thenisC>else
isC>else
if
is B > and
is C> then
isB>andisC>thenisA> is A >
isC>thenisA>
isA>
7
Two
Figure
mles
simple CXIFLICICL1
exemplar fuzzy from a
nzczskerl
PUNN/2and rule insertion show that this is a
rule extraction to
techniques promising approach building
information which suits
These
adaptive intelligent processing systems many applications.
areas include
time-series
application signal processing (particularly recognition),
speech
and data and and
modelling control,
prediction, adaptive mining knowledge acquisition, image
processing.
Further of PUNN is in the directions: FuNN
development planned following
optimising
structures alternative use o l GA and with and as an
through learning forgetting pruning; adding,
alternative radial basis
functions with a be-tween the
choice,
membership larger
overlap
of chaotic FuNN»based modular FuNN
functions;
membership development
systems; using
for data and
time~series
systems speechrecognition, adaptive control, mining
adaptive analysis,
and as discussed in
image [14].
object recognition
Acknowledgments
This research is a research UO() 606 funded the Public
Good Science
supported by grant by
Fund of the Foundation o i Research
Science and in New Zealand. The
Technology (FRST)
took in the discussioris the FUNN/2 and in its
following colleagues part during
design partly
and 2 Richard Dr Robert Dr
Kozma, Martin Dr
testing Purvis,
implementation Kilgour, Feng
Brendan Iflallet and Steve
Israel. FUNN/2 for l\/lSWindows is available free from the
Zhang,
WWW site:
http://divcom.otago.ac.nz:SOO/COM/INFDSCl./KEL/fuzzycop.htm.
References
Y a i ’ n a l T . , K u s a n a g i , l - l . , U c h i n o , E . a n d M i k i , T . , A n e w E f f e c t i v e A l g o r i t h m f o r N e o E. and A new
T., l-l., T., Effective for Neo
[ll Kusanagi, Uchino, Miki,
Algorithm
Neuron in:
l \ f 1 o d e l , [FSA World lOl7~lO2O,
Fuzzy Procecclings ()_fFIff/1 C0/’zgre5.s’, (1993)
16 A
T., Furuhashi, T., Decision Model
[2] Uchikawa, Y., a
Hzishiyamzi, l\/luking
Using
Fuzzy
Neural in:
2nd I 7 7 l ‹ f ’ l Z ( £ l ’ f O i l ( l l on
N e t w o r k ,
Proceedirnzs’ Qfl’/16 C ‘ 0 r ’ y 2 e 1 w 1 r ; ~ e & Neural
FL¢zz_v Logic
1057-1060.
NffI’w01"ks, Iizu/ca,
Japan, (1992)
rules and in
[3] Kasabov, N., neural networks and
L e a r n i n g fuzzy approximate
reasoning fuzzy
Sets and
82 135-149
1996,
hybrid SYSIICITISD, Fuzzy Sy.a’l1:ms, (2),
F on/1dati0n.s’
Kasabov, N., N e f n v o r / F u z z y S ) » . 5 ‘ [ ( ? l 7 1 S a n d K I I ( ) 1 \ » l ( ( ] g £ ? l : ‘ r 1 g i 1 z e e 1 i r z ; g , and
[4] Q/‘1\/ez;/’czl
Fuzzy S ) » . 5 ‘ [ ( ? l 7 1 S
K I I ( ) 1 \ » l ( ( ] g £ ? l : ‘ r 1 g i 1 z e e 1 i r z ; g ,
’ 1 ‘ h e M 1 1 ‘
CA, 1996
Press, MA,
N, connectionist
Kzlsabov, 13
[5] Aciaptable
production systems. (2~4), 1996,
Neurocnmpuling
95~1 17
A Neural Net
W., Heesche, K., for Bidireetionzil
[6] Hauptmann,
Topology
Fu2.zy-Neuro
in:
Transformation, 151
Pr0c:eeding.s’ ofthe FUZZ-IEEE/IFES, YO1Japan,(1995)1511~1518. 1~1518.
Japan,(1995)
ANFIS:
netWork~bz1sed inference
[7] R., IEEE Trans. on
Jang, adaptive
fuzzy system, S)/st.,Man.
665-685
23(3), 1993,
Cybernetics, l\/1ay~June
Lin, C-’l‘., Lin, C-J., Lee, control network
[8] CT., with on-line
Fuzzy acluptivelezirning
learning,
Sets 71
25-45
Fuzzy c z / 1 d . $ ‘ y S r e n z . s , (1), 1995,
Genetic is
[9] D., S(f‘ØH’(,‘/I, and ll/Iczchine
Goldberg, Algorit/zms Addison
O])Ifi’l’ZiZCIf1()l’Z
I.¢:arning,
1989
Wesley,
G. A
[10] Lan, H., Genetic-based method of
l\/lang, Zlnuig,
Rules and
Cleneruting
Fuzzy
Functions
from in;
by
Men’11_>ers1’iip Learning I/zfernatio/zczl
E x a r n p 1 e s , Proceedings Qf
0/1 Neural] -
C O l 1 ] L ( ? l L ? } ‘ 1 ( _ ‘ ( 3 Volume
If1f())"l71[lff()H P I U C 7 { ? . 5 ‘ , \ ‘ I ’ l I g (ICONIP 9 5 ) One, 335 338
1995,
17~
N. Connectionist Procluction Towards
[ll] Kasabov,
Hybrid Fuzzy
Systems Building
and 1:4
AI,
Comprehensive Il1If@Hfg(:’fl[/1ZlIf0l’H(11f()l’l S o / i Computizzg, (1995) 351--360)
D i s t r i b u t e d and Prediction ART and ARTMAP
G.,
[12] Carpenter, Leairning, Recognition,
by
Neural in: IEEE Volume
Press. Panel and
N e t w o r l i n : P r o c e e d i n g s o f I C N / \ / ’ 9 6 , I E E E P r e s s . V o l u m e P l e n a r y P a n e l a n d Proceedings of ICN/\/’96,
P l e n a r y
244-249
S e s s i o n s , l996,
Special
the and in neural networks
Kuszibov, N.,
[13] I n v e s t i g a t i n g
aclaptutiori forgetting fuzzy by
the method of and in:
I / L l ‘ ( ‹ I 7 2 L £ l ‘ i ( ) l ’ Z £ l l on
using training z e r o i n g , P r o c e e a ’ i n g _ s oft/ze C ( ) l ’ l f ‹ l ’ 6 H ( T ( ?
Neural Networks
Plenar and S eciul Sessions 118423
1 ’ C N N 9 6 , ,Panel volume,l996,
Y P
AdV2lI’lCC(1 for
Kusaluov, N.,
[14] Neuro-Fuzzy
Engineering Building Intelligent Adaptive
Information in:
I _ , . R e z n i l V . D i m i t r o v , . l . K a c p r Z y l < ( e d s ) F u z z y S _ y . s ~ t e 1 r i . s D e s i g n : S o c i a l Social
Systems, V.Dimitrov, (eds)
.l.KacprZyl< Fuzzy S _ y . s ~ t e 1 r i . s Design:
and to in
l99’/
Engineering Applications, Physica-Verlag
(Springer Verlag), appear
18A. A
modified
for
Appendix and
hackpropagation algorithm
learning
of functions in
FuNN/2
adaptation
membership
This section the
used for the
FuNN/2 both the
explains algorithm architecture, feed-»forward
and
the of errors. The
is discussed in
phase bacloferrors.Thealgorithmisdiscussedintermsoflayersof terms of of
algorithm
layers
neurons since are the focus of a
rather than the
they architecture, as such. Where
fuzzy
weights
o l are
intended this should be evident
layers from the context.
weights
Forward
pass
This the activation
values o i all the nodes in the network
phase from the first to t‘ii’th
computes
In this section a
indicates the and
layers. :1 describes connection
supcrscript
layer subscript
between
weights
layers.
O
1
The nodes in this
Layer (Input Layer). transmit values to
layer
only input values)
(crisp
the next
without modification.
layer directly
O
2
The
Layer (Condition function of this node is the
Layer). that the
output
degree input
to the
function.
The
belongs given the center for that
membership input
weight represents
with the minimum and
particular function, maximum determined
membership the
using
centers. ln the case
adjacent of the first and last
m e r n l i > e r s h i p s function for a
membership
variable a shoulder is
used instead. this
particular acts as the fuzzifier.
Hence, Each
layer
function is
and an
activates two
membership triangular
input signal
only neighbouring
functions
the sum o t the of
membership these two
simultaneously,
grades
neighbouririg
functions for
is
membership to 1. For a
any given input
always equal
triangie~shaped
function as in
the
FLINN, activation functions for a node i are:
nicmbership
19< <
if x A cf
ai u;,.;,1.hen
if < x < then AC!
ai., ai,
=
if X then Act
ui,
where a is the centre of the
membn hmctxon
trxzmgular rshlp
O
3 : The CO11HC(,llOI‘1S hom the (,0D(ill10[1 Lo thzs rule I
(Rule u c used
Layer Layer) layer
xyex
to l’I‘lZ1l‘ChIUUof rules [he ( , O l ] l ’ l C C f 1 O I l
m lh1s m
perform pre-condition fu77y
wexghts
layer ly
be set either and then named or to setofH1168(nllu,memuon)The set of H1168
The
randomly
lC(.OId1I]g (nllu, memuon)
net and aetivzltions arc
inputs
I‘LSpC
13
><
Net’ Act"
ww,
I
"
~ - - - ~ - - ~ ~
=
Ac!
1+ ef--g>)
where is 21 factor.
g gain
O 4 : The nodw Ill this
md the conneetzoh
Layer (Action Layer) \ V £ , l £ l ] [ S i\lI1L,U()D as
ldycr
those in 3 for net
and 1Lt1\/(mon
layer input;
==
Net" >< ACI’
vvw
1
I’
\ --~ _ .__-........-.,..-..._,._.._........

Ao
l+e(_W,_,,,0 S
: ’l‘l’iis
Layer defuzzification to 21
(Output Layer)
layer performs
produce crisp output
value.
the used
dei‘uzzit‘icz1l.ion the Center
Among
commonly
strategies, Qf Gmviry (COG)
method the best
result. In this linear
uctivzition
yielded function is used:
layer, output
=
Ne!" ><
ALT!"
2 Wm
Net"
H
I
ACI
’ T j f
Act
2
Backward
pass
The for this is to minimise
the error function:
goal
phase
l_
1

=
Ifrror
(yd y" )
E

where is the desired
and is the current
yd output, y
output.
Hence
the rule
used is
general
learning (gradient descent)
QE
¢=
Aw
~f--~
Jw
813
i
=
--|-
+
ww, n(-- -----) )
wi 0c(Aw,
QW
Where is the
rate and C6 is the momentum
T] and
learning
coefficient,
QE’ JE
QA/et
---~ --- --»
= -
:
>< 6 >< A -L-1
QW/ c ’ 9 1 \ / e t Qu/
Hence the
rule is:
weight update
=
>< --1-
Ac!
U5
Aw", (IAM/,
21O
S
Layer (Output Layer):
0
ali’ 0/l ct
v U
(1 H
’" ’

=
0 ><
V
" f ’ " 1 7 ’ - " ’ 7 _"Wir" Y ’
& J V ( ? l (3/llff
( ) N ‹ f
When thc are
the
rule is taken into account which
Weights adapted
constraining
imposes
restrictions to the of the
centres ofthe functions.
change
membership
O 4
The error for each
Layer (Action node in the action is
Layer): calculated
layer
based on the error and
on the activation of this node in
individually mind the
output
having
o l the
functions used in the
defuzzificution us well as the
typo rnembership (triangular)
layer
of clefuzzificutiori:
type
-Ji:-EL

It < < then c l l
ni
y am,

H i Hr
-
ti.
v
=:
If < < then cl"
ar.,
y ui,
-
ai ai_,
= fr
then dl" I
ai,
lfy
Hence,
= -
5" ><
><
_ f ( N e r ) A f . - r ( 1 ~ ~ (-Acr")
ACN) Acr")
O 3
Layer (Rule Layer):
~
=
6" ><
><
/it-fri ACH) §"j>
2><§"j>
O
2 The
(Condition is us
Layer wig follows, lf’ xl lics in
Layer): weight thc
assigned
fuzzy
then the
should be increased
segment,
corresponcling weight to thc
directly proportional
22error f1’O1l’l_ the because the error is Cai. me -ci
l _h e
propagated previous layer, This
Jyt
weight.
can be the
represented by
proposition following equation:

A e r X ><
56

Z><
rule of
Thus the this is:
weight-updating layer
"W

+
716
ivif’
| (IAM/[Fl
The new centres o l the membershi f une t ‘ions are
also
input [ l l Z 1 T l g l , l . l 2 ] . l ‘ p to
zicijustedziccording
21 as for the
output layer.
partition 1 " 2 l l l g ‹ 3
23