CS-485: Capstone in

builderanthologyAI and Robotics

Oct 19, 2013 (3 years and 9 months ago)

90 views

CS
-
485: Capstone in

Computer Science

Artificial Neural Networks and their
application in Intelligent Image
Processing

Spring 2010


1

Organizational Details


Class Meeting:


12:25
-
3:45pm
Tuesday, SCIT213



Class webpage
http://www.eagle.tamut.edu/faculty/igor/CS
-
485.htm




Instructor: Dr. Igor Aizenberg



Office:
Science and Technology Building, 104C


Phone (903 334 6654)


e
-
mail:
igor.aizenberg@tamut.edu




Office hours:


Monday, Thursday 10
-
30


6
-
30


Tuesday, Wednesday 4
-
30


6
-
30

2

Text Book


1) I. Aizenberg,


“Advances in Neural Networks”,
University of Dortmund, 2005,


Class notes (available from the class
webpage)

2) Additional materials will also be
available from the class webpage

3



Applied Problems:



Image,
Sound
,

and Pattern

recognition


Decision


making


Knowledge discovery



Context
-
Dependent Analysis




Artificial Intellect
:

Who is stronger and why?


NEUROINFORMATICS


-

modern theory about principles and new
mathematical models of information
processing, which based on the biological
prototypes and mechanisms of human
brain activities

Introduction to Neural Networks


4

Natural language understanding
(Translation of the texts)

Recognition of Images

Decision Making


Knowledge Discovery


Learning and Adaptation

Team behavior


Fuzzy Logic

Reasoning and
Prediction


Cognitive analysis

Applied Problems

5

Renaissance of connectionism from the papers by Hopfield, and
popularizing the back
-
propagation algorithm for multiplayer feed
-
forward networks


McCulloch and Pitts introduced the fundamental ideas of analyzing neural
activity via thresholds and weighted sums


Notion of Wiener about key role of connectionism and feedback loops
as a model for learning in neural networks

Hebb hypothesis that human and animal long
-
term memory is
mediated by permanent alterations in the synapses.

Minsky’s builts the first actual neural network learning
system

Frank Rosenblatt invented the modern
“perceptron” style of NN, composed of
trainable threshold units

Ashby puts the idea that intelligence
could be created by the use of
“homeostatic” devices which learn
through a kind of exhaustive search

1982

1969

1949

1948

1943

End of Perceptron era:

Work “Perceptron” by Minsky and Papert

1957

1952

1951

The History of Neuroscience


6

NN as an model of
brain
-
like Computer



An
artificial neural network (ANN)
is a
massively parallel distributed processor that
has a natural propensity for storing
experimental knowledge and making it
available for use. It means that:




Knowledge is acquired by the network
through a learning (training) process;



The strength of the interconnections
between neurons is implemented by means
of the synaptic weights used to store the
knowledge.


The
learning process
is a procedure of the
adapting the weights with a learning
algorithm in order to capture the knowledge.
On more mathematically, the aim of the
learning process is to map a given relation
between inputs and output (outputs) of the
network.

Brain



The human brain is still not well
understood and indeed its
behavior is very complex!


There are about 10 billion
neurons in the human cortex and
60 trillion synapses of connections


The brain is a highly complex,
nonlinear and parallel computer

(
information
-
processing system
)


ANN as a Brain
-
Like Computer

7

Data

Acquisition

Data

Analysis

Interpretation

and

Decision Making

Signals

&

parameters

Characteristics

&

Estimations

Rules

&

Knowledge

Productions

Data

Acquisition

Data

Analysis


Decision
Making


Knowledge

Base


Adaptive Machine Learning

via Neural Network


Intelligent Data Analysis in Engineering
Experiment


8

m
p

m
1

m
2

m
3

x
i

y
i

1. Quantization of pattern space
into
p

decision classes

Input Patterns




Response:


2. Mathematical model of
quantization:

“Learning by Examples”

Mathematical Interpretation of
Classification in Decision Making


9


Self
-
organization



basic principle

of learning
:

Structure reconstruction

Input Images

Teacher

Neuroprocessor

Responce

The learning
involves
change of
structure

Learning Rule

Learning via Self
-
Organization Principle


10

Artificial
Intellect with
Neural
Networks

Intelligent

Control

Technical

Diagnistics

Intelligent


Data Analysis

and Signal
Processing

Adv
a
nce
Robotics

Machine
Vision

Image &
Pattern
Recognition

Intelligent
Security
Systems

Intelligentl
Medicine
Devices

Intelligent
Expert
Systems

Applications of Artificial Neural
Networks


11

Theory

Practice

Self
-
Paced

Work

Artificial Neural Networks

And Its Applications


You will learn:


Contemporary theoretical principles and
paradigms of Neuroinformatics,


Mathematical models and algorithms of
neural network techniques for experimentation,


Applications of Neuroinformatics to
engineering and sciences problems,


Computer
-
Aided Technology for
Instrumentation


What we will learn and do?

12

What we will learn and do?


General principles of artificial neural networks


General principles of learning algorithms


Feedforward neural network and
backpropagation learning


Multi
-
valued neurons and a feedforward neural
network based on multi
-
valued neurons


Basic ideas of image processing


Edge detection on noisy images using a neural
network

13

Symbol manipulation

Pattern recognition

Which way of
imagination is
best for you ?




Dove flies


Lion goes


Tortoise scrawls


Donkey sits


Shark swims

Ill
-
Formalizable Tasks:


Sound and Pattern recognition


Decision making


Knowledge discovery


Context
-
Dependent Analysis


What

is

difference

between

human

brain

and

traditional

computer

via

specific

approaches

to

solution

of

ill
-
formalizing

tasks

(those

tasks

that

can

not

be

formalized

directly)
?


Symbol Manipulation or Pattern
Recognition ?


14

Massive parallelism


Brain

computer

as

an

information

or

signal

processing

system,

is

composed

of

a

large

number

of

a

simple

processing

elements,

called

neurons
.

These

neurons

are

interconnected

by

numerous

direct

links,

which

are

called

connection,

and

cooperate

which

other

to

perform

a

parallel

distributed

processing

(PDP)

in

order

to

soft

a

desired

computation

tasks
.


Connectionism


Brain computer is a highly
interconnected neurons system in
such a way that the state of one
neuron affects the potential of the
large number of other neurons
which are connected according to
weights or strength. The key idea
of such principle is

the functional
capacity of biological neural nets
determs mostly not so of a single
neuron but of its connections


Associative distributed
memory

S
to
rage

of

information

in

a

brain

is

supposed

to

be

concentrated

in

synaptic

connections

of

brain

neural

network,

or

more

precisely,

in

the

pattern

of

these

connections

and

strengths

(weights)

of

the

synaptic

connections
.


A

process of pattern
recognition and pattern
manipulation

is based
on:

How our brain
manipulates with
patterns ?

Principles of Brain Processing


15

?

Brain
-
Like Computer

Brain
-
like

computer




is

a

mathematical

model

of

humane
-
brain

principles

of

computations
.

This

computer

consists

of

those

elements

which

can

be

called

the

biological

neuron

prototypes
,

which

are

interconnected

by

direct

links

called

connections

and

which

cooperate

to

perform

parallel

distributed

processing

(PDP)

in

order

to

solve

a

desired

computational

task
.



Neurons and Neural Net

The

new

paradigm

of

computing

mathematics

consists

of

the

combination

of

such

artificial

neurons

into

some

artificial

neuron

net
.

Artificial Neural Network


Mathematical
Paradigms of Brain
-
Like Computer

Brain
-
like Computer

16

Connectionizm

NN is a highly interconnected structure in such a way that the state of one
neuron affects the potential of the large number of another neurons to which
it is connected accordiny to weights of connections

Not Programming but Training

NN is trained rather than programmed to perform the given task
since it is difficult to separate the hardware and software in the
structure. We program not solution of tasks but ability of learning to
solve the tasks

Distributed Memory

NN presents an distributed memory so that changing
-
adaptation of
synapse can take place everywhere in the structure of the network.


Principles of Neurocomputing

17

Learning and Adaptation

NN are capable to adapt themselves (the synapses connections
between units) to special environmental conditions by changing
their structure or strengths connections.

Non
-
Linear Functionality

Every new states of a neuron is a nonlinear function of the
input pattern created by the firing nonlinear activity of the
other neurons.

Robustness of Assosiativity

NN states are characterized by high robustness or
insensitivity to noisy and fuzzy of input data owing to use of
a highly redundance distributed structure

Principles of Neurocomputing

18