Principal Components Analysis & Eigenpictures - ElderLAB

brasscoffeeΤεχνίτη Νοημοσύνη και Ρομποτική

17 Νοε 2013 (πριν από 3 χρόνια και 4 μήνες)

52 εμφανίσεις

Principal Components
Analysis


Vida Movahedi

December 2006

Outline


What is PCA?


PCA for images


Eigenfaces


Recognition


Training Set


Test Set


Summary

Principal Component Analysis


Eigen Vectors show the direction of axes of a
fitted ellipsoid


Eigen Values show the significance of the
corresponding axis


The larger the Eigen value, the more separation
between mapped data


For high dimensional data,


only few of Eigen values


are significant

What is PCA?


Finding Eigen Values and Eigen Vectors


Deciding on which are significant


Forming a new coordinate system defined
by the significant Eigen vectors


(

lower dimensions for new coordinates)


Mapping data to the new space



Compressed Data


How is PCA used in Recognition?


A training set is used for
LEARNING

phase


Applying PCA to training data to form a new
coordinate system defined by significant Eigen
vectors


Representing each data in PCA coordinate system
(weights of Eigen vectors)



A test set is used for
TESTING

phase


Same PCA coordinate system is used


Each new data is represented in PCA coordinates


New data is recognized as the closest training data
(Euclidean distance)

PCA for images


Each image is represented as a 1
-
D data

i


Finding Eigen values/vectors is expensive


Turk/Pentland Trick:

T
i
T
i
i
i
i
T
ι
i
i
T
T
T
T
M
i
T
i
i
M
i
i
M
i
i
M
AA
Av
A
A
v
Av
Av
AA
v
Av
A
M
M
A
A
N
N
AA
N
N
C
N
N
AA
C

of
r
eigenvecto
:

of
r
eigenvecto
:
:
but

:
:
Trick
:

:
image

:
Matrix

Covariance

:
picture

average
2
2
2
2
1
1
1
1



























What are Eigenfaces?


Turk and Pentland used PCA method for
face images


All faces are about the same size


Each face image is a data vector.


Each Eigen vector is actually an image
called an Eigenface.

Average image

Eigenfaces

Training set
-

before preprocessing

Training Set

Eigen Pictures

Significant Components

1
2
3
4
5
6
7
8
9
10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1



m
i
i
m
g
1
)
(

Recognition of Training set

L: No of
eigenvectors

Recognition Rate

10

10 of 10

9

10 of 10

8

10 of 10

7

9 of 10

6

8 of 10

5

8 of 10

Test set: Noisy images

P
n
=0.1

P
n
=0.2

P
n
=0.3

Recognition of Noisy images

P
n
: Probability of Noise

Recognition Rate

0.10

10 of 10

0.20

9 of 10

0.30

3 of 10

Summary


PCA gives a high compression rate


Performance is good when noise is
present


Performance is very bad if scale of
image is changed

References

1) Smith, L.I. (2002), “A tutorial on Principal Components Analysis”,
http://csnet.otago.ac.nz/cosc453/student_tutorials/principal_compon
ents.pdf.

2) Zhao, W., Chellappa, R., Rosenfeld, A., Phillips, P.J. (2000), “Face
Recognition: A literature survey”, UMD CfAR Technical Report CAR
-
TR
-
948, http://citeseer.ist.psu.edu/zhao00face.html.

3) Turk, M. and Pentland, A. (1991), “Eigenfaces for recognition”,
Journal of Cognitive Neuroscience, vol. 3, no. 1, p.71
-
86.

4) ‘Principal components analysis’,
http://en.wikipedia.org/wiki/Principal_ component_analysis.

5) “Eigenface”, http://en.wikipedia.org/wiki/Eigenface

6) Dailey, M. (2005), “Matt’s Matlab Tutorial Source Code Page”,
http://ai.ucsd.edu/ Tutorial/matlab.html.