Lecture 13-14 Face Recognition Subspace/Manifold Learning

soilflippantAI and Robotics

Nov 17, 2013 (3 years and 11 months ago)

90 views

EE4
-
62 MLCV

Lecture 13
-
14


Face
Recognition




Subspace/Manifold
Learning



Tae
-
Kyun Kim

1

EE4
-
62 MLCV

EE4
-
62 MLCV

EE4
-
62 MLCV

Face Image Tagging and Retrieval


Face tagging at commercial
weblogs


Key issues


User interaction for face tags


Representation of a long
-

time
accumulated data


Online and efficient learning

2



Active research area in Face
Recognition Test and MPEG
-
7
for face image retrieval and
automatic passport control



Our proposal promoted to
MPEG7 ISO/IEC standard

EE4
-
62 MLCV

Principal Component Analysis (PCA)


-

Maximum
Variance Formulation of
PCA

-

Minimum
-
error
formulation of
PCA

-

Probabilistic
PCA

3

EE4
-
62 MLCV

Maximum Variance Formulation of
PCA

4

EE4
-
62 MLCV

5

EE4
-
62 MLCV

6

EE4
-
62 MLCV

7

EE4
-
62 MLCV

8

EE4
-
62 MLCV

Minimum
-
error formulation of PCA

9

EE4
-
62 MLCV

10

EE4
-
62 MLCV

11

EE4
-
62 MLCV

12

EE4
-
62 MLCV

13

EE4
-
62 MLCV

14

EE4
-
62 MLCV

Applications of PCA to Face
Recognition

15

EE4
-
62 MLCV

EE4
-
62 MLCV

(Recap) Geometrical interpretation of PCA


Principal components are the vectors in the direction of the
maximum variance of the projection samples.


Each two
-
dimensional data point is transformed to a single
variable z1 representing the projection of the data point onto
the eigenvector u1.


The data points projected onto u1 has the max variance.


Infer the inherent structure of high dimensional data.


The intrinsic dimensionality of data is much smaller.



For given 2D data
points, u1 and u2 are
found as PCs

16

EE4
-
62 MLCV

Eigenfaces


Collect a set of face images


Normalize for scale, orientation (using eye locations)






Construct the covariance

matrix and obtain eigenvectors













w

h

D
=
wh

N
D
R
X




,...
,
1
1
x
x
X
X
X
N
S
T






M
D
R
U
U
SU




,
17

EE4
-
62 MLCV

Eigenfaces


Project data onto the
subspace




Reconstruction is
obtained as




Use the distance to the
subspace for face
recognition


D
M
R
Z
X
U
Z
N
M
T




,
,
UZ
X
Uz
u
z
x
M
i
i
i





~
,
~
1
x
~
||
~
||
x
x

18

EE4
-
62 MLCV

x
EE4
-
62 MLCV

19

EE4
-
62 MLCV

Matlab

Demos



Face Recognition by PCA

20

EE4
-
62 MLCV


Face Images


Eigen
-
vectors and Eigen
-
value plot


Face image reconstruction


Projection coefficients (visualisation of high
-
dimensional data)


Face recognition

21

EE4
-
62 MLCV

Probabilistic PCA


A subspace is spanned by the
orthonormal

basis
(eigenvectors computed from
covariance matrix)


Can interpret each observation
with a generative model


Estimate (approximately) the
probability of generating each
observation with Gaussian
distribution,

PCA: uniform prior on
the subspace

PPCA: Gaussian dist.

22

EE4
-
62 MLCV

EE4
-
62 MLCV

Continuous Latent Variables

23

EE4
-
62 MLCV

24

EE4
-
62 MLCV

25

EE4
-
62 MLCV

Probabilistic PCA

26

EE4
-
62 MLCV

EE4
-
62 MLCV

27

EE4
-
62 MLCV

28

EE4
-
62 MLCV

29

EE4
-
62 MLCV

30

EE4
-
62 MLCV

Maximum likelihood PCA

31

EE4
-
62 MLCV

32

EE4
-
62 MLCV

33

EE4
-
62 MLCV

34

EE4
-
62 MLCV

35

EE4
-
62 MLCV

Limitations of PCA

36

EE4
-
62 MLCV

Unsupervised
learning

37

PCA
vs

LDA (Linear Discriminant Analysis)

EE4
-
62 MLCV

Linear model

Linear
Manifold =
Subspace

Nonlinear
Manifold

38

EE4
-
62 MLCV

PCA
vs

Kernel PCA

EE4
-
62 MLCV

Gaussian Distribution Assumption

39

IC1

IC2

PC1

PC2

PCA
vs

ICA (Independent Component
Analysis)

EE4
-
62 MLCV

EE4
-
62 MLCV

40

(also by ICA)