Object Recognition 2: Object Recognition 2: Appearance Appearance- -based, Tracking based, Tracking

grizzlybearcroatianΤεχνίτη Νοημοσύνη και Ρομποτική

16 Οκτ 2013 (πριν από 3 χρόνια και 9 μήνες)

91 εμφανίσεις

E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Object Recognition 2:
Object Recognition 2:
Appearance
Appearance
-
-
based, Tracking
based, Tracking
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Contour
Contour
-
-
based Object Recognition
based Object Recognition
1. SVM (Support Vector Machine)
2. Classifier-based Object Recognition
3. Viola Jones Method: AdaBoost-based Face Detection
4. Kalman Filter-based Tracking, A Posteriori Probability
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Support Vector Machine
Support Vector Machine
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Optimal Separating
Optimal Separating
Hyperplanes
Hyperplanes

[6]
[6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Optimal Separating
Optimal Separating
Hyperplanes
Hyperplanes

[6]
[6]
Distance between a plane and a point
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Optimal Separating
Optimal Separating
Hyperplanes
Hyperplanes

[6]
[6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Optimal Separating
Optimal Separating
Hyperplanes
Hyperplanes

[6]
[6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Consider the two-dimensional optimization problem:
We can visualize contours of
f
given by
g(x,y) = c
f(x,y)=d
Find x and y to maximize f(x,y)
subject to a constraint (shown in red)
g(x,y) = c.
Lagrange Multipliers [9]
Lagrange Multipliers [9]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
When f(x,y) becomes maximum on the path of g(x,y)=c, the contour line for g=c meets
contour lines of f tangentially. Since the gradient of a function is perpendicular to the contour
lines, this is the same as saying that the gradients of f and g are parallel.
Contour map. The red line shows the constraint g(x,y) = c. The blue
lines are contours of f(x,y). The point where the red line tangentially
touches a blue contour is our solution.
Lagrange Multipliers [9]
Lagrange Multipliers [9]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
To incorporate these conditions into one equation, we introduce an auxiliary
function
and solve
Lagrange Multipliers [9]
Lagrange Multipliers [9]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Kuhn
Kuhn
-
-
Tucker Theorem [6]
Tucker Theorem [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
The
The
Lagrangian
Lagrangian

Dual Problem [6]
Dual Problem [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Minimize (in w, b)
Subject to (for any i=1,…n)
One could be tempted to expressed the previous problem by means of non-negative Lagrange
multipliers
α
i
as
we could find the minimum by sending all
α
i
to ∞. Nevertheless the previous constrained
problem can be expressed as
This is we look for a saddle point.
Dual Problem [10]
Dual Problem [10]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
The
The
Lagrangian
Lagrangian

Dual Problem [6]
Dual Problem [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
The
The
Lagrangian
Lagrangian

Dual Problem [6]
Dual Problem [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Support Vectors [6]
Support Vectors [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Non
Non
-
-
separable Case [6]
separable Case [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Non
Non
-
-
separable Case [6]
separable Case [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Non
Non
-
-
separable Case [6]
separable Case [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Non
Non
-
-
separable Case [6]
separable Case [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Non
Non
-
-
linear
linear
SVMs
SVMs

[6]
[6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Non
Non
-
-
linear
linear
SVMs
SVMs

[6]
[6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Non
Non
-
-
linear
linear
SVMs
SVMs

[6]
[6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Implicit Mappings: An Example [6]
Implicit Mappings: An Example [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Kernel Methods [6]
Kernel Methods [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Kernel Methods [6]
Kernel Methods [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Kernel Methods [6]
Kernel Methods [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Kernel Methods [6]
Kernel Methods [6]
Kernel Functions
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Architecture of an SVM [6]
Architecture of an SVM [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Case Study: XOR [6]
Case Study: XOR [6]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
References
References
1. Frank Masci, “An Introduction to Principal Component Analysis,”
http://web.ipac.caltech.edu/staff/fmasci/home/statistics_refs/PrincipalComponentAnalysis.pdf
2. Richard O. Duda, Peter E. Hart, David G. Stork, Pattern Classification, second edition, John Wiley &
Sons, Inc., 2001.
3. Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2007.
4. Sergios Theodoridis, Konstantinos Koutroumbas, Pattern Recognition, Academic Press, 2006.
5. Ho Gi Jung, Yun Hee Lee, Pal Joo Yoon, In Yong Hwang, and Jaihie Kim, “Sensor Fusion Based
Obstacle Detection/Classification for Active Pedestrian Protection System,” Lecture Notes on Computer
Science, Vol. 4292, 294-305.
6. Ricardo Gutierrez-Osuna, “Pattern Recognition, Lecture Notes,” available at
http://research.cs.tamu.edu/prism/lectures.htm
7. Paul Viola, Michael Jones, “Robust real-time object detection,” International Journal of Computer
Vision, 57(2), 2004, 137-154.
8. Wikipedia, “Principal component analysis,” available at
http://en.wikipedia.org/wiki/Principal_component_analysis
9. Wikipeida, “Lagrange multipliers,” http://en.wikipedia.org/wiki/Lagrange_multipliers
.
10. Wikipeida, “Support Vector Machine,” http://en.wikipedia.org/wiki/Support_vector_machine
.
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Classifier
Classifier
-
-
Based Object
Based Object
Recognition
Recognition
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Two Phases of Object Recognition
Two Phases of Object Recognition
1. Hypothesis Generation (HG) Phase: it should include all possibilities and it is
expected to be relatively lighter to implement.
2. Hypothesis Verification (HV) Phase: it should be able to distinguish the real objects
and it is expected to be heavier to implement Based on classifier.
HV consists of feature and classification.
• Features: PCA, ICA, Local Receptive Field (LRF), Histogram Of Gradient (HOG),
Gabor Filter Bank (GFB)
• Classifier: LDA, Neural Network, Support Vector Machine (SVM)
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
GFB
GFB
-
-
SVM
SVM
-
-
Based Pedestrian Recognition [1][2][3]
Based Pedestrian Recognition [1][2][3]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Gabor Filter
Gabor Filter
Gabor filter is the product of 2D Gaussian function and 2D sine wave. It can detect the orientation-specific,
frequency-specific components of image at a specific spatial location.
Gabor filter in spatial domain is defined like Eq. (1)
22
22
11
(,)exp2
22
xyxy
xy
gxyjWx











Its definition in frequency domain is like Eq. (2)
22
22
1()
(,)exp
2
uv
uWv
Guv














(1)
(2)
22
22
11
(,)exp2
22
cossin
,
sincos
s
ss
xyxy
xy
gxyajUx
kk
xxy
KK
where
kk
yxy
KK
































Gaussian wavelet can be defined by rotating the basic form.
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Garbor
Garbor

Filter Bank
Filter Bank
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Support Vector Machine
Support Vector Machine
SVM is a kind of classification method with hyperplane. Decision boundary is defined by N-dimensional
vector w

and b as shown in equation (1). The decision function is defined by the sign of the projection of
input vector x

on the hyperplane as shown in equation (2).
()0 , ,
N
bwhereb
wxw
()(())
f
signb

xwx
(1)
(2)
SVM assumes the maximum separating margin
between two classes is the optimal condition. A
data set {x1,…,xn} and class label yi∈{-1,1} for xi

are known, decision boundary must satisfy yi(w

xi+b)≥0 for all data. SVM training is a procedure
finding w

maximizing the margin as depicted in
Figure 2. It can be achieved by minimizing
2
2
w
SVM training is converted into constrained
optimization problem by introducing Lagrange
multipliers α

as shown in equation (3).
111
0
1
maximize()
2
subject to0, 0
nnn
T
iijijij
iij
n
iii
i
Wyy
y








αxx
(3)
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Support Vector Machine
Support Vector Machine
Equation (3) can not be applied to non-separable data. Thus the only difference from the optimal
hyperplane case is that the
α
i now have an upper bound of C.
111
0
1
maximize()
2
subject to0, 0
nnn
T
iijijij
iij
n
iii
i
Wyy
Cy








αxx
Therefore, SVM learning problem using RBF kernel function for non-separable case can be expressed as
in equation (6).
2
2
2
111
0
1
maximize()
2
subject to0, 0
ij
nnn
iijij
iij
n
iii
i
Wyye
Cy











xx
α
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Support Vector Machine [2]
Support Vector Machine [2]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
References
References
1. Ho Gi Jung, Pal Joo Yoon, and Jaihie Kim, “Genetic Algorithm-Based Optimization of SVM-Based
Pedestrian Classifier,” The 22
nd
International Technical Conference on Circuits/Systems, Computers
and Communications (ITC-CSDD 2007), Busan, Korea, Jul. 8-11, 2007, pp. 783-784.
2. Ho Gi Jung, Jaihie Kim, “Constructing a pedestrian recognition system with a public open database,
without the necessity of re-training: an experimental study,” Pattern Analysis and Applications, on-line
published on 7 Apr. 2009.
3. Ho Gi Jung, Yun Hee Lee*, Pal Joo Yoon, In Yong Hwang, and Jaihie Kim, “Sensor Fusion Based
Obstacle Detection/Classification for Active Pedestrian Protection System”, LNCS Vol. 4292, Part II
294-305, November 2006.
4. David Lowe, “The Viola/Jones Face Detector,” UBC lecture material of computer vision, Spring 2007.
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Viola Jones Method:
Viola Jones Method:

AdaBoost
AdaBoost
-
-
based Face Detection
based Face Detection
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Face Detector by
Face Detector by
AdaBoost
AdaBoost

[4]
[4]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
AdaBoost
AdaBoost

[4]
[4]
• Given a set of weak classifiers
– None much better than random
• Iteratively combine classifiers
– Form a linear combination
– Training error converges to 0 quickly
– Test error is related to training margin
}1,1{)( :originally



x
j
h








t
t
bxhxC)()(

E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Face Detector by
Face Detector by
AdaBoost
AdaBoost

[4]
[4]
Weak
Classifier 1
Weights
Increased
Weak
classifier 3
Final classifier is
linear combination of weak
classifiers
Weak
Classifier 2
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Haar
Haar
-
-
like Feature [7]
like Feature [7]
The simple features used are reminiscent of Haar

basis functions which have
been used by Papageorgiou

et al. (1998).
Three kinds of features: two-rectangle feature, three-rectangle feature, and four-

rectangle feature
Given that the base resolution of the detector is 24x24, the exhaustive set of
rectangle feature is quite large, 160,000.
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Haar
Haar
-
-
like Feature: Integral Image [7]
like Feature: Integral Image [7]
Rectangle features can be computed very rapidly using an intermediate representation for the
image which we call the integral image.
The integral image at location x,y contains the sum of the pixels above and to the left of x, y,
inclusive:
where ii (x, y) is the integral image and i (x, y) is the original image (see Fig. 2). Using the
following pair of recurrences:
(where s(x, y) is the cumulative row sum, s(x,−1) =0, and ii (−1, y) = 0) the integral image can
be computed in one pass over the original image.
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Haar
Haar
-
-
like Feature: Integral Image [7]
like Feature: Integral Image [7]
Using the integral image any rectangular sum can be

computed in four array
references (see Fig. 3).
Our hypothesis,

which is borne out by experiment, is that a

very small number of
these features can be combined

to form an effective classifier. The main
challenge is to

find these features.
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Feature Selection [4]
Feature Selection [4]
• For each round of boosting:
– Evaluate each rectangle filter on each example
– Sort examples by filter values
– Select best threshold for each filter (min Z)
– Select best filter/threshold (= Feature)
– Reweight examples
• Mfilters, T thresholds, N examples, L learning time
– O( MT L(MTN) ) Naïve Wrapper Method
– O( MN ) Adaboost feature selector
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Cascasted
Cascasted

Classifier [4]
Classifier [4]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Cascaded Classifier [4]
Cascaded Classifier [4]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
References
References
1. Frank Masci, “An Introduction to Principal Component Analysis,”
http://web.ipac.caltech.edu/staff/fmasci/home/statistics_refs/PrincipalComponentAnalysis.pdf
2. Richard O. Duda, Peter E. Hart, David G. Stork, Pattern Classification, second edition, John Wiley &
Sons, Inc., 2001.
3. Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2007.
4. Sergios Theodoridis, Konstantinos Koutroumbas, Pattern Recognition, Academic Press, 2006.
5. Ho Gi Jung, Yun Hee Lee, Pal Joo Yoon, In Yong Hwang, and Jaihie Kim, “Sensor Fusion Based
Obstacle Detection/Classification for Active Pedestrian Protection System,” Lecture Notes on Computer
Science, Vol. 4292, 294-305.
6. Ricardo Gutierrez-Osuna, “Pattern Recognition, Lecture Notes,” available at
http://research.cs.tamu.edu/prism/lectures.htm
7. Paul Viola, Michael Jones, “Robust real-time object detection,” International Journal of Computer
Vision, 57(2), 2004, 137-154.
8. Wikipedia, “Principal component analysis,” available at
http://en.wikipedia.org/wiki/Principal_component_analysis
9. Wikipeida, “Lagrange multipliers,” http://en.wikipedia.org/wiki/Lagrange_multipliers
.
10. Wikipeida, “Support Vector Machine,” http://en.wikipedia.org/wiki/Support_vector_machine
.
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Kalman
Kalman

Filter
Filter
-
-
based Tracking,
based Tracking,

A Posteriori Probability
A Posteriori Probability
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
2D
2D
Kalman
Kalman

Filtering [1]
Filtering [1]
- State vector: Light blob 무게중심의x좌표, y좌표, x축속도, y축속도













1000010010100101
A







0010
0001
H















y
x
v
v
y
x
x
-2차원평면에대한LTI(Linear Time-Invariant) model을이용한Kalman filter













10000
01000
0010000010
Q







00500005
R
State transition matrix
Measurement matrix
State covarianceMeasurement covariance
System dynamics 정의
System noise level 정의













































































kvkv
kykx
kvkv
kvkykvkx
kvkv
ky
kx
y
x
y
x
x
x
y
x
1000010010100101
1
1
1
1
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
1) Prediction 단계
QAPAP

T
kk1
1
ˆˆ



kk
xAx
2) Matching 단계
- Predicted position 에서가장가까운측정치zk
검색
- 그object 까지의거리가Pk-를활용한임계치보다작으면대응성립correction 단계로
대응없으면소멸
3) Correction 단계

1

RHPHHPK
T
k
T
kk




kkkkk
xHzKxx
ˆˆˆ


kkkk
PHKPP
4) 생성단계: matching 되지않은zk에대해track 생성
2D
2D
Kalman
Kalman

Filtering [1]
Filtering [1]

k
x
ˆ
예측된
x
,
y
좌표
검출된
x
,
y
좌표


kkkkk
zKxHKIx

ˆˆ
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Classification of Tracked Object [2]
Classification of Tracked Object [2]
C개의class가있는classification을N회tracking된object에적용한결과가O이다.
즉, 어떤object의class c는
특정시간n의어떤object에대한classifier의출력o(n)은
N회tracking된어떤object에대한classifier 출력O는
Cc


1
Co
n


1


...
21N
ooo

O
N회tracking된어떤object에classifier를적용하여classifier 출력O를얻은후, 그object의

class를알아내는것은


Oc|P
c
maxarg
Class c인샘플들에대해서classifier를적용하여얻은결과는P(O|c)이다.
미리측정할수있는값이다.
Maximum A Posteriori Probability
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung











C
c
cPcP
cPcP
P
cPcP
cP
1
|
||
|
O
O
O
O
O
Classification of Tracked Object [2]
Classification of Tracked Object [2]















coPcoPcoPcoPcoPcoPcoPcoPcP
nnnn
|||||||||
121121


O
Bayes’ Rule을이용하면,
이때, classifier의출력이서로independent하다면,
누적해서곱해서구할수있다
.
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Classification of Tracked Object [2]
Classification of Tracked Object [2]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Classification of Tracked Object [2]
Classification of Tracked Object [2]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
Classification of Tracked Object [2]
Classification of Tracked Object [2]
E-mail: hogijung@hanyang.ac.kr
http://web.yonsei.ac.kr/hgjung
References
References
1. 김성필, “MATLAB활용칼만필터의이해,” 도서출판아진, 2010년.
2. Ho Gi Jung, Jaihie Kim, “Constructing a pedestrian recognition system with a public open database,
without the necessity of re-training: an experimental study,” Pattern Analysis and Applications (2010)
12: 223-233.