Mean-shift and its application for object tracking

naivenorthAI and Robotics

Nov 8, 2013 (3 years and 9 months ago)

74 views

Dorin Comaniciu, Visvanathan Ramesh and Peter Meer
“Kernel
-
Based Object Tracking”,
IEEE Transactions on
Pattern Analysis and Machine Intelligence, vol.25, No 5, May 2003


Dorin Comaniciu and Peter Meer,

“Mean shift: a robust approach toward feature space analysis”
, IEEE
Transactions on Pattern Analysis and Machine Intelligence, vol.25, no 5, May 2002


Dorin Comaniciu and Peter Meer,
“Mean shift analysis and applications"
, The Proceedings of the Seventh IEEE
International Conference on Computer Vision, vol.2, pp.1197
-
1203, September 1999

Mean
-
shift and its application for object tracking

Andy {
andrey.korea@gmail.com
}

2

Intelligent Systems Lab.

What is mean
-
shift?

Non
-
parametric

Density Estimation

Non
-
parametric

Density
GRADIENT

Estimation


(Mean Shift)

Data

Discrete PDF Representation

PDF Analysis

A tool for:
finding modes in a set of data samples, manifesting an underlying
probability density function (PDF) in R
N

3

Intelligent Systems Lab.

Intuitive description

Region of

interest

Center of

mass

Mean Shift

vector

Objective
: Find the densest region

4

Intelligent Systems Lab.

Intuitive description

Objective
: Find the densest region

Region of

interest

Center of

mass

Mean Shift

vector

5

Intelligent Systems Lab.

Intuitive description

Objective
: Find the densest region

Region of

interest

Center of

mass

Mean Shift

vector

6

Intelligent Systems Lab.

Intuitive description

Objective
: Find the densest region

Region of

interest

Center of

mass

Mean Shift

vector

7

Intelligent Systems Lab.

Intuitive description

Objective
: Find the densest region

Region of

interest

Center of

mass

Mean Shift

vector

8

Intelligent Systems Lab.

Intuitive description

Objective
: Find the densest region

Region of

interest

Center of

mass

Mean Shift

vector

9

Intelligent Systems Lab.

Intuitive description

Objective
: Find the densest region

Region of

interest

Center of

mass

10

Intelligent Systems Lab.

Modality analysis

Tessellate the space

with windows

Run the procedure in parallel

11

Intelligent Systems Lab.

Modality analysis

The
blue

data points were traversed by the windows towards the mode

12

Intelligent Systems Lab.

Modality analysis

Window tracks signify the steepest ascent directions

13

Intelligent Systems Lab.

Mean shift for data clustering

14

Intelligent Systems Lab.

Data clustering example

Simple Modal Structures

Complex Modal Structures

15

Intelligent Systems Lab.

Data clustering example

Initial window

centers

Modes found

Modes after

pruning

Final clusters

Feature space
:

L*u*v representation

16

Intelligent Systems Lab.

Image segmentation examples

17

Intelligent Systems Lab.

General framework: target representation

Current
frame





Choose a
feature space

Represent the
model in the
chosen feature
space

Choose a
reference
model in the
current frame

18

Intelligent Systems Lab.

General framework: target localization

Search in the
model’s
neighborhood
in next frame

Start from the
position of the
model in the
current frame

Find best
candidate by
maximizing a
similarity func.

Repeat the
same process
in the next pair
of frames

Current
frame





Model

Candidate

19

Intelligent Systems Lab.

Target representation

Choose a
reference
target model

Quantized
Color Space

Choose a
feature space

Represent the
model by its
PDF in the
feature space

0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
1
2
3
.
.
.
m
color
Probability
20

Intelligent Systems Lab.

PDF representation





,
f y f q p y

 
 
Similarity

Function:

Target Model

(centered at 0)

Target Candidate

(centered at y)



1..
1
1
m
u u
u m
u
q q q


 

0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
1
2
3
.
.
.
m
color
Probability






1..
1
1
m
u u
u m
u
p y p y p


 

0
0.05
0.1
0.15
0.2
0.25
0.3
1
2
3
.
.
.
m
color
Probability
21

Intelligent Systems Lab.

Smoothness of the similarity function



f y
Similarity Function:





,
f y f p y q

 
 
Problem:

Target is
represented by
color info only

Spatial info is
lost

Solution:

Mask the target with
an isotropic kernel in
the spatial domain

f
(
y
) becomes
smooth in
y

f

is not smooth

Gradient
-
based
optimizations
are not robust

Large similarity
variations for
adjacent
locations

22

Intelligent Systems Lab.

PDF with spatial weighting



1..
i
i n
x

Target pixel locations

( )
k x
A differentiable, isotropic, convex, monotonically decreasing kernel



Peripheral pixels are affected by occlusion and background interference

( )
b x
The color bin index (1..
m
) of pixel
x



2
( )
i
u i
b x u
q C k x



Normalization
factor

Pixel weight

Probability of feature u in model

0
0.05
0.1
0.15
0.2
0.25
0.3
1
2
3
.
.
.
m
color
Probability
Probability of feature u in candidate

0
0.05
0.1
0.15
0.2
0.25
0.3
1
2
3
.
.
.
m
color
Probability


2
( )
i
i
u h
b x u
y x
p y C k
h

 


 
 
 

Normalization
factor

Pixel weight

0

model

y

candidate

23

Intelligent Systems Lab.

Similarity function









1
,,
m
p y p y p y



1
,,
m
q q q

Target model:

Target candidate:

Similarity function:





,?
f y f p y q
 
 
 








1
,,
m
p y p y p y




1
,,
m
q q q


q



p y

y

1

1









1
cos
T
m
y u u
u
p y q
f y p y q
p y q


 
  
 


The Bhattacharyya Coefficient





m
u
u
u
q
p
q
y
p
f
f
1
)
(
]
),
(
[
)
(
y
y
24

Intelligent Systems Lab.

Target localization algorithm



,
f p y q
 
 
Start from the
position of the
model in the
current frame

q
Search in the
model’s
neighborhood in
next frame



p y
Find best
candidate by
maximizing a
similarity func.

25

Intelligent Systems Lab.

Approximating similarity function









0
1 1
0
1 1
2 2
m m
u
u u u
u u
u
q
f y p y q p y
p y
 
 
 
Linear
approx.

(around
y
0
)

0
y
Model location:

y
Candidate location:

2
1
2
n
h i
i
i
C y x
wk
h

 

 
 
 



2
( )
i
i
u h
b x u
y x
p y C k
h

 


 
 
 

Independent of
y

Density
estimation


(as a function of
y
)





1
m
u u
u
f y p y q



26

Intelligent Systems Lab.

Computing gradient of similarity function





i
n
i
i
i
h
n
i
i
i
h
m
u
u
u
w
h
x
y
k
x
y
h
C
h
x
y
k
w
C
q
y
p
y
f
h
h






































1
2
1
1
0
'
2
2
ˆ
)
(
ˆ
2
1




x
k
x
g
'






































































y
h
x
y
g
w
h
x
y
g
w
x
h
x
y
g
w
h
C
w
h
x
y
g
y
x
h
C
h
h
h
h
n
i
i
i
n
i
i
i
i
n
i
i
i
h
i
n
i
i
i
h
1
1
1
2
1
2
2
2
2
1
2
1
( )
n
i
i
i
n
i
i
g
h
g
h


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 


x- x
x
m x x
x- x
Simple Mean Shift procedure
:



Compute mean shift vector










Translate the Kernel window by
m(x)

27

Intelligent Systems Lab.

Kernels and kernel profiles








































































y
h
x
y
g
w
h
x
y
g
w
x
h
x
y
g
w
h
C
w
h
x
y
k
x
y
h
C
y
f
h
h
h
h
n
i
i
i
n
i
i
i
i
n
i
i
i
h
i
n
i
i
i
h
1
1
1
2
1
2
2
'
2




x
k
x
g
'


k



kernel profile





2
x
ck
x
K

K


kernel




Epanechnikov
Kernel






Uniform Kernel






Normal Kernel



2
1 1
( )
0 otherwise
E
c
K

 





x x
x
1
( )
0 otherwise
U
c
K
 



x
x
2
1
( ) exp
2
N
K c
 
  
 
 
x x
28

Intelligent Systems Lab.

Epanechnikov kernel

Epanechnikov kernel:



1 if x 1
0 otherwise
x
k x
   

 
 
A special class of radially
symmetric kernels:





2
K x ck x

1
1
1
n
i i
i
n
i
i
x w
y
w





2
0
1
1
2
0
1
n
i
i i
i
n
i
i
i
y x
x wg
h
y
y x
wg
h


 

 
 
 

 

 
 
 






1 if x 1
0 otherwise
g x k x
  
  
 
 
Uniform kernel:

29

Intelligent Systems Lab.

Tracking with mean
-
shift

1.
Initialize the location of the target in the current frame (
y
0
)

with location of model on previous
frame, compute candidate target model
p(y
0
)
and evaluate

•••••••••••••••••




2. Compute weights

3. Find the next location of the target candidate according to








4. Compute candidate model at new position
p(y
1
)
•••••••••••••••••

5. If ||
y
0
-
y
1
||
<
ε


Stop.


Else


set y0 = y1 and go to step 2.




m
u
u
u
q
y
p
q
y
p
f
1
0
0
)
(
]
),
(
[
2
0
1
1
2
0
1
n
i
i i
i
n
i
i
i
y x
x wg
h
y
y x
wg
h


 

 
 
 

 

 
 
 


30

Intelligent Systems Lab.

Conclusions

Advantages :




Application independent tool




Suitable for real data analysis




Does not assume any prior shape


(e.g. elliptical) on data clusters




Can handle arbitrary feature


spaces




Only ONE parameter to choose




h

(window size) has a physical


meaning, unlike K
-
Means

Disadvantages:




The window size (bandwidth


selection) is not trivial




Inappropriate window size can


cause modes to be merged,


or generate additional “shallow”


modes


Use adaptive window


size