USING LINKING FEATURES IN

dealerdeputyAI and Robotics

Nov 25, 2013 (3 years and 8 months ago)

89 views

USING LINKING FEATURES IN
LEARNING NON
-
PARAMETRIC
PART MODELS *

Ammar

Kamal

Hattab


ENGN
2560
Mid Project Presentation

April
16
,
2013


* Leonid Karlinsky,

Shimon
Ullman
, ECCV (
3
)
2012

Project Goal


Project Goal:

implement Linking Features Algorithm to
detect a set of parts of a deformable object.


Examples:



detect human parts: head, torso,

upper/lower limbs


detect facial landmarks: eyes,

nose, mouth outlines, etc.


detect animal parts




tll

Nk

Torso

trl

bll

brl

Linking Features Method

The elbow appearance “
links
” the correct arm part candidates

Linking Features
How do we choose the right lower arm candidate?


To use
local features
in strategic locations


To provide
evidence
on the
connectivity
of the part
candidates

TRAINING STEPS

Steps Completed

Step
1
: Generating Training Dataset


Convert a video file (13 seconds) to a sequence of
images (397).


Manually Adding annotations to each image using
SVG editor.


Read the annotations from the SVG image to a text
file.


Step
2
: Extracting SIFT features


For each image, find
SIFT descriptors
using
VLFeat

open source

Matlab

library on a dense grid
in a specific ROI, using
20
×

20
pixels patch.


Grid steps:
3
pixels (gives ~
10
,
300
SIFT feature in
the following region)

Step
3
: Connecting Features with Parts


Read

and draw the stickman annotation file


Enlarge

each stick by
20
pixels


For each part,
assign

SIFT

features that

are inside the
rectangle

with each part


Storing

all features info and learned parts in an
external file


Step
4
: Find Linking Features


Find
linking features
in the training images by using a
circle with radius of
15
pixels centered between the two
parts, (different from the paper)


for each feature in the image define
variable Ai
=
1
if
it’s a linking feature, =
0
otherwise, and store them also.

TESTING STEPS

Step
1
: Extract SIFT features


For a new test image, extract all SIFT features on a
dense grid.

* Image from the referenced paper

Step
2
: Building the Nearest Neighbor Tree


Load all training data
from files.



Using training data Build
ANN tree


Given a test feature,

efficiently

find
nearest neighbors
features in the training
images.


NNs used for estimating Kernel Density Estimation (
KDE
) for different variables.



Building the following
ANN trees
:


One
ANN
all

for all features


Several
ANN
j

for features assigned with j
-
th

part (
e.g
: ANN for Lower Left Arm)



Using

C++ code from
Mount

with a
Matlab

wrapper from
Bagon
.

Step
3
: Generalized Hough Transform


for each
part (ex: head)


For each feature in the test features


Find
25
nearest neighbors using the trained ANNs


Accumulate offset to the part center to get
the KDE

(
P
j

(
L
j

|
F
i
) conditional probability of the center location given
Fi

)


Summing up these KDEs gives us the
GHT
(each feature votes
for part center location using the probability)

Step
4
: Parts Candidates


Then we need to get top
20
maxima

locations of
the voting matrix V which will give us
candidate
parts locations


Taking top
3
orientations

and
length
, we have
60
parts candidates

* Image from the referenced paper

Step
5
: Finding Linking Features


Using parts candidates for a pair of parts


For each pair of parts candidates


As in the previous step, for each feature in the image,
compute
probability
of the feature to
link
the two
parts
P(
F
i

|
L
j
,
L
k
)

(Find
25
NN


KDE)

* Image from the referenced paper

1

2

Step
6
: Inference


So by
maximizing
the previous
probability
(linking
features probability) we could infer the
correct
choice

of part candidates.


Example: Comparing probability of two part
candidates for upper and lower left arm

P =
0.0868

P =
0.0164

Next Steps


Completing the Implementation:


use greedy approach to find the
optimal parts
configurations

(that maximizes probability of previous
step)


Solving Problems
:


Taking too much time to compute the GHT and spectral
clustering.


There is a problem in voting for orientations.


Evaluate the algorithm

and test it on different data
sets.

Missed Step: Spectral Clustering


Use spectral clustering algorithm to cluster the
features into
20
clusters based on how much
features contribute to
Maximas

of voting matrix.


Using code from:
http://books.nips.cc/papers/files/nips
14
/AA
35
.pdf