Random_Walks_for_Image_Segmentationx

hesitantdoubtfulAI and Robotics

Oct 29, 2013 (3 years and 9 months ago)

106 views

RANDOM WALKS FOR
IMAGE SEGMENTATION

IEEE Transaction on pattern analysis and machine
intelligence, November 2006

Leo Grady, Member, IEEE

Outline


Introduction


Algorithm


Dirichlet

Problem


Behavioral Properties


Result
--
Demo


2

Introduction


K
-
way image segmentation


User
-
defined
seeds
user
-
defined
labels








How to labels and unseeded pixel ?

K objects

3

Introduction


Algorithm is resolving the question:


Given a random walker starting at this location, what is
the probability that it first reaches each of the K seed
points ?







Probability = ? A random walker first reach each of
the K seed points.

K
-
tuple

vector

A pixel

Seed point

Probability?

4

Introduction

5

The probability a random walker first
reaches a seed point.

The solution to the
Dirichelet

problem
with boundary conditions at the locations
of the seed points and the seed point in
question fixed to unity while the others
are set to zero.

Introduction


Goal


1. location of weak (or missing) boundaries.


2. noise robustness.


3. ability to identify multiple objects simultaneously.


4. fast computation (and editing).


5. avoidance of small/trivial solutions.

6

Algorithm


1. generating the graph weights


2. establishing the system of equations to solve the

problem.


3. the practical details of implementation.

7

Defining a graph


Graph

G = ( V , E )


And edge, e, spanning two vertices, v
i

and
v
j
, is
denoted by
e
ij
.


The weight of an edge,
e
ij
, is denoted by
w
(
e
ij
) or
w
ij
.


Degree

of a vertex is


Assume this graph is
connected

and
undirected
.

8

Edge Weights

9


Gaussian weighting function



The only free parameter

g
i

indicates the image
intensity at pixel
i

Combinatorial
Dirichlet

Problem

10


The
Dirichlet

integral




A
harmonic function

is a function that satisfies the
Laplace equation



Dirichlet

problem
-

finding a harmonic function
subject to its boundary values.

Combinatorial
Dirichlet

Problem(cont.)

11


Combinatorial
Laplacian

matrix



Where
L
ij

is indexed by vertices v
i

and
v
j
.


The
m
x
n
edge
-
node
incidence matrix

as




Incidence matrix is indexed by edge
e
ij

and node
v
k
.




Combinatorial
Dirichlet

Problem(cont.)

12


A combinatorial formulation of the
Dirichlet

integral




C is the
m
x
m

constitutive matrix

(the diagonal matrix
with the weights of each edge along the diagonal).

Combinatorial
Dirichlet

Problem(cont.)

13


Partition the vertices into two sets,


V
M

(marked/seed nodes)


V
U

(unseeded nodes)





Finding the critical point yields

Combinatorial
Dirichlet

Problem(cont.)

14


The probability (potential) assumed at node, v
i
, for
each label, s, by .


Define the set of labels for the seed points as a
function


Define the vector for each label, s, at node



as

Solving the combinatorial
Dirichlet

problem

15


For one label



For all labels



X has K columns taken by each and M has
colums

given by each

Equivalences between random walks
and electrical circuits

16


Three fundamental equations of circuit theory.





These three equations may be combined into the
linear system




It is equivalent to with f = 0.

Algorithm Summary

17

1.
Using , map the image
intensities to edge weights in the lattice.

2.
Obtain a set,
V
M
, of marked (labeled) pixels with
K labels, either interactively or automatically.

3.
Solve outright for the potentials or
solve for each label except the
final one, f. Set

4.
Obtain a final segmentation by assigning to each
node, v
i
, the label corresponding to

Overview of segmentation computation

18

Analogies

19


Assigns an unseeded pixel to a label, given a weighted
graph:


If a random walker leaving the pixel is most likely to first
reach a bearing label
s
, assign the pixel to label
s
.


If the seeds are alternately replaced by grounds/unit
voltage sources, assign the pixel to the label for which its
seeds being “on” produces the greatest electrical potential.


Assign the pixel to the label for which its seeds have the
largest
effective conductance
.


If a
2
-
tree

is drawn randomly from the graph, assign the
pixel to the label for which the pixel is most likely to remain
connected to.

Effective Conductance

20


Effective conductance





Dirichlet

integral
equals the effective conductance
between nodes labeled “1” (“on”) and those
labeled “0” (“off”).



x is intended to include both
x
M

and
x
U


i

j

i

j

equals

Unit voltage

Current flow

Effective Conductance (cont.)

21


Effective conductance between two nodes, v
i
,
v
j

is
given by




where T is a set of edges defining a connected tree


the sum is over all possible trees in the graph




Where TT(
i
,
j
) is used to represent the set of edges
defining a
2
-
tree
, such that node v
i

is in one component
and
v
j

is in another.

2
-
tree

22


A
2
-
tree

is defined to be a tree with one edge
removed.





v
i

and
v
j

are indifferent components and
v
t

is in the
same component as
v
j
.


Note that


2
-
tree (cont.)

23


Then, the following expressions are equivalent






The segmentation is computed from the potentials
by assigning the pixel to the label for which it has
greatest potential (probability).

Behavioral Properties

24

1.
Weak Boundary detection

2.
Noise robustness

3.
Assignment of ambiguous regions

Weak Boundaries

25

Weak Boundaries
-

Comparison

26

Noise Robustness

27

Ambiguous Unseeded Regions

28

Demo Videos

29


http://www.cns.bu.edu/~lgrady/Random_Walker_I
mage_Segmentation.html


Brain


Lung tumor


Aorta
-
3D

References

30

1.
L. Grady and G.
Funka
-
Lea, “Multi
-
Label Image Segmentation for Medical Applications
Based on Graph
-
Theoretic Electrical Potentials,”
Proc. Workshop Computer Vision and Math.
Methods in Medical and Biomedical Image Analysis,

pp. 230
-
245, May 2004.

2.
R. Wallace, P.
-
W.
Ong

and E. Schwartz, “Space Variant Image Processing,”
Int"l

J. Computer
Vision,

vol. 13, no. 1, pp. 71
-
90, Sept. 1994.

3.
L. Grady, “Space
-
Variant Computer Vision: A Graph
-
Theoretic Approach,” PhD dissertation,
Boston Univ., 2004.

4.
S.
Kakutani
, “Markov Processes and the
Dirichlet

Problem,”
Proc. Japanese Academy,

vol. 21,
pp. 227
-
233, 1945.

5.
P. Doyle and L. Snell,
Random Walks and Electric Networks,

no. 22, Washington, D.C.: Math.
Assoc. of Am., 1984.

6.
R
. Courant and D. Hilbert,
Methods of Math. Physics,

vol. 2. John Wiley and Sons
, 1989.

7.
R
.
Hersh

and R.J.
Griego
, “Brownian Motion and Potential Theory,”
Scientific Am.,

vol. 220,
pp. 67
-
74,
1969.

8.
F.H
.
BraninJr
. “The Algebraic
-
Topological Basis for Network Analogies and the Vector
Calculus,”
Proc. Conf. Generalized Networks,

pp. 453
-
491, Apr
. 1966.

9.
P
.
Tetali
, “Random Walks and the Effective Resistance of Networks,”
J. Theoretical Probability,

vol. 4, no. 1, pp. 101
-
109
, 1991.

10.
N
. Biggs, “Algebraic Potential Theory on Graphs,”
Bull. London Math. Soc.,

vol. 29, pp. 641
-
682, 1997.

31

11.
Z.
Tu

and S.
-
C. Zhu, “Image Segmentation by Data
-
Driven Markov Chain Monte Carlo,”
IEEE
Trans. Pattern Analysis and Machine Intelligence,

vol. 24, no. 5, pp. 657
-
673, May 2002.

12.
S.C. Zhu, Y. Wu and D. Mumford, “Filters, Random Field and Maximum Entropy (FRAME),”
Int"l

J.
Computer Vision,

vol. 27, no. 2, pp. 1
-
20, Mar./Apr. 1998.

13.
P.
Perona

and J.
Malik
, “Scale
-
Space and Edge Detection Using Anisotropic Diffusion,”
IEEE
Trans. Pattern Analysis and Machine Intelligence,

vol. 12, no. 7, pp. 629
-
639, July 1990.

14.
J. Shi and J.
Malik
, “Normalized Cuts and Image Segmentation,”
IEEE Trans. Pattern Analysis
and Machine Intelligence,

vol. 22, no. 8, pp. 888
-
905, Aug. 2000.

15.
E. Mortensen and W. Barrett, “Interactive Segmentation with Intelligent Scissors,”
Graphical
Models in Image Processing,

vol. 60, no. 5, pp. 349
-
384, 1998.

16.
J.A.
Sethian
,
Level Set Methods and Fast Marching Methods.

Cambridge Univ. Press, 1999.

17.
M.
Kass
, A.
Witkin

and D.
Terzopoulos
, “Snakes: Active Contour Models,”
Int"l

J. Computer
Vision,

vol. 1, no. 4, pp. 321
-
331, 1987.

18.
Y.
Boykov

and M.
-
P. Jolly, “Interactive Graph Cuts for Optimal Boundary and Region
Segmentation of Objects in N
-
D Images,”
Proc.
Int"l

Conf. Computer Vision,

pp. 105
-
112,
2001.

19.
Y.
Boykov

and V.
Kolmogorov
, “An Experimental Comparison of Min
-
Cut/Max
-
Flow Algorithms
for Energy Minimization in Vision,”
IEEE Trans. Pattern Analysis and Machine Intelligence,

vol. 26,
no. 9, pp. 1124
-
1137, Sept. 2004.

20.
Y.
Boykov
, O.
Veksler

and R.
Zabih
, “A New Algorithm for Energy Minimization with
Discontinuities,”
Proc. Second
Int"l

Workshop Energy Minimization Methods in Computer Vision
and Pattern Recognition,

pp. 205
-
220, July 1999.

32

21.
L. Grady, “
Multilabel

Random Walker Image Segmentation Using Prior Models,”
Proc. 2005
IEEE CS Conf. Computer Vision and Pattern Recognition,

pp. 763
-
770, June 2005.

22.
H.
Lombaert
, Y. Sun, L. Grady and C.
Xu
, “A Multilevel Banded Graph Cuts Method for Fast
Image Segmentation,”
Proc.
Int"l

Conf. Computer Vision 2005,

vol. 1, pp. 259
-
265, Oct.
2005.

23.
Y. Li, J. Sun, C. Tang and H. Shum, “Lazy Snapping,”
Proc. ACM SIGGRAPH Conf.,

pp. 303
-
308, Apr. 2004.

24.
A. Blake, C.
Rother
, M. Brown, P. Perez and P.
Torr
, “Interactive Image Segmentation Using an
Adaptic

GMMRF Model,”
Proc. European Conf. Computer Vision,

pp. 428
-
441, May 2004.

25.
C.
Rother
, V.
Kolmogorov

and A. Blake, “`
GrabCut
"

Interactive Foreground Extraction Using
Iterated Graph Cuts,”
ACM Trans. Graphics,

vol. 23, no. 3, pp. 309
-
314, 2004.

26.
C.
Zahn
, “Graph Theoretical Methods for Detecting and Describing Gestalt Clusters,”
IEEE
Trans. Computers,

vol. 20, pp. 68
-
86, 1971.

27.
Z. Wu and R. Leahy, “An Optimal Graph Theoretic Approach to Data Clustering: Theory and
Its Application to Image Segmentation,”
IEEE Trans. Pattern Analysis and Machine Intelligence,

vol. 11, pp. 1101
-
1113, 1993.

28.
S.
Sarkar

and P.
Soundararajan
, “Supervised Learning of Large Perceptual Organization:
Graph Spectral Partitioning and Learning Automata,”
IEEE Trans. Pattern Analysis and Machine
Intelligence,

vol. 22, no. 5, pp. 504
-
525, May 2000.

29.
P.
Perona

and W. Freeman, “A Factorization Approach to Grouping,”
Proc. Fifth European
Conf. Computer Vision,

pp. 655
-
670, June 1998.

30.
L. Grady and E.L. Schwartz, “Isoperimetric Graph Partitioning for Image Segmentation,”
IEEE
Trans. Pattern Analysis and Machine Intelligence,

vol. 28, no. 3, pp. 469
-
475, Mar. 2006.

33

31.
A.
Barbu

and S.
-
C. Zhu, “Graph Partition by
Swendsen
-
Wang Cuts,”
Proc. Ninth IEEE Conf.
Computer Vision,

vol. 1, pp. 320
-
327, Oct. 2003.

32.
L. Grady and E. Schwartz, “Anisotropic Interpolation on Graphs: The Combinatorial
Dirichlet

Problem,” Technical Report CAS/CNS
-
TR
-
03
-
014, Dept. of Cognitive and Neural Systems,
Boston Univ., 2003.

33.
A. Levin, D.
Lischinski

and Y. Weiss, “Colorization Using Optimization,”
Proc. ACM SIGGRAPH
Conf.,

pp. 689
-
694, Aug. 2004.

34.
X. Zhu, Z.
Ghahramani

and J. Lafferty, “Semi
-
Supervised Learning Using Gaussian Fields
and Harmonic Functions,”
Proc. 20th
Int"l

Conf. Machine Learning,

pp. 912
-
919, 2003.

35.
V.B.
Eckmann
, “
Harmonische

Funktionen

und
Randwertaufgaben

in
einem

Komplex
,”
Commentarii

Math.
Helvetici
,

vol. 17, pp. 240
-
255, 1945.

36.
U.R.
Kodres
, “Geometrical Positioning of Circuit Elements in a Computer,”
Proc. 1959 AIEE
Fall General Meeting,

Oct. 1959.

37.
W.T.
Tutte
, “How to Draw a Graph,”
Proc. London Math. Soc.,

vol. 13, no. 3, pp. 743
-
768,
1963.

38.
H. Wechsler and M.
Kidode
, “A Random Walk Procedure for Texture Discrimination,”
IEEE
Trans. Pattern Analysis and Machine Intelligence,

vol. 1, no. 3, pp. 272
-
280, 1979.

39.
L.
Gorelick
, M.
Galun
, E. Sharon, R.
Basri

and A. Brandt, “Shape Representation and
Classification Using the Poisson Equation,”
Proc. 2004 IEEE CS Conf. Computer Vision and
Pattern Recognition (CVPR "04),

vol. 2, pp. 61
-
67, July 2004.

40.
L. Grady and E.L. Schwartz, “Isoperimetric Partitioning: A New Algorithm for Graph
Partitioning,”
SIAM J. Scientific Computing,

vol. 27, no. 6, pp. 1844
-
1866, June 2006.

34

41.
D.
Harel

and Y.
Koren
, “On Clustering Using Random Walks,”
Proc. 21st Conf. Foundations of
Software Technology and Theoretical Computer Science,

vol. 2245, pp. 18
-
41, 2001.

42.
L. Yen, D.
Vanvyve
, F.
Wouters
, F.
Fouss
, M.
Verleysen

and M.
Saerens
, “Clustering Using a
Random
-
Walk Based Distance Measure,”
Proc. 13th
Symp
. Artificial Neural Networks,

pp.
317
-
324, 2005.

43.
M.E.J. Newman, “A Measure of
Betweenness

Centrality Based on Random Walks,”
Social
Networks,

vol. 27, no. 1, pp. 39
-
54, Jan. 2005.

44.
F.
Harary
,
Graph Theory.

Addison
-
Wesley, 1994.

45.
M.J. Black, G.
Sapiro
, D.H.
Marimont

and D.
Heeger
, “Robust Anisotropic Diffusion,”
IEEE
Trans. Image Processing,

vol. 7, no. 3, pp. 421
-
432, Mar. 1998.

46.
X. Zhu, J. Lafferty and Z.
Ghahramani
, “Combining Active Learning and Semi
-
Supervised
Learning Using Gaussian Fields and Harmonic Functions,”
Proc. ICML 2003 Workshop
Continuum from Labeled to Unlabeled Data in Machine Learning and Data Mining,

pp. 58
-
65,
2003.

47.
J.
Dodziuk
, “Difference Equations, Isoperimetric Inequality and the Transience of Certain
Random Walks,”
Trans. Am. Math. Soc.,

vol. 284, pp. 787
-
794, 1984.

48.
J. Roth, “An Application of Algebraic Topology to Numerical Analysis: On the Existence of a
Solution to the Network Problem,”
Proc.
Nat"l

Academy of Science of Am.,

vol. 41, pp. 518
-
521, 1955.

49.
C.
Mattiussi
, “The Finite Volume, Finite Element and Finite Difference Methods as Numerical
Methods for Physical Field Problems,”
Advances in Imaging and Electron Physics,

pp. 1
-
146,
Apr. 2000.

50.
F.R.K. Chung,
Spectral Graph Theory,

no. 92, Providence, R.I.: Am. Math. Soc., 1997.

35

51.
N. Biggs,
Algebraic Graph Theory,

no. 67, Cambridge Univ. Press, 1974.

52.
G.
Golub

and C. Van Loan,
Matrix Computations,

third ed. The Johns Hopkins Univ. Press,
1996.

53.
W.
Hackbusch
,
Iterative Solution of Large Sparse Systems of Equations.

Springer
-
Verlag
,
1994.

54.
J.J.
Dongarra
, I.S. Duff, D.C. Sorenson and H.A. van
der

Vorst,
Solving Linear Systems on
Vector and Shared Memory Computers.

Philadelphia: SIAM, 1991.

55.
K.
Gremban
, “Combinatorial
Preconditioners

for Sparse, Symmetric Diagonally Dominant
Linear Systems,” PhD dissertation, Carnegie Mellon Univ., Pittsburgh, Penn. Oct. 1996.

56.
J.
Bolz
, I. Farmer, E.
Grinspun

and P.
Schröder
, “Sparse Matrix Solvers on the GPU:
Conjugate Gradients and
Multigrid
,”
ACM Trans. Graphics,

vol. 22, no. 3, pp. 917
-
924, July
2003.

57.
J.
Krüger

and R.
Westermann
, “Linear Algebra Operators for GPU Implementation of
Numerical Algorithms,”
ACM Trans. Graphics,

vol. 22, no. 3, pp. 908
-
916, July 2003.

58.
Multigrid
,

U.
Trottenberg
, C.W.
Oosterlee
, and A.
Schuller
, eds. San Diego, Calif.: Academic
Press, 2000.

59.
J.E.
Dendy
, “Black Box
Multigrid
,”
J. Computational Physics,

vol. 48, pp. 366
-
386, 1982.

60.
L. Grady and E.L. Schwartz, “Faster Graph
-
Theoretic Image Processing via Small
-
World and
Quadtree

Topologies,”
Proc. 2004 IEEE CS Conf. Computer Vision and Pattern Recognition,

pp. 360
-
365, June
-
July 2004.

36

61.
L. Grady and E.L. Schwartz, “The Graph Analysis Toolbox: Image Processing on Arbitrary
Graphs,” Technical Report TR
-
03
-
021, Boston Univ., Aug. 2003.

62.
G. Kirchhoff, “
Über

die
Auflösung

der

Gleichungen
, auf
Welche

man
bei

der

Untersuchung

der

Linearen

Verteilung

Galvanischer

Ströme

Geführt

Wird
,”
Poggendorf"s

Annalen

der

Physik

Chemie
,

vol. 72, pp. 497
-
508, 1847.

63.
M. Fiedler,
Special Matrices and Their Applications in Numerical Mathematics.

Martinus

Nijhoff

Publishers, 1986.

64.
L. Grady, T.
Schiwietz
, S.
Aharon

and R.
Westermann
, “Random Walks for Interactive Organ
Segmentation in Two and Three Dimensions: Implementation and Validation,”
Proc. Conf.
Medical Image Computing and Computer
-
Assisted Intervention 2005 II,

pp.

773
-
780, Oct.
2005.


Definition of a harmonic function

37


Any
real function

u(x, y) with continuous second
partial derivatives

which satisfies
Laplace's
equation
, is called a harmonic
function.


Reference from
Mathworld
: http://mathworld.wolfram.com/HarmonicFunction.html

Back