1
Stream Processing Techniques for High Performance
Optical
Flow Approximation
David Dobson
and
Andrew Strelzoff, Ph.D
The University of Southern Mississippi
Abstract
Optical flow in vision systems is the identification of
apparent motion of objec
ts or visual features in a scene
observed from a fixed point. For scenes with either large
numbers of objects or complex moving surfaces this problem
is quite computationally challenging. In ocean modeling
there is a need for optical flow approximations
of wave
motion but current implementations are not able to produce
real time results with an acceptable level of accuracy. This
paper describes early efforts to develop a stream processing
approach to this problem using graphics hardware.
CR Categories:
D.1.3 [Software]: Programming
Techniques
—
Parallel; I.7.m [Computer Graphics]: Image
Processing and Computer Vision
—
Scene Analysis

Motion
Keywords:
Graphics Processing, Optical Flow
1
Introduction
There is very broad interest
the application of optica
l flow to
determine wave direction and intensity
from prerecorded or
live video streams
for a wide variety of purposes including
ocean and water model input and evaluation
[Tang, H.M. and
Xu, S.R.
1999; Qiu, M. 2000; Jahne, B. 2003]
. Figure 1
shows a samp
le video feed of surface ocean waves from a
fixed motion

correcting buoy camera.
Figure 1:
Real time video capture of ocean surface wave
motion
The generalized problem is to derive a vector field
representing wave intensity and direction from the video
frames. These derived values can then be used as input for a
wide variety of purposes such as ocean modeling as shown in
Figure 2.
Figure 2:
Example of Ocean Surface Wave Model
2 Optical Flow Calculation
Mathematically optical flow is expressed by
the motion
constraint equation. Barron and Thacker [J.L.Barron and
N.A.Thacker 2005] give the 2D motion constraint equation
as follows :
)
,
,
(
)
,
,
(
t
t
y
y
x
x
I
t
y
x
I
(1)
In other words, a pixel at position (x, y) and time t moves to
a new posi
tion (x +
∂
x, y +
∂
y) at time t +
∂
t. In figure 3 for
example, a yellow pixel at time t is in position (3, 3). The
pixel moves to position (3, 5) at time t +
∂
t.
Figure 3:
Movement implied by the change in position
between two frames of a video image
2
Formula 1 for figure 3 is
then
as follows:
)
1
,
5
,
3
(
)
0
,
3
,
3
(
I
I
(2)
Give
n
a complete set
)
1
,
..
0
,
..
0
(
)
,
..
0
,
..
0
(
t
time
t
frameheigh
framewidth
I
t
time
t
frameheigh
framewidth
I
(3)
a
vector field can be derived to describe the motion of
objects or surfaces in the captured vide
o
image as shown
in
Figure 4.
Figure 4:
Sample Optical Flow
Vector Field
A number of different techniques attempt to solve optical
flow by approximating formula 1.
Roughly speaking there
are
4 families of methods:
differential methods
in which
differences taken acros
s on or more frames are used to
compute velocity,
region

based methods
in which a search
is conducted to find the pixel in time t+1 which best matches
the pixel in time t,
frequency

based methods
which rely
upon Fourier filters to derive velocity patterns
in the input,
and
phase based methods
which apply band pass filters to
derive velocity from phase behavior
[J.L.Barron and
N.A.Thacker 2005; J.L Barron , D.J. Fleet and S.S.
Beauchemin 2004]
. A simple naive

search region

based
method was chosen for the cu
rrent research because although
region

based methods have been shown to have the poorest
generalized results the primary purpose of this research is to
demonstrate that GPU hardware can be used to solve optical
flow problems faster than traditional CPU imp
lementation
and approaching real time. The assumption is that more
complex algorithms with greater mathematical intensity and
the same level of parallelism will then perform even better in
comparison with CPU implementation.
3 Region

Based Optical Flo
w
One simple
region

based
approach attempts to matc
h a pixel
at t to a pixel at t+
∂
t. In this approach, a pixel is identified
by its intensity value and the intensity value of a number of
its neighbor
pixels [Parekh et al. 2004]. The pixel and its
neig
hbor pixels
form a window of pixels. Optical flow is
computed by finding the best match for the window (within
specific limits)
in the image at t + ∂
t. Figure 4 shows how a
window is formed around a target pixel.
Figure 5
:
A comparison window for regio
n

based optical
flow generation.
The window is
exhaustively
compared to the image at
t + ∂
t
in different test positions. The best match within the search
area is the optical flow for that pixel.
The black box in Figure
5 is moved slightly between frames
which is detected and
Figure 5
Figure 6
The window is compared to the image at time 1 in different
test
4
GPU Implementation
Figure 7
3
Figure 8
Figure 9
Figure 10
Figure 11: The Hamburg Taxi optical flow using the more
sophistica
ted Nagel second order derivative method.
Figure
1
2
5 Future Work
References
Comments 0
Log in to post a comment