SOFTWARE ENGINEERING IN VIRTUAL REALITY

creepytreatmentΤεχνίτη Νοημοσύνη και Ρομποτική

14 Νοε 2013 (πριν από 3 χρόνια και 11 μήνες)

61 εμφανίσεις



SOFTWARE ENGINEERING IN VIRTUAL REALITY



Jeff
rey E.

Runde

Department of Software Engineering

University of Wisconsin Platteville

Platteville, WI 53818

rundeje@uwplatt.edu



Abstract


Virtual Reality is defined

as an alternate environment presented to a user in a way to
appear as if it is the user’s real environment. This “virtual environment” provides sensory
input to the user and reacts to the user’s movement in real time. With the continuing
advancements in h
ardware and software design the generation of virtual environments is
becoming more equal to the real environment.


My paper

will cover the
four

main components used to create a virtual environment and
the software interactions between them. The first com
ponent is motion tracking, getting
input from the movements of the user. The second and most important component of
virtual reality is the simulation hardware/software which generates the virtual
environment for the user.
The third component
is

the user in
put devices, depending on
the objective of the virtual reality experience the input devices may include data gloves,
joysticks, and other props.
The final component of virtual reality is motion feedback.
Most commonly this is done through a motion base whi
ch moves as the virtual
environment moves.





Introduction


Virtual reality is a very useful tool
. It consists of hardware and software components
.

A
large amount of the current development work being done in virtual reality consists of
application devel
opment.
Nearly any

real environment can be
created
virtually by using
software from
open source libraries to fully developed applications.




What is Virtual Reality?


A common definition of Virtual Reality is an interactive computer generated simulation
environment or
v
irtual
e
nvironment

[5]
. A “virtual environment” provides sensory input
to the user and reacts to the user’s movement in real time. The most common use for
v
irtual
r
eality is
i
mmersive VR
,

which is where a human user is placed in a virtual
e
nvironment to evaluate that environment.

Since
v
irtual
r
eality is such a loosely defined
field there are
also
many other uses.



Components of Immersive Virtual Reality


There are six
main
tools

typically found in an Immersive Virtual Reality setup. Typic
ally
there is a motion t
racking device, its

job is to track any object in the real environment
and provide that data to the simulation to be used in the virtual environment. These
objects usually include a human user but can also include physical objects t
hat exist in
both environments. The motion tracking system communicates with the
motion
tracking
software
. The

tracking software’s job is to correlate the data from the
motion
tracking
system and stream it to the simulation software. The simulation softwar
e is the central
point of the system. It takes the tracking data and the stream from any data acquisition
being done and generate
s

the virtual environment.

The virtual environment consists of a
display view for the user to see and can also include other fe
edback such as motion.
The
display device
’s responsibility is to
display the virtual environment to the user
. This may
include a stereo or just a mono view of the virtual environment
. A motion device or
motion base provides
motion
feedback to the user. It
is most often used in driving or
flying simulations

[1]
.



Motion
Tracking
System
Motion
Tracking
Software
Simulation
Software
Display
Device
Data
Acquisition
Motion
Device


Figure 1
:

Immersive VR System


Motion Tracking


A true Immersive Virtual Reality system does some sort of motion tracking of the user

and any other object needed in
the real and virtual environments
.
The motion tracking
captures the user’s movement in the tracking volume
. I
t must be able to determine the

position and orientation of each object
.
The user is limited to a volume of space were the
motion tracking can oper
ate. There are many different types of motion tracking systems
available to use which range in price from a few hundred dollars to hundreds of
thousands depending on the intended use

of the system
.


Motion tracking systems are also used in other fields suc
h as software game engineering.
For example most modern sports video games will bring in real human subjects and
record the
ir

body motion to be digitized and inserted into the game. The movie industry
also uses motion tracking to capture actor movements to

digitize and map to computer
generated character
s
[4]
.


The
re

are two very
common
setups for
motion tracking systems
. The first is an optical
setup which
use
s

i
nfrared cameras to track very highly reflective markers placed on the
user
. The second common
type of motion tracking systems
generate a magnetic field
inside the tracking volume
. This is used alon
g

with a receiver to triangulate
positions

and
orientations
.


Generally an optical system costs more than a magnetic
based
system but
the optical
system

supports a much larger tracking volume, up to the size of a full basketball court.
An optical system works by placing a series of cameras around the perimeter of the
volume to be tracked. These need to be in a very fixed position because the
tracking
soft
ware then calculates where each camera is with respect to the other cameras. The
allows the software to triangulate the position of each marker inside the tracking volume

[4]
.


A magnetic based system is much more limited in its range due to the range of
the
magnetic
field;

it supports a maximum volume size of about 1
25

cubic

feet. A magnetic
motion tracking system

works by

generat
ing

a magnetic field which is read by a series of
trackers fixed to the user. Each track
er

can determine its location and orien
tation inside
the magnetic field. The
re are two
disadvantage
s to a magnetic based system. The first is
the fact that each tracker requires a
wire leading
back to the main system. The second
disadvantage is the requirement that nothing in the tracking volum
e be made out of metal
because this with alter the magnetic field and shift the locations the trackers think they
located at.


The general optical motion tracking
system
loop contains four steps. At the first step the
cameras each capture and stream an ima
ge back to the tracking software.
The software
then calculates the three dimensional location of each marker in the volume based on the
known location of each camera and the camera image.
A minimum of three cameras
must see the marker for its location to b
e determined.
Once the location of each marker is
determined the software determines which markers belong to which objects. A
track
able

object is already predefined for the
motion tracking
software. Every object
consists of a
non symmetrical patter
n of mar
kers attached to it. This

allows the tracking soft
ware to
recognize the object’s three

dimensional marker pattern
. I
t can then calculate the position
and orientation of the object. Having non symmetrical patterns is
very important,
otherwise the absolute p
osition and orientation cannot be determined. Once each object
is tracked

its position and orientation are

streamed to the network to be picked up by the
simulation software.

This entire process is done at about 120Hertz, and requires a very
capable PC or
workstation

[4]
. If the tracking slows down and gets choppy, it affects
everything down to the user.



Get camera
images
Determine
locations of
markers
Determine
location
&
orientation
of objects
Stream
object data


Figure 2
:

Motion Tracking System



Virtual
Environment Generation
Software


The virtual environment generation software cr
eates the virtual environment. It uses the
user’s tracking data
which is
being streamed

from the tracking system along with any
other input from the real environment. The virtual environment
consists

of

what the user
will see, the visuals,
and
sometimes wh
at the user
will

hear.
It also contains t
he dynamics
of other objects in the environment
. F
or example a virtual machine being operated by the
user or a virtual plane being flown. And finally
the virtual environment contains
the
motion feedback to the user.



There are many different applications available that hand
le a few or all of these tasks.
Virtual Reality
is
still
very much
in the development stage
so
most users create a
customized application that ac
complishes what they need to do.

F
rom driving
simu
lations to visibility checks to cool entertainment systems the development is driven
by the users

[2]
.


Read
tracking
stream
Read data
acquisition
/
simulation
stream
Generate
display
view
(
s
)
&
feeback
motion
Import
environment
objects
Adjust
location
&
orientation
of objects


Figure 3
:

Virtual Environment Generation


Figure 3 details a typical immersive virtual reality
application
.

First the
vir
tual
environment
objects are
loaded
. Th
is could include machine geometry, environment
terrain, and anything else needed in the virtual environment. Next the main software loop
starts
executing by
reading the tracking stream from the tracking software which

is
already running. Next any data acquisition is done, includ
ing

all the user input devices
such as joysticks, or a steering wheel and any simulation hardware added to the loop
such as a production machine ECU. Next the software takes all the gathered dat
a and
generates the virtual scene for the user.
This includes adjusting the location and
orientation of the tracked objects and updating the environment with the latest simulation
data. Finally the images are drawn and
sent to the user via a display device
. Any other
output devices are also updated such as a motion base system or force feedback.


Generally the most important part of the virtual environment for the user is the visual
aspect. This is w
h
ere frame rates and smoothness of the display become big

factors in the
user’s

perception and experience with virtual reality. Ideally the display system would

run in real
-
time faster than thirty frames per second. O
ptimally it would be around
sixty
frames per second.

This requires that the
software’s
main loop

gets executed that fast.



Virtual Reality Uses


EDS Jack is an example of a commercially available virtual reality software package. It
is mainly used for visibility and ergonomics study. These are two of the areas that using
Virtual Reality really benef
its. For example
when
designing a large mechanical device
such as a bulldozer or even a car, visibility and ergonomics are very important to the
operators
. W
ould you buy a car that was uncomfortable to drive
or
had

poor visibility,

p
robably
not?

Many compa
nies spend a large amount of money making the
ir

products
interface better with the operators. The cost of building prototypes is very expensive,
upwards of a few million dollars for one machine

using the bulldozer example
. By using
virtual reality the comp
any could check out the viability and ergonomics of their
machine quickly and make changes to it without ever spending money on building
hardware.


Another area that Virtual Reality is heavily used in is driving or flying simulations.
These provide the u
sers a chance to gain expertise operating a vehicle without the real
world consequences of making a mistake.


MPI Vega Prime is an example of a software package that supports any type of driving
simulation. The user build
s

the virtual
environment
within t
he software package. It

biggest advantage is its realistic physics engine which
supports collision detection.


Flight simulators are the most common type of machine simulation. Some other
examples would be the US Army
’s

use

of

simulators to train tank sol
d
ers with virtual
tank wars.
NASA
also
trains its astronauts on how to land the space shuttle with a
v
irtual
r
eality simulator

[1]
.



Display Devices


T
h
e visualization part of
v
irtual
r
eality is very important to convincing a user that they
are in a virtu
al environment. The most common tool to do this is a head moun
ted display
or HMD, which sits on the user’s

head and
contains
the

display. The more expensive
HMDs contain a display for each eye creating
stereo vision or a three dimensional view

for the user
. The lesser expensive HMDs only contain one display which is a mono view
for the user. This can still be useful depending on the objective of the setup. HMDs also
vary w
idely in
performance. The three biggest factors in an HMD
are

its resolution,
refresh
rate, and its field of view. A large resolution allows the user to see very
finely
detailed objects in the virtual environment. A quick refresh rate will give the user a sense
of fluidness when looking around, not a choppy image. Finally a large field of v
iew
HMD won’t make the users feel like they are looking through a tunnel to see the
environment. Each of these parameters
needs

to be considered when choosing an HMD
for a virtual reality system.
The
typical
cost range
for
an
HMD
is between a hundred
doll
ars to forty or fifty thousand dollars
for

a stereo HDM with a high

resolution, high
refresh rate and a wide field of view

[3
]
.


Another display tool is a cave. A

cave is usually three walls and a floor with projectors
displaying an image on each.
By usin
g three dimensional
shutter
glasses

or polarized
filters
,
a user
standing inside a cave
can experience
a full virtual environment with a very
wide field of view.
Caves are mostly used for driving
or flying
simulator
s

where the user
won’t be moving around a

lot.

Finally a cheaper simulator can be done with a dome or
stereo
wall

which uses

the same setup as a cave
.


Conclusion


Virtual reality can be a very useful tool. It has many applications from product
development to entertainment. It is still very much

in the development stage with many
users creating their own customized applications and setups to suit their needs.



References


1.

Vertical Motion Simulator,

r
etrieved April 4, 2005,

from

http
://www.simlabs.arc.nasa.gov/vms/vms.html


2.

Virtual Reality Applications Center,
r
etrieved April 3, 2005,
from

http://www.vrac.iastate.edu/


3.

Head Mounted Displays,
retrieved April 6, 2005, from
http://www.vrealities.com/hmd.html


4.

Vicon Products,
r
etrieved April 6, 2005,
from


http://www.vicon.com/products/index.htm


5.

Virtual Reality Definition,
r
etrieved Ap
ril 4, 2005,
from
http://glossary.its.bldrdoc.gov/fs
-
1037/dir
-
039/_5809.htm