Augmented Reality Visualization of Outdoor Environmental Corrosion

bumpedappleMobile - Wireless

Nov 21, 2013 (3 years and 8 months ago)

102 views

i




Research Proposal

Augmented Reality Visualization of Outdoor
Environmental Corrosion


James Walsh


100063615

WALJA008


f
or
Bachelor of Computer Science (Honours)



Supervisor: Professor Bruce Thomas



13
th

June 20
1
0



Wearable Computer Lab

School of
Computer and Information Science

University of South Australia



i


Abstract

Augmented reality
i
s the ability to
enhance
reality with additional
virtual
information

and
functionality
.

Mobile augmented reality provides the ability
to apply augmented reality

beyond fixed
systems
, such as
roaming indoor or outdoor environments.

The application of
situated

visualization
within m
obile augmented reality has shown promise in aiding the understanding of our
environment.

Current methods for the maintenance of large structures involve the regular manual inspection
by
personnel

to establish the condition of the structure.

Previous attem
pts to semi
-
automate this
process via the use of wireless
environmental
sensors

still result is a loss of data due to context
when reviewing the data logs.

With previous work indicating
the possibility for increased
understanding of our environment via
vis
ualizing

the sensor data in situated augmented reality,
this
research is motivated to develop visualization techniques to aid in the representation of outdoor
environmental corrosion

in mobile augmented reality
.

Preliminary results indicate a successful
ap
plication of in
-
situ visualization
of

outdoor environmental corrosion in mobile augmented reality.


i
i


Contents

Abstract

................................
................................
................................
................................
....................

i

List of Figures

................................
................................
................................
................................
.........

iii

List of Tabl
es

................................
................................
................................
................................
..........

iv

Introduction

................................
................................
................................
................................
............

1

Motivation
................................
................................
................................
................................
...........

1

Research Question

................................
................................
................................
..............................

1

Literature Review

................................
................................
................................
................................
....

3

Augmented Reality

................................
................................
................................
..............................

3

Definition o
f AR

................................
................................
................................
...............................

3

Classification of AR

................................
................................
................................
..........................

3

Applications of AR

................................
................................
................................
...........................

3

Sensor Visualization

................................
................................
................................
........................

4

Limitations

................................
................................
................................
................................
..........

6

Research Design

................................
................................
................................
................................
......

8

Initial Analysis

................................
................................
................................
................................
.....

8

Research Methodology

................................
................................
................................
.......................

8

Expected Outcomes

................................
................................
................................
............................

9

User Evaluations and Ethics

................................
................................
................................
..................

10

Proposed User Evaluation Form

................................
................................
................................
.......

10

Preliminary Results

................................
................................
................................
...............................

12

Schedule

................................
................................
................................
................................
................

15

Proposed Thesis Table of Contents

................................
................................
................................
.......

16

Summary

................................
................................
................................
................................
...............

18

References

................................
................................
................................
................................
............

19




iii


List of Figures

Figure 1 The Reality
-
Virtuality Continuum (Milgram et al., 1995)

................................
..........................

3

Figure 2 ARVino viewport

................................
................................
................................
.......................

4

Figure 3 SiteLens viewport

................................
................................
................................
......................

5

Figure 4 SensAR visualizing temperature
................................
................................
................................

5

Figure 5 Handheld humidity visualization

................................
................................
...............................

6

Figure 6 Proposed Tinmith shared library architecture

................................
................................
..........

8

Figure 7 Proposed iterative development cy
cle

................................
................................
.....................

9

Figure 8 The Plasma effect shown at various degrees of severity

................................
........................

12

Figure 9 The ‘(rust) crystal’ representation, showing three stages (left
-
to
-
right) of the corrosion (15%,
35% and 70%)

................................
................................
................................
................................
........

12

Figure 10 The revised Box (second revision) and Gauge (fourth revision) sensor visualizations

.........

12

Figure 11 Initial visualization system prototype showing a semi
-
opaque transparent overlay (left)
and dialog overlay (right)

................................
................................
................................
......................

13

Figure 12 Current visualization system running in the Tinmith emulator showing first person (left)
and birds eye (right) views

................................
................................
................................
....................

13




iv


List of Tables

Table 1 Proposed schedule

................................
................................
................................
...................

15

Table 2 Proposed thesis Table of Contents

................................
................................
...........................

17


1


Introduction

Located to the left of

the

Reality
-
Virtuality Continuum
(Milgram et al., 1995)
, Augmented

Reality

(AR)
is the supplementing of virtual information and functionality
over the real world
(Azuma et al.,
2001)
, with mobile AR

being a classification of AR regarding the
type of
AR system

used
.

The
information used to

augment

the world
can be static
(Feiner et al., 1993, Feiner et al., 1997)

or
dynamic

(Gerhard et al., 2004, Thomas et al., 2000)
.


We can e
xtend the definition of visualization (
the use of computer support in interactive, visual
repres
entation
s

to increase understanding and cognition
(Card et al., 1999)
)

to situated (in
-
situ)
visualization, where
visualization
s

are relevant to

the context in which they are displayed

(White et
al., 2007).

Th
e

use of context sensitive visualization aides in the understanding
of the
environment

by the user
.

Following
previous
efforts examining the use of situated scientific visualization in AR

(White et al., 2007, Belhumeur et al., 2009, White, 2009a, Rauhala et al., 2006)
,
the
possibility
of
increased
cognitive
perception

is present

(Rauhala et al., 2006)
.

Motivation

Today’s u
rban
structures

contain a
ever increasing number

of visible and invisible
components
,
which may remain hidden

from
visitors

without aids

to
increase

observation

(White et al., 2007)
.
Current methods for the regular
maintenance
inspection
s

of large structures involve a time
consuming, manual examination by
a

site inspector to evaluate the structure’s condition and
integrity.

The inspection currently involves the inspector physically

walking around a structure and
examining each component as they
move
, making notes regarding the
structures
condition

at that
position
.

Previous work has
sought

to improve upon

the manual

inspection
s

by automating the collection of
environmental data which may impact on the structure.
Kong

(2009)

dev
eloped a sensor system to
enable

the positioning of numerous wireless sensor over a structure to monitor the environment at
that position
,

at regular intervals.

Each sensor consists of a corrosion sensor, humidity sensor and
sensor internal and external th
ermometers.
Currently, this data is available in numerical form, losing
important contextual data that is associated with the sensor’s real world position.

It is
also relies on
the

inspector
’s ability

to translate this information from the
spreadsheet

give
n to them, into
meaningful inferences regarding which areas of the structure have been affected
, deducing possible
relationships between

numerous
sensors that are located near one another
.


The application of
visualization in
AR to onsite inspections
enabl
es inspectors to see a wealth of

previously unavailable

information
(White et al., 2007, White and Feiner, 2009)
.

Research Question

Th
is

research answers the question


What are the most suitable techniques for
visualizing outdoor
environmental corrosion in mobile augmented reality
?

.


The scope of the research project is limited to
only
the
visualization of
corrosion and

associated
temperature and humidity
data

over time
. The proposed techniqu
es are designed onl
y to enable
a
more intuitive, context sensitive

understanding of sensor data
for
on
-
site structure inspections
. The
proposed visualization system will work within the constraints
imposed

upon it by the Tinmith
AR
platform.


2


The remainder of this research p
roposal is as follows;

a literature review

of

the
areas of research
relating to

this project
, concluding
with an analysis
of
the

literature, justifying the contribution of
this research.

This is followed

by the research question

and

outlines
of
the design
and methodology
that will be applied in this research, along with the anticipated outcomes.

An overview of the
required user evaluations and
associated
ethics considerations is followed by preliminary results
of
the research and
a

proposed schedule

for exp
ected milestone
s
.



3


Literature Review

This
section

provides a review of relevant
literature.

Given the research question, the literature
review is broken into
four

sections
; augmented reality definition, classification and applications,
followed by sensor

visualization
, each providing an overview of research specific to that area
.

This
section concludes with a review of
the

literature
, identifying limitations and

serving as a justification
for the proposed research.

Augmented Reality

This section provides an overview
of Augmented Reality research and current applications
.

Definition of AR

Following
Sutherland’s
(1968)

p
ioneering
work
in
mixed reality,
Milgram et al’s

(1995)

Reality
-
Virtuality continuum (
Figure
1
)
represents

the spectrum between
the

real
and

purely virtual
world
s
,
and

the
cross
-
over between them
.

Four areas are identified
;

Real Environment, Augmented Reality,
Augmented Virtuality, and Virtual Environment
.


Figure
1

The Reality
-
Virtuality Continuum

(Milgram et al., 1995)


One of the four positions of the continuum, Augmen
ted Reality (AR), is the supplementing of the
real world with additional
, virtual

information

(Azuma, 1997)
. Located to the left of the continuum,
augmented
reality is more real than virtual, enabling the user to interact with the real world with
additional knowledge.

Given the definition of reality
is

not limited to a single sense,

the knowledge
and interactions in AR

can extended beyond
the visual

sense

(Heidemann et al., 2004)
.

Classification of AR

The met
hods used in the
implementation

of AR can be separated into three areas;
head
-
mount
ed

display

(HMD)
, handheld and
projection

based AR

(Zhou et al., 2008, Azuma et al., 2001)
.


Jun
(1995)

describes a method of AR
system

where the wo
rld is augmented when looking through a
specific

device’s

view port onto the world
.

Handheld AR systems apply this


magic glass


technique,
acting
as
a window to the
augmented

world.

Unlike
handheld AR, where users

only experienc
e

AR in
the
limited
areas

that
involve the magic lens


viewport,

mobile AR is

fully

immersiv
e
, with users

wear
ing a head mounted display
,

or
similar

device

to interact with the augmented world

(Milgram et
al., 1995)
.

Applications of AR

Many
areas have been identified

as being able to benefit from AR

(Azuma, 1997)
.

Given the nature
of mobile AR
’s

ability to allow a user’s movement over large geographical areas, o
ne applic
ation

is
the use of AR

in

large

scale, outdoor navigation.

Previous work has been carried out to examine the
effectiveness and technological consider
ation
s of outdoor AR navigation
(Feiner et al., 1997, Thomas
et al., 1998, Gerhard et al., 2004)
.

4


The
Touring Machine
(Feiner et al., 1997)

provided
point of interest guidance

for exploration of
outdoor
environment
s
.

By
augmenting the user

s

view

with
labelling

for

bui
ldings and important
structures
, the system served as a
virtual
tour
guide

for

those

unfamiliar with the
environment
.

Similarly, the Tinmith mobile AR system

(Piekarski and Thomas, 2003)
, originally developed for
outdoor
architectural design,

could

be
used as a virtual compass

(Thomas et al., 1998)
.
By u
sing a
floating diamond to indicate the
destinat
ion’s

direction and distance, the system could guide users
to their destination
.

There have been a number of extensions

to Tinmith
;

including

a ‘first person shooter’ game
,

ARQuake

(Thomas et al., 2000)
,
an
outdoor
weather simulator,

ARWeather

(Heinrich et al., 2008)
,
x
-
r
ay v
ision

(Piekarski, 2009)

and
terrain

visualization for viticulture,
ARVino

(Piekarski et al., 2005)
.
ARVino

supported

the visualization of
geographical information system

(GIS) data in mobile AR
,
en
abling users to view geographic areas through the
HMD
, with additional information inferred from
the
overlaid

colour

scheme
on top of the

terrain (
Figure
2
)
.


Figure
2

ARVino viewport

Sensor
Visualization

Taking
the

de
fi
ni
tion of
visualization as the use of computer support, interactive, visual
representation to increa
se understanding and cognition
(Card et al., 1999)
, we
define

situated
(in
-
situ)
visualization as visual representations that are
relevant

to the context in which they are
displayed
, as a method to
increase a user’s understanding of the environment

(White et al., 2007)
.

By using
sensors to monitor the environment, t
he primary goal of sensor visualization is the
conversion of collected data i
nto visual representations to aid understanding

(Yuxi et al., 2009)
.

T
he

SiteLens system

(White and Feiner, 2009, White, 2009a, White, 2009b)

is

a
n

example of in
-
situ
visualization in AR
, consisting of
a handheld AR
system

used to visualize
carbon dioxide

(CO
2
)
levels
for
on
-
site building

inspectors
.


Carbon dioxide

levels are represented as

floating
spheres/
cylinders

whose height/size
are

indicative of the
carbon dioxide
level recorded at that
physical

position

(
Figure
3
)
.
The ambiguity regarding when the data was collected
le
d to

confusion
of users
, unable to
understand

the temporal
meaning

of the data
. The use of a handheld magic
-
lens approach allowed
for a mo
re natural method of interaction

h
owever
,

the
system was limited to displaying only one
type of sensor data, with no discussion relating to ha
ndling/viewing multiple dataset parameters

and
the possible relationships between them.

An extension of this work also
enabled

object

and shape

5


recognition

(Belhumeur et al., 2009)
, allowing the system to detect known objects wh
en placed in
the field of view.


Figure
3

SiteLens viewport

The U
biquitous
A
ugmented
R
eality (UAR) system

(Xinyu, 2009)

combines mobile
AR

with

ARToolkit
1

to enable
the visualization of
indoor environmental sensor data.

Upon
viewing

a tracking marker,
data from a sensor also located at the marker’s position is overlaid
on top of the
marker
, indicating
the sensor’s real time status
.

Similarly,

Goldsmith et al.

(2008)

also present
SensAR
,

a
n

ARToolkit

visual
tracking system
to
overlay
sensor values in real time.

The simple us
e

of metaphors in
visualization representations

enables

intuitive
understanding of data
, such as the use of
thermometer
representations

for temperature

(
Figure
4
)
.


Figure
4

SensAR
visualizing

temperature

Despite
the
project’s focus as

a proof of concept,
a handheld system designed for

humidity
sensor
visualization

(Gunnarsson et al., 2006)

demonstrated the feasibility

and

effectiveness

of real time
,
handheld

AR
visualization.

Again, b
y using visual
ARToolkit t
racking

sensor
values could be
positioned
on

top of the user’s viewport.

T
he use of interpolation

of sensor data

(however small
scale)

was used

to

indicate
the
approximate
values for surrounding areas

(
Figure
5
)
,
a differential



1

http://www.hitl.washington.edu/artoolkit/

6


with the
previous

system
s

displaying
o
nly
individual

values
, despite

all

system
s

showing
individual
data values over
distributed areas
.



Figure
5

Handheld

h
umidity visualization

The
application

of visualization
to

large scale environmenta
l monitoring has been examined
(Fan and
Biagioni, 2004, Yuxi et al., 2009)
.
Despite the
use of
non AR

visualization
,

Yuxi et al.
(2009)

demons
trated the use of sensors

visualization in

monitor
ing

remote

wetland

environments
.

The
wireless network
topology

is visualized as a layer
on top

of a geographic map
, using colour codes

to
indicate the relationships between nodes.

However,
despite the appl
ication of
mapping to
visualization,
the
actual
visualization of logged data is a traditional line graph.

Similarly,
Fan and
Biagioni

(2004)

propose the GRASS GIS system
, also

for large scale environme
ntal monitoring.
Using
a wireless
sensor network,
environmental
data is logged, with rainfall
visualized as colour
-
coded
Voronoi

diagrams
, with temperature

also visualized as line graphs
.

Whilst not

visualizing sensor data,
Malkawi and Choudhary

(1999)

demonstrate
the
visualization of
simulat
ed

heat transfer within a structure over time
.
By adjusting
the structure’s physical
parameters,
the visualization will
adjust the visualization
to

reflect the t
ransfer of heat within the
structure.

Rad and Khosrowshahi

(1997)

extend this concept
again,

by visualizing the impact of
parameters on
a structure’s

maintenance
over

time.

Following a set of
mathematical

models,

the
structure visibly

decays
, representing the required maintenance
for up
-
keep
.

Limitations

This section will provide an

analysis of the research presented in the literature review.
By analysing
the limitations of the current systems and methodologies,
we can identify areas of
oppor
tunity

for
innovation
, development and further research.
The section

will serve

as
justification
for

this
research’s contributions.

Current systems for AR visualization of sensor data have had the capability t
o

only
visualize a

single
parameter.

Despite ce
rtain conditions, such as environmental monitoring, containing a number of
logged parameters,
the visualization’s have only
enabled
the inclusion of one parameter, thus
obscuring any relationships that may be present between the parameters.

Despite the use

of a single

parameter, only one AR system has included the ability to interpolate between values

at fixed
points.

Given the possible application of in
-
situ visualization across large areas,
it would be
7


impossible to log data from every point, inferring th
at interpolation would be a valuable asset in
representing
the collected data.


Despite work relating to the visualization of building conditions
, currently there exists a gap between
the use of environmental sensor networks and
the application of that
data in building visualization.

Current systems do not current
ly

inform the user of any abnormalities or rapid changes within a data
set (no analysis is performed).
For the use of large data sets, one could assume that the automation
of detecting
and ident
ifying
rapid changes and outliers in the data to the use
r would be a desirable
function.

Similarly,
the ability to retrieve
the exact data values in conjunction with its visualization
does not seem to be readily supported by current tools.







8


Research
Design

This section outlines the
proposed design of the research and is

split into three areas; Initial Analysis,
Research Methodology

and Expected Outcomes.

The Initial Analysis provides the foundation for
establishing
a plan and research method which con
stitutes
the Research Methodology.

The section
concludes with the expected

outcomes of the research.

According to
Shaw

(2002)

the classification of this research is
a

method of development


with the
expected results being
a ‘procedure or technique’, which
given the research
,

will be

a technique
.

Initial Analy
sis

An in
itial review of the planned research has been conducted to determine the best approach and
methodology for the desired outcome. The platform of development will be the Ti
n
mith
(Piekarski
and Thomas, 2003)

mobile AR system
.

Despite the integration of my research as part of an existing
system, the development of
a new ‘plug
-
in’ model for Tinmith will enable the construction of the
system with minimal interaction

with Tinmith.

A shared header interface (
Figure
6
) will enable th
e
construction of a separate system without relying

on detailed knowledge of the Tinmith system
.

This
header will also enable the use of the visualization subsystem by any OpenGL application that
implements the shared header
’s interface
.

The visualization system will contai
n no Tinmith
dependant code
.


Figure
6

Proposed Tinmith
shared library

architecture

Given the nature of the research in creating effective representations for outdoor environmental
corrosion, p
reliminary resea
r
ch will be conducted to provide background knowledge to assist in the
effective
design and selection of
the visualiza
t
ion
s
. This will serve as part of the
first step

in
the
analyse
, design, implement and evaluate

process.

Following
a revie
w of the proposed system

design
,
implementation
will occur on the Tinmith plug
-
in
, after which the process will be eval
ua
ted to
determine

the

continued feasibility

of the methodology
.

Research Methodology

Following an initial review of the nature of the research, an iterative
analyse, develop, implem
e
n
t
and evaluate
development cycle
is proposed

(
Figure
7
)
.

Given the t
imeline for the research, two
iterations are proposed, with user ev
a
l
u
ations performed at the
conclusion

of each iteration
. These
evaluations will

provide an indication of progress and provide feedback
as grounds
for future
improve
ment of the system
.

Both
iterations will focus on the development of effective and intuitive
visualizations for outdoor environmental
corrosion
,

with

e
ach iteration result
ing

in

a

running system

to

ensure the system’s

availability

for user evaluations
.

9










Figure
7

Proposed iterative development cycle

Expected Outcomes

The outcome of this research is intended to be a set of visualizations that effectively represent
outdoor environmental corrosion in
mobile
AR.

For the visualization techniques
, full descriptions of
the development process
,

along with a
function
al

visualization system

prototype

(as a plug
-
in to

the

Tinmith mobile AR system)

to
demonstrate

the
visualization
s

will be provided
.

The system will
support
the

viewing

of

detailed numerical data in conjunction with its visual representation
, alo
ng
with auto
matic identifying and directing the user

to

‘areas of risk’

on the structure.

The visualization
system will be able to be implemented by any system that implements the system
’s

‘plug
-
in’
interface.



Full
user
evaluation
s

of the visualizations will be avai
lable as justification for their development
history
,
outli
ning

why specific facets of them were designed as they were, along with additional
comments relating to effective implementation of them.

The requirements for implementing the visualization system

in other systems will be given
, along
with
a list
of
essential

and non
-
essential

hardware
requi
red to interact with the system
.

Specifications and working examples for the reading, transmission and storage of data from the
wireless
sens
ors will also be
pr
ovided
.


Full source code will be provided for the visualizations
,

the
sensor
visualization system

along with all associated
utilities

that have been developed
.

Despite not being a contribution, a side outcome of the research will be a generic plug
-
in inte
rface

and template

for the Tinmith system. The interface specification, along with a working example will
be available.



Analysis

Implementation

Evaluation

Design

10


User Evaluations and
Ethics

This section outlines the
involvement of user evaluations as part of the research, along with the
associat
ed ethics considerations.

The requirement for user involvement in the evaluation of visualization techniques leads to the
requirement for ethics approval by the
u
niversity.

Given the nature of the data required, it is
expected that the ethics proposal will

fall under a ‘low risk’

category.

The research’s ethics
application is already in progress, and is expected to be completed mid
-
June.

Given

the
reasoning behind using a mix of qualitative and
quantitative methodologies for evaluating
visualizations

(North, 2006)
, the user survey will involve a range of questions serving both methods
of
measurement.

The remainder of section provides a sample of the expected
user evaluation form
t
hat will be used to gauge the effectiveness of the
proposed visualizations.

Proposed User
Evaluation

Form

Questionnaire

Visualization of
Outdoor Environment
Corrosion Sensor Data in
Mobile
Augmented Reality

Name:

Age:

Gender:

M

F


(
Please circle
)

Have you used an augmented reality system before?


No

Yes

Extensively



(
Please circle
)

Have you used a wearable computer system before?


No

Yes

Extensively



(
Please circle
)

Please answer the following questions based on your interpretations of the syst
em:

1.

Which sensor had the most corrosion?


(
Please circle)

Left
-
Near

Left
-
Far


Right
-
Near

Right
-
Far

2.

Which sensor was the first to draw your attention?

(
Please circle)

Left
-
Near

Left
-
Far


Right
-
Near

Right
-
Far

3.

What was the inside temperature at the left
-
near
sensor?


________
o
C


4.

What was the outside temperature at the left
-
far sensor?


________
o
C


5.

Which sensor requires the most urgent attention?


(
Please circle
)

Left
-
Near

Left
-
Far


Right
-
Near

Right
-
Far

6.

Did you prefer the ‘plasma’ or ‘crystal’ effect for showin
g corrosion?


(
Please circle
)

11


Neither

Plasma


Crystal


Both

7.

Why?

___________________________________________________________________________


8.

Did you prefer the ‘box’ or ‘gauge’ sensor visualization for representing the sensor?


(
Please circle
)

Neither

Plasma


Crystal


Both

9.

Why?

___________________________________________________________________________

10.

How intuitive was the ‘box’ sensor to understand? (
Please draw a line to indicate your
preference)

Not Intuitive _____________________ Very Intuitive

11.

H
ow intuitive was the ‘gauge’ sensor to understand? (
Please draw a line to indicate your
preference)

Not Intuitive _____________________ Very Intuitive

End of
Questionnaire




12


Preliminary Results

This section reports on the preliminary results of the curr
ent research to date. This covers the
development

of the

majority of the

first iteration of the system

to
date
.

A number of visualizations
to represent the sensor’s values
were

designed
, initially

prototyped

i
n
Photoshop, and then in OpenGL

(
which has been

since been

implemented
in

Tinmith
)
.
Two
representations
were

created to represent corrosion



the
Plasma
(
Figure
5
)
and Crystal effects

(
Figure
9
)



with

two other

representations


the Box

and Gauge

(
Figure
10
)



used

to represent the
values of the sensor

s other attributes.


The centre hue of the
Plasma’s
transparent
gradient

remains constant
,
with
the size of the
gradient
varied

to

indicate
the s
everity

recorded at that po
sition

and

that

severe corrosion of a single spot
sensor is most likely to also be affecting the areas around it.

Red was selected as the most obvious
colour to indicate problematic areas.


Figure
8

The Plasma effect shown at various degrees of severity




Figure
9

The ‘(rust) crystal’ representation, showing three stages (left
-
to
-
right) of the corrosion (15%, 35% and 70%)

It was envisioned that numerous crystals would semi
-
randomly ‘grow’ out of a sensor, with

their size
and complexity

(and thus visibility)

indicating
the amount of corrosion present

a
t that position
.


Figure
10

The revised Box (second revision) and Gauge (fourth revision) sensor visualizations

Concerns were noted regarding the different
size
scales for the inner and outer measures semi
-
circles, however informal
evaluations

have found users can accurately read th
e value of attributes
despite the varying scale.

Inside Temperature

Outside Temperature

Humidity

13


Initial informal reviews
of the prototypes

(
Figure
11
)

have been promising, with feedback indicating
the plasma effec
t as an effective and intuitive indicator for corrosion
. C
oncerns for the crystal
obscuring the sensor’s readability
, along with ambiguity as to
its

meaning

were also raised
. Following
feedback, the colour scheme of the Box has been changed to use a more t
raditional blue
-
red
colour
spectrum to represent

inside and outside temperature
s
.

The requirement for users ‘guessing’ a
value based on the Box’s colour scheme was a primary driver behind the development of the Gauge,
which facilitates
the reading of senso
r values within an acceptable error range.


Figure
11

In
itial
visualization system
prototype

showing a semi
-
opaque

transparent overlay (left) and dialog overlay (right)

Tinmith has been extended to support the loading and
execution of a dynamic
, shared

library where
the visualization system resides.

The visualization system is completely independent of Tinmith, with
only a common
interface

used to ensure
compatibility
.

Communication is one
-
way, with Tinmith
passing menu opt
ions to the library, but the library
is
unable to pass data to the system.

Upon
rendering

a

scene, Tinmith will setup the OpenGL instance, perform
its

own
rendering, move to a
known origin

and orientation

and execute the dynamic library’s draw method.

This

keeps the
visualization system
independent

of Tinmith, with the ability for it to be easily ported to any other
OpenGL
application
.



Figure
12

Current visualization system running in the Tinmith emulator showing first person
(left) and birds eye (right)
views

Tinmith
input m
enu operations are passed to the library, where it tracks its own menu state.

Upon
loading, the

library dynami
cally

loads sensor instances
from a dataset, allowing the user to navigate
the dataset via the u
se of system’s inputs.

The use of
hardware inputs
,

via the use

of

Phidgets
2
,

has
been implemented, providing
a dedicated

set of inputs for controlling data navigation.




2

http://www.phidgets.com/

14


Semi
-
opaque structures have been implemented, allowing
for

‘x
-
ray vision’ to see the pos
ition of
other walls and sensors
through walls

(
Figure
12
)
.

Currently, the system supports the visualization of two structures, displaying sensors on two corners
of
each building

(
Figure
12
)
.
Idly, these

sensors are the Box
representation
,
however when the user
focuses on them
, they

rotate/shrink
to display a Gauge
representation
to provide more detailed

and
exact

values
.

Users can select

groups


of
walls from the structure
s that share a common property, in
this case the direction (north, south
,

e
tc.) they are facing.

A formal evaluation of the
first iteration of de
velopment

is pending ethics approv
al.


15


Schedule

This
section

outlines the proposed research schedule.

Period

Task

Completed Status

Comments

Jan
-
Feb

Initial project analysis

100%


Feb

Literature Review

100%


Feb
-
Mar

Task identification and breakdown

100%


Mar

Propose representations for data
visualization

100
%

Iteration 1

Mar
-
Apr

Create representation in OpenGL

100
%


Apr

Establish Tinmith plug
-
in template

100
%


Apr

Integrate representations into Tinmith

100
%


Apr
-
May

Establish i
nput handli
ng

and rendering
of objects at correct GPS positions

100%


May

Ability

to render

and navigate

structure
in AR

100
%


May

Integrate Phidget
s

input
s

into system

75
%


May

Ethics approval

50
%


May

Prepare research Proposal

100
%


May
-
Jun


0%


Jun

User
testing

0%


Jun
-
Jul

Review of user testing

0%

Iteration 2

Jul

Implement changes based on feedback

0%


Jul
-
Aug

User testing

0%


Aug

Review of user testing

0%


Aug
-
Sep

Begin
writing t
hesis

0%


Sep

Review Completed Thesis

0%


Oct

Thesis submission

0%


Table
1

Proposed schedule



16


Proposed Thesis Table of Contents

Following the conclusion of the research, the written thesis will be
provided
,
detailing

the work
undertaken.
This is
the
proposed table of contents

for the final work.

Abstract

Chapter 1. Introduction


1.1 Motivation


1.2 The Problem


1.3 Research question


1.4 Contribution


1.5 Dissertation structure



Chapter 2. Literature review



2.1 Augmented Reality





2.1.1 Definition of AR





2.1.2 Classification of AR





2.1.
3 Tinmith and Mobile AR





2.1.4 Applications of AR



2.2 Visualization





2.2.1 Data Visualization





2.2.2 Structure Visualization



2.
3

Summary

Chapter 3. Research methodology



3.1 Initial analysis



3.2 Research methodology



3.3 Expected outcome



3.4 Summary

Chapter 4. Visualization techniques



4.1 Motivation




4.1.1 Goals




4.1.2 Considerations



4.2 Design and Implementation




4.2.1 Initial aim and requirements




4.2.2 Design reasoning




4.2.3 Limitations




4.2.4 Model refinement technique



4.3 Evaluation and discussion




4.3.1 Benefits




4.3.2 User Evaluations




4.3.2 Discussion



4.4 Summary

Chapter 5. Sensor Visualization System



5.1 Motivation



5.2 Design and implementation





5.2.1 Aim and constr
aints





5.2.2 Sensors and input devices analysis for Tinmith





5.2.3 System construction



5.3 Evaluation





5.3.1 Benefits





5.3.2 Limitations





5.3.3 Evaluation



5.4 Summary

17


Chapter 6. Conclusion



6.1 Corrosion visualization techniques



6.2 Considerations regarding visualization techniques



6.3 Future work



6.4 Concluding remarks

Chapter 7. References

Table
2

Proposed
thesis
Table of Contents




18


Summary

This research project is motivated to
develop techniques to
assist in the visualization of outdoor
environmental corrosion in mobile augmented reality.

The integration of
sensor networks with
situated visualization

in
mobile
augmented reality
will result in
benefits

for on
-
site inspections of
large structures.

The use of in
-
situ visualization

of environmental

data

provides

an intuitive method
of
understanding
information
that was previously
difficult to interpret and

hard t
o extract

meaningful inferences
.



19


References

AZUMA, R
., BAILLOT, Y., BEHRINGER, R., FEINER, S., JULIER, S. & MACINTYRE, B. (2001) Recent
Advances in Augmented Reality.
IEEE Comput. Graph. Appl.,

vol. 21
,

pp. 34
-
47.

AZUMA, R. T. (1997) A Survey of Augmented Reality.
Computer Graphics and Applications,

vol. 21
,

pp. 34
-
47.

BELHUMEUR, P. N., CHEN, D., FEINER, S., JACOBS, D. W., KRESS, J., LING, H., LOPEZ, I.,
RAMAMOORTHI, R., WHITE, S. & ZHANG, L. (2009) Searching the World’s Herbaria: A System
for Visual Identification of Plant Species.

CARD, S. K., MACKINLAY, J
. D. & SHNEIDERMAN, B. (Eds.) (1999)
Readings in information
visualization: using vision to think
, Morgan Kaufmann Publishers Inc.

FAN, F. & BIAGIONI, E. S. (2004) An Approach to Data Visualization and Interpretation for Sensor
Networks.
Proceedings of the

Proceedings of the 37th Annual Hawaii International
Conference on System Sciences (HICSS'04)
-

Track 3
-

Volume 3.

IEEE Computer Society.

FEINER, S., MACINTYRE, B., HOLLERER, T. & WEBSTER, A. (1997) A Touring Machine: Prototyping 3D
Mobile Augmented Reali
ty Systems for Exploring the Urban Environment.
Proceedings of the
1st IEEE International Symposium on Wearable Computers.

IEEE Computer Society.

FEINER, S., MACINTYRE, B. & SELIGMANN, D. (1993) Knowledge
-
based augmented reality.
Commun.
ACM,

vol. 36
,

pp.
53
-
62.

GERHARD, I., REITMAYR, G. & SCHMALSTIEG, D. (2004) Collaborative Augmented Reality for Outdoor
Navigation and.
In Proceedings of the Symposium on Location Based Services and
TeleCartography.

GOLDSMITH, D., LIAROKAPIS, F., MALONE, G. & KEMP, J. (2008
) Augmented Reality Environmental
Monitoring Using Wireless Sensor Networks.
Proceedings of the 2008 12th International
Conference Information Visualisation.

IEEE Computer Society.

GUNNARSSON, A.
-
S., RAUHALA, M., HENRYSSON, A. & YNNERMAN, A. (2006) Visuali
zation of sensor
data using mobile phone augmented reality.
Proceedings of the 5th IEEE and ACM
International Symposium on Mixed and Augmented Reality.

IEEE Computer Society.

HEIDEMANN, G., BAX, I. & BEKEL, H. (2004) Multimodal interaction in an augmented
reality scenario.
Proceedings of the 6th international conference on Multimodal interfaces.

State College, PA,
USA, ACM.

HEINRICH, M., THOMAS, B. H., MUELLER, S. & SANDOR, C. (2008) An augmented reality weather
system.
Proceedings of the 2008 International

Conference on Advances in Computer
Entertainment Technology.

Yokohama, Japan, ACM.

JUN, R. (1995) The Magnifying Glass Approach to Augmented Reality Systems.
International
Conference on Artificial Reality and Telexistence.

Makuhari, Chiba, Japan.

KONG, R.

(2009) Corrosion Sensor Programming Guide.
Corrosion Sensor Programming Guide.

Adelaide, UniSA.

MALKAWI, A. & CHOUDHARY, R. (1999) Visualizing the Sensed Environment in the Real World.
Journal of the Human
-
Environment Systems,

vol. 3
,

pp. 61
-
99.

20


MILGRAM,
P., TAKEMURA, H., UTSUMI, A. & KISHINO, F. (1995) Augmented reality: a class of displays
on the reality
-
virtuality continuum. IN DAS, H. (Ed.) 1 ed. Boston, MA, USA, SPIE.

NORTH, C. (2006) Toward Measuring Visualization Insight.
IEEE Comput. Graph. Appl.,

vol. 26
,

pp. 6
-
9.

PIEKARSKI, W. (2009) Through
-
Walls Collaboration.
IEEE Pervasive Computing,

vol. 8
,

pp. 42
-
49.

PIEKARSKI, W. & THOMAS, B. H. (2003) Interactive augmented reality techniques for construction at
a distance of 3D geometry.
Proceedings of th
e workshop on Virtual environments 2003.

Zurich, Switzerland, ACM.

PIEKARSKI, W., THOMAS, B. H. & KING, G. R. (2005) ARVino : outdoor augmented reality visualisation
of viticulture GIS data. IEEE Computer Society.

RAD, H. N. & KHOSROWSHAHI, F. (1997) Visua
lisation of building maintenance through time.
Information Visualization, 1997. Proceedings., 1997 IEEE Conference on.

RAUHALA, M., GUNNARSSON, A.
-
S. & HENRYSSON, A. (2006) A novel interface to sensor networks
using handheld augmented reality.
Proceedings
of the 8th conference on Human
-
computer
interaction with mobile devices and services.

Helsinki, Finland, ACM.

SHAW, M. (2002) What makes good research in software engineering?
International Journal on
Software Tools for Technology Transfer (STTT),

vol. 4
,

pp. 1
-
7.

SUTHERLAND, I. E. (1968) A head
-
mounted three dimensional display.
Proceedings of the December
9
-
11, 1968, fall joint computer conference, part I.

San Francisco, California, ACM.

THOMAS, B., CLOSE, B., DONOGHUE, J., SQUIRES, J., BONDI, P. D., MORR
IS, M. & PIEKARSKI, W.
(2000) ARQuake: An Outdoor/Indoor Augmented Reality First Person Application.
Proceedings of the 4th IEEE International Symposium on Wearable Computers.

IEEE
Computer Society.

THOMAS, B., DEMCZUK, V., PIEKARSKI, W., HEPWORTH, D. & GU
NTHER, B. (1998) A wearable
computer system with augmented reality to support terrestrial navigation.
2nd International
Symposium on Wearable Computers.

Pittsburgh, Pennsylvania, IEEE.

WHITE, S. (2009a) Interaction with the Environment
-

Sensor Data Visuali
zation in Outdoor
Augmented Reality
International Symposium on Mixed and Augmented Reality.

Orlando,
Florida, University of Columbia.

WHITE, S. & FEINER, S. (2009) SiteLens: situated visualization techniques for urban site visits.
Proceedings of the 27th i
nternational conference on Human factors in computing systems.

Boston, MA, USA, ACM.

WHITE, S., MOROZOV, P. & FEINER, S. (2007) Imaging for Insight: Site Visit by Situated Visualization.
ACM Computer/Human Interaction.

San Jose, California.

WHITE, S. M.
(2009b) Interaction and Presentation Techniques for Situated Visualization

Department of Computer Science.

New York, Columbia University.

XINYU, L. (2009) Ubiquitous Augmented Reality System. IN DONGYI, C. & SHIJI, X. (Eds.).

21


YUXI, H., DESHI, L., XUEQIN, H
., TAO, S. & YANYAN, H. (2009) The Implementation of Wireless Sensor
Network Visualization Platform Based on Wetland Monitoring.
Intelligent Networks and
Intelligent Systems, 2009. ICINIS '09. Second International Conference on.

ZHOU, F., DUH, H. B.
-
L. & B
ILLINGHURST, M. (2008) Trends in augmented reality tracking, interaction
and display: A review of ten years of ISMAR.
Proceedings of the 7th IEEE/ACM International
Symposium on Mixed and Augmented Reality.

IEEE Computer Society.