Semantic knowledge-based framework to improve the situation awareness of autonomous underwater vehicles

grassquantityAI and Robotics

Nov 15, 2013 (3 years and 9 months ago)

119 views

JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 1
Semantic knowledge-based framework to
improve the situation awareness of autonomous
underwater vehicles
Emilio Miguel
´
a
˜
nez,Member,IEEE,Pedro Patr
´
on,Member,IEEE,Keith E.Brown,Member,IEEE,
Yvan R.Petillot,Member,IEEE,and David M.Lane,Member,IEEE
Abstract—This paper proposes a semantic world model framework for hierarchical distributed representation of knowledge in auto-
nomous underwater systems.This framework aims to provide a more capable and holistic system,involving semantic interoperability
among all involved information sources.This will enhance interoperability,independence of operation,and situation awareness of the
embedded service-oriented agents for autonomous platforms.The results obtained specifically impact on mission flexibility,robustness
and autonomy.The presented framework makes use of the idea that heterogeneous real-world data of very different type must be
processed by (and run through) several different layers,to be finally available in a suited format and at the right place to be accessible
by high-level decision making agents.In this sense,the presented approach shows how to abstract away from the raw real-world
data step by step by means of semantic technologies.The paper concludes by demonstrating the benefits of the framework in a real
scenario.A hardware fault is simulated in a REMUS 100 AUV while performing a mission.This triggers a knowledge exchange between
the status monitoring agent and the adaptive mission planner embedded agent.By using the proposed framework,both services can
interchange information while remaining domain independent during their interaction with the platform.The results of this paper are
readily applicable to land and air robotics.
Index Terms—Autonomous vehicles,Ontology design,model-based diagnostics,mission planning.

1 I
NTRODUCTION
1.1 Motivation
W
ITH
the growing use of autonomous and semi-
autonomous platforms and the increase data flows in
modern maritime operations,it is critical that the data is
handled efficiently across multiple platforms and domains.
At present,knowledge representation is embryonic and
targets simple mono-platform and mono-domain applications,
therefore limiting the potential of multiple coordinated ac-
tions between agents.Consequently,the main application for
autonomous underwater vehicles is information gathering from
sensor data.In a standard mission flow,data is collected during
mission and then post-processed off-line.
However,as decision making technologies evolve towards
providing higher levels of autonomy,embedded service-
oriented agents require access to higher levels of data repre-
sentation.These higher levels of information will be required
to provide knowledge representation for contextual awareness,
temporal awareness and behavioural awareness.
Two sources can provide this type of information:the
domain knowledge extracted from the original expert or the
inferred knowledge from the processed sensor data.In both
cases,it will be necessary for the information to be stored,
accessed and shared efficiently by the deliberative agents
while performing a mission.These agents,providing different
• Authors are with the Ocean Systems Laboratory,Heriot-Watt University,
Edinburgh,Scotland,UK,EH14 4AS
E-mail:e.miguelanez,p.patron,k.e.brown,y.r.petillot,d.m.lane@hw.ac.uk
capabilities,might be distributed among the different platforms
working in collaboration.
1.2 Contribution
This article focuses on the study of a semantic world model
framework for hierarchical distributed representation of knowl-
edge in autonomous underwater systems.The framework uses
a pool of hierarchical ontologies for representation of the
knowledge extracted from the expert and the processed sensor
data.Its major advantage is that service-oriented agents can
gain access to the different levels of information and might
also contribute to the enrichment of the knowledge.If the
required information is unavailable,the framework provides
the facility for requesting that information be generated by
other agents with the necessary capabilities.
At the decision level,an autonomous planner generates
mission plans based on the representation of the mission goals
in this semantic framework.As consequence,we claimthat the
proposed framework enhances interoperability,independence
of operation,and situation awareness of the embedded ser-
vice oriented agents for autonomous platforms.The results
obtained specifically impact on mission flexibility,robustness
and autonomy.
To the best of our knowledge,this is the first time that an
approach to goal-based planning using semantic representation
is applied to the adaptation of an underwater mission in order
to maintain the operability of the platform.Furthermore,we
have demonstrated on a set of sea trials the advantages of this
approach in a real scenario.
Digital Object Indentifier 10.1109/TKDE.2010.46 1041-4347/10/$26.00 © 2010 IEEE
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 2
Fig.1.Human and AUV situation awareness levels
across the levels of autonomy.
This paper is structured as follows:Section 2 describes our
evolved view of the sensing-decision-acting loop for autonomy
for Autonomous Underwater Vehicles (AUVs).Section 3 pro-
vides an overview of previous related work in knowledge rep-
resentation in robotics systems,fault detection and diagnosis,
and mission plan adaptation.Section 4 presents the interaction
of the diagnosis and planner agents using the framework.
Section 5 presents an overview of the role of ontologies as
knowledge representation,including the main features behind
this approach.Section 6 describes the framework focusing
on the semantic representation of the vehicle’s knowledge.
Section 7 describes the different modules involved in the test
case scenario,considering the horizontal flow of information.
Section 8 demonstrates the benefits of the proposed framework
in a real scenario,where a hardware fault is simulated in a
REMUS 100 AUV while performing a mission.This paper
ends in Section 9 with the conclusions and future work.
2 S
ITUATION
A
WARENESS FOR
A
UTONOMY
The human capability of dealing and understanding highly
dynamic and complex environments is known as Situation
Awareness (SA
H
).SA
H
breaks down into three separate
levels:perception of the environment,comprehension of the
situation and projection of the future status.
According to Boyd,decision making occurs in a cycle
of observe-orient-decide-act,known as OODA loop [1].The
Observation component corresponds to the perception level
of SA
H
.The Orientation component contains the previously
acquired knowledge and understanding of the situation.The
Decision component represents the SA
H
levels of comprehen-
sion and projection.This last stage is the central mechanism
enabling adaptation before closing the loop with the final
Action stage.Note that it is possible to take decisions by
looking only at orientation inputs without making any use of
observations.
Based on the autonomy levels and environmental character-
istics,SA
H
definitions can be directly applied to the notion of
unmanned vehicle situation awareness SA
V
[2].The levels of
situation awareness for individual unmanned vehicle systems
(SA
S
) span from full human control to fully autonomous
unmanned capabilities (see Fig.1).
In current implementations,the human operator constitutes
the decision making phase.When high-bandwidth communi-
cation links exist,the operator remains in the loop during
the mission execution.Examples of the implementation of
this architecture are existing Remote Operated Underwater
Vehicles (ROVs).However,when the communication is poor,
unreliable or not allowed,the operator tries,based only on
the initial orientation or expertise,to include all possible
behaviours to cope with execution alternatives.As a conse-
quence,unexpected and unpredictable situations can cause the
mission to abort and might even cause the loss of the vehicle,
as happened with Autosub2 which was lost under the Fimbul
ice sheet in the Antarctic [3].Examples of this architecture
are current implementations for AUVs.
In order to achieve an autonomous decision making loop,
two additional components are required:a status monitor and a
mission plan adapter.The status monitor reports any changes
detected during the execution of a plan.These modifications
might change the SA
V
perception.When the mission executive
is unable to handle the changes detected by the status monitor,
the mission planner is called to generate a new modified
mission plan that agrees with the updated knowledge of
the world (see Fig.2).The RAX architecture was the first
attempt of implementing this type of architecture on a real
system [4].However,tight time deadlines,more restricted
communications and different environmental constraints exist-
ing in general AUVs applications have led us to the research
of a new implementation of this approach based on ontological
knowledge-based situation awareness.
3 R
ELATED
W
ORK
3.1 Knowledge Representation
Current knowledge representation approaches are only able to
provide the perception or observation level of SA
V
.State of
the art embedded agents make use of different message transfer
protocols in order to stay updated with the current status of
the platform and the environment.Several approaches can
be found in the literature implementing information transfer
protocols for robotics.
For example,robotic libraries such as Player [5] and Yet
Another Robotic Platform (YARP) [6],support transmission
of data across various protocols – TCP,UDP,MCAST (multi-
cast),shared memory.They provide an interface to the robot’s
sensors and actuators.This allows agents to read data from
sensors,to write commands to actuators,and to configure
devices on the fly.These two approaches separate the agents
from the details of the network technology used but they
do not provide any standardisation about the meaning of the
information transferred.
The Mission Oriented Operating Suite (MOOS) [7],[8] uses
human readable ASCII messages for communication of data to
Status
Monitor
Mission
t
0
Mission
t
n
Execution status
Description
Ȉt
0
Mission
Generator
Environment
Events
Mission
Executive
Platform
Ȉ
Actions
Observations
OFFLINE ONLINE
Mission
Adapter
Description
Ȉt
n
Objectives
Initial
State
Fig.2.Required OODA-loop for SA
S
= SA
V
.Decision
stage for adaptation takes place on-board based on Ob-
servations from the status monitor.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 3
Data
Events
Client / WM Interface (OceanSHELL)
Internal Client
Requests
Queries
Client Shell
Database
Event Manager
Drivers
Data Inputs
External Client
Client Shell
Query
Manager
Data
Events
Requests
Data
Events
Requests
Interpolator
Fig.3.World model architecture for the BAUUVMoD
program (Battlespace access for unmanned underwater
systems).Its database handles data inputs and queries
from internal and external clients.
a centralised database.This centralized topology is vulnerable
to ’bottle-necking‘ at the server as the ASCII messages can
generate considerable parsing overheads.
The OceanSHELL libraries [9] implement UDP broadcast
protocol to pass data information between agents.In the
latest version,abstraction messages are able to carry semantic
information.These libraries are the first attempt to standardise
semantic information for unmanned underwater platforms in
order to share knowledge between different multidisciplinary
agents.However,the approach is still limited to the observa-
tion or perception level of SA
V
.
An more detailed review of these and other robotic knowl-
edge representation approaches was published by [10].
As an extension of these libraries,the Battlespace Access
for Unmanned Underwater Vehicles (BAUUV) program,spon-
sored by the UK Ministry of Defence Directorate of Equip-
ment Capability (Underwater Battlespace),shown the benefit
of also including the mental or orientation model component
of SA
V
[11].A diagram describing the BAUUV dynamic
multi-layered world model architecture is shown in Fig.3.
The approach was capable of integrating the capabilities of di-
fferent software components and provide them with a common
picture of the environment.This picture contained processed
sensor data and a priori knowledge.Thus it provided a full
SA
V
.However,the design was limited to Autonomous Target
Recognition (ATR) applications in the underwater domain.
In order to provide a truly service-oriented architecture,an
evolution from this approach is required providing a generic
framework independent of service capability and domain of
application.
The Joint Architecture for Unmanned Systems (JAUS),orig-
inally developed for the Unmanned Ground Vehicles domain
only,has recently extended to all domains trying to provide
a common set of architecture elements and concepts [12].
In order to handle the orientation and observation phases,
it classifies four different sets of Knowledge Stores:Status,
World map,Library and Log.
Our experience has shown that overlap exists between these
different sets of knowledge stores.This overlap is provided
by the strong interconnection existing between the orientation
and observation phases for SA
V
.The approach proposed in
this project will make use of the concepts proposed by JAUS
but enhances the Knowledge Store set by providing higher
flexibility in the way the information can be handled and
accessed.
Looking at related work on applying ontologies,recent
publications [13],[14],[15] shows that there is a growing
inclination to use semantic knowledge in several areas of
robotic systems,playing a key role in order to improve the
inter-operability of the dierent robotic agents.From mapping
and localization using semantically meaningful structures to
human-robot interaction trying to make the robot understand
the human environment of words,gestures and expressions,
it has been clear that there are many ways in which the
technology behind ontologies may be applied in robotics.
However,there is yet a lack of well defined architecture,which
abstracts from low-level real-world information to higher-
level information,which is enriched by semantics to output
enhanced diagnostic and mission planning information.
3.2 Fault Detection and Diagnosis (FDD)
In the field of diagnostic,the gathering and processing of
knowledge in AUVs as in most robotic systems are classified
into two categories (i) model free and (ii) model based.
Model free methods,such as rule-based,use limit checking
of sensors for the detection of faults.Rule-based diagnostic
is the most intuitive form of diagnostic,where through a
set of mathematical rules,observed parameters are assessed
for conformance to anticipated system condition.Knowledge
gained is thus explicit as rules are either satisfied or not.Rule
based reasoning is an easy concept to employ,and if kept
simple requires little development time,provided that expert
tacit knowledge (system behaviour awareness) can be straight
forwardly transformed to explicit knowledge (rules).However,
these rules use knowledge gained outside observation of the
system rather a representation of any internal mechanisms.
In other words,they represent only the relationship between
symptoms and failures,and cannot provide a coherent ex-
planation of the failure.Furthermore,they exhibit a lack of
flexibility as only faults that have been explicitly described
can be diagnosed.The main advantage of a rule base system is
that execution time is generally much faster than other methods
using more sophisticated models.
Model based diagnosis systems rely on the development of
a model constructed fromdetailed in-depth knowledge (prefer-
ably from first principles) of the system.There is a wide range
of models available for diagnosis,including mathematical,
functional and abstract [16].The fault detection and isolation
(FDI) community has tackled the diagnostic task by comparing
simulated results to real results,and detects abnormalities
accordingly based mainly in analytical redundancy.The main
advantage of this approach is that it can be adapted easily
to changes in the system environment by changing inputs to
the model being simulated.However,the numerical models
are based on behaviour of the system,with little knowledge
of the structure and functionality of the components.Also,
there is not mechanism to detect multiple faults and it requires
expensive computation.
Currently,there is an increasing move away from FDD
model-based to structure and data-driven methods,because
complex dynamic systems are difficult to model,based on
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 4
analytical redundancy alone.Uppal and Patton argue that
the interesting combination of certain aspects of qualitative
and quantitative modelling strategies can be made [17].They
further state that qualitative methods alone should be used,
if faults cause qualitative changes in system performance
and when qualitative information is sufficient for diagnosis.
Qualitative methods are essential if measurements are rough
or if the system cannot be described by differential equations
with known parameters.
3.3 Mission Planning
Declarative mission plan adaptation for unmanned vehicles
leverage fromthe research carried out on autonomous planning
generation [18].Space operations,Mars rovers in particular,
have been the impetus behind autonomous plan adaptability
for unmanned vehicles.
The Remote Agent Experiment (RAX) [4],which flew on
Deep Space-1,first demonstrated the ability to autonomously
control an active spacecraft.This project highlighted the
problem of using a batch planner on a reactive system.It took
hours to replan after updates had invalidated the original plan.
Another system handling a certain level of planning adap-
tation in a real scenario was the Automated Schedule and
Planning Environment (ASPEN) [19].This system classified
constraint violations and attempted to repair each by per-
forming a planning operation.However,the set of repair
methods were predefined and the type of conflict determined
the set of possible repair methods for any given conflict.The
system could not guarantee that it would explore all possible
combinations of plan modifications or that it would not retry
unhelpful modifications.
Van der Krogt later formalised two levels of handling events
in what was called executive repair and planning repair [20].
His approach combined unrefinement and refinement stages
in order to provide faster performance than planning from
scratch.However,this approach might fail to produce an
optimal plan.This could be considered an issue in domains re-
quiring optimality.This is not generally the case in unmanned
vehicle mission plans where optimality can be sacrificed for
operability.
The approach was compared with GPG and Sherpa systems.
In the GPG [21],based on the graphplan planner [22],the
unrefinement stage is done only on the initial plan and never
on any of the plans produced by a refinement step.The
Sherpa [23],based on the LPA* algorithm that we previously
used in [24] for adaptive trajectory planning,could only be
applied to problems in which actions have been removed from
the domain knowledge.
In the underwater domain,several challenges have been
identified requiring adaptive planning solutions of the mis-
sion [25],[26].The potential benefits of the adaptive planning
capabilities have been promoted by [27].Together with Fox
and other authors,they have started using sensor data to adapt
the decision making on the vehicle’s mission [28],[29].
4 F
AULT
T
OLERANT
A
DAPTIVE
M
ISSION
P
LANNING
In recent years,emphasis for increasing AUVs operability has
been focused in increasing AUVs survivability by reducing the
susceptibility and vulnerability of the platform [30].Recent
approaches in rules of collision [31] and wave propagation
techniques for obstacle avoidance [24],collision avoidance and
escape scenarios [32] have focused on reducing susceptibility
by looking at the adaptation of the vehicles trajectory plan.
However,when the vehicle faces unforeseen events,such as
unexpected component failures or unplanned interactions with
the surrounding environment,the focus of the mission should
shift to reconfigure itself to use alternative combinations of
the remaining resources.The underwater domain has scarce
bandwidth and tight response constraints to keep the operator
in the loop.In such challenging environment,autonomous
embedded recoverability,is a key capability for vehicle’s
endurance.This can be achieved via adaptation of the vehicle’s
mission plan.
Adapting of a mission on the fly in response to events is
feasible with embedded planners.However,they are limited
to the quality and scope of the available information.For
them to be effective,the mission programmer must predict
all possible situations,which is clearly impractical.Therefore,
to adapt mission plans due to unforeseen and incipient faults,it
is required that accurate information is available to recognize
that a fault has occurred and the root cause of the failure.
For example,if a propeller drive shaft developed too much
shaft friction,then the added current load may overburden the
drive motor.Identification of such a condition before the shaft
bearing fails would allow rectification of the mission before a
major breakdown occurs.
AUVs are generally equiped with FDD systems based on
damage control that results in the vehicle resurfacing in the
event of any fault in the system.But future industrial and
military AUVs may require systems that operate even while
partially damaged.Hence,it is of importance to develop
a system which not only detects failure in the underwater
vehicle but also provides meaningful and reliable information
to counterpart modules,such as the mission planner,to adapt
and recover from the failures.
4.1 Mission Plan Adaptation
Following classical planning problem representation,an in-
stance of a vehicle mission problem can be simply defined as
Π = {P,A,I,G},where P is the propositions defining the
available resources in the vehicle,A is the set of actions,I is
the initial platform state and G is the set of possible mission
accomplished states.D = P ∪ A defines the mission domain
and P = I ∪G the mission problem.Given and instance Π,
the mission generation problem consists in finding if there
exists a mission plan (mp),using a
i
∈ A,such that satisfies
any g ∈ G.
Several approaches exist in the artificial intelligent (AI)
literature capable of solving this problem.In a real envi-
ronment where optimality can be sacrificed by operability,
partial ordered planning (pp) is seen as a suitable approach
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 5
because it produces a flexible structure capable of being
adapted (see Fig.4).The implemented approach can deal
with extensions from the classical representation.It can handle
durative actions,fluents,functions and different search metrics
in order to minimise resource consumption,such as remaining
vehicles battery,or total distance travelled.
Figure 5 shows the benefits between replanning and re-
pairing a mission plan.At the initial loop,a partial ordered
plan pp
0
is generated satisfying the given mission domain and
problem Π
0
.The pp
0
is then grounded into the minimal
mission plan mp
0
including all constraints in pp
0
.At an
iteration i,the knowledge base is updated by the diagnosis
information d
i
providing a modified mission domain and
problem Π

i+1
.The mission replan process generates a new
partial plan pp

i+1
,as done at the first stage,based only in the
Π

i+1
information.On the other hand,the mission plan repair
process re-validates the original plan by ensuring minimal
perturbation of it.Given a mp to Π
i
,the mission repair
problem produces a solution mission plan mp

than solves
the updated mission problem Π

i+1
,by minimally modifying
mp.
When a mission failure occurs during the execution,two
possible repair levels can be identified:mission execution
repair and mission plan repair.Execution repair changes
the instantiation mp such that either:an action a
i
that was
previously instantiated by some execution α is no longer
instantiated,or an action a
i
,that was previously instantiated
is newly bound by an execution α already part of the mp.
Plan repair modifies the partial plan pp itself,so that it uses
a different composition,though it still used some of the same
constraints between actions.It might also entail the elimination
of actions which have already been instantiated.
Executive repair will be less expensive and it is expected
to be executed by the mission executive module.Plan repair,
however,will be computationally more expensive and requires
action of the mission planner.The objective is to maximise the
number of executions repairs over the plan repairs and,at the
plan repair level,maximise the number of decisions reused
from the old mission.
4.2 Mission Plan Refinement
A practical approach following the previously described con-
cepts has raised interest recently in the AI community pro-
viding a novel solution for all drawbacks identified during
the replanning process.This set of methods is known as plan
recovery methods.Plan recovery methods are based on plan-
space searches and are able to adapt the existent plan to the
new state of the world.They can be divided into two stages.
The first stage,known as plan diagnosis,analyses the
effects of the updated platform status on the current mission.
According with the new updated constraints received from the
status monitor,it identifies the failures and gaps existent in
the current mission plan.These plan gaps are causing the
inconsistency between the existent plan and the current status
of the platform and the environment.They are,therefore,
preventing the correct execution of the mission.The approach
developed at this stage is based on unrefinement planning
navigate
( remus, ?from_603, rarea2 )
navigate
( remus, ?from_462, sarea1 )
reacquire
( remus, camera, rarea2 )
survey
( remus, sidescan, sarea1 )
navigate
( remus, ?from_327, wpEnd )

Ø
at (remus, ?from_603)
at (remus, rarea2)
at (remus, ?from_462)
on (remus, camera, OK)
P
T
C
on (remus, sidescan, OK)
P
T
C
at (remus, sarea1)
(reacquired camera rarea2)
(at remus wpEnd)
(surveyed sidescan sarea1)
at (remus, ?from_327)
?from_603 = wpStart
?from_462 = rarea2
?from_327 = sarea1
Fig.4.Example of a partial ordered plan representa-
tion of an autonomously generated AUV mission for a
similar scenario as the one described in the demonstra-
tion in Section 8.Ordering constraints represented using
the graph depth,interval preservation constraints repre-
sented with black arrows,point truth constraints repre-
sented with PTC-labelled arrows and binding constraints
represented in the top left box.
0
GENERATE
Slot
reuse
REPLAN REPAIR
D
i
P
i
D
i+1
d
i
P
i+1
pp
i
mp
i
pp
i+1
mp
i+1
compare
ground
i i+1
D
i
P
i
D
i+1
d
i
P
i+1
pp
i
mp
i
pp
i+1
mp
i+1
compare
ground
i i+1
generate
D
0
P
0
pp
0
mp
0
ground
generate
Fig.5.Replan and repair processes for mission plan
adaptation.
strategies and uses the potential of the knowledge reasoning
in order to identify the resources that remain available.
The second stage is known as plan repair.The strategy
during this stage is to repair with new partial plans the gaps
or failures identified during the plan diagnosis stage.The plan
repair stage is based on refinement planning strategies for plan
recovery.
In simple terms,when changes in the instances of the
planning ontology are sensed (d) that affect the consistency
of the current mission plan pp
i
,the plan adaptability process
is initiated.Based on the outputs of the planning reasoner,
the plan diagnosis stage starts an unrefinement process that
relaxes the constraints in the mission plan that are causing the
mission plan to fail.The remaining temporal mission partial
plan pp
t
is now relaxed to be able to cope with the new
sensed constraints.This will be the simplest stage of recovery
necessary to continue with the execution of the plan but it does
not guarantee that all the mission goals will be achieved.The
plan repair stage then executes a refinement process searching
for a new mission plan pp

i+1
that is consistent with the new
world status D

and P

.By doing this,it can be seen that
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 6
Reasoning
TBox
ABox
Description
Language
Application
Agents
Rules
Knowledge Base
Fig.6.Knowledge representation system.
the new mission plan mp

is not generated again from D

and
P

(re-planned) but recycled from pp
i
(repaired).This allows
re-use of the parts of the plan pp
i
that were still consistent
with D

and P

.
5 O
VERVIEW IN THE
R
OLE OF
O
NTOLOGIES
AS
K
NOWLEDGE
R
EPRESENTATION
Many definitions and descriptions about ontologies can be
found in the literature.From a philosophical point,the on-
tology term has been defined as the study or concern about
what kinds of things exist what entities or things there are
in the universe [33].From a practical point,ontologies are
viewed as the working model of entities and interactions
either generically (e.g.the SUMO ontology) or in some
particular domain of knowledge or practice,such as predictive
maintenance or subsea operations.
A definition given by Gruber characterizes an ontology as
the specification of conceptualisations,which used to help
programs and humans share knowledge [34].The combination
of these two terms embraces the knowledge about the world
in terms of entities (things,the relationships they hold and
the constraints between them),with its representation in a
concrete form.One step in this specification is the encoding of
the conceptualisation in a knowledge representation language.
Ultimately,the main goal behind the concept of ontology is to
create an agreed-upon vocabulary and semantic structure for
exchanging information about that domain.
The main components of an ontology are concepts,axioms,
instances and relationships.A concept represents a set or class
of entities or things within a domain.A fault is an example
of a concept within the domain of diagnostic.A finite set of
definitions is called TBox,where is also included the set of
terminological axioms for every atomic concept.Axioms are
used to constrain the range and domain of the concepts,such
as the father is a man that has a child.
Instances are the things represented by a concept,such as
a FaultySensorXYZ is an instance of the concept Fault.The
combination of an ontology with associated instances is what
is known as a knowledge base (see Fig.6).However,deciding
whether something is a concept of an instance is difficult,
and often depends on the application [35].This finite set of
assertions about individuals is called ABox.
In the ABox,besides the concept assertions,one can also
specify role assertions or,in other words,the relations de-
scribing the interactions between individuals.For example,the
property isComponentOf might link the individual SensorX to
the individual PlatformY.
Following the definition and characterization of ontologies,
one of the main objectives for an ontology is that it should
be reliable and re-usable [36].However,an ontology is only
reusable when it is to be used for the same purpose for which
it was developed.Not all ontologies have the same intended
purpose and may have parts that are re-usable and other parts
that are not.They will also vary in their coverage and level
of detail.
Furthermore,one of the benefits of the ontology approach
is the extended querying that it provides,even across hetero-
geneous data systems.The meta-knowledge within an ontol-
ogy can assist an intelligent search engine with processing
your query.Part of this intelligent processing is due to the
capability of reasoning that makes possible the publication of
machine understandable meta-data,opening opportunities for
automated information processing and analysis.For instance
a diagnostic system,using an ontology of the system,could
automatically suggest the location of a fault in relation to the
happening of symptoms and alarms in the system.The system
may not even have a specific sensor in that location,and the
fault may not even categorized in a fault tree.The reasoning
interactions with the ontology are provided by the reasoner,
which is an application that enables the domains logic to be
specified with respect to the context model and executed to
the corresponding knowledge,i.e.the instances of the model.
A detailed description of how the reasoner works is outside of
the scope of this paper,where the test scenario is described.
6 F
RAMEWORK
SA
V
consists in making the vehicle to autonomously un-
derstand the big picture.This picture is composed by the
experience achieved from previous missions (orientation) and
the information obtained from the sensors while on mission
(observation).The TBox and Abox that have already been
introduced as main components in any knowledge base can
be assigned to the orientation and observation components
of SA
V
respectively.For each knowledge representation,
its TBox-ABox pair will not only describe the relationships
between concepts but also facilitate the decision making
process of the service-oriented agents.Reasoning capabilities
allow concept consistency to reassure that SA
V
remains
stable through the evolution of the mission.Also,inference
of concepts and relationships allows new knowledge to be
extracted or inferred from the observed data.
In addition,a set of ontologies has been developed in order
to represent the knowledge information required at SA
V
.
A key part in the ontology engineering discipline is the
construction and organization of these libraries of ontologies,
which should be designed for maximum reusability [34],[37].
A major challenge in building these libraries is to define
how these ontologies are constructed and organized,and what
relations should be among them.Existing approaches propose
three discrete levels of vertical segmentation including (1)
upper/foundation,(2) core/domain and (3) application (see
Fig.7).
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 7
FOUNDATI ON ONTOLOGY
CORE 1
CORE 2
App 1.1
App 1.2
App 2.1
App 2.2
Fig.7.Levelsofgeneralityofadomainontology.
The top layer in Fig.7 refers to the foundational ontologies
(FOs) (or upper ontologies) which represent the very basic
principles,and meets the practical need of a model that
has as much generality as possible,to ensure reusability
across different domains.There are several standardized upper
ontologies available for use,including Dublin Core,GFO,
OpenCyc/ResearchCyc,SUMO,and DOLCE.
The second level of the ontology hierarchy represents the
core domain ontology,which is arguably another of the
building blocks to information integration.The goal of a
core ontology is to provide a global and extensible model
into which data originating from distinct sources can be
mapped and integrated.This canonical form can then provide
a single knowledge base for cross-domain tools and services
(e.g.vehicle resource/capabilities discovery,vehicle physical
breakdown,and vehicle status).A single model avoids the
inevitable combinatorial explosion and application complexi-
ties that results from pair-wise mappings between individual
metadata formats and/or ontologies.
At the bottom of the stack,an application ontology provides
a underlying formal model for tools that integrate source data
and perform a variety of extended functions.Here,application
concepts are handled at the executive layer and are used to
ground the abstract concepts managed by the software agents
running in the system.As such,higher levels of complexity
are tolerable and the design should be motivated more by com-
pleteness and logical correctness than human comprehension.
In this paper,target areas of these application ontologies are
found in the status monitoring system of the vehicle and the
planning of the mission,which by making use of the proposed
framework allow the transition from the deliberative to the
action phase of the OODA loop.
In the work described in this paper,ontologies are defined
in Ontology Web Language (OWL) [38].The OWL language
is a specialisation of the Resource Description Framework
(RDF) standard [39],and therefore a specialisation of the
XML standard.These standards enable the representation of
information in a structured format where an abstraction model,
known as an ontology,is combined with instance data in a
repository.
6.1 Foundation and Core Ontology
To lay the foundation for the knowledge representation of
unmanned vehicles,JAUS concepts have been considered.
However,the core ontology developed in this work extends
these concepts while remaining focused in the domain of
unmanned systems.Some of the knowledge concepts identified
related with this domain are:
KNOWLEDGE
BASE
RAW DATA
ADAPTER
SERVICE ORIENTED AGENT
APPLICATION
ONTOLOGY
CORE
ONTOLOGY
UPPER
ONTOLOGY
UTILITY
ONTOLOGY
REASONER
RULE
ENGINE
Fig.8.SA
V
concept representation (Core,Application
ontologies),instance generation (adapter) and handling
(reasoning,inference and decision making agent).
Fig.9.Snapshot of the Core Ontology representing the
environment around the concept Platform.
- Platform:Static or mobile (ground,air,underwater vehi-
cles,),
- Payload:Hardware with particular properties,sensors or
modules,
- Module:Software with specific capabilities,
- Sensor:A device that receives and responds to a signal
or stimulus,
- Driver:Module for interaction with a specific sen-
sor/actuator,
- Waypoint:Position in space with coordinate and toler-
ance,
- Coordinate:Local frame,global frame,angular,
- Velocity:Linear,angular,
- Attitude:Roll,pitch,yaw,...
Figure 9 represents a snapshot of the Core Ontology show-
ing the key concepts involved in the test case scenario,and the
relationship associated with them.In this figure and all other
ontology representations presented in this paper,concepts and
relationships between them have been depicted by using the
expressive nature of OWL,which leads to a much stronger
specification for AUV integration.Essentially,the OWL lan-
guage specifies that all information is represented in the form
of a simple triple,named subject where each component of that
triple is represented by a Uniform Resource Identifier (URI).
The URI is a unique identifier for the data that is held in
a data store.And a triple represents a unit of information
through the combination of a subject,a predicate and an
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 8
object.The representation of the triple with the associated URI
reference can be observed in Fig.9 as well as in all ontology
representations throughout this paper,such as
core:Software core:hasPlatform core:Platform
core:Software core:hasCapability core:Capability
core:Platform core:hasCapability core:Capability
To support generic context-aware concepts,this framework
make use of the SOUPA (Standard Ontology for Ubiquitous
and Pervasive Applications) [40] ontology,which is the core
of the Context Broker Architecture (CoBrA) [41],a system
for supporting context-aware computing.The contribution of
SOUPA ontology is the spatio-temporal representation,which
allows the representation of time instants,intervals,and tempo-
ral relations,as well as space enhanced by high-level concepts
such as movable spatial thing,geographical entity or geometry
concepts.
6.2 Application Ontology
Each service-oriented agent has its own Application Ontology.
It represents the agent’s awareness of the situation by including
concepts that are specific to the expertise or service provided
by the SOA.
In the case study presented in this paper,these agents are the
status monitor and the mission planner.Together,they provide
the status monitor and mission adapter components described
in Fig.2 required for closing the OODA-loop and provide
on-board decision making adaptation.
6.2.1 Status Monitor Application Ontology
The Status Monitor Agent considers all symptoms and ob-
servations from environmental and internal data in order to
identify and classify events according to their priority and
their nature (critical or incipient).Based on internal events and
context information,this agent is able to infer new knowledge
about the current condition of the vehicle with regard to
the availability for operation of its components (i.e.status).
In a similar way,environmental data is also considered for
detecting and classifying external events in order to keep the
situation awareness of the vehicle updated.
The Status Monitoring Application Ontology is used to
express the SA
V
of the status monitor agent.Recent devel-
opments in defining ontologies as a knowledge representation
approach for a domain provide significant potential in model
design,able to encapsulate the essence of the diagnostic
semantic into concepts and to describe the key relationships
between the components of the system being diagnosed.
To model the behaviour of all components and subsystems
considering from sensor data to possible model outputs,the
Status Monitoring Application Ontology is designed and built
based on ontology design patterns [42].Ontology patterns
facilitate the construction of the ontology and promote re-use
and consistency if it is applied to different environments.In
this work,the representation of the monitoring concepts are
based on a system observation design pattern,which is shown
in Fig.10.Some of the most important concepts identified for
status monitoring are:
• Data:all internal and external variables (gain levels,water
current speed),
• Observation:patterns of data (sequences,outliers,resid-
uals,...),
• Symptom:individuals related to interesting patterns of
observations (e.g.,low gain levels,high average speed),
• Event:represents a series of correlated symptoms (low
power consumption,position drift),Two subclasses of
Events are defined:CriticalEvent for high priority events
and IncipientEvent for the remaining ones.
• Status:links the latest and most updated event infor-
mation to the systems being monitored (e.g.sidescan
transducer),
Please note how some of these concepts are related to
concepts of the Core Ontology (e.g.an observation comes
froma sensor).These Core Ontology elements are the enablers
for the knowledge exchange between service-oriented agents.
This will be shown in the demonstration scenario of Section 8.
Note that this knowledge representation of diagnostic con-
cepts are linked to concepts already described in the core
ontology,such as the status of the system which is the key
component in the handshaking of information between the
status monitoring and mission planner agents.
6.2.2 Mission Planning Ontology
On the mission planning side,knowledge modelling is im-
plemented by using a language representation.This language
is then used to express the input to the planner.Language
vocabularies generally include the information concepts and
the grammars are used for describing the relationships and
constraints between these concepts.
The STRIPS language from [43] is generally acknowledged
as the basis for classical planning.Basic concepts in this
language are:an initial state,a goal state (or set of goal states)
and a set of actions which the planner can perform.Each action
consists of a set of preconditions,which must be true for the
action to be performed,and a set of postconditions,which will
be true after the action has been completed.A classical planner
then normally attempts to apply actions whose postconditions
satisfy the goal,recursively applying other actions to meet the
preconditions until a complete plan (if available),which can be
executed from the initial state and ends at the goal,is formed.
di ag:Sympt om
di ag:Observat i on
core:Sensor
core:Physi cal Enti ty
core:St at us
di ag:Faul t
di ag:hasObservati on
di ag:causedBySymptom
di ag:Observati onFrom
core:isStatusOf
di ag:hasSympt om
di ag:hasObservati onData
core:Observat i onDat a
Fi g.10.Repr esent at i on of t he Syst em Obser vat i on Pat -
t er n.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 9
The PDDL language was originally created by [44] and
stands for Planning Domain Definition Language.It is the
planning language used by the planning community during
the bi-annual International Planning Competition that takes
place during the ICAPS Conference [45].It can be considered
as an extension of the original STRIPS language with extra
functionality added.PDDL intends to express the physics of
a planning domain,which is,what predicates there are,what
actions are possible,what the structure of compound actions
is and what the effects of actions are.At its last version,
it contains extensions for dealing with extended goals where
good quality plans are as valid as optimal quality plans.It is a
very complex language with complex syntactic features such
as specification of safety constraints or hierarchical actions.
State of the art plan generators are not able to handle the
entire set of PDDL language features.In consequence,several
versions of the language have came out describing a subset
of features,called requirements,that are extended every two
years when the competition takes place.
The adaptive decision making process requires concepts
for generating a mission following a declarative goal-based
approach instead of a classic procedural waypoint-based de-
scription.This can be achieved by looking at the concepts
described by PDDL.However,in order to provide a solu-
tion to a mission failure,it also requires concepts capable
of representing incidents or problems occurring during the
mission.These concepts have been extracted from [20].Some
of the most important concepts identified for mission plan
adaptability are:
- Resource:state of an object (physical or abstract) in the
environment.(vehicle,position,release,..),
- Action:Collection of state of resources.State changes.
(calibrate,classify,explore,..).
- Catalyst resource:Resources that are not consumed for an
action but needed for the proper execution of the action.
(sensor activation,),
- Plan gap:Actions that may no longer be applicable.At
least two ways of achieving a subgoal but a commitment
has not been taken yet,
- Gap:A non-executable action,
- Execution:When an action is executed successfully,
- Mission Failure:An unsuccessful execution.
The representation of the planning concepts related to the
mission plan and mission actions is shown in Fig.11.
Fig.11.Representation of the mission plan description.
Mission
Executive
Mission
Planner
Functional
ALI
Action
Notification
Command
Query
Acknowledgment
Status
Monitor
World
Model
Status
Event
Capabilities
Domain
Fig.12.Modules and interconnection messages imple-
menting the three tier architecture of the system.
Note that this knowledge representation of planning con-
cepts are linked to concepts already described in the core
ontology,such as the list of capabilities required to perform
each of the mission acts.
7 S
YSTEM
A
RCHITECTURE
As described in Section 2,the system implements the four
stages of the OODA loop (see Fig.12).The status monitor
reports changes in the environment and the internal status
of the platform to the world model.The world model stores
the ontology-based knowledge provided by the a-priori ex-
pert orientation and the observation of events received from
the status monitor.A mission planner module generates and
adapts mission plans based on the notifications and capabilities
reported by the mission executive and world model.The
mission executive module is in charge of executing the mission
commands in the functional layer based on the actions received
from the mission planner.In order to provide independency of
the architecture with the vehicles functional layer,an Abstract
Layer Interface (ALI) has been developed.
The translation from the mission planner to the mission
execution is done via a sequence of instances of action
executions.An action execution is described by the domain
ontology TBox and gets instantiated by the action grounded
on mp.The instance contains the script of commands required
to perform the action.The action execution contains a timer,
an execution counter,a timeout register and a register of
the maximum number of executions.The success,failure or
timeout outputs control the robust execution of the mission
and the executive repair process.
Once mp is obtained and the list of α
i
is generated for
mp,the mission plan gets transformed into a state machine
of action execution instances.An action execution graph gets
generated that contains all the possible options for the new
plan.This deals with the robustness of the execution and the
execution repair process.This minimises the number of calls
to the mission planner and therefore the response time for
adaptation.
8 T
EST
C
ASE
S
CENARIO
This section demonstrates the benefits of the approach in-
side the real mine counter measure (MCM) application.The
demonstration shows the benefits of using an ontological
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 10
Fig.13.Instances in the knowledge base representing the demonstration scenario.The diagramrepresents the main
platform,its components and their capabilities.
representation to describe the S
AV
and how the status monitor-
ing and adaptive planner agents are capable of interchanging
knowledge in order to dynamically adapt the mission to the
changes occurring in the platform and the environment.
In this scenario,AUVs support and provide solutions for
mine-hunting and neutralisation.The operation involves high
levels of uncertainty and risk of damage in the vehicle.Navi-
gating in such a hazard environment is likely to compromise
the vulnerability of the platform.Additionally,if a vehicle is
damaged or some of its components fail,mission adaptation
will be required to cope with the new restricted capabilities.
The performance of the system has been evaluated on
a REMUS 100 AUV platform in a set of integrated in-
water field trial demonstration days at Loch Earn,Scotland
(56

23.1N,4

12.0W).A PC/104 1.4GHz payload running
Linux has been installed in the vehicle.The payload system
is capable of communicating with the vehicles control module
and taking control of it via the manufacturers Remote Control
protocol.
Figure 14 shows one of the original mission plan way-
points as described by the operator following the vehicles
procedural waypoint-based mission description specification.
The mission plan consisted on a starting waypoint and two
waypoints describing a simple North to South pattern at an
approximate constant Longitude (4

16.2W).The mission leg
was approximately 250 meters long and it was followed by a
loiter recovery waypoint.This mission plan was loaded to the
vehicle control module.The track obtained after executing this
mission using the vehicle control module is shown in Fig.14
with a dark line.A small adjustment of the vehicles location
can be observed on the top trajectory after the aided navigation
module corrects its solution to the signals received from the
Long Baseline (LBL) transponders previously deployed in the
area of operations.
On the other hand,the adaptive system in the payload com-
puter was oriented by the operator using a-priori information
about the area of operation and a declarative description of
the mission plan.All this knowledge was represented using
concepts from the core ontology and the planning ontology
respectively.The original scenario is displayed in Fig.13.
Additionally,the a-priori information of the area was provided
based on automatic computer aided classification knowledge
generated from previous existent data [46].These areas are
shown in Fig.14.For the goal-oriented mission plan,the
requirements of the mission where specified using the planning
ontology in a way that can be resumed as survey all known
areas.The benefits of the approach and its performance in this
scenario are described in the following sections.
8.1 Pre-mission Reasoning
One of the first contributions of the approach is that it allows to
automatically answer some important questions before starting
the mission:
• Is this platform configuration suitable to successfully
perform this mission?
In order to answer this question,new knowledge can be
inferred from the original information provided by the opera-
tor.The core ontology rule engine is executed providing with
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 11
Fig.14.Procedural mission uploaded to the vehicle con-
trol module and a-priori seabed classification information
stored in the knowledge base.
additional knowledge.A set of predefined rules help orienting
the knowledge base into inferring new relationships between
instances.An example of a rule dealing with the transfer of
payload capabilities to the platform is represented in Eq.1.
core:isCapabilityOf(?Capability,?Payload)∧
core:isPayloadOf(?Payload,?Platform)
→core:isCapabilityOf(?Capability,?Platform)
(1)
Once all the possible knowledge has been extracted,it is
possible to query the knowledge base in order to extract the
list of capabilities of the platform (see Eq.2) and the list of
requirements of the mission (see Eq.3).
SELECT?Platform?Cap
WHERE { rdf:type(?Platform,core:Platform) ∧
core:hasCapability(?Platform,?Cap) }
(2)
SELECT?Mission?Req
WHERE { plan:hasAction(?Mission,?Action) ∧
plan:hasRequirement(?Action,?Req ) }
(3)
This way,it is possible to autonomously extract that the
requirements of the mission are:

core:WaypointManeuver
Capability ∈
jaus:Maneuver
Capability

core:ComputerAidedClassification
Capability ∈
jaus:Autonomous
RSTA-I
Capability

core:ComputerAidedDetection
Capability ∈
jaus:Autonomous
RSTA-I
Capability

core:SidescanSensor
Capability ∈
jaus:Environmental
Sensing
Capability
which are a subset of the platform capabilities.Therefore,
for this particular case,the platform configuration suits the
mission requirements.
8.2 In Mission Adaptation
The payload system is given a location in which to take
control to the host vehicle.At this point the mission planner
agent generates the mission based on the knowledge available,
including the mission requirements.The instantiation of the
mission with the list of actions to be executed is described in
Fig.15 using the planning ontology elements.The mission is
then passed to the executive agent that takes control of the
vehicle for its execution.
While the mission is being executed the status monitor
maintains an update of the knowledge stored in the world
model.In this example,the status monitor reports status
of hardware components,such as batteries and sensors and
external parameters such as water currents.
When the executive is about to execute the next action in
the mission plan,it looks at the knowledge stored on the world
model in order to generate the adequate list of commands to
the vehicle.In this case,the list of waypoints generated for
the lawnmower pattern survey of the areas is related to the
measured water current at the moment of initiating the survey.
Under this framework,information and knowledge are able to
transfer between the status monitoring system and the adaptive
mission planner while on mission.When the observations
coming from the environment being monitored by the status
monitoring agent indicate that the mission under execution
is affected,the mission planner is activated and the mission
plan gets adapted.The transfer of knowledge between agents
is possible by providing a solution to answer the following
question:
• Are the observations coming from the environment affect-
ing the mission currently under execution?
In order to explain the reasoning process involved,an
component fault has been simulated in the vehicle while per-
forming the mission.For this case,the gains of the starboard
transducer of the sidescan sonar of the vehicle were modelled
to drop to their minimumlevels half way through the execution
of the seabed survey action.
The first step is to deal with the information coming from
the sensor,which are signalling a low gain in the starboard
transducer.This signal triggers a symptom instance,which
has an associated event level.This event level,represented in
the diagnostic ontology using a value partition pattern,plays
a key role in the classification of the instances in the Fault
concept between being critical or incipient.This classification
is represented axiomatically in the Eqs.4 and 5.
diag:CriticalFault ...
diag:Fault   diag:causedBySymptom...
(diag:Symptom   diag:hasEventLevel...
(diag:Level  diag:High))
(4)
diag:IncipientFault ...
diag:Fault   diag:causedBySymptom...
(diag:Symptom   diag:hasEventLevel...
(diag:Level   diag:Med))
(5)
Once the fault individuals are re-classified,the status of
the related system is instantiated using the most updated fault
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 12
Fig.15.Representation of the mission planning action execution parameters.
information,which is represented by an annotation property.
Therefore,a critical status of the sidescan starboard transducer
is caused by a critical fault.However,due to the fact that
the sidescan sonar is composed of two transducers,one
malfunctioned transducer only causes an incipient status of
the sidescan sonar.
The characteristics of the Status concepts and re-usability
presented here supports the transfer of knowledge between
the two involved agents.Now,the mission planner agent is
responsible to adequate the mission according to this new
piece of information.
SELECT?Mission?Action?Param?Status
WHERE { plan:hasAction(?Mission,?Action) ∧
plan:hasExecParam(?Action,?Param) ∧
plan:hasStatus(?Param,?Status) }
(6)
Equation 6 reports to the mission planner that the two
survey actions in the mission are affected by the incipient
status of the sidescan sonar.In this case,the sensor required
by the action is reporting an incipient fault.The action can
be therefore modified by only adapting the way it is being
executed,an execution repair.If both transducers were down
and the status monitoring agent would have been reported a
critical status of the sidescan sensor,a plan repair adaptation
of the mission plan would have been required instead.In
that case,it would have been necessary to look for redundant
components or platforms with similar capabilities in order to
be able to perform the action.The same procedure is used
once the transducer is reported as recovered and the mission
plan commands change back to the original pattern.
The mission performance timeline is described in Fig.16,
which shows the most representative subset of the variables
being monitored by the status monitor agent.Each of the
symbols ,￿,♦, and on the aforementioned figures
represents a point during the mission where an event has
occurred.
- Symbol represents the point where the payload system
takes control of the vehicle.Please note a change on the
mission status flag that becomes 0x05,i.e.mission active
and payload in control.
- Symbol ￿ represents the point where the vehicle arrives
to perform the survey of the area.At this point,the action
survey gets instantiated based on the properties of the
internal elements and external factors.
- Symbol ♦represents the point when the diagnosis system
reports an incipient status in the starboard transducer of
the sidescan sonar.It can be seen how the lawnmower
pattern was adapted to cope with the change and use
the port transducer to cover the odd and even spacing
of the survey and avoiding gaps on the sidescan data
(see Fig.17).This adaptation allows maximizing sensor
coverage for the survey while the transducer is down.
- Symbol indicates the point where the diagnosis system
reports back the correct functionality of the starboard
transducer.It can be observed how the commands ex-
ecuting the action are modified in order to optimise the
survey pattern and minimise distance travelled.Although
also being monitored,the power status does not report
any critical status during the mission that requires modi-
fication of the actions.
- Symbol shows the location where all the mission goals
are considered achieved and the control is given back to
the vehicle.
9 C
ONCLUSION AND
F
UTURE
W
ORK
Nowadays,the autonomous vehicles in the underwater domain
employs a knowledge representation that is embryonic and
targets simple mono-platform and mono-domain applications,
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 13
0 500 1000 1500 2000
0100200300
Time (s)
Current (degrees)

0 500 1000 1500 2000
1000104010801120
Time (s)
Power (Wh)

Time (s)
Port
offon
0 500 1000 1500 2000

Time (s)
Starboard
offon
0 500 1000 1500 2000

0 500 1000 1500 2000
02468
Time (s)
Mission status

0x0:idle, 0x1:active, 0x2:suspend, 0x3:over, 0x4:payload, 0x8:rehearsal
Fig.16.Status monitoring (top to bottom):direction of
water current (degrees),battery power (Wh),sidescan
sensor port,starboard transducers availability (on/off) and
mission status binary flag,all plotted against mission time
(s).
−100 −50 0 50 100 150 200
−250−200−150−100−500
East (m)
North (m)

Fig.17.Vehicle’s track during mission in North-East local
coordinate frame.
therefore limiting the potential of multiple coordinated actions
between agents.
In order to enhance interoperability,independence of oper-
ation,and situation awareness,we have presented a semantic-
based framework that provides the core architecture for knowl-
edge representation for embedded service oriented agents in
autonomous vehicles.The framework combines the initial ex-
pert orientation and the observations acquired during mission
in order to improve the situation awareness of the vehicle.
It has direct impact on the knowledge distribution between
embedded agents at all levels of representation,providing
a more capable and holistic system,and involving semantic
interoperability among all involved information sources.
The fully integrated framework has achieved the following:
- Formal platform view:Three different ontology models
have been produced representing the concepts related to
an underwater platform,and to the diagnostic and mission
planning domain.And,as a consequence,the ontological
commitments made by these interrelated domains are
made explicit.
- Interoperability:The proposed framework supports the
communication and cooperation between systems,that
are normally working in isolation.
- Model-based knowledge acquisition:Before and during
the mission,these ontology models are able to acquire
knowledge of a real world situation.Based on the data
obtained during the mission,the service-oriented agents
provide to each other the requested information by the
execution of pre-defined queries,in order to manage
unforeseen events.
We have tested and evaluated this proposed framework
to the problem of fault-tolerant adaptive mission planning
in a MCM scenario,where the ontological representation of
knowledge,model-based diagnosis and adaptive mission plan
repair techniques are combined.To demonstrate the benefit
of this approach,the mission adaptation capability and its
cooperation with the diagnosis unit has been shown during
an in-water field trial demonstration.In this scenario,the
approach has shown how the status monitoring and the mission
planner agents can collaborate together in order to provide a
solution to mission action failures when component faults are
detected in the platform.
As part of our future work,we are planning to apply the
approach to more complex scenarios involving other agents,
such as agents for autonomous detection and classification
of targets.We are also planning to extend the single vehicle
mission plan recovery to a mission plan recovery for a group
or team of vehicles performing a collaborative mission,im-
proving local (agent level) and global (system level) situation
awareness.In this scenario,the agents are distributed across
the different platforms.We are currently working for an SA for
a teamof vehicles (SA
T
) to which every teammember possess
the SAS required for its responsibilities.The main challenge
for this will be to deal with the acoustic communication
limitations associated to the underwater environment.
A
CKNOWLEDGEMENTS
Our thanks to the members of the Ocean Systems Laboratory
for their inputs and helpful critique.Thanks also to all at See-
Byte Ltd,Edinburgh for providing the necessary knowledge
to run the experiments in the real world.The work reported
in this paper is partly funded by the Project SEAS-DTC-AA-
012 from the Systems Engineering for Autonomous Systems
Defence Technology Centre and by the Project RT/COM/5/059
from the Competition of Ideas,both established by the UK
Ministry of Defence.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 14
R
EFERENCES
[1] J.Boyd,“OODA loop,” Tech.Rep.,1995.
[2] J.A.Adams,“Unmanned vehicle situation awareness:A path forward,”
in Proceedings of the 2007 Human Systems Integration Symposium,
Annapolis,Maryland,USA,2007.
[3] J.E.Strutt,“Report of the inquiry into the loss of autosub2 under the
fimbulisen,” National Oceanography Centre Southampton,Southampton,
UK.,Tech.Rep.,2006.
[4] B.Pell,E.Gat,R.Keesing,N.Muscettola,and B.Smith,“Robust
periodic planning and execution for autonomous spacecraft,” in Proceed-
ings of the Fifteenth International Conference on Artificial Intelligence
(IJCAI’97),1997,pp.1234–1239.
[5] T.H.Collett,B.A.MacDonald,and B.P.Gerkey,“Player 2.0:
Toward a practical robot programming framework,” Proceedings of the
Australasian Conference on Robotics and Automation (ACRA 2005),
Sydney,Australia,December 2005.
[6] P.Fitzpatrick,G.Metta,and L.Natale,“YARP user manual,” Yet
Another Robotic Platform (YARP),Tech.Rep.,2008.
[7] P.Newman,“Introduction to programming with MOOS,” Oxford
Robotics Research Group,Tech.Rep.,2009.
[8] ——,“Under the hood of the MOOS communications API,” Oxford
Robotics Research Group,Tech.Rep.,2009.
[9] “OceanSHELL:An embedded library for distributed applications and
communications,” Ocean Systems Laboratory,Heriot-Watt University,
Tech.Rep.,August 2005.
[10] M.Somby,“A review of robotics software plat-
forms,” http://www.windowsfordevices.com/c/a/Windows-For-Devices-
Articles/A-review-of-robotics-software-platforms/,August 2007.[On-
line].Available:http://www.windowsfordevices.com/c/a/Windows-For-
Devices-Articles/A-review-of-robotics-software-platforms/
[11] M.Arredondo,S.Reed,and Y.R.Petillot,“Battlespace access for
unmanned underwater vehicles - dynamic multi-dymensional world
modelling - final report,” SeeByte Ltd.- Ministry of Defence,Tech.
Rep.,2006.
[12] “Society of automotive engineers AS-4 AIR5665 JAUS architecture
framework for unmanned systems,” SAE International Group,Tech.
Rep.,2008.[Online].Available:http://www.jauswg.org/
[13] A.Bouguerra,L.Karlsson,and A.Saffiotti,“Monitoring the
execution of robot plans using semantic knowledge,” Robotics
and Autonomous Systems,vol.56,no.11,pp.942 –
954,2008,semantic Knowledge in Robotics.[Online].Avail-
able:http://www.sciencedirect.com/science/article/B6V16-4T7XGSM-
1/2/04f9b80a1f7cdebb975141bc910cd594
[14] C.Galindo,J.-A.Fernndez-Madrigal,J.Gonzlez,and A.Saffiotti,
“Robot task planning using semantic maps,” Robotics and
Autonomous Systems,vol.56,no.11,pp.955 – 966,
2008,semantic Knowledge in Robotics.[Online].Avail-
able:http://www.sciencedirect.com/science/article/B6V16-4T9CCW6-
2/2/5bf373f40885a5995dbcf60b0a48ae80
[15] J.Hertzberg and A.Saffiotti,“Using semantic knowledge in robotics,”
Robotics and Autonomous Systems,vol.56,no.11,pp.875
– 877,2008,semantic Knowledge in Robotics.[Online].Avail-
able:http://www.sciencedirect.com/science/article/B6V16-4T72WWW-
1/2/9edd0eb7357cb93ab0a7f0285979c469
[16] M.Chantler,G.Cogill,Q.Shen,and R.Leitch,“Selecting tools
and techniques for model based diagnosis,” Artificial Intelligence in
Engineering,vol.12,pp.81–98,1998.
[17] F.J.Uppal and R.J.Patton,“Fault diagnosis of an electro-pneumatic
valve actuator using neural networks with fuzzy capabilities,” in Pro-
ceedings of European Symposium on Artificial Neural Networks,2002,
pp.501–506.
[18] M.Ghallab,D.Nau,and P.Traverso,Automated Planning:Theory and
Practice.Morgan Kaufmann,ISBN:1-55860-856-7,2004.
[19] G.Rabideau,R.Knight,S.Chien,A.Fukunaga,and A.Govindjee,
“Iterative repair planning for spacecraft operations in the ASPEN
systems,” in Proceedings of the Fifth International Symposium on
Artificial Intelligence Robotics and Automation in Space,1999,pp.99–
106.[Online].Available:citeseer.ist.psu.edu/rabideau99iterative.html
[20] R.van der Krogt,“Plan repair in single-agent and multi-agent systems,”
Ph.D.dissertation,Netherlands TRAIL Research School,2005.
[21] A.Gerevini and I.Serina,“Fast plan adaptation through planning graphs:
Local and systematic search techniques,” in AAAI Conf.on AI Planning
Systems,2000,pp.112–121.
[22] A.L.Blum and M.L.Furst,“Fast planning through planning graph
analysis,” in Artificial Intelligence,vol.90,1997,pp.281–300.
[23] S.Koenig,M.Likhachev,and D.Furcy,“Lifelong planning A*,”
Artificial Intelligence,vol.155,no.1-2,pp.93–146,May 2004.
[24] C.P
ˆ
etr
`
es,Y.Pailhas,P.Patr
´
on,Y.R.Petillot,J.Evans,and D.M.Lane,
“Path planning for autonomous underwater vehicles,” IEEE Transactions
on Robotics,vol.23,no.2,pp.331–341,April 2007.
[25] R.Turner,“Intelligent mission planning and control of autonomous
underwater vehicles,” in Workshop on Planning under uncertainty for
autonomous systems,15th International Conference on Automated Plan-
ning and Scheduling (ICAPS’05),2005.
[26] J.Bellingham,B.Kirkwood,and K.Rajan,“Tutorial on issues in
underwater robotic applications,” in 16th International Conference on
Automated Planning and Scheduling (ICAPS’06),2006.
[27] K.Rajan,C.McGann,F.Py,and H.Thomas,“Robust mission planning
using deliberative autonomy for autonomous underwater vehicles,” in
Workshop in Robotics in challenging and hazardous environments,
International Conference on Robotics and Automation (ICRA’07),2007.
[28] M.Fox,D.Long,F.Py,K.Rajan,and J.Ryan,“In situ analysis for
intelligent control,” in Proceedings of the IEEE International Conference
Oceans (Oceans’07),Vancouver,Canada,September 2007.
[29] C.McGann,F.Py,K.Rajan,H.Thomas,R.Henthorn,and R.McEwen,
“T-REX:A model-based architecture for AUV control,” in Workshop
in Planning and Plan Execution for Real-World Systems:Principles
and Practices for Planning in Execution,International Conference of
Autonomous Planning and Scheduling (ICAPS’07),2007.
[30] J.Thornton,“Survivability - its importance in the maritime environ-
ment,” Journal of Defence Science,vol.10,no.2,pp.57–60,May 2005.
[31] M.Benjamin,J.Curcio,J.Leonard,and P.Newman,“Navigation of
unmanned marine vehicles in accordance with the rules of the road,”
International Conference on Robotics and Automation (ICRA’06),May
2006.
[32] J.Evans,P.Patr
´
on,B.Smith,and D.Lane,“Design and evaluation of a
reactive and deliberative collision avoidance and escape architecture for
autonomous robots,” Journal of Autonomous Robots,vol.3,pp.247–
266,Dec 2008.
[33] S.Blackburn,The Oxford Dictionary of Philosophy.Oxford,UK:
Oxford University Press,1996.
[34] T.R.Gruber,“Towards principles for the design of ontologies used for
knowledge sharing,” International Journal Human-Computer Studies,
vol.43,pp.907–928,1995.
[35] R.J.Brachman,D.L.McGuinness,P.F.Patel-Schneider,L.A.Resnick,
and A.Borgida,“Living with classic:When and how to use a KL-ONE-
like language,” Principles of Semantic Networks:Explorations in the
representation of knowledge,pp.401–456,1991.
[36] M.Uschold,M.King,S.Moralee,and Y.Zorgios,“The enterprise
ontology,” Knowledge Engineering Review,vol.13,no.1,pp.31–89,
1998.
[37] G.van Heijst,A.Schreiber,and B.Wielinga,“Using explicit ontologies
in kbs development,” International Journal of Human-Computer Studies,
vol.46,no.2-3,1996.
[38] M.K.Smith,C.Welty,and D.L.McGuinness,“Owl web ontology
language guide,w3c recommendation,” February.[Online].Available:
http://www.w3.org/TR/owl-guide/
[39] D.Becket and B.McBride,“Rdf syntax specification (revised),
resource description framework,” February 2004.[Online].Available:
http://www.w3.org/TR/rdf-syntax-grammar/.
[40] H.Chen,F.Perich,T.Finin,and A.Joshi,“Soupa:Standard ontology
for ubiquitous and pervasive applications,” in International Conference
on Mobile and Ubiquitous Systems:Networking and Services,Boston,
MA,2004.
[41] H.Chen,T.Finin,and A.Joshi,“Using owl in a pervasive computing
broker,” in Workshop on Ontologies in Open Agent Systems,Melbourne,
Australia,2003,pp.9–16.
[42] E.Blomqvist and K.Sandkuhl,“Patterns in ontology engineering
classification of ontology patterns,” in 7th International Conference on
Enterprise Information Systems,Miami,USA,2005.
[43] R.Fikes and N.Nilsson,“STRIPS:a new approach to the application
of theorem proving to problem solving,” Artificial Intelligence,vol.2,
pp.189–208,1971.
[44] M.Ghallab,A.Howe,C.Knoblock,D.McDermott,A.Ram,M.Veloso,
D.Weld,and D.Wilkins,“PDDL:The planning domain definition
language,” Yale Center for Computational Vision and Control,Tech.
Rep.,1998.
[45] “ICAPS,” 2008.[Online].Available:http://www.icaps-conference.org/
[46] S.Reed,I.Ruiz,C.Capus,and Y.Petillot,“The fusion of large scale
classified side-scan sonar image mosaics,” IEEE Transactions on Image
Processing,vol.15,no.7,pp.2049–2060,Jul 2006.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.
JOURNAL OF IEEE TRANS.ON KNOWLEDGE AND DATA ENGINEERING,VOL.V,NO.N,JAN 2010 15
Emilio Miguel
´
a
˜
nez (M’01) received the MPhys
degree in Physics from the University of Manch-
ester in 2000.Then,he pursued a MSc degree in
Information Technology (Systems) and his PhD
degree on the application of evolutionary com-
putation approaches to fault diagnosis in the do-
main of semiconductor at Heriot-Watt University.
He started his professional career working as
research engineer at Test Advantage Ltd devel-
oping intelligent systems and knowledge mining
solutions for the semiconductor manufacturing
environment.Currently,Emilio Miguel
´
a
˜
nez,is a Senior Development
Engineer at Seebyte Ltd,responsible for the technical development of
RECOVERY,intelligent diagnostic tool-kit.Since joining Seebyte Ltd he
leads the company’s technical development for the European project In-
teGRail,tasked with the developing and implementing several intelligent
diagnostic on-board modules enabling a coherent information systems
in order to achieve higher levels of performance of the railway system.
His current interests include knowledge engineering and intelligent
systems for diagnostics and prognostics.
Pedro Patr
´
on is a specialist in embedded and
planning technologies for autonomous systems.
He has been working on research and devel-
opment projects for ROVs and AUVs at the
Ocean Systems Laboratory at Heriot-Watt Uni-
versity since 2004.As a Research Engineer,
he has been working on various successful re-
search projects in the field of planning and sys-
tem integration such as (EU) AUTOTRACKER,
(MoD) BAUUV that involved collaboration be-
tween many academic and industry partners.
He is the current investigator for mission planning,decision making
and knowledge representation on the Semantic world modelling for
distributed knowledge representation in cooperative (autonomous) plat-
forms project inside the (MoD) Competition of Ideas Consortium.His
interests include autonomous decision making processes such as real
time on-board mission planning,deliberative obstacle avoidance and
multi-robot systems for autonomous vehicles.
Keith E.Brown received the B.Sc.degree in
electrical and electronic engineering in 1984 and
his Ph.D.on the application of knowledge-based
techniques to telecoms equipment fault diagno-
sis in 1988 from the University of Edinburgh,
Edinburgh,Scotland.He is currently a senior lec-
turer at Heriot-Watt University and part of the Ed-
inburgh Research Partnerships Joint Research
Institute for Signal and Image Processing.His
research interests are in intelligent systems for
diagnostics and prognostics and in bio-inspired
wideband sonar systems.
Yvan R.Petillot (M03) received the Engineering
degree in telecommunications with a specializa-
tion in image and signal processing,the M.Sc.
degree in optics and signal processing,and the
Ph.D.degree in real-time pattern recognition us-
ing optical processors fromthe Universit de Bre-
tagne Occidentale,Ecole Nationale Suprieure
des Tlcommunications de Bretagne (ENSTBr),
Brest,France.He is a Specialist in sonar data
processing (including obstacle avoidance) and
sensor fusion.He is currently a Lecturer at
Heriot-Watt University,Edinburgh,U.K.,where he leads the Sensor
Processing Group of the Oceans Systems Laboratory,focusing on
image interpretation and mine and counter measures.Dr.Petillot acts
a reviewer for various IEEE Transactions.
David M.Lane (M92) received the B.Sc.de-
gree in electrical and electronic engineering in
1980 and the Ph.D.degree for robotics work
with unmanned underwater vehicles in 1986
from Heriot-Watt University,Edinburgh,U.K.He
was a Visiting Professor with the Department of
Ocean Engineering,Florida Atlantic University,
Boca Raton,and is Cofounder/Director of See-
Byte,Ltd.,Edinburgh,U.K.Previously,he was
with the U.K.Defence and Offshore Industries.
He currently is a Professor with the School of
Engineering and Physical Sciences,Heriot-Watt University,where he
also is the Director of the Ocean Systems Laboratory.He leads a
multidisciplinary team that partners with U.K.,European,and U.S.
industrial and research groups supporting offshore,Navy,and marine
science applications.Major sponsors include oil companies,the United
States Navy,the European Union,and the U.K.Ministry of Defence.
He has published over 150 journal and conference papers on tethered
and autonomous underwater vehicles,subsea robotics,computer vision,
image processing,and advanced control.He is an Associate Editor for
the International Journal of Systems Science.Dr.Lane is a Member
of Institution of Engineering and Technology (IET),London,U.K.;the
Professional Network on Robotics;and the U.K.Society for Underwater
Technology Underwater Robotics Committee.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.
Authorized licensed use limited to: Heriot-Watt University. Downloaded on April 21,2010 at 09:35:12 UTC from IEEE Xplore. Restrictions apply.