3: ontological reasoning

farmpaintlickInternet and Web Development

Oct 21, 2013 (3 years and 9 months ago)

111 views

Joost Breuker

Leibniz Center for Law

University of
Amsterdam


Ontology,
ontologies

and ontological
reasoning

3: ontological reasoning

Overview


Semantic Web and OWL


Use of
ontologies


Reasoning with
ontologies


TRACS: testing the Dutch Traffic Regulation


HARNESS: DL
-
based legal assessment


Frameworks and the limits of DL based
reasoning


Problem solving and reasoning

What the
Semantic

Web is intended for

Dream

part 2


“In communicating between people using the Web, computers
and networks have as their job to enable the information
space, and otherwise get out of their way. But doesn’t it
make sense to also bring computers more onto action, to
put their analytic power to
work.
In part two of the
dream, that is just what they do. The first step is putting
data on the web in a form that machines can naturally
understand, or converting it to that form. This creates
what I call a Semantic Web
--

a web of data that can be
processed directly or indirectly by machines.” [p191]



A decade later (W3C): infrastructural standards for SW


Semantics are represented by
ontologies


Ontologies

are represented by a
KR formalism


Note:
ontologies

were
specifications

in KE (using
Ontolingua
; CML,
cf

UML in SE)


On top of a layer cake of data
-
handling formalisms


KR formalism is intended for reasoning


Even suitable for blind trust (OWL
-
DL is decidable)








Legal
ontologies

(from
Nuria

Casellas
, 2008/9)

Legal
ontologies

(from
Nuria

Casellas
, 2008/9)

HOWEVER, in practice


Not one of these
ontologies

is used for
reasoning


Use:


Information management (documents)


That is also what the current
Semantic Web
efforts are
about (not only in legal domains)


Core
ontologies

(reuse?)




Why using OWL?

OWL: DL
-
based knowledge representation


OWL
-
DL is a unique result of 40 years of
research in AI about KR


Semantic networks, KL
-
ONE, CLASSIC, LOOM,..


Concept oriented representation


Very suitable for
ontologies


vs

Rule based KR


End 80
-
ies: logical foundations


A KR formalism defines what can be correctly and
completely inferred


On the basis of the semantics (model theory) of the
formalism


Problem: finding a sub
-
set of predicate logic that is
decidable (and computationally tractable)


History of KR (Hoekstra, 2009)

OWL’s

semantic web context

OWL

OWL’s

further requirements/problems


…besides the expressivity/decidability trade
-
off


The RDF layer (OO based) was a serious
obstacle


Its expressiveness was incompatible with DL research


Ian
Horrocks
, Peter F. Patel
-
Schneider, and Frank van
Harmelen
.
From
SHIQ

and RDF to OWL: The making of a web ontology
language. Journal of Web Semantics, pages 7

26, 2003.


No unique naming assumption


USA/EU team: KR & KA community


NB: improved expressivity in OWL 2!


Still: OWL is not a self
-
evident for novices…

Reasoning with OWL


Main inference: classification on the basis of
properties of concepts


Reasoner

(inference engine) `classifier’


Complete for DL based


Rule based
reasoners

are not complete


Eg

Prolog, unless `closed world assumption’


For the Web this assumption cannot hold!


T(erminology
)
-
Box: ontology (knowledge)


Classes (concepts, universals) and properties
(relations, attributes, features,..)


A(ssertions
)
-
Box: some situation (information)


Individuals (instances) with properties


When is REASONING with ontologies indicated


(...except for consistency checking etc.)


In understanding/modeling
situations


Situation = events and states of entities in space
(and over time)


Three main modes


Text understanding (stories, cases)


Nb

exc. expository discourse…


Scene understanding (robotics)


Problem solving (knowledge engineering)


Understanding the problem description

Joost
Breuker

Modelling situations


Situations described as instances of concepts and
relations: A
-
Box


Eg

mereological

(topological) and/or dependency
relations


Initial model


An ontology describing the concepts and properties
is applied to infer the implied information:


= classifying the A
-
Box




implied information is made explicit (`added’)



static
: description of a situation


dynamic: explaining
changes

between
situations

Joost
Breuker


situation
1 plus a `process’


structural (topological) descriptions of objects
in space


Joost
Breuker

should predict this


Inferred: positions, breaking, etc

Or: inferring explanations


Given Situation 1 and Situation 2, we should be
able to infer the processes involved


This is a
causal explanation



NB: teleological explanation:


There must have been a Situation 0 with an intention
to throw a ball (or worse: to get rid of an ugly
teapot..etc
.)

Joost Breuker

SIKS
-
course, may 2006


A
-
Box: events
& states of objects

desk

floor

teapot

ball

move/fa
ll

move/fa
ll

move/fa
ll

move/fa
ll

break

collide

move/fa
ll

T
-
2

T
-
1

Joost Breuker

SIKS
-
course, may 2006


identifying implied
processes

desk

floor

ball

move/fa
ll

move/fa
ll

move/fa
ll

move/fa
ll

break

collide

move/fa
ll

T
-
2

support

support

teapot

T
-
1

Joost Breuker

SIKS
-
course, may 2006


identifying
causation

desk

floor

ball

move/fa
ll

move/fa
ll

move/fa
ll

move/fa
ll

break

collide

move/fa
ll

support

support

teapot

Joost Breuker

SIKS
-
course, may 2006

desk

floor

ball

move/fa
ll

move/fa
ll

move/fa
ll

move/fa
ll

break

collide

move/fa
ll

support

support

teapot

Why does the

desk not move?



more detailed: even limiting
causal
effects of collisions

When is REASONING with
ontologies

required



When
all possible
situations have to be modeled


Typical examples:


Model based & qualitative reasoning systems


Testing system models


..legal case assessment


All possible combinations


completeness &
consistency


eg

OWL
-
DL


NB: in knowledge systems, situations are usually
modeled implicitly in user
-
system dialogues:


Asking user (values of/presence of) parameters


Heuristics; human limitations in handling
combinatorics

For instance: TRACS (1990


1994)


Testing a new Dutch traffic code (RVV
-
90)



art. 3 Vehicles should keep to the right


art. 6 Two bicycles may ride next to each other


art. 33 A trailer should have lights at the back


Questions


Consistent?


Complete?


In what respect different from RVV
-
66 (old one)?


These can only be answered when we can
model all possible situations distinguished by
this code

Traffic participants:
a part of the ontology (`world knowledge’)

traffic
-
participant

pedestrian

driver

bicyclist

autocyclist

driver of

motor vehicle

car driver

motorcycle

driver

lorry driver

bus driver

Simple example of ontological reasoning


Ontology (T
-
Box)


Subsumes (
Physical_object
, Car)


Right
-
of (
Physical_object
,
Physical_object
)


Inv(Right_of
,
Left_of
)


Case description (A
-
Box)


Is
-
a (car1, Car)


Is
-
a (car1, Car)


Right_of

(car1, car2)


Classifier

(
eg

Pellet)



Left_of

(car2, car1) (A
-
Box)



…simple as that, but necessary

Architecture of
TRACS
(Breuker & den
Haan
, 94)

WORLD

KNOWLEDGE

BASE

META
-
LEGAL

KNOWLEDGE

BASE

REGULATION

KNOWLEDGE

BASE

REGULATION

APPLIER

CONFLICT

RESOLVER

SITUATION

GENERATOR

SITUATION

APPLICABLE

RULES

TRESPASSED/

NON
-
TRESPASSED

RULES

CONSISTENTLY

APPLICABLE

RULES

VALIDATOR

Btw: some surprising results

Tram on
tramway

Car on bicycle
lane

Just a prototype…


About 10
5
possible combinations


Analysis of redundancy (symmetry)


Still: too many for humans to inspect!


But:


Differences with old regulation


Differences with foreign regulations (


ontology the same?)


…political decisions…

HARNESS: OWL 2 DL also for normative reasoning




Normative

reasoning simultaneously with
ontological

reasoning using OWL
-
DL


Estrella
, 6
th

framework, 2006
-
2008


http://
www.estrellaproject.org
/


Saskia van de Ven, Joost Breuker, Rinke Hoekstra, Lars Wortel, and Abdallah El
-
Ali.
Automated legal assessment in OWL 2. In
Legal Knowledge and Information
Systems. Jurix 2008: The 21st Annual Conference
, Frontiers in Artificial Intelligence
and Applications. IOS Press, December 2008.


András

Förhécz

and
György

Strausz
, Legal Assessment Using Conjunctive Queries,
Proceedings LOAIT 2009

Representing norms in
OWL DL 2


Norm


Generic case
description is a conjunction of generic
situation (
σ
) descriptions


Generic case description is a class (

)


A
deontic

qualification

(P,O,F) is associated with




Case description is an individual (C)


Description is itself composed of classes/individuals
as defined in the ontology!


Watch this….

What did you see?


Event 1 (
Saskia

entering, shows ID)


Event 2 (Joost entering, shows ID)


Event 3 (
Radboud

entering)



(
nb

:
Radboud

is president of
Jurix
)



JURIX 2009 Regulation (= set of norms)



1.
For entering a U
-
building, identification is
required

2.
The President does not need an identification to
enter a U
-
building


Generic situations and generic case

1.
For (
entering a U
-
building
), (
an identification
) is
required

2.
The (
President
)
does not need
(
an
identification
) to (
enter a U
-
building
)


Step 1: Modeled as:

(1)
Σ
1


σ
1

σ
2

σ
3


(1)

Σ
2


σ
4

σ
2
(


3
)










Generic situations and generic case

1.
For (
entering a U
-
building
), (
an identification
) is
required
[for each person]

2.
The (
President
)
does not need
(
an
identification
) to (
enter a U
-
building
)


Step 1: Modeled as:

(1)
Σ
1


σ
1

σ
2

σ
3


(1)

Σ
2


σ
4

σ
2
(


3
)











Step 2: Adding
deontic

qualification to the norms





Permitted(Σ
1
)



σ
1

σ
2

σ
3

(1) Obliged(
Σ
1
)





Forbidden(
Σ
1
)



σ
1

σ
2
(~

σ
3
)


(this is a `design pattern’ which separates
conditions (
person
,
entering

and
identity

from
a forbidden generic case)


(2) Permitted(Σ
2
)



σ
4

σ
2
(


3
)






President

Step 3 Normative assessment: classifying
C(ase
)


Case:
President
Radboud

enters U
-
building


C: {s
4,
s
2
}


Classifying C:


Σ
1

subsumes Σ
2



exception


C is Disallowed
-
by Σ
1


C is Allowed
-
by Σ
2


Etc, etc…


This is not viewed as a logical conflict by Pellet due to
the fact that this individual is classified by two
different norms (classes)



HARNESS selects subsumed (Σ
2
)

Experimental user interface (Protégé plug
-
in)

Violation

Compliance


An important advantage


Three knowledge bases:


domain ontology
(T
-
Box)


norms (T
-
Box)


case description (individuals & properties; A
-
Box)


OWL
-
DL
reasoner

(Pellet) `classifies’ case in
terms of concepts and of norms simultaneously
in an intertwined fashion







Hybrid or only
-
rule
-
based solutions cannot
preserve all (inferred) information of the
ontology as Pellet/OWL 2 does



Knowledge, ontology and meaning


There is more to knowledge than ontology


Ontology (terminology) provides the basic units for
understanding


Regular combinations: patterns of concepts


Scripts & frames
: experience, heuristics, associations


Meaningful experience can only be based upon
understanding!


Synthetic learning
vs

(further) abstraction

What’s further new


Monotonic and deductive:


Unique & against accepted wisdom


Exceptions do not lead to conflict


Advantage:


Reasoning is sound and complete (trust)


No rule formalism allows this with the same expressiveness


Full use of OWL 2
DL’s

expressiveness


No loss in translation


Disadvantage:


Modeling in DL is found to be more intellectually
demanding than modeling in rules anyway


Obligation design pattern is not very intuitive


A serious problem in the use of (OWL
-
) DL


DL representations are `variable free’


(most) rule formalisms have variables


Moreover: in OWL names of individuals are not
taken as identifiers of individuals (no unique
naming assumption)


It is not possible to track
changes

of a particular
individual


A
-
Box: colour(block1,
red
); colour(block1,
blue
)


OWL: …
there are (now) two block1’s
!

A serious problem in the use of (OWL
-
) DL (2)


Also: it is (almost) impossible to `enforce’
identity of individuals in OWL


Example: transaction


OWL’s

restriction on the form of graphs



`diamond of individuals’







what OWL allows: tree’s


An approximate solution: a special design pattern


Constraining the identity:









Rinke Hoekstra and Joost Breuker. Polishing diamonds in OWL2. In Aldo Gangemi and Jérôme Euzenat,
editors,
Proceedings of the 16th International Conference on Knowledge Engineering and
Knowledge Management (EKAW 2008)
, LNAI/LNCS. Springer Verlag, October 2008.
)

Rinke Hoekstra.
Ontology Representation
-

Design Patterns and Ontologies that Make Sense
, volume
197 of
Frontiers of Artificial Intelligence and Applications
. IOS Press, Amsterdam, June 2009.

The DL view has a limited scope


Excellent for
axiomatic

grounding of the terms
that form the lowest level of granularity of a
knowledge base


More complex knowledge structures
(frameworks) will require also rules


`Hybrid’ solution:

“In the hybrid approach there is a strict separation between the
ordinary predicates, which are basic rule predicates and ontology
predicates, which are only used as constraints in rule antecedents.
Reasoning is done by interfacing an existing rule
reasoner

with an
existing ontology
reasoner



Problem: rule formalism has to be `DL
-
safe’


OWL/rule combination still (W3C) research issue


Frameworks: complex knowledge structures


Stereotypical pattern of relationships


Dependency and/or part
-
of structures


(`causal’. `
mereological
’; `how’)


Ontology as background (`what’)


Learning by experience


Reoccurring events or structures


Justification


Plans
<
-
>

rituals


70
-
ies:


Scripts (
Schank
), Frames (
Minsky
)


Restaurant; House



She is the … ?

Three types of frameworks


Situational frameworks


Dependencies between events/actions


Eg

scripts,
busineness

processes, etc


Mereological

frameworks


Structures, configurations of objects


`
topo
-
mereology



Epistemological frameworks


Dependencies between roles in reasoning


Problem solving methods


Hypotheses


evidence
-
> conclusion


Valente’s

FOLaw

Reasoning with frameworks


As frameworks usually have lots of restrictions
on the identity of objects (
eg

agents, roles) the
approximate solutions may become very
problematic; even impossible


So we will need also rules


Preferably in a hybrid architecture…


We can split
-
up the reasoning in 2 steps


Static

situation modeling by ontological reasoning
(classification)


Inferring all properties of a collection of entities


Modeling
change

by frameworks


…and if necessary: iterate…

Qualitative reasoning:
ontologies

and frameworks


qualitative = quantities on an ordinal
-
scale


eg
:
neg
-
max, negative, zero, positive, pos
-
max


points,intervals


Also: rates (increase, decrease)


Special calculus


Prediction of
behaviour

of a system on the basis
of a situation description


Structural description plus initial values


`scenario’


Behaviour

is derived from


Processes (changes) triggered by a set of conditions


Model fragments (= a framework)

QR
-

2


Eg

heat
-
exchange
-
1 (conduction)


Conditions


T
obj1

> T
obj2,
share(surf
obj1,
surf
obj2
), phase(obj1,solid), etc


Process


T
obj1

>
decrease

T
obj2,

max = T
obj1

= T
obj2


Heat
-
exchange
-
2 (convection; Boyle)


Conditions


… phase(obj1, liquid or gas)…


Simplified:


If conditions match situation: transition etc.


from state to next state


Ambiguities due to coarse
grainsize


Branching in causal chains (
behaviour

graph)





GARP
-
3, Qualitative reasoning architecture

s
ituation

description

Where is the ontology?

(called: GARP
-
3, simple)

GARP and
Ontologies


GARP is written in Prolog


No real distinction between ontology (entities) and
model fragments (processes)


However:


As
ontologies

become largely available on the Web, cast in
OWL, GARP has now OWL import/export


Clearer distinction


See Ken
Forbus
, Qualitative
Modelling
. In Frank van
Harmelen
, Vladimir
Lifschitz

and Bruce Porter (
Eds
), Handbook of Knowledge Representation,
Amsterdam, Elsevier, 2008


And now
for a movie





http://hcs.science.uva.nl/QRM/Garp3NNR.mov