3: ontological reasoning

snufflevoicelessInternet and Web Development

Oct 22, 2013 (3 years and 9 months ago)

43 views

Joost Breuker

Leibniz Center for Law

University of
Amsterdam


Ontology, ontologies and ontological
reasoning

3: ontological reasoning

Overview


Semantic Web and OWL


Use of ontologies


Reasoning with ontologies


TRACS: testing the Dutch Traffic Regulation


HARNESS: DL
-
based legal assessment


Frameworks and the limits of DL based
reasoning


Problem solving and reasoning

What the
Semantic

Web is intended for

Dream part 2


“In communicating between people using the Web, computers
and networks have as their job to enable the information
space, and otherwise get out of their way. But doesn’t it
make sense to also bring computers more onto action, to
put their analytic power to work. In part two of the
dream, that is just what they do. The first step is putting
data on the web in a form that machines can naturally
understand, or converting it to that form. This creates
what I call a Semantic Web
--

a web of data that can be
processed directly or indirectly by machines.” [p191]



A decade later (W3C): infrastructural standards for SW


Semantics are represented by ontologies


Ontologies are represented by a
KR formalism


Note: ontologies were
specifications

in KE (using
Ontolingua; CML, cf UML in SE)


On top of a layer cake of data
-
handling formalisms


KR formalism is intended for reasoning


Even suitable for blind trust (OWL
-
DL is decidable)








Legal ontologies (from Nuria Casellas, 2008/9)

Legal ontologies (from Nuria Casellas, 2008/9)

HOWEVER, in practice


Not one of these ontologies is used for
reasoning


Use:


Information management (documents)


That is also what the current
Semantic Web
efforts are
about (not only in legal domains)


Core ontologies (reuse?)




Why using OWL?

OWL: DL
-
based knowledge representation


OWL
-
DL is a unique result of 40 years of
research in AI about KR


Semantic networks, KL
-
ONE, CLASSIC, LOOM,..


Concept oriented representation


Very suitable for ontologies


vs

Rule based KR


End 80
-
ies: logical foundations


A KR formalism defines what can be correctly and
completely inferred


On the basis of the semantics (model theory) of the
formalism


Problem: finding a sub
-
set of predicate logic that is
decidable (and computationally tractable)


OWL’s semantic web context

OWL

OWL’s further requirements/problems


…besides the expressivity/decidability trade
-
off


The RDF layer (OO based) was a serious
obstacle


Its expressiveness was incompatible with DL research


Ian Horrocks, Peter F. Patel
-
Schneider, and Frank van Harmelen.
From
SHIQ

and RDF to OWL: The making of a web ontology
language. Journal of Web Semantics, pages 7

26, 2003.


No unique naming assumption


USA/EU team: KR & KA community


NB: improved expressivity in OWL 2!


Still: OWL is not a self
-
evident for novices…

Reasoning with OWL


Main inference: classification on the basis of
properties of concepts


Reasoner (inference engine) `classifier’


Complete for DL based


Rule based reasoners are not complete


Eg Prolog, unless `closed world assumption’


For the Web this assumption cannot hold!


T(erminolgy)
-
Box: ontology (knowledge)


Classes (concepts, universals) and properties
(relations, attributes, features,..)


A(ssertions)
-
Box: some situation (information)


Individuals (instances) with properties


When is REASONING with ontologies indicated


(...except for consistency checking etc.)


In understanding/modeling
situations


Situation = events and states of entities in space
(and over time)


Two main modes


Text understanding (stories, cases)


Nb exc. expository discourse…


Scene understanding (robotics)

When is REASONING with ontologies
required



When
all possible
situations have to be modeled


Typical examples:


Model based & qualitative reasoning systems


Testing system models


..legal case assessment


All possible combinations


completeness &
consistency


eg OWL
-
DL


NB: in knowledge systems, situations are usually
modeled implicitly in user
-
system dialogues:


Asking user (values of/presence of) parameters


Heuristics; human limitations in handling
combinatorics

For instance: TRACS (1990


1994)


Testing a new Dutch traffic code (RVV
-
90)



art. 3 Vehicles should keep to the right


art. 6 Two bicycles may ride next to each other


art. 33 A trailer should have lights at the back


Questions


Consistent?


Complete?


In what respect different from RVV
-
66 (old one)?


These can only be answered when we can
model all possible situations distinguished by
this code

Traffic participants:
a part of the ontology (`world knowledge’)

traffic
-
participant

pedestrian

driver

bicyclist

autocyclist

driver of

motor vehicle

car driver

motorcycle

driver

lorry driver

bus driver

Simple example of ontological reasoning


Ontology (T
-
Box)


Subsumes (Physical_object, Car)


Right
-
of (Physical_object, Physical_object)


Inv(Right_of, Left_of)


Case description (A
-
Box)


Is
-
a (car1, Car)


Is
-
a (car1, Car)


Right_of (car1, car2)


Classifier

(eg Pellet)



Left_of (car2, car1) (A
-
Box)



…simple as that, but necessary

Architecture of TRACS
(Breuker & den Haan, 94)

WORLD

KNOWLEDGE

BASE

META
-
LEGAL

KNOWLEDGE

BASE

REGULATION

KNOWLEDGE

BASE

REGULATION

APPLIER

CONFLICT

RESOLVER

SITUATION

GENERATOR

SITUATION

APPLICABLE

RULES

TRESPASSED/

NON
-
TRESPASSED

RULES

CONSISTENTLY

APPLICABLE

RULES

VALIDATOR

Btw: some surprising results

Tram on
tramway

Car on bicycle
lane

Just a prototype…


About 10
5
possible combinations


Analysis of redundancy (symmetry)


Still: too many for humans to inspect!


But:


Differences with old regulation


Differences with foreign regulations (


ontology the same?)


…political decisions…

HARNESS: OWL 2 DL also for normative reasoning




Normative

reasoning simultaneously with
ontological

reasoning using OWL
-
DL


Estrella, 6
th

framework, 2006
-
2008


http://www.estrellaproject.org/


Saskia van de Ven, Joost Breuker, Rinke Hoekstra, Lars Wortel, and Abdallah El
-
Ali.
Automated legal assessment in OWL 2. In
Legal Knowledge and Information
Systems. Jurix 2008: The 21st Annual Conference
, Frontiers in Artificial Intelligence
and Applications. IOS Press, December 2008.


András Förhécz and György Strausz, Legal Assessment Using Conjunctive Queries,
Proceedings LOAIT 2009

Representing norms in OWL DL 2


Norm


Generic case
description is a conjunction of generic
situation (σ) descriptions


Generic case description is a class (

)


A deontic qualification

(P,O,F) is associated with




Case description is an individual (C)


Description is itself composed of classes/individuals
as defined in the ontology!


Watch this….

What did you see?


Event 1 (Saskia entering, shows ID)


Event 2 (Joost entering, shows ID)


Event 3 (Radboud entering)



(nb : Radboud is president of Jurix)



JURIX 2009 Regulation



1.
For entering a U
-
building, identification is
required

2.
The President does not need an identification to
enter a U
-
building


Generic situations and generic case

1.
For (
entering a U
-
building
), (
an identification
) is
required

2.
The (
President
)
does not need
(
an
identification
) to (
enter a U
-
building
)


Step 1: Modeled as:

(1)
Σ
1


σ
1

σ
2

σ
3


(1)

Σ
2


σ
4

σ
2
(


3
)










Generic situations and generic case

1.
For (
entering a U
-
building
), (
an identification
) is
required
[for each person]

2.
The (
President
)
does not need
(
an
identification
) to (
enter a U
-
building
)


Step 1: Modeled as:

(1)
Σ
1


σ
1

σ
2

σ
3


(1)

Σ
2


σ
4

σ
2
(


3
)











Step 2: Adding deontic qualification to the norms





Permitted(Σ
1
)



σ
1

σ
2

σ
3

(1) Obliged(
Σ
1
)





Forbidden(
Σ
1
)



σ
1

σ
2
(~

σ
3
)


(this is a `design pattern’ which separates
conditions (
person
,
entering

and
identity

from
a forbidden generic case)


(2) Permitted(Σ
2
)



σ
4

σ
2
(


3
)






President

Step 3 Normative assessment: classifying C(ase)


Case:
President Radboud enters U
-
building


C: {s
4,
s
2
}


Classifying C:


Σ
1

subsumes Σ
2



exception


C is Disallowed
-
by Σ
1


C is Allowed
-
by Σ
2


Etc, etc…


This is not viewed as a logical conflict by Pellet due to
the fact that this individual is classified by two
different norms (classes)



HARNESS selects subsumed (Σ
2
)

Experimental user interface (Protégé plug
-
in)

Violation

Compliance


An important advantage


Three knowledge bases:


domain ontology
(T
-
Box)


norms (T
-
Box)


case description (individuals & properties; A
-
Box)


OWL
-
DL reasoner (Pellet) `classifies’ case in
terms of concepts and of norms simultaneously
in an intertwined fashion







Hybrid or only
-
rule
-
based solutions cannot
preserve all (inferred) information of the
ontology as Pellet/OWL 2 does



Knowledge, ontology and meaning


There is more to knowledge than ontology


Ontology (terminology) provides the basic units for
understanding


Regular combinations: patterns of concepts


Scripts & frames
: experience, heuristics, associations


Meaningful experience can only be based upon
understanding!


Synthetic learning vs (further) abstraction

What’s further new


Monotonic and deductive:


Unique & against accepted wisdom


Exceptions do not lead to conflict


Advantage:


Reasoning is sound and complete (trust)


No rule formalism allows this with the same expressiveness


Full use of OWL 2 DL’s expressiveness


No loss in translation


Disadvantage:


Modeling in DL is found to be more intellectually
demanding than modeling in rules anyway


Obligation design pattern is not very intuitive


A serious problem in the use of (OWL
-
) DL


DL representations are `variable free’


(most) rule formalisms have variables


Moreover: in OWL names of individuals are not
taken as identifiers of individuals (no unique
naming assumption)


It is not possible to track
changes

of a particular
individual


A
-
Box: colour(block1,
red
); colour(block1,
blue
)


OWL: …
there are (now) two block1’s
!

A serious problem in the use of (OWL
-
) DL (2)


Also: it is (almost) impossible to `enforce’
identity of individuals in OWL


Example: transaction


OWL’s restriction on the form of graphs



`diamond of individuals’







what OWL allows: tree’s


An approximate solution: a special design pattern


Constraining the identity:









Rinke Hoekstra and Joost Breuker. Polishing diamonds in OWL2. In Aldo Gangemi and Jérôme Euzenat,
editors,
Proceedings of the 16th International Conference on Knowledge Engineering and
Knowledge Management (EKAW 2008)
, LNAI/LNCS. Springer Verlag, October 2008.
)

Rinke Hoekstra.
Ontology Representation
-

Design Patterns and Ontologies that Make Sense
, volume
197 of
Frontiers of Artificial Intelligence and Applications
. IOS Press, Amsterdam, June 2009.

The DL view has a limited scope


Excellent for
axiomatic

grounding of the terms
that form the lowest level of granularity of a
knowledge base


More complex knowledge structures
(frameworks) will require also rules


`Hybrid’ solution:

“In the hybrid approach there is a strict separation between the
ordinary predicates, which are basic rule predicates and ontology
predicates, which are only used as constraints in rule antecedents.
Reasoning is done by interfacing an existing rule reasoner with an
existing ontology reasoner”


Problem: rule formalism has to be `DL
-
safe’


OWL/rule combination still (W3C) research issue


Frameworks