ORMS: Use-cases Version 0.22

toadspottedincurableInternet και Εφαρμογές Web

4 Δεκ 2013 (πριν από 3 χρόνια και 11 μήνες)

137 εμφανίσεις

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
1

of
25



ORMS: Use
-
cases

Version
0.2
2

Working

Draft,
27 Octo
ber 2008

Specification URIs
:

This Version
:

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/

[additional path/filename]
.html

http://docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.doc

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.pdf

Previous Version:

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.html

http://docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/fil
ename]
.doc

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.pdf

Latest
Version:

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.html

http://docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.doc

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.pdf

Latest Approved Version:

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.html

http://docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.doc

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.pdf

Technical Committee:

OASIS
Open Re
putation Management Systems (ORMS)

TC

Chair(s)
:

Anthony Nadalin

Nat Sakimura

Editor(s):

Mahalingam Mani

Related work:

This specification replaces or supercedes:



[specifications replaced by this standard
]

This specification is rela
ted to:



[related specifications]

Declared XML Namespace(s):

[list namespaces here]

[list namespaces here]

Abstract:

Towards arriving at a standard protocol for exchanging reputation

information between reputation
data providers and
consumers
and a portable reputation data and meta
-
data format, a referenc
e
model is described. The model is evalu
ate
d

and validated with
use
-
cases

to arrive at
requirements for a
portable
reputation provid
er data f
ormat that ensures openness in ownership,
privacy and confidentiality protection and management of reputation data.

Status:

This document was last revised or approved by the
[TC name | membership of OASIS
]
on the
above date
. The level of approval is also listed above.
Check the
“Latest Version” or “Latest
Approved Version”

location noted above for possible later revisions of this document.

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
2

of
25


Technical Committee members should send comments on this specification to the Technica
l
Committee’s email list. Others should send comments to the Technical Committee by using the
“Send A Comment” button on the Technical Committee’s web page at
http://
www.oasis
-
open.org/committees/
orms
/
.

For information on whether any patents have been disclosed that may be essential to
implementing this specification, and any offers of patent licensing terms, please refer to the
Intellectual Property Rights section of the Tech
nical Committee web page (
http://
www.oasis
-
open.org/committees/
orms
/ipr.php
.

The non
-
normative errata page for this specification is located at
http://
www.oasis
-
open.org/committees/
orms
/
.

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
3

of
25


Notices

Copyright © OASIS®
2007. All Rights Reserved.

All capitalized terms in the following text have the meanings assigned to them in the OASIS Intelle
ctual
Property Rights Policy (the "OASIS IPR Policy"). The full Policy may be found at the OASIS website.

This document and translations of it may be copied and furnished to others, and derivative works that
comment on or otherwise explain it or assist in
its implementation may be prepared, copied, published,
and distributed, in whole or in part, without restriction of any kind, provided that the above copyright notice
and this section are included on all such copies and derivative works. However, this docu
ment itself may
not be modified in any way, including by removing the copyright notice or references to OASIS, except as
needed for the purpose of developing any document or deliverable produced by an OASIS Technical
Committee (in which case the rules appl
icable to copyrights, as set forth in the OASIS IPR Policy, must
be followed) or as required to translate it into languages other than English.

The limited permissions granted above are perpetual and will not be revoked by OASIS or its successors
or assign
s.

This document and the information contained herein is provided on an "AS IS" basis and OASIS
DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY
WARRANTY THAT THE USE OF THE INFORMATION HEREIN WILL NOT INFRINGE ANY
OWNERSHIP R
IGHTS OR ANY IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A
PARTICULAR PURPOSE.

OASIS requests that any OASIS Party or any other party that believes it has patent claims that would
necessarily be infringed by implementations of this OASIS Committee

Specification or OASIS Standard,
to notify OASIS TC Administrator and provide an indication of its willingness to grant patent licenses to
such patent claims in a manner consistent with the IPR Mode of the OASIS Technical Committee that
produced this spec
ification.

OASIS invites any party to contact the OASIS TC Administrator if it is aware of a claim of ownership of
any patent claims that would necessarily be infringed by implementations of this specification by a patent
holder that is not willing to prov
ide a license to such patent claims in a manner consistent with the IPR
Mode of the OASIS Technical Committee that produced this specification. OASIS may include such
claims on its website, but disclaims any obligation to do so.

OASIS takes no position reg
arding the validity or scope of any intellectual property or other rights that
might be claimed to pertain to the implementation or use of the technology described in this document or
the extent to which any license under such rights might or might not be
available; neither does it
represent that it has made any effort to identify any such rights. Information on OASIS' procedures with
respect to rights in any document or deliverable produced by an OASIS Technical Committee can be
found on the OASIS website.

Copies of claims of rights made available for publication and any
assurances of licenses to be made available, or the result of an attempt made to obtain a general license
or permission for the use of such proprietary rights by implementers or users of th
is OASIS Committee
Specification or OASIS Standard, can be obtained from the OASIS TC Administrator. OASIS makes no
representation that any information or list of intellectual property rights will at any time be complete, or
that any claims in such list ar
e, in fact, Essential Claims.

The names "OASIS",
[insert specific trademarked names and abbreviations here]

are trademarks
of
OASIS, the

owner and developer of this specification, and should be used only to refer to the organizatio
n
and its official outputs. OASIS welcomes reference to, and implementation and use of, specifications,
while reserving the right to enforce its marks against misleading uses. Please see
http://ww
w.oasis
-
open.org/who/trademark.php

for above guidance.


ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
4

of
25


Table of Contents

1

Introduction

................................
................................
................................
................................
...........

6

1.1 Terminology

................................
................................
................................
................................
........

6

1.1.1 ORMS Definitions

................................
................................
................................
........................

6

1.2 Normative References

................................
................................
................................
........................

8

1.3 Non
-
Normative References

................................
................................
................................
................

8

2

Overview

................................
................................
................................
................................
...............

9

3

ORMS Reference Model

................................
................................
................................
....................

10

4

Use
-
cases

................................
................................
................................
................................
...........

13

4.1 OpenID in Trusted Exchange

................................
................................
................................
...........

13

4.1.1 Actors

................................
................................
................................
................................
........

13

4.1.2 Description

................................
................................
................................
................................
.

13

4.1.3 Input

................................
................................
................................
................................
...........

13

4.1.4 Output

................................
................................
................................
................................
........

13

4.2 IdP (Identity Provider) Reputation S
ervice

................................
................................
.......................

13

4.2.1 Actors

................................
................................
................................
................................
........

13

4.2.2 Description

................................
................................
................................
................................
.

13

4.2.3 Input

................................
................................
................................
................................
...........

14

4.2.4 Output

................................
................................
................................
................................
........

14

4.3 Content Filtering

................................
................................
................................
...............................

14

4.3.1 Actors

................................
................................
................................
................................
........

14

4.3.2 Description

................................
................................
................................
................................
.

14

4.3.3 Input

................................
................................
................................
................................
...........

14

4.3.4 Output

................................
................................
................................
................................
........

15

4.4

Second Life

Avatars

................................
................................
................................
..........................

15

4.4.1 Actors

................................
................................
................................
................................
........

15

4.4.2 Description

................................
................................
................................
................................
.

15

4.4.3 Input

................................
................................
................................
................................
...........

15

4.4.4 Output

................................
................................
................................
................................
........

15

4.5 Nodes in
Second Lif
e Grid

................................
................................
................................
................

15

4.5.1 Actors

................................
................................
................................
................................
........

15

4.5.2 Description

................................
................................
................................
................................
.

15

4.5.3 Input

................................
................................
................................
................................
...........

16

4.5.4 Output

................................
................................
................................
................................
........

16

4.6 Social
-
network derived Peer reputation

................................
................................
............................

16

4.6.1 Actors

................................
................................
................................
................................
........

16

4.6.2 Description

................................
................................
................................
................................
.

16

4.6.3 Input

................................
................................
................................
................................
...........

16

4.6.4 Output

................................
................................
................................
................................
........

16

4.7 Digital Signature (signing key) reputation

................................
................................
.........................

16

4.7.1 Actors

................................
................................
................................
................................
........

16

4.7.2 Description

................................
................................
................................
................................
.

16

4.7.3 Input

................................
................................
................................
................................
...........

17

4.7.4 Output

................................
................................
................................
................................
........

17

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
5

of
25


4.8 Peer Reputation in P2P Networks

................................
................................
................................
....

17

4.8.1 Actors

................................
................................
................................
................................
........

17

4.8.2 Description

................................
................................
................................
................................
.

17

4.8.3 Input

................................
................................
................................
................................
...........

17

4.8.4 Output

................................
................................
................................
................................
........

17

4.9 Seller Reputation

................................
................................
................................
..............................

17

4.9.1 Actors

................................
................................
................................
................................
........

17

4.9.2 Description

................................
................................
................................
................................
.

17

4.9.3 Input

................................
................................
................................
................................
...........

18

4.9.4 Output

................................
................................
................................
................................
........

18

4.10 Reputee Influence: Social & Professional Networks

................................
................................
......

18

4.10.1 Actors

................................
................................
................................
................................
......

18

4.10.2 Description
................................
................................
................................
...............................

18

4.10.3 Input

................................
................................
................................
................................
.........

18

4.10.4 Output

................................
................................
................................
................................
......

18

4.11 Adaptive Trust: Enterprise

unified communications (UC)

................................
...............................

18

4.11.1 Actors

................................
................................
................................
................................
......

18

4.11.2 Description
................................
................................
................................
...............................

19

4.11.3 Input

................................
................................
................................
................................
.........

19

4.11.4 Output

................................
................................
................................
................................
......

19

4.12 Federated Trust in UC

................................
................................
................................
....................

19

4.12.1 Actors

................................
................................
................................
................................
......

19

4.12.2 Description
................................
................................
................................
...............................

19

4.12.3 Input

................................
................................
................................
................................
.........

20

4.1
2.4 Output

................................
................................
................................
................................
......

20

4.13 Peer
-
peer reputation (between actors)

................................
................................
...........................

20

4.13.1 Actors

................................
................................
................................
................................
......

20

4.13.2 Description
................................
................................
................................
...............................

20

4.13.3 Input

................................
................................
................................
................................
.........

20

4.13.4 Output

................................
................................
................................
................................
......

20

5

Security and Privacy considerations

................................
................................
................................
..

21

A.

Acknowledgements

................................
................................
................................
............................

23

B.

Non
-
Normative Text

................................
................................
................................
...........................

24

C.

Revision History

................................
................................
................................
................................
..

25



ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
6

of
25


1

Introduction

1

Social and Corporate networking interactions in the Internet age have given rise to an exponential growth
2

in real
-
time and asynchronous communications. The
openness of the good
-
faith protocols and networks
3

are now increasingly exposed to the threats and exploits of the community.

4

Moreover, corporate networks and social networks are required to deal with a range of users with roles
5

and privileges varying dynam
ically in time and (network) domain requiring corporations to adjust to the
6

wired and wireless network, traditional and virtually
-
extended perimeters, extranets, federations and
7

partner
-
portals involving considerable degree of transitive trust.

8

A
n
framewor
k is required to identify and qualify

9



accidental, well
-
behaved and malicious privilege/usage patterns and

10



quantify (or
trust
-
score
) the above patterns to facilitate (social and corporate network) services
11

adapt trust levels and authorized accesses to resou
rces.

12

Interoperable trust
-
scoring mechanism is required to relate

13

This document describes use
-
cases in varying

scenarios based on existing e
-
commerce transaction
14

systems, social networks and converged communications scenarios ranging from corporate enterpr
ise
15

networks to peer
-
peer networks.

16

The use
-
case section is preceded by ORMS Terminology and Overview sections and also a Reference
17

Model in aiding the discussion of use
-
cases.

18

1.1

Terminology

19

The key words “
MUST

,

MUST NOT

,

REQUIRED

,

SHALL

,

SHALL NOT

,


SHOULD

,

SHOULD
20

NOT

,

RECOMMENDED

,

MAY

, and

OPTIONAL
” in this document are to be interpreted as described
21

in
[RFC2119]
.

22

1.1.1

ORMS
Definitions

23

Actor

24

Particpating entities in a transaction. For example, in the Reputation Systems contex
t, the
25

reputation scoring
-
service (Provider or reputer), the service using the Reputation Provider (relying
26

party) and the client being evaluated.(reputee)

27

Avata
r

28

An Avatar is an i
ncarnation
, embodiment

or
virtual
manifestation of an actor
’s

profile
in a S
ocial

or
29

P
rofessional

N
etwork
community
.

30

Alternate definition: a computer user's representation of himself or herself, whether in the form of
31

a three
-
dimensional model used in computer games, a 2
-
dimensional icon (picture) used on
32

internet forums and

other communities, or a text construct found on early systems
1

33

Reputation Systems

34










1

An example is MUD: Multi
-
User Domain.

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
7

of
25


U
sing the Internet’s bi
-
directional communication capabilities in order to artificially engineer large
-
35

scale word
-
of
-
mouth networks where
actors

share opinion
s and experiences on a wide range of
36

topics, including companies, products, services, and even world events. (Dellarocas (2005)
)

37

Reputation

38

R
eputation is a collective evaluation of
the behavior of
a
n entity based on factual and/or
39

subjective data about it, and is used as one of the factors for establishing trust on that

entity for a
40

specific purpose.

41

Reputation Score

42

A Reputation Score of a Player (Reputee) on the Type (Criteria) by other players (Reputor) is the
43

subjective probability assigned by the Reputor tha
t the Reputee fulfils the Criteria. (Sakimura
44

(2008))

45

Reputation domain

46

T
he encompassing domain where a reputation is defined (
to be refined
)

47

Reputation C
ompute
-
Engine


48

A reputation for an entity is computed using a
R
eputation
Correlator
, based on different
sources

49

of input data
,

about the entity
. The reputation
correlator

accepts
one or more input data about the
50

entity,
and evaluates a reputation score

based on its algorithms

and contextual information
51

available at the time of computation.

52

Contextual
I
nformation

53

Cr
iteria used
as functions
in
RCE


to evaluate
Reputation output.

54

Reputation algorithm

55

a domain
-
specific algorithm for computing reputations. A reputation algorithm is designed taking
56

into account the characteristics of the encompassing doma
in: topology (centralized or distributed
57

reputation computation), entities to which the reputation is associated, entities that produce input
58

data, entities that consume reputations, available types of input data, type of contextual
59

information available,
desired properties of the computed reputation (
robustness, fairness, etc.).

60

Reputation (input) data

61

T
he data upon which the reputation is computed.

62

Reputation Management System

63

(
to be refined
) A reputation management system may include mechanisms for: c
ollecting data
64

about entities (generating data inputs or integrating external data); computing reputations; making
65

sure the system is fair (e.g. provide bootstrapping mechanisms for new entities); performing actions
66

based on reputations (e.g. trust computa
tions, automatic decisions); revoking reputations, allowing
67

entities legitimate control over their reputation and allowing entities to challenge their reputations
68

(governance); making sure the system is not abused (security), making sure privacy of entitie
s is
69

respected (i.e. that the association entity
-

reputation is only disclosed to authorized parties)

70

RP

71


Re
lying
Party.

(See Reputee in the context of OpenID).

72

OP

73


OpenID Provider
. The reputation Compute
-
E
ngine in the OpenID model.

74

Reputer

75

Reputee

76

RSP

77

Reputation Service Provider.

78

VoIP

79

Voice over Internet Protocol.

80

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
8

of
25


UC

81

Unified Communications
, a term denoting

all forms of call and multimedia/cross
-
media message
-
82

management functions controlled by an individual user for both business and social purposes
2

83

1.2

Nor
mative References

84

[RFC2119]

S. Bradner,
Key words for use in RFCs to Indicate Requirement Levels
,
85

http://www.ietf.org/rfc/rfc2119.txt
, IETF RFC 2119, March 1997.

86

[
Reference
]

[Full reference citation]


87

1.3

Non
-
Normative References

88

[OpenIDW]

OpenID Wikipedia Page

89

[
OpenID
]

OpenID Community Website

90

[Dellarocas]

Dellarocas, C.,
2005,
"Reputation Mechanisms"
.

91

[Wilson]

Wilson, R., 1985, Reputations in Games and Markets.
A. Roth, ed. Game
-
92

Theoretic Models of Bargaining, Cambridge University Press, Ca
mbridge, UK,
93

27
-
62.

94

[Sakimura]

Sakimura, N., 2008
"What is Reputation?"

95

[veracite]

Veracite Research Project (IBM)

96

[enisasec]

Reputation
-
based Systems: A security analysis: ENISA

p
osition paper, E
97

Carrara, Giles Hogben, October 2007

98


99










2

This definition, from International Engineering Consortium is the most generic to many minor industry
variants.

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
9

of
25


2

Overview

100

The use of reputation systems has been proposed for various applications such as:

101



Validating the trustworthiness of sellers and buyers in online auctions (ecommerce websites have
102

proved can h
a
ve large influence on sellers)

103



Detecting free riders in peer to peer networks

104



Ensuring the authenticity of signature keys in a web of trust.

105



Smarter searching of web sites, blogs, events, products, companies and other individuals
.

106

R
eputation is a metric

(a score, a rank, a state,
a multi
-
dimensional profile, etc.
) associated to an entity (a
107

person, a business, a digital identity, a website, a system, a device, a category of de
vices, a computing
108

resource, etc.
) or to a tuple [entity, attribute(s)] (e.g. [
person,skill]) in a particular domain and
at a
109

particular moment in time.

110

Reputation in

these examples
refers to the opinions about an entity, from others. Reputation is one of the
111

factors upon which trust can be based through the use of verifiable claims.

Reputation changes with time
112

and is used within a context. Trust and reputation are related to a context.

113

There are various methods for generating user's reputation data or trustworthiness. Some methods are
114

based on user's feedback through appropriate fee
dback channels. Other methods include having viewers
115

participate in the reputation
-
building process through the user's profile at specific sites and communities.
116

Each method has its limitations in terms of its susceptibility to bad actors, manipulation of
data for specific
117

purposes, and spammers.

118

2.1

Reputation Input Data

119

T
h
is is the

data upon which the reputation is computed.

Input data can be of different types, f
or example:

120



Subjective data about the entity: e.g. ratings and feedback from peers, claims and
121

e
ndorsements

122



Static and dynamic characteristics of the entity: e.g. demographics, preferences

123



Behavior data, stemming from measurements and observations within a system: e.g. logs of
124

entity’s past actions, history of interactions, parametric data

125



“Real w
orld” data about the entity: e.g. background checks, credit history

126



Inferred data about an entity: e.g. text analytics

127


128

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
10

of
25


3

ORMS Reference Model

129

The following figure represents a generalized reputation model.

130


131


132

Figure
1

Generalized reputation model

[
to be replaced with a regular diagram

-

ed
]

133

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
11

of
25



134

Primary components of the reference model are

135

1.

Input sources: Reputation Input data collectors (RDC).

136

2.

Reputation Data (RD): Portable reput
ation data generated by all input sources into a reputation
137

computation engine (RCE or reputation calculator).

138

3.

Reputation Context (RC). This allows filtering and qualifying the right choice of algorithms to use
139

and pre
-
process.

140

4.

Reputation Score (RS): the o
utcome of the reputation evaluation of an entity (to be portable).

141

5.

Reputation Consumer (Reputee): consumer of reputation score to use as yardstick for computing
142

the degree of trust for the entity it serves.

143


144

Thus, the primary objective and challenge is to
make the reputation input data and reputation formats
145

interoperable (portable)

across vendor
-
boundaries

and domains of Reputation
.

146

Input

data

collect
ion
/
generation

Input data

collect
ion
/
generation

Input data

collect
ion
/
generation

Input data

collect
ion
/
generation

Individual &
demo
graphic data


entity E

Oberved data


Entity E

Real
-
world data:
entity E

Inferred

data
:

Entity E

s
ubjective

data


Entity E

Input data

collect
ion
/
generation



Reputation

Computer

C
ontext

Input data

collect
ion
/
generation


Reputation System

Reputation

of

Input entities

(local)

Reputation


Portable

Reputation


Computatio
n of
Trust by (external
)

Consuming party

Pub
-
Sub

Portable

Reputation

Data Input

Computation of
Trust by (external
)

Consuming party

Local ReputationScope

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
12

of
25



147

Figure
2

Reputation Reference Model [
to be edited as we go along

-

ed
]

148

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
13

of
25


4

Use
-
cases

149

4.1

OpenID in Trusted Exchange

150

4.1.1

Actors

151

The identified actors in an OpenID reputation framework are:

152

1. OpenID Provider

153

2. OpenID Relying Party

(Reputee)

154

3. Reputation Service (Reputer)

155

4.1.2

Description

156

Trusted Exchange is a secure protocol for data excha
nge between a OpenID provider (OP) and a Relying
157

party (RP). OP provides RP access to user data based on RP's reputation.

158

4.1.2.1

Basic Flows

159

[TBD: figure]

160

4.1.2.1.1

Pre
-
conditions

161

4.1.2.1.2

Post
-
conditions

162

4.1.3

Input

163

The following are the general inputs to the OpenID trusted exchange.

164

1.

N
umeric count of successful transaction

165

2.

Numeric count of claims

166

4.1.4

Output

167

Score value accumulated t
o evaluate RP's trustworthiness.

168

4.2

IdP (Identity Provider) Reputation Service

169

The identity provider and the use
-
case is quite analogous to the OpenID provider role

in the previous use
-
170

case.

171

4.2.1

Actors

172

1. Identity Provider

173

2. User (service clients relying on the IdP)

174

3. Identity Provider Reputation Service (Reputer


providing the trustworthiness of the chosen IdP)

175

4.2.2

Description

176

The generic use case applies to all browser
-
redirect
-
based Web single
-
sign
-
on systems (e.g., OpenID,
177

SAML Web Profile, etc.) This use case has received particular attention in the OpenID community as an
178

alternative (or a supplement) to OpenID Relying Parties (RPs) having to maintain their own OpenID

179

Provider whitelists/blacklists
.

180

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
14

of
25


4.2.2.1

Basic Flows

181

[TBD: figure]

182

4.2.2.1.1

Pre
-
conditions

183

4.2.2.1.2

Post
-
conditions

184

4.2.3

Input

185

Options (not mutually exclusive):

186


187



Vote from authenticated IdP users.

188



Vote from registered IdPs.

189



Vote from registered third parties.

190

4.2.4

Output

191

Score value accumula
ted to evaluate IdP’s trustworthiness.

192

4.3

Content Filtering

193

This use
-
case aims to describe reputation as Trust meta
-
data for content
-
filtering. It references a (as yet
-
194

unpublicized) Veracite research project

195

4.3.1

Actors

196

1. Users of web content (producers, evaluat
ors, consumers, etc.)

197

2.
Veracite

server(s)

198

4.3.2

Description

199

This scenario is based in the Veracite research project from IBM. A Veracite server provides a service for
200

binding actor information to web content (where actor can be a person
-

author, evaluator, et
c. or an
201

automated system), together with assertions from that actor about the specific content (actor information
202

and web content are distributed, the server only provides the binding). This "trust metadata" is used by
203

content consumers to filter the cont
ent according to her trust preferences. Actors in a Veracite system can
204

have reputations associated, which becomes another parameter for content filterin
g.

205

4.3.2.1

Basic Flows

206

[TBD: figure]

207

4.3.2.1.1

Pre
-
conditions

208

4.3.2.1.2

Post
-
conditions

209

4.3.3

Input

210

1.

Content
-
providing actor’s assertions
about content.

211

2.

(veracite) service’s binding
(vouching)
of the content to the content
-
providing
-
actor’s identity.

212

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
15

of
25


4.3.4

Output

213

The system does not produce reputation scores, it relies on portable reputations provided by third parties
214

(the requirement is that the
reputation information can be used for filtering and that the context to which it
215

applies be well specified
).

216

4.4

Second Life

Avatars

217

4.4.1

Actors

218

1.

SecondLife

(SL)
reputation service

219

2.

Avatars

220

4.4.2

Description

221

Enabl
ing portability of avatar reputations from one SL node to anothe
r (as the grid diversifies, and
222

specialized SL nodes emerge, this will require "translation"of avatar reputations between nodes)

223

4.4.2.1

Basic Flows

224

[TBD: figure]

225

4.4.2.1.1

Pre
-
conditions

226

4.4.2.1.2

Post
-
conditions

227

4.4.3

Input

228

Externalized description of the metadata that defines an avatar, peer ratings for an avatar in a given
229

node, historical data for an avatar in a given node

230

4.4.4

Output

231

TBD

232

4.5

Nodes in
Second Lif
e

Grid

233

4.5.1

Actors

234

1.

SecondLife

reputation service

235

2.

SL nodes

236

3.

Avatars

237

4.5.2

Description

238

An SL grid is emerging where different nodes can be controlled by different entities: SL servers are no
239

longer under the sole control of Linden Labs, anybody is able to put up a SL node and int
egrate it into the
240

SL grid. This opens up the possibility for new business scenarios (e.g "business oriented" SL nodes) but
241

also for malicious SL nodes; having a reputation associated to a node would help
.

242

4.5.2.1

Basic Flows

243

[TBD: figure]

244

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
16

of
25


4.5.2.1.1

Pre
-
conditions

245

4.5.2.1.2

Post
-
cond
itions

246

4.5.3

Input

247

Per
-
node ratings su
bmitted to a reputation service.

248

4.5.4

Output

249

[TBD]

250

4.6

Social
-
network derived Peer reputation

251

4.6.1

Actors

252

Members of a social network

(reputes and reputers)
.

253

4.6.2

Description

254

Members of a social network who have a relationship with member A ar
e randomly sampled and asked to
255

vouch for or rate member A with respect to a specific criterion. The identities of the members
256

vouching/rating are optionally kept anonymous, but in any case they are known to be members in good
257

standing.

258

4.6.2.1

Basic Flows

259

[TBD:
figure]

260

4.6.2.1.1

Pre
-
conditions

261

4.6.2.1.2

Post
-
conditions

262

4.6.3

Input

263

Scores stated by the vouching members AND frequency and recency of activity, and interactions of the
264

vouching members.

265

4.6.4

Output

266

Personal score value.

267

4.7

Digital Signature (signing key) reputation

268

4.7.1

Actors

269

Key holders

270

4.7.2

D
escription

271

Signers in a web of trust sign keys to express trust in the assertion that the key belongs to the holder
’s

272

name
(
subjectname

and/or
subjaltname
)
contained in the
digital
certificate. The more people sign, the
273

greater the trust in that assertion.

Note the only assertion subject to reputation is that the key belongs to
274

the named individual
-

nothing about the trustworthiness of this individual
.

275

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
17

of
25


4.7.2.1

Basic Flows

276

[TBD: figure]

277

4.7.2.1.1

Pre
-
conditions

278

4.7.2.1.2

Post
-
conditions

279

4.7.3

Input

280

Number of signing keys and their reputatio
n; CRL for signing keys.

281

4.7.4

Output

282

Key trust value

283

4.8

Peer Reputation in P2P Networks

284

4.8.1

Actors

285

Nodes in a peer
-
to
-
peer network.

286

4.8.2

Description

287

Internet service providers may use the following fairness criterion based on reputation for regulating
288

bandwidth allocation
according to observe
d usage behavior best
-
practices:

Nodes in a P2P network gain
289

download bandwidth according to their upload behavior. Essentially a bandwidth economy is maintained.

290

4.8.2.1

Basic Flows

291

[TBD: figure]

292

4.8.2.1.1

Pre
-
conditions

293

4.8.2.1.2

Post
-
conditions

294

4.8.3

Input

295

Average u
pload bandwidth for a node, number of files uploaded and similar upload metrics.

296

4.8.4

Output

297

Node download bandwidth.

298

4.9

Seller Reputation

299

4.9.1

Actors

300

Sellers and buyers in an e
-
commerce system.

301

4.9.2

Description

302

Buyers vote on the tru
stworthiness/quality of sellers.

303

4.9.2.1

Basic F
lows

304

[TBD: figure]

305

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
18

of
25


4.9.2.1.1

Pre
-
conditions

306

4.9.2.1.2

Post
-
conditions

307

4.9.3

Input

308

B
u
yer rat
es
the sellers
(potentially also prices of items bough
t, buyer
-
reputations of voters).

309

4.9.4

Output

310

Seller reputation as a percentage.

311

4.10

Reputee Influence: Social & Professional Networks

312

This class o
f use
-
cases deals with reputee
-
influenced criteria in social and professional networks.

313

4.10.1

Actors

314

4.

Customers or users relating to a professional and/or company (reputers)

315

5.

Professional and/or company being evaluated (reputee)

316

6.

Reputation Service P
rovider

(RSP)

317

4.10.2

D
escription

318

A specific aspect is that reputers, reputees and the reputation service provider may determine criteria to

319

be evaluated. B
oth reputers and reputees may apply their respective weightings allowing the reputation
320

service provider to calculate overa
ll ratings and rankings of professionals and/or companies within a
321

specific business segment

322

4.10.2.1

Basic Flows

323

[TBD: figure]

324

4.10.2.1.1

Pre
-
conditions

325

4.10.2.1.2

Post
-
conditions

326

4.10.3

Input

327

Scores on specific criteria by reputers processed by reputation service provider to facilitate relev
ancy and
328

avoid frau
d.

329

4.10.4

Output

330

Reputer as well as repute
e biased consolidated

score.

331

4.11

Adaptive Trust: Enterprise unified communications (UC)

332

4.11.1

Actors

333

1.

R
eputees
: email/IM/VoIP/... UC clients

334

2.

R
e
puters: E
nterprise UC services.

335

3.

Reputation Service Providers: E
nterpri
se

Policy F
ramework, through agents (
gateways
-

XML,
VoIP
)
336

or enterprise UC servers.

337

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
19

of
25


4.11.2

Description

338

Intrusion and SPAM detection a
gents monitor authorized behavior and score down
clients (reputees)
339

based on patterns of Policy
-
violations. They score

back up to

default level when be
havior is in line with
340

policy. R
eputers (
UC S
ervices
)

deliver based the level of current tr
ust
-
score
.

The services themselves
341

(policy enforcement points) may not be directly involved in interpreting scores. The repute
-
access
342

privilege
s may be modulated by Policy Decision Points.

343

4.11.2.1

Basic Flows

344

[TBD: figure]

345

4.11.2.1.1

Pre
-
conditions

346

4.11.2.1.2

Post
-
conditions

347

4.11.3

Input

348

The enterprise policy against which to measure behavior: patterns of policy
-
violation or compliance.

349

4.11.4

Output

350

T
rust
-
levels (authorizations) map
ped to

a numeric scale or role.

351

4.12

Federated Trust in UC

352

This is a more complex variant of the adaptive trust UC use
-
case. There exists a two
-
tier reputation
353

system. Two RSPs are peered to exchange reputation
-
leading events. The RSPs’ trust of each other may
354

also b
e impacted.

355

4.12.1

Actors

356

1.

R
eputee: R
emote
UC
client

(e.g., Yahoo, Google client
)
-

(C)

357

2.

R
eputer: called/emailed/IMed/... Enterpris
e
-
destination UC service
-

(D).

358

3.

RSP: E
nterprise

Policy F
ramewo
rk, through agents (gateways
-

XML, VoIP
) or
enterprise UC
server
s.

359

4.12.2

Desc
ription

360

1.

D detects a pattern of abuse(SPAM/SPIT/SPIM) and reports to peer (e.g., DKIM/SIP) server S
361

h
osting (C).

362

2.

S may gather similar inp
uts on C and be its
RSP
.

363

3.

D may provide a
trust
-
score and be (one of C
's)
RSPs
.

364

4.

S may

combine scores/reports and be two
-
t
ier

RSP. A possible hierarchical RSP

scenar
i
o is hidden
365

in (3).

366

5.

This may result
in RSP action by S similar to the generic Adaptive Trust UC use
-
case.

367

4.12.2.1

Basic Flows

368

[TBD: figure]

369

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
20

of
25


4.12.2.1.1

Pre
-
conditions

370

4.12.2.1.2

Post
-
conditions

371

4.12.3

Input

372

S scores trust value of ; optionally D scor
es in (3
). S
reacts to

the score. D may act independent of S's
373

scoring (may re
ly on its internal trust
-
score or input).

374

4.12.4

Output

375

The following are possible outputs


besides trust
-
score of reputation.

376



Step [2] leads to a Report / Alert of trust
-
related event
.

377



Steps [3] and [4] provide

data or trust
-
score. There's a contractual or baselined tr
ust
-
level
378

between every S & D (F
ederation).

379

4.13

Peer
-
peer r
eputation (between actors)

380

4.13.1

Actors

381

R
eputees: both participants in an electronic mes
saging exchange between people

382

R
e
puter: messaging client or server

383

4.13.2

Description

384

Two people communicate elect
ronically (e.g via email or IM).

385

4.13.2.1

Basic Flows

386

[TBD: figure]

387

4.13.2.1.1

Pre
-
conditions

388

4.13.2.1.2

Post
-
conditions

389

4.13.3

Input

390

Inputs to the reputation evaluation engine will be the communication content itself
-

t
ext/content analysi
s
391

of the message's basic intent
(e.g. request, offer, commitment/promise, question, answer, notice) as well
392

as latency, frequency of interaction.

393

4.13.4

Output

394

Relative peer reputation and/or social capital each party has accumulated in the r
elationship

395

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
21

of
25


5

Security
and Privacy
c
onsiderations

396

As in any open system, there are considerations of threat and vulnerabilities to be analyzed in the
397

Reputation Management System


both because and no
t
withstanding the Reputation Management
398

System itself bein
g a service to be built on a web of trust.

399


400

5.1

Threat taxonomy

401

Specifically, [
enisasec
] refers to a slew of threats and many of these are captured here for reference to
402

possible relevance to some or all of the di
scussed use
-
cases.

403

5.1.1

Whitewashing Attack

404

The
adversary

resets a poor reputation by rejoining the system with a new identi
ty. Systems that allow for easy
405

change of identity and easy use of new pseudonyms are vulnerable to this attack.

406

5.1.2

Sybil Attack

407

The adversa
ry creates multiple identities (sybils) and exploits them in order to manipulate a reputation score.

408

5.1.3

Impersonation and Reputation Theft

409

One entity acquires the identity of another entity (masquerades) and consequently steals her reputation.

410

5.1.4

Bootstrap issue
s and related threats

411

The initial reputation value given to a newcomer may lay it open to threats such as Sybil and Whitewashing attacks.

412

5.1.5

Extortion

413

Coordinated campaigns aimed at blackmail by damaging reputation for malicious motives.

414

5.1.6

Denial of Reputation

415

Attack designed to damage an entity’s reputation (e.g. in

combination with a sybil attack or impersonation) and create
416

an opportunity for blackmail in order to have the reputation cleaned.

417

5.1.7

Ballot
-
stuffing and bad
-
mouthing

418

Reporting of a false reputation sc
ore; the attackers (distinct or sybils) collude to give positive/negative feedback, to
419

increase or lower a reputation.

420

5.1.8

C
ollusion

421

Multiple users conspire (collude) to influence a given reputation.

422

5.1.9

Repudiation
-

of data, transaction

423

an entity can deny that a

transaction happened, or the existence of data for which he was responsible.

424

5.1.10

Dishonest Reputer

425

The voter is not trustworthy in his
/her

scoring.

426

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
22

of
25


5.1.11

Privacy threats for voters and reputation owners

427

Reputers and reputation
systems owners may be unwilling or una
ble to provide explicitly honest inputs for fear of
428

reprisal or backlash from (an apparently powerful) reputee.

429


430

Anonymity offers a safe haven for accurate voting under these circumstances.
For example, anonymity improves the
431

accuracy of votes.

432

5.1.12

Social

thr
eats

433

Discriminatory behavior is possible when, for example, in a second
-
order reputation system, an entity can choose to
434

co
-
operate only with peers who have a high reputation, so that their recommendations weigh more heavily. Other
435

possible social threats
include the risk of herd behaviour and the penalisation of innovative, controversial opinions,
436

and vocal minority effect.

437

5.1.13

T
hreats to the lower network layers

438

The reputation system can be attacked by targeting the underlying infrastructure; for example, the

reputation
439

information can be manipulated/replayed/disclosed both when stored and
440

when transported, or may be made

unavailable by a denial of service attack.

441

5.1.14

Trust topology threats

442

An attack targets certain links to have maximum effect, for example those
entities with the highest reputation.

443

5.1.15

Threats to ratings

444

There is a whole range of threats to reputation ratings which exploit features of metrics used by the system to
445

calculate the aggregate reputation rating from the single scores.

446

5.2

Countermeasures

447


448

5.3

Priv
acy considerations

449

5.3.1

Privacy of Reputee

450

Reputation data


input and score


SHOULD NOT

include information (meta
-
data) that relates to the
451

Personal Information (PI) and Personally Identifiable Information (PII)
.

452

5.3.2

Privacy of Reputer

453

Portable Reput
ation Format
should provide for and preserve

anonymity, where desired or required, of the
454

reputation provider

from the reputation consumer and the reputee
. Here is all the more an implication that
455

while the reputation calculator needs authentic information about the id
entity o
f the reputation input
456

provider, audit and compliance requirements will still need to record the identity of input source.

457

5.3.3

Privacy protection between Reputers

458

Given the potential for reputers being influenced, in specific instances, by other repute
rs is also
459

detrimental to the integrity and accuracy of the reputation input.

460

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
23

of
25


A.

Acknowl
e
dgements

461

The following individuals have participated in the creation of this specification and are gratefully
462

acknowledged:

463

Participants:

464

[Participant Name, Affiliation | Individual Member]

465

[Participant Name, Affiliation | Individual Member]

466


467

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
24

of
25


B.

No
n
-
Normative Text

468

ORMS TC Use
-
Cases, v0.1


10 Sep 2008

Copyright
©

O
ASI
S® 2008
. All Rights Reserved.


Page
25

of
25


C.

Revision History

469

[optional; should not be included in OASIS Standards]

470


471

R
evision

Date

Editor

Changes Made

0.1

17
September
2008

Mahalingam Mani

Initial version

0.2

30
september,
2008

Mahalingam Mani

Updates to use
-
case sections, introduction to
reference model based on initial TC
discussions. Also introducing security and
pri
vacy considerations section.

0.21

29 October,
2008

Mahalingam Mani

Expanded on the Reference model, security
considerations. Refined use
-
cases text.

0.22

10 Nov 2008

Mahalingam Mani

Reference model and terminology changes
based on discussion 10 Nov 2008
, AM


472


473