ORMS: Use-cases Version 0.24b

toadspottedincurableInternet and Web Development

Dec 4, 2013 (3 years and 9 months ago)

138 views

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
1

of
29



ORMS: Use
-
cases

Version
0.2
4
b

Working

Draft,
2
1

January

200
9

Specification URIs
:

This Version
:

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/

[additional path/filename]
.html

http://docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.doc

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.pdf

Previous Version:

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.html

http://docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additiona
l path/filename]
.doc

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.pd
f

Latest Version:

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.html

http://docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.doc

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.pdf

Latest Approved Version:

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.html

http://docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.doc

http://
docs.oasis
-
open.org/
[tc
-
short
-
name]
/
[additional path/filename]
.pdf

Technical Committee:

OAS
IS
Open Reputation Management Systems (ORMS)

TC

Chair(s)
:

Anthony Nadalin

Nat Sakimura

Editor(s):

Mahalingam Mani

Related work:

This specification replaces or supercedes:



[specifications replaced by this standard
]

This specificati
on is related to:



[related specifications]

Declared XML Namespace(s):

[list namespaces here]

[list namespaces here]

Abstract:

Towards arriving at a standard protocol for exchanging
reputation information between reputation
data providers and
consumers
and a portable reputation data and meta
-
data format, a referenc
e
model is described. The model is evalu
ate
d

and validated with
use
-
cases

to arrive at
requirements for a
portable
reputat
ion provider data f
ormat that ensures openness in ownership,
privacy and confidentiality protection and management of reputation data.

Status:

This document was last revised or approved by the
[TC name | membership of OASIS
]
on the
above date. The level of approval is also listed above.
Check the
“Latest Version” or “Latest
Approved Version”

location noted above for possible later revisions of this document.

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
2

of
29


Technical Committee members should send comments on this specification to th
e Technical
Committee’s email list. Others should send comments to the Technical Committee by using the
“Send A Comment” button on the Technical Committee’s web page at
http:/
/
www.oasis
-
open.org/committees/
orms
/
.

For information on whether any patents have been disclosed that may be essential to
implementing this specification, and any offers of patent licensing terms, please refer to the
Intellectual Property Rights section o
f the Technical Committee web page (
http://
www.oasis
-
open.org/committees/
orms
/ipr.php
.

The non
-
normative errata page for this specification is located at
http://
www.oasis
-
open.org/committees/
orms
/
.

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
3

of
29


Notices

Copyright © OASIS®
2007. All Rights Reserved.

All capitalized terms in the following text have the meanings assigned to them in the OAS
IS Intellectual
Property Rights Policy (the "OASIS IPR Policy"). The full Policy may be found at the OASIS website.

This document and translations of it may be copied and furnished to others, and derivative works that
comment on or otherwise explain it or
assist in its implementation may be prepared, copied, published,
and distributed, in whole or in part, without restriction of any kind, provided that the above copyright notice
and this section are included on all such copies and derivative works. However,

this document itself may
not be modified in any way, including by removing the copyright notice or references to OASIS, except as
needed for the purpose of developing any document or deliverable produced by an OASIS Technical
Committee (in which case the
rules applicable to copyrights, as set forth in the OASIS IPR Policy, must
be followed) or as required to translate it into languages other than English.

The limited permissions granted above are perpetual and will not be revoked by OASIS or its successors

or assigns.

This document and the information contained herein is provided on an "AS IS" basis and OASIS
DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY
WARRANTY THAT THE USE OF THE INFORMATION HEREIN WILL NOT INFRINGE ANY
O
WNERSHIP RIGHTS OR ANY IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A
PARTICULAR PURPOSE.

OASIS requests that any OASIS Party or any other party that believes it has patent claims that would
necessarily be infringed by implementations of this OASIS

Committee Specification or OASIS Standard,
to notify OASIS TC Administrator and provide an indication of its willingness to grant patent licenses to
such patent claims in a manner consistent with the IPR Mode of the OASIS Technical Committee that
produced

this specification.

OASIS invites any party to contact the OASIS TC Administrator if it is aware of a claim of ownership of
any patent claims that would necessarily be infringed by implementations of this specification by a patent
holder that is not willi
ng to provide a license to such patent claims in a manner consistent with the IPR
Mode of the OASIS Technical Committee that produced this specification. OASIS may include such
claims on its website, but disclaims any obligation to do so.

OASIS takes no po
sition regarding the validity or scope of any intellectual property or other rights that
might be claimed to pertain to the implementation or use of the technology described in this document or
the extent to which any license under such rights might or mig
ht not be available; neither does it
represent that it has made any effort to identify any such rights. Information on OASIS' procedures with
respect to rights in any document or deliverable produced by an OASIS Technical Committee can be
found on the OASI
S website. Copies of claims of rights made available for publication and any
assurances of licenses to be made available, or the result of an attempt made to obtain a general license
or permission for the use of such proprietary rights by implementers or u
sers of this OASIS Committee
Specification or OASIS Standard, can be obtained from the OASIS TC Administrator. OASIS makes no
representation that any information or list of intellectual property rights will at any time be complete, or
that any claims in su
ch list are, in fact, Essential Claims.

The names "OASIS",
[insert specific trademarked names and abbreviations here]

are trademarks
of
OASIS, the

owner and developer of this specification, and should be used only to refer to the o
rganization
and its official outputs. OASIS welcomes reference to, and implementation and use of, specifications,
while reserving the right to enforce its marks against misleading uses. Please see
http://www.oasis
-
open.org/who/trademark.php

for above guidance.


ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
4

of
29


Table of Contents

1

Introduction

................................
................................
................................
................................
...........

7

1.1 Terminology

................................
................................
................................
................................
........

7

1.1.1 ORMS Definitions

................................
................................
................................
........................

7

1.2 Normative References

................................
................................
................................
........................

9

1.3 Non
-
Normative Referenc
es

................................
................................
................................
................

9

2

Overview

................................
................................
................................
................................
.............

10

2.1 Reputation Input Data

................................
................................
................................
.......................

10

3

ORMS Reference

Model

................................
................................
................................
....................

11

3.1 A Mathematical Model for reputation Trust Score

................................
................................
............

12

3.1.1 Reputation Calculation

................................
................................
................................
..............

12

4

ORMS Portable Format and Protocol Requirements

................................
................................
.........

14

4.1 Reputation Result Portable Format Requirements

................................
................................
...........

14

4.2 Protocol Requirements

................................
................................
................................
.....................

15

5

Use
-
cases

................................
................................
................................
................................
...........

17

5.1 OpenID in Trusted Exchange

................................
................................
................................
...........

17

5.1.1 Actors

................................
................................
................................
................................
........

17

5.1.2 Description

................................
................................
................................
................................
.

17

5.1.3 Input

................................
................................
................................
................................
...........

17

5.1.4 Output

................................
................................
................................
................................
........

17

5.2 IdP (Identity Provider) Reputation Service

................................
................................
.......................

17

5.2.1 Actors

................................
................................
................................
................................
........

17

5.2.2 Description

................................
................................
................................
................................
.

17

5.2.3 Input

................................
................................
................................
................................
...........

18

5.2.4 Output

................................
................................
................................
................................
........

18

5.3 Content Filtering

................................
................................
................................
...............................

18

5.3.1 Actors

................................
................................
................................
................................
........

18

5.3.2 Description

................................
................................
................................
................................
.

18

5.3.3 Input

................................
................................
................................
................................
...........

18

5.3.4 Output

................................
................................
................................
................................
........

19

5.4

Second Life

Avatars

................................
................................
................................
..........................

19

5.4.1 Actors

................................
................................
................................
................................
........

19

5.4.2 Description

................................
................................
................................
................................
.

19

5.4.3 Input

................................
................................
................................
................................
...........

19

5
.4.4 Output

................................
................................
................................
................................
........

19

5.5 Nodes in
Second Lif
e Grid

................................
................................
................................
................

19

5.5.1 Actors

................................
................................
................................
................................
........

19

5.5.2 Description

................................
................................
................................
................................
.

19

5.5.3 Input

................................
................................
................................
................................
...........

20

5.5.4 Output

................................
................................
................................
................................
........

20

5.6 Social
-
n
etwork derived Peer reputation

................................
................................
............................

20

5.6.1 Actors

................................
................................
................................
................................
........

20

5.6.2 Description

................................
................................
................................
................................
.

20

5.6.3 Input

................................
................................
................................
................................
...........

20

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
5

of
29


5.6.4 Output

................................
................................
................................
................................
........

20

5.7 Digital Signature (signing key) reputation

................................
................................
.........................

20

5.7.1 Actors

................................
................................
................................
................................
........

20

5.7.2 Description

................................
................................
................................
................................
.

20

5.7.3 Input

................................
................................
................................
................................
...........

21

5.7.4 Output

................................
................................
................................
................................
........

21

5.8 Peer Reputation in P2P Networks

................................
................................
................................
....

21

5.8.1 Actors

................................
................................
................................
................................
........

21

5.8.2 Description

................................
................................
................................
................................
.

21

5.8.3 Input

................................
................................
................................
................................
...........

21

5.8.4 Output

................................
................................
................................
................................
........

21

5.9 Seller Reputation

................................
................................
................................
..............................

21

5.9.1 Actors

................................
................................
................................
................................
........

21

5.9.2 Description

................................
................................
................................
................................
.

21

5.9.3 Input

................................
................................
................................
................................
...........

22

5.9.4 Output

................................
................................
................................
................................
........

22

5.10 Reputee Influence: Social & Professional Networks

................................
................................
......

22

5.10.1 Actors

................................
................................
................................
................................
......

22

5.10.2 Description
................................
................................
................................
...............................

22

5.10.3 Input

................................
................................
................................
................................
.........

22

5.10.4 Output

................................
................................
................................
................................
......

22

5.11 Adaptive Trust: Enterprise unified communications (UC)

................................
...............................

22

5.11.1 Actors

................................
................................
................................
................................
......

22

5.11.2 Description
................................
................................
................................
...............................

23

5.11.3 Input

................................
................................
................................
................................
.........

23

5.11.4 Output

................................
................................
................................
................................
......

23

5.12 Federated Trust in UC

................................
................................
................................
....................

23

5.12.1 Actors

................................
................................
................................
................................
......

23

5.12.2 Description
................................
................................
................................
...............................

23

5.12.3 Input

................................
................................
................................
................................
.........

24

5.12.4 Output

................................
................................
................................
................................
......

24

5.13 Peer
-
peer reputation (between actors)

................................
................................
...........................

24

5.13.1 Actors

................................
................................
................................
................................
......

24

5.13.2 Description
................................
................................
................................
...............................

24

5.13.3 Input

................................
................................
................................
................................
.........

24

5.13.4 Output

................................
................................
................................
................................
......

24

6

Security and Privacy considerations

................................
................................
................................
..

25

6.1 Threat tax
onomy

................................
................................
................................
...............................

25

6.1.1 Whitewashing Attack

................................
................................
................................
.................

25

6.1.2 Sybil Attack

................................
................................
................................
................................

25

6.1.
3 Impersonation and Reputation Theft

................................
................................
.........................

25

6.1.4 Bootstrap issues and related threats

................................
................................
.........................

25

6.1.5 Extortion

................................
................................
................................
................................
....

25

6.1.6 Denial of Reputation

................................
................................
................................
..................

25

6.1.7 Ballot
-
stuffing and bad
-
mouthing
................................
................................
...............................

25

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
6

of
29


6.1.8 Collusion

................................
................................
................................
................................
....

25

6.1.9 Repudiation
-

of data, transaction

................................
................................
.............................

25

6.1.10 Dishonest Reputer

................................
................................
................................
...................

25

6.1.11 Privacy threats for voters and reputation owners

................................
................................
....

26

6.1.12 Social threats

................................
................................
................................
...........................

26

6.1.13 Threats to the lower netw
ork layers

................................
................................
........................

26

6.1.14 Trust topology threats

................................
................................
................................
..............

26

6.1.15 Threats to ratings

................................
................................
................................
....................

26

6.2 Countermeasures

................................
................................
................................
.............................

26

6.3 Privacy considerations

................................
................................
................................
......................

26

6.3.1 Privacy of Reputee

................................
................................
................................
....................

26

6.3.2 Privacy of Reputer

................................
................................
................................
.....................

26

6.3.3 Privacy protection between Reputers

................................
................................
.......................

26

A.

Acknowledgements

................................
................................
................................
............................

27

B.

Non
-
Normative Text

................................
................................
................................
...........................

28

C.

Revision History

................................
................................
................................
................................
..

29



ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
7

of
29


1

Introduction

1

Social and Corporate networking int
eractions in the Internet age have given rise to an exponential growth
2

in real
-
time and asynchronous communications. The openness of the good
-
faith protocols and networks
3

are now increasingly exposed to the threats and exploits of the community.

4

Moreover,
corporate networks and social networks are required to deal with a range of users with roles
5

and privileges varying dynamically in time and (network) domain requiring corporations to adjust to the
6

wired and wireless network, traditional and virtually
-
exten
ded perimeters, extranets, federations and
7

partner
-
portals involving considerable degree of transitive trust.

8

A
framework is required to

9



identify and qualify
accidental, well
-
behaved and malicious privilege/usage patterns and

10



quanti
fy (or
trust
-
score
) the above patterns to facilitate (social and corporate network) services
11

ad
o
pt
ion of

trust levels and authorized accesses to resources.

12

An
i
nteroperable trust
-
scoring mechanism is required to relate

the trust scores from multiple prov
iders.

13

This document describes use
-
cases in varying

scenarios based on existing e
-
commerce transaction
14

systems, social networks and converged communications scenarios ranging from corporate enterprise
15

networks to peer
-
peer networks.

16

The use
-
case section is

preceded by ORMS Terminology and Overview sections and also a Reference
17

Model in aiding the discussion of use
-
cases.

18

1.1

Terminology

19

The key words “
MUST

,

MUST NOT

,

REQUIRED

,

SHALL

,

SHALL NOT

,

SHOULD

,

SHOULD
20

NOT

,

RECOMMENDED

,

MAY

, and

OPTIONA
L
” in this document are to be interpreted as described
21

in
[RFC2119]
.

22

1.1.1

ORMS
Definitions

23

Actor

24

A p
articpating entit
y

in a transaction. For example, in the Reputation Systems context, the
25

reputation scoring
-
service (Provider or reputer)

is an actor
, the service using the Reputation
26

Provider (relying party)

is an actor

and the client being evaluated

(reputee)

is an actor.

27

Avata
r

28

An Avatar is an i
ncarnation
, embodiment

or
virtual
manifestation of an actor
’s

profile
in a S
ocial

or
29

P
rofessi
onal

N
etwork
community
.

30

Alternate definition: a computer user's representation of himself or herself, whether in the form of
31

a three
-
dimensional model used in computer games, a 2
-
dimensional icon (picture) used on
32

internet forums and other communitie
s, or a text construct found on early systems
1

33

Reputation Systems

34










1

An example is MUD: Multi
-
User Domain.

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
8

of
29


U
sing the Internet’s bi
-
directional communication capabilities in order to arti
ficially engineer large
-
35

scale word
-
of
-
mouth networks where
actors

share opinions and experiences on a wide range of
36

topics, including companies, products, services, and even world events. (Dellarocas (2005)
)

37

Reputation

38

R
eputation is a

su
bjective

evaluation of
the
assertion about a subject being true

based on factual
39

and/or subjective data about it, and is used as one of the factors for establishing trust on that

40

subject

for a specific pu
rpose.

41

Reputation Score

42

A Reputation Score of a Player (Reputee)

on the Type (Criteria) by other players (Reputor) is the
43

subjective probability assigned by the Reputor that the Reputee fulfils the Criteria. (Sakimura
44

(2008))

45

Reputation domain

46

T
he encompassing domain where a reputation is defined (
to be refined
)

47

Reputation Compute
-
Engine

(or RCE)

48

A reputation for an entity is computed using a
R
eputation
Correlator
, based on different
sources

49

of input da
ta
,

about the entity
. The reputation
correlator

accepts
one or more input data about the
50

entity,
and evaluates a reputation score

b
ased on its algorithms

and contextual information
51

available at the time of computation.

52

Contextual
I
nformation

53

Criteria used
as functions
in
RCE


to evaluate
Reputation output.

54

Reputation algorithm

55

a domain
-
specific algorithm for comput
ing reputations. A reputation algorithm is designed taking
56

into account the characteristics of the encompassing domain: topology (centralized or distributed
57

reputation computation), entities to which the reputation is associated, entities that produce inpu
t
58

data, entities that consume reputations, available types of input data, type of contextual
59

information available, desired properties of the computed reputation (
robustness, fairness, etc.).

60

Reputation (input) data

61

T
he data upon which the reputation is
computed.

62

Reputation Management System

63

(
to be refined
) A reputation management system may include mechanisms for: collecting data
64

about entities (generating data inputs or integrating external data); computing reputations; making
65

sure the system is fair (
e.g. provide bootstrapping mechanisms for new entities); performing actions
66

based on reputations (e.g. trust computations, automatic decisions); revoking reputations, allowing
67

entities legitimate control over their reputation and allowing entities to chall
enge their reputations
68

(governance); making sure the system is not abused (security), making sure privacy of entities is
69

respected (i.e. that the association entity
-

reputation is only disclosed to authorized parties)

70

RP

71


Re
lying
Party.

(See Reputee in t
he context of OpenID).

72

OP

73


OpenID Provider
. The reputation Compute
-
E
ngine in the OpenID model.

74

Reputer

75

The reputation
-
scoring service

76

Reputee

77

The actor whose reputation score is being considered or calculated

78

RSP

79

Reputation Service Provider.

80

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
9

of
29


VoIP

81

Voice ov
er Internet Protocol.

82

UC

83

Unified Communications
, a term denoting

all forms of call and multimedia/cross
-
media message
-
84

management functions controlled by an individual user for both business and social purposes
2

85

1.2

Normative References

86

[RFC2119]

S. Bradner,
Ke
y words for use in RFCs to Indicate Requirement Levels
,
87

http://www.ietf.org/rfc/rfc2119.txt
, IETF RFC 2119, March 1997.

88

[
Reference
]

[Full reference citatio
n]


89

1.3

Non
-
Normative References

90

[OpenIDW]

OpenID Wikipedia Page

91

[
OpenID
]

OpenID Community Website

92

[Dellarocas]

Dellarocas, C., 2005,
"Reputation Mechanisms"
.

93

[Wilson]

Wilson, R., 1985, Reputations in Games and Markets.
A. Roth, ed. Game
-
94

Theoretic Models of Bargaining, Cambridge University Press, Cambridge, UK,
95

27
-
62.

96

[Sakimura]

Sakimura, N
., 2008
"What is Reputation?"

97

[veracite]

Veracite Research Project (IBM)

98

[enisasec]

Reputation
-
based Systems: A security analysis: ENISA

position paper, E
99

Carrara, Giles Hogben, Oc
tober 2007

100


101










2

This definition, from International Engineering Consortium is the most generic to many minor industry
variants.

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
10

of
29


2

Overview

102

The use of reputation systems has been proposed for various applications such as:

103



Validating the trustworthiness of sellers and buyers in online auctions (ecommerce websites have
104

proved
this
can h
ave
a
large influence on sellers)

105



Dete
cting free riders in peer to peer networks

106



Ensuring the authenticity of signature keys in a web of trust.

107



Smarter searching of web sites, blogs, events, products, companies and other individuals
.

108

R
eputation is a metric (a score, a rank, a state,
a multi
-
dimensional profile, etc.
) associated to an entity (a
109

person, a business, a digital identity, a website, a system, a device, a category of de
vices, a computing
110

resource, etc.
) or to a tuple [entity, attribute(s)] (e.g. [person,skill]) in a particular doma
in and
at a
111

particular moment in time.

112

Reputation in

these examples
refers to the opinions about an entity, from others. Reputation is one of the
113

factors upon which trust can be based through the use of verifiable claims. Reputation changes with time
114

and i
s used within a context. Trust and reputation are related to a context.

115

There are various methods for generating user's reputation data or trustworthiness. Some methods are
116

based on user's feedback through appropriate feedback channels. Other methods inclu
de having viewers
117

participate in the reputation
-
building process through the user's profile at specific sites and communities.
118

Each method has its limitations in terms of its susceptibility to bad actors, manipulation of data for specific
119

purposes, and spa
mmers.

120

2.1

Reputation Input Data

121

T
h
is is the

data upon which the reputation is computed.

Input data can be of different types, f
or example:

122



Subjective data about the entity: e.g. ratings and feedback from peers, claims and
123

endorsements

124



Static and dynamic cha
racteristics of the entity: e.g. demographics, preferences

125



Behavior data, stemming from measurements and observations within a system: e.g. logs of
126

entity’s past actions, history of interactions, parametric data

127



“Real world” data about the entity: e.g. b
ackground checks, credit history

128



Inferred data about an entity: e.g. text analytics

129


130

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
11

of
29


3

ORMS Reference Model

131

The following figure represents a generalized reputation model.

132


133


134

Primary components of the reference model are

135

1.

Input sources: Reputation Input data collectors (RDC).

136

2.

Reputation Data (RD): Portable reputation data generated by all input s
ources into a reputation
137

computation engine (RCE or reputation calculator).

138

3.

Reputation Context (RC). This allows filtering and qualifying the right choice of algorithms to use
139

and pre
-
process.

140

4.

Reputation Score (RS): the outcome of the reputation evaluation

of an entity (to be portable).

141

5.

Reputation Consumer (Reputee): consumer of reputation score to use as yardstick for computing
142

the degree of trust for the entity it serves.

143


144

Thus, the primary objective and challenge is to make the reputation input data and
reputation formats
145

interoperable (portable)

across vendor
-
boundaries

and domains of Reputation
.

146

Input data

collect
ion
/
generation

Input data

collect
ion
/
generation

Input data

collect
ion
/
generation

Input data

collect
ion
/
generation

Individual &
demo
graphic data


entity E

Oberved data


Entity E

Real
-
world data:
e
ntity E

Inferred

data
:

Entity E

subjective

data


Entity E

Input data

collect
ion
/
generation



Reputation

Computer

C
ontext

Input data

collect
ion
/
generation


Reputation System

Reputation

of

Input entities

(local)

Reputation


Portable

Reputation


Computation of
Trust by (external
)

Consuming party

Pub
-
Sub

Portable

Reputation

Data Input

Computation of
Trust by (external
)

Consuming party

Local ReputationScope

Figure
1

Generalized Reputation Model

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
12

of
29



147

Figure
2

Reputation Reference Model [
to be edited as we go along

-

ed
]

148

3.1

A

Mathematical Model

for reputati
on Trust Score

149

3.1.1

Reputation Calculation

150

R
eputation
calculation made by a Reputation Authority

i
"
, on an assertion

A


about a subject

s


on
151

criteria

c


being true is

a
subjective

mapping
R
i
(A(s,c))

of a set of input data

x


within an input
152

information se
t

X


to a set

y


within an reputation representation set

Y

.

i.e.,

153


154

R
i
(A(s,c))
:X

-
>Y

155

where



156

s
:= subject identifier

157

c
:= criteria identifier

158

A(s,c)
:= An assertion about
c

on
s

being true.

159

X
:={
x

| set of all input data including

y

in
Y
}

160

Y
:={
y

| set of all possible reputation calculation result.}

161

R
i
(A(s,c))
:= Reputor
i
's mapping of input data (multiple) to
Y

on the assertion
A(s,c)
.


162


163

As a result of above, one may further consider Reputation to be
mathematically defined

as:

164

Reput
ation

is the result of the reputation calculation, i.e.,
y

in
Y
, that can be
used as one of the
165

factors for establishing trust on that entity for a specific purpose.

166


167

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
13

of
29


At a conceptual level, it can be any mapping of a set in the Domain to another set in the

Range. In other
168

word
s
, it could be conceived as a projection of a set in a Domain hyperplane to the Range hyperplane.
169

However, interpreting the end result for an arbitrary Range would be extremely difficult. Thus, the
170

following mathematcal
not
at
ion of

Re
putation Score

,
as defined below,
is proposed;
and
Y

defined
171

accordingly
:

172

A Reputation Score
,
z

in
Z
,
is a subjective probability assigned by the Reputation Engine for the
173

assertion being true.

174

Thus, the Reputation Score will be in the range [0,1] or
,

in
percentage term
s,
[0,100]. i.e.,
Z
:={
z

| Real
175

number between 0 and 100}. This has the property that it is intuitively easy for the consumer and for two
176

subjects
m

and
n
, if m is more likely to fit the assertion than n, then
z
m

> z
n

where
z
m

is the reputat
ion
177

score of
m
.

178

Then,
a
representation of Reputation,
y
,
can be created
as
an

information
set
around
z

as described in
179

the
section [
4.1
]

on format requirements
.

180

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
14

of
29


4

ORMS
Portable Format and P
rotocol Requirements

181

A

reputation system

may accept
:

182



R
aw input
: the format of the input is unspecified and is domain
-
specific.

This input may be
183

expressed in RDF and/or
X
D
I.

184



Portable Formatted input

185


186

An XML
encoding
format
SHALL
be defined for reputation format
.
An

ASN.1

format MAY also be
187

suppo
rted.

188

4.1

Reputation Result Portable Format Requirements

189

The reputation score described above in itself
requires

some statistical qualities
as a portable reputation
190

so that the reputation consumer can make inference
s based

on it. There are number of desired pr
operties
191

that should be
transmitt
ed

with the score

for this to be so useful
.

192

From the discussion, r
esulting XML
seems to require

to have at least the following:

193

1.

Reputation result XML needs to have an identifier of somebody being scored.

194

It may include PI
I (e.g., Social Security Number), so it may be wise to mandate that this be a
195

hash(identifier, salt)?
=>Protocol Consideration

196

2.

The same for who is scoring.

197

3.

For what criteria, t
his reputation score was made.

198

4.

Input Data Range

199

5.

For the reputation to be aggre
gatable, it has to have a distribution that we know about the
200

aggregated distribution (such as normal distribution).

201

6.

The information about the distribution, including what distribution, mean, and standard deviation
202

must be published together with the scor
e.

203

7.

Display score should be intuitive for an average person.

204

8.

Date that score was made

205

9.

Signature by the score maker
so that it will be tamper proof.

206

The above requirements with some others are captured in the
[
Table
1
]
.

207

It seems
, to be a meaningful Portable Reputation Data
representation
, i.e., the representation of
y

to be
208

transmitted

out of the local reputation scope, the file should contain at least the following.

209

item

type

Example

Remarks

ReputationID

XRI/URI

@myRS/=nat/+goo
dSeller/
20081110

Unique ID of this file.

SubjectID

XRI/URI

=nat

Identifier of the Subject being scored

ReputationServiceID

XRI/URI

@myRS

Identifier of the Reputation Authority/Engine

AssertionID

XRI/URI

@myRS/
+
goodSeller

Identifier of the assertion tem
plate which would
form an assertion (criteria) when SubjectID is
combined with it.

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
15

of
29


RequesterID

XRI/URI

=john

Identifier of the requesting party. This is
included
to make the source detectable in the
case of the leakage
.

Display
Score

float

95.1

Cumulati
ve
distribution in percentage form

Score

Float

56.8

Raw Score

ConfidenceInterval

Float
, Float

92.8, 96.8

5%
confidence interval

of Display Score

Distribution

XRI/URI

+distribution/normal

Identifier representing the distribution.

Mean

float

50

50

Stan
dard
Deviation

float

4.1

4.1

Sample
Size

Integer

100

100

SubjectPublic
Key

PEM

-----
BEGIN
CERTIFICATE
-----

MIIDaDCCAlCgAwIBAg
IBHTANBgkqhkiG

...

-----
END
CERTIFICATE
-----


PEM format version of the Public Key of the
Subject
.
This can be used to find out late
r when
the reputation consumer is making a transaction
with the Subject to validate that the party he is
talking to is really the one that this reputation
file is
referring

to.


Reputor
Public
Key

PEM

-----
BEGIN
CERTIFICATE
-----

MIIDaDCCAlCgAwIBAg
IBHTANBgk
qhkiG

...

-----
END
CERTIFICATE
-----


PEM format version of the Public Key of the
Reputation Authority.


Date


XMLDATE

2008
-
11
-
11
T14:34:00Z

XML Date of the calculation

InputDataInfo

XRI/URI

@myRS/+goodSeller/=nat/
+inputData

An identifier (URI, IRI, XRI)
to the
metadata about the input data.

StartDate

XMLDATE

2008
-
01
-
01T00:00:00Z

Start Date of the input data.

EndDate

XMLDATE

2008
-
11
-
11
T14:3
0
:00Z

End Date of the input data.

Signature

String

af8afsld92dfjdsla…

af8afsld92dfjdsla…

Table
1

Reputation Attributes & Properties

210

4.2

Protocol Requirements

211


212

PR1.

The reputation consumer
SHOULD

be able to obtain the reputation file by specifying the
213

assertion including the subject identifier.

214

PR2.

Since the reputation data itself is often an sensitive dat
a including PII, it
SHOULD

have
the
215

following
security considerations:

216

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
16

of
29




SubjectID
SHOULD

be represented so that it cannot be traced back to the Subject
, e
.g.,
217

sha256(
SubjectID
,
salt
). This implies that the protocol should be a request
-
response protocol
218

sin
ce otherwise the receiver cannot map the file to the Subject.

219



B
e able to make the source detectable in the case of the leakage, the file should contain the
220

requester ID.

221



To make the request
forgery
-
proof
, the request should contain the
digital
signature o
f the
222

requesting party.

223



To

protect from

e
ave
sdropping and MITM attack
s
, the response should be encrypted using a
224

content encryption key (session key) which in turn is encrypted by the requesting party

s
225

public key.

226



Considering that
the

mere fact that an e
ntity is requesting a reputation representation of the
227

subject may be a privacy risk, the request probably should be encrypted in the same manner
228

as the response, with reputation authority

s public key.

229

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
17

of
29


5

Use
-
cases

230

5.1

OpenID in Trusted Exchange

231

5.1.1

Actors

232

The ident
ified actors in an OpenID reputation framework are:

233

1. OpenID Provider

234

2. OpenID Relying Party

(Reputee)

235

3. Reputation Service (Reputer)

236

5.1.2

Description

237

Trusted Exchange is a secure protocol for data exchange between a OpenID provider (OP) and a Relying
238

party

(RP). OP provides RP access to user data based on RP's reputation.

239

5.1.2.1

Basic Flows

240

[TBD: figure]

241

5.1.2.1.1

Pre
-
conditions

242

5.1.2.1.2

Post
-
conditions

243

5.1.3

Input

244

The following are the general inputs to the OpenID trusted exchange.

245

1.

Numeric count of successful transaction

246

2.

Numeric count o
f claims

247

5.1.4

Output

248

Score value accumulated t
o evaluate RP's trustworthiness.

249

5.2

IdP (Identity Provider) Reputation Service

250

The identity provider and the use
-
case is quite analogous to the OpenID provider role in the previous use
-
251

case.

252

5.2.1

Actors

253

1. Identity Provider

254

2. User (service clients relying on the IdP)

255

3. Identity Provider Reputation Service (Reputer


providing the trustworthiness of the chosen IdP)

256

5.2.2

Description

257

The generic use case applies to all browser
-
redirect
-
based Web single
-
sign
-
on systems (e.g., OpenI
D,
258

SAML Web Profile, etc.) This use case has received particular attention in the OpenID community as an
259

alternative (or a supplement) to OpenID Relying Parties (RPs) having to maintain their own OpenID
260

Provider whitelists/blacklists
.

261

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
18

of
29


5.2.2.1

Basic Flows

262

[TBD: fig
ure]

263

5.2.2.1.1

Pre
-
conditions

264

5.2.2.1.2

Post
-
conditions

265

5.2.3

Input

266

Options (not mutually exclusive):

267


268



Vote from authenticated IdP users.

269



Vote from registered IdPs.

270



Vote from registered third parties.

271

5.2.4

Output

272

Score value accumulated to evaluate IdP’s trustworthiness.

273

5.3

Content Filteri
ng

274

This use
-
case aims to describe reputation as Trust meta
-
data for content
-
filtering. It references a (as yet
-
275

unpublicized) Veracite research project

276

5.3.1

Actors

277

1. Users of web content (producers, evaluators, consumers, etc.)

278

2.
Veracite

server(s)

279

5.3.2

Descriptio
n

280

This scenario is based in the Veracite research project from IBM. A Veracite server provides a service for
281

binding actor information to web content (where actor can be a person
-

author, evaluator, etc. or an
282

automated system), together with assertions f
rom that actor about the specific content (actor information
283

and web content are distributed, the server only provides the binding). This "trust metadata" is used by
284

content consumers to filter the content according to her trust preferences. Actors in a Ve
racite system can
285

have reputations associated, which becomes another parameter for content filterin
g.

286

5.3.2.1

Basic Flows

287

[TBD: figure]

288

5.3.2.1.1

Pre
-
conditions

289

5.3.2.1.2

Post
-
conditions

290

5.3.3

Input

291

1.

Content
-
providing actor’s assertions about content.

292

2.

(veracite) service’s binding
(vouching)

of the content to the content
-
providing
-
actor’s identity.

293

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
19

of
29


5.3.4

Output

294

The system does not produce reputation scores, it relies on portable reputations provided by third parties
295

(the requirement is that the reputation information can be used for filtering and t
hat the context to which it
296

applies be well specified
).

297

5.4

Second Life

Avatars

298

5.4.1

Actors

299

1.

SecondLife

(SL)
reputation service

300

2.

Avatars

301

5.4.2

Description

302

Enabling portability of avatar reputations from one SL node

to anothe
r (as the grid diversifies, and
303

specialized SL nodes emerge, this will require "translation"of avatar reputations between nodes)

304

5.4.2.1

Basic Flows

305

[TBD: figure]

306

5.4.2.1.1

Pre
-
conditions

307

5.4.2.1.2

Post
-
conditions

308

5.4.3

Input

309

Externalized description of the metadata that defines

an avatar, peer ratings for an avatar in a given
310

node, historical data for an avatar in a given node

311

5.4.4

Output

312

TBD

313

5.5

Nodes in
Second Lif
e

Grid

314

5.5.1

Actors

315

1.

SecondLife

reputation service

316

2.

SL nodes

317

3.

Avatars

318

5.5.2

Des
cription

319

An SL grid is emerging where different nodes can be controlled by different entities: SL servers are no
320

longer under the sole control of Linden Labs, anybody is able to put up a SL node and integrate it into the
321

SL grid. This opens up the possibil
ity for new business scenarios (e.g "business oriented" SL nodes) but
322

also for malicious SL nodes; having a reputation associated to a node would help
.

323

5.5.2.1

Basic Flows

324

[TBD: figure]

325

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
20

of
29


5.5.2.1.1

Pre
-
conditions

326

5.5.2.1.2

Post
-
conditions

327

5.5.3

Input

328

Per
-
node ratings su
bmitted to a reputatio
n service.

329

5.5.4

Output

330

[TBD]

331

5.6

Social
-
network derived Peer reputation

332

5.6.1

Actors

333

Members of a social network

(reputes and reputers)
.

334

5.6.2

Description

335

Members of a social network who have a relationship with member A are randomly sampled and asked to
336

vouch for or rate memb
er A with respect to a specific criterion. The identities of the members
337

vouching/rating are optionally kept anonymous, but in any case they are known to be members in good
338

standing.

339

5.6.2.1

Basic Flows

340

[TBD: figure]

341

5.6.2.1.1

Pre
-
conditions

342

5.6.2.1.2

Post
-
conditions

343

5.6.3

Input

344

Scores st
ated by the vouching members AND frequency and recency of activity, and interactions of the
345

vouching members.

346

5.6.4

Output

347

Personal score value.

348

5.7

Digital Signature (signing key) reputation

349

5.7.1

Actors

350

Key holders

351

5.7.2

Description

352

Signers in a web of trust sign keys to expr
ess trust in the assertion that the key belongs to the holder
’s

353

name
(
subjectname

and/or
subjaltname
)
contained in the
digital
certificate. The more people sign, the
354

greater the trust in that assertion. Note the only assertion subject to reputation is that

the key belongs to
355

the named individual
-

nothing about the trustworthiness of this individual
.

356

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
21

of
29


5.7.2.1

Basic Flows

357

[TBD: figure]

358

5.7.2.1.1

Pre
-
conditions

359

5.7.2.1.2

Post
-
conditions

360

5.7.3

Input

361

Number of signing keys and their reputation; CRL for signing keys.

362

5.7.4

Output

363

Key trust value

364

5.8

Peer R
eputation in P2P Networks

365

5.8.1

Actors

366

Nodes in a peer
-
to
-
peer network.

367

5.8.2

Description

368

Internet service providers may use the following fairness criterion based on reputation for regulating
369

bandwidth allocation according to observe
d usage behavior best
-
practices:

N
odes in a P2P network gain
370

download bandwidth according to their upload behavior. Essentially a bandwidth economy is maintained.

371

5.8.2.1

Basic Flows

372

[TBD: figure]

373

5.8.2.1.1

Pre
-
conditions

374

5.8.2.1.2

Post
-
conditions

375

5.8.3

Input

376

Average upload bandwidth for a node, number of files uploaded a
nd similar upload metrics.

377

5.8.4

Output

378

Node download bandwidth.

379

5.9

Seller Reputation

380

5.9.1

Actors

381

Sellers and buyers in an e
-
commerce system.

382

5.9.2

Description

383

Buyers vote on the tru
stworthiness/quality of sellers.

384

5.9.2.1

Basic Flows

385

[TBD: figure]

386

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
22

of
29


5.9.2.1.1

Pre
-
conditions

387

5.9.2.1.2

Post
-
conditions

388

5.9.3

Inpu
t

389

B
u
yer rat
es
the sellers
(potentially also prices of items bough
t, buyer
-
reputations of voters).

390

5.9.4

Output

391

Seller reputation as a percentage.

392

5.10

Reputee Influence: Social & Professional Networks

393

This class of use
-
cases deals with reputee
-
influenced criteria in
social and professional networks.

394

5.10.1

Actors

395

4.

Customers or users relating to a professional and/or company (reputers)

396

5.

Professional and/or company being evaluated (reputee)

397

6.

Reputation Service P
rovider

(RSP)

398

5.10.2

Description

399

A specific aspect is that reputers, reputee
s and the reputation service provider may determine criteria to

400

be evaluated. B
oth reputers and reputees may apply their respective weightings allowing the reputation
401

service provider to calculate overall ratings and rankings of professionals and/or compan
ies within a
402

specific business segment

403

5.10.2.1

Basic Flows

404

[TBD: figure]

405

5.10.2.1.1

Pre
-
conditions

406

5.10.2.1.2

Post
-
conditions

407

5.10.3

Input

408

Scores on specific criteria by reputers processed by reputation service provider to facilitate relevancy and
409

avoid frau
d.

410

5.10.4

Output

411

Reputer as well as repute
e biased consolidated

score.

412

5.11

Adaptive Trust: Enterprise unified communications (UC)

413

5.11.1

Actors

414

1.

R
eputees
: email/IM/VoIP/... UC clients

415

2.

R
e
puters: E
nterprise UC services.

416

3.

Reputation Service Providers: E
nterprise

Policy F
ramework, through agents (
gateways
-

XML,
V
oIP
)
417

or enterprise UC servers.

418

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
23

of
29


5.11.2

Description

419

Intrusion and SPAM detection a
gents monitor authorized behavior and score down
clients (reputees)
420

based on patterns of Policy
-
violations. They score

back up to default level when be
havior is in line with
421

policy. R
eputers (
UC S
ervices
)

deliver based the level of current tr
ust
-
score
.

The services themselves
422

(policy enforcement points) may not be directly involved in interpreting scores. The repute
-
access
423

privileges may be modulated by Policy Decision Points.

424

5.11.2.1

Basic Fl
ows

425

[TBD: figure]

426

5.11.2.1.1

Pre
-
conditions

427

5.11.2.1.2

Post
-
conditions

428

5.11.3

Input

429

The enterprise policy against which to measure behavior: patterns of policy
-
violation or compliance.

430

5.11.4

Output

431

T
rust
-
levels (authorizations) map
ped to a numeric scale or role.

432

5.12

Federated Trust in UC

433

This i
s a more complex variant of the adaptive trust UC use
-
case. There exists a two
-
tier reputation
434

system. Two RSPs are peered to exchange reputation
-
leading events. The RSPs’ trust of each other may
435

also be impacted.

436

5.12.1

Actors

437

1.

R
eputee: R
emote
UC
client

(e.g., Ya
hoo, Google client
)
-

(C)

438

2.

R
eputer: called/emailed/IMed/... Enterpris
e
-
destination UC service
-

(D).

439

3.

RSP: E
nterprise

Policy F
ramewo
rk, through agents (gateways
-

XML, VoIP
) or
enterprise UC
server
s.

440

5.12.2

Description

441

1.

D detects a pattern of abuse(SPAM/SPIT/SPIM) a
nd reports to peer (e.g., DKIM/SIP) server S
442

h
osting (C).

443

2.

S may gather similar inp
uts on C and be its
RSP
.

444

3.

D may provide a
trust
-
score and be (one of C
's)
RSPs
.

445

4.

S may

combine scores/reports and be two
-
tier

RSP. A possible hierarchical RSP

scenar
i
o is hidde
n
446

in (3).

447

5.

This may result
in RSP action by S similar to the generic Adaptive Trust UC use
-
case.

448

5.12.2.1

Basic Flows

449

[TBD: figure]

450

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
24

of
29


5.12.2.1.1

Pre
-
conditions

451

5.12.2.1.2

Post
-
conditions

452

5.12.3

Input

453

S scores trust value of ; optionally D scores in (3
). S
reacts to

the score. D may act independen
t of S's
454

scoring (may re
ly on its internal trust
-
score or input).

455

5.12.4

Output

456

The following are possible outputs


besides trust
-
score of reputation.

457



Step [2] leads to a Report / Alert of trust
-
related event.

458



Steps [3] and [4] provide

data or trust
-
score. There
's a contractual or baselined tr
ust
-
level
459

between every S & D (F
ederation).

460

5.13

Peer
-
peer r
eputation (between actors)

461

5.13.1

Actors

462

R
eputees: both participants in an electronic mes
saging exchange between people

463

R
eputer: messaging client or server

464

5.13.2

Description

465

Two peo
ple communicate elect
ronically (e.g via email or IM).

466

5.13.2.1

Basic Flows

467

[TBD: figure]

468

5.13.2.1.1

Pre
-
conditions

469

5.13.2.1.2

Post
-
conditions

470

5.13.3

Input

471

Inputs to the reputation evaluation engine will be the communication content itself
-

t
ext/content analysi
s
472

of the message's basic intent
(
e.g. request, offer, commitment/promise, question, answer, notice) as well
473

as latency, frequency of interaction.

474

5.13.4

Output

475

Relative peer reputation and/or social capital each party has accumulated in the relationship

476

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
25

of
29


6

Security
and Privacy
c
onsiderations

477

As in
any open system, there are considerations of threat and vulnerabilities to be analyzed in the
478

Reputation Management System


both because and no
t
withstanding the Reputation Management
479

System itself being a service to be built on a web of trust.

480


481

6.1

Threat tax
onomy

482

Specifically, [
enisasec
] refers to a slew of threats and many of these are captured here for reference to
483

possible relevance to some or all of the discussed use
-
cases.

484

6.1.1

Whitewashing Attack

485

The
adversary

r
esets a poor reputation by rejoining the system with a new identi
ty. Systems that allow for easy
486

change of identity and easy use of new pseudonyms are vulnerable to this attack.

487

6.1.2

Sybil Attack

488

The adversary creates multiple identities (sybils) and exploits t
hem in order to manipulate a reputation score.

489

6.1.3

Impersonation and Reputation Theft

490

One entity acquires the identity of another entity (masquerades) and consequently steals her reputation.

491

6.1.4

Bootstrap issues and related threats

492

The initial reputation value giv
en to a newcomer may lay it open to threats such as Sybil and Whitewashing attacks.

493

6.1.5

Extortion

494

Coordinated campaigns aimed at blackmail by damaging reputation for malicious motives.

495

6.1.6

Denial of Reputation

496

Attack designed to damage an entity’s reputation (e.g.

in

combination with a sybil attack or impersonation) and create
497

an opportunity for blackmail in order to have the reputation cleaned.

498

6.1.7

Ballot
-
stuffing and bad
-
mouthing

499

Reporting of a false reputation score; the attackers (distinct or sybils) collude to giv
e positive/negative feedback, to
500

increase or lower a reputation.

501

6.1.8

C
ollusion

502

Multiple users conspire (collude) to influence a given reputation.

503

6.1.9

Repudiation
-

of data, transaction

504

an entity can deny that a transaction happened, or the existence of data for wh
ich he was responsible.

505

6.1.10

Dishonest Reputer

506

The voter is not trustworthy in his
/her

scoring.

507

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
26

of
29


6.1.11

Privacy threats for voters and reputation owners

508

Reputers and reputation
systems owners may be unwilling or unable to provide explicitly honest inputs for fear of
509

re
prisal or backlash from (an apparently powerful) reputee.

510


511

Anonymity offers a safe haven for accurate voting under these circumstances.
For example, anonymity improves the
512

accuracy of votes.

513

6.1.12

Social

threats

514

Discriminatory behavior is possible when, for exa
mple, in a second
-
order reputation system, an entity can choose to
515

co
-
operate only with peers who have a high reputation, so that their recommendations weigh more heavily. Other
516

possible social threats include the risk of herd behaviour and the penalisatio
n of innovative, controversial opinions,
517

and vocal minority effect.

518

6.1.13

T
hreats to the lower network layers

519

The reputation system can be attacked by targeting the underlying infrastructure; for example, the reputation
520

information can be manipulated/replayed/di
sclosed both when stored and
521

when transported, or may be made

unavailable by a denial of service attack.

522

6.1.14

Trust topology threats

523

An attack targets certain links to have maximum effect, for example those entities with the highest reputation.

524

6.1.15

Threats to ratin
gs

525

There is a whole range of threats to reputation ratings which exploit features of metrics used by the system to
526

calculate the aggregate reputation rating from the single scores.

527

6.2

Countermeasures

528


529

6.3

Privacy considerations

530

6.3.1

Privacy of Reputee

531

Reputation data


input and score


SHOULD NOT

include information (meta
-
data) that relates to the
532

Personal Information (PI) and Personally Identifiable Information (PII)
.

533

6.3.2

Privacy of Reputer

534

Portable Reput
ation Format should provide for and preserve

anonymity, where desir
ed or required, of the
535

reputation provider

from the reputation consumer and the reputee
. Here is all the more an implication that
536

while the reputation calculator needs authentic information about the identity o
f the reputation input
537

provider, audit and com
pliance requirements will still need to record the identity of input source.

538

6.3.3

Privacy protection between Reputers

539

Given the potential for reputers being influenced, in specific instances, by other reputers is also
540

detrimental to the integrity and accuracy o
f the reputation input.

541

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
27

of
29


A.

Acknowl
e
dgements

542

The following individuals have participated in the creation of this specification and are gratefully
543

acknowledged:

544

Participants:

545

[Participant Name, Affiliation | Individual Mem
ber]

546

[Participant Name, Affiliation | Individual Member]

547


548

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
28

of
29


B.

No
n
-
Normative Text

549

ORMS TC Use
-
Cases, v0.1


21

Jan

200
9

Copyright
©

O
ASIS® 2008
. All Rights Reserved.


Page
29

of
29


C.

Revision History

550

[optional; should not be included in OASIS Standards]

551


552

Revision

Date

Editor

Changes Made

0.1

17
September
200
8

Mahalingam
Mani

Initial version

0.2

30
september,
2008

Mahalingam
Mani

Updates to use
-
case sections,
introduction to reference model
based on initial TC discussions.
Also introducing security and
privacy considerations section.

0.21

29
October,
2008

Ma
halingam
Mani

Expanded on the Reference
model, security considerations.
Refined use
-
cases text.

0.22

10 Nov
2008

Mahalingam
Mani

Reference model and
terminology changes based on
discussion 10 Nov 2008, AM

0.23

10 Dec
2008

Mahalingam
Mani

Incorporating o
utcome of
discussions 4
-
5
pmPT
, in face
-
face on 10 Nov 2008. this
includes a
proposed
statistical
empirical
model for normalized
portable trust
-
score, a portable
reputation
exchange
format,
and requirements on a
reputation exchange protocol;
all
text adopte
d

from Nat’s
灲潰潳al.

Als漠i湣l畤e搠dre
摥fi湩tio湳⁲敦i湥m敮琠t潲o
r数畴utio測nr数畴utio渠nc潲o.

〮㈴

ㄹ⁄散
㈰〸

Br畣攠eich

䵩n潲⁥oi瑯ti慬⁣桡n来s⁴
敮桡湣攠el慲ity.

〮04b

㈱⁊慮
㈰〹

䵡桡li湧慭
䵡ni

䥮f敧ra瑩n朠gr潭

ㄯ㜯㈰〹N

m敥瑩湧 ⁣潭m敮瑳 fro

T潮y 乡k慬i渠慮搠乡k

t漠
Bruce’s.


㔵R


㔵R