OASIS Specification Template€¦  · Web viewFor information on whether any patents have been...

30
ORMS: Use-cases Version 0.21 Working Draft, 27 October 2008 Specification URIs: This Version: http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].html http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].doc http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].pdf Previous Version: http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].html http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].doc http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].pdf Latest Version: http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].html http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].doc http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].pdf Latest Approved Version: http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].html http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].doc http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].pdf Technical Committee: OASIS Open Reputation Management Systems (ORMS) TC Chair(s): Anthony Nadalin Nat Sakimura ORMS TC Use-Cases, v0.1 10 Sep 2008 Copyright © OASIS® 2008. All Rights Reserved. Page 1 of 30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 1 2 3

Transcript of OASIS Specification Template€¦  · Web viewFor information on whether any patents have been...

ORMS: Use-cases Version 0.21Working Draft, 27 October 2008Specification URIs:This Version:

http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].htmlhttp://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].dochttp://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].pdf

Previous Version:http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].htmlhttp://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].dochttp://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].pdf

Latest Version:http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].htmlhttp://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].dochttp://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].pdf

Latest Approved Version:http://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].htmlhttp://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].dochttp://docs.oasis-open.org/[tc-short-name]/ [additional path/filename].pdf

Technical Committee:OASIS Open Reputation Management Systems (ORMS) TC

Chair(s):Anthony NadalinNat Sakimura

Editor(s):Mahalingam Mani

Related work:This specification replaces or supercedes:

[specifications replaced by this standard]This specification is related to:

[related specifications]Declared XML Namespace(s):

[list namespaces here][list namespaces here]

Abstract:Towards arriving at a standard protocol for exchanging reputation information between reputation data providers and consumers and a portable reputation data and meta-data format, a reference model is described. The model is evaluated and validated with use-cases to arrive at requirements for a portable reputation provider data format that ensures openness in ownership, privacy and confidentiality protection and management of reputation data.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 1 of 24

1

2

3

4567891011121314151617181920212223242526272829303132333435363738394041

123

Status:This document was last revised or approved by the [TC name | membership of OASIS]on the above date. The level of approval is also listed above. Check the “Latest Version” or “Latest Approved Version” location noted above for possible later revisions of this document.Technical Committee members should send comments on this specification to the Technical Committee’s email list. Others should send comments to the Technical Committee by using the “Send A Comment” button on the Technical Committee’s web page at http://www.oasis-open.org/committees/orms/.For information on whether any patents have been disclosed that may be essential to implementing this specification, and any offers of patent licensing terms, please refer to the Intellectual Property Rights section of the Technical Committee web page (http://www.oasis-open.org/committees/orms/ipr.php.The non-normative errata page for this specification is located at http://www.oasis-open.org/committees/orms/.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 2 of 24

4243444546474849505152535455

456

NoticesCopyright © OASIS® 2007. All Rights Reserved.All capitalized terms in the following text have the meanings assigned to them in the OASIS Intellectual Property Rights Policy (the "OASIS IPR Policy"). The full Policy may be found at the OASIS website.This document and translations of it may be copied and furnished to others, and derivative works that comment on or otherwise explain it or assist in its implementation may be prepared, copied, published, and distributed, in whole or in part, without restriction of any kind, provided that the above copyright notice and this section are included on all such copies and derivative works. However, this document itself may not be modified in any way, including by removing the copyright notice or references to OASIS, except as needed for the purpose of developing any document or deliverable produced by an OASIS Technical Committee (in which case the rules applicable to copyrights, as set forth in the OASIS IPR Policy, must be followed) or as required to translate it into languages other than English.The limited permissions granted above are perpetual and will not be revoked by OASIS or its successors or assigns.This document and the information contained herein is provided on an "AS IS" basis and OASIS DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY WARRANTY THAT THE USE OF THE INFORMATION HEREIN WILL NOT INFRINGE ANY OWNERSHIP RIGHTS OR ANY IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.OASIS requests that any OASIS Party or any other party that believes it has patent claims that would necessarily be infringed by implementations of this OASIS Committee Specification or OASIS Standard, to notify OASIS TC Administrator and provide an indication of its willingness to grant patent licenses to such patent claims in a manner consistent with the IPR Mode of the OASIS Technical Committee that produced this specification.OASIS invites any party to contact the OASIS TC Administrator if it is aware of a claim of ownership of any patent claims that would necessarily be infringed by implementations of this specification by a patent holder that is not willing to provide a license to such patent claims in a manner consistent with the IPR Mode of the OASIS Technical Committee that produced this specification. OASIS may include such claims on its website, but disclaims any obligation to do so.OASIS takes no position regarding the validity or scope of any intellectual property or other rights that might be claimed to pertain to the implementation or use of the technology described in this document or the extent to which any license under such rights might or might not be available; neither does it represent that it has made any effort to identify any such rights. Information on OASIS' procedures with respect to rights in any document or deliverable produced by an OASIS Technical Committee can be found on the OASIS website. Copies of claims of rights made available for publication and any assurances of licenses to be made available, or the result of an attempt made to obtain a general license or permission for the use of such proprietary rights by implementers or users of this OASIS Committee Specification or OASIS Standard, can be obtained from the OASIS TC Administrator. OASIS makes no representation that any information or list of intellectual property rights will at any time be complete, or that any claims in such list are, in fact, Essential Claims.The names "OASIS", [insert specific trademarked names and abbreviations here] are trademarks of OASIS, the owner and developer of this specification, and should be used only to refer to the organization and its official outputs. OASIS welcomes reference to, and implementation and use of, specifications, while reserving the right to enforce its marks against misleading uses. Please see http://www.oasis-open.org/who/trademark.php for above guidance.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 3 of 24

56

57585960616263646566676869707172737475767778798081828384858687888990919293949596979899100

101

789

Table of Contents1 Introduction......................................................................................................................................... 6

1.1 Terminology....................................................................................................................................... 61.1.1 ORMS Definitions....................................................................................................................... 6

1.2 Normative References....................................................................................................................... 81.3 Non-Normative References...............................................................................................................8

2 Overview........................................................................................................................................... 103 ORMS Reference Model................................................................................................................... 114 Use-cases......................................................................................................................................... 13

4.1 OpenID in Trusted Exchange..........................................................................................................134.1.1 Actors....................................................................................................................................... 134.1.2 Description............................................................................................................................... 134.1.3 Input......................................................................................................................................... 134.1.4 Output...................................................................................................................................... 13

4.2 IdP (Identity Provider) Reputation Service.......................................................................................134.2.1 Actors....................................................................................................................................... 134.2.2 Description............................................................................................................................... 134.2.3 Input......................................................................................................................................... 144.2.4 Output...................................................................................................................................... 14

4.3 Content Filtering.............................................................................................................................. 144.3.1 Actors....................................................................................................................................... 144.3.2 Description............................................................................................................................... 144.3.3 Input......................................................................................................................................... 144.3.4 Output...................................................................................................................................... 15

4.4 Second Life Avatars........................................................................................................................ 154.4.1 Actors....................................................................................................................................... 154.4.2 Description............................................................................................................................... 154.4.3 Input......................................................................................................................................... 154.4.4 Output...................................................................................................................................... 15

4.5 Nodes in Second Life Grid...............................................................................................................154.5.1 Actors....................................................................................................................................... 154.5.2 Description............................................................................................................................... 154.5.3 Input......................................................................................................................................... 164.5.4 Output...................................................................................................................................... 16

4.6 Social-network derived Peer reputation...........................................................................................164.6.1 Actors....................................................................................................................................... 164.6.2 Description............................................................................................................................... 164.6.3 Input......................................................................................................................................... 164.6.4 Output...................................................................................................................................... 16

4.7 Digital Signature (signing key) reputation........................................................................................164.7.1 Actors....................................................................................................................................... 164.7.2 Description............................................................................................................................... 164.7.3 Input......................................................................................................................................... 17

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 4 of 24

102

103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144

101112

4.7.4 Output...................................................................................................................................... 174.8 Peer Reputation in P2P Networks...................................................................................................17

4.8.1 Actors....................................................................................................................................... 174.8.2 Description............................................................................................................................... 174.8.3 Input......................................................................................................................................... 174.8.4 Output...................................................................................................................................... 17

4.9 Seller Reputation............................................................................................................................. 174.9.1 Actors....................................................................................................................................... 174.9.2 Description............................................................................................................................... 174.9.3 Input......................................................................................................................................... 184.9.4 Output...................................................................................................................................... 18

4.10 Reputee Influence: Social & Professional Networks......................................................................184.10.1 Actors..................................................................................................................................... 184.10.2 Description............................................................................................................................. 184.10.3 Input....................................................................................................................................... 184.10.4 Output.................................................................................................................................... 18

4.11 Adaptive Trust: Enterprise unified communications (UC)..............................................................184.11.1 Actors..................................................................................................................................... 184.11.2 Description............................................................................................................................. 194.11.3 Input....................................................................................................................................... 194.11.4 Output.................................................................................................................................... 19

4.12 Federated Trust in UC...................................................................................................................194.12.1 Actors..................................................................................................................................... 194.12.2 Description............................................................................................................................. 194.12.3 Input....................................................................................................................................... 204.12.4 Output.................................................................................................................................... 20

4.13 Peer-peer reputation (between actors)..........................................................................................204.13.1 Actors..................................................................................................................................... 204.13.2 Description............................................................................................................................. 204.13.3 Input....................................................................................................................................... 204.13.4 Output.................................................................................................................................... 20

5 Security and Privacy considerations.................................................................................................21A. Acknowledgements........................................................................................................................... 22B. Non-Normative Text..........................................................................................................................23C. Revision History................................................................................................................................ 24

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 5 of 24

145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179

180181

131415

1 IntroductionSocial and Corporate networking interactions in the Internet age have given rise to an exponential growth in real-time and asynchronous communications. The openness of the good-faith protocols and networks are now increasingly exposed to the threats and exploits of the community.Moreover, corporate networks and social networks are required to deal with a range of users with roles and privileges varying dynamically in time and (network) domain requiring corporations to adjust to the wired and wireless network, traditional and virtually-extended perimeters, extranets, federations and partner-portals involving considerable degree of transitive trust.An framework is required to identify and qualify

accidental, well-behaved and malicious privilege/usage patterns and quantify (or trust-score) the above patterns to facilitate (social and corporate network) services

adapt trust levels and authorized accesses to resources.Interoperable trust-scoring mechanism is required to relateThis document describes use-cases in varying scenarios based on existing e-commerce transaction systems, social networks and converged communications scenarios ranging from corporate enterprise networks to peer-peer networks.The use-case section is preceded by ORMS Terminology and Overview sections and also a Reference Model in aiding the discussion of use-cases.

1.1 TerminologyThe key words “MUST”, “MUST NOT”, “REQUIRED”, “SHALL”, “SHALL NOT”, “SHOULD”, “SHOULD NOT”, “RECOMMENDED”, “MAY”, and “OPTIONAL” in this document are to be interpreted as described in [RFC2119].

1.1.1 ORMS DefinitionsActor

Particpating entities in a transaction. For example, in the Reputation Systems context, the reputation scoring-service (Provider or reputer), the service using the Reputation Provider (relying party) and the client being evaluated.(reputee)

AvatarAn Avatar is an incarnation, embodiment or virtual manifestation of an actor’s profile in a Social or Professional Network Domain.Alternate definition: a computer user's representation of himself or herself, whether in the form of a three-dimensional model used in computer games, a 2-dimensional icon (picture) used on internet forums and other communities, or a text construct found on early systems1

UCUnified Communications. A field that describes convergence of IP-based multi-media including voice, video and data.

Online reputation mechanisms Online reputation mechanisms, also known as reputation systems (Resnick et al., 2000; Dellarocas, 2003a), are using the Internet’s bi-directional communication capabilities in order to artificially engineer large-scale word-of-mouth networks where individuals share opinions and

1 An example is MUD: Multi-User Domain.ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 6 of 24

182183184185186187188189190191192193194195196197198199

200201202203

204205206207208

209210211212213214

215216217

218219220221

16171819

experiences on a wide range of topics, including companies, products, services, and even world events. (Dellarocas (2005) )

Reputation Systems See Online reputation mechanisms.

Reputation Reputation is a concept that arises in repeated game settings when there is uncertainty about some property (the “type”) of one or more players in the mind of other players. (Wilson (1985))

Reputation Score A Reputation Score of a Player (Reputee) on the Type (Criteria) by other players (Reputor) is the subjective probability assigned by the Reputor that the Reputee fulfils the Criteria. (Sakimura (2008))

Reputation (alternate definition)Reputation is a collective evaluation of an entity based on factual and/or subjective data about it, and is used as one of the factors for establishing trust on that entity for a specific purpose.Reputation is a metric (a score, a rank, a state, a multi-dimensional profile, etc.) associated to an entity (a person, a business, a digital identity, a website, a system, a device, a category of devices, a computing resource, etc.) or to a tuple [entity, attribute(s)] (e.g. [person,skill]) in a particular domain and at a particular moment in time.

Reputation domain (or Reputation Name-space) The encompassing domain where a reputation is defined (to be refined)

Reputation Compute-Engine A reputation for an entity is computed using a reputation calculator, based on different types of input data about the entity (available within the domain or imported into the domain). The reputation calculator combines and weights one or more input data about the entity, according to a reputation algorithm and contextual information available at the time of computation.

Contextual information (to be defined)

Reputation algorithm a domain-specific algorithm for computing reputations. A reputation algorithm is designed taking into account the characteristics of the encompassing domain: topology (centralized or distributed reputation computation), entities to which the reputation is associated, entities that produce input data, entities that consume reputations, available types of input data, type of contextual information available, desired properties of the computed reputation (robustness, fairness, etc.).

Reputation (input) data the data upon which the reputation is computed. Input data can be of different types, for example:

Subjective data about the entity: e.g. ratings and feedback from peers, claims and endorsements

Static and dynamic characteristics of the entity: e.g. demographics, preferences Behavior data, stemming from measurements and observations within a system: e.g. logs

of entity’s past actions, history of interactions, parametric data “Real world” data about the entity: e.g. background checks, credit history Inferred data about an entity: e.g. text analytics

Reputation Management System (to be refined) A reputation management system may include mechanisms for: collecting data about entities (generating data inputs or integrating external data); computing reputations; making

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 7 of 24

222223

224225

226227228

229230231232

233234235236237238239

240241

242243244245246

247248

249250251252253254

255256257258259260261262263

264265266

202122

sure the system is fair (e.g. provide bootstrapping mechanisms for new entities); performing actions based on reputations (e.g. trust computations, automatic decisions); revoking reputations, allowing entities legitimate control over their reputation and allowing entities to challenge their reputations (governance); making sure the system is not abused (security), making sure privacy of entities is respected (i.e. that the association entity - reputation is only disclosed to authorized parties)

RPRelying Party. (See Reputee in the context of OpenID).

OPOpenID Provider. The reputation Compute-engine in the OpenID model.

ReputerReputeeRSP

Reputation Service Provider.VoIP

Voice over Internet Protocol.UC

Unified Communications, a term denoting all forms of call and multimedia/cross-media message-management functions controlled by an individual user for both business and social purposes2

1.2 Normative References[RFC2119] S. Bradner, Key words for use in RFCs to Indicate Requirement Levels,

http://www.ietf.org/rfc/rfc2119.txt, IETF RFC 2119, March 1997.[Reference] [Full reference citation]

1.3 Non-Normative References[OpenIDW] OpenID Wikipedia Page[OpenID] OpenID Community Website[Dellarocas] Dellarocas, C., 2005, "Reputation Mechanisms".[Wilson] Wilson, R., 1985, Reputations in Games and Markets. A. Roth, ed. Game-

Theoretic Models of Bargaining, Cambridge University Press, Cambridge, UK, 27-62.

[Sakimura] Sakimura, N., 2008 "What is Reputation?"[veracite] Veracite Research Project (IBM)[enisasec] Reputation-based Systems: A security analysis: ENISA position paper, E

Carrara, Giles Hogben, October 2007

2 This definition, from International Engineering Consortium is the most generic to many minor industry variants.ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 8 of 24

267268269270271272273274275276277278279280281282283284

285286287288

289290291292293294295296297298299300

2324252627

2 OverviewThe use of reputation systems has been proposed for various applications such as:

Validating the trustworthiness of sellers and buyers in online auctions (ecommerce websites have proved can have large influence on sellers)

Detecting free riders in peer to peer networks

Ensuring the authenticity of signature keys in a web of trust.

Smarter searching of web sites, blogs, events, products, companies and other individuals.

Reputation in these examples refers to the opinions about an entity, from others. Reputation is one of the factors upon which trust can be based through the use of verifiable claims. Reputation changes with time and is used within a context. Trust and reputation are related to a context.There are various methods for generating user's reputation data or trustworthiness. Some methods are based on user's feedback through appropriate feedback channels. Other methods include having viewers participate in the reputation-building process through the user's profile at specific sites and communities. Each method has its limitations in terms of its susceptibility to bad actors, manipulation of data for specific purposes, and spammers.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 9 of 24

301302303304

305

306

307

308309310311312313314315

316

282930

3 ORMS Reference ModelThe following figure represents a generalized reputation model.

Figure 1 Generalized reputation model [to be replaced with a regular diagram - ed]

Primary components of the reference model are1. Input sources: Reputation Input data collectors (RDC).2. Reputation Data (RD): Portable reputation data generated by all input sources into a reputation

computation engine (RCE or reputation calculator).3. Reputation Context (RC). This allows filtering and qualifying the right choice of algorithms to use

and pre-process.4. Reputation Score (RS): the outcome of the reputation evaluation of an entity (to be portable).5. Reputation Consumer (Reputee): consumer of reputation score to use as yardstick for computing

the degree of trust for the entity it serves.

Thus, the primary objective and challenge is to make the reputation input data and reputation formats interoperable (portable) across vendor-boundaries and domains of Reputation.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 10 of 24

317318

319320

321

322323324325326327328329330

331332333

313233

Figure 2 Reputation Reference Model [to be edited as we go along - ed]

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 11 of 24

334

335

343536

4 Use-cases4.1 OpenID in Trusted Exchange

4.1.1 ActorsThe identified actors in an OpenID reputation framework are:1. OpenID Provider2. OpenID Relying Party (Reputee)3. Reputation Service (Reputer)

4.1.2 DescriptionTrusted Exchange is a secure protocol for data exchange between a OpenID provider (OP) and a Relying party (RP). OP provides RP access to user data based on RP's reputation.

4.1.2.1 Basic Flows[TBD: figure]

4.1.2.1.1 Pre-conditions

4.1.2.1.2 Post-conditions

4.1.3 InputThe following are the general inputs to the OpenID trusted exchange.

1. Numeric count of successful transaction2. Numeric count of claims

4.1.4 OutputScore value accumulated to evaluate RP's trustworthiness.

4.2 IdP (Identity Provider) Reputation ServiceThe identity provider and the use-case is quite analogous to the OpenID provider role in the previous use-case.

4.2.1 Actors1. Identity Provider2. User (service clients relying on the IdP)3. Identity Provider Reputation Service (Reputer – providing the trustworthiness of the chosen IdP)

4.2.2 DescriptionThe generic use case applies to all browser-redirect-based Web single-sign-on systems (e.g., OpenID, SAML Web Profile, etc.) This use case has received particular attention in the OpenID community as an alternative (or a supplement) to OpenID Relying Parties (RPs) having to maintain their own OpenID Provider whitelists/blacklists.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 12 of 24

336

337

338339340341342

343344345

346347

348

349

350351352353

354355

356357358

359360361362

363364365366367

373839

4.2.2.1 Basic Flows[TBD: figure]

4.2.2.1.1 Pre-conditions

4.2.2.1.2 Post-conditions

4.2.3 InputOptions (not mutually exclusive):

Vote from authenticated IdP users. Vote from registered IdPs. Vote from registered third parties.

4.2.4 OutputScore value accumulated to evaluate IdP’s trustworthiness.

4.3 Content FilteringThis use-case aims to describe reputation as Trust meta-data for content-filtering. It references a (as yet-unpublicized) Veracite research project

4.3.1 Actors1. Users of web content (producers, evaluators, consumers, etc.)2. Veracite server(s)

4.3.2 DescriptionThis scenario is based in the Veracite research project from IBM. A Veracite server provides a service for binding actor information to web content (where actor can be a person - author, evaluator, etc. or an automated system), together with assertions from that actor about the specific content (actor information and web content are distributed, the server only provides the binding). This "trust metadata" is used by content consumers to filter the content according to her trust preferences. Actors in a Veracite system can have reputations associated, which becomes another parameter for content filtering.

4.3.2.1 Basic Flows[TBD: figure]

4.3.2.1.1 Pre-conditions

4.3.2.1.2 Post-conditions

4.3.3 Input1. Content-providing actor’s assertions about content.2. (veracite) service’s binding (vouching) of the content to the content-providing-actor’s identity.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 13 of 24

368369

370

371

372373

375376377

378379

380381382

383384385

386387388389390391392

393394

395

396

397398399

404142

4.3.4 OutputThe system does not produce reputation scores, it relies on portable reputations provided by third parties (the requirement is that the reputation information can be used for filtering and that the context to which it applies be well specified).

4.4 Second Life Avatars

4.4.1 Actors1. SecondLife (SL) reputation service2. Avatars

4.4.2 DescriptionEnabling portability of avatar reputations from one SL node to another (as the grid diversifies, and specialized SL nodes emerge, this will require "translation"of avatar reputations between nodes)

4.4.2.1 Basic Flows[TBD: figure]

4.4.2.1.1 Pre-conditions

4.4.2.1.2 Post-conditions

4.4.3 InputExternalized description of the metadata that defines an avatar, peer ratings for an avatar in a given node, historical data for an avatar in a given node

4.4.4 OutputTBD

4.5 Nodes in Second Life Grid

4.5.1 Actors1. SecondLife reputation service2. SL nodes3. Avatars

4.5.2 DescriptionAn SL grid is emerging where different nodes can be controlled by different entities: SL servers are no longer under the sole control of Linden Labs, anybody is able to put up a SL node and integrate it into the SL grid. This opens up the possibility for new business scenarios (e.g "business oriented" SL nodes) but also for malicious SL nodes; having a reputation associated to a node would help.

4.5.2.1 Basic Flows[TBD: figure]

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 14 of 24

400401402403

404

405406407

408409410

411412

413

414

415416417

418419

420

421422423424

425426427428429

430431

434445

4.5.2.1.1 Pre-conditions

4.5.2.1.2 Post-conditions

4.5.3 InputPer-node ratings submitted to a reputation service.

4.5.4 Output[TBD]

4.6 Social-network derived Peer reputation

4.6.1 ActorsMembers of a social network (reputes and reputers).

4.6.2 DescriptionMembers of a social network who have a relationship with member A are randomly sampled and asked to vouch for or rate member A with respect to a specific criterion. The identities of the members vouching/rating are optionally kept anonymous, but in any case they are known to be members in good standing.

4.6.2.1 Basic Flows[TBD: figure]

4.6.2.1.1 Pre-conditions

4.6.2.1.2 Post-conditions

4.6.3 InputScores stated by the vouching members AND frequency and recency of activity, and interactions of the vouching members.

4.6.4 OutputPersonal score value.

4.7 Digital Signature (signing key) reputation

4.7.1 ActorsKey holders

4.7.2 DescriptionSigners in a web of trust sign keys to express trust in the assertion that the key belongs to the holder’s name (subjectname and/or subjaltname) contained in the digital certificate. The more people sign, the greater the trust in that assertion. Note the only assertion subject to reputation is that the key belongs to the named individual - nothing about the trustworthiness of this individual.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 15 of 24

432

433

434435

436437

438

439440

441442443444445

446447

448

449

450451452

453454

455

456457

458459460461462

464748

4.7.2.1 Basic Flows[TBD: figure]

4.7.2.1.1 Pre-conditions

4.7.2.1.2 Post-conditions

4.7.3 InputNumber of signing keys and their reputation; CRL for signing keys.

4.7.4 OutputKey trust value

4.8 Peer Reputation in P2P Networks

4.8.1 ActorsNodes in a peer-to-peer network.

4.8.2 DescriptionInternet service providers may use the following fairness criterion based on reputation for regulating bandwidth allocation according to observed usage behavior best-practices: Nodes in a P2P network gain download bandwidth according to their upload behavior. Essentially a bandwidth economy is maintained.

4.8.2.1 Basic Flows[TBD: figure]

4.8.2.1.1 Pre-conditions

4.8.2.1.2 Post-conditions

4.8.3 InputAverage upload bandwidth for a node, number of files uploaded and similar upload metrics.

4.8.4 OutputNode download bandwidth.

4.9 Seller Reputation

4.9.1 ActorsSellers and buyers in an e-commerce system.

4.9.2 DescriptionBuyers vote on the trustworthiness/quality of sellers.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 16 of 24

463464

465

466

467468

469470

471

472473

474475476477

478479

480

481

482483

484485

486

487488

489490

495051

4.9.2.1 Basic Flows[TBD: figure]

4.9.2.1.1 Pre-conditions

4.9.2.1.2 Post-conditions

4.9.3 InputBuyer rates the sellers (potentially also prices of items bought, buyer-reputations of voters).

4.9.4 OutputSeller reputation as a percentage.

4.10 Reputee Influence: Social & Professional NetworksThis class of use-cases deals with reputee-influenced criteria in social and professional networks.

4.10.1 Actors4. Customers or users relating to a professional and/or company (reputers)5. Professional and/or company being evaluated (reputee)6. Reputation Service Provider (RSP)

4.10.2 DescriptionA specific aspect is that reputers, reputees and the reputation service provider may determine criteria to be evaluated. Both reputers and reputees may apply their respective weightings allowing the reputation service provider to calculate overall ratings and rankings of professionals and/or companies within a specific business segment

4.10.2.1 Basic Flows[TBD: figure]

4.10.2.1.1 Pre-conditions

4.10.2.1.2 Post-conditions

4.10.3 InputScores on specific criteria by reputers processed by reputation service provider to facilitate relevancy and avoid fraud.

4.10.4 OutputReputer as well as reputee biased consolidated score.

4.11 Adaptive Trust: Enterprise unified communications (UC)

4.11.1 Actors1. Reputees: email/IM/VoIP/... UC clients

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 17 of 24

491492

493

494

495496

497498

499500

501502503504

505506507508509

510511

512

513

514515516

517518

519

520521

525354

2. Reputers: Enterprise UC services.3. Reputation Service Providers: Enterprise Policy Framework, through agents (gateways - XML, VoIP)

or enterprise UC servers.

4.11.2 DescriptionIntrusion and SPAM detection agents monitor authorized behavior and score down clients (reputees) based on patterns of Policy-violations. They score back up to default level when behavior is in line with policy. Reputers (UC Services) deliver based the level of current trust-score. The services themselves (policy enforcement points) may not be directly involved in interpreting scores. The repute-access privileges may be modulated by Policy Decision Points.

4.11.2.1 Basic Flows[TBD: figure]

4.11.2.1.1 Pre-conditions

4.11.2.1.2 Post-conditions

4.11.3 InputThe enterprise policy against which to measure behavior: patterns of policy-violation or compliance.

4.11.4 OutputTrust-levels (authorizations) mapped to a numeric scale or role.

4.12 Federated Trust in UCThis is a more complex variant of the adaptive trust UC use-case. There exists a two-tier reputation system. Two RSPs are peered to exchange reputation-leading events. The RSPs’ trust of each other may also be impacted.

4.12.1 Actors1. Reputee: Remote UC client (e.g., Yahoo, Google client) - (C)2. Reputer: called/emailed/IMed/... Enterprise-destination UC service - (D).3. RSP: Enterprise Policy Framework, through agents (gateways - XML, VoIP) or enterprise UC servers.

4.12.2 Description1. D detects a pattern of abuse(SPAM/SPIT/SPIM) and reports to peer (e.g., DKIM/SIP) server S

hosting (C).2. S may gather similar inputs on C and be its RSP.3. D may provide a trust-score and be (one of C's) RSPs.4. S may combine scores/reports and be two-tier RSP. A possible hierarchical RSP scenario is hidden

in (3).5. This may result in RSP action by S similar to the generic Adaptive Trust UC use-case.

4.12.2.1 Basic Flows[TBD: figure]

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 18 of 24

522523524

525526527528529530

531532

533

534

535536

537538

539540541542

543544545546

547548549550551552553554

555556

555657

4.12.2.1.1 Pre-conditions

4.12.2.1.2 Post-conditions

4.12.3 InputS scores trust value of ; optionally D scores in (3). S reacts to the score. D may act independent of S's scoring (may rely on its internal trust-score or input).

4.12.4 OutputThe following are possible outputs – besides trust-score of reputation.

Step [2] leads to a Report / Alert of trust-related event. Steps [3] and [4] provide data or trust-score. There's a contractual or baselined trust-level

between every S & D (Federation).

4.13 Peer-peer reputation (between actors)

4.13.1 ActorsReputees: both participants in an electronic messaging exchange between peopleReputer: messaging client or server

4.13.2 DescriptionTwo people communicate electronically (e.g via email or IM).

4.13.2.1 Basic Flows[TBD: figure]

4.13.2.1.1 Pre-conditions

4.13.2.1.2 Post-conditions

4.13.3 InputInputs to the reputation evaluation engine will be the communication content itself - text/content analysis of the message's basic intent (e.g. request, offer, commitment/promise, question, answer, notice) as well as latency, frequency of interaction.

4.13.4 OutputRelative peer reputation and/or social capital each party has accumulated in the relationship

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 19 of 24

557

558

559560561

562563564565566

567

568569570

571572

573574

575

576

577578579580

581582

585960

5 Security and Privacy considerationsAs in any open system, there are considerations of threat and vulnerabilities to be analyzed in the Reputation Management System – both because and notwithstanding the Reputation Management System itself being a service to be built on a web of trust.

5.1 Threat taxonomySpecifically, [enisasec] refers to a slew of threats and many of these are captured here for reference to possible relevance to some or all of the discussed use-cases.

5.1.1 Whitewashing AttackThe adversary resets a poor reputation by rejoining the system with a new identity. Systems that allow for easy change of identity and easy use of new pseudonyms are vulnerable to this attack.

5.1.2 Sybil AttackThe adversary creates multiple identities (sybils) and exploits them in order to manipulate a reputation score.

5.1.3 Impersonation and Reputation TheftOne entity acquires the identity of another entity (masquerades) and consequently steals her reputation.

5.1.4 Bootstrap issues and related threatsThe initial reputation value given to a newcomer may lay it open to threats such as Sybil and Whitewashing attacks.

5.1.5 ExtortionCoordinated campaigns aimed at blackmail by damaging reputation for malicious motives.

5.1.6 Denial of ReputationAttack designed to damage an entity’s reputation (e.g. in combination with a sybil attack or impersonation) and create an opportunity for blackmail in order to have the reputation cleaned.

5.1.7 Ballot-stuffing and bad-mouthingReporting of a false reputation score; the attackers (distinct or sybils) collude to give positive/negative feedback, to increase or lower a reputation.

5.1.8 CollusionMultiple users conspire (collude) to influence a given reputation.

5.1.9 Repudiation - of data, transactionan entity can deny that a transaction happened, or the existence of data for which he was responsible.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 20 of 24

583584585586587

588589590

591592593

594595596

597598599

600601602

603604

605606607

608609610

611612

613614

616263

5.1.10 Dishonest ReputerThe voter is not trustworthy in his/her scoring.

5.1.11 Privacy threats for voters and reputation ownersReputers and reputation systems owners may be unwilling or unable to provide explicitly honest inputs for fear of reprisal or backlash from (an apparently powerful) reputee.

Anonymity offers a safe haven for accurate voting under these circumstances. For example, anonymity improves the accuracy of votes.

5.1.12 Social threatsDiscriminatory behavior is possible when, for example, in a second-order reputation system, an entity can choose to co-operate only with peers who have a high reputation, so that their recommendations weigh more heavily. Other possible social threats include the risk of herd behaviour and the penalisation of innovative, controversial opinions, and vocal minority effect.

5.1.13 Threats to the lower network layersThe reputation system can be attacked by targeting the underlying infrastructure; for example, the reputation information can be manipulated/replayed/disclosed both when stored and when transported, or may be made unavailable by a denial of service attack.

5.1.14 Trust topology threatsAn attack targets certain links to have maximum effect, for example those entities with the highest reputation.

5.1.15 Threats to ratingsThere is a whole range of threats to reputation ratings which exploit features of metrics used by the system to calculate the aggregate reputation rating from the single scores.

5.2 Countermeasures

5.3 Privacy considerations

5.3.1 Privacy of ReputeeReputation data – input and score – SHOULD NOT include information (meta-data) that relates to the Personal Information (PI) and Personally Identifiable Information (PII).

5.3.2 Privacy of ReputerPortable Reputation Format should provide for and preserve anonymity, where desired or required, of the reputation provider from the reputation consumer and the reputee. Here is all the more an implication that while the reputation calculator needs authentic information about the identity of the reputation input provider, audit and compliance requirements will still need to record the identity of input source.

5.3.3 Privacy protection between ReputersGiven the potential for reputers being influenced, in specific instances, by other reputers is also detrimental to the integrity and accuracy of the reputation input.ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 21 of 24

615616

617618619620621622

623624625626627

628629

630631632

633634635

636637638

639

640

641

642643644

645646647648649

650651652646566

A. AcknowledgementsThe following individuals have participated in the creation of this specification and are gratefully acknowledged:Participants:

[Participant Name, Affiliation | Individual Member][Participant Name, Affiliation | Individual Member]

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 22 of 24

653

654655656657658

659

676869

B. Non-Normative Text

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 23 of 24

660

707172

C. Revision History[optional; should not be included in OASIS Standards]

Revision Date Editor Changes Made

0.1 17 September 2008

Mahalingam Mani Initial version

0.2 30 september, 2008

Mahalingam Mani Updates to use-case sections, introduction to reference model based on initial TC discussions. Also introducing security and privacy considerations section.

0.21 29 October, 2008

Mahalingam Mani Expanded on the Reference model, security considerations. Refined use-cases text.

ORMS TC Use-Cases, v0.1 10 Sep 2008Copyright © OASIS® 2008. All Rights Reserved. Page 24 of 24

661

662

663

664665

737475