Framework for Managing the Assured Information Sharing Lifecycle
description
Transcript of Framework for Managing the Assured Information Sharing Lifecycle
February 2009
Framework for Managing the
Assured Information Sharing Lifecycle
2008 MURI project with UMBC, Purdue, U. Texas Dallas, U. Illinois, U. Texas San Antonio, and U. Michigan
Objectives:
• Create a new framework for assured information sharing recognizing that sharable information has a lifecycle of production, release, advertising, discovery, acquisition and use
• Develop techniques grounded in this model to promote information sharing while maintaining appropriate security, privacy and accountability
• Evaluate, adapt and improve the AIS concepts and algor- ithms in relevant demonstration systems and test beds
See http://aisl.umbc.edu/ for papers and more information
February 2009
AIS Lifecycle Approach• Design a service oriented
architecture to support theassured information sharinglifecycle
• Create new policy models &languages to express and en-force AIS rules & constraints
• Develop new data mining techniquesand algorithms to track provenance, increase quality and preserve privacy
• Model underlying organizational social networks to estimate trust and information novelty
• Design incentive structures to motivate sharing inorganizations and coalitions
Information value chaininformation has a lifecycle involving a web of producers and consumers
All aspects of the lifecycle are shaped by
distributed information sharing policies
Integration and mining creates new information that may be shared
access may involve negotiating
policy defined obligations
February 2009
Selected AISL Recent Results① Progress on models, architectures, languages and
mechanisms for trustworthiness-centric assured information sharing (UTSA, Purdue)
② Techniques for resolving conflicting facts extracted from different resources (UIUC)
③ Study of information sharing motivation and quality in online forums (Michigan)
④ Modeling incentives & trust in info. sharing (UTD)
⑤ Learning statistically sound trust metrics (UTD)
⑥ Inferring access policies from logs (UMBC)
⑦ Policies for privacy in mobile information systems (UMBC, Purdue)
February 2009
Trustworthiness-centric AIS Framework
• Objective: create a trustworthiness-centric assured information sharing framework
• Approach: design models, architectures, language and mechanisms to realize it
• Key challenges: - Trustworthiness and risk management for end-user
decision making- Usage management to extends access control- Attack management, including trustworthiness of
infrastructure services- Identity management extending current generation- Provenance management for managing trustworthiness of
data, software, and requests
11
February 2009
trustworthiness-centric assuredinformation sharing framework
Trustworthiness management
Risk management
Usage management (of authorized activities)
Identity management (of people, organizations, and devices)
Attack management (of
unauthorized activities)
Provenance management (of
data, software, and requests)
Note: “trustworthiness risk” in general11
February 2009
Progress on Trustworthiness-centric AIS
• Initial framework will be published as:S. Xu, R. Sandhu & E. Bertino, Trustworthiness-centric Assured Information Sharing, (invited paper), 3rd IFIP Int. Conf. on Trust Management, 2009
• Design for identity & provenance mgmt underway• Group-centric info sharing model extends traditional
dissemination one with new intuitive metaphors: secure meeting room and subscription service
• Developed family of security models for semantics of basic group operations (join, leave, add, remove) and proved security properties about them
• Results published in recent conference papers
11
22
Truth Discovery with Multiple Conflicting Information Providers
[TKDE’08] Heuristic Rule 2: A web
site that provides mostly true facts for many objects will likely provide true facts for other objects
Problem: Multiple information provider may provide conflictive facts on the same object E.g., different author
names for a book Which is the true fact?
Heuristic Rule 1: The false facts on different web sites are less likely to be the same or similar False facts are often
introduced by random factors
w1 f1
f2
f3w2
w3
w4
f4
f5
Web sites Facts
o1
o2
Objects
February 2009
Truth-Discovery: Framework Extension
Multi-version of truth Democrats vs. republicans may have different views
Truth may change with time A player may win first but then lose
Truth is a relative, dynamically changing judgment Incremental updates with recent data in data streams
Method: Veracity-Stream Dynamic information network mining for veracity analysis
in multiple data streams Current Testing Data Sets
Google News: A dynamic news feed that provides functions and facilitates to search and browse 4,500 news sources updated continuously22 February 2009
February 2009
Motivation & quality in information sharing
• Analyzed online Q&A forums: 2.6Mquestions, 4.6M answers and interviewswith 26 top answerers
• Motivations to contribute include: altruism,learning, competition (via point system) andas a hobby
• Users who contribute more often and lessintermittently contribute higher qualityinformation
• Users prefer to answer unansweredquestions and to respond to incorrectanswers
• See “Questions in, Knowledge iN? A Study of Naver's Question Answering Community”, Nam, Ackerman, Adamic, CHI 2009
Knowledge iN
33
FEARLESS engineering
Incentives & Trust in Assured Information Sharing
• Goal: Create means of encouraging desirable behavior within an environment which lacks or cannot support a central governing agent
• Approach: Combining intelligence through a loose alliance– Bridges gaps due to sovereign boundaries– Maximizes yield of resources– Discovery of new information through correlation, analysis
of the ‘big picture’– Information exchanged privately between two participants
• Drawbacks to sharing include misinformation and freeloading
44
FEARLESS engineering
Our Model
• Players assumed to be rational• The game of information trading
– Strategies: be truthful, lie, refuse to participate
– One game played for each possible pair of players, all games played simultaneously in a single round; game repeated ‘infinitely’
– Players may verify the information they received with some cost
• When to verify becomes aspect of game– Always verifying works poorly in light of honest equilibrium behavior
but never verifying may yield game to lying opponents
• Add EigenTrust to game– A distributed trust metric where each player asks others for their
opinion of a third
– Based on known perfect information
44
FEARLESS engineering
Simulation Results
• We set δmin = 3, δmax = 7, CV = 2
• Lie threshold is set 6.9
• Honest behavior wins %97 percent of the time if allbehaviors exist.
• Experiments show without LivingAgent behavior, honest
behavior cannot flourish.
“Incentive and Trust Issues in Assured Information Sharing”, Ryan Layfield, Murat Kantarcioglu, and Bhavani Thuraisingham, International Conference on Collaborative Computing, 2008
44
February 2009
Learning statistically sound trust scores
• Goal: Build a statistically sound trust-based scoring system for effective access control through the application of the credit scoring system
• Approach: Find appropriate predictive variables by applying concepts and methodologies used in credit scoring systems
Incorporate a utility function into the scoring system to set up score-related access policies
55 Trust-Based Access Control Processes
February 2009
Inferring RBAC Policies • Problem: A system whose access policy is known is
more vulnerable to attacks and insider threatAttackers may infer likely policies fromaccess observations, partial knowledgeof subject attributes, and backgroundknowledge
• Objective: Strengthen policiesagainst discovery
• Approach: Explore techniques topropose policy theories via machinelearning such as ILP
• Results: promising initial results forsimple Role Based Access Control policies66
February 2009
Privacy policies for mobile computing
• Problem: mobile devices collect and integrate sensitive private data about their users which they would like to selectively share with others
• Objective: Develop a policy-based system for information sharing with an interface enabling end users to write & adapt privacy policies
• Approach: prototype component foriConnect on an iPhone and evaluate ina University environment
• Example policy rules: share my exactlocation with my family; share currentactivity with my close friends, …77