A Survivability Validation Framework for OASIS Program Technologies.

21
A Survivability Validation Framework for OASIS Program Technologies

Transcript of A Survivability Validation Framework for OASIS Program Technologies.

A Survivability Validation Frameworkfor OASIS Program Technologies

2

Framework Objective

• Continue to organize projects in the OASIS program so that it is possible to– Identify to DoD users and DARPA Management where particular

technologies and projects can help improve system survivability– Identify overall coverage of the set of OASIS projects as a whole, so

that we can identify threats/vulnerabilities that are not being addressed

• Create a survivability validation framework that will allow PIs to validate their proposed means for achieving survivability

• Use terminology established in the DoD and in the dependable computing and fault tolerance community (IFIP WG 10.4) for better and wider understanding

3

Background

• At summer 2000 PI meeting, PIs were asked to answer three questions:– 1. What threats/attacks is your project considering?– 2. What assumptions does your project make?– 3. What policies can your project enforce?

• A fourth question that covered the set of projects was also asked:– 4. What policies can the collection of projects enforce?

• Capsule summaries of the projects were developed from answers to these questions during the winter 2001 PI Meeting

• To characterize the nature of the survivability provided by particular OASIS technology development projects and by the collection of all OASIS projects, a survivability validation framework has been developed

• Some of the framework originated in the DARPA IA Program

4

Developing a Characterization under the Framework (1 of 3)

• 1. A system or more generally a technology has certain functional goals over a domain of application along with certain supporting survivability and security attributes for protection– Examples of functional goals are to provide an application, a

database, a mobile code platform, an operating system– Domains of application are where the technology applies, i.e., to

clients, servers, networks, storage, database, middleware, firmware, hardware, etc. and when the technology applies, i.e., at design phase, implementation phase, operational phase

– Survivability and security attributes are standard in the DoD: system availability*, integrity*, confidentiality*, authentication*, and non-repudiation*

[*See definitions on later slides]

5

Developing a Characterization under the Framework (2 of 3)

• 2. The system may not be able to achieve its functional goals because of certain impairments*, i.e., threats/attacks/vulnerabilities* (TAVs), from threat agents*

6

Developing a Characterization under the Framework (3 of 3)

• 3. However, the system may counter the impairments or TAVs by protection mechanisms/means that are intended to provide for its particular attributes and assure that it achieves its functional goals

7

Relations among Concepts

System orTechnology

Functional Goals

Survivability andSecurity Attributes

Impairments or Threats

Mechanisms/Means

Domain of Application

8

Comments (1 of 2)

• While survivability and security attributes have standard definitions, they may have different senses– Availability has different aspects, e.g., availability over a system or

specific system availability for special functions of a system

• TAVs – Form a very large class, potentially infinite, which is growing daily– Can be viewed according to when they arise: at design phase, at

implementation phase, or at operational phase– May be considered according to where they impair a system, how

they impair a system, or what they impair in a system

9

Comments (2 of 2)

• TAVs– Have been considered according to various taxonomies

• Krsul’s Thesis at https://www.cerias.purdue.edu/techreports-ssl/public/97-05.pdf

• Howard’s Thesis at http://www.cert.org/research/JHThesis/Word6/• Lough’s Thesis at http://scholar.lib.vt.edu/theses/available/etd-

04252001-234145/

– Have been enumerated in databases• Common Vulnerabilities and Exposures at http://cve.mitre.org/• ICAT Metabase at http://icat.nist.gov/icat.cfm• CERIAS Cooperative Vulnerability Database at

https://coopvdb.cerias.purdue.edu/main/index.html

10

OASIS 3-Space:Attributes x Domain x Impairments

AttributesImpairmentsDomain

Conf Int Avail Auth NR

Impairments (threats/attacks/vulnerabilities)

Domain of appl.(when/where)

Attributes (properties desired)

Design

Implem

ent.Operate

Malicious Code

DDoS

Spoof/MITM

???

11

Outline of a Characterization

• Characterizations consist of 9 parts:

1. Technology Description and Survivability/Security Problem Addressed

2. Assumptions

3. TAVs

4. Survivability and Security Attributes

5. Comparison with other Systems (Optional)

6. Survivability and Security Mechanisms

7. Rationale

8. Residual risks, limitations, and caveats

9. Cost and benefit analysis

12

Building Characterizations (1 of 6)

• 1. Technology Description and Survivability/Security Problem Addressed– What functionality is the technology trying to provide and what in

brief are its survivability and security objectives? What is its domain of application?

– Can be extracted from project information: PI briefings, papers, discussions with PI, etc.

– Aims to provide a brief high-level description of functionality and survivability and security objectives

– Should provide the domain of application and explain limitations• 2. Assumptions

– What are the assumptions upon which the technology depends?– Provided in the project literature or from PI– Can be divided into assumptions about system, user, network,

environment, etc.

13

Building Characterizations (2 of 6)

• 3. Impairments = TAVs– What are the impairments = threats/attacks/vulnerabilities that the

technology is trying to address?– Defined to include any circumstances with potential harm to the

system in the form of destruction, disclosure, adverse data modification, and/or denial of service

– Provided in the project literature or from PI– Can be grouped systematically according to design, implementation,

and operation (when the TAV may have its effect)• 4. Survivability and Security Attributes

– What attributes is the technology trying to support among system availability (AV), integrity (I), confidentiality (C), authentication (AU), and non-repudiation (NR)

14

Building Characterizations (3 of 6)

• 5. Comparison with other Systems (Optional)– How does this technology compare with others?– Compare the OASIS technology to existing commercial

systems/practices– Provide rationale matrices and explanations for the commercial

systems• 6. Survivability and Security Mechanisms

– What techniques are used to mitigate given TAVs? Examples are:• Damage assessment• Containment• Reconfiguration• Repair• Fault treatment

– Intended as support for the high-level survivability and security attributes

15

Building Characterizations (4 of 6)

• 7. Rationale– How do the elements fit together? Provide a rationale matrix (later

slide)– Footnote for each mechanism/assumption cell of the matrix

• Descriptive paragraph showing that the assumptions and mechanisms counter the TAVs and thus supporting claims about achieving the high-level attributes

– Rationale matrix plus footnotes only outline the beginning of validation; a validation plan is needed

16

Building Characterizations (5 of 6)

– Recommended techniques for goal accomplishment verification and validation

• Red team testing and analysis• Formal assurance argument• Formal methods of proof• Modeling and simulation• Code inspection• Cryptanalysis• Other techniques

– Independent peer review– Summary

17

Example of a Rationale Matrix

AV I C AU NRDesign TAV-11

TAV-12 M1, M3TAV-13 M2…TAV-1m

Implementation TAV-21 A4TAV-22TAV-23 A3

TAV-2n

Operation TAV-31 M2 M8TAV-32 M5, M6TAV-33

TAV-3p

18

Building Characterizations (6 of 6)

• 8. Residual risks, limitations, and caveats– What are the residual risks or gaps?– These may be determined from the arguments

• 9. Cost and benefit analysis– What are the costs with respect to the benefits?– Cost metrics (quantified if possible)

• Performance degradation• Functionality change• Storage needs• Network bandwidth requirements • Cost as $

– Benefit metrics (quantified if possible)• Probability of surviving an attack, loss of data, loss of

confidentiality• Length of time in successfully defending against attacker

– One-to-one correspondence of mechanisms to goals

Definitions

20

Terminology and Definitions (1 of 2)NSA and IFIP WG 10.4

• Attack – An attempt to bypass security controls on a computer. The attack may alter, release, or deny data. Whether an attack will succeed depends on the vulnerability of the computer system and the effectiveness of existing countermeasures.

• Authentication – To positively verify the identity of a user, device, or other entity in a computer system, often as a prerequisite to allowing access to resources in a system.

• Availability – Assuring information and communications services will be ready for use when expected.

• Confidentiality – Assuring information will be kept secret, with access limited to appropriate persons.

• Impairment – [IFIP WG 10.4 definition]. Regarded in this program as TAV.

• Integrity – Assuring information will not be accidentally or maliciously altered or destroyed.

21

Terminology and Definitions (2 of 2)NSA and IFIP WG 10.4

• Non-repudiation – Method by which the sender of data is provided with proof of delivery and the recipient is assured of the sender’s identity, so that neither can later deny having processed the data.

• Threat – The means through which the ability or intent of a threat agent to adversely affect an automated system, facility, or operation can be manifest. A potential violation of security.

• Threat agent – Methods and things used to exploit a vulnerability in an information system, operation, or facility; fire, natural disaster, and so forth.

• Vulnerability – Hardware, firmware, or software flow that leaves an automated information system (AIS) open for potential exploitation. A weakness in automated system security procedures, administrative controls, physical layout, internal controls, and so forth, that could be exploited by a threat to gain unauthorized access to information or disrupt critical processing.