September 14, 2006 Lecture 3

35
1 September 14, 2006 Lecture 3 IS 2150 / TEL 2810 Introduction to Security

description

September 14, 2006 Lecture 3. IS 2150 / TEL 2810 Introduction to Security. What is a secure system?. A simple definition A secure system doesn’t allow violations of a security policy Alternative view: based on distribution of rights to the subjects - PowerPoint PPT Presentation

Transcript of September 14, 2006 Lecture 3

1

September 14, 2006Lecture 3

IS 2150 / TEL 2810Introduction to Security

2

What is a secure system? A simple definition

A secure system doesn’t allow violations of a security policy

Alternative view: based on distribution of rights to the subjects Leakage of rights: (unsafe with respect to

right r) Assume that A representing a secure state

does not contain a right r in any element of A. A right r is said to be leaked, if a sequence of

operations/commands adds r to an element of A, which did not contain r

3

What is a secure system?

Safety of a system with initial protection state Xo

Safe with respect to r: System is safe with respect to r if r can never be leaked

Else it is called unsafe with respect to right r.

4

Safety Problem: formally

Given initial state X0 = (S0, O0, A0) Set of primitive commands c r is not in A0[s, o]

Can we reach a state Xn where s,o such that An[s,o] includes a right r

not in A0[s,o]?- If so, the system is not safe- But is “safe” secure?

5

Decidability Results(Harrison, Ruzzo, Ullman) Theorem:

Given a system where each command consists of a single primitive command (mono-operational), there exists an algorithm that will determine if a protection system with initial state X0 is safe with respect to right r.

6

Decidability Results(Harrison, Ruzzo, Ullman) Proof: determine minimum commands k to

leak Delete/destroy: Can’t leak (or be detected) Create/enter: new subjects/objects “equal”, so

treat all new subjects as one No test for absence Tests on A[s1, o1] and A[s2, o2] have same result as the

same tests on A[s1, o1] and A[s1, o2] = A[s1, o2] A[s2, o2]

If n rights leak possible, must be able to leak k= n(|S0|+1)(|O0|+1)+1 commands

Enumerate all possible states to decide

7

Decidability Results(Harrison, Ruzzo, Ullman)

It is undecidable if a given state of a given protection system is safe for a given generic right

For proof – need to know Turing machines and halting problem

8

What is the implication? Safety decidable for some models

Are they practical? Safety only works if maximum

rights known in advance Policy must specify all rights someone

could get, not just what they have Where might this make sense?

9

Back to HRU:Fundamental questions

How can we determine that a system is secure? Need to define what we mean by a

system being “secure” Is there a generic algorithm that

allows us to determine whether a computer system is secure?

10

Turing Machine & halting problem The halting problem:

Given a description of an algorithm Given a description of an algorithm and a description of its initial and a description of its initial arguments, determine whether the arguments, determine whether the algorithm, when executed with these algorithm, when executed with these arguments, ever halts (the alternative arguments, ever halts (the alternative is that it runs forever without halting). is that it runs forever without halting).

11

Turing Machine & halting problem

Reduce TM to Safety problem If Safety problem is decidable then it

implies that TM halts (for all inputs) – showing that the halting problem is decidable (contradiction)

12

Turing Machine TM is an abstract model of computer

Alan Turing in 1936 TM consists of

A tape divided into cells; infinite in one direction A set of tape symbols M

M contains a special blank symbol b A set of states K A head that can read and write symbols An action table that tells the machine

What symbol to write How to move the head (‘L’ for left and ‘R’ for right) What is the next state

13

Turing Machine The action table describes the

transition function Transition function (k, m) = (k, m, L):

in state k, symbol m on tape location is replaced by symbol m,

head moves to left one square, and TM enters state k

Halting state is qf TM halts when it enters this state

14

Turing Machine

A B C …

1 2 3 4

head

Current state is k

Let (k, C) = (k1, X, R)where k1 is the next state

Current symbol is C

D A B X …

1 2 3 4

head

D

A B ? …

1 2 3 4

head

?

Let (k1, D) = (k2, Y, L)where k2 is the next state

?

?

15

General Safety Problem Theorem: It is undecidable if a given state of a

given protection system is safe for a given generic right

Proof: Reduce TM to safety problem Symbols, States rights Tape cell subject Cell si has A si has A rights on itself Cell sk sk has end rights on

itself State p, head at si si has p rights on itself Distinguished Right own:

si owns si+1 for 1 ≤ i < k

16

Command Mapping(Left move)

(k, C) = (k1, X, L)If head is not in leftmostcommand ck,C(si, si-1)if own in a[si-1, si] and k in a[si, si] and C in a[si, si]then

delete k from A[si,si];delete C from A[si,si];enter X into A[si,si];enter k1 into A[si-1, si-1];

End

17

Mapping

s1 s2 s3 s4

s4

s3

s2

s1 A

B

C k

D end

own

own

own

A B C …

1 2 4

head

Current state is kCurrent symbol is C

D

1 2 3 4

18

Mapping (Left Move)

s1 s2 s3 s4

s4

s3

s2

s1 A

B k1

X

D end

own

own

ownAfter (k, C) = (k1, X, L)where k is the currentstate and k1 the next state

A B X …

1 2 4

head

D

1 2 3 4

If head is in leftmost both si, si-1are s1

19

Command Mapping(Right move)

(k, C) = (k1, X, R)

command ck,C(si, si+1)if own in a[si, si+1] and k in a[si, si] and C in a[si, si]then

delete k from A[si,si];delete C from A[si,si];enter X into A[si,si];enter k1 into A[si+1, si+1];

end

20

Mapping

s1 s2 s3 s4

s4

s3

s2

s1 A

B

C k

D end

own

own

own

A B C …

1 2 4

head

Current state is kCurrent symbol is C

D

1 2 3 4

21

Mapping

s1 s2 s3 s4

s4

s3

s2

s1 A

B

X

D k1 end

own

own

ownAfter (k, C) = (k1, X, R)where k is the currentstate and k1 the next state

A B X …

1 2 4

head

D

1 2 3 4

22

Command Mapping(Rightmost move)

(k1, D) = (k2, Y, R) at end becomes

command crightmostk,C(si,si+1)if end in a[si,si] and k1 in a[si,si] and D in a[si,si]then

delete end from a[si,si];create subject si+1;enter own into a[si,si+1];enter end into a[si+1, si+1];delete k1 from a[si,si];delete D from a[si,si];enter Y into a[si,si];enter k2 into A[si,si];

end

23

Mapping

s1 s2 s3 s4

s4

s3

s2

s1 A

B

X

Y

own

own

ownAfter (k1, D) = (k2, Y, R)where k1 is the currentstate and k2 the next state

s5

s5

own

b k2 end

A B X

1 2 4

head

Y

1 2 3 4

24

Rest of Proof Protection system exactly simulates a TM

Exactly 1 end right in ACM Only 1 right corresponds to a state Thus, at most 1 applicable command in

each configuration of the TM If TM enters state qf, then right has leaked If safety question decidable, then represent

TM as above and determine if qf leaks Leaks halting state halting state in the

matrix Halting state reached Conclusion: safety question undecidable

25

Security Policy Defines what it means for a system

to be secure Formally: Partitions a system into

Set of secure (authorized) states Set of non-secure (unauthorized)

states Secure system is one that

Starts in authorized state Cannot enter unauthorized state

26

Secure System - Example

Is this Finite State Machine Secure? A is start state ? B is start state ? C is start state ? How can this be made secure if not? Suppose A, B, and C are authorized states ?

A B C D

Unauthorizedstates

Authorizedstates

27

Additional Definitions: Security breach: system enters an

unauthorized state Let X be a set of entities, I be information.

I has confidentiality with respect to X if no member of X can obtain information on I

I has integrity with respect to X if all members of X trust I

Trust I, its conveyance and protection (data integrity) I maybe origin information or an identity (authentication) I is a resource – its integrity implies it functions as it should

(assurance) I has availability with respect to X if all members

of X can access I Time limits (quality of service

28

Confidentiality Policy Also known as information flow

Transfer of rights Transfer of information without transfer of

rights Temporal context

Model often depends on trust Parts of system where information could flow Trusted entity must participate to enable flow

Highly developed in Military/Government

29

Integrity Policy Defines how information can be altered

Entities allowed to alter data Conditions under which data can be altered Limits to change of data

Examples: Purchase over $1000 requires signature Check over $10,000 must be approved by

one person and cashed by another Separation of duties : for preventing fraud

Highly developed in commercial world

30

Transaction-oriented Integrity

Begin in consistent state “Consistent” defined by specification

Perform series of actions (transaction) Actions cannot be interrupted If actions complete, system in consistent

state If actions do not complete, system

reverts to beginning (consistent) state

31

Trust Theories and mechanisms rest on some

trust assumptions Administrator installs patch

1. Trusts patch came from vendor, not tampered with in transit

2. Trusts vendor tested patch thoroughly3. Trusts vendor’s test environment

corresponds to local environment4. Trusts patch is installed correctly

32

Trust in Formal Verification

Formal verification provides a formal mathematical proof that given input i, program P produces output o as specified

Suppose a security-related program S formally verified to work with operating system O

What are the assumptions?

33

Trust in Formal Methods

1. Proof has no errors Bugs in automated theorem provers

2. Preconditions hold in environment in which S is to be used

3. S transformed into executable S’ whose actions follow source code

Compiler bugs, linker/loader/library problems

4. Hardware executes S’ as intended Hardware bugs

34

Security Mechanism Policy describes what is allowed Mechanism

Is an entity/procedure that enforces (part of) policy

Example Policy: Students should not copy homework Mechanism: Disallow access to files

owned by other users Does mechanism enforce policy?

35

Common Mechanisms:Access Control Discretionary Access Control (DAC)

Owner determines access rights Typically identity-based access control: Owner

specifies other users who have access Mandatory Access Control (MAC)

Rules specify granting of access Also called rule-based access control

Originator Controlled Access Control (ORCON) Originator controls access Originator need not be owner!

Role Based Access Control (RBAC) Identity governed by role user assumes