Post on 17-Feb-2016
description
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #1
ECE579S Computer & Network Security
2: Identification and Authentication; Access Control
Prof. Richard A. Stanley
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #2
Overview of Tonight’s Class
• Review of last class• Identification and authentication• Design principles
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #3
Last time...
• Block ciphers are widely used• Most commonly used block cipher today is
TDEA, operating in one of 4 modes• TDEA is limited by 64-bit block and key size,
provides poor software implementation• AES chosen to replace TDEA• Should be several years of coexistence
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #4
More of last time...
• Both symmetric and asymmetric crypto have their uses in communications
• Symmetric keys can be purely random, but asymmetric keys are mathematically related
• Symmetric crypto is much faster than asymmetric, which leads to combining the types in practical applications
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #5
Cryptosystems Compared
• Symmetric key– Same key both ends– Key management a
problem; requires secure side channel
– Fast– Message length < key
length
• Asymmetric key– Two keys
• Public key, known to all
• Private key, known to owner alone
– Key management less of a problem
– Computationally intensive, so it is slow
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #6
Hashing: the Final Tool
• Encryption seeks to obscure plaintext with a key, so that the plaintext can be recovered
• Hash functions produce fixed-length output given variable-length input, such that the hash output will change substantially if even a single bit of the input is changed– Similar to checksum or CRC for data integrity– Depends on hash function being one-way
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #7
Hash Function Uses / Benefits
• Verify the integrity of a block of data– e.g. a message
• Faster to compute than encrypted version of input
• Always produces an output of known and fixed length– Useful in many applications
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #8
Hash Function Requirements
• Hash value h = H(x), where H is the hash function of some input x– Input x can be of any length– Output H(x) has fixed length– H(x) relatively easy to compute for any x– H(x) is one-way– H(x) is collision-free
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #9
Hashing Terms
• One way– H(x) is one way if it is computationally infeasible to
find x such that H(x) = h– i.e. H(x) is hard to invert
• Collisions– weakly collision-free: given x, computationally
infeasible to find y x such that H(x) = H(y)– strongly collision-free: computationally infeasible to
find any two messages x and y such that H(x) = H(y)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #10
Hash Function Example
Source: RSA Laboratories, Inc.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #11
Common Hash FunctionsAlgorithm MD2 MD4 MD5 SHA-1
Output length 128 bits 128 bits 128 bits 160 bits
Block size 128 bits 512 bits 512 bits 512 bits
Specification RFC 1319 RFC 1320 RFC 1321 FIPS 180-1
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #12
Types of Hash• “Standard”
– Message is input to the hash function– Hash calculated according to the standard– Same message always produces same hash
• Keyed, or secure, hash– Message is one input to the hash function– Secret key is another input– Output depends on both key and message
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #13
Broken Hashes
• CRYPTO 2004 conference announced collisions in MD5 and other hash functions
• Impact is that two messages can be found that produce the same hash, although the hash cannot be chosen in advance– If hash collision on two different messages known, could generate
signature for first message but later claim second message was the one signed. This is a repudiation attack.
– If hash collision known on two different messages victim could sign one message. Signature also be valid for second message, which victim did not intend to sign. This is a chosen-message forgery.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #14
Impact?
• Broken hashes should be used with great caution in signatures
• MD4 & MD5 are the most commonly used hashes for which collisions have been found to date
• Further progress in finding collisions could lead to serious problems in the digital signature area
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #15
So What?
• Hash functions can be used to provide– Fast integrity check on data
• Asymmetric key cryptography can provide– Session key negotiation– User authentication (with some help)
• We now have all the cryptographic tools needed to provide confidentiality, integrity, and authentication
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #16
The Good Old Days
• Mainframe computers– Physically isolated
from casual access by unauthorized personnel
– Programs, data passed to/from computer by trusted staff
– No authorization, no job
• So, no problem, right?IBM 360 mainframe
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #17
The Good New Days
• Computers are everywhere• Access can often be achieved by walking up
to the keyboard/display and beginning to work
• What’s an authorization number?• So, big problem, right?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #18
Access Control• Determines and monitors who can do what
with what in the computer• Is much more than establishing a physical
perimeter around the computer• Can’t happen without identification and
authentication (about which, more later)• Needs to be instantiated in a policy
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #19
Subjects and Objects
• Remember your English grammar• Subjects act• Objects are acted upon• These roles are not graven in stone
– If you hit the ball, you are the subject– If the ball hits you, you are the object
• It is just the same in computer science
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #20
Access Control Model
Subject RequestReferenceMonitor Object
Any of these points is a vulnerability. How to protect?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #21
Reference Monitor
• Makes access control work• You can tell it
– What a subject is allowed to do (privilege)– What may be done with an object (permission)
• In order to specify these things, you need to know all the possibilities, or you need to define things narrowly so that what you don't know doesn’t become allowed
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #22
Access Operations (Example)
• Observe– Read– Write
• Alter– Write– Append
• How do you execute a program?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #23
Unix Access Control
• Read: read a file• Write: write to a file• Execute: execute a file• Interpreted according to where the access
rights are to be granted
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #24
Access Control Types
• Discretionary: the file owner is in charge• Mandatory: the system policy is in charge• One can exist within the other, especially
discretionary within a class of mandatory
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #25
Access Control Matrix
• A = set of access operations permitted• S = set of subjects• O = set of objects
M M so s S o O M Aso
, ,
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #26
Access Control Matrix Example
Bill.doc Edit.exe Fun.comBill r,w e e,r,wAlice e e,r
How easy is this to implement?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #27
Access Control Lists
• Stores the access rights within the object• Convenient, quick
– This is the Unix approach• Difficult to modify globally w.r.t. subjects,
easy w.r.t. the object• How to find out what a subject is able to
do?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #28
Intermediate Controls
• Groups • Negative permissions• Protection rings• Abilities• Privileges• Role-based
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #29
Security Levels• Linear
– Top secret– Secret– Confidential– Unclassified
• Lattice– Security level– Compartment
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #30
Security Level Examples
• Linear– Marking contains the name of the level– Each higher level dominates those below it
• Lattice– Marking contains name of level + name of
compartment (e.g. TOP SECRET PETUNIA)– Only those “read into” the compartment can
read the information in that compartment, and then only at the level of their overall access
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #31
Who Can Read What?
• In a linear system?• In a lattice system?• What is dominance?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #32
System High/Low
• System High is the highest security level in the system. It can be thought of the apex of all lattice levels
• System Low is the lowest security level in the system. It can be thought of as that level which all system users can “see”
• Question?– In a Unix system, what level should be assigned to the
root directory?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #33
Security Models Implement Access Control Policy
• Why?– If you can’t describe it, you can’t measure it,
and you don’t know what it is– Policy requires a model– Security requires a policy– Q.E.D.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #34
Security Model Types
• Formal (high-assurance computing)– Bell-LaPadula– Biba– Chinese Wall
• Informal (policy description)– Clark-Wilson
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #35
Bell-LaPadula
• Describes access policies and permissions in a state machine model of a computer
• S is the set of subjects• O is the set of objects• A is the set of access operations
= {execute, read, append, write}={e,r,a,w}• L is the set of security levels with partial
ordering
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #36
What’s a Partial Ordering?• Partial ordering on a set L is a relation where:
– for all a L, a a holds (reflexive)– for all a,b,c L, if a b, b c, then a c
(transitive)– for all a,b L, if a b, b a, then a b
(antisymmetric)• If two elements a,b L are not comparable, then
notation is a b
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #37
Bell-LaPadula Access Rights
• e: execute• r: read• a: append• w: write• Don’t assume anything when dealing with
security!
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #38
State Machines• BLP: security = property of states• State is an instantaneous representation of the
system at an instant in time• State transition occurs when the state changes• State transitions may be constrained• With a 2.5 GHz processor, what is the likely rate
of state change?• What are the chances that you can capture all the
states of even a desktop computer? Why?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #39
Bell-LaPadula
• Is a state machine model• Utilizes the machine state to check security
– All permissions must be captured– All subjects accessing objects must be captured– These are machine states
• Complicated state set results• Defining state set is the major BLP problem
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #40
Access Control Model
Subject RequestReferenceMonitor Object
The Reference Monitor validates all requestsagainst permitted state functions
We have seen this before, and we will see it again
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #41
BLP Security Policies• Mandatory security policies• Simple security (ss) policy (no read up)• Star (*) policy (no write down)
– How to send messages from high to low?– Trusted subjects can violate policy
• Discretionary (ds) policy• If all three properties are satisfied, a state
is secure
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #42
Basic Security Theorem
• A state transition is secure if both the initial and the final states are secure, so
• If all state transitions are secure and the initial system state is secure, then every subsequent state will also be secure, regardless of which inputs occur. (Proof)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #43
Tranquility
• Security levels and access rights are never changing = tranquility
• Operations that do not change access rights are tranquil
• Does BLP really capture security?• What are your views of McLean’s
criticism?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #44
BLP Advantages
• Descriptive capabilities of the model• Policies based on security levels -- easy to
introduce other structures in their place• Actual security policies• Specific solution (e.g. Multics)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #45
BLP Disadvantages
• Deals only with confidentiality, not integrity
• Does not address management of access control
• Contains covert channels
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #46
Covert Channel
• An information flow that is not controlled by a security mechanism
• Can occur by allowing low-level subjects to see names, results of comparisons, etc. of high-level objects
• Difficult to find, difficult to control, critical to success
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #47
Harrison-Ruzzo-Ullman Model
• Deals with BLP lack of procedures to change access rights
• Uses a structured programming approach to modify the access control matrix
• Provides a view of complex systems modeled by complex models
• The more complex a security model is, the more difficult it usually is to verify security properties
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #48
HRU Command Structurecommand c(x1,...,xk)
if r1 in Ms,o and:
if rm in Ms,m then
op1
:opk
end
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #49
HRU Create File Example
command create_file (s,f)create fenter o into Ms,f
enter r into Ms,f
enter w into Ms,f
end
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #50
Chinese Wall Model• Rule: there must be no information flow
that causes a conflict of interest• Info concerning a single company are O• Company dataset is y: O C• Conflict classes are x: O P(C)• N is a Boolean matrix where
– Ns,o' = true, if s ever had access to o– Ns,o' = false, if s never had access to o
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #51
Chinese Wall Properties
• Object security label: (x(o) , y(o) )• Sanitized information is not subject to
access restrictions, x(o)=0• ss-property: s can access o iff. for all o'
with Ns,o' = true, y(o) x(o') or y(o)=y(o')• *-property: s can write to o iff. s cannot
read o' with y(o) y(o') and x(o')=0
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #52
What Does This Mean?
• Write access to an object is granted if, and only if, no other object can be read by that subject that is in a different company dataset and contains unsanitized info
• In BLP, access rights -- once stated -- were static. In the Chinese Wall Model, they must be evaluated at every state transition.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #53
Biba Model - 1
• State machine model, like BLP• Integrity levels assigned to subjects, objects• Static integrity levels
– Mirrors the BLP tranquility property• Simple integrity property: if s can alter o, then
fs(s) fo(o) (no write up)• Integrity *-property: if s can observe o, then s can
write to some other object p only if fo(p) fo(o)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #54
Biba Model - 2• Dynamic integrity levels
– Automatically adjust integrity level of entities if they have come in contact with low-level info
• Subject low watermark property: s can observe o at any integrity level. The new subject integrity level is inf (fs(s), fo(o) ), where fs(s), fo(o) are the integrity levels before the operation
• Object low watermark property: s can alter o at any integrity level. The new object integrity level is inf (fs(s), fo(o) ), where fs(s), fo(o) are the integrity levels before the operation
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #55
Biba Model Extensions
• Invoke property: s1 can invoke s2 iff. fs(s2) fs(s1)
• Ring property: s1 can read objects at all integrity levels. s1 can only modify objects where fs(s) fo(o) and can invoke a subject s2 only if fs(s1) fs(s2)
• Can these two properties coexist?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #56
Clark-Wilson Model
• Focuses on data integrity– Internal consistency (system properties)– External consistency (enforced externally)
• Integrity enforced by– Well-formed transactions– Separation of user duties
• Argue that military and commercial systems differ w/regard to confidentiality & integrity
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #57
Information Flow Model
• Seek to capture all possible information flows– explicit– implicit
• Components– Lattice of security levels (L, – Set of labeled objects– Security policy
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #58
Information Flow Policy
• Information flow permitted from c1 to c2 iff. c1 c2
• Information flow violating above is illegal• System secure if there is no illegal info flow• Static vs. dynamic modeling & enforcement• Problem: exhaustive identification of all
possible information flows
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #59
Computer Security
• Is about protection• BUT…WHAT are we trying to protect?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #60
What Is Vulnerable?
• The computer itself– Theft of the machine– Inappropriate access to the machine
• The information in the computer– Programs– Data
• Emanations from the computer– Electronic– Shoulder surfing
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #61
Anything Else?
• Connections to the computer– Modems– Networks– Peripherals
• Covert channels• …and?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #62
Security Perimeters
Where to provide protection? Against what?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #63
How to secure the system?
• Track identities of all users• Ensure all users do only what they are
authorized to do on the system• Keep records of who does what, when• REMEMBER: 60-80% of the “bad guys”
are inside the security perimeter (who has a better opportunity to compromise security?)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #64
Identification
• Not as simple as it seems– Who are you?– Can you prove it?– How?
• Now--how can I prove you are who you purport to be?
• And how can you prove to what or whom you are talking?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #65
Identification & Authentication
• Identification– A unique entity descriptor
• Authentication– Verifying the claimed identification
These are two sides of the same coin, but they are NOT the same thing
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #66
Authentication• Validates you are who you claim to be
– Something you know– Something you have– Who you are– What you do– Where you are
• An intruder who has the authentication keys looks just like the real user!
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #67
Something You Know
• Password• PIN• Some other piece of information (e.g. your
mother’s maiden name -- very popular)• NB: anyone who obtains this information is
-- so far as the computer knows -- you. Is there a problem here?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #68
Password
• Most commonly used• Relatively easy to compromise or break• Many threats• Usability issues• First line of defense, but not a very solid
one
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #69
Good Passwords
• Are not dictionary words• Have mixed case, numbers, characters• Are regularly changed• Are not cyclical• Are unique (one per user)• Can be tried only a limited number of times• Are easy for the user to use (say what?)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #70
Password Problems
• Security/sharing• System is only as secure as the weakest link• Vulnerable to brute force attack
– Dictionary attacks easy, in any language– Other intelligent searches– Exhaustive attacks
• Password file vulnerable• Spoofing, man-in-the middle
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #71
Password Defenses
• Change the defaults• Use good password design and management• Age the passwords• Limit login attempts• Attack/scan your own system• Combine with other, orthogonal
identification measures
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #72
How Does the Computer Know a Good Password from a Bad One?
• Password file– can you see a problem here?
• How to attack• How to protect
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #73
Something You Have
• Physical token– Physical key– Magnetic card– Smart card– Calculator
• What if you lose it?• What if it breaks?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #74
Who You Are• Biometrics
– Fingerprints– Face geometry– Voiceprints– Retinal scanning– Hand geometry
• False positives, negatives• User acceptance
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #75
What You Do
• Mechanical tasks– Signature (pressure, speed)– Joystick
• False positives, negatives• Potential for forgery, etc.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #76
Where You Are
• Limit use by user location• Vet location by GPS, etc.• Reliability, dependability, complexity
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #77
How Can You Break In?
• Spoofing• Man in the middle• Password crackers• Communications lines• Theft• Exploit weaknesses
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #78
How To Break A System
• Do not attack the system head-on• Attack the assumptions on which the
system is based• Why can this work?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #79
What About Breaking the People?
• Do you know who works for you?– Majority of security breaches come from inside
the organization– Background checking is a good idea
• What does it take to buy loyalty?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #80
Policy -- Does it Have a Place?
• Passwords• Authentication• Proper use• Penalties• Prosecution / civil penalties
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #81
Audit Trails
• Without them, you know nothing about what has happened in your system
• Manual review is usually suboptimal, but automated review is not straightforward
• Needed to prove breaches, losses• Logs must be protected; they are the first
target of a savvy intruder
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #82
Insider or Outsider?
• What’s the difference?– Exploiting identification– Exploiting authentication– Spoofing, etc.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #83
I See You, Can You See Me?
• Passwords and authentication, properly implemented, can do a decent job of identifying the user to the computer
• How does the user know to whom he/she is connected?
• Why does the user care?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #84
Authentication
• Remember?– Authentication is proving that an entity is really
who/what they purport to be– Common way is passwords and user Ids
• Hard to do locally• How to do at a distance (i.e., over a
network)?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #85
Basic Cryptographic AuthenticationAlice Bob
Select random n
Encrypt n with own private key
n
E(n)Decrypt E(n) w/Bob’s
public keyD[E(n)] = n ?
If yes, Bob is authenticated
This system is known as challenge / response authentication
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #86
Challenge / Response Under the Microscope
• What does this exchange really tell Alice?– Does she know who Bob is?– Can she be sure she is really connected to Bob?– Why?– Why not?
• We need a better way to do this
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #87
RSA to the Rescue?• Consider this possibility
– Generate a message– When finished, encrypt the message with your
private key – Send the message to the intended recipient– Recipient decrypts the message using your public
key– If message decrypts, recipient assumes you sent it,
and you are therefore authenticated
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #88
Flaws In This Scheme?• What does it prove if you can decrypt a
message encrypted with someone’s private key?
• What if Alice and Bob don’t know one another before this communication?– What is the basis for trust?– How is trust established?
• We’ll come back to this later
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #89
Remember the Hash Function
Source: RSA Laboratories, Inc.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #90
Back to Our Message• Suppose we take the message text and hash it,
producing a message digest• Now we encrypt the hash with our private key,
and append this to the end of the message• This is called a digital signature• It is not necessary that the message body be
encrypted for it to be digitally signed.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #91
At the Other End...• Recipient gets our message, and decrypts the
message digest we encrypted with our private key, using our public key
• Recipient now has the message digest in cleartext• Recipient computes the message digest over the
text of the message• If the two hashes match, the message has not been
changed and it is authentic
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #92
Digital Signatures in General
Message
MD
Encrypt w/sender’sprivate key
Digital SignatureCreation
Decrypt w/sender’spublic key
Message
MD
ComputeMD
=? No
Yes
Bogus
Authentic
Digital SignatureVerification
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #93
IF...
• We can be certain that the private key used to encrypt the message digest does, in fact, belong to the person we think it does
• This requires a trust relationship so that we can have some assurance of who owns that private key
• There are two types of trust mechanisms
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #94
Web of Trust
Alice
Bob
Existing trust relationship
CarolExisting trust relationship
Derived trust relationship:Bob trusts Alice, and Alice trusts Bob.Alice trusts Carol, so Bob trusts Alice to introduce him to Carol and then Bob trusts Carol
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #95
Web of Trust Issues• Peer-to-peer approach• Does not deal nicely with third-level and higher
unknowns– Does Bob trust Carol to introduce Don?– How about Earl, whom none of the above know?
• Is it usable?– Yes -- this is the model used by PGP
• Does it scale nicely?– No -- scales exponentially
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #96
Hierarchy of Trust
TrustA
B C
D E F
Root
B, C trust AD, E, F trust BF also trusts C
So, ALL trust ABUT, D, E do not trust C
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #97
Issues
• How do we know A is trustworthy?– Because A says so!
• What are the criteria for establishing trust relationships?
• Is this useful for extending trust to entities previously unknown to you?
• Does it scale?– Yes, linearly
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #98
Hierarchy is the Basis for X.500 Directory Services
• X.500 begun as the answer to harmonizing telephone directories all over the world– At its root, X.500 is a database specification– Basic implementation is the Directory Access
Protocol, DAP, which is rather “chatty”– This led, in turn, to the Lightweight Directory Access
Protocol, LDAP• X.509 developed as way of implementing
hierarchical trust structures
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #99
Enter the Certification Authority
• A Certification Authority (CA) is a trusted third party who issues Digital Certificates that bind a user to that user’s public key– The CA digitally signs the digital certificate, so
that any changes (such as substituting another public key) will be obvious
– The CA has no knowledge of the user’s private key (some government CAs are exceptions)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #100
Digital Certificates
• Clearly, for this to work, all parties must use the same format for the certificates
• The most popular (but not the only) standard in current use is X.509 v3
• An X.509 certificate has a fixed format, and contains certain mandatory items, in a prescribed order, so it is easy for a computer to scan and verify
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #101
X.509 Certificate Contents• Version• Serial number• Signature algorithm identity• Name of issuing CA• Period of validity (not before - not after)• Subject name to whom certificate refers• Subject’s public key• Subject distinguished name (X.500)• Extensions (added in X.509 v3)• Digital signature over the entire certificate
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #102
Uses for Certificates• Anyone can obtain the digital certificate for
anyone else with whom they wish to conduct secure communications, whether or not they have a previous relationship
• The CA attests that the public key in the certificate is really the public key of the subject named in the certificate. You know to whom you are talking! (Or do you?)
• This greatly facilitates electronic commerce
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #103
More Issues With CA’s
• How do you trust the CA?• Who guarantees the “goodness” of the top of
the hierarchy?• What are the liability issues?• Does this really guarantee you know who’s
who in the digital world?• The hierarchy of CAs is called the Public Key
Infrastructure
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #104
Where Do the Certificates Come From?
• So-called root certificates are pre-loaded on web clients for use by average folks
• If you are presented with a certificate for which you have a root certificate on your machine, then that certificate is checked and you are told if there are any problems such as the certificate being expired, etc.
• You then choose what to do
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #105
Trust?
• Because the certificates are pre-loaded by the web client maker, you are actually trusting Microsoft or Netscape
• You can customize the set of certificates in any web client, provided that the client has not itself been customized to prevent that.– It is common to put custom browsers on desktops to
prevent users taking liberties with system settings, adding certificates, etc.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #106
Authentication of the Computer
• Public key infrastructure• Digital certificates
– Certification authority– Certificate revocation
• Is it foolproof?• Is it legal?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #107
Digital Signatures in General
Message
MD
Encrypt w/sender’sprivate key
Digital SignatureCreation
Decrypt w/sender’spublic key
Message
MD
ComputeMD
=? No
Yes
Bogus
Authentic
Digital SignatureVerification
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #108
X.509 Certificate Contents• Version• Serial number• Signature algorithm identity• Name of issuing CA• Period of validity (not before - not after)• Subject name to whom certificate refers• Subject’s public key• Subject distinguished name (X.500)• Extensions (added in X.509 v3)• Digital signature over the entire certificate
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #109
Pretty Good Privacy (PGP)• Arguably, the first quality crypto system, not
developed by or for a government, that is available to non-government entities
• Developed by Phil Zimmerman– When developed, held to violate the Munitions Act
which barred export of encryption– U. S. Government brought charges against Mr.
Zimmerman, which were ultimately dropped• Available worldwide, free over Internet
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #110
PGP
• Versions available for most OS’s• Algorithms have passed extensive public
review, considered extremely secure• Wide applicability• Developed outside of any governmental
agencies– In fact, drew (and continues to draw) the wrath
of many governments
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #111
Algorithms Supported• Symmetric encryption
– CAST-128– IDEA– TDEA (3 key, 168 bits)
• Asymmetric encryption– RSA– DSS (Digital Signature Standard)– El Gamal
• Hashing– SHA-1
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #112
PGP Functions
• Digital signature• Message confidentiality• Data compression• E-mail compatibility
– Only sends ASCII characters– Cf. Kermit
• Internet / email message size compatibility– Segmentation of large messages
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #113
PGP Digital Signature• Hash message using SHA-1• Encrypt hash with RSA using sender’s
private key• Prepend encrypted hash to message• Recipient decrypts hash with sender’s
public key• Generates new hash, compares with decode
– Message authentic if match
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #114
PGP Digital Signatures
Message
SHA-1
Encrypt w/sender’sprivate key
Digital SignatureCreation
Decrypt w/sender’spublic key
Message
SHA-1
ComputeSHA-1
=? No
Yes
Bogus
Authentic
Digital SignatureVerification
RSA*
RSA*
* Alternatively, DSS (FIPS PUB 186)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #115
PGP Signature Features
• Choice of two encryption algorithms– RSA– DSS (Digital Signature Standard, FIPS 186)
• Signatures can be detached from message– Facilitates separate signature log– Signature can be virus check on executable– Enables multiple signatures on single message
without nesting the signatures (e.g., contracts)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #116
PGP Encryption Options
• Uses one of these symmetric systems– TDEA with three keys (you know this one)– CAST-128– IDEA
• Sender generates session key• RSA used to encrypt session key, which is
prepended to the encrypted message
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #117
IDEA
• International Data Encryption Algorithm– By Xuejia Lai and James Massey, Swiss
Federal Institute of Technology, 1991– Feistel cipher, well-reviewed
• Eight rounds• No S-boxes in round function
– XOR, binary addition and multiplication (16-bit integers)
• Complex subkey generation using circular shifts– Six subkeys for each round
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #118
CAST-128 Cipher• Carlisle Adams & Stafford Tavares, 1997
– IETF RFC 2144 defines– Key size: 40, 48, 56,…,128 bits– Extensive review, becoming fairly common– Feistel cipher
• Fixed S-boxes, larger than DES• S-boxes designed to be nonlinear, resistant to cryptanalysis• Subkeys also generated by nonlinear processes• Round function varies from round to round
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #119
PGP Confidentiality
• Sender generates random 128-bit number as session key for this message only
• Message encrypted with session key• Session key encrypted with recipient’s
public key, and prepended to message• Recipient decrypts session key with
recipient’s private key• Uses session key to decrypt message
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #120
Options
• Can use RSA or El Gamal to encrypt session key
• Key sizes– 768 to 3072 bits– DSS signatures fixed at 1024 bits
• Confidentiality and authentication can be combined in a single message– A very good idea!
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #121
Authenticated Secure Message
• Sender signs message with own private key• Sender generates session key and encrypts
message with the session key• Sender encrypts session key with recipient’s public key
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #122
Compression• PGP was targeted at email, so compression
is an important feature• Compression (ZIP) applied after signature,
but before encryption of message body– Ensures same signature despite compression
• Compressed message is encrypted– Less redundancy than original, so cryptanalysis
harder
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #123
Compatibility• After encryption, message contains stream of
arbitrary binary octets• Some email systems permit transmission of only
ASCII text• PGP provides option to convert data stream to
blocks of ASCII text for compatibility– 3 octets become 4 ASCII characters– Known as Radix-64 conversion– Expands size by 33% (because 3 become 4)
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #124
Radix-64 Conversion
• Straightforward mapping of binary 6-bit values into printable ASCII characters w/CRC– 0 = A, 1 = B, etc.– no hyphen, no control characters
• Expands input by 33%, but ZIP still provides about 33% overall compression
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #125
Segmentation• Some systems limit the size of messages,
often to about 50K octets• PGP provides a built-in service to segment
messages into parts small enough to transit the system, and then to reassemble the message properly at the destination for presentation to the recipient
• Much like packet assembly / disassembly
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #126
PGP Keys
• Four types– One-time session keys– Public keys– Private keys– Passphrase-based keys
• User may desire multiple public/private key pairs– How does recipient know which one was used?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #127
Key Identifiers
• So that recipient knows which public key was used, a key ID is transmitted with the message– Key ID = least significant 64 bits of the public
key• Avoids wasted BW if entire public key were sent• Very low probability of duplicate key Ids
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #128
Key Rings
• Simply tables of private and public keys, where each row represents one key pair– Can be indexed by user ID or key ID
• Private key not stored in clear in key ring– Encrypted using CAST-128, etc.– Passphrase used to access private-key ring
• Passphrase is hashed with SHA-1• First 128 bits used to encrypt private key
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #129
Key Management• PGP uses the Web of Trust key management
model– Therefore, you must trust someone else to sign a third
party’s key– Provides for partial levels of trust
• Difficult--is someone 40% trusted or 62%?• What does this mean in practice?
– How to deal with wholly unknown third parties?• PGP proponents dismiss these concerns, but they
are serious issues for e-commerce, etc.
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #130
PGP Web of Trust
TrustYou
B C
D E F
You trust CYou partially trust B
C trusts F
So, you trust C to sign for FBUT, you partially trust B to
sign for D, E, FWhat does this mean?
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #131
PGP In Summary• Very good cryptosystem, providing
confidentiality, authentication, and features to make it compatible with email
• Despised by governments worldwide, and subject to legal restrictions in many places
• Complex key management scheme that does not scale well to large systems where parties previously unknown need services
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #132
Summary - 1
• It’s all about protection• Identification ties a physical entity to an
abstract identity• Authentication verifies the identity of both
entities: the user and the computer• Policy and audit trails are critical• No shortage of folks trying to break in
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #133
Summary - 2
• Access controls limit the ability of subjects to act on objects within the computer
• It isn’t easy to establish and maintain a solid access control system
• Keeping track of who can do what is often the weak link in the chain
• Identification ties a physical entity to an abstract identity• Authentication verifies the identity of both entities: the
user and the computer• Policy -- once again -- is a key element
Spring 2011© 2000-2011, Richard A. Stanley
ECE579S/2 #134
Assignment for Next Class• Read text, Chapters 38 & 39• Using the text and whatever other reference material you
choose, compare and contrast the security features of any version of Unix/Linux you wish with Windows XP/Vista. Which one do you believe to be more secure? Justify your choice with facts, not just opinions.
• Research “broken” hashes and evaluate their importance in the realm of authentication. Is there a problem here? How significant is it? How worried should we be?