Seminar in Foundations of Privacy
description
Transcript of Seminar in Foundations of Privacy
![Page 1: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/1.jpg)
Seminar in Foundations of Privacy
Gil Segev
Message Authenticationin the Manual Channel Model
![Page 2: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/2.jpg)
2
Pairing of Wireless Devices
Scenario: Buy a new wireless camera Want to establish a secure channel for the first time
Diffie-Hellman key agreement protocol
![Page 3: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/3.jpg)
3
Diffie-Hellman Key Agreement Alice and Bob wish to agree on a secret key Public parameters:
Group G Generator g 2 G
gx
gyAlice Bob
Both parties compute KA,B = gxy
Security: Even when given (G, g, gx, gy) it is still hard to compute gxy
![Page 4: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/4.jpg)
4
Diffie-Hellman Key Agreement
Decisional Diffie-Hellman assumption (DDH):
{(g, gx, gy, gxy)} {(g, gx, gy, gc)}c
for random x, y and c.Computational
Indistinguishability
Computational Diffie-Hellman assumption (CDH):For every probabilistic polynomial-time algorithm A, every polynomial p(n) and for all sufficiently large n,
Pr[A(Gn,gn,gnx,gn
y) = gnxy] < 1/p(n)
The probability is taken over A’s internal coins tosses and over the random choice of (x,y)
![Page 5: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/5.jpg)
5
Diffie-Hellman Key Agreement Alice and Bob wish to agree on a secret key Public parameters:
Group G Generator g 2 G
gx
gyAlice Bob
Both parties compute KA,B = gxy
CDH assumption: KA,B is hard to guess
DDH assumption:KA,B is as good as a random secret Secure against passive adversaries
Eve is only allowed to read the sent messages
![Page 6: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/6.jpg)
6
Pairing of Wireless Devices
Scenario: Buy a new wireless camera Want to establish a secure channel for the first time
Diffie-Hellman key agreement protocol
gx
gy
![Page 7: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/7.jpg)
7
“I thought this is a wireless
camera…”
Simple Cheap Authenticated channel
DevicesPairing of WirelessCable pairing
![Page 8: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/8.jpg)
8
Pairing of Wireless Devices
Problem: Active adversaries (“man-in-the-middle”)
Wireless pairing
![Page 9: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/9.jpg)
9
Pairing of Wireless DevicesWireless pairing
gx gy
ga gb
Problem: Active adversaries (“man-in-the-middle”)
![Page 10: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/10.jpg)
10
Diffie-Hellman Key Agreement Suppose now that Eve is an active adversary
“man-in-the-middle” attacker
gx
gaAlice Bob
KA,E = gxa
gy
gb
KE,B = gby
Eve
Completely insecure: Eve can decrypt m, and then re-encrypt it
Alice BobEveENC(KA,E,m) ENC(KE,B,m)
![Page 11: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/11.jpg)
11
Diffie-Hellman Key Agreement Suppose now that Eve is an active adversary
“man-in-the-middle” attacker
gx
gaAlice Bob
KA,E = gxa
gy
gb
KE,B = gby
Eve
Solution - Message authentication: Alice and Bob authenticate gx and gy
![Page 12: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/12.jpg)
12
Message Authentication Assure the receiver of a message that it has not been
changed by an active adversary
Alice BobEvem m
Problem specification:
Completeness: No interference m Bob accepts m (with high probability)
Soundness: m Pr[ Bob accepts m m ] ^
![Page 13: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/13.jpg)
13
H = {h| h: {0,1}n → {0,1}k } is a family of hash functions
One-Time Authentication The secret key enables a single authentication of a
message m {0,1}n
Alice and Bob share a random function hH h is not known to Eve
To authenticate m {0,1}n Alice sends (m,h(m))
Upon receiving (m,z): If z = h(m), then Bob outputs m and halts Otherwise, Bob outputs ? and halts
^
^ ^
![Page 14: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/14.jpg)
14
What properties do we require from H?
One-Time Authentication
Hard to guess h(m) Success probability at most Should hold for any m
^
^
![Page 15: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/15.jpg)
15
What properties do we require from H?
One-Time Authentication
Hard to guess h(m) even given h(m) Success probability at most Should hold for any m and m
Short representation for h - must have small log|H|
^
^
Easy to compute h(m) given h and m
![Page 16: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/16.jpg)
16
Given h: {0,1}n → {0,1}k we can always guess a correct output with probability at least 2-k
A family where this is tight is called universal2
Definition: a family H = {h| h: {0,1}n → {0,1}k } is called Strongly Universal2 or pair-wise independent if:
for all m1 m2 {0,1}n and y1, y2 {0,1}k we have
Pr[h(m1) = y1 and h(m2) = y2 ] = 2-2k where the probability is over a randomly chosen h H
In particular Pr[h(m2) = y2 | h(m1) = y1 ] = 2-k
Theorem: when a strongly universal2 family is used in the protocol, Eve’s probability of cheating is at most 2-k
Universal Hash Functions
![Page 17: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/17.jpg)
17
The linear polynomial construction: Fix a finite field F of size at least the message space 2n
Could be either GF[2n] or GF[P] for some prime P ≥ 2n The family H of functions h: F→ F is defined as
H= {ha,b(m) = a∙m + b | a, b F}
Claim: the family above is strongly universal2
Proof: for every m1≠m2, y1, y2 F there are unique a, b F such that
a∙m1+b = y1
a∙m2+b = y2
Size: each hH represented by 2n bits
Constructing Universal Hash Functions
![Page 18: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/18.jpg)
18
Theorem:Let H= {h| h: {0,1}n → {0,1} } be a family of pair-wise independent functions. Then
|H| is Ω(2n) More precisely, to obtain a d-wise independence family
|H| should be Ω(2n└d/2┘)
Lower Bound
N. Alon and J. SpencerThe Probabilistic MethodChapter 15 (derandomization), Proposition 2.3
![Page 19: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/19.jpg)
19
More on Authentication Reducing the length of the secret key
Almost-pair-wise independent hash functions Interaction
Using the same secret key to authenticate any polynomial number of messages
Requires computational assumptions Pseudorandom functions
Authentication in the public-key world
Much more to discuss…
![Page 20: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/20.jpg)
20
Pairing of Wireless Devices
gx gy
ga gb
m = gx || ga
m = gb || gy
^
Wireless pairing
Impossible without additional setup
![Page 21: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/21.jpg)
21
Pairing of Wireless Devices
gx gy
ga gb
Wireless pairing
Solution:
Manual Channel
![Page 22: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/22.jpg)
22
The Manual Channel
gx gy
ga gb
141141
User can compare two short strings
Wireless pairing
![Page 23: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/23.jpg)
23
Manual Channel Model
Insecure communication channel Low-bandwidth auxiliary channel:
Enables Alice to “manually” authenticate one short string s
Alice Bob
s
. . .
ss
Adversarial power: Choose the input message m Insecure channel: Full control Manual channel: Read, delay Delivery timing
m
Interactive
Non-interactive
![Page 24: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/24.jpg)
24
Manual Channel Model
Insecure communication channel Low-bandwidth auxiliary channel:
Enables Alice to “manually” authenticate one short string s
Alice Bob
ss
Goal:Minimize the length of the manually authenticated string
m
. . .
s
Interactive
Non-interactive
![Page 25: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/25.jpg)
25
Manual Channel ModelAlice Bob
ss
No trusted infrastructure, such as: Public key infrastructure Shared secret key Common reference string .......
Suitable for ad hoc networks: Pairing of wireless devices
Wireless USB, Bluetooth Secure phones
AT&T, PGP, Zfone Many more...
. . .
m
s
![Page 26: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/26.jpg)
26
Implementing the manual channel: Compare two strings displayed by the devices
Why Is This Model Reasonable?
141141
![Page 27: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/27.jpg)
27
Implementing the manual channel: Compare two strings displayed by the devices Type a string, displayed by one device, into the other device
141141
Why Is This Model Reasonable?
![Page 28: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/28.jpg)
28
Implementing the manual channel: Compare two strings displayed by the devices Type a string, displayed by one device, into the other device Visual hashing
Why Is This Model Reasonable?
![Page 29: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/29.jpg)
29
Implementing the manual channel: Compare two strings displayed by the devices Type a string, displayed by one device, into the other device Visual hashing Voice channel
141141
Why Is This Model Reasonable?
![Page 30: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/30.jpg)
30
The Naive SolutionAlice Bob
H - collision resistant hash function (e.g., SHA-256) No efficient algorithm can find m m s.t. H(m) = H(m) with
noticeable probability Any adversary that forges a message can be used to find a
collision for H
m
H(m)
^ ^
Alice Bobm
H(m)
Evem
![Page 31: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/31.jpg)
31
The Naive SolutionAlice Bob
H - collision resistant hash function (e.g., SHA-256) No efficient algorithm can find m m s.t. H(m) = H(m) with
noticeable probability Any adversary that forges a message can be used to find a
collision for H
m
H(m)
^ ^
Are we done?
No. The output length of SHA-256 is too long (160 bits) Cannot be easily compared or typed by humans
![Page 32: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/32.jpg)
32
. . .
m
s
Tight Boundsn-bit
ℓ-bit forgery probability
Upper bound: log*n-round protocol in which ℓ = 2log(1/) + O(1)
No setup or computational assumptions
Matching lower bound: n 2log(1/) ℓ 2log(1/) - 2
One-way functions are necessary (and sufficient) for breaking the lower bound in the computational setting
![Page 33: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/33.jpg)
33
ℓ ℓ = 2log(1/) ℓ = log(1/)
Unconditional security
Computational security
Impossible
One-way functions
Our Results - Tight Bounds
log(1/)
![Page 34: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/34.jpg)
34
Outline
Security definition Tight bounds
The protocol Lower bound
![Page 35: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/35.jpg)
35
. . .
m
s
n-bit
ℓ-bit
Security Definition
Unconditionally secure (n, ℓ, k, )-authentication protocol:
Completeness: No interference m Bob accepts m (with high probability)
Unforgeability: m Pr[ Bob accepts m m ]
n-bit input message ℓ manually authenticated bits k rounds
^
![Page 36: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/36.jpg)
36
Outline
Security definition Tight bounds
The protocol Lower bound
![Page 37: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/37.jpg)
37
Preliminaries:
For m = m1 ... mk GF[Q]k and x GF[Q], let m(x) = mixi
i = 1
k
Then, for any m ≠ m and for any c, c GF[Q], ^ ^
Prob x R GF[Q] [ m(x) + c = m(x) + c ] k/Q ^ ^
Based on the [GN93] hashing technique In each round, the parties:
Cooperatively choose a hash function Reduce to authenticating a shorter message
A short message is manually authenticated
The Protocol (simplified)
![Page 38: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/38.jpg)
38
We hash m to x || m(x) + c
One party chooses x
Other party chooses c
Preliminaries:
For m = m1 ... mk GF[Q]k and x GF[Q], let m(x) = mixi
i = 1
k
Then, for any m ≠ m and for any c, c GF[Q], ^ ^
Prob x R GF[Q] [ m(x) + c = m(x) + c ] k/Q ^ ^
The Protocol (simplified)
![Page 39: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/39.jpg)
39
Alice Bobm
b1a1 R GF[Q1]
a2 R GF[Q2]
b1 R GF[Q1]
b2 R GF[Q2]
Accept iff m2 is consistent
m1 = b1 || m0(b1) + a1 m2 = a2 || m1(a2) + b2
m0 = mBoth parties set:
a1
m2
Q1 n/ , Q2 log(n)/
2log(1/) + 2loglog(n) + O(1) manually authenticated bits
Two GF[Q2] elements
k rounds 2loglog(n) is reduced to 2log(k-1)(n)
b2
The Protocol (simplified)
![Page 40: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/40.jpg)
40
Security Analysis Must consider all generic man-in-the-middle attacks. Three attacks in our case:
Alice BobEvem
b1
a1
b2
m a1
m2
^ ^
b1 b2^ ^
Attack #1
![Page 41: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/41.jpg)
41
Security Analysis Must consider all generic man-in-the-middle attacks. Three attacks in our case:
Alice BobEve
m
b1
a1
b2
m a1 ^ ^
b1 b2^ ^
Attack #2
m2
![Page 42: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/42.jpg)
42
Security Analysis Must consider all generic man-in-the-middle attacks. Three attacks in our case:
Alice BobEvem
b1
a1
b2
m a1 ^ ^
b1 b2^ ^
Attack #3
m2
m2
![Page 43: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/43.jpg)
43
Security Analysis – Attack #1
m1,A = b1 || m0,A(b1) + a1
m2,A = a2 || m1,A(a2) + b2
m0,A = m ^ ^
^
m1,B = b1 || m0,B(b1) + a1 m2,B = a2 || m1,B(a2) + b2
m0,B = m ^
^
Alice BobEvem
b1
a1
b2
m a1 ^ ^
b1 b2^ ^
m0,A m0,B and m2,A = m2,B
m1,A = m1,B
m1,A m1,B and m2,A = m2,B
Pr[ ] + Pr[ /2 + /2
]
m2
![Page 44: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/44.jpg)
44
Security Analysis – Attack #1
m1,A = b1 || m0,A(b1) + a1
m0,A = m ^ ^ m1,B = b1 || m0,B(b1) +
a1
m0,B = m ^
^
Alice BobEvem a1 m a1 ^ ^
b1^
m1,A = m1,B
Pr[ ] /2
b1
Claim:
Eve chooses b1 b1
Eve chooses b1 = b1
Pr[ m0,A(b1) + a1 = m0,B(b1) + a1 ] /2
m1,A m1,B
^
^
^
![Page 45: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/45.jpg)
45
Outline
Security definition Tight bounds
The protocol Lower bound
![Page 46: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/46.jpg)
46
Lower BoundAlice Bob
x2
m, x1
m R {0,1}n M, X1, X2, S are well defined random variables
s
![Page 47: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/47.jpg)
47
Lower Bound
Goal: H(S) 2log(1/)
Alice BobX2
S
M, X1
![Page 48: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/48.jpg)
48
Let X be random variable over domain X with probabilitydistribution PX
The Shannon entropy of X is
Shannon Entropy
(where 0log0 = 0)
H(X) = - ∑x 2 X PX(x) log PX(x)
Measures the amount of randomness in X on average Measures how much we can compress X on average
0 · H(X) · log|X|
Equality , X is constant
Equality , X is uniform
![Page 49: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/49.jpg)
49
Let X be random variable over domain X with probabilitydistribution PX
The min-entropy of X is
H1(X) = - log maxx 2 X PX(x)
Measures the amount of randomness in X in the worst-case Represents the most likely value(s)
0 · H1(X) · H(X) · log|X|
Equality , X is constant
Equality , X is uniform
A Related Notion: Min-Entropy
Equality , X is uniform
![Page 50: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/50.jpg)
50
Let X and Y be two random variables over domains X and Ywith probability distributions PX and PY
The conditional Shannon entropy of X given Y is
H(X,Y) = H(X) + H(Y|X)
Conditional Shannon Entropy
H(X|Y) = ∑y 2 Y PY(y) H(X|Y=y)
H(X,Y) = H(Y) + H(X|Y)
Observation:
![Page 51: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/51.jpg)
51
The mutual information between X and Y is
Shannon Mutual Information
I(X;Y) = H(X) – H(X|Y)
Observation: I(X;Y) = I(Y;X)
Conditional mutual information:
I(X;Y|Z) = H(X|Z) – H(X|Y,Z)
![Page 52: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/52.jpg)
52
Lower Bound
Goal: H(S) 2log(1/)
Alice BobX2
S
M, X1
Evolving intuition: The parties must use at least log(1/) random bits
H(S) = H(S) - H(S | M, X1)
+ H(S | M, X1) - H(S | M, X1, X2)
+ H(S | M, X1, X2)
= I(S ; M, X1)
+ I(S ; X2 | M, X1)
+ H(S | M, X1, X2)
Each party must independently reduce H(S) by log(1/) bits
Each party must use at least log(1/) random bits
![Page 53: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/53.jpg)
53
Lower BoundAlice Bob
X2
M, X1
H(S) = I(S ; M, X1)
+ I(S ; X2 | M, X1)
+ H(S | M, X1, X2)
Alice’s randomnes
s
Bob’s randomnes
s
S
Goal: H(S) 2log(1/)
Evolving intuition: The parties must use at least log(1/) random bits
Each party must independently reduce H(S) by log(1/) bits
Each party must use at least log(1/) random bits
![Page 54: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/54.jpg)
54
Lower BoundAlice Bob
X2
M, X1
H(S) = I(S ; M, X1)
+ I(S ; X2 | M, X1)
+ H(S | M, X1, X2)
Alice’s randomnes
s
Bob’s randomnes
s
Lemma 1: I(S ; M, X1) + H(S | M, X1, X2) log(1/)Lemma 2: I(S ; X2 | M, X1) log(1/)
S
Goal: H(S) 2log(1/)
![Page 55: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/55.jpg)
55
Proof of Lemma 1
Chooses m R {0,1}n^
Alice BobEve
m x1
x2
s
m x1 ^
x2
Consider the following attack:
Eve wants Alice to manually authenticate s
Samples x2 from the distribution of X2 given m, x1 and s^
If Pr[ s | m, x1 ] = 0 Eve quits
^
Eve acts as follows:
Chooses m R {0,1}n
Forwards s
and hopes that s = s
![Page 56: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/56.jpg)
56
Proof of Lemma 1By the protocol requirements:
Since n log(1/), we get
which implies
(S ; M, X1) + H(S | M, X1, X2) log(1/) - 1
Pr[ s = s and m ≠ m ] Pr[ s = s ] - 2-n ^^ ^
2 Pr[ s = s ]^
Claim: Pr[ s = s ] 2 - { (S ; M, X1
) + H(S | M, X1
, X2
) }^
![Page 57: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/57.jpg)
57
Lower BoundAlice Bob
X2
M, X1
H(S) = I(S ; M, X1)
+ I(S ; X2 | M, X1)
+ H(S | M, X1, X2)
Alice’s randomnes
s
Bob’s randomnes
s
Lemma 1: I(S ; M, X1) + H(S | M, X1, X2) log(1/) - 1Lemma 2: I(S ; X2 | M, X1) log(1/) - 1
S
Goal: H(S) 2log(1/) - 2
![Page 58: Seminar in Foundations of Privacy](https://reader035.fdocuments.in/reader035/viewer/2022062517/56813ac3550346895da2d4fa/html5/thumbnails/58.jpg)
58
References
Peter Gemmell and Moni NaorCodes for Interactive AuthenticationCRYPTO 1993
Moni Naor, Gil Segev and Adam SmithTight Bounds for Unconditionally Secure Authentication Protocols in the Manual Channel and Shared Key ModelsCRYPTO 2006
Whitfield Diffie and Martin E. HellmanNew Directions in CryptographyIEEE Transactions on Information Theory 1976
T. Cover and J. A. ThomasElements of information Theory