Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for...

133
Formal Models and Techniques for Analyzing Security Protocols: A Tutorial

Transcript of Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for...

Page 1: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

Formal Models andTechniques for Analyzing

Security Protocols: ATutorial

Page 2: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,
Page 3: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

Formal Models andTechniques for Analyzing

Security Protocols: ATutorial

Véronique CortierLORIA, [email protected]

Steve KremerINRIA Nancy

[email protected]

Boston — Delft

Page 4: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

Foundations and Trends R© inProgramming LanguagesPublished, sold and distributed by:now Publishers Inc.PO Box 1024Hanover, MA 02339United StatesTel. [email protected]

Outside North America:now Publishers Inc.PO Box 1792600 AD DelftThe NetherlandsTel. +31-6-51115274

The preferred citation for this publication is

now Publishers Inc.. Formal Models and Techniques for Analyzing SecurityProtocols: A Tutorial. Foundations and Trends R© in Programming Languages,vol. 1, no. 3, pp. 151–XXX, 2014.

This Foundations and Trends R© issue was typeset in LATEX using a class file designedby Neal Parikh. Printed on acid-free paper.

ISBN: 978-1-60198-902-4c© 2014 now Publishers Inc.

All rights reserved. No part of this publication may be reproduced, stored in a retrievalsystem, or transmitted in any form or by any means, mechanical, photocopying, recordingor otherwise, without prior written permission of the publishers.

Photocopying. In the USA: This journal is registered at the Copyright Clearance Cen-ter, Inc., 222 Rosewood Drive, Danvers, MA 01923. Authorization to photocopy items forinternal or personal use, or the internal or personal use of specific clients, is granted bynow Publishers Inc for users registered with the Copyright Clearance Center (CCC). The‘services’ for users can be found on the internet at: www.copyright.com

For those organizations that have been granted a photocopy license, a separate systemof payment has been arranged. Authorization does not extend to other kinds of copy-ing, such as that for general distribution, for advertising or promotional purposes, forcreating new collective works, or for resale. In the rest of the world: Permission to pho-tocopy must be obtained from the copyright owner. Please apply to now Publishers Inc.,PO Box 1024, Hanover, MA 02339, USA; Tel. +1 781 871 0245; www.nowpublishers.com;[email protected]

now Publishers Inc. has an exclusive license to publish this material worldwide. Permissionto use this content must be obtained from the copyright license holder. Please apply tonow Publishers, PO Box 179, 2600 AD Delft, The Netherlands, www.nowpublishers.com;e-mail: [email protected]

Page 5: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

Foundations and Trends R© inProgramming Languages

Volume 1, Issue 3, 2014Editorial Board

Editor-in-Chief

Mooly SagivTel Aviv UniversityIsrael

Editors

Martín AbadiMicrosoft Research &UC Santa CruzAnindya BanerjeeIMDEAPatrick CousotENS Paris & NYUOege De MoorUniversity of OxfordMatthias FelleisenNortheastern UniversityJohn FieldGoogleCormac FlanaganUC Santa CruzPhilippa GardnerImperial CollegeAndrew GordonMicrosoft Research &University of EdinburghDan GrossmanUniversity of Washington

Robert HarperCMUTim HarrisOracleFritz HengleinUniversity of CopenhagenRupak MajumdarMPI-SWS & UCLAKenneth McMillanMicrosoft ResearchJ. Eliot B. MossUMass, AmherstAndrew C. MyersCornell UniversityHanne Riis NielsonTU DenmarkPeter O’HearnUCLBenjamin C. PierceUPennAndrew PittsUniversity of Cambridge

Ganesan RamalingamMicrosoft ResearchMooly SagivTel Aviv UniversityDavide SangiorgiUniversity of BolognaDavid SchmidtKansas State UniversityPeter SewellUniversity of CambridgeScott StollerStony Brook UniversityPeter StuckeyUniversity of MelbourneJan VitekPurdue UniversityPhilip WadlerUniversity of EdinburghDavid WalkerPrinceton UniversityStephanie WeirichUPenn

Page 6: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

Editorial Scope

Topics

Foundations and Trends R© in Programming Languages publishes surveyand tutorial articles in the following topics:

• Abstract interpretation

• Compilation andinterpretation techniques

• Domain specific languages

• Formal semantics, includinglambda calculi, process calculi,and process algebra

• Language paradigms

• Mechanical proof checking

• Memory management

• Partial evaluation

• Program logic

• Programming languageimplementation

• Programming languagesecurity

• Programming languages forconcurrency

• Programming languages forparallelism

• Program synthesis

• Program transformations andoptimizations

• Program verification

• Runtime techniques forprogramming languages

• Software model checking

• Static and dynamic programanalysis

• Type theory and type systems

Information for Librarians

Foundations and Trends R© in Programming Languages, 2014, Volume 1, 4issues. ISSN paper version 2325-1107. ISSN online version 2325-1131. Alsoavailable as a combined paper and online subscription.

Page 7: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

Foundations and Trends R© in Programming LanguagesVol. 1, No. 3 (2014) 151–XXXc© 2014 now Publishers Inc.

DOI: 10.1561/2500000001

Formal Models and Techniques for AnalyzingSecurity Protocols: A Tutorial

Véronique CortierLORIA, [email protected]

Steve KremerINRIA Nancy

[email protected]

Page 8: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,
Page 9: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

Contents

1 Introduction 3

2 Running example 72.1 The Needham Schroeder public key protocol . . . . . . . . 72.2 Lowe’s man in the middle attack . . . . . . . . . . . . . . 9

3 Messages and deduction 133.1 Terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133.2 Message deduction . . . . . . . . . . . . . . . . . . . . . 163.3 An algorithm to decide message deduction . . . . . . . . . 183.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

4 Equational theory and static equivalence 234.1 Equational theories . . . . . . . . . . . . . . . . . . . . . 234.2 Static equivalence . . . . . . . . . . . . . . . . . . . . . . 274.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

5 A cryptographic process calculus 355.1 Syntax and informal semantics . . . . . . . . . . . . . . . 365.2 Modelling protocols as processes . . . . . . . . . . . . . . 385.3 Formal semantics . . . . . . . . . . . . . . . . . . . . . . 415.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

ix

Page 10: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

x

6 Security properties 536.1 Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . 536.2 Secrecy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 546.3 Authentication . . . . . . . . . . . . . . . . . . . . . . . . 576.4 Equivalence properties . . . . . . . . . . . . . . . . . . . . 636.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

7 Automated verification: bounded case 757.1 From protocols to constraint systems . . . . . . . . . . . . 767.2 Constraint solving . . . . . . . . . . . . . . . . . . . . . . 807.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

8 Automated verification: unbounded case 898.1 Undecidability . . . . . . . . . . . . . . . . . . . . . . . . 918.2 Analysis of protocols with Horn clauses . . . . . . . . . . . 928.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . 104

9 Further readings and conclusion 105

References 113

Page 11: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

Abstract

Security protocols are distributed programs that aim at securing com-munications by the means of cryptography. They are for instance usedto secure electronic payments, home banking and more recently elec-tronic elections. Given the financial and societal impact in case of fail-ure, and the long history of design flaws in such protocols, formal ver-ification is a necessity. A major difference from other safety criticalsystems is that the properties of security protocols must hold in thepresence of an arbitrary adversary. The aim of this paper is to providea tutorial to some modern approaches for formally modeling protocols,their goals and automatically verifying them.

now Publishers Inc.. Formal Models and Techniques for Analyzing SecurityProtocols: A Tutorial. Foundations and Trends R© in Programming Languages,vol. 1, no. 3, pp. 151–XXX, 2014.

DOI: 10.1561/2500000001.

Page 12: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,
Page 13: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

1Introduction

Security protocols are used to protect electronic transactions. The prob-ably most used security protocol is the SSL/TLS protocol which un-derlies the https protocol in web browsers. It may be used for electroniccommerce, or simply to encrypt web search queries on their way be-tween the host and the search engine. There are of course many otherprotocols in use, e.g. to authenticate to providers on mobile phones orwithdraw cash on an ATM. Moreover, the digitalization of our modernsociety requires the use of security protocols in an increasing numberof contexts, such as electronic passports that may include RFID chips,electronic elections that allow for Internet voting, etc.

We may think of security protocols as distributed programs thatmake use of cryptography, e.g. encryption, to achieve a security prop-erty, such as confidentiality of some data, e.g. your credit card number.Given the difficulty of designing correct distributed systems in gen-eral, it is not surprising that many flaws were discovered in securityprotocols, even without breaking the underlying cryptography. Dur-ing the last 30 years many research efforts were spent on designingtechniques and tools to analyze security protocols. One may trace thisline of work back to the seminal work of Dolev and Yao [1981] who

3

Page 14: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

4 Introduction

pioneered the ideas of an attacker who completely controls the commu-nication network, has an unbounded computational power, but manip-ulates protocol messages according to some predefined rules, idealizingthe protections offered by cryptography. These techniques not only al-lowed to better understand the principles underlying secure protocoldesign, but also resulted in mature tools, for automated protocol anal-ysis, and the discovery of many attacks. For example, while designinga formal model of Google’s Single Sign-On protocol, that allows a userto identify himself only once and then access various applications (suchas Gmail or Google calendar), Armando et al. [2008] discovered that adishonest service provider could impersonate any of its users at anotherservice provider. This flaw has been corrected since. Basin et al. [2012]have identified flaws and proposed fixes for the ISO/IEC 9798 standardfor entity authentication, using automated protocol verification tools.The standard has been revised to include their proposed amendments.Bortolozzo et al. [2010] designed a dedicated analysis tool for hard-ware security tokens that implement the PKCS#11 standard. The toolautomatically reverse-engineers the tokens to extract its configuration,builds an abstract model to be analyzed and verifies the attack on thetoken if an attack is found. They were able to find unknown attacks onmore than 10 commercial tokens.

This paper proposes a tutorial, presenting modern techniques tomodel and automatically analyze security protocols. Given the largebody of work in this area we do not aim to be exhaustive and onlypresent some selected methods and results. We expect that this tuto-rial could serve as a basis for a master, or graduate course, or allowresearchers from different areas to get an overview of the kinds of tech-niques that are used. The outline of the tutorial is as follows.

• We first present an informal description of our running exam-ple, the Needham Schroeder public key protocol that we used forillustration purposes in the remainder of the paper.

• Then, we explain how protocol messages can be modeled as firstorder terms, and how adversary capabilities can be modeled byan inference system. We also provide a decision algorithm for de-duction, i.e. the adversary’s capability to construct new messages.

Page 15: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5

• Next, we introduce a more general model, based on equationaltheories. We revisit deduction and define a notion of messageindistinguishability, called static equivalence. We again provide adecision procedure for static equivalence for a simple equationaltheory representing symmetric encryption.

• We continue by introducing a process calculus, the applied pi cal-culus, which we use to model protocols. One of the main differ-ences with the original pi calculus is that the calculus allows com-munication of messages represented by terms, rather than onlynames. We illustrate how protocols can be conveniently modeledin this formalism.

• Next we discuss how we can express security properties of proto-cols modelled in the applied pi calculus. We cover different flavorsof confidentiality, authentication, but also anonymity properties,expressed as behavioral equivalences of processes.

• We go on discussing automated verification. We first consider thecase when protocol participants only execute a bounded numberof sessions. We present a decision procedure based on constraintsolving which allows to decide secrecy in this setting.

• Finally, we show that the general case, where the number of ses-sions is unbounded, is undecidable. We show that neverthelessit is possible to design tools that are able to analyze protocols.This comes at the cost that termination is not guaranteed. Inparticular, we present an approach based on a representation ofthe protocol and the adversary as Horn clauses and describe aresolution based procedure implemented in the ProVerif tool.

• We conclude the tutorial by briefly discussing some other ap-proaches for automated verification and other directions in thisresearch area.

Page 16: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,
Page 17: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

2Running example

We first introduce our running example, the Needham Schroeder publickey protocol [Needham and Schroeder, 1978]. We will also describe thefamous man in the middle attack, discovered by Lowe [1996] 17 yearsafter the publication of the original paper.

2.1 The Needham Schroeder public key protocol

The Needham Schroeder public key protocol can be described by thefollowing three message transmissions.

1. A→ B : {|〈A,Na〉|}apk(B)

2. B → A : {|〈Na, Nb〉|}apk(A)

3. A→ B : {|Nb|}apk(B)

The notation A→ B : m denotes that Alice (A) is sending the messagem to Bob (B). We use the notation {|m|}a

pk(B) to denote the asymmetricencryption of the message m with B’s public key. 〈m1,m2〉 denotes thepairing, i.e., the concatenation, of messages m1 and m2.

In the first message Alice sends her identity A and a nonce Na

encrypted with Bob’s public key to Bob. A nonce, (which stands for

7

Page 18: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

8 Running example

number used once) is a fresh number that is randomly generated ineach session.

When Bob receives this first message, he decrypts the message andchecks that it is well-formed. Bob also generates a nonce Nb and en-crypts the nonces NA and NB with Alice’s public key. This mechanismof sending someone an encrypted nonce and waiting for the recipient tosend back this nonce is often called a challenge-response mechanism.Here Alice’s challenge to Bob is to be able to decrypt the message andBob’s response consists in sending back the nonce.

When receiving the second message Alice decrypts it and verifiesthat it contains her previous nonce Na. This proves to Alice that Bobhas indeed received the first message. Indeed, Bob is the only entityable to decrypt the first message, as it was encrypted with his publickey pk(B). Then Alice retrieves the second nonce Nb and replies withthe encryption of Nb with Bob’s public key. Bob can decrypt this mes-sage and verify that it contains the nonce Nb generated previously. Wecan observe that the second and third message also form a challenge-response from Bob to Alice.

The aim of the protocol is to guarantee mutual authentication: atthe end of a successful protocol run Bob should be ensured he hasbeen communicating with Alice while Alice has been communicatingwith Bob. Moreover, the protocol should guarantee confidentiality ofthe nonces Na and Nb.

While the above presentation of the protocol is convenient to read itdoes however lack a formal semantics and many details are left implicit,e.g. which messages are freshly generated, etc. It also only shows onepossible execution of the protocol, which is the honest execution whereeveryone behaves as intended. However the network used for communi-cation is not reliable, i.e. an attacker may intercept messages, re-routeand modify them. Moreover, this notation leaves implicit which part ofa message needs to be verified as being well formed, or correspond tosome known value.

A slightly more faithful representation of this protocol is given inFigure 2.1. In this figure it is explicit that the messages sent by A

are not necessarily received by B. The empty space between outgoing

Page 19: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

2.2. Lowe’s man in the middle attack 9

A B

BA

A B

{|〈A, Na〉|}apk(B) {|〈x, y〉|}a

pk(B)

{|〈y, Nb〉|}apk(x){|〈Na, z〉|}a

pk(A)

{|z|}apk(B) {|Nb|}a

pk(B)

Figure 2.1: Lowe’s attack on the Needham Schroeder public key protocol.

and incoming arrows represents the untrusted network. We also notethe use of variables in some messages. Variables are used whenever thevalue of part of the message is a priori unknown. These variables maythen be used to construct other messages later. For example, the secondmessage from A’s point of view {|〈Na, z〉|}a

pk(A) explicits that A needs tocheck that the first component of the encrypted pair corresponds to thenonce Na while the second component is unknown and will be boundto variable z. In the third message A uses variable z as the value tobe encrypted. This representation is very useful when thinking aboutwhat could go wrong? in the protocol.

2.2 Lowe’s man in the middle attack

Lowe [1996] discovered an attack in the case where A initiates a sessionwith a dishonest party C: C may fool B, by making B believe he isA. Moreover, the nonce Nb which B believes to only share with A

becomes known to C. The attack is displayed in Figure 2.2. We writeC(A) whenever C is impersonating A. The first message from A to C isdecrypted and re-encrypted with B’s public key. As the second messageis encrypted with A’s public key, C cannot modify this message, andforwards it to A. As A has no means to know that Nb was generatedby B rather than C, he accepts this message and returns {|Nb|}a

pk(C).One may say that A acts as a decryption oracle. The attacker may nowlearn Nb and successfully complete the protocol with B.

This execution violates the secrecy of Nb and authentication fromB’s point of view. B believes he successfully finished a session of theprotocol with A while A did not authenticate to B.

Page 20: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

10 Running example

A C

C(A) B

BC(A)

CA

A C

C(A) B

{|〈A, Na〉|}apk(C)

{|〈A, Na〉|}apk(B)

{|〈Na, Nb〉|}apk(A)

{|〈Na, Nb〉|}apk(A)

{|Nb|}apk(C)

{|Nb|}apk(B)

Figure 2.2: Lowe’s attack on the Needham Schroeder public key protocol.

The crucial observation is that A is willing to start a session with acorrupted agent. This is actually in contradiction with the assumptionsmade by Needham and Schroeder [1978]:

We also assume that each principal has a secure environ-ment in which to compute, such as is provided by a personalcomputer or would be by a secure shared operating system.Our viewpoint throughout is to provide authentication ser-vices to principals that choose to communicate securely. Wehave not considered the extra problems encountered whentrying to force all communication to be performed in a se-cure fashion or when trying to prevent communication be-tween particular principals in order to enforce restrictionson information flow.

It may therefore be debatable whether Lowe’s finding should be consid-ered as an attack. Nevertheless, while the above assumption may havebeen reasonable in 1978, it is certainly not the case anymore. Comput-ers may run malware and people may connect to a malicious server,e.g., because of a phishing attack.

Lowe also shows that the protocol can easily be fixed to avoid thisattack by adding B’s identity to the second message, i.e. replace themessage {|〈Na, Nb〉|}a

pk(A) by {|〈Na, 〈Nb, B〉〉|}apk(A). Indeed, this altered

Page 21: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

2.2. Lowe’s man in the middle attack 11

message prevents the man in the middle attack as A would expect thesecond message to be {|〈Na, 〈Nb, C〉〉|}a

pk(A) while the intruder can onlyproduce the message {|〈Na, 〈Nb, B〉〉|}a

pk(A). We will refer to the fixedprotocol as the Needham Schroeder Lowe (NSL) protocol.

Page 22: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,
Page 23: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

3Messages and deduction

Many different symbolic models are used to represent and reasonabout protocols. Examples of symbolic models are process algebra(e.g. CSP [Schneider, 1997], applied-pi calculus [Abadi and Fournet,2001]), strand spaces [Thayer et al., 1999], constraint systems [Millenand Shmatikov, 2001, Comon-Lundh and Shmatikov, 2003], or Hornclauses [Blanchet, 2001]. These models have many differences but theyall have in common the fact that messages are represented by terms.Intuitively, the exact values of keys, identities, or nonces are abstractedaway but the structure of the message is kept and modeled as a speciallabelled graph, called term.

3.1 Terms

Terms are a common object in computer science. We introduce hereonly the basic definitions of terms, substitutions, and unification and werefer the reader to handbooks, e.g. [Baader and Nipkow, 1998, Baaderand Snyder, 2001], for a deeper introduction on terms.

13

Page 24: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

14 Messages and deduction

3.1.1 Terms

Cryptographic primitives such as encryption or signatures are simplyrepresented by function symbols. A finite set of function symbols iscalled a signature, where a function symbol f has an associated integer,its arity.

Definition 3.1. Given a set X of variables and a set N of names (usedto represent atomic data such as keys, nonces, or identities), the set ofterms of the signature F , the variables X and the names N is denotedT (F ,X ,N ) and is inductively defined as names, variables, and functionsymbols applied to other terms.

Variables are typically used to represent parts of messages that areleft unspecified, such as components received from the outside. The setof variables occurring in a term t is denoted var(t), while the set ofnames of t is denoted n(t). Terms without variables are called groundor closed.

Example 3.1. In the context of security protocols, a standard signatureis Fstd = {senc, aenc, pair, pk} where senc, aenc and pair are three sym-bols of arity 2, representing respectively symmetric encryption, asym-metric encryption, and concatenation, while pk is a symbol of arity1, representing the public key associated to some private key. For ex-ample, the term t0 = aenc(pair(a, na), pk(ka)), where a, na, ka ∈ N ,represents the encryption under the public key pk(ka) of the concate-nation of the identity a together with the nonce na. This term t0 can beused to represent the first message sent by a in the Needham-Schroederprotocol.

For readability, we may write 〈t1, t2〉 instead of pair(t1, t2). We mayalso write {|t1|}s

t2 instead of senc(t1, t2), and {|t1|}at2 for aenc(t1, t2).

The set of positions of a term t is written pos(t) ⊆ N∗. We use ε todenote the root position. Formally, the set pos(t) is inductively definedas follows.

pos(f(t1, . . . , tn)) = {ε} ∪n⋃

i=1

i · pos(ti)

Page 25: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

3.1. Terms 15

The set of subterms of t is written st(t). The subterm of t at positionp ∈ pos(t) is written t|p. In particular, t|ε = t. The term obtained byreplacing t|p with a term u in t is denoted t[u]p. The set of subterms ofa set of terms S is simply st(S) =

⋃t∈S st(t).

Example 3.2. Let t0 = aenc(pair(a, na), pk(ka)) as defined in Exam-ple 3.1. The set of its positions is pos(t0) = {ε, 1, 1.1, 1.2, 2, 2.1}. Theset of its subterms is st(t0) = {t0, pair(a, na), a, na, pk(ka), ka, } andt0[aenc(na, pk(ka))]1 = aenc(aenc(na, pk(ka)), pk(ka)).

3.1.2 Substitutions

Terms with variables are used to represent partially specified messages.For example, in the Needham-Schroeder protocol, the agent B expectsan encrypted message that contains A’s identity and some unknownpart X, that should be the nonce sent by A. However, B cannot controlthat X is indeed the nonce sent by A. The message expected by B willtherefore be represented by the term aenc(pair(a, x), pk(ka)) where x isa variable. Such variables can then be instantiated depending on theactually received message, that is, they are replaced by a term. This isformally defined through substitutions.

Definition 3.2 (Substitution). A substitution is a function σ from afinite subset, called domain and denoted Dom(σ) of the set of variablesX to T (F ,X ,N ). When applied to a term, a substitution replaces anyvariable x by the corresponding term σ(x). Formally:

σ(x) = x if x /∈ Dom(σ)σ(f(t1, . . . , tn)) = f(σ(t1), . . . , σ(tn))

We often write tσ instead of σ(t).

The question of whether two terms with variables may be madeequal is called unification. Two terms u and v are said unifiable if thereexists a substitution σ, called unifier, such that uσ = vσ.

Proposition 3.1 ([Baader and Snyder, 2001]). If two terms u and v areunifiable then there exists a most general unifier mgu(u, v) such thatany unifier is actually an instance of mgu(u, v), that is for any σ suchthat uσ = vσ, there exists θ such that σ = mgu(u, v)θ.

Page 26: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

16 Messages and deduction

3.2 Message deduction

Cryptographic primitives have of course particular properties. For ex-ample, anyone can decrypt a message with the corresponding decryp-tion key. These properties are reflected through inference rules, thatdefine which messages can be computed from an a priori given set ofmessages.

3.2.1 Inference system

From the key k and the message senc(m, k), which represents the (sym-metric) encryption of m over k, one can compute m. This can be rep-resented by the rule

senc(m, k) k

m

This can be formalized as follows.

Definition 3.3. An inference rule is a rule of the form:u1 · · · un

uwith u1, . . . , un, u are terms (with variables).

An inference system is a set of inference rules.

The standard inference system corresponding to the encryption andconcatenation is presented in Figure 3.1. It is often called the Dolev-Yaosystem, in reference to the first symbolic formalisation of the deduc-tive power of an attacker by Dolev and Yao [1981]. The first line ofFigure 3.1 corresponds to the fact that anyone can concatenate termsand retrieve terms from a concatenation. The second line models thatone can encrypt and decrypt symmetrically whenever he/she has thecorresponding key. Similarly, the third line corresponds to asymmetricencryption. The first inference rule of each line of Figure 3.1 is called acomposition rule while the other rules are called decomposition rules.

Of course, more inference rules may be considered, for example torepresent signatures, hashes, etc. Other inference systems are presentedin Exercises (§3.4).

Page 27: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

3.2. Message deduction 17

IDY :

x y

〈x, y〉〈x, y〉

x

〈x, y〉

y

x y

senc(x, y)senc(x, y) y

x

x y

aenc(x, y)aenc(x, pk(y)) y

x

Figure 3.1: Inference system IDY corresponding to a “Dolev-Yao adversary”.

3.2.2 Derivability

Inference rules may be combined to compute or derive new messages.For example, from a set of messages S0 = {〈k1, k2〉, 〈k3, a〉, {|n|}s

〈k1,k3〉},an attacker may learn k1 from 〈k1, k2〉 and k3 from 〈k3, a〉. This allowshim to obtain 〈k1, k3〉, which in turn enables him to decrypt {|n|}s

〈k1,k3〉to get n. He may then further append a to it for example, yielding〈n, a〉. We say that 〈n, a〉 is deducible from S0 and the correspondingdeduction steps can be conveniently represented by the following prooftree.

{|n|}s〈k1,k3〉

〈k1, k2〉

k1

〈k3, a〉

k3

〈k1, k3〉

n

〈k3, a〉

a

〈n, a〉Formally, the notion of deduction is defined as follows.

Definition 3.4 (deduction). Let I be an inference system. A term t isdeducible in one step from a set of terms S for the inference system I,

which is denoted S `1I t if there exists an inference rule

u1 · · · un

uof I, terms t1, . . . , tn ∈ S, and a substitution θ such that

ti = uiθ for 1 ≤ i ≤ n and t = uθ

We say thatt1 · · · tn

tis an instance of

u1 · · · un

u.

Page 28: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

18 Messages and deduction

A term t is deducible from a set of terms S, denoted S `I t if thereexists a proof tree Π, that is, a tree such that

• the leaves of Π are labelled by terms in S;

• if a node of Π is labelled by tn+1 and has n child nodes labelled by

t1, . . . , tn respectively thent1 · · · tn

tn+1

is an instance of some inference

rule of I;

• the root of Π is labelled by t.

The set Terms(Π) is the set of all terms that appear in any node of Π.We may write S ` t instead of S `I t when I is clear from the context.

Example 3.3. Let S0 = {〈k1, k2〉, 〈k3, a〉, {|n|}s〈k1,k3〉} and IDY as de-

fined in Figure 3.1. The fact that S0 ` 〈k1, k2〉 is witnessed by theproof tree consisting of a single node labelled 〈k1, k2〉. Then k1 and k3

are derivable in one step from S0. We have S0 ` 〈n, a〉 and a proof treeof it is the tree presented in Section 3.2.2.

3.3 An algorithm to decide message deduction

To analyse security protocols, one of the key methods is to design au-tomatic procedures to detect whether a protocol is subject to attacks.In this context, a very basic question is to decide whether a term t

is deducible from a (finite) set of terms S. This problem is called theintruder deduction problem.

Definition 3.5 (Intruder deduction problem). Let I be an inference sys-tem. The intruder deduction problem consists in deciding the followingproblem:

Input a finite set of terms S and a term t

Output whether S `I t.

The intruder deduction problem is undecidable in general [Abadiand Cortier, 2006], for arbitrary inference systems. It is however easilydecidable for the case of local theories that satisfy that whenever S ` tthen there exists a proof of it that uses only subterms of S and t.

Page 29: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

3.3. An algorithm to decide message deduction 19

Definition 3.6. An inference system I is local if for any finite set ofterms S and for any term t such that S ` t there exists a proof tree Πof S ` t such that any term labeling Π is in st(S ∪ {t}).

Theorem 3.1. Let I be a local inference system. The intruder deduc-tion system is decidable in polynomial time (PTIME).

Proof. Let I be a local inference system. Whether S ` t can be decidedby the following algorithm:

• Let S0 = S.

• Let Si+1 = Si ∪({u | Si `1 u} ∩ st(S ∪ {t})

).

• If Si+1 = Si then stop.

• Check whether t ∈ Si.

Let N be the cardinality of st(S ∪ {t}). For any i, we have that Si ⊆st(S ∪ {t}) therefore the procedure stops in at most N steps. Sinceeach step can be computed in polynomial time, the overall procedureis polynomial. Its correctness is ensured by the locality property ofI.

The intruder deduction problem is decidable in PTIME for theDolev-Yao inference system, due to its locality property.

Proposition 3.2. The Dolev-Yao inference system IDY is local.

Proof. We say that a proof tree Π of S ` t is minimal if its number ofnodes is minimal.

We show a property that is slightly stronger than locality:For any S, t such that S ` t, for any minimal proof Π of S ` t, it holdsthat Terms(Π) ⊆ st(S ∪{t}). Moreover if Π is reduced to a leaf or endswith a decomposition rule then Terms(Π) ⊆ st(S).

The proof proceeds by induction on the size of Π.Base case. If Π is a leaf then by definition Terms(Π) ⊆ st(S).Induction. Let Π be a minimal proof of S ` t and consider the

last applied inference rule. Assume first it is a composition rule:

Π =

Π1

t1

Π2

t2

t

Page 30: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

20 Messages and deduction

with t = f(t1, t2) for f ∈ {pair, aenc, senc}. By induction hypothesis,it must be the case that Terms(Π1) ⊆ st(S ∪ {t1}) ⊆ st(S ∪ {t}) andTerms(Π2) ⊆ st(S ∪ {t2}) ⊆ st(S ∪ {t}). Hence Terms(Π) ⊆ st(S ∪ {t}).

Consider now the case where the last applied inference rule is adecomposition and assume first that it corresponds to asymmetric de-

cryption, that is Π =

Π1

aenc(t, pk(t2))

Π2

t2

t

. The last applied rule in

Π1 must be a decomposition. Indeed, if it were a composition, we wouldhave

Π =

Π′

1

t

Π′′1

pk(t2)

aenc(t, pk(t2))

Π2

t2

t

. Therefore Π′1 would be a smaller proof

of S ` t, which contradicts the minimality of Π. Now, since the lastapplied rule of Π′

1 is a decomposition and applying the induction hy-pothesis, we deduce that Terms(Π1) ∪ {aenc(t, pk(t2))} ⊆ st(S) andTerms(Π2) ⊆ st(S ∪{t2}). We deduce that aenc(t, pk(t2)) and thereforet and t2 are in st(S). Hence Terms(Π) ⊆ st(S ∪ {t}).

The other cases are left to the reader.

Corollary 3.2. The intruder deduction problem is decidable in PTIMEfor the Dolev-Yao inference system IDY .

This follows directly from Theorem 3.1 and Proposition 3.2.

Page 31: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

3.4. Exercises 21

3.4 Exercises

Exercice 1 (*). We consider the inference system IDY introduced inSection 3.2.2. Let

S = {{k2}〈k1,{k1}k3〉, 〈k1, k1〉, {{k1}k3}k1}.

1. Show that k1 and k2 are deducible from S, that is S `IDYk1 and

S `IDYk2.

2. Show that k3 is not deducible from S, that is S 6`IDYk3.

Exercice 2 (**). We consider two binary symbols sign and blind. Intu-itively, the term sign(m, k) is meant to represent the message m signedby the key k while blind(m, r) represents the message m blinded by therandom factor r. Blind signatures have the following property: giventhe signature of a blinded message and given the blinding factor, onecan compute a valid signature of the original message. This is reflectedby the following deduction rule:

sign(blind(m, r), k) r

sign(m, k)

Such blind signatures are used for example in voting protocols [Fujiokaet al., 1992].

The complete inference system corresponding to this primitive isIsign defined as follows.

Isign :

x y

pair(x, y)

x y

blind(x, y)

x y

sign(x, y)

pair(x, y)

x

pair(x, y)

y

sign(blind(x, y), z) y

sign(x, z)

blind(x, y) y

x

1. Exhibit a set of messages S and a name n such that S `Isignn but

S 6`IDYn where IDY has been defined in Section 3.2.2.

2. Show that Isign is not a local inference system.

3. Provide an algorithm to decide the intruder deduction problem forIsign.

Page 32: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

22 Messages and deduction

Hint: Show that Isign is a “local” inference system, for the followingnotion of subterm: the set stext(t) of extended subterms of t is thesmallest set such that

• st(t) ⊆ stext(t), and

• if sign(blind(t1, t2), t3) ∈ stext(t) then sign(t1, t3) ∈ stext(t).

Exercice 3 (***). Exhibit an inference deduction system I such thatthe intruder deduction problem associated to I is undecidable.

Exercice 4 (**). We propose an alternative procedure to decide theintruder deduction problem associated to IDY (defined in Section 3.2.2).

On input S and t, do:

• Decompose the terms of S as much as possible, that is, compute a fixedpoint S∗ of S applying only decomposition inference rules.

• Check whether t can be obtained from S∗ applying only compositioninference rules.

1. Show that this procedure always terminate.

2. Why this procedure is not complete? That is, exhibit S and t such thatthe procedure fails to detect that S `IDY

t.

3. Provide additional assumptions on the inputs such that the procedurebecomes complete.

Exercice 5 (**). We add to the inference system IDY a rule correspond-ing to block encryption modes such as ECB (Electronic codebook) orCBC (Cipher-block chaining).

aenc(pair(x, y), z)

aenc(x, z)It reflects the fact that an attacker may compute the prefix of an en-crypted message (provided the length of the prefix is a multiple of thelength of a block).

Show that the intruder deduction system remains decidable.Hint: you may introduce a different notion of subterm, such that thisinference system can be shown to be “local” w.r.t. this notion of sub-term.

Page 33: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

4Equational theory and static equivalence

Inference systems model what an attacker can compute. This is how-ever still not sufficient in cases where the attackers gains informationnot by learning new values but by observing the difference between twobehaviors. Consider for example the case of an election where votersvote 0 or 1. An important security property is that votes remain confi-dential. However, the question is not whether the values 0 or 1 remainconfidential since they are both public. The votes remains confidentialif an attacker cannot distinguish whether some agent A is voting 0 or 1.

We first start this chapter by enriching term algebras with equa-tional theories, to model more primitives. We then provide a formalnotion of indistinguishability, called static equivalence.

4.1 Equational theories

A pure free algebra does not always suffice to accurately representcryptographic primitives. Consider for example the bit-wise exclusiveor operator ⊕. This operation has a cancellation property: m⊕m = 0.This is of course true even if this operation is nested in other primitives.For example {m ⊕m}k = {0}k. The exclusive or is also commutative

23

Page 34: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

24 Equational theory and static equivalence

and associative. These properties cannot be accurately reflected by aninference system. Instead, it is convenient to quotient the term algebrawith an equational theory.

4.1.1 Definitions

Let F be a signature. An equational theory E is simply a set of equationsu = v, where u and v are terms in T (F ,X ,N ).

The equivalence relation =E is defined by the equalities of E closedby reflexivity, transitivity, substitutions and context. Formally, =E isthe smallest equivalence relation such that:

• uθ =E vθ for any u = v ∈ E and any substitution θ,

• u1 =E v1, . . . , uk =E vk implies f(u1, . . . , uk) =E f(v1, . . . , vk).

4.1.2 Examples

Exclusive Or

The standard equational theory E⊕ for representing the exclusive or isdefined as follows

x⊕ (y ⊕ z) = (x⊕ y)⊕ z x⊕ y = y ⊕ x

x⊕ x = 0 x⊕ 0 = x

where ⊕ is a function symbol of arity 2. The two equations of thefirst line model resp. associativity and commutativity while the lastones model the cancellation property and the identity element of theexclusive or.

Example 4.1. Let u = k1 ⊕ k2, v = k2 ⊕ k3, and w = k1 ⊕ k3. Thenw =E⊕ u⊕ v.

Modular Exponentiation

Another standard primitive that cannot be modeled by an inferencesystem is modular exponentiation. Modular exponentiation is used inmany encryption schemes (such as RSA or El Gamal) and is at the heartof the Diffie-Hellman key exchange protocol [Diffie and Helman, 1976]

Page 35: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

4.1. Equational theories 25

for example. A minimal equational theory for modular exponentiationis Eexp induced by the equation

exp(exp(x, y), z) = exp(exp(x, z), y)

where exp is a function symbol of arity 2.Of course, this is not the only equation that holds for modular

exponentiation but it suffices to reflect the property needed to executethe Diffie-Hellman protocol:

A → B : exp(g, na)

B → A : exp(g, nb)

At the end of an honest execution, the two agents A and B share thesame key exp(exp(g, na), nb) =Eexp exp(exp(g, nb), na).

Encryption

Encryption and concatenation can also be modeled by an equationaltheory, with the introduction of explicit destructor operators. Let

Fdec = {sdec, adec, fst, snd}

corresponding to, respectively, symmetric decryption, asymmetric de-cryption, first, and second projections. Let F0 be an arbitrary (finite)set of constant symbols. The properties of concatenation and standardsymmetric and asymmetric encryption are modeled by the followingset of equations Eenc, over the term algebra T (Fstd ∪ Fdec ∪ F0,X ):

sdec(senc(x, y), y) = x adec(aenc(x, pk(y)), y) = x

fst(pair(x, y)) = x snd(pair(x, y)) = y

4.1.3 Deduction

It is again possible to model what an attacker can deduce from a givenset of messages. Actually, when cryptographic primitives are modeledby an equational theory, the notion of deduction can be defined in auniform way: the intruder may deduce any term that he can obtain

Page 36: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

26 Equational theory and static equivalence

by applying functions. Formally, given an equational theory E, theassociated deduction system `E is defined as follows:

t1 · · · tn

f(t1, . . . , tn)

tif t =E t′

t′

Example 4.2. Consider the equational theory E⊕ ∪ Eenc of XOR andencryption, over terms in T (Fstd ∪ Fdec ∪ {⊕},X ). Let

S = {senc(a, a⊕ c), a⊕ b, b⊕ c}.

Then S `E⊕∪Eenc a. Indeed

senc(a, a⊕ c)

a⊕ b b⊕ c

a⊕ c

a

Alternatively, deduction can be characterized as a variant of a uni-fication problem.

Proposition 4.1. Let E be an equational theory. Let S be a set of termsand t a term. Then S `E t if and only if there exist a context C suchthat n(C) = ∅ and terms t1, . . . , tn ∈ S such that t =E C[t1, . . . , tn].

A context is a term with holes (more formally, with variablesx1, . . . , xn) and C[t1, . . . , tn] denotes C where the holes (more formallythe variables xi) are replaced by the ti.

Proof. The implication S `E t ⇒ ∃C, t1, . . . , tn ∈ S s.t. n(C) =∅ and t =E C[t1, . . . , tn] is proved by a simple induction of the sizeof the proof tree of S `E t.

The converse implication is proved again by induction, on the sizeof the context C.

This property was first noticed by Abadi and Cortier [2004] andthe existence of a context C such that t =E C[t1, . . . , tn] has been latercalled the cap unification problem [Anantharaman et al., 2007].

4.1.4 The case of explicit decryption

Back to the case of encryption, there are two ways of defining deduc-tion: either through the inference system IDY introduced in §3.2.1 or

Page 37: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

4.2. Static equivalence 27

through the deduction system induced by the equational theory Eenc.Both definitions actually coincide.

Proposition 4.2. Let S be a set of terms and t be a term in the termalgebra T (Fstd,X ). Then

S `IDYt if and only if S `Eenc t.

The proof is left as an exercise.

4.2 Static equivalence

Deduction does not always suffice to reflect the attacker’s abilities. Forexample, the attacker can observe in which order messages are sent.Messages are therefore organized into frames.

Definition 4.1 (frame). A frame is an expression ϕ = νnθ =νn{M1/x1, . . . ,Mn/xn} where θ is a substitution and n is a set ofnames that are restricted in ϕ. The terms M1, . . . ,Mn represent theattacker knowledge while the names in n are initially unknown to theattacker. If k is a name νkϕ denotes ν(n∪{k})θ. We define the domainof ϕ, Dom(ϕ), to be Dom(θ).

Example 4.3. νk{1/x1, 0/x2, senc(0, k)/x3} is a frame. It models thefact that the attacker has seen two constants 1 and 0 in addition tothe encryption senc(0, k) where the key k is initially unknown to theattacker.

4.2.1 Deduction and recipe

The notion of deduction can easily be extended to frames.

Definition 4.2 (deduction). A term t is deducible from a frame ϕ = νnθ

if it can be deduced using ϕ and any name that does not occur in n.More formally, given an equational theory E and a frame ϕ = νnθ, wewrite ϕ `E t if

Dom(ϕ) ∪ (N\n) `E t.

Page 38: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

28 Equational theory and static equivalence

Consider for example the frame ϕ1 = νn, kθ1 where

θ1 = {senc(pair(n, n), k)/x, k/y}

and consider the equational theory Eenc defined in §4.1.2. Then ϕ1 `Eenc

n. More precisely, n can be obtained from ϕ1 by first decrypting themessage stored in x by the key in y and then projecting on the firstcomponent. Let M = fst(dec(x, y)). We have that Mθ1 =Eenc n. Such aterm M is called a recipe of n w.r.t. ϕ1.

Definition 4.3 (recipe). Let ϕ = νnθ be a frame and t a term such thatϕ `E t. A term R is said free w.r.t. ϕ if n(R) ∩ n = ∅. A term R is arecipe of t in ϕ if R is free w.r.t. ϕ and if Rθ =E t.

A term is deducible if and only if there exists a recipe of it.

Proposition 4.3. Let ϕ = νnθ be a frame and t a term. ϕ `E t if andonly if there exists a recipe R of t in ϕ.

The proof is very similar to the proof of Proposition 4.1.

4.2.2 Definition of static equivalence

Consider the two frames ϕ1 = {0/x, 1/y} and ϕ2 = {1/x, 0/y} where0 and 1 are constants. Clearly, the same terms can be deduced fromϕ1 and ϕ2. However, the two frames model two different situations. Inϕ1, the attacker observed 0 and 1, while in ϕ2, the attacker observed 1then 0. To reflect the ability of the attacker to compare messages, Abadiand Fournet [2001] introduced the notion of static equivalence. Beforedefining static equivalence we require the following auxiliary definition.

Definition 4.4. Given frames ϕ1, ϕ2 we write ϕ1 =α ϕ2 if ϕ1 is equalto ϕ2 up to α-conversion of restricted names.

We say that the equation M =E N holds in a frame ϕ, written(M =E N)ϕ if and only if there exist n and θ such that ϕ =α νnθ,M,N are free w.r.t. νnθ and Mθ =E Nθ.

Definition 4.5 (static equivalence). Two frames ϕ1 and ϕ2 are stati-cally equivalent w.r.t. an equational theory E, denoted ϕ1 ∼E ϕ2, ifDom(ϕ1) = Dom(ϕ2) and for any two terms M,N we have that

(M =E N)ϕ1 if and only if (M =E N)ϕ2.

Page 39: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

4.2. Static equivalence 29

We may write ϕ1 ∼ ϕ2 instead of ϕ1 ∼E ϕ2 when E is clear fromthe context.

Example 4.4. Let Eenc be the equational theory of encryption as de-fined in §4.1.2. Let ϕ1 = {0/x, 1/y} and ϕ2 = {1/x, 0/y}. Thenϕ1 6∼ ϕ2. Indeed (x = 0)ϕ1 while (x 6= 0)ϕ2.

Example 4.5. Let ϕ1 = νk{aenc(0, pk(k))/x, pk(k)/y} and ϕ2 =νk{aenc(1, pk(k))/x, pk(k)/y} corresponding respectively to the asym-metric encryption of 0 and 1. Then ϕ1 6∼ ϕ2. Indeed (aenc(0, y) = x)ϕ1

while (aenc(0, y) 6= x)ϕ2. An attacker may encrypt 0 itself and checksfor equality.

This is not the case anymore if encryption is randomized.Let ϕ′1 = νk, n{aenc(pair(0, n), pk(k))/x, pk(k)/y} and ϕ′2 =νk, n{aenc(pair(1, n), pk(k))/x, pk(k)/y}. Then ϕ′1 ∼ ϕ′2.

Static equivalence is closed under restriction and composition.

Proposition 4.4. Let ϕ = νnσ, ϕ1 = νn1σ1, and ϕ2 = νn2σ2 be threeframes such that ϕ1 ∼ ϕ2, Dom(σ) ∩ Dom(σi) = ∅ and n ∩ ni = ∅(1 ≤ i ≤ 2). Then

1. νs ϕ1 ∼ νs ϕ2, and

2. ψ1 = ν(n ∪ n1)(σ ∪ σ1) ∼ ν(n ∪ n1)σ ∪ σ2 = ψ2

Proof. Property 1 follows from the fact that n(νs ϕi) ⊆ n(ϕi). Prop-erty 2 is also simple to prove. Assume (M = N)ψ1. This can be rewrit-ten as (Mσ = Nσ)ϕ1 assuming n(M,N)∩n = ∅, which can be enforcedthrough α-renaming. Since ϕ1 ∼ ϕ2, this implies (Mσ = Nσ)ϕ2, there-fore (M = N)ψ2. The case where (M = N)ψ2 is symmetric.

4.2.3 Decision procedure for encryption

In this section, we consider the equational theory of symmetric encryp-tion Esenc restricting the theory Eenc (defined in §4.1.2) to pair andsymmetric encryption. Our goal is to present a decision procedure forstatic equivalence, for the theory Esenc. The procedure is a special caseof the procedure presented by Baudet [2005], for arbitrary subtermconvergent equational theories.

Page 40: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

30 Equational theory and static equivalence

For the sake of clarity of exposition, we consider only frames whereall names are restricted (public names should be explicitly added in theframe). That is, we only consider frames of the form νnθ where n(θ) ⊆n. Therefore, by abuse of notations, we write θ1 ∼Esenc θ2 wheneverνn(θ1)θ1 ∼Esenc νn(θ2)θ2. We first need to introduce some vocabulary.

Definition 4.6 (extended frame). An extended frame is an expression{M1 .u1, . . . ,Mn .un} where ui and Mi are terms. The extended frameassociated to a frame νn{u1/x1, . . . , un/xn} (where n(ui) ⊆ n) is simplydefined as {x1 . u1, . . . , xn . un}.

Initialization Given a substitution θ, the decision procedure startsby eliminating duplicated terms, replacing them by equations. Supposethe extended frame corresponding to θ is

{x11 . t1, . . . , x

k11 . t1, . . . , x

1n . tn, . . . , x

knn . tn}

where the ti are pairwise distinct. Then we define

Init(θ) = ({x11 . t1, x

12 . t2, . . . , x

1n . tn}, {x1

i �xji | 1 ≤ i ≤ n, 1 ≤ j ≤ ki})

Saturation Let ϕ0 be an extended frame and E0 a set of equationsM �N where M and N are terms. We define the saturation procedurefor ϕ0 and E0 as follows.Sat(ϕ0, E0) =

1. Let ϕ := ϕ0 and E := E0.

2. Repeat until reaching a fixed point.

For any M1 . t1,M2 . t2 ∈ ϕ, f ∈ {senc, dec, pair}, g ∈ {fst, snd}:

• If f(t1, t2) =Esenc t for some term t subterm of ϕ then

– if ∃M. M . t ∈ ϕ then E := E ∪ {f(M1,M2) �M};– else ϕ := ϕ ∪ {f(M1,M2) . t}.

• If g(t1) =Esenc t for some term t subterm of ϕ then

– if ∃M. M . t ∈ ϕ then E := E ∪ {g(M1) �M};

Page 41: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

4.2. Static equivalence 31

– else ϕ := ϕ ∪ {g(M1) . t}.

3. Return E.

This procedure terminates and reaches a fixed point (left as exer-cise).

Theorem 4.1. Let θ1 and θ2 be two substitutions. then θ1 ∼Esenc θ2 ifand only if

• for any (M �N) ∈ Sat(Init(θ1)) it holds that Mθ2 =Esenc Nθ2.

• for any (M �N) ∈ Sat(Init(θ2)) it holds that Mθ1 =Esenc Nθ1.

This theorem is a particular case of the decision procedure devel-oped by Baudet [2005]. Its proof will not be given here.

Example 4.6. Consider θ = {senc(pair(n, n), k)/x, k/y}. Then

Init(θ) = ({x . senc(pair(n, n), k), y . k}, ∅)

and Sat(Init(θ)) = (ϕ,E)

with

ϕ = x . senc(pair(n, n), k), y . k, dec(x, y) . pair(n, n), fst(dec(x, y)) . n

and E = {snd(dec(x, y)) � fst(dec(x, y))}

4.2.4 More decision procedures

A much more general decision procedure has been developed by Baudet[2005] for any convergent theory. It is guaranteed to terminate for theclass of convergent subterm theories1 and it also works in more generalcases that encompass for example blind signatures [Baudet et al.]. Thecorresponding implementation is the tool YAPA. Another available toolfor static equivalence is KISS by Ciobâcă et al. [2012]. This tool also

1A convergent theory is an equational theory induced by a convergent rewritingsystem. The theory is subterm convergent if there is a corresponding (convergent)rewriting system such that any rewrite rule ` → r is such that r is a subterm of `or a constant.

Page 42: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

32 Equational theory and static equivalence

handles a large class of equational theories that includes for exampletrapdoor commitments.

Decidability of static equivalence has also been obtained for theclass of monoidal theories [Cortier and Delaune, 2012] that capturese.g. Exclusive Or and pure Associativity and Commutativity. It alsoencompasses homomorphic equations of the form h(x+y) = h(x)+h(y)where + is an associative and commutative symbol. Decidability resultscan be combined [Cortier and Delaune, 2012] : if static equivalence isdecidable for two disjoint theories E1 and E2 then it is also decidablefor E1 ∪ E2 (provided deduction is also decidable for E1 and E2).

Some more decidability results have been obtained for specific the-ories such as the theory of trapdoor commitment and that of reen-cryption [Berrima et al., 2011] as well as theories for Diffie-Hellmanexponentiation [Kremer and Mazaré, 2007] and bilinear pairings [Kre-mer et al., 2012].

Page 43: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

4.3. Exercises 33

4.3 Exercises

Exercice 6 (**). Consider the equational theory E⊕ of the ExclusiveOr, defined in §4.1.2. Show that the following problem is decidable.

Input a finite set of terms S and a term t over T ({⊕},N )

Output whether S `E⊕ t.

Discuss the complexity of your algorithm (Note: there exists a polyno-mial time algorithm).

Exercice 7 (**). Consider the equational theory EAC of associativityand commutativity, defined by the two following equations.

{(x+ y) + z = x+ (y + z), x+ y = y + x}

Show that the following problem is decidable.

Input a finite set of terms S and a term t over T ({+},N )

Output whether S `E+ t.

Discuss the complexity of your algorithm (Note: this problem is actuallyNP-complete).

Exercice 8 (***). Let S be a set of terms and t be a term in the termalgebra T (Fstd,X ). Show that

S `IDYt if and only if S `Eenc t.

Exercice 9 (*). Let ϕ1 = νn1θ1 and νϕ2 = n2θ2. We define ϕ1◦∼E ϕ2

if and only if

• Dom(ϕ1) = Dom(ϕ2), and

• for all M,N that are free in ϕ1 and ϕ2 we have that

(Mθ1) =E (Nθ1) iff (Mθ2) =E (Nθ2)

Show that static equivalence (∼E) and ◦∼E do not coincide.

Exercice 10 (**). Show that the saturation procedure described in§4.2.3 terminates.

Page 44: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

34 Equational theory and static equivalence

Exercice 11 (*). Let θ1 = {senc(pair(n, n), k)/x, k/y},θ2 = {senc(pair(senc(n, k′), senc(n, k′)), k)/x, k/y}, and θ2 ={senc(pair(n, n′), k)/x, k/y},

1. Show that θ1 6∼Esenc θ3.

2. Show that θ1 ∼Esenc θ2.Hint: you may use the decision procedure of §4.2.3.

Exercice 12 (***). Consider the theory Eenc defined in §4.1.2. Let hbe a unary symbol (that has no equation). Show that for any groundterm t,

νn. {h(n)/x} ∼Eenc νn. {h(pair(t,n))/x}

Page 45: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5A cryptographic process calculus

The previous chapter describes how messages exchanged in crypto-graphic protocols can be represented as terms. In this chapter, we dis-cuss how the protocols themselves can be modelled. While the kind of“Alice - Bob” notation used in §2 are convenient for explaining pro-tocols, these notations generally only describe a honest protocol exe-cution, and contain ambiguities. Fundamentally, security protocols areconcurrent programs and formalisms for representing such programsdo exist. In particular process algebras and calculi have been devel-oped for this purpose. Some “general purpose process algebras”, e.g.CSP [Ryan et al., 2000], have indeed been used to reason about securityprotocols. There also exist dedicated calculi which integrate supportfor sending messages that use cryptographic primitives. Examples ofsuch dedicated process calculi are CryptoSPA [Focardi and Martinelli,1999], which extend the CCS process algebra, the spi calculus [Abadiand Gordon, 1999], and the applied pi calculus [Abadi and Fournet,2001], that both extend the pi calculus. We present here in more de-tails the applied pi calculus. In contrast to the pure pi calculus it is notrestricted to communicating names, but processes may output termsthat represent messages. One may also note that some people have con-

35

Page 46: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

36 A cryptographic process calculus

P,Q,R := plain processes0

P ‖ Q!Pνn.P

if t1 = t2 then P else Q

in(u, x).Pout(u, t).P

Figure 5.1: Syntax: plain processes

sidered the problem of compiling “Alice - Bob” kind notations to moreformal models, e.g., [Jacquemard et al., 2000, Millen and Denker, 2002,Chevalier and Rusinowitch, 2010], but we will not discuss them here.

5.1 Syntax and informal semantics

We assume a set of names N , a set of variables X , and a signature Fwhich define the set of terms T (F ,X ,N ) equipped with an equationaltheory E (see §4). The equational theory is left implicit throughoutthis chapter. Moreover, we rely on a simple sort system that distin-guishes channels and basic messages. We distinguish the set Nch ⊂ Nof channel names and partition X = Xb ] Xch in the set of variablesof base sort and variables of channel sort. We suppose that functionsymbols cannot be applied to variables or names of channel sort, andcannot return terms of that sort. Hence, channels are always atomic:the only terms of channel sort are variables and names of that sort.

The applied pi calculus has two kind of processes: plain and ex-tended processes. Plain processes are generated by the grammar givenin Figure 5.1, where t, t1, t2, . . . range over terms, n over names, x overvariables and u is a meta-variable that stands for either a name or avariable of channel sort. The 0 process is the process that does noth-ing. Parallel composition P ‖ Q models that processes P and Q areexecuted in parallel. The replication of P , denoted !P , allows an un-bounded number of copies of P to be spawned. New names can be

Page 47: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5.1. Syntax and informal semantics 37

A,B,C := extended processesP

A ‖ Bνn.A

νx.A

{t/x}

Figure 5.2: Syntax: extended processes

created using the new operator νn, which acts as a binder and gener-ates a restricted name. The conditional if t1 = t2 then P else Q behavesas P whenever t1 =E t2 and as Q otherwise. Finally, in(u, x).P expectsan input on channel u that is bound to variable x in P and out(u,M).Poutputs term M on channel u and then behaves as P .

Extended processes are generated by the grammar given in Fig-ure 5.2. They extend plain processes by active substitutions, and allowrestrictions on both names and variables. An active substitution {t/x}allows processes to address a term by a variable. The scope of thisaccess may be restricted using the ν operator on variables. This alsoallows to define local variables as follows: the construct let x = t in P

is defined as νx.(P ‖ {t/x}). When the variable x is not restricted, itmeans that the environment, which represents the attacker, may use xto access the term t. As exemplified in the description of the labelledsemantics of the applied pi calculus, these active substitutions are typ-ically used to record the terms output by the processes and representthe attacker knowledge. Given several active substitutions in parallel{t1/x1} ‖ . . . ‖ {tn/xn} we often regroup them in a single substitu-tion {t1/x1 , . . . ,

tn /xn}. We suppose that these substitutions are cyclefree, that there is at most one substitution defined for each variable,and exactly one when this variable is restricted, and substitutions onlycontain terms of base sort. Given an extended process A we denoteby φ(A) the process obtained by replacing any plain process with 0.φ(A) is called the frame of the process A. We also note that extendedprocesses must not appear under a replication, an input, an output, or

Page 48: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

38 A cryptographic process calculus

a conditional.For readability we often omit trailing 0 processes and else 0

branches, e.g. we write

if t1 = t2 then out(c, t)

instead ofif t1 = t2 then out(c, t).0 else 0

As usual we define free and bound names for processes, denoted n(A)and bn(A). Similarly we define free and bound variables, denoted fv(A)and bv(A). The set of bound variables contains all variables that havebeen bound by an input and the ones that are restricted by the ν oper-ator. For an active substitution {t/x} the set of free variables containsx in addition to the variables occurring in t.

Finally, we define the notion of context. A context is a process witha “hole”, often denoted _. Given a context C, we write C[A] for theprocess obtained by replacing _ with the process A. An evaluationcontext is a context whose hole is neither under replication, nor input,output or conditional.

5.2 Modelling protocols as processes

Before defining the formal semantics of the applied pi calculus we illus-trate how the calculus can be used for modelling security protocols. Asan example we consider the Needham-Schroeder public key protocol,introduced in §2. The protocol can be informally described as follows.

1. A→ B : aenc(〈a, na〉, pkb)2. B → A : aenc(〈na, nb〉, pka)3. A→ B : aenc(nb, pkb)

We assume the previously defined equational theory Eenc on signa-ture Fdec∪Fstd and model each of the roles A and B by a process. It isimportant to distinguish between the role of the initiator A, modelledby a process, and the agent (a in the above informal description) whois executing the role. To make this distinction explicit we parametrizethe processes representing the initiator and responder with the keys ofthe agents who execute the role.

Page 49: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5.2. Modelling protocols as processes 39

PA(ski, pkr) = νna. out(c, aenc(〈pk(ski), na〉, pkr)).in(c, x).if fst(adec(x, ski)) = na then

let xnb = snd(adec(x, ski)) in

out(c, aenc(xnb, pkr))

The process first generates a fresh nonce na and then outputs thefirst message on channel c. Note that for simplicity we assume that theagent’s identity is his public key. Next, the initiator is waiting for amessage which is going to be bound to the variable x. Using a con-ditional the initiator checks that the message contains the previouslysent nonce na. For readability we then create a local variable xnb andstore in this variable what is expected to be the nonce generated bythe responder.

We can model similarly the responder process PB.

PB(skr) = in(c, y).let pki = fst(adec(y, skr)) in

let yna = snd(adec(y, skr)) in

νnb.out(c, aenc(〈yna, nb〉, pki))in(c, z).if adec(z, skr)) = nb then Q

One may note that PB only takes a single argument, the responder’sprivate key. The initiator’s public key is received during the execution.When the final test succeeds we suppose that the responder continuesto execute some task modelled by the process Q.

We can now put the processes together into a process that modelsthe Needham Schroeder public key protocol as a whole.

P 1nspk = νska, skb.(PA(ska, pk(skb)) ‖ PB(skb) ‖

out(c, pk(ska)) ‖ out(c, pk(skb)))

This first version models that a (or more precisely the agent identifiedby pk(ska)) is executing an instance of the role PA with b (identifiedby pk(skb)). We also output the public keys of a and b to make theseavailable to the adversary.

Page 50: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

40 A cryptographic process calculus

However, one may notice that the above modeling would missLowe’s man in the middle attack since this setting does not involveany dishonest agent c. To capture this attack one could explicitly in-clude that a is willing to start a session with the intruder. We supposethat the intruder possesses a secret key skc which formally is just a freename.

P 2nspk = νska, skb.(PA(ska, pk(skb)) ‖ PA(ska, pk(skc)) ‖ PB(skb) ‖

out(c, pk(ska)) ‖ out(c, pk(skb)))

This second version explicitly includes a session started by a with theintruder and indeed captures Lowe’s man in the middle attack. How-ever, this situation is not satisfactory, as one does not know a prioriwith whom agents should start a session. One trick is to leave this choiceto the attacker: we add an input that is used to define the public keygiven to the initiator role.

P 3nspk = νska, skb.(in(c, xpk).PA(ska, xpk) ‖ PB(skb) ‖

out(c, pk(ska)) ‖ out(c, pk(skb)))

Now the attacker can just input the public key that suits him best tocreate an attack. Note that he may input one of the two regular publickeys pk(ska) and pk(skb), or any other term, including in particular hisown key pk(skc). Note that the attacker may also trigger the agent ato execute a protocol “with himself”, i.e., with the public key pk(ska).There exist indeed attacks, sometimes called reflection attacks, thatrely on this behavior.

This version has still a shortcoming. We only consider one sessionfor each role. Many attacks do however require several parallel sessionsof the same role. Millen [1999] has even shown that there is a priori noupper bound on the number of parallel sessions that would avoid allattacks. We therefore add replication.

P 4nspk = νska, skb.(!in(c, xpk).PA(ska, xpk) ‖!PB(skb) ‖

out(c, pk(ska)) ‖ out(c, pk(skb)))

Page 51: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5.3. Formal semantics 41

This new version allows a and b to execute an arbitrary number ofsessions. Note that a may execute several sessions with the same as wellas different responders. However, this modeling still misses that bothinitiator and responder may be executed by the same agent. We there-fore include explicitly that a and b may execute both roles. Moreover,the above process only allows two honest agents while an attack maya priori require the presence of more honest agents. Therefore we addan additional replication that allows the creation of an arbitrary num-ber of honest private keys, each of which can be used in an arbitrarynumber of sessions.

P 5nspk = !νska, skb.(!in(c, xpk).PA(ska, xpk) ‖ !PB(ska) ‖

!in(c, xpk).PA(skb, xpk) ‖ !PB(skb) ‖out(c, pk(ska)) ‖ out(c, pk(skb)))

Observing the symmetric roles of a and b this process can be writtenmore succinctly and elegantly as

P 6nspk = !νsk.(!in(c, xpk).PA(sk, xpk) ‖ !PB(sk) ‖ out(c, pk(sk)))

This final modeling allows the adversary to spawn an arbitrary numberof instances of PA and PB with either the same or different private keys.

5.3 Formal semantics

As the goal is to prove security properties of protocols modelled asprocesses, we need to define the semantics of the calculus in order tohave a precise definition on how a process can be executed.

5.3.1 Operational semantics

We first define the notion of structural equivalence. Intuitively, struc-tural equivalence relates identical processes that are simply written ina different way.

Formally, structural equivalence ≡ is the smallest equivalence re-lation closed under α-conversion of bound names and variables andapplication of evaluation contexts and such that:

Page 52: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

42 A cryptographic process calculus

Par-0 A ‖ 0 ≡ A

Par-C A ‖ B ≡ B ‖ APar-A (A ‖ B) ‖ C ≡ A ‖ (B ‖ C)Repl !P ≡ P ‖ !P

New-0 ν n. 0 ≡ 0New-Par A ‖ ν u.B ≡ ν u.(A ‖ B) when u 6∈ fv(A) ∪ n(A)New-C ν u.ν v.A ≡ ν v.ν u.A

Alias ν x.{t/x} ≡ 0Subst {t/x} ‖ A ≡ {t/x} ‖ A{t/x}Rewrite {t1/x} ≡ {t2/x} when t1 =E t2

While most of the above rules are standard the last three rules mayrequire some explanation. Alias allows the creation of a new localvariable. Subst allows the application of an active substitution to aprocess and Rewrite allows to relate two active substitutions modulothe equational theory.

Example 5.1. Let us illustrate these rules by showing that out(c, t1) ≡out(c, t2) when t1 =E t2.

out(c, t1) ≡ out(c, t1) ‖ 0 by Par-0≡ out(c, t1) ‖ ν x.{t1/x} by Alias≡ ν x.(out(c, t1) ‖ {t1/x}) by New-Par≡ ν x.({t1/x} ‖ out(c, t1)) by Par-C≡ ν x.({t1/x} ‖ out(c, x)) by Subst≡ ν x.({t2/x} ‖ out(c, x)) by Rewrite≡ ν x.({t2/x} ‖ out(c, t2)) by Subst≡ ν x.(out(c, t2) ‖ {t2/x}) by Par-C≡ out(c, t2) ‖ ν x.{t2/x} by New-Par≡ out(c, t2) ‖ 0 by Alias≡ out(c, t2) by Par-0

Note that we also implicitly used the fact that structural equivalenceis closed under application of evaluation contexts as we applied someof the rules directly under a context.

Page 53: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5.3. Formal semantics 43

One may also note that for any extended process A, we have thatφ(A) ≡ νn.σ for some sequence of names n and substitution σ. There-fore we can lift static equivalence to processes and we write A ∼E B

whenever φ(A) ≡ νnA.σA, φ(B) ≡ νnB.σB, and nA.σA ∼ νnB.σB.We can now define how processes interact together. Internal re-

duction is the smallest relation on processes closed under structuralequivalence and application of evaluation contexts such that

Comm out(c, t).P1 ‖ in(c, x).P2 → P1 ‖ P2{t/x}Then if t = t then P else Q → P

Else if t1 = t2 then P else Q → Q

where t1, t2 are ground and t1 6=E t2

The first rule (Comm) models communication: whenever a processis ready to output a term t on channel c and another process, runningin parallel, is ready to input on channel c, i.e., it starts with in(c, x)then a communication can take place and x is replaced by t. RulesThen and Else model a conditional. One may note that the Thenrule requires syntactic equality of terms (if t = t). However as internalreduction is closed under structural equivalence this rule is equivalentto the rule

Then’ if t1 = t2 then P else Q → P

where t1 =E t2

using structural equivalence in a similar way as in Example 5.1. Onemay also note that in the Else rule, contrary to the Then rule werequire t1, t2 to be ground. This is due to the fact that equality is closedunder substitution while disequality is not, e.g. even though x 6= y wehave that xσ = yσ for σ = {x/y}.

Example 5.2. We illustrate internal reduction by modelling the honestexecution of the Needham Schroeder public key protocol. For simplicity,

Page 54: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

44 A cryptographic process calculus

we consider a naive model that only considers two honest participants:

νska, skb. PA(ska, pk(skb)) ‖ PB(skb)

→ νska, skb, na, nb. in(c, x).if fst(adec(x, ski)) = na then

let xnb = snd(adec(x, ski)) in

out(c, aenc(xnb, pkr))

‖ out(c, aenc(〈na, nb〉, pk(ska)))in(c, z).if adec(z, skb)) = nb then Q

→ νska, skb, na, nb. if na = na then

out(c, aenc(nb, pkr))

‖ in(c, z).if adec(z, skb)) = nb then Q

→ νska, skb, na, nb. out(c, aenc(nb, pkr))

‖ in(c, z).if adec(z, skb)) = nb then Q

→ νska, skb, na, nb. if nb = nb then Q

→ νska, skb, na, nb. Q

5.3.2 Observational equivalence

In order to model security properties we often rely on the notion ofobservational equivalence. Examples of security properties relying onobservational equivalence are provided in §6. Intuitively, two processesare observationally equivalent if they cannot be distinguished by anattacker. Here the attacker may be an arbitrary process written in theapplied pi calculus. The formal definition of observational equivalenceuses the concept of barbs: we write A ⇓ a if process a is able to senda message on channel a, i.e. A →∗ C[out(a, t).P ] for some evaluationcontext C[_] that does not bind a, some process P and term t.

Definition 5.1. Observational equivalence, denoted ≈, is the largestsymmetric relation R on closed extended processes with same domainsuch that if A R B then

Page 55: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5.3. Formal semantics 45

1. if A ⇓ a then B ⇓ a;

2. if A→∗ A′ then there exists B′ such that B →∗ B′ and A′ R B′;

3. for all closing evaluation context C[_] we have that C[A] R C[B].

Intuitively, the aim of the attacker (modelled by the context C[_])is to output on channel a whenever he believes that he interacts withA and not with B.

Example 5.3. Consider the following two processes A and B.

A = in(c, x). if x = 0 then out(c, 1)B = in(c, x). if x = 0 then out(c, 0)

where 0 and 1 are constants. These two processes are not observation-ally equivalent, i.e., A 6≈ B. A witness of the non-equivalence is forinstance provided by the context

C[_] = out(c, 1). in(c, y). if y = 1 then out(a, 1) ‖ _

We indeed have that C[A] →∗ out(a, 1) and therefore C[A] ⇓ a whileC[B] can never emit on a.

Example 5.4. As another example consider the following processes Aand B

A = in(c, x). νn. out(c, h(n))B = in(c, x). νn. out(c, h(〈x, n〉))

where h is a free symbol, i.e., not appearing in the equational the-ory. One may think of h as modeling a hash function. These two pro-cesses are observationally equivalent, i.e. A ≈ B even though this isnot straightforward to prove.

5.3.3 Labelled semantics and labelled bisimulation

In the previous example we presented two processes A and B that areobservationally equivalent. Showing observational equivalence formallyis however tricky as it requires to show that the processes behave inthe same way for all context C[_]. This universal quantification overcontexts is generally difficult to reason about and motivates the in-troduction of a labelled semantics, which allows processes to directlyinteract with the environment.

Page 56: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

46 A cryptographic process calculus

The labelled operational semantics defines the relation α→ where αis either in(a, t) (a is a channel name and t is a term that can containnames and variables), or νx.out(a, x) (x is a variable of base type), orout(a, c) or νc.out(a, c) (c is a channel name). α→ extends the internalreduction (→) by the following rules:

In in(a, x).Pin(a,t)−−−−→ P{t/x}

Out-Ch out(a, c).Pout(a,c)−−−−−→ P

Open-ChA

out(a,c)−−−−−→ A′ c 6= a

νc.Aνc.out(a,c)−−−−−−→ A′

Out-T out(a, t).Pνx.out(a,x)−−−−−−−→ P | {t/x} x 6∈ fv(P ) ∪ fv(t)

ScopeA

α−→ A′ u does not occur in α

νu.Aα−→ νu.A′

ParA

α−→ A′ bv(α) ∩ fv(B) = bn(α) ∩ fn(B) = ∅A | B α−→ A′ | B

StructA ≡ B B

α−→ B′ A′ ≡ B′

Aα−→ A′

The rule In allows the environment to input a term and annotates thistransition by the label in(a, t). Intuitively, t is the recipe (as in §4.2.1)that allows to deduce the term that is input. The fact that recipesdo not contain restricted names is enforced by the rule Scope. Wedistinguish different ways of outputting terms. The distinction is dueto the fact that channel names are handled differently from terms ofbase type to ensure that they do not appear in the frame. The ruleOut-Ch simply outputs a public channel name. Open-Ch allows tooutput a private channel name: this channel name is opened and be-comes public by removing the “νc” specified in the label νc.out(a, c).The side-condition c 6= a simply ensures that one cannot open a privatechannel, by outputting it on itself. The rule Out-T allows to output

Page 57: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5.3. Formal semantics 47

terms of base type. Terms are output by reference, i.e., an output cre-ates an entry {t/x} in the frame, allowing the environment to addresst through the variable x. The label νx.out(c, x) indicates that x is afresh variable that has been “opened” and can now be used by theenvironment. Scope and Par are used to close the labelled reductionunder evaluation context, provided these contexts do not interfere withthe bound names and variables. We provide examples below illustrat-ing the importance of these side-conditions. Finally, the rule Structstates that labelled reduction is closed under structural equivalence.

Remark 5.1. Our definition of labelled semantics slightly differs fromthe definition given in [Abadi and Fournet, 2001]. We prefer this presen-tation as it clarifies the distinction between channel names and termsof base type. Delaune et al. [2010a] show that observational equivalencecoincides for both semantics.

Example 5.5. Let A be the following process

A = νs. out(c, enc(s, k)). in(c, y). if y = s then P

where x, y 6∈ fv(P ). This models a simple challenge-response proto-col: the protocol outputs a secret s encrypted with a key k and onlyproceeds if it receives s. However, the key k has not been declared pri-vate. Therefore an attacker can indeed provide s through the followingtransition sequence.

Aνx.out(c,x)−−−−−−−→ νs. A1

in(c,dec(x,k))−−−−−−−−→ νs. A2 → P

whereA1 = in(c, y). if y = s then P ‖ {enc(s,k)/x}A2 = if dec(x, k) = s then P ‖ {enc(s,k)/x}

The step νs. A1in(c,dec(x,k))−−−−−−−−→ νs. A2 can be shown to be a valid transi-

tion as follows.

Inin(c, y). if y = s then P

in(c,dec(x,k))−−−−−−−−→ if dec(x, k) = s then PPar

A1in(c,dec(x,k))−−−−−−−−→ A2

Scopeνs. A1

in(c,dec(x,k))−−−−−−−−→ νs. A2

Page 58: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

48 A cryptographic process calculus

We may now consider the process νk. A, which uses a private key k forthe challenge-response. Then we have that

νk. Aνx.out(c,x)−−−−−−−→ νk. νs. A1

However the transition

νk. A1in(c,dec(x,k))−−−−−−−−→ νk. νs. A2

is not valid. In particular, the above proof that νs. A1in(c,dec(x,k))−−−−−−−−→

νs. A2 cannot be extended using the rule Scope: the rule’s side condi-tion is violated because k does appear in the label in(c, dec(x, k)). Inthis way we ensure that the attacker may not directly use restrictednames in the terms that are input. In other words, only deducible termsmay be sent by the attacker.

As discussed above proving observational equivalence is particularlytricky due to the universal quantification over all contexts. Thereforewe now introduce a labelled bisimulation.

Definition 5.2. Labelled bisimilarity, denoted ≈`, is the largest sym-metric relation R on closed extended processes, such that if ARB thenwe have

• A ∼ B;

• if A→ A′ then there exists B′ such that B →∗ B′ and A′RB′;

• if A α−→ A′ and fv(α) ⊆ Dom(A) and bn(α)∩n(B) = ∅ then there existsB′ such that B →∗ α−→→∗ B and A′RB′;

The conditions 2 and 3 correspond to the standard definition of(weak) bisimulation. The first condition requires that the processes arestatically equivalent, i.e., the attacker cannot distinguish the sequenceof terms output so far by the processes.

Abadi and Fournet [2001] show that labelled bisimulation and ob-servational equivalence coincide.

Theorem 5.1 ([Abadi and Fournet, 2001]). Let A and B be two closedextended processes. A ≈ B if and only if A ≈` B.

Page 59: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5.3. Formal semantics 49

We can therefore use labelled bisimulation as a proof technique toshow that two processes are (or are not) observationally equivalent.Labelled bisimulation indeed avoids the universal quantification overall possible contexts. One may nevertheless notice that the transitionsystem defined by the labelled reduction relation is infinite branchingas the input rule may allow an infinite number of possible terms to besent by the adversary.

We now revisit Examples 5.3 and 5.4.

Example 5.6. Recall the processes

A = in(c, x). if x = 0 then out(c, 1)B = in(c, x). if x = 0 then out(c, 0)

defined in Example 5.3. To show that these processes are not observa-tionally equivalent we provided a context that distinguishes them. Weprovide an alternate way to prove this result using labelled bisimula-tion.

We proceed by contradiction. Suppose that A ≈` B. We have

that Ain(c,0)−−−−→→ νx.out(c,x)−−−−−−−→ A′ where A′ = {1/x}. Then, because

A ≈` B there must exist B′ such that Bin(c,0)−−−−→→∗ νx.out(c,x)−−−−−−−→→∗ B′

and A′ ≈` B′. In the given example the process B′, such that

Bin(c,0−−−→→∗ νx.out(c,x)−−−−−−−→→∗ B′ is uniquely defined (up to ≡) as {0/x}.

However, A′ 6∼ B′, violating condition 1 of Definition 5.2, hence con-tradicting that A ≈` B.

Example 5.7. In Example 5.4 we defined processes

A = in(c, x). νn. out(c, h(n))B = in(c, x). νn. out(c, h(〈x, n〉))

which we claimed to be observationally equivalent. We now define therelation R on closed extended processes as

R = (A,B) ∪ {(A′, B′) | A′ ≡ νn. out(c, h(n)),B′ ≡ νn. out(c, h(〈t, n〉)),t ground}

∪ {(A′, B′) | A′ ≡ νn. {h(n)/x}B′ ≡ νn. {h(〈t,n〉)/x}t ground, x ∈ X}

Page 60: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

50 A cryptographic process calculus

Provided that for any ground term t we have that

νn. {h(n)/x} ∼ νn. {h(〈t,n〉)/x}

it is easy to verify that R satisfies the 3 conditions of Definition 5.2.Hence, as A R B we have that A ≈` B. We see that labelled bisimu-lation can be used to reduce the proof of observational equivalence toa proof of a (generally infinite) family of static equivalences—in thisexample the static equivalence must hold for all ground terms t.

Page 61: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

5.4. Exercises 51

5.4 Exercises

Exercice 13 (**). Consider the previously introduced process P 6nspk

which models the Needham-Schroeder public key protocol.

• Give the labelled transition sequence that witnesses Lowe’s man in themiddle attack.

• Provide the attacker context that allows to witness this attack usingonly internal reduction.

Exercice 14 (***). The following message exchange describes the WooLam mutual authentication protocol which allows to establish a freshsymmetric key using a trusted server S.

1. A → B : A,Na

2. B → A : B,Nb

3. A → B : {A,B,Na, Nb}Kas

4. B → S : {A,B,Na, Nb}Kas , {A,B,Na, Nb}Kbs

5. S → B : {B,Na, Nb,Kab}Kas , {A,Na, Nb,Kab}Kbs

6. B → A : {B,Na, Nb,Kab}Kas , {Na, Nb}Kab

7. A → B : {Nb}Kab

For any agent C we assume Kcs to be a symmetric key whose value isinitially known only by agent C and the server S.

• Provide a reasonable and general model of this protocol in the appliedpi calculus.

• Show that the protocol admits an attack: the intruder I makes B be-lieve that he is executing the protocol with A (while A did not sendany message).Hints:

– The attack requires to interleave 2 sessions of the B role (andnone for A and the server).

– Some ciphertexts may be replaced by random values.

– We suppose that A cannot distinguish the session key Kab from anonce.

Page 62: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

52 A cryptographic process calculus

• Show that your model of the protocol includes this attack by providinga labelled reduction sequence as a witness.

Page 63: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6Security properties

In this section we discuss how several, important security properties canbe modelled. Before defining the security properties we introduce thenotion of events. Events are used to annotate processes and are usefulto define properties about the executions. Then we discuss reachabilityproperties: (a weak version of) confidentiality, or secrecy, and differentflavors of authentication. Next we show how equivalence propertiesmay be used to express a variety of security properties, including strongnotions of secrecy, privacy preserving properties, as well as more generalproperties expressed as indistinguishability from an ideal system.

6.1 Events

We first enrich the process calculus with events. Events are simplyannotations to make statements about which part of the protocol hasbeen reached. For this we add an event construct to the grammar ofplain processes:

event e(t).P

where e is taken from the set of events, a set of unary function symbols,distinct from the term algebra signature. t is a term used to provide

53

Page 64: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

54 Security properties

parameters to the event. We could easily consider events of arbitraryarity, but for notational simplicity we consider only unary ones in thedefinitions. Multiple parameters can be encoded using pairs in the equa-tional theory. Similarly an event of arity 0 can be simply given a con-stant for t. In what follows, by abuse of notation, we may use eventswith an arbitrary number of parameters, relying on this encoding.

As events are merely annotations they should not interfere with theprocess execution. We therefore simply extend the internal reductionby the rule

Event event e(t). P → P

We also define what it means for an event to occur.

Definition 6.1. A reduction sequence R = A0(α1)−−→ A1 . . .

(αn)−−−→ An

satisfies the event e(t) at step i if and only if Ai−1 ≡ C[event e(t).P ]and Ai ≡ C[P ] for some C[_], P .

We say that R satisfies the event e(t), if there exists i such that Rsatisfies the event e(t) at step i.

6.2 Secrecy

In many approaches for symbolic protocol analysis secrecy is expressedas a reachability property: the adversary may not reach a state where heknows a secret s, i.e., where the secret s is deducible from the currentadversary knowledge. This form of secrecy is sometimes called weaksecrecy. We discuss later in this chapter stronger flavors of secrecyexpressed as particular equivalence properties.

A first remark is that given a process P , the secrecy of term t isactually not well defined, as illustrated by the following example.

Example 6.1. Consider the process P = P1 ‖ P2 ‖ P3 where

P1 = νn.out(c, n)P2 = νn.out(c, h(n))P3 = out(c, h(n))

The question of whether n is secret in P is not well defined. Actually,formally, n may only refer to n used in P3. However, we may wish to

Page 65: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.2. Secrecy 55

express that the n used in P1 is secret (which is not the case) andsimilarly for P2. However, n is bound in P1 and in P2 and may beα-converted at will.

To avoid the problem of being unable to address bound names weadd a monitor process that raises an event in case the secret is deducibleby the adversary.

Definition 6.2. Let A be a closed extended process such that the eventded does not occur in A and s ∈ n(A). Let As = νs.(A ‖ (in(c, x) if x =s then event ded(s)). We say that s is (weakly) secret in A if and only

if for all reduction sequence R = As (α1)−−→ A1 . . .(αn)−−−→ An we have that

R does not satisfy ded(s).

Note that for simplicity we restrict this definition to the secrecy ofnonces that are restricted at the top level, i.e. the “νs” is placed in frontof the process. This definition can be generalized to compound terms aslong as all bound names appearing in the term are restricted at the toplevel. When restrictions appear under replication a general definitionis more complicated but in many particular cases the correspondingmonitor process is straightforward to add.

We may sometimes abuse language and talk about the secrecy of arestricted name to mean the secrecy of free name in the process wherethe restriction of this name has been removed. We will only do so whenthis is not ambiguous, i.e. the name is bound once and does not appearfreely.

Example 6.2. Consider again the Needham Schroeder public key pro-tocol and its modelling by the process P 6

nspk on p. 41. One may wantto check whether the nonces na and nb are secret or not. To avoid re-ferring to a nonce under a replication we may note that by unrollingsome replications we have that

P 6nspk ≡ νsk.(in(c, xpk).P ′

A(sk, xpk) ‖!in(c, xpk).PA(sk, xpk) ‖ !PB(sk) ‖ out(c, pk(sk))) ‖

P 6nspk

where P ′A(sk, xpk) is obtained from PA(sk, xpk) α-converting na to n′a.

We may regard P ′A(sk, xpk) as a test session, which has been chosen

Page 66: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

56 Security properties

without loss of generality, as it was obtained through structural equiv-alence.

We denote by A the process obtained from the above process byremoving the restriction on n′a. We will also suppose that B is a similarconstruction, but defining a test session for the responder, which allowsus to reason about the secrecy of n′b.

One might expect that n′a is secret in A while n′b is not secret in Bdue to the man-in-the-middle attack. However, as stated, both noncesmay be divulged to the attacker. Indeed if a starts a session with theattacker, the nonce n′a is trivially leaked.

More formally, we have that the reduction sequence

An′a = νn′a. (A ‖ (in(c, x) if x = s then event ded(s))in(c,pk(skc))−−−−−−−→

νx.out(c,x)−−−−−−−→ νn′a.νsk.(A′ ‖

(in(c, x) if x = s then event ded(s)) ‖{aenc(〈pk(sk),na〉,pk(skc))/x})

in(c,snd(adec(x,skc)))−−−−−−−−−−−−−→→ νn′a.νsk.(A

′ ‖ event ded(s) ‖{aenc(〈pk(sk),na〉,pk(skc))/x})

→ νn′a.νsk.(A′ ‖ {aenc(〈pk(sk),na〉,pk(skc))/x})

witnesses the attack on the secrecy of n′a.This example shows that one has to be careful when stating the

security properties to be checked. One way to avoid the nonces to betrivially leaked in this way is to enforce the test session between honestparticipants and define

Pnanspk = νsk.(P ′

A(sk, pk(sk)) ‖!in(c, xpk).PA(sk, xpk) ‖ !PB(sk) ‖ out(c, pk(sk))) ‖

P 6nspk

We can define similarly Pnb but we have to alter P ′B to additionally

include the test

if pk(sk) = fst(adec(y, skr)) then

just before the creation of n′b (νn′b). This results into a process P ′B that

only accepts sessions with the honest participants.

Page 67: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.3. Authentication 57

Now we indeed have that Pnanspk preserves the secrecy of n′a while

Pnbnspk does not preserve the secrecy of n′b because of the man-in-the-

middle attack.

6.3 Authentication

Authentication is among the most important security properties. Onesometimes distinguishes between entity authentication and message au-thentication. We mostly focus on the former. Roughly, the aim is thatan attacker is not able to impersonate an entity A when communicat-ing with an entity B. The ISO/IEC-9798-1 standard defines the goalof (entity) authentication as follows:

Entity authentication mechanisms allow the verification, ofan entity’s claimed identity, by another entity. The authen-ticity of the entity can be ascertained only for the instanceof the authentication exchange.

This kind of natural language specification leaves however muchroom for interpretation. The interested reader may refer to Gollmann’sdiscussion [Gollmann, 1996] of different interpretations that may leadto “attacks”, depending on which interpretation is chosen. We definebelow several flavors of authentication, following Lowe’s hierarchy ofauthentication properties [Lowe, 1997a].

6.3.1 Correspondence properties

Woo and Lam [1992] introduced the use of correspondence propertiesto model authentication. Intuitively, a correspondence property statesthat if an event e has happened then an event e′ must have happenedbefore. To model authentication the event e is typically a statementof the form “B accepted a run of the protocol” and the event e′ isa statement of the form “A started a run of the protocol”. Blanchet[2009] formalized correspondence properties in the applied pi calculusand proposes a method for automatically verifying these propertiesin the ProVerif tool. We present here a slightly simplified version of

Page 68: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

58 Security properties

this formalization which is sufficient to express the different notions ofauthentication.

Correspondence properties are statements of the form

e(t) e′(t′)

Intuitively, such a property holds on a process A if in any execution of Awhenever an instance of the event e(t), say e(tσ), occurred, then thecorresponding instance of e′, i.e. e′(t′σ), occurred before. More formallywe define the satisfaction of a correspondence property as follows.

Definition 6.3. A closed extended process A satisfies the correspon-dence property e(t) e′(t′) if and only if for all reduction R =

A(α1)−−→ A1 . . .

(αn)−−−→ An we have that if R satisfies e(tσ) for some σwith Dom(σ) = fv(t, t′) then R satisfies e′(t′σ).

One may note that in the definition we did not explicitly requirethat e′ occurred before e. This is however an immediate consequence:suppose that the definition holds and that there is a reduction wheree′(t′σ) only occurs after e(tσ); as the definition quantifies over all re-ductions the smallest prefix of this reduction satisfying e(tσ) wouldviolate the definition.

We sometimes need to consider properties where every occurrence ofthe event e needs to match a different occurrence of e′. This requirementis formalized by the use of injective correspondence properties whichare of the form

e(t) inj e′(t′)

and whose satisfaction is defined as follows.

Definition 6.4. A closed extended process A satisfies the injective cor-respondence property e(t) inj e

′(t′) if and only if for all reduction

R = A(α1)−−→ A1 . . .

(αn)−−−→ An, there exists a partial, injective functionφ : {1, . . . , n} 7→ {1, . . . , n} such that if R satisfies e(tσ) for some σwith Dom(σ) = fv(t, t′) at step i then R satisfies e′(t′σ) at step φ(i).

Injective correspondence properties protect against replay attacks,as exemplified in the next section.

Page 69: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.3. Authentication 59

6.3.2 Authentication as correspondence properties

We review the different flavors of authentication identified in Lowe’sauthentication hierarchy [Lowe, 1997a] and show how they can be mod-elled as correspondence properties. For each type of authentication wefirst recall the natural language definition given by Lowe (up to someminor rewording) and then illustrate how each of these definitions canbe formalized as a correspondence property on the Needham Schroederpublic key protocol.

Throughout this section we suppose that the processes PA and PB,introduced before for modeling the initiator and responder roles of theNeedham Schroeder public key protocol are annotated using eventseabegin(t

abegin), e

aaccept(t

aaccept), ebbegin(t

bbegin), and ebaccept(t

baccept) that are

placed as follows.

PA(ski, pkr) = νna. out(c, aenc(〈pk(ski), na〉, pkr)).event eabegin(t

abegin)

in(c, x).if fst(adec(x, ski)) = na then

let xnb = snd(adec(x, ski)) in

event eaaccept(taaccept)

out(c, aenc(xnb, pkr))

PB(skr) = in(c, y).let pki = fst(adec(y, skr)) in

let yna = snd(adec(y, skr)) in

νnb.event ebbegin(tbbegin)

out(c, aenc(〈yna, nb〉, pki))in(c, z).if adec(z, skr) = nb then

event ebaccept(tbaccept). Q

Note that the terms tabegin, tbbegin, t

aaccept, t

baccept are left unspecified for

the moment. They will be instantiated according to which flavor ofauthentication we wish to express.

We now review several variants of authentication properties.

Page 70: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

60 Security properties

Aliveness The weakest form of authentication is aliveness.

Definition 6.5 (Aliveness [Lowe, 1997a]). A protocol satisfies alivenessif, whenever an honest agent A completes a run of the protocol, ap-parently with another honest agent B, then B has previously run theprotocol.

To specify aliveness for the initiator we let taaccept = pkr and tbbegin =pk(skr). The process P 5

nspk guarantees aliveness to the initiator if thecorrespondence property

eaaccept(pk(skb)) ebbegin(pk(skb))

is satisfied. This correspondence property indeed expresses thatwhenever the initiator successfully finishes his protocol with b

(eaaccept(pk(skb)) is satisfied) then b previously initiated a run of theprotocol (ebbegin(pk(skb)) is satisfied). This property is indeed satisfiedby P 6

nspk.We note that the more general correspondence property

eaaccept(x) ebbegin(x)

would be trivially violated, as the attacker could act as the responderand would not execute the event ebbegin, i.e., we can only expect au-thentication to hold between honest agents. This is similar to the useof test sessions when specifying secrecy.

As the Needham Schroeder protocol is designed to guarantee mu-tual authentication we can also express that the protocol guaranteesaliveness to the responder by defining tbaccept = pki, taaccept = pk(ski)and requiring

ebaccept(pk(ska)) eabegin(pk(ska))

to hold. Again this property is satisfied by P 5nspk. The fact that the

property holds illustrates the weak nature of aliveness. It does notcapture the man in the middle attack: indeed aliveness simply requires ato be alive, which is indeed the case in case of the man in the middleattack.

Page 71: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.3. Authentication 61

Weak agreement As we have seen aliveness fails to capture someattacks. We therefore define the notion of agreement, stating that theinitiator and responder should agree on their identities.

Definition 6.6 (Weak agreement [Lowe, 1997a]). A protocol guaranteesweak agreement if, whenever an honest agent A completes a run ofthe protocol, apparently with another honest agent B, then B haspreviously been running the protocol, apparently with A.

Weak agreement for the initiator is expressed by including bothidentities as parameters to the events: we define taaccept = 〈pk(ski), pkr〉and tbbegin = 〈pki, pk(skr)〉 and require the property

eaaccept(〈pk(ska), pk(skb)〉) ebbegin(〈pk(ska), pk(skb)〉)

to hold.Similarly, to express weak agreement for the responder we let

tbaccept = 〈pki, pk(skr)〉 and tabegin = 〈pk(ski), pkr〉 and require that

ebaccept(〈pk(ska), pk(skb)〉) eabegin(〈pk(ska), pk(skb)〉)

The process P 5nspk satisfies weak agreement to the initiator, but not

to the responder: in the man in the middle attack the responder acceptsa run between a and b while the initiator started a run between a andthe attacker. We see that this notion of authentication successfullycaptures this attack.

Non-injective agreement Sometimes, it is not sufficient to authenti-cate the other user’s identity. We may also want to ensure that bothparties agree on some other messages, e.g. the nonces na and nb inthe Needham Schroeder public key protocol. Therefore we may wantto parametrize the definition by a set of data which the parties shouldagree on.

Definition 6.7 (Non-injective agreement [Lowe, 1997a]). A protocolguarantees non-injective agreement if, whenever an honest agent A

completes a run of the protocol, apparently with another honest agentB on a set of data d then B has previously been running the protocol,apparently with A with the same set of data d.

Page 72: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

62 Security properties

Generally one expects the participants to agree on all atomicdata. This property is sometimes called full agreement, or just agree-ment without specifying d. Agreement for the initiator could be ex-pressed by defining taaccept = 〈pk(ski), 〈pkr, 〈na, xnb〉〉〉 and tbbegin =〈pki, 〈pk(skr), 〈yna, nb〉〉 and requiring the property

eaaccept(〈pk(ska), 〈pk(skb), 〈x, y〉〉〉) ebbegin(〈(〈pk(ska), 〈pk(skb), 〈x, y〉〉〉)

to hold. Agreement for the responder is modelled in a similar way.

Injective agreement Non-injective agreement may still not be suffi-ciently strong. Consider the following protocol for sharing a symmetrickey:

A→ B : sign(aenc(k, pk(B)), prv(A))

A generates a fresh key k, encrypts it with B’s public encryption keyand signs the encryption with his own private signing key.

This protocol should guarantee full (non-injective) agreement, i.e.,both A and B agree on the identities and the key. However, an attackermay replay A’s message and trick B into using a previously used key.This key might however have been compromised since its first usage,maybe because of a long brute force attack, or careless disposal ofprevious key material. Therefore, one may want to ensure that suchreplay attacks are not possible, which motivates the notion of injectiveagreement.

Definition 6.8 (Injective agreement [Lowe, 1997a]). A protocol guaran-tees injective agreement if, whenever an honest agent A completes arun of the protocol, apparently with another honest agent B on a setof data d then B has previously been running the protocol, apparentlywith A with the same set of data d and each such run of A correspondsto a unique run of B.

Injective agreement is modelled as non-injective agreement, but wereplace with inj. The above described protocol would obviouslyviolate this property as the attacker could replay A’s message resulting

Page 73: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.4. Equivalence properties 63

into several identical accept events for a single begin event. This kindof replay attacks is avoided in the (Lowe-)Needham Schroeder publickey protocol by the nonce handshake, which guarantees freshness ofthe messages.

6.4 Equivalence properties

6.4.1 Strong flavors of confidentiality

Expressing confidentiality properties in terms of deduction is often tooweak. Indeed a term is secret unless the complete term may be knownto the adversary. Consider for instance a function h that when appliedto a pair returns the first element of the pair, modelled by the equationh(〈x, y〉) = x. Given h(t), the term t would be declared secret accordingto our previous definition. If the hash would be applied to your creditcard number, modelled as the nested pairs of each of the digits, thiswould certainly not be an acceptable notion of security, as the adversarywould learn all but one digit.

This is why Blanchet introduced the notion of strong se-crecy [Blanchet, 2004], which is inspired by the definitions of semanticsecurity in cryptography. Strong secrecy requires that two executionsof the protocol are indistinguishable, even if the adversary may choosethe value of the secret in each of the two runs.

Definition 6.9 (strong secrecy). Let P be an extended process andfv(P ) = x. We say that x is strongly secret in P if and only if

in(c, x1). in(c, x2).P{x1/x} ≈ in(c, x1). in(c, x2).P{x2/x}

The fact that the adversary choses the secret value is expressed bythe input of two possible values as a preamble to the protocol.

Example 6.3. Consider the protocol A that simply outputs the publickey encryption of a secret.

A = νsk. out(c, aenc(x, pk(sk))) ‖ {pk(sk)/y}

The active substitution models that the public key is known to theadversary. It is easy to see that x is not strongly secret in A, i.e.,

in(c, x1). in(c, x2).A{x1/x} 6≈ in(c, x1). in(c, x2).A{x2/x}

Page 74: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

64 Security properties

Indeed, it is sufficient for the attacker to provide two constants c1,and c2 as input values and compare the output with aenc(c1, y), theencryption of c1 with the public key. One may note that the processνs.A{s/x} does preserve the weak secrecy of s.

This example also illustrates that our modeling of encryption istoo naive, as it does not reflect the randomness of (most) asymmetricencryption schemes. While this abstract view of encryption is generallysufficient for trace properties it is no longer the case when consideringindistinguishability properties. If we consider an updated model whereaenc is a ternary function with an explicit argument for the randomness(and the expected, updated equational theory) we indeed have that

A = νsk. νr.out(c, aenc(x, r, pk(sk))) ‖ {pk(sk)/y}

A satisfies strong secrecy of x. This property can be easily checkedusing a tool such as ProVerif.

Another flavor of secrecy of interest is real-or-random secrecy. Ina first phase, the adversary is allowed to interact with the protocol atwill. In a second phase the adversary is given either the real secret,or a fresh randomly generated secret. If the adversary is unable todistinguish these two scenarios, we say that the protocol satisfies real-or-random secrecy.

Definition 6.10 (real-or-random secrecy). Given a closed extended pro-cess A, such that s 6∈ bn(A) we say that νs.A satisfies real-or-randomsecrecy of the name (of base type) s if and only if for all B and reduc-tion sequence νs. A α1−→ νs. A1

α2−→ . . .αn−−→ νs. An such that s 6∈ bn(Ai)

for 1 ≤ i ≤ n and no α-conversion in this reduction sequence involveds we have that

νs.(φ(An) ‖ {s/x}) ∼ νs.(φ(An) ‖ νs′. {s′/x})

Note that we need to suppose that no α-conversion involved thevalue s in the reductions in order to make sure we still refer to the rightname. One may note that real-or-random secrecy is strictly weaker thanstrong secrecy.

Page 75: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.4. Equivalence properties 65

Proposition 6.1. Let A be an extended process, such that fv(A) = {x},x is of base type and s 6∈ bn(A)∪n(A). If x is strongly secret in A thenνs.A{s/x} satisfies real-or-random secrecy of s. The converse is not truein general.

The proof of this proposition relies on the following two lemmas.The first lemma states that any transition under a restriction is alsovalid without the restriction.

Lemma 6.1. Let s be a name of base type. If νs.A α−→ νs.B (where αmay be empty, denoting an internal reduction) and s 6∈ bn(A) ∪ bn(B)and no α-conversion in this sequence involved s then A

α−→ B.

Proof. The proof is done by induction on the proof tree of νs.A α−→νs.B.

The second lemma is a result on static equivalence.

Lemma 6.2. For any frames ϕ,ϕ′ we have that if ϕ ∼ ϕ′{s′/s} thenνs.(ϕ | {s/x}) ∼ νs.ϕ | νs′.{s′/x}.

Proof. We have that

ϕ ∼ ϕ′{s′/s}⇒ νs.ϕ ∼ νs.ϕ′{s′/s} by Lemma 4.4⇔ νs.ϕ ∼ ϕ′{s′/s} as s does not occur in ϕ′{s′/s}⇔ νs.ϕ ∼ ϕ as, by hyp., ϕ ∼ ϕ′{s′/s}⇔ νs′.ϕ{s′/s} ∼ ϕ by α-conversion⇒ νs.νs′.(ϕ{s′/s} | {s/x}) ∼ νs.(ϕ | {s/x}) by Lemma 4.4⇔ νs′.ϕ{s′/s} | νs.{s/x} ∼ νs.(ϕ | {s/x}) as s does not occur in ϕ{s′/s}⇔ νs.ϕ | νs′.{s′/x} ∼ νs.(ϕ | {s/x}) by α-conversion

We can now prove Proposition 6.1

Proof. We will prove the contraposition. Suppose that νs.A{s/x} doesnot satisfy real-or-random secrecy of s: there exists A1, . . . , An suchthat

νs.A{s/x}α1−→ νs.A1 . . .

αn−−→ νs.An

Page 76: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

66 Security properties

andνs.φ(An) | {s/x} 6∼ νs.φ(An) | νs′.{s′/x} (6.1)

Note that s may not occur in any label αi as s is a restricted name ofbase type. We have that

in(c, x1). in(c, x2). A{x1/x}in(c,s)−−−−→ in(c,s′)−−−−→ A{s/x}

and by Lemma 6.1 we also have that

A{s/x}α1−→ A1 . . .

αn−−→ An

Moreover, for any A′ such that

in(c, x1). in(c, x2). A{x2/x}in(c,s)−−−−→ in(c,s′)−−−−→α1−→ · · · αn−−→ A′

we have that s does not occur in A′ as s does not occur in A norin any label αi and A inputs on x2 (therefore inputs s′ and not s).Hence, φ(A′) = φ(A′){s′/s}. Applying Lemma 6.2 to the static in-equivalence (6.1) with ϕ = φ(An) and ϕ′ = φ(A′) we have thatφ(An) 6∼ φ(A′){s′/s}. As φ(A′) = φ(A′){s′/s}, we conclude

in(c, x1). in(c, x2). A{x1/x}6≈`

in(c, x1). in(c, x2). A{x2/x}

To see that the converse direction of the proposition does not holdconsider the following counter-example. Let c1 and c2 be two constantsand

A = in(c, y). if y = x then out(c, c1) else in(c, c2)

The process A does not preserve strong secrecy of x as the adversarymay force execution of the then branch on one side and the else branchon the other side. However, the process A does preserve real-or-randomsecrecy of x.

Real-or-random secrecy is also useful to reason about passwordbased protocols. Password based protocols may be subject to offlinedictionary attacks. Offline dictionary attacks consist of two phases: in

Page 77: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.4. Equivalence properties 67

the first phase an attacker interacts with a protocol and collects somedata; in the second phase the adversary tries all possible passwords ina dictionary and uses the collected data to verify whether the currententry is the correct one or not. When a protocol satisfies real-or-randomsecrecy of a password, it resists against offline dictionary attacks. In-deed the first phase of an offline dictionary attack is captured in Def-inition 6.10 by universally quantifying over B. The second phase iscaptured by the static equivalence

νs.(φ(B) ‖ {s/x}) ∼ νs.(φ(B) ‖ νs′. {s′/x})

Intuitively, the variable x can be interpreted as the dictionary entrythe adversary is currently checking and the equivalence can be readas “trying the right entry in the dictionary is indistinguishable fromanother entry”.

First definitions for offline dictionary attacks were proposed byLowe [2004] and Delaune and Jacquemard [2006]. Modeling offline dic-tionary attacks using static equivalence was first proposed by Corinet al. [2005]. Automated verification of real-or-random secrecy wasshown decidable by Baudet [2005] for a bounded number of sessionsand its verification is also supported by the ProVerif tool for an un-bounded number of sessions [Blanchet, 2004].

6.4.2 Privacy properties

In the literature one may find many examples of privacy properties thatcan be expressed in terms of process equivalences. These definitionsinclude anonymity (towards external parties) in authentication pro-tocols [Abadi and Fournet, 2004], privacy of e-voting [Delaune et al.,2009b] and e-auction protocols [Dong and Pang, 2011, Dreier et al.,2013], unlinkability in RFID protocols [Arapinis et al., 2010, Brusòet al., 2010], etc. We do not aim to give a general definition here, asit would result into a vague and very abstract definition. Instead, weprovide a few examples.

Private authentication The notion of private authentication was in-troduced by Abadi and Fournet [2004]. The idea is that A may try to

Page 78: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

68 Security properties

authenticate to B. B only accepts authentication requests from a givenset of users, and refuses the authentication request if A is not a memberof this set. In addition to authentication this protocol should protectthe anonymity of the users trying to connect and must hide whetherthe authentication request succeeded or not, i.e. hide whether a givenidentity is in the set or not.

We present here a simplified protocol proposed by Cheval [2012].We define the following processes:

A(ska, pkb) = νna. out(c, aenc(〈na, pk(ska)〉, pkb)). in(c, x)

B(skb, pka) = νnb. in(c, y). let z = adec(y, skb) in

if fst(z) = pk(ska)then out(c, aenc(〈fst(z), 〈nb, pk(skb)〉〉, pka))else out(c, aenc(nb, pka))

The process A(a, b) models the initiator with identity a who wishesto authenticate to b. For this he sends a fresh nonce na and his pub-lic key pk(a) encrypted with b’s public key to b. The process B(b, a)models the responder b willing to get authentication requests from a.If the request contains a’s public key b responds with the messageaenc(〈fst(z), 〈nb, pk(skb)〉〉, pk(ska)). Otherwise b sends a decoy messageaenc(nb, pk(ska)). This decoy message should be indistinguishable froma valid message to any party except a. In particular, an attacker send-ing an initial message with a’s public key should be unable to knowwhether a request from a is accepted by b or not. We can formally statethis property as follows:

νska, ska′ , skb. (ϕ ‖ A(ska, pk(skb)) ‖ B(skb, pk(ska)))≈

νska, ska′ , skb. (ϕ ‖ A(ska′ , pk(skb)) ‖ B(skb, pk(ska′)))

where ϕ = {pk(ska)/xa,pk(ska′ ) /xa′ ,

pk(skb) /xb} ensures that public keysare known to the adversary. The equivalence states that the attackercannot distinguish the case where a is allowed to authenticate from thecase where a′ is. One may note that the attacker may spoof requestsfrom honest parties, as they only require the knowledge of the publickeys. For a complete modelling, one should however also consider semi-

Page 79: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.4. Equivalence properties 69

dishonest sessions where B is willing to answer requests from a dishon-est agent. Interestingly, privacy is lost if the attacker checks messagelength. Indeed, aenc(〈na, 〈nb, pk(skb)〉〉, pk(ska)) is likely to be longerthan aenc(nb, pk(ska)). This attack cannot be detected in a standardsymbolic model. It is necessary to enhanced the model with length. Werefer the reader to [Cheval et al., 2013] for a model and a corrrespond-ing decision procedure for an adversary that compares the length ofmessages.

Privacy in e-voting protocols In electronic voting, anonymity or voteprivacy is a fundamental property. Formally defining vote privacy mayhowever be tricky. Simply changing the identity of a voter would indeedresult into distinguishable processes, as in many protocols the identityof the participating voters is revealed, typically to ensure that onlyeligible voters did vote. Changing the vote also results in two distin-guishable processes as the election result is eventually published andthe differing vote may yield an observable difference in the result.

One may also make the following observations about vote privacy.In order to achieve privacy in an election one needs to consider atleast 2 honest voters. Otherwise, if all dishonest voters collude, it isgenerally easy for them to deduce the vote of the honest voters. Simi-larly, one needs to keep in mind that some information is inevitably beleaked through the election outcome: for instance the extreme case ofan unanimous vote leaks how each of the voters did vote.

The definition proposed in [Kremer and Ryan, 2005] therefore pro-poses to consider a voting protocol with two distinguished voters whoswap their vote:

S[V {a/id,v1 /v} ‖ V {b/id,

v2 /v}] ≈ S[V {a/id,v2 /v} ‖ V {b/id,

v1 /v}]

Here V represents the voter process, and id and v are the variablesreferring to the identity and the vote. The context S represents theremaining parts of the system, such as the election administrators orother dishonest voters. The definition effectively expresses that it is notpossible for an adversary to link a particular voter to a particular vote.

Page 80: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

70 Security properties

Unlinkability in RFID protocols Yet another kind of privacy propertyis unlinkability: it is not possible to link several sessions, i.e., inferthat the sessions involve a same user. This property is of particularimportance in RFID tags to avoid that a person could be traced. Someimplementations of the European electronic passport are vulnerable toan attack that allows tracing a person [Chothia and Smirnov, 2010].Unlinkability in this context can again be expressed as an observationalequivalence.

Suppose that the protocol P uses a secret key k which is specific toeach tag and the only difference between tags. Then unlinkability canbe expressed as

!νk. !P ≈ νk. !P

The left-hand side process allows several tags, each of which mayexecute several sessions. On the right-hand side there is only one tagthat may execute several sessions. If these processes are observationallyequivalent an attacker cannot distinguish the case where the same tagexecutes the protocol several times from the case where the protocolmay be executed by different tags.

6.4.3 Ideal systems

In software engineering one often starts from a specification and refinesit into a concrete implementation. A similar idea may be applied to se-curity protocols. Abadi and Gordon [1999] propose to specify a securityprotocol by an ideal system, i.e. a protocol which is trivially secure bydesign and show that the concrete protocol is indistinguishable fromits specification. We illustrate this idea in the following example.

Example 6.4. We consider a simple protocol for message authentica-tion P and a corresponding ideal system I.

P = νk.(νr.out(c, senc(m, k, r)) ‖in(c, x). if valid(x, k) = > then let y = sdec(x, k).Q)

I = νk.(νr.out(c, senc(m, k, r)) ‖in(c, x). if valid(x, k) = > then let y = sdec(x, k).Q{m/y})

Page 81: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.4. Equivalence properties 71

The protocol assumes that two entities share a key k and the senderoutputs the encryption of the message m with k. We suppose that en-cryption is authentic, i.e., it is infeasible to construct a valid ciphertextwithout knowing the encryption key. This is modelled by enriching theequational theory for symmetric encryption with the equation

valid(senc(m, k, r), k) = >

When the recipient receives a valid ciphertext it executes a process Qwhich may depend on the message. The ideal system I behaves as P ,except that it “magically” passes the message m to Q, independentlyof the received ciphertext.

Authenticity of the message may be expressed by the fact that forany message m, and any process Q, P and I behave in the same way,i.e.

∀m,Q. P ≈ I

Note that in this simple example the property only holds for a singlesession, as such a simple protocol would allow for replay attacks, in casewe would replicate the processes.

One may notice in the above example that the ideal system hasbeen tailored towards the particular protocol. It may be more desir-able to have one ideal protocol that can be implemented by severalconcrete protocols, even if these protocols significantly differ in theway the property is achieved. Simulation based security definitions,which achieve this kind of independence between the property and thespecification, have been introduced by Canetti [2001] and Backes et al.[2007] in computational models. These ideas have also been adaptedto symbolic models in [Delaune et al., 2009a, Böhl and Unruh, 2013].We here only sketch the main idea of this approach: one requires theexistence of a context S, called a simulator, who is in charge of makingthe ideal and the concrete protocol look the same, i.e.,

∃S. P ≈ S[I]

The context S is in charge of simulating network communications ofthe protocol in order to make the processes observationally equivalent.

Page 82: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

72 Security properties

The simulator must be able to do so without access to any restrictednames of the ideal system I (this is syntactically enforced, as S cannotbe under the scope of any restriction in I). The rationale behind thisapproach is that any attack process A on P can be combined with thecontext S against the ideal system I. As I is correct by constructionsuch an attack cannot exist.

Page 83: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

6.5. Exercises 73

6.5 Exercises

The following exercises require the use of the ProVerif tool available at

http://proverif.inria.fr

ProVerif’s input language is a variant of the applied pi calculus andproperty specification is similar to the material discussed in this chap-ter.

Exercice 15 (**). Model the Needham Schroeder public key protocolin ProVerif and use ProVerif to verify the following properties, respec-tively find attacks.

1. Verify that the protocol satisfies (weak) secrecy of nonce na, but admitsan attack on the (weak) secrecy of nonce nb. (Check that the attackfound corresponds to the man in the middle attack)

2. Verify that injective weak agreement is guaranteed to the initiator,while non-injective, weak agreement is violated.

Exercice 16 (**). Correct the model of the previous exercise by ap-plying Lowe’s fix (cf Chapter 2). Use ProVerif to verify the followingproperties.

1. Verify that the protocol satisfies (weak) secrecy for both nonces na andnb.

2. Verify that full injective agreement is guaranteed to both the initiatorand the responder.

3. Show that strong secrecy is not guaranteed for the nonce na (even ifthe encryption is probabilistic). Look at ProVerif’s output and explainthe attack found.

4. Verify that real-or-random secrecy of nonce na is preserved.

Exercice 17 (**). Consider the following protocol due to Kao andChow in 1995.

1. A→ S : A,B,Na

2. S → B : {A,B,Na,Kab}Kas, {A,B,Na,Kab}Kbs

3. B → A : {A,B,Na,Kab}Kas, {Na}Kab, Nb

4. A→ B : {Nb}Kab

Page 84: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

74 Security properties

The protocol involves three roles: the initiator A, the responder Band a trusted server S. Na and Nb are fresh nonces generated by A andB, respectively. Kas and Kbs are long term keys shared between A,respectively B, and the server S. Kab is a fresh session key generatedby S.

As usual we suppose that the intruder may take the role of A or Band has a shared key with the server S. Use ProVerif to verify that theprotocol guarantees

1. secrecy of Kab;

2. full injective agreement for both A and B.

We now suppose that an attacker may be able to compromise anold session key Kab, i.e., the attacker may record, or interact with asession and obtain the established session key.

Model key compromise of a previous session in ProVerif and showthat this leads to an attack, i.e., secrecy of an uncompromised keymay be violated. (Compromise of a key may be modelled by simplyoutputting the compromised key to the adversary.)

Page 85: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

7Automated verification: bounded case

As illustrated in the previous chapters, the design of protocols is error-prone and finding flaws is not an easy task. Therefore, the two lastdecades have seen the development of decision techniques and cor-responding tools to check automatically whether a protocol can beattacked. Actually, even a simple property like secrecy is undecid-able [Durgin et al., 1999]: no generic tool can check for secrecy withoutrequiring some restrictions on the attacker model. In particular, al-lowing to spawn an unbounded number of protocol sessions leads toundecidability. We provide a construction showing undecidability inthe next chapter, in §8.1.

Therefore many techniques focus on the case where the number ofsessions is bounded : assuming that the protocol is executed a limitednumber of times, can we check whether the secrecy of some data s is pre-served? Note that even if the number of sessions is bounded the systemto verify is still infinite (due to the infinite number of messages that anattacker may construct and use when interacting with the protocol par-ticipants). One of the first decidability results was proposed by Amadioand Lugiez [2000], Amadio et al. [2002]. Rusinowitch and Turuani [2001,2003] extend this result to a more general framework that includes in

75

Page 86: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

76 Automated verification: bounded case

particular composed keys. They also provide a complexity result: se-crecy is shown to be (co-)NP-complete. Another algorithm based onconstraint systems has been proposed by Millen and Shmatikov [2001]and extended by Comon-Lundh and Shmatikov [2003] to the exclusiveor. The algorithm is rather elegant, and amenable to extensions. Thisis the approach we have chosen to present in this chapter.

7.1 From protocols to constraint systems

Consider again the Needham-Schroeder public key protocol.

1. A→ B : aenc(〈a, na〉, pkb)2. B → A : aenc(〈na, nb〉, pka)3. A→ B : aenc(nb, pkb)

One session of the initiator instantiated by a willing to talk to b

can be informally represented by the following two rules.

→ aenc(〈pk(ska), na〉, pk(skb)) (7.1)

aenc(〈na, x〉, pk(ska)) → aenc(x, pk(skb)) (7.2)

The agent a first sends her identity (represented by her public keypk(ska)), together with a fresh nonce na, encrypted by the public keyof b. Then upon receiving a message of the form aenc(〈na, x〉, pk(ska)),the agent a replies by x encrypted by the public key of b. Similarly,one session of the initiator instantiated by a willing to talk to c can beinformally represented by the following two rules.

→ aenc(〈pk(ska), n′a〉, pk(skc)) (7.3)

aenc(〈n′a, y〉, pk(ska)) → aenc(y, pk(skc)) (7.4)

Finally, one session of the responder, instantiated by b willing toanswer to a can be informally represented by a single rule.

aenc(〈pk(ska), z〉, pk(skb)) → aenc(〈z, nb〉, pk(ska)) (7.5)

These rules may be triggered in any order, provided 7.1 is played be-fore 7.2 and 7.3 is played before 7.4. Consider for example the followingexecution order: 7.3, 7.5, followed by 7.4. This is a possible interleaving

Page 87: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

7.1. From protocols to constraint systems 77

of the sessions of the protocol. There is an attack if a third party canlearn nb, the nonce generated by b for a. To learn nb in this scenario,the attacker should be able to:

• build a message of the form aenc(〈pk(ska), z〉, pk(skb)) out ofaenc(〈pk(ska), n′a〉, pk(skc)) and its initial knowledge S0. This re-quirement can be represented as follows:

S0, aenc(〈pk(ska), n′a〉, pk(skc)) aenc(〈pk(ska), z〉, pk(skb))

In return, he would learn aenc(〈z, nb〉, pk(ska)) for some instanti-ation on z that satisfies the constraint above.

• build a message of the form aenc(〈n′a, y〉, pk(ska)) out ofaenc(〈z, nb〉, pk(ska)) and the previous messages

S0, aenc(〈pk(ska), n′a〉, pk(skc)), aenc(〈z, nb〉, pk(ska))

aenc(〈n′a, y〉, pk(ska))

• deduce nb

S0, aenc(〈pk(ska), n′a〉, pk(skc)), aenc(〈z, nb〉, pk(ska)), aenc(y, pk(skc))

nb

The attacker can learn nb if there is an instantiation of the variablesx, y, and z that satisfies these three constraints. We formally defineconstraints in the next section. Note that for simplicity, in the aboveexample, and in the remaining of the chapter we write S, t instead ofS ∪ {t}.

Remark 7.1. One may note that in this section we consider a formalmodel where messages from the attacker are pattern matched againstthe messages (with variables) expected by the protocol. This differsform the approach taken in the applied pi calculus where inputs fromthe attacker are bound to a variable, and the message is explicitly ver-ified to correspond to the expected message using conditionals. Con-straint system approaches that are following the later approach alsoexist [Delaune and Jacquemard, 2004, Baudet, 2005].

Page 88: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

78 Automated verification: bounded case

7.1.1 Constraint systems

In the remaining of this chapter, we assume the inference system IDY

defined in Chapter 3. The associated deduction relation is denoted `.Constraints are formally defined as follows.

Definition 7.1. A constraint is an expression of the form S u whereS is a non empty set of terms and u is a term.

A constraint system is a set of constraints C =⋃n

i=1 Si ui suchthat

1. Si ⊆ Si+1 for any 1 ≤ i ≤ n− 1;

2. If Si ui ∈ C and x ∈ var(Si) then

Sj = min{S′ | S′ v ∈ C, x ∈ var(v)}

is well-defined and is such that j < i.

Condition 1 reflects the fact that the knowledge of the attackerincreases during the execution (an attacker never forgets). Condition 2states that variables are introduced in the right of a constraint. Whenmodeling protocols, it will always be the case that variables used inoutputs (the left side of constraints) are bound by the input messages(the right side of constraints).

Given some initial knowledge S0 (a set of terms), a secret data s,and an execution interleaving

u1 → v1...

uk → vk

the corresponding constraint system is:

S0 u1

S0, v1 u2

...

S0, v1, . . . , vk−1 uk

S0, v1, . . . , vk−1, vk s

The last constraint encodes the security goal, i.e. the secrecy of s.

Page 89: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

7.1. From protocols to constraint systems 79

Example 7.1. We define the constraint system CNS associated to theinterleaving of the rules 7.3, 7.5, 7.4 of the Needham-Schroeder protocolas follows.

S0, aenc(〈pk(ska), n′a〉, pk(skc)) aenc(〈pk(ska), z〉, pk(skb))

S1, aenc(〈z, nb〉, pk(ska)) aenc(〈n′a, y〉, pk(ska))

S2, aenc(y, pk(skc)) nb

where S1 = S0, aenc(〈pk(ska), n′a〉, pk(skc)) and S2 =S1, aenc(〈z, nb〉, pk(ska)).

The set S0 representing the initial knowledge of the attacker hasbeen left unspecified so far. A possible initial knowledge is:

S0 = {pk(ska), pk(skb), skc}.

The attacker knows the public keys of the honest agents a, b and theprivate keys of the dishonest agent c.

7.1.2 Solutions

A solution to a constraint system is an instantiation of the variablesthat satisfy all the constraints. In particular, satisfaction of the lastconstraint represents the violation of secrecy, while satisfaction of theother constraints ensures the validity of the execution leading to thesecrecy violation.

Definition 7.2. A substitution σ is a solution to a constraint systemC if Sσ ` uσ for any constraint S u ∈ C.

The notation ⊥ represents a constrain system that has no solution.

Example 7.2. Consider the constraint system CNS defined in Exam-ple 7.1. A solution to CNS is the substitution:

σ = {n′a/z, nb/y}.

It corresponds the Lowe’s man-in-the-middle attack presented in Chap-ter 2 (Figure 2.1).

Page 90: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

80 Automated verification: bounded case

7.2 Constraint solving

Searching for attacks against a protocol, for a bounded number of ses-sions, reduces to searching for the existence of a solution of a constraintsystem. Indeed, a constraint system describes all possible executionsonce an interleaving has been fixed and there are finitely many inter-leavings associated to a finite number of sessions.

We present an algorithm that not only detects whether a constraintsystem admits a solution but actually computes a finite representationof all possible solutions. This algorithm was developed by Millen andShmatikov [2001], Comon-Lundh and Shmatikov [2003].

7.2.1 Algorithm

Checking for the existence of a solution is easy when all the right mem-bers of the constraints are variables.

S0 x1

S0, v1 x2

...

S0, v1, . . . , vk−1 xk

Such a constraint system is said to be in solved form and obviously hasa solution. Indeed, the substitution σ such that x1σ = x2σ = xkσ = t0for some t0 ∈ S0 is a solution.

Given an arbitrary constraint system, the constraint solving algo-rithm simplifies the constraints until it gets a system in solved form ora system that is unsatisfiable.

The rule Rred detects that a constraint T u is redundant. Itremoves the constraint T u in case u is already deducible from theset T increased by variables that appear on the right hand side of solvedconstraints, that is, variables that must be deducible. The rules Runif1

and Runif2 guess a possible instantiation of the variables by unifyingtwo non variable subterms of a constraint. R′

unif2 is a variant of Runif2

that does no prohibit the unification of variables in the special casewhere one of the subterms to be unified is used as the secret key in

Page 91: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

7.2. Constraint solving 81

Rred C ∪ {T u} C

If T ∪ {x | T ′ x ∈ C, T ′ ⊆ T, T ′ 6= T} ` u

Runif1 C ∪ {T u} σ Cσ ∪ {Tσ uσ}σ = mgu(t, u), t ∈ st(T ), t, u are not variables

Runif2 C ∪ {T u} σ Cσ ∪ {Tσ uσ}σ = mgu(t1, t2), t1, t2 ∈ st(T ), t1, t2 are not variables

R′unif2 C ∪ {T u} σ Cσ ∪ {Tσ uσ}

σ = mgu(t1, t2), aenc(t3, pk(t1)), t2 ∈ st(T )

Rf C ∪ {T f(u, v)} C ∪ {T u, T v}f ∈ {senc, aenc, pair}

Runsat C ∪ {T u} ⊥If var(T ) = var(u) = ∅ and T 6` u

Figure 7.1: Constraint solving algorithm

a public key encryption. Of course, unification with variables could bealso considered. The procedure will still be correct but less efficient.The rule Rf guesses that the attacker has built f(u, v) out of u and v.Finally, Runsat detects that the constraint system is unsatisfiable: thereis some ground constraint T u such that u is not deducible from T .Since the constraint is ground, it will never be satisfiable.

For the sake of uniformity, we may write C ε C′ instead of C C ′

where ε is the identity substitution.

Example 7.3. Consider again the constraint system CNS defined inExample 7.1.

S0, aenc(〈pk(ska), n′a〉, pk(skc)) aenc(〈pk(ska), z〉, pk(skb))

S1, aenc(〈z, nb〉, pk(ska)) aenc(〈n′a, y〉, pk(ska))

S2, aenc(y, pk(skc)) nb

Page 92: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

82 Automated verification: bounded case

After two successive applications of rule Rf with f = aenc and f = pair,we get that CNS 2 C1

NS where C1NS is:

S0, aenc(〈pk(ska), n′a〉, pk(skc)) pk(skb)

S0, aenc(〈pk(ska), n′a〉, pk(skc)) pk(ska)

S0, aenc(〈pk(ska), n′a〉, pk(skc)) z

S1, aenc(〈z, nb〉, pk(ska)) aenc(〈n′a, y〉, pk(ska))

S2, aenc(y, pk(skc)) nb

Since pk(ska), pk(skb) ∈ S0, the two first constraints can be eliminatedusing the rule Rred, yielding C2

NS :

S0, aenc(〈pk(ska), n′a〉, pk(skc)) z

S1, aenc(〈z, nb〉, pk(ska)) aenc(〈n′a, y〉, pk(ska))

S2, aenc(y, pk(skc)) nb

We may then apply Runif1 to unify aenc(〈z, nb〉, pk(ska)) andaenc(〈n′a, y〉, pk(ska)) where σ = {na/z, nb/y}. The resulting constraintsystem is:

S0, aenc(〈pk(ska), n′a〉, pk(skc)) n′a

S1, aenc(〈n′a, nb〉, pk(ska)) aenc(〈n′a, nb〉, pk(ska))

S2, aenc(nb, pk(skc)) nb

Then the three constraints can be eliminated using the rule Rred, yield-ing the empty (solved) system. Note that σ is precisely the solution wehave exhibited in Example 7.2.

7.2.2 Correctness

Given a constraint system, many simplification rules may apply, yield-ing several possible “simplified” constraint systems. These constraintsystems should be further simplified until a solved form is reached ora constraint system is unsatisfiable. The idea is that all paths shouldbe explored, as illustrated in Figure 7.2. As soon as a path leads to asystem in solved form then the initial system admits a solution. Con-versely, if all paths lead to unsatisfiable systems then the initial systemis unsatisfiable.

Page 93: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

7.2. Constraint solving 83

C =

S0 u1

S0, v1 u2...

S0, v1, . . . vn un+1

C1 C2 C3

⊥ C4 Solved ⊥

Figure 7.2: Decision procedure.

Formally, this procedure is:

• Sound (Theorem 7.1): any solution found by the procedure isindeed a solution of the constraint system.

• Complete (Theorem 7.2): whenever there is a solution of the con-straint system, there is a path in the tree of possible simplifica-tions that leads to a solution.

• Terminating (Theorem 7.3): there is no infinite branch.

The three theorems together actually provide a stronger result: the setof solutions of the initial system can be retrieved by considering theunion of the set of solutions of the systems in solved form that appearon the leaves.

Theorem 7.1 (soundness). Let C be a constraint system. If C σ C′

then C ′ is a constraint system and for any solution θ of C ′, σθ is asolution of C.

Page 94: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

84 Automated verification: bounded case

The soundness of the procedure is easy to prove. The most tech-nical point of the proof of Theorem 7.1 lies actually in the proof ofCondition 2 of the definition of a constraint system is still satisfiedby C ′.

Theorem 7.2 (completeness). Let C be a constraint system that is notin solved form. If σ is a solution of C then there exists a system C ′ andsubstitutions θ, τ such that C θ C

′ and σ = θτ .

Theorem 7.2 is of course the most technical result. Its proof can befound in [Millen and Shmatikov, 2001, Comon-Lundh and Shmatikov,2003, Comon-Lundh et al., 2010] for several variants of the term alge-bra.

Theorem 7.3 (termination). There is no infinite sequence C1 σ1

C2 σ2 · · · .

The proof of termination is left as an exercise. This procedure ac-tually does not immediately yield an NP-procedure. First, as for allother (NP) procedures, it is necessary to use a DAG representationof the terms. Otherwise, an attack in n steps may require to doublethe size of the term at each step. Moreover, the length of a branchof the decision procedure is not polynomially bounded as one couldexpect. While it is rather easy to bound polynomially the DAG-sizeof each constraint, a constraint may disappear due to application ofsome rule and re-appear later in the same branch, yielding branches ofexponential length, as exemplified by Comon-Lundh et al. [2010]. Toobtain a polynomial bound, it is necessary to refine the procedure withstrategies, as developed by Comon-Lundh et al. [2010].

7.2.3 Extensions

An enjoyable property of this decision procedure is that no solutionis lost. Indeed, a consequence of Theorem 7.2 is that every solutionof the initial constraint system can be found in one of the leaves (insolved form) that is reached when applying the procedure. This makesthis algorithm more amendable to extensions both in terms of secu-rity properties and equational theories. For example, the procedure of

Page 95: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

7.2. Constraint solving 85

Comon-Lundh et al. [2010] is used to decide a small logic of proper-ties that encompasses, e.g., authentication and also decide whether anattacker can create key cycles on honest keys. Cheval et al. [2011] ex-tended the procedure to cope with equivalence properties, the activecounterpart of static equivalence (defined in Chapter 4). It has beenimplemented yielding the APTE tool [Cheval, 2014]. This procedurehas then been used when the attacker is enhanced with the ability tocompare the length of messages [Cheval et al., 2013]. Regarding equa-tional theories, [Bursuc et al., 2007] propose a procedure for constraintsystems with an associative and commutative function symbol (and noother symbols). Delaune et al. [2012] decide the equivalence of con-straint systems for several group theories.

7.2.4 Tools

The first tool to implement such a constraint solving algorithm waspresented by Millen and Shmatikov [2001] and further extended byCorin and Etalle [2003], Corin et al. [2006]. The APTE tool [Cheval,2014] was already mentioned above; it implements a decision procedurebased on constraint systems to check equivalence properties.

There are however several other efficient tools, based on differ-ent techniques, that can check security properties of protocols for abounded number of sessions.

• Avispa [Armando et al., 2005] is a plateform that gathers severaltools for the analysis of protocols. It currently offers four toolsCL-Atse, OFMC, SAT-MC, and TA4SP.

• Scyther [Cremers, 2008] has the originality to allow security anal-ysis for both a bounded and an unbounded number of sessions:the tool first tries to provide a proof of security for an unboundednumber of sessions and falls back to the bounded mode in case itfails.

Page 96: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

86 Automated verification: bounded case

7.3 Exercises

Exercice 18 (**). Prove Theorem 7.3.Hint: you may use a lexicographical order on (v, s) where v is the

number of variables and s the size of the constraint system, for a notionof size to be defined.

Exercice 19 (**). Prove Theorem 7.2.

Exercice 20 (*). Consider the Wide-Mouthed-Frog.

A → S : {A,B,Kab}Kas

S → B : {A,B,Kab}Kbs

A sends to the server a key session Kab using a key Kas shared with theserver. A also indicates her identity and the identity B of the destina-tion. The server S transmits the session key to B with his shared keyKbs. The session key should remain secret between A and B (and S).

Consider a session where the initiator A wishes to talk to B togetherwith a session of the server that answers to A willing to talk to C, somedishonest agent.

1. Show that there is no attack in this configuration, that is, show thatthe following constraint system has no solution.

S0, {〈〈a, b〉, kab〉}kas {〈〈a, c〉, x〉}kas

S0, {〈〈a, b〉, kab〉}kas , {〈〈a, c〉, x〉}kcs kab

where S0 = {a, b, c, kcs}.2. Consider now the following variant of the protocol.

A → S : A,B, {Kab}Kas

S → B : A,B, {Kab}Kbs

Adapt the constraint system and show that it admits a solution. Exhibita solution using the constraint solving algorithm.

Exercice 21 (*). We consider the following protocol.

A → B : {A,Kab}apk(B)

B → A : {s}Kab

Page 97: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

7.3. Exercises 87

1. Show that this protocol admits an attack.

2. Propose a constraint system that reflects the attack scenario.

3. Show, using the constraint solving algorithm that it has a solution.

Exercice 22 (**). We enhance the inference system IDY defined inChapter 3 by considering blind signatures (see Exercise 2 for moreintuition on blind signatures). Formally, we add the three followinginference rules.

x y

blind(x, y)

sign(blind(x, y), z) y

sign(x, z)

blind(x, y) y

x

Accordingly, we add the following simplification rule to the constraintsolving algorithm defined in Figure 7.1:

C ∪ {T blind(u, v)} C ∪ {T u, T v}

1. Show that this rule is correct (easy). More precisely, show that if C ′C

with this rule, then any solution θ of C ′, θ is a solution of C.

2. Show that the resulting constraint solving algorithm is no longer com-plete. That is, show that Theorem 7.2 is false in this context.

Page 98: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,
Page 99: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

8Automated verification: unbounded case

In the previous chapter we report on techniques for the analysis ofprotocols when the number of sessions is bounded. This yields effi-cient tools to find attacks. However, when no attack is found, it isimpossible to conclude whether the analyzed protocol is secure or not.Indeed, there might exist an attack that requires a few additional ses-sions. Moreover, in practice, tools can only analyse a small number ofsessions (typically 2 or 3) in a reasonable amount of time. Therefore,if no attack is found then there is no proof that the protocol is se-cure. To overcome this limitation, it is necessary to analyse protocolsfor an unbounded number of session. However, even a simple propertylike secrecy is undecidable when we do not bounded the number of ses-sions [Durgin et al., 1999]. We provide a construction for undecidabilityin §8.1.

Nevertheless, it is still possible to design verification tools in thiscase. Obviously in this case termination is not guaranteed. There havebeen two main approaches.

• One approach is to perform backward search, relying on causalityarguments. This backward search may not terminate, but userinteraction with the tool or additional lemmas allow to prune

89

Page 100: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

90 Automated verification: unbounded case

some branches and enforce termination. Examples of such toolsare the NRL Protocol Analyzer [Meadows, 1996], and its reim-plementation in Maude, Maude-NPA [Escobar et al., 2009], aswell as the Athena [Song, 1999], Scyther [Cremers, 2008] andTamarin [Schmidt et al., 2012] tools.

• The other approach is to use abstractions. Protocols may be en-coded as (first order) Horn clauses [Weidenbach, 1999, Blanchet,2001] or tree automata [Monniaux, 2003, Goubault-Larrecq, 2000,Genet and Klay, 2000], over-approximating the intruder capabil-ities. These tools may allow for false attacks and termination isgenerally still not guaranteed. The ProVerif tool is the most ma-ture tool using this approach: its optimizations and a dedicatedHorn clause resolution algorithm make the tool extremely effi-cient and non-termination and false attacks (at least for weaksecrecy properties) rare on practical examples.

We focus on the modeling of protocols as Horn clauses, a modelingfirst suggested by Weidenbach [1999], and the verification algorithmproposed by Blanchet and implemented in ProVerif [Blanchet, 2001].The ProVerif tool takes protocols written in a variant of the appliedpi calculus as input together with a security property to verify. Theprotocol is then automatically translated into a set of first-order Hornclauses and the properties are translated into derivability queries. Theverification algorithm is based on a dedicated Horn clause resolutionprocedure. We describe in this chapter how protocols can be encoded asHorn clauses (without providing in full details the automatic transla-tion from the full applied pi calculus). Then we present the first verifica-tion algorithm implemented in ProVerif to verify weak secrecy queries.The current version of ProVerif adds many optimizations to this algo-rithm and allows for verifying more advanced properties (injective andnon-injective correspondences and some equivalence based properties)which we do not discuss here.

Page 101: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

8.1. Undecidability 91

8.1 Undecidability

Checking for secrecy is (at least) as difficult as checking for a solutionto the Post Correspondence Problem (PCP). We show how to encodePCP in protocols.

We first recall the Post Correspondence Problem. Let Σ be a finitealphabet.

Input (ui, vi)1≤i≤k, k ∈ N, ui, vi ∈ Σ∗

Output Does there exist n ∈ N and i1, . . . , in ∈ N such that 1 ≤ ij ≤ k

andui1 · · ·uin = vi1 · · · vin ?

where N denotes the set of integers.The Post Correspondence Problem is well known to be undecid-

able Davis and Weyuker [1983]. Given an input (ui, vi)1≤i≤k of PCP,we build the following protocol P .

→ {|〈u1, v1〉|}sKab

, . . . , {|〈uk, vk〉|}sKab

{|〈〈x, y〉|}sKab

→ {|〈x · u1, y · v1〉|}sKab

, {|s|}s{|〈x·u1,x·u1〉|}s

Kab

, . . . ,

{|〈x · uk, y · vk〉|}sKab

, {|s|}s{|〈x·uk,x·uk〉|}s

Kab

where a1 · a2 · · · an denotes the term 〈a1, 〈a2, 〈· · · an〉 · · · 〉 fora1, . . . , an ∈ Σ. The key Kab is a long-term key shared between a pairof agents a and b. The protocol describes one role. The first messagesimply outputs k encryptions of the pairs (ui, vi) that form the inputof the PCP instance. The second rule offers the possibility to concate-nate: when receiving an encrypted pair, the protocol replies with k

encryptions, appending ui and vi “under the encryption”.It is easy to see that if there is a solution

ui1 · · ·uin = vi1 · · · vin

to the entry (ui, vi)1≤i≤k, then the corresponding protocol P admitsan attack against the secrecy of s. Indeed the attacker can select themessage {|〈ui1 , vi1〉|}s

Kabin the first message from A and forward it to

Page 102: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

92 Automated verification: unbounded case

B. B then replies with

{|〈ui1 · u1, vi1 · v1〉|}sKab

, {|s|}s{|〈ui1

·u1,ui1·u1〉|}s

Kab

, . . . ,

{|〈ui1 · uk, vi1 · vk〉|}sKab

, {|s|}s{|〈ui1

·uk,ui1·uk〉|}s

Kab

The attacker selects {|〈ui1 · ui2 , vi1 · vi2〉|}sKab

and proceeds recursively,building step by step the solution to the PCP problem. In the end, theattacker gets

{|〈ui1 · ui2 · · ·uin , vi1 · vi2 · · · vin〉|}sKab

, {|s|}s{|〈ui1

·ui2···uin ,ui1

·ui2···uin 〉|}s

Kab

Since ui1 · · ·uin = vi1 · · · vin , the attacker easily deduces s.The converse direction is more complex to prove and requires a

precise model of execution. However, an interesting point to note isthat this undecidability result is independent of the specificities of aparticular model.

It is also interesting to note that this construction does use nonces,which means that secrecy is undecidable even for protocols that donot make use of nonces (or only use a bounded number of nonces).If additionally the size of messages is bounded, then secrecy is de-cidable Durgin et al. [1999]. However, when message size is bounded,secrecy is again undecidable as soon as protocols are allowed to use anunbounded number of nonces Amadio and Charatonik [2002].

8.2 Analysis of protocols with Horn clauses

8.2.1 Horn clauses

A Horn clause is a clause with at most one positive literal. Followingthe tradition in logic programming we write Horn clauses, also calledrules, as an implication

H1 ∧ . . . ∧Hn → C

where H1, . . . ,Hn are atomic formulas called the hypotheses and C

is an atomic formula called the conclusion. Moreover, we sometimesconsider the hypotheses as a set H = {H1, . . . ,Hn} and simply write

Page 103: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

8.2. Analysis of protocols with Horn clauses 93

H → C. We only consider closed formulas and all variables are implic-itly universally quantified.

Atomic formulas, also called facts, are built by applying a predicateon terms. We consider a single, unary predicate, the attacker predicate,denoted att(t). Intuitively, when att(t) is true, the attacker has learntthe message modelled by the term t. The construction of the terms hasone peculiarity: names are parametrized by terms. As we see below thisis useful to model freshness. One may see names as a particular sort offunction symbol, and we use square brackets to avoid confusing themwith “normal” function symbols.

t ::= termx, y, z variable in Xa[t1, . . . , tn] name in Nf(t1, . . . , tn) function application for f ∈ F

We present both the attacker capabilities and the protocol itself canbe modelled as Horn clauses.

8.2.2 Attacker capabilities

Starting from an inference system, it is straightforward to model theattacker capabilities as Horn clauses. Recall the Dolev Yao inferencesystem IDY :

x y

〈x, y〉〈x, y〉

x

〈x, y〉

y

x y

senc(x, y)senc(x, y) y

x

x y

aenc(x, y)aenc(x, pk(y)) y

x

This inference system is modelled by the following set of Horn clauses:

att(x) ∧ att(y) → att(〈x, y〉)att(〈x, y〉) → att(x)att(〈x, y〉) → att(y)

att(x) ∧ att(y) → att(senc(x, y))att(senc(x, y)) ∧ att(y) → att(x)

att(x) ∧ att(y) → att(aenc(x, y))att(aenc(x, pk(y))) ∧ att(y) → att(x)

Page 104: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

94 Automated verification: unbounded case

It is easy to show that {t1, . . . , tn} ` t if only if att(t) is entailed by{att(t1), . . . , att(tn)} ` t and the set of aformentionned Horn clausesrepresented the attacker capabilities.

More generally an inference rulet1 · · · tn

tcan be modelled by the

Horn clauseatt(t1) ∧ · · · ∧ att(tn) → att(t)

The initial knowledge of the attacker is also modelled using Hornclauses. For each term t in his initial knowledge we simply add theclause att(t). We suppose in particular that the attacker always knowsat least one name a[], and hence include the clause att(a[]).

8.2.3 Protocols as Horn clauses

We now illustrate on the Needham Schroeder public key protocol how aprotocol can itself be modelled by a set of Horn clauses. The NeedhamSchroeder protocol can be represented by the three following clauses:

att(xpkr) → att(aenc(〈pk(ska[]), na[xpkr]〉, xpkr))att(aenc(〈xpki, xna〉, pk(skb[]))) → att(aenc(〈xna, nb[xna, xpki]〉, xpki))att(xpkr) ∧ att(aenc(〈na[xpkr], xnb〉, xpki)) → att(aenc(xnb, xpkr))

The first message of the initiator is modelled by the first clause. Thehypothesis att(xpk) models that the attacker chooses the identity (inour modeling the public key) of the responder. The conclusion modelsthe fact that when the initiator outputs the first protocol message thenthe attacker learns this message. One may note that we parametrizethe nonce na with the variables that appear in the clause’s hypothesis.The aim is to ensure some kind of freshness. Indeed when the attackerexecutes the protocol with different agents, the nonce na will be dif-ferent as it will be parametrized with different public keys. One mayalready note that executing the protocol twice with the same responderwould result in using twice the same nonce.

The second clause models the reception of the responder of the firstmessage and his sending of the second message. One may note againthat the nonce is parametrized by two variables xna and xpki, i.e., allvariables that appear in hypothesis.

Page 105: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

8.2. Analysis of protocols with Horn clauses 95

The last clause models the reception of the second message andsending of the third message by the initiator. Here one may note therepetition of the hypothesis att(xpkr) of the first clause in the lastclause. There are several reasons for doing this. First, it guaranteesthat variables are bound correctly: as the hypotheses correspond to in-puts, it is possible to reuse a variable bound in the first input in anylater output. Second, it gives some guarantees that the protocol canactually reach this point, i.e. previous tests were satisfied. To betterunderstand this point consider the following example. Suppose that aparticipant expects a secret s1 (not included in the attacker’s initialknowledge) and outputs a constant c. Next the protocol waits for theinput of constant c and then outputs the secret nonce s2. Normally, ass1 can never be sent by the adversary in the first place, this protocolshould never perform any output and s2 should remain secret. Thisprotocol can be modelled using the following Horn clauses

att(s1[]) → att(c)att(s1[]) ∧ att(c) → att(s2[])

It should be obvious that the repetition of the hypothesis att(s1[]) inthe second clause is needed to avoid a false attack. Otherwise, the secretnonce could indeed be trivially learned by the attacker using only thesecond clause.

8.2.4 Approximations

While describing the modeling of protocols in Horn clauses we havealready hinted towards sources of approximations.

The first source of approximations is the freshness of names. Eventhough names are parametrized, this may not be sufficient to faithfullyrepresent new names. For instance when a protocol is executed severaltimes with exactly the same inputs, the parameters will be the sameand the generated nonces will be equal, while they actually should bedistinct.

Another source of approximation comes from the fact that clausesmay be “used” several times, and this in any order. This is best illus-

Page 106: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

96 Automated verification: unbounded case

trated by an example. Consider the following protocol

P = νs, n1, n2, k.

out(c, 〈senc(n1, k), senc(n2, k)〉)in(c, x). out(c, sdec(x, k))in(c, x). if x = 〈n1, n2〉 then out(c, s)

It is easy to see that this protocol should preserve the secrecy of s,as the protocol can at most reveal one out of the two nonces. A Hornclause modeling of this protocol would be

att(〈senc(n1[], k[]), senc(n2[], k[])〉)senc(x, k[]) → att(x)senc(x, 〈n1[], n2[]〉) → att(s[])

However the second rule may be used twice to learn both n1 and n2.Then the attacker can use the pairing rule (in the attacker capabilities)and the third protocol rule to learn the secret s. The fact that a rulemay be used any number of times is however important as this is thekey for modeling an unbounded number of sessions.

8.2.5 Secrecy and derivability queries

In this formalism verifying whether a term t is (weakly) secret amountsto checking whether a fact att(t) is derivable from a set of horn clauses.

Before defining derivability we introduce the notion of subsumption.

Definition 8.1 (subsumption). We say that a clause H1 → C1 subsumesthe clause H2 → C2, written H1 → C1 w H2 → C2, if and only if thereexists a substitution σ such that C1σ = C2 and H1σ ⊂ H2.

Intuitively, a clause which subsumes another clause is more generaland requires fewer hypotheses.

Definition 8.2 (derivability). Given a set of Horn clauses R, a fact f isderivable from R if there exists a finite tree such that

• each node is labelled by a fact,

• the root is labelled by f

Page 107: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

8.2. Analysis of protocols with Horn clauses 97

• if a node is labelled by a fact f0 and its child nodes are labelled byf1, . . . , fn then there exists R ∈ R such that {f1, . . . fn} → f0 v R.

Example 8.1. We consider again the Needham Schroeder public keyprotocol. The Horn clauses representation of this protocol, togetherwith the relevant attacker capabilities and initial intruder knowledgeare recalled in Figure 8.1.

To show that this protocol is insecure we give the tree witnessingthat an instance of att(nb[x]) is derivable in Figure 8.2. We annotateeach derivation step with the rules defined in Figure 8.1. One maynote that the derivation tree mimics the man-in-the middle attack.The fact that the tree has been split in three subtrees is merely forlayout purposes.

8.2.6 The analysis procedure

The analysis procedure was originally defined by Blanchet [2001]. Itoperates in two steps. The first step is a kind of forward search thatperforms resolution on the set of Horn clauses, combining some of theclauses into new, “optimized” clauses. The new clauses allow to performseveral derivation steps of the original set of rules as a single step.The resulting set of clauses is guaranteed to preserve derivability withrespect to the original clauses.

The second step is a backward depth-first search on the “optimized”set of clauses, whose aim it is to check whether a clause is derivable.

We now explain the two steps in more details. Our presentation ofthe algorithm follows the one of Blanchet [2011], where the interestedreader may find additional details.

Resolution algorithm

We start by defining what is a resolution step. Resolution is a classicaltechnique for combining rules in order to derive a new rule.

Definition 8.3 (Resolution). Let R = H → C and R′ = H ′ → C ′ betwo clauses. If there exists F0 ∈ H ′ such that F0 and C are unifiable

Page 108: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

98 Automated verification: unbounded case

Attacker capabilities:

att(x) ∧ att(y) → att(〈x, y〉) (8.1)att(〈x, y〉) → att(x) (8.2)att(〈x, y〉) → att(y) (8.3)

att(x) ∧ att(y) → att(aenc(x, y)) (8.4)att(aenc(x, pk(y))) ∧ att(y) → att(x) (8.5)

Initial knowledge:

att(skc[]) (8.6)att(pk(ska[])) (8.7)att(pk(skb[])) (8.8)att(pk(skc[])) (8.9)

att(a[]) (8.10)

Protocol clauses:

att(xpkr) → att(aenc(〈pk(ska[]), na[xpkr]〉, xpkr)) (8.11)att(aenc(〈xpki, xna〉,

pk(skb[])))→ att(aenc(〈xna, nb[xna, xpki]〉, xpki)) (8.12)

att(xpkr) ∧ att(aenc(〈na[xpkr], xnb〉, xpki))

→ att(aenc(xnb, xpkr)) (8.13)

Figure 8.1: Horn clauses modeling the Needham Schroeder public key protocol

and σ = mgu(F0, C) then the clause R ◦F0 R′, defined as

(H ∪ (H ′ \ {F0}))σ → C ′σ

is the resolvent obtained by resolution of R′ with R using F0.

Example 8.2. Consider again the clauses in Figure 8.1. We have that

(8.6) ◦att(xpkr) (8.11) = att(aenc(〈pk(ska[]), na[pk(skc)]〉, pk(skc)))

and

(8.4) ◦att(aenc(〈xpki,xna〉,pk(skb[]))) (8.12)= att(〈xpki, xna〉) ∧ att(pk(skb[])) → att(aenc(〈xna, nb[xna, xpki]〉, xpki))

Page 109: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

8.2. Analysis of protocols with Horn clauses 99

Π1 :=(8.9)

pk(skc[])(8.11)

att(aenc(〈pk(ska[]), na[pk(skc)]〉, pk(skc[])))

Π2 := Π1

(8.6)att(skc[])

(8.5)〈pk(ska[]), na[pk(skc)]〉

Π3 :=Π2

(8.8)att(pk(skb[]))

(8.4)att(aenc(〈pk(ska[]), na[pk(skc)]〉, pk(skb[])))

(8.12)att(aenc(〈na[pk(skc)], nb[na[pk(skc)], pk(ska[])]〉, pk(ska[])))

Π4 :=Π3

(8.9)pk(skc[])

att(aenc(nb[na[pk(skc)], pk(skc[]))(8.6)

att(skc[])(8.5)

att(nb[na[pk(skc)], pk(ska[])])

Figure 8.2: Derivation tree witnessing that an instance of att(nb[x]) is derivable

However, if resolution is applied to any possible rules, one quicklyruns into non termination issues.

Example 8.3. Consider the clause

R = att(senc(x, y)) ∧ att(y) → att(x)

introduced previously in the attacker capabilities. Already this clauseon its own may be a source of non-termination. One may solve thisclause with itself. For clarity, consider two renamings of this clause

R1 = att(senc(x1, y1)) ∧ att(y1) → att(x1)R2 = att(senc(x2, y2)) ∧ att(y2) → att(x2)

Then we have that R1 ◦att(senc(x2,y2)) R2 results into

R3 = att(senc(senc(x2, y2), y1)) ∧ att(y1) ∧ att(y2) → att(x2)

Page 110: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

100 Automated verification: unbounded case

Applying resolution iteratively R1 on the resolvent results into an infi-nite sequence of new rules of the form

att(senc(senc(. . . senc(xn, yn), . . . , y2), y1))∧att(y1) ∧ att(y2) ∧ . . . att(yn)

→ att(xn)

This example illustrates the need to guide the resolution algorithm,which can be done by the means of a selection function.

Definition 8.4. A selection function sel is a function from a clauseH → C to a set of facts F , such that F ⊆ H.

Intuitively, when trying to perform resolution on a clause R weonly solve facts that are in sel(R). In particular, the selection functionshould avoid facts of the form att(x) when x is a variable as thesefacts unify with any other facts. Therefore Blanchet [2001] considers inparticular basic selection functions. A selection function sel0 is basic ifit respects the following criteria:

sel0(H → C) ={

∅ if ∀F ∈ H. ∃x ∈ X . F = att(x){F} if F ∈ H ∧ ∀x ∈ X . F 6= att(x)

The saturation algorithm is presented in Figure 8.3. It uses thefunction add(R,R) which computes the set of clauses resulting fromadding the clause R to the set of clauses R and eliminating any sub-sumed clauses. The saturation algorithm itself proceeds in three steps.First it eliminates any subsumed clauses in the initial set of clauses.Second it performs resolution until reaching a fixpoint. Resolution ishowever only performed between selected facts and clauses for whichno hypotheses are selected. Third, it returns the set of facts that haveno selected hypothesis.

To better understand the algorithm let us consider a few possibleselection functions.

• Let sel(H → C) = H. In that case the algorithm performs asimple forward search. Indeed the algorithm uses a clause R withno hypothesis (as they verify the condition sel(R) = ∅) to getrid of an arbitrary hypothesis of another clause. At the end thealgorithm returns only clauses with no hypothesis.

Page 111: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

8.2. Analysis of protocols with Horn clauses 101

add(R,R) ={R if ∃R′ ∈ R, R v R′

{R} ∪ {R′ ∈ R | R′ 6v R} otherwise

Saturate(R0)=

1. R := ∅

For each R ∈ R0 do R := add(R,R0)

2. Iterate until a fixed point is reached

For each R ∈ R such that sel(R) = ∅

For each R′ ∈ R such that F0 ∈ sel(R′) and R ◦F0 R′ is defined

R := add(R ◦F0 R′,R)

3. Return {(H → C) ∈ R | sel(H → C) = ∅}

Figure 8.3: First step: saturation.

• Taking the other extreme case we may define sel(R) = ∅. Inthat case the algorithm does not modify the set of clauses exceptremoving subsumed clauses in the first step.

• Consider the case where the selection function is basic. As dis-cussed above, we never select hypothesis of the form att(x) whenapplying resolution. This choice avoids many cases of non termi-nation due to the fact that att(x) unifies with any fact. Moreover,at the end we only keep clauses with hypotheses of the form att(x)(or clauses with no hypothesis at all). These hypotheses can al-ways be satisfied (as at least att(a[]) is initially true). A rulehaving a hypothesis of a different form has either been solvedduring resolution (and we keep only the solved form), or it has ahypothesis that is never satisfiable and hence can never occur ina derivation tree.

Blanchet has shown that for any selection function saturation pre-serves derivability. This is actually an adaptation of the fact that res-olution with free selection is complete [de Nivelle, 1995].

Page 112: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

102 Automated verification: unbounded case

deriv(R,R,R1) =

∅ if ∃R′ ∈ R, R v R′

{R} otherwise, if sel(R) = ∅⋃{deriv(R′ ◦F0 R,R∪ {R},R1) | R′ ∈ R1, F0 ∈ sel(R)and R′ ◦F0 R is defined} otherwise

derivable(F,R1) = deriv(F → F, ∅,R1)

Figure 8.4: Second step: backward depth-first search.

Lemma 8.1 ([Blanchet, 2001]). Let F be a closed fact. F is derivablefrom R if and only if F is derivable from saturate(R).

It is obvious that steps 1 and 2 of Saturate preserve derivability.The technical points lies in showing that it is sufficient to consider onlyclauses R such that sel(R) = ∅ (step 3).

Backward search

The second step of the procedure is a simply backward depth firstsearch on the set of Horn clauses obtained from the saturation proce-dure. The procedure derivable(F,R1) is described in Figure 8.4.

The result of derivable(F,R1) is a set of clauses that provides afinite representation of the instances of F that are derivable. Moreprecisely, for each clause R = H → C ∈ derivable(F,R1) we havethat R is derivable from R1, C is an instance of F and sel(R) = ∅.Moreover, for any instance F ′ of F that is derivable from R1 thereexists H → C ∈ derivable(F,R1) and σ such that F ′ = Cσ and eachelement of Hσ is derivable from R1.

The set derivable(F,R1) is computed with the help of the auxiliaryfunction deriv(R,R,R1). The first argument R = H → C is the currentgoal: C is an instance of the fact F we initially wish to derive; H are thefacts needed to derive C. Initially R = F → F . The second argumentis the set of goals we already tried. The third argument is the set ofclauses R1 which stays invariant. The function is evaluated recursively.

The correctness of the backward search has been proven by Blanchetthrough the following lemma.

Page 113: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

8.2. Analysis of protocols with Horn clauses 103

Lemma 8.2 ([Blanchet, 2001]). Let F ′ be a ground instance of F . F ′

is derivable from R1 if and only if there exists a clause H → C ∈derivable(F,R1) and a substitution σ such that F ′ = Cσ and anyF ′′ ∈ Hσ is derivable from R1.

We can now put the pieces together. If derivable(F, saturate(R)) =∅ then we have that F is not derivable from R as a direct con-sequence of the two above lemmas. Moreover, if the selection func-tion is basic and F is ground, F is derivable from R if and only ifderivable(F, saturate(R)) 6= ∅. This follows from the two following ob-servations: if F is ground for any H → C ∈ derivable(F, saturate(R))we have that C = F , and as the selection function is basic all elementsof H are of the form att(x) for some variable x which can be derivedusing, e.g. att(a[]).

Page 114: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

104 Automated verification: unbounded case

8.3 Exercises

Exercice 23 (**). Show that secrecy remains undecidable even whenconsidering only protocols with atomic keys (that is, only constants areused as keys).

Exercice 24 (*). Consider the set of clauses RI corresponding to therules symmetric encryption.

RI = {I(x) ∧ I(y) → I({x}y), I({x}y) ∧ I(y) → I(x)}

1. Show that I({a}k2) is derivable from C1 = RI∪{I(k1), I({k2}k1), I(a)}using the procedure seen is this chapter with a basic selection function.

2. Show that I({a}k2) is not derivable from C2 = RI ∪ {I(k1), I(a)}.You may use the procedure seen is this chapter with a basic selectionfunction.

Page 115: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

9Further readings and conclusion

The aim of our tutorial was to give an introduction to some selectedtopics in the area of security protocol verification. Without aiming atgiving an exhaustive bibliography we conclude our tutorial with a fewpointers towards other approaches and some new directions.

Other approaches for protocol verification We have seen that bound-ing the number of sessions is sufficient to make the problem of protocolverification decidable (for several equational theories and properties).If one is willing to additionally bound the length of messages the result-ing system is finite and general purpose model checkers can be used toanalyze protocols. Even though this approach does not provide strongsecurity guarantees it has been successfully used to detect flaws in pro-tocols, see for instance the work by Mitchell et al. [1997] using Murφand Lowe [1997b] using Casper and FDR.

Other approaches go in the opposite directions. Instead of simplify-ing the model to ensure automation, they sacrifice automation in orderto keep the full adversary model. Paulson [1998] has for instance usedthe Isabelle theorem prover to prove protocols correct. The tamarintool [Schmidt et al., 2012] is a more recent, and dedicated tool that al-

105

Page 116: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

106 Further readings and conclusion

lows for interactive proof construction when the automatic mode fails.It is also possible to use static analysis techniques, in particular type

systems. This line of work goes back to Abadi’s result on secrecy bytyping [Abadi, 1997]. The use of a type system is of course incomplete,i.e., some secure protocols may not type check, but the type systemdictates a discipline on how a protocol should be programmed. Theseideas have been generalized in many directions and we refer the readerto Foccardi and Maffei’s chapter on types for security protocols [Fo-cardi and Maffei, 2011]. One may in particular mention the F7 typechecker [Bengtson et al., 2011] which allows to verify a large variety oftrace properties for protocols written in the F# functional language,by using refinement types and outsourcing the verification conditionsto an SMT solver.

Extending the scope Most protocol verification tools allow the ver-ification of trace properties, such as weak secrecy and some forms ofcorrespondance properties, but do not support the verification of equiv-alence properties. An exception is the ProVerif tool which offers some,albeit limited, support for equivalence properties [Blanchet et al., 2005].This is due in particular to the fact that it verifies a more fine grainedproperty which is too strong in many examples, e.g. when verifyingunlinkability in e-Passports or privacy in voting protocols. Recently,a few dedicated tools have been developed for verifying trace equiva-lence for a bounded number of sessions [Tiu and Dawson, 2010, Chevalet al., 2011, Chadha et al., 2012]. However, existing tools have eitherlimited support for equational theories or do not support non trivialelse branches. These tools are still at a prototype status and not as ma-ture as the ones for trace properties. An interesting direction for futurework is to design interactive tools for proving equivalence propertiesfor an unbounded number of sessions.

Other shortcomings of existing tools are the types of protocols theycan analyze. Some protocols require to maintain a global, mutable state,i.e. a memory that can be read and modified by parallel threats. Aprominent example where modeling such a state is required are secu-rity hardware modules which can be accessed by a security API, such

Page 117: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

107

as the RSA PKCS#11 standard, IBM’s CCA or the trusted platformmodule (TPM). These security tokens have been known to be vulner-able to logical attacks for some time [Longley and Rigby, 1992, Bondand Anderson, 2001] and formal analysis has shown to be a valuabletool to identify attacks and find secure configurations [Delaune et al.,2010b, Bortolozzo et al., 2010]. Apart from security APIs many otherprotocols need to maintain databases: key servers need to store thestatus of keys, in optimistic contract signing protocols a trusted partymaintains the status of a contract, RFID protocols maintain the sta-tus of tags and more generally websites may need to store the currentstatus of transactions.

Many existing tools, in particular those that allow to verify proto-cols for an unbounded number of sessions, do not handle well this kindof state. Limitations of an encoding of memory cells are for instance dis-cussed explicitly in the ProVerif manual [Blanchet et al., 2013, Section6.3.3]:

“Due to the abstractions performed by ProVerif, such a cell istreated in an approximate way: all values written in the cell are con-sidered as a set, and when one reads the cell, ProVerif just guaranteesthat the obtained value is one of the written values (not necessarily thelast one, and not necessarily one written before the read).”

Some works [Arapinis et al., 2011, Mödersheim, 2010, Delaune et al.,2011] have nevertheless used ingenious encodings of mutable state inHorn clauses, but these encodings have limitations. The most recentadvances on verification of such stateful protocols are based on thetamarin prover [Schmidt et al., 2013, Kremer and Künnemann, 2014].These works overcome existing limitations but require user interaction.

Another kind of protocols for which only very limited tool supportexists are group protocols, e.g. group key exchange protocols, whereparticipants need to process lists or other recursive data structures.This kind of data structures also appears in web services that have toprocess XML documents. Some of the first results on this topic wereobtained by Paulson [1997] who used the Isabelle theorem prover toprove a recursive authentication protocol. In the case of a boundednumber of sessions Truderung [2005] showed decidability of secrecy

Page 118: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

108 Further readings and conclusion

for recursive protocols and Chridi et al. [2009] adapted the constraintsolving approach to analyze protocols that manipulate unbounded lists.Recently, the ProVerif tool was also adapted to be able to analyze pro-tocols with unbounded lists, and this for an unbounded number of ses-sions [Blanchet and Paiola, 2010]. Finally, the tamarin prover has alsobeen used to analyze group key agreement protocols providing supportfor recursive structures, loops and equational theories for Diffie Hell-man exponentiation and bilinear pairings [Kremer and Künnemann,2014].

Towards more realistic models As for other areas of verification, se-curity proofs discussed in this tutorial are carried out in an abstract,mathematical model. In so-called Dolev-Yao models the cryptographyis indeed considered as “perfect”, i.e. an intruder may only manipu-late terms according to some deduction rules or equations. Moreover,nonces are unguessable. This contrasts with the computational mod-els adopted in cryptography, see e.g. [Goldwasser and Micali, 1984],where adversaries are arbitrary probabilistic polynomial time Turingmachines and security is expressed in terms of the probability of an ad-versary to break a security property. In such models, proofs are done ina way similar to proofs in complexity theory, by reducing the problem ofattacking the protocol to the problem of breaking the underlying cryp-tographic primitives, e.g. encryption. While these models are arguablymore precise and provide better security guarantees, the proofs aremore complicated, generally done by hand and also more error-prone(the models being much more complex, writing very detailed proofsbecomes quickly infeasible). In their seminal work, Abadi and Rog-away [2002] show that it is sometimes possible to prove computationalsoundness of Dolev-Yao models. Such soundness results guarantee thata proof in a Dolev-Yao model can be translated into a proof in the com-putational model, yielding strong guarantees while benefiting from toolsupport in the more abstract model. We refer the interested reader to asurvey paper [Cortier et al., 2010] on this topic for an extensive descrip-tion and bibliography. There have also been some attempts to formalizeand (partially) automate proofs directly in computational models. The

Page 119: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

109

main successes in that directions are the CryptoVerif prover [Blanchet,2006] and the easycrypt tool [Barthe et al., 2011], a dedicated, interac-tive theorem prover.

Another challenge is to directly analyse the source code rather thanworking with abstract models, be it symbolic, Dolev-Yao models orcomputational ones. The CSure tool [Goubault-Larrecq and Parrennes,2005] relies on dedicated abstract interpretation techniques to translatesecurity protocol written in C to a set of Horn clauses that can thenbe analyzed. The ASPIER tool [Chaki and Datta, 2009] also allows toanalyze C programs, combining software model checking techniques (inparticular counter-example guided abstraction refinement) with sym-bolic protocol verification. Some works also build on general purposeprogram analysis tools to analyze protocols written in C [Dupressoiret al., 2011] and JAVA [Küsters et al., 2012]. Type systems, such asF7 [Bengtson et al., 2011] have also been used to analyze protocols writ-ten in functional languages. Yet another approach is to generate theprotocol code from verified specifications, rather than try to directlyanalyze existing code, see [Pironti and Sisto, 2010, Cadé and Blanchet,2013].

Conclusion To conclude, the formal analysis of security protocols isnow a mature field that offers several powerful techniques to performsecurity proofs or to find flaws. Many challenges remain yet to be solvedsuch as obtaining security proofs in more accurate models, verifying theimplementations, or tackling new families of protocols such as e-votingor APIs for secure elements.

Page 120: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,
Page 121: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

Acknowledgements

We would like to thank the reviewer for the careful reading and thehelpful suggestions of improvement. We also thank the editors for theirsuggestion to write this tutorial and their patience and support untilit came to a final version.

This work has received support from the European ResearchCouncil under the European Union’s Seventh Framework Programme(FP7/2007-2013) / ERC grant agreement n◦ 258865 and the ANRproject ProSe (decision ANR 2010-VERS-004).

111

Page 122: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,
Page 123: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

References

M. Abadi. Secrecy by typing in security protocols. In Proc. 3rd InternationalSymposium on Theoretical Aspects of Computer Software (TACS’97), vol-ume 1281 of Lecture Notes in Computer Science, pages 611–638. Springer,1997.

M. Abadi and V. Cortier. Deciding knowledge in security protocols underequational theories. In Proc. 31st International Colloquium on Automata,Languages, and Programming (ICALP’04), volume 3142 of Lecture Notesin Computer Science, pages 46–58. Springer, 2004.

M. Abadi and V. Cortier. Deciding knowledge in security protocols underequational theories. Theoretical Computer Science, 387(1-2):2–32, 2006.

M. Abadi and C. Fournet. Private authentication. Theoretical ComputerScience, 322(3):427–476, 2004.

M. Abadi and C. Fournet. Mobile values, new names, and secure commu-nication. In Proc. 28th ACM Symposium on Principles of ProgrammingLanguages (POPL’01), pages 104–115. ACM, 2001.

M. Abadi and A. D. Gordon. A calculus for cryptographic protocols: The spicalculus. Information and Computation, 148(1):1–70, 1999.

M. Abadi and P. Rogaway. Reconciling two views of cryptography (the com-putational soundness of formal encryption). Journal of Cryptology, 15(2):103–127, 2002.

113

Page 124: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

114 References

R. Amadio and W. Charatonik. On name generation and set-based analysis inthe Dolev-Yao model. In Proc. 13th International Conference on Concur-rency Theory (CONCUR’02), Lecture Notes in Computer Science, pages499–514. Springer, 2002.

R. Amadio and D. Lugiez. On the reachability problem in cryptographicprotocols. In Proc. 12th International Conference on Concurrency Theory(CONCUR’00), volume 1877 of Lecture Notes in Computer Science, pages380–394, 2000.

R. Amadio, D. Lugiez, and V. Vanackère. On the symbolic reduction ofprocesses with cryptographic functions. Theoretical Computer Science, 290(1):695–740, 2002.

S. Anantharaman, P. Narendran, and M. Rusinowitch. Intruders with caps. InProc. 18th International Conference on Term Rewriting and Applications(RTA’07), volume 4533 of Lecture Notes in Computer Science, pages 20–35.Springer, 2007.

M. Arapinis, T. Chothia, E. Ritter, and M. D. Ryan. Analysing unlinkabilityand anonymity using the applied pi calculus. In Proc. 23rd ComputerSecurity Foundations Symposium (CSF’10), pages 107–121. IEEE Comp.Soc. Press, 2010.

M. Arapinis, E. Ritter, and M. D. Ryan. Statverif: Verification of statefulprocesses. In Proc. 24th IEEE Computer Security Foundations Symposium(CSF’11), pages 33–47. IEEE Comp. Soc. Press, 2011.

A. Armando, D. Basin, Y. Boichut, Y. Chevalier, L. Compagna, J. Cuel-lar, P. Hankes Drielsma, P.-C. Héam, O. Kouchnarenko, J. Mantovani,S. Mödersheim, D. von Oheimb, M. Rusinowitch, J. Santiago, M. Turuani,L. Viganò, and L. Vigneron. The AVISPA Tool for the automated valida-tion of internet security protocols and applications. In K. Etessami andS. Rajamani, editors, 17th International Conference on Computer AidedVerification, CAV’2005, volume 3576 of Lecture Notes in Computer Sci-ence, pages 281–285, Edinburgh, Scotland, 2005. Springer.

A. Armando, R. Carbone, L. Compagna, J. Cuellar, and L. Tobarra Abad.Formal analysis of saml 2.0 web browser single sign-on: Breaking the saml-based single sign-on for google apps. In Proc. 6th ACM Workshop on FormalMethods in Security Engineering (FMSE 2008), pages 1–10, 2008.

F. Baader and T. Nipkow. Term rewriting and all that. Cambridge UniversityPress, 1998. ISBN 978-0-521-45520-6.

F. Baader and W. Snyder. Unification theory. In J. A. Robinson andA. Voronkov, editors, Handbook of Automated Reasoning, pages 445–532.Elsevier and MIT Press, 2001. ISBN 0-444-50813-9, 0-262-18223-8.

Page 125: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

References 115

M. Backes, B. Pfitzmann, and M. Waidner. The reactive simulatability(RSIM) framework for asynchronous systems. Information and Compu-tation, 205(12):1685–1720, 2007.

G. Barthe, B. Grégoire, S. Heraud, and S. Zanella Béguelin. Computer-aidedsecurity proofs for the working cryptographer. In Advances in Cryptology -CRYPTO 2011, volume 6841 of Lecture Notes in Computer Science, pages71–90. Springer, 2011.

D. Basin, C. Cremers, and S. Meier. Provably repairing the ISO/IEC 9798standard for entity authentication. In Proc. 1st Conference on Principles ofSecurity and Trust (POST’12), volume 7215 of Lecture Notes in ComputerScience, pages 129–148. Springer, 2012.

M. Baudet. Deciding security of protocols against off-line guessing attacks. InProc. 12th ACM Conference on Computer and Communications Security(CCS’05), pages 16–25. ACM, November 2005.

M. Baudet, V. Cortier, and journal = ACM Transactions on ComputationalLogic publisher = ACM volume = 14 issue = 1 title = YAPA: A generictool for computing intruder knowledge year = 2013 Delaune, S.

J. Bengtson, K. Bhargavan, C. Fournet, A. D. Gordon, and S. Maffeis. Re-finement types for secure implementations. ACM Trans. Program. Lang.Syst., 33(2):8:1–8:45, 2011.

M. Berrima, N. Ben Rajeb, and V. Cortier. Deciding knowledge in secu-rity protocols under some e-voting theories. Theoretical Informatics andApplications (RAIRO-ITA), 45:269–299, 2011.

B. Blanchet. A computationally sound mechanized prover for security pro-tocols. In Proc. IEEE Symposium on Security and Privacy (SP’06), pages140–154. IEEE Comp. Soc. Press, 2006.

B. Blanchet. Automatic verification of correspondences for security protocols.Journal of Computer Security, 17(4):363–434, 2009.

B. Blanchet. Formal Models and Techniques for Analyzing Security Protocols,chapter Using Horn Clauses for Analyzing Security Protocols. Volume 5 of, Cortier and Kremer [2011], 2011.

B. Blanchet. Automatic Proof of Strong Secrecy for Security Protocols. InProc. Symposium on Security and Privacy (SP’04), pages 86–100. IEEEComp. Soc. Press, 2004.

B. Blanchet. An efficient cryptographic protocol verifier based on prolog rules.In Proc. of the 14th Computer Security Foundations Workshop (CSFW’01).IEEE Comp. Soc. Press, 2001.

Page 126: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

116 References

B. Blanchet and M. Paiola. Automatic verification of protocols with listsof unbounded length. In Proc. 17th ACM Conference on Computer andCommunications Security (CCS’10), pages 573–584. ACM, 2010.

B. Blanchet, M. Abadi, and C. Fournet. Automated Verification of SelectedEquivalences for Security Protocols. In Proc. 20th Symposium on Logicin Computer Science (LICS’05), pages 331–340. IEEE Comp. Soc. Press,2005.

B. Blanchet, B. Smyth, and V. Cheval. ProVerif 1.88: Automatic Crypto-graphic Protocol Verifier, User Manual and Tutorial, 2013.

F. Böhl and D. Unruh. Symbolic universal composability. In Proc. 26rd Com-puter Security Foundations Symposium (CSF’13), pages 257–271, 2013.

M. Bond and R. Anderson. API level attacks on embedded systems. IEEEComputer Magazine, pages 67–75, October 2001.

M. Bortolozzo, M. Centenaro, R. Focardi, and G. Steel. Attacking and fixingPKCS#11 security tokens. In Proc. 17th ACM Conference on Computerand Communications Security (CCS’10), pages 260–269. ACM, 2010.

M. Brusò, K. Chatzikokolakis, and J. den Hartog. Formal verification ofprivacy for rfid systems. In Proc. 23rd Computer Security FoundationsSymposium (CSF’10), pages 75–88. IEEE Comp. Soc. Press, 2010.

S. Bursuc, H. Comon-Lundh, and S. Delaune. Associative-commutative de-ducibility constraints. In Proc. 24th Annual Symposium on TheoreticalAspects of Computer Science (STACS’07), volume 4393 of Lecture Notes inComputer Science, pages 634–645. Springer, 2007.

D. Cadé and B. Blanchet. Proved generation of implementations fromcomputationally-secure protocol specifications. In Proc. 2nd Conferenceon Principles of Security and Trust (POST’13), volume 7796 of LectureNotes in Computer Science, pages 63–82. Springer, 2013.

R. Canetti. Universally composable security: A new paradigm for crypto-graphic protocols. In Proc. 42nd IEEE Symp. on Foundations of ComputerScience (FOCS’01), pages 136–145. IEEE Comp. Soc. Press, 2001.

R. Chadha, Ș. Ciobâcă, and S. Kremer. Automated verification of equivalenceproperties of cryptographic protocols. In Programming Languages and Sys-tems —Proc. 21th European Symposium on Programming (ESOP’12), vol-ume 7211 of Lecture Notes in Computer Science, pages 108–127. Springer,2012.

S. Chaki and A. Datta. Aspier: An automated framework for verifying securityprotocol implementations. In Proc. 22nd Computer Security FoundationsSymposium (CSF’09), pages 172–185. IEEE Comp. Soc. Press, 2009.

Page 127: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

References 117

V. Cheval. Automatic verification of cryptographic protocols: privacy-typeproperties. Thèse de doctorat, Laboratoire Spécification et Vérification,ENS Cachan, France, December 2012.

V. Cheval. Apte: an algorithm for proving trace equivalence. In Proc. 20thInternational Conference on Tools and Algorithms for the Constructionand Analysis of Systems (TACAS’14), volume 8413 of Lecture Notes inComputer Science, pages 587–592. Springer, 2014.

V. Cheval, H. Comon-Lundh, and S. Delaune. Trace equivalence decision:Negative tests and non-determinism. In Proc. 18th ACM Conference onComputer and Communications Security (CCS’11), pages 321–330. ACM,2011.

V. Cheval, V. Cortier, and A. Plet. Lengths may break privacy – or how tocheck for equivalences with length. In Proc. 25th International Conferenceon Computer Aided Verification (CAV’13), volume 8043 of Lecture Notesin Computer Science, pages 708–723. Springer, 2013.

Y. Chevalier and M. Rusinowitch. Compiling and securing cryptographicprotocols. Inf. Process. Lett., 110(3):116–122, 2010.

T. Chothia and V. Smirnov. A traceability attack against e-passports. InProc. 14th International Conference on Financial Cryptography and DataSecurity (FC’10), volume 6052 of Lecture Notes in Computer Science, pages20–34. Springer, 2010.

N. Chridi, M. Turuani, and M. Rusinowitch. Decidable analysis for a class ofcryptographic group protocols with unbounded lists. In Proc. 22nd Com-puter Security Foundations Symposium (CSF’09), pages 277–289. IEEEComp. Soc. Press, 2009.

Ș. Ciobâcă, S. Delaune, and S. Kremer. Computing knowledge in securityprotocols under convergent equational theories. Journal of Automated Rea-soning, 48(2):219–262, 2012.

H. Comon-Lundh and V. Shmatikov. Intruder deductions, constraint solvingand insecurity decision in presence of Exclusive Or. In Proc. 18th AnnualIEEE Symposium on Logic in Computer Science (LICS ’03), pages 271–280, Los Alamitos, CA, 2003. IEEE Computer Society.

H. Comon-Lundh, V. Cortier, and E. Zălinescu. Deciding security propertiesfor cryptographic protocols. Application to key cycles. ACM Transactionson Computational Logic, 11(2), 2010.

Page 128: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

118 References

R. Corin and S. Etalle. An improved constraint-based system for the ver-ification of security protocols. In Proc. 9th International Static AnalysisSymposium (SAS’02), volume 2477 of Lecture Notes in Computer Science,pages 326–341. Springer, 2003.

R. Corin, J. Doumen, and S. Etalle. Analysing password protocol securityagainst off-line dictionary attacks. ENTCS, 121:47–63, 2005.

R. Corin, S. Etalle, and A. Saptawijaya. A logic for constraint-based secu-rity protocol analysis. In Proc. IEEE Symposium on Security and Privacy(SP’06), pages 155–168. IEEE Comp. Soc. Press, 2006.

V. Cortier and S. Delaune. Decidability and combination results for two no-tions of knowledge in security protocols. Journal of Automated Reasoning,48, 2012.

V. Cortier and S. Kremer, editors. Formal Models and Techniques for Ana-lyzing Security Protocols, volume 5 of Cryptology and Information SecuritySeries. IOS Press, 2011.

V. Cortier, S. Kremer, and B. Warinschi. A survey of symbolic methods incomputational analysis of cryptographic systems. Journal of AutomatedReasoning, 46(3-4):225–259, 2010.

C. Cremers. The Scyther Tool: Verification, falsification, and analysis of secu-rity protocols. In Proc. 20th International Conference on Computer AidedVerification (CAV’08), volume 5123 of Lecture Notes in Computer Science,pages 414–418. Springer, 2008.

M. D. Davis and E. J. Weyuker. Computability, complexity and languages,chapter 7, pages 128–132. Computer Science and Applied Mathematics.Academic Press, 1983.

H. de Nivelle. Ordering Refinements of Resolution. PhD thesis, TechnischeUniversiteit Delft, 1995.

S. Delaune and F. Jacquemard. A decision procedure for the verificationof security protocols with explicit destructors. In Proc. 11th ACM Confer-ence on Computer and Communications Security (CCS’04), pages 278–287.ACM, 2004.

S. Delaune and F. Jacquemard. Decision procedures for the security of proto-cols with probabilistic encryption against offline dictionary attacks. Journalof Automated Reasoning, 36(1-2):85–124, 2006.

Page 129: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

References 119

S. Delaune, S. Kremer, and O. Pereira. Simulation based security in theapplied pi calculus. In Proc. 29th Conference on Foundations of SoftwareTechnology and Theoretical Computer Science (FSTTCS’09), volume 4 ofLeibniz International Proceedings in Informatics, pages 169–180. Leibniz-Zentrum für Informatik, 2009a.

S. Delaune, S. Kremer, and M. D. Ryan. Verifying privacy-type properties ofelectronic voting protocols. Journal of Computer Security, 17(4):435–487,2009b.

S. Delaune, S. Kremer, and M. D. Ryan. Symbolic bisimulation for the appliedpi calculus. Journal of Computer Security, 18(2):317–377, 2010a.

S. Delaune, S. Kremer, and G. Steel. Formal analysis of PKCS#11 and propri-etary extensions. Journal of Computer Security, 18(6):1211–1245, 2010b.

S. Delaune, S. Kremer, M. D. Ryan, and G. Steel. Formal analysis of protocolsbased on TPM state registers. In Proc. 24th IEEE Computer SecurityFoundations Symposium (CSF’11), pages 66–82. IEEE Press, 2011.

S. Delaune, S. Kremer, and D. Pasaila. Security protocols, constraint systems,and group theories. In Proc. 6th International Joint Conference on Auto-mated Reasoning (IJCAR’12), volume 7364 of Lecture Notes in ArtificialIntelligence, pages 164–178. Springer, 2012.

W. Diffie and M. Helman. New directions in cryptography. IEEE Transactionson Information Society, 22(6):644–654, 1976.

D. Dolev and A.C. Yao. On the security of public key protocols. In Proc. 22ndSymposium on Foundations of Computer Science, pages 350–357. IEEEComp. Soc. Press, 1981.

H. Dong, N.and Jonker and J. Pang. Analysis of a receipt-free auction protocolin the applied pi calculus. In Proc. International Workshop on FormalAspects in Security and Trust (FAST’10), volume 6561 of Lecture Notes inComputer Science, pages 223–238. Springer, 2011.

J. Dreier, P. Lafourcade, and Y. Lakhnech. Formal verification of e-auctionprotocols. In Proc. 2nd Conferences on Principles of Security and Trust(POST’13), volume 7796 of Lecture Notes in Computer Science, pages 247–266. Springer, 2013.

F. Dupressoir, A. D. Gordon, J. Jürjens, and D. A. Naumann. Guiding ageneral-purpose c verifier to prove cryptographic protocols. In Proc. 24thComputer Security Foundations Symposium (CSF’11), pages 3–17. IEEEComp. Soc. Press, 2011.

Page 130: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

120 References

N. Durgin, P. Lincoln, J. Mitchell, and A. Scedrov. Undecidability of boundedsecurity protocols. In Proc. Workshop on Formal Methods and SecurityProtocols, 1999.

S. Escobar, C. Meadows, and J. Meseguer. Maude-npa: Cryptographic pro-tocol analysis modulo equational properties. In Foundations of SecurityAnalysis and Design V, volume 5705 of Lecture Notes in Computer Sci-ence, pages 1–50. Springer, 2009.

R. Focardi and M. Maffei. Formal Models and Techniques for Analyzing Secu-rity Protocols, chapter Types for Security Protocols. Volume 5 of , Cortierand Kremer [2011], 2011.

R. Focardi and F. Martinelli. A uniform approach for the definition of securityproperties. In Proc. World Congress on Formal Methods (FM’99), LectureNotes in Computer Science, pages 794–813. Springer, 1999.

A. Fujioka, T. Okamoto, and K. Ohta. A practical secret voting scheme forlarge scale elections. In Advances in Cryptology - AUSCRYPT’92, volume718 of Lecture Notes in Computer Science, pages 244–251. Springer-Verlag,1992.

Th. Genet and F. Klay. Rewriting for cryptographic protocol verifica-tion. In Proc. 17th International Conference on Automated Deduction(CADE’00), volume 1831 of Lecture Notes in Computer Science, pages 271–290. Springer, 2000.

S. Goldwasser and S. Micali. Probabilistic encryption. Journal of Computerand System Sciences, 28, 1984.

D. Gollmann. What do we mean by entity authentication? In Proc. Sym-posium on Security and Privacy (SP’96), pages 46–54. IEEE Comp. Soc.Press, 1996.

J. Goubault-Larrecq. A method for automatic cryptographic protocol verifi-cation (extended abstract). In Proc. Workshops of the 15th InternationalParallel and Distributed Processing Symposium, volume 1800 of LectureNotes in Computer Science, pages 977–984. Springer, 2000.

J. Goubault-Larrecq and F. Parrennes. Cryptographic protocol analysis onreal C code. In Proc. 6th International Conference on Verification, ModelChecking and Abstract Interpretation (VMCAI’05), volume 3385 of LectureNotes in Computer Science, pages 363–379. Springer, 2005.

ISO/IEC-9798-1. Information technology - Security techniques - Entity au-thentication mechanisms; Part 1: General model. ISO/IEC 9798-1. Inter-national Organization for Standardization, second edition edition, sep 1991.

Page 131: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

References 121

F. Jacquemard, M.Rusinowitch, and L. Vigneron. Compiling and verifyingsecurity protocols. In Proc. 7th International Conference on Logic for Pro-gramming and Automated Reasoning (LPAR’00), volume 1955 of LectureNotes in Computer Science, pages 131–160. Springer, 2000.

S. Kremer and R. Künnemann. Automated analysis of security protocols withglobal state. In Proceedings of the 35th IEEE Symposium on Security andPrivacy (SP’14), pages 163–178. IEEE Comp. Soc. Press, 2014.

S. Kremer and L. Mazaré. Adaptive soundness of static equivalence. InProc. 12th European Symposium on Research in Computer Security (ES-ORICS’07), volume 4734 of Lecture Notes in Computer Science, pages 610–625. Springer, 2007.

S. Kremer and M. D. Ryan. Analysis of an electronic voting protocol in theapplied pi-calculus. In Programming Languages and Systems — Proc. 14thEuropean Symposium on Programming (ESOP’05), volume 3444 of LectureNotes in Computer Science, pages 186–200. Springer, 2005.

S. Kremer, A. Mercier, and R. Treinen. Reducing equational theories forthe decision of static equivalence. Journal of Automated Reasoning, 48(2):197–217, 2012.

R. Küsters, T. Truderung, and J. Graf. A framework for the cryptographicverification of java-like programs. In Proc. 25th Computer Security Founda-tions Symposium (CSF’12), pages 198–212. IEEE Comp. Soc. Press, 2012.

D. Longley and S. Rigby. An automatic search for security flaws in key man-agement schemes. Computers and Security, 11(1):75–89, March 1992.

G. Lowe. Analysing protocols subject to guessing attacks. Journal of Com-puter Security, 12(1):83–98, 2004.

G. Lowe. A hierarchy of authentication specifications. In Proc. 10th ComputerSecurity Foundations Workshop (CSFW’97), pages 31–44. IEEE Comp.Soc. Press, 1997a.

G. Lowe. Breaking and fixing the Needham-Schroeder public-key protocolusing FDR. In Proc. 2nd International Conference on Tools and Algorithmsfor the Construction and Analysis of Systems (TACAS’96), volume 1055 ofLecture Notes in Computer Science, pages 147–166. Springer-Verlag, 1996.

G. Lowe. Casper: a compiler for the analysis of security protocols. In Proc.10th Computer Security Foundations Workshop (CSFW’97), pages 18–30.IEEE Comp. Soc. Press, 1997b.

C. Meadows. The NRL protocol analyzer: An overview. Journal of LogicProgramming, 26(2):113–131, 1996.

J. Millen. A necessarily parallel attack. In FMSP ’99, 1999.

Page 132: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

122 References

J. Millen and G. Denker. Capsl and mucapsl. J. Telecommunications andInformation Technology, 4:16–27, 2002.

J. Millen and V. Shmatikov. Constraint solving for bounded-process crypto-graphic protocol analysis. In Proc. 8th ACM Conference on Computer andCommunications Security (CCS’01), 2001.

J. C. Mitchell, M. Mitchell, and U. Stern. Automated analysis of cryptographicprotocols using Murϕ. In Proc. IEEE Symposium on Security and Privacy(SP’97), pages 141–153, 1997.

S. Mödersheim. Abstraction by set-membership: verifying security protocolsand web services with databases. In Proc. 17th ACM Conference on Com-puter and Communications Security (CCS’10), pages 351–360. ACM, 2010.

D. Monniaux. Abstracting cryptographic protocols with tree automata. Sci.Comput. Program., 47(2-3):177–202, 2003.

R. M. Needham and M. D. Schroeder. Using encryption for authentication inlarge networks of computers. Communications of the ACM, 21(12):993–999,1978.

L. C. Paulson. Mechanized proofs for a recursive authentication protocol. InProc. 10th Computer Security Foundations Workshop (CSFW’97), pages84–95. IEEE Comp. Soc. Press, 1997.

L. C. Paulson. The inductive approach to verifying cryptographic protocols.Journal of Computer Security, 6(1/2):85–128, 1998.

Alfredo Pironti and Riccardo Sisto. Provably correct Java implementations ofSpi Calculus security protocols specifications. Computers & Security, 29:302–314, 2010.

M. Rusinowitch and M. Turuani. Protocol insecurity with finite number ofsessions is NP-complete. In Proc. 14th Computer Security FoundationsWorkshop (CSFW’01), pages 174–190. IEEE Comp. Soc. Press, 2001.

M. Rusinowitch and M. Turuani. Protocol Insecurity with Finite Numberof Sessions and Composed Keys is NP-complete. Theoretical ComputerScience, 299:451–475, 2003.

P.Y.A Ryan, S.A. Schneider, M. Goldsmith, G. Lowe, and A.W. Roscoe. Mod-elling and Analysis of Security Protocols. Addison Wesley, 2000.

B. Schmidt, S. Meier, C. Cremers, and D. Basin. Automated analysis of Diffie-Hellman protocols and advanced security properties. In Proc. 25th IEEEComputer Security Foundations Symposium (CSF’12), pages 78–94. IEEEComp. Soc. Press, 2012.

Page 133: Formal Models and Techniques for Analyzing Security ... · Formal Models and Techniques for Analyzing Security Protocols: A Tutorial. Foundations and TrendsR in Programming Languages,

References 123

B. Schmidt, S. Meier, C. Cremers, and D. Basin. The tamarin prover for thesymbolic analysis of security protocols. In Proc. 25th International Con-ference on Computer Aided Verification (CAV’13), volume 8044 of LectureNotes in Computer Science, pages 696–701. Springer, 2013.

S. Schneider. Verifying authentication protocols with CSP. In Proc. 10thComputer Security Foundations Workshop (CSFW’97). IEEE Comp. Soc.Press, 1997.

D. Song. Athena, an automatic checker for security protocol analysis. InProc. 12th IEEE Computer Security Foundations Workshop (CSFW’99).IEEE Comp. Soc. Press, 1999.

J. Thayer, J. Herzog, and J. Guttman. Strand spaces: proving security pro-tocols correct. IEEE Journal of Computer Security, 7:191–230, 1999.

A. Tiu and J. Dawson. Automating open bisimulation checking for thespi-calculus. In Proc. 23rd Computer Security Foundations Symposium(CSF’10), pages 307–321. IEEE Comp. Soc. Press, 2010.

T. Truderung. Selecting theories and recursive protocols. In Proc. 16th In-ternational Conference on Concurrency Theory (Concur’05), volume 3653of Lecture Notes in Computer Science, pages 217–232. Springer, 2005.

Ch. Weidenbach. Towards an automatic analysis of security protocols in first-order logic. In Proc. 16th International Conference on Automated Deduc-tion (CADE’99), volume 1632 of Lecture Notes in Computer Science, pages314–328. Springer, 1999.

T.Y.C. Woo and S.S. Lam. Authentication for distributed systems. In Proc.Symposium on Security and Privacy (SP’92), pages 178–194. IEEE Comp.Soc. Press, 1992.