Background on cryptography and security analysis

sunflowerplateAI and Robotics

Nov 21, 2013 (3 years and 11 months ago)

117 views

Carnegie Mellon

Background on cryptography and
security analysis

CIS800/003

Sept. 28. 2011

Carnegie Mellon

2


Access to YouTube was cut for a couple of hours on Feb. 24
th
,
2008


Pakistan
announced a more specific BGP route announcement for the
block of IP addresses that YouTube uses.


What could go wrong with BGP?

How do I know that this path has
not been altered by malicious
ASes
?


Carnegie Mellon

3


Mechanisms to achieve security


Overview of cryptography



Background knowledge on security analysis


Formal models


Logic for specifying security properties


Techniques for verification



Topics

Carnegie Mellon

4


Confidentiality


Integrity


Authenticity


Availability


Auditability


Access control


Privacy




What Are the Security Requirements?

Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

5


Data integrity


Ensure data is “correct” (i.e., correct syntax & unchanged)


Prevents unauthorized or improper changes


E.g., Trent always verifies the integrity of his database after restoring a
backup, to ensure that no incorrect records exist


Entity authentication or identification


Verify the identity of another protocol participant


E.g., Alice authenticates Bob each time they establish a secure
connection


Data authentication


Ensure that data originates from claimed sender


E.g., For every message Bob sends, Alice authenticates it to ensure that
it originates from Bob

Integrity, Authentication

Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

6


Secrecy


Keep data hidden


E.g., Alice kept the incriminating information secret


Confidentiality


Keep (someone else’s) data hidden from unauthorized entities


E.g., banks keep much account information confidential


Privacy


Keep data about a person secret


E.g., to protect Alice’s privacy, company XYZ did not disclose any personal
information


Anonymity


Keep identity of a protocol participant secret


E.g., to hide her identity from the web server, Alice uses The Onion Router
(TOR) to communicate

Secrecy, Confidentiality, Privacy, Anonymity

Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

7


Freshness


Prove that data was created after an event


Upper bound on the duration of existence


Temporal order


Verify ordering of a sequence of events

Temporal Properties

Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

8


Building blocks for enforcing properties


Study of techniques to communicate securely in the presence
of an adversary


Cryptography

Goal
: A dedicated,
private connection

Alice

Bob

Reality
: Communication via an
adversary

Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

9

1.
Observe
what

Alice and Bob are communicating


Attacks on “confidentiality” or “secrecy”

2.
Observe
that

Alice and Bob are communicating, or
how
much
they are communicating


Called “traffic analysis”

3.
Modify

communication between Alice and Bob


Attacks on “integrity”

4.
Impersonate

Alice to Bob, or vice versa

5.
Prevent
Alice and Bob from communicating


Called “denial of service”



Cryptography traditionally focuses on
preventing (1)

and
detecting (3) and (4)

Adversary’s Goals

Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

10


Detecting modification and impersonation attacks is
determining who could have
sent

a communication
s


Authentication



Preventing observation attacks is limiting who can possibly
receive

a communication
s


Confidentiality


Adversary’s Goals in Perspective

Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

11


A digital signature that


cannot be forged by the adversary


Can be uniquely linked to content of the message


Can be uniquely linked to the sender

Digital Signature

Alice

Bob

Reality
: Communication via an
adversary

Message integrity

Sender authentication


Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

12


A
digital signature

scheme is a triple
<
G
,
S
,
V
>

of efficiently
computable algorithms


G

outputs a “public key”
K

and a “private key”
K
-
1

<
K
,
K
-
1
>


G
(

)


S

takes a “message”
m

and
K
-
1

as input and outputs a “signature”






S
K
-
1
(
m
)


V

takes a message

m
, signature



and public key
K

as input, and

outputs a bit

b

b


V
K
(
m,

)


If




S
K
-
1
(
m
)
then
V
K
(
m,

)

outputs 1 (“valid”)


Given only
K

and message/signature pairs
{<
m
i
,
S
K
-
1
(
m
i
)>}
i
, it is
computationally infeasible to compute
<
m
,


>
such that

V
K
(
m,

)

= 1


for any new
m



m
i

Digital Signatures (Informal Definition)

Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

13

Digital Signature

Alice

Bob

Message integrity

Sender authentication


Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

K

K
-
1

m

S
K
-
1
(

) =




V
K
(

,
) = 1

m




Carnegie Mellon

14

Informal Definition of a MAC


A
message authentication code

(MAC) scheme is a triple




G
,
T, V


of efficiently computable functions


G

outputs a “secret key”
K

K


G
(

)


T

takes a key
K

and “message”
m

as input, and outputs a “tag”
t

t


T
K
(
m
)


V

takes a message

m
, tag

t

and key
K

as input, and

outputs a bit

b

b


V
K
(
m, t
)


If
t



T
K
(
m
)
then
V
K
(
m, t
)
outputs 1 (“valid”)


Given only message/tag pairs
{

m
i
,
T
K
(
m
i
)

}
i
, it is computationally
infeasible to compute

m
,
t


such that

V
K
(
m, t
)

= 1


for any new
m



m
i


Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

15

MAC

Alice

Bob

Message integrity

?Sender authentication


Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

K

m

T
K
(

) =

t

V
K
(

,
) = 1

m

t

K

Carnegie Mellon

16

Informal Definition of Symmetric Encryption


A
symmetric encryption scheme

is a triple

G
,
E, D


of
efficiently computable functions


G

outputs a “secret key”
K

K


G
(

)


E

takes a key
K

and “plaintext”
m

as input, and outputs a

ciphertext


c


E
K
(
m
)


D

takes a
ciphertext

c

and key
K

as input, and

outputs


or a
plaintext

m


D
K
(
c
)


If
c


E
K
(
m
)
then
m


D
K
(
c
)


If
c


E
K
(
m
)
, then
c

should reveal “no information” about
m


Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

17

c

Encryption

Alice

Bob

Confidentiality

Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

K

m

E
K
(

) =

c

D
K
(

) =
m

m

K

Carnegie Mellon

18

Informal Definition of Public Key Encryption


A
public key encryption scheme

is a triple

G
,
E, D


of
efficiently computable functions


G

outputs a “public key”
K

and a “private key”
K
-
1


K, K
-
1




G
(

)


E

takes public key
K

and plaintext
m

as input, and outputs a
ciphertext

c


E
K
(
m
)


D

takes a
ciphertext

c

and private key
K
-
1

as input, and

outputs


or
a plaintext

m


D
K

1
(
c
)


If
c


E
K
(
m
)
then
m


D
K

1
(
c
)


If
c


E
K
(
m
)
, then
c

and
K

should reveal “no information” about
m


Copyright

© by Lujo Bauer, Michael Reiter 2006
-
2011

Carnegie Mellon

19


Digital signature


MAC


Encryption



Hash functions


Pseudo
-
random functions


One
-
way functions


One
-
way trapdoor functions


Summary

Carnegie Mellon

20


How do I know that this path has not been
altered by malicious
ASes
?


Digital signatures



Back to BGP

Carnegie Mellon

21


Mechanisms to achieve security


Overview of cryptography



Background knowledge on security analysis


Formal models


Logic for specifying security properties


Techniques for verification



Topics

Carnegie Mellon

22

System Modeling


M= (S,

ⰠS
0
)


States: S


Initial states: S
0



S


Transition rules:







S



Traces of M: s
0

s
1

s
2


s
n


s
0


S
0



i



[0, n
-
1],

(s
i
) = s
i+1



s =


s
u


s
u

= (
RtTB
u

,
BestRtTb
u

,
Ads
u

)



s
0

=


s
u0


s
u0

= (













(s
i
) = s
i+1


A node u removes a route
advertisement from
Ads
u


in

s
ui


Updates its tables


Advertise its new best path P


P is added to
Ads
v

in s
i+1
, for all v
that is
u’s

neighbor

Carnegie Mellon

23

Computational Tree

S
12

S
11

S
0









Carnegie Mellon

24

Computational Tree


Properties hold on the
computational tree


For all path …


Exist a path …


For all states in a path …


Exist a state in path …


For all states in a sub
-
tree …


Exists a state in sub
-
tree …




S
12

S
11

S
0







Carnegie Mellon

25


Mechanisms to achieve security


Overview of cryptography



Background knowledge on security analysis


Formal models


Logic for specifying security properties


Techniques for verification



Topics

Carnegie Mellon

26


An idealization of correct reasoning in natural
language


Syntax


What can be expressed


Semantics


What is the meaning


Inference rules


How to derive facts from assumptions

Logic

Carnegie Mellon

27


Terms


Constants: a, b, c


Concrete object (1, 2, 3, Alice, Bob)


Variables: x, y, z


Unspecified object


Functions: f, g


1+2,
father_of
(Alice)



Predicates: P, Q, R


Properties


E.g. truth, false, “the sky is blue”,
isBlue
(sky),
lessThan
(x, y)

Logic


syntax

Carnegie Mellon

28


Formulas: F


Atomic formulas: P(t
1
, …,
t
n
)


Compound formulas


Conjunction: F
1



F
2



Disjunction: F
1



F
2



Implication: F
1



F
2



Negation:


F


Universal quantification:

x. F


Existential quantification:

x. F


Logic


syntax

Carnegie Mellon

29

Logic


inference rules





瑲略

trueI





F





f慬獥

falseE














B


E1





B








B


E2








B








I





B


Ⱐ䘠


F

Init

Judgment:




F

Carnegie Mellon

30

Logic


inference rules





B






x. F


E


, F[t/x]


B








B


Ⱐ䄠


B



I








B








E





B

Carnegie Mellon

31

link(
a,b
), link(b, c), F1, F2




path(
a,c
)





B






x. F


E


, F[t/x]


B

F1 =


x,y
. link
(
x,y
)


path(
x,y
)

F2 =


x,y,z
.
link(
x,y
)


path(
y,z
)


path(
x,z
)

link(
a,b
), link(b, c), F1, F2



F2

link(
a,b
), link(b, c), F1, F2,

link(
a,b
)


path(
b,c
)


path(
a,c
)




path(
a,c
)

link(
a,b
), link(b, c), F1, F2,

link(
a,b
)


path(
b,c
)



path(
a,c
)




link(
a,b
)


path(
b,c
)



path(
a,c
)





A


B




A



E





B

link(
a,b
), link(b, c), F1, F2,

link(
a,b
)


path(
b,c
)



path(
a,c
)




link(
a,b
)


path(
b,c
)








E



E

Carnegie Mellon

32



x
1


x
n
. P
1
(t
11
, t
12,



)


P
2

(t
2
1
, t
22,



) …


P
n

(t
n
1
, t
n2,



)


儨Q
1
, s
2,



)



Prolog/
Datalog
/
NDlog


Q(s
1
, s
2,



) :
--

P
1
(t
11
, t
12,



), P
2

(t
2
1
, t
22,



) … ,
P
n

(t
n
1
, t
n2,



)

Horn clauses

path(
x,y
) :
--

link(
x,y
)

path(
x,z
) :
--

link(
x,y
), path(
y,z
)


link(a, b).

link(b, c).


Query: path(a, c)

Goal: path(a, c)

Sub
-
goal 1: link(a, c)



Carnegie Mellon

33



x
1


x
n
. P
1
(t
11
, t
12,



)


P
2

(t
2
1
, t
22,



) …


P
n

(t
n
1
, t
n2,



)


儨Q
1
, s
2,



)



Prolog/
Datalog
/
NDlog


Q(s
1
, s
2,



) :
--

P
1
(t
11
, t
12,



), P
2

(t
2
1
, t
22,



) … ,
P
n

(t
n
1
, t
n2,



)

Horn clauses

path(
x,y
) :
--

link(
x,y
)

path(
x,z
) :
--

link(
x,y
), path(
y,z
)


link(a, b).

link(b, c).


Query: path(a, c)

Goal: path(a, c)

Sub
-
goal 2.1: link(a, Y)

Solved: b/Y



Sub
-
goal 2.2: path(Y, c)

Carnegie Mellon

34



x
1


x
n
. P
1
(t
11
, t
12,



)


P
2

(t
2
1
, t
22,



) …


P
n

(t
n
1
, t
n2,



)


儨Q
1
, s
2,



)



Prolog/
Datalog
/
NDlog


Q(s
1
, s
2,



) :
--

P
1
(t
11
, t
12,



), P
2

(t
2
1
, t
22,



) … ,
P
n

(t
n
1
, t
n2,



)

Horn clauses

path(
x,y
) :
--

link(
x,y
)

path(
x,z
) :
--

link(
x,y
), path(
y,z
)


link(a, b).

link(b, c).


Query: path(a, c)

Goal: path(a, c)

Sub
-
goal 2.1: link(a, Y)

Solved: b/Y



Sub
-
goal 2.2: path(b, c)

Sub
-
goal 2.2.1: link(b, c)







Carnegie Mellon

35


Mechanisms to achieve security


Overview of cryptography



Background knowledge on security analysis


Formal models


Logic for specifying security properties


Techniques for verification



Topics

Carnegie Mellon

36

Models


Properties hold on the
computational tree


For all path …


Exist a path …


For all states in a path …


For all states in a sub
-
tree


Exist a state in path …


Exists a state in sub
-
tree …

S
12

S
11

S
0









Carnegie Mellon

37


Express trace properties


F ::= P(t
1
, …,
t
n
) |


F |

F
1



F
2

| F
1



F
2

| F
1



F
2

|


x. F |

x. F


|

F

|

F

|

F

|

F

|

F
|
F
1

U
F
2

|
F
1

S

F
2



Linear Temporal Logic

T
, S


䘠†††⁏渠瑲慣攠
T
,

F is true at state S


F

F

T
, S
i





F

S
n

S
0

S
i

S
i+1







F

F

T
, S
i





F

S
n

S
0

S
i



S
i+j







F

T
, S
i





F

S
n

S
0

S
i
-
j



S
i





F

Carnegie Mellon

38

Linear Temporal Logic



F

T
, S
i





F

S
n

S
0

S
i
-
1

S
i



S
i+1



T
, S
i


F
1

U
F
2


S
n

S
0

S
i



S
i+j





F
1

U
F
2

F



F

T
, S
i




F

S
n

S
0

S
i
-
1

S
i



S
i+1



F

F
2

F
1

T
, S
i



F
1

S
F
2


S
n

S
0

S
i
-
j



S
i





F
1

S
F
2

F
2

F
1


Express trace properties


F ::= P(t
1
, …,
t
n
) |


F |

F
1



F
2

| F
1



F
2

| F
1



F
2

|


x. F |

x. F


|

F

|

F

|

F

|

F

|

F

|
F
1

U
F
2

|
F
1

S

F
2



Carnegie Mellon

39


Path authenticity


(

灡p栬h
灡p栠

䅤s


††††††

瘬甬⁰Ⱐ瀲Ⱐ




†††
灡p栠h⁰ㅀ孶1⁵嵀瀲



†††††



扥獴b慴a⡵
Ⱐ孵嵀瀲⤠


汩l欨k

v
⤠)



Origin authenticity



(

灡ph


䅤s







搬d甬⁰Ⱐ
,慴a㴠瀱䁛甬@摝d


潷湳⡵Ⱐ搩)



Properties of BGP

v
1
v
2


v

u

u
1
u
2
..
d


p1

p2

Carnegie Mellon

40


Mechanisms to achieve security


Overview of cryptography



Background knowledge on security analysis


Formal models


Logic for specifying security properties


Techniques for verification



Topics

Carnegie Mellon

41

Security Analysis Techniques


Model checking


Common problem to solve
:

Given a model of the system, is it
possible to reach a state such that
property F holds in that state


More complicated problems:

e.g. LTL model checking



Explores all possible states


Is often in
-
complete

S
12

S
11

S
0







Carnegie Mellon

42


Theorem proving


Given the abstract description of the system, can we prove that the
system exhibit certain properties


Often rely on program logics to over approximate the system


E.g. Hoare Logic: {P} C {Q}


{
y
>5}
x
:=
y

-

5{x>0}


Security Analysis Techniques

Carnegie Mellon

43


Mechanisms to achieve security


Overview of cryptography



Background knowledge on security analysis


Formal models


Logic for specifying security properties


Techniques for verification



Summary

Carnegie Mellon

44

Next Monday


A survey of BGP security issues and solutions
.

Carnegie Mellon

45


Integrity is often a property of local or stored data


For example, we want to ensure integrity for a database stored on disk,
which emphasizes that we want to prevent unauthorized changes


Integrity emphasizes that data has not been changed


Authentication used in network context, where entities
communicate across a network


Two communicating hosts want to achieve data authentication to
ensure data was not changed by network


Authentication emphasizes that data was created by a
specific sender


Implies integrity, data unchanged in transit


Implies that identity of sender is verified

Relationship Between Integrity and Authentication

Carnegie Mellon

46


A hash function is an efficiently computable function
h
that
maps an input

x
of arbitrary bit length to an output

y


h
(
x
)


of fixed bit length


One way (
preimage

resistance):

Given only
y
, it is computationally
infeasible to find any
x


such that
h
(
x

) =
y
.


Weak collision (2
nd

preimage
) resistance:

Given
x
, it is computationally
infeasible to find any
x




x

such that
h
(
x

) =
h
(
x
)
.


Strong collision resistance:

It is computationally infeasible to find any
two distinct inputs
x
,
x


such that
h
(
x
) =
h
(
x

)
.



Hash Functions

Carnegie Mellon

47

One
-
Way Functions


Previously “
preimage

resistant”


A one
-
way function

f

: Domain



Range



Given only
y
, it is computationally infeasible to find any
x


such that
f
(
x

) =
y
.




Carnegie Mellon

48

Trapdoor One
-
Way Functions


A trapdoor one
-
way function is a function


f


that is one
-
way, but there is an efficient algorithm
B

and
trapdoor
t
such that

x



B
(
t
,
f
(
x
))



Intuition: Trapdoor
t

permits
f
i
to

be inverted
efficiently,but

otherwise
f

is one
-
way





Carnegie Mellon

49

Applications


One
-
way functions


To make a public identifier for private information



One
-
way trapdoor functions:


Digital signature


To sign a message
x
, create



=
f

1
(
x
) using the trapdoor


Signer verifies signature by checking that
f
(

) =
x


(This is just for illustration. This is not a secure signature scheme.)


Carnegie Mellon

50

Pseudorandom Functions


Intuitively, a
pseudorandom function

is a function

f
: Keys


䑯浡楮D


剡湧R



that is indistinguishable from a random function to anyone
not knowing the key (the first input)


A useful primitive for a range of “higher level” crypto functions


Notation: Let
f
K
(
x
) =
f
(
K
,
x
)



To define this precisely, let

F
(
Domain

剡湧R
)


denote the set of all functions from

Domain
to
Range

Carnegie Mellon

51

Pseudorandom Functions Make Good MACs


Let
f

be a pseudorandom function (for an appropriate

)



Select
K


R

Keys



Define

T
K
(
m
) =
f
K
(
m
)
for

m



䑯浡楮D



Define














otherwise
0

if
1
,
t
m
f
t
m
V
K
K
Carnegie Mellon

52

Pseudorandom Functions Make Good MACs


Let
f

be a pseudorandom function (for an appropriate

)



Select
K


R

Keys



Define

T
K
(
m
) =
f
K
(
m
)
for

m



䑯浡楮D



Define














otherwise
0

if
1
,
t
m
f
t
m
V
K
K
Carnegie Mellon

53

Logic


inference rules





B






x. F


E


, F[t/x]


B






x. F





F[a/x]


I

a is fresh