Quantum Cryptography Beyond QKD
CHRISTIAN SCHAFFNER
RESEARCH CENTER FOR QUANTUM SOFTWARE
INSTITUTE FOR LOGIC, LANGUAGE AND COMPUTATION (ILLC) UNIVERSITY OF AMSTERDAM
CENTRUM WISKUNDE & INFORMATICA
All material available on https://homepages.cwi.nl/~schaffne
QIP 2018 Tutorial in Delft Sunday, 14 January 2018
Quantum Cryptography Beyond QKD
n survey article with Anne Broadbent n aimed at classical cryptographers
http://arxiv.org/abs/1510.06120 In Designs, Codes and Cryptography 2016
[Broadbent Schaffner 16 in Designs, Codes and Cryptography] QCrypt Conference Series n Started in 2011 by Christandl and Wehner n Steadily growing since then: approx. 100 submissions, 30 accepted as contributions, 330 participants in Cambridge 2017. This year: Shanghai, China n It is the goal of the conference to represent the previous year’s best results on quantum cryptography, and to support the building of a research community n Trying to keep a healthy balance between theory and experiment n Half the program consists of 4 tutorials of 90 minutes, 6-8 invited talks n present some statistical observations about the last 4 editions
[QCrypt charter, QCrypt 2017 business meetings slides] Overview
[thanks to Serge Fehr, Stacey Jeffery, Chris Majenz, Florian Speelman, Ronald de Wolf] MindMap Tools Q advantage n experiments n Selection of open questions Post Q crypto
n Fork me on github! Q data
[https://github.com/cschaffner/QCryptoMindmap] Quantum Key Distribution (QKD) Quantum Mechanics
+ basis
× basis
Quantum Measurements: with prob. 1 yields 1 operations: U
0/1 H
with prob. ½ yields 0
0/1 with prob. ½ yields 1 No-Cloning Theorem
Quantum operations: U
? ? ?
Proof: copying is a non-linear operation Quantum Key Distribution (QKD)
Alice [Bennett Brassard 84]
Bob k = ?
k = 0101 1011 Eve k = 0101 1011 n Offers an quantum solution to the key-exchange problem which does not rely on computational assumptions (such as factoring, discrete logarithms, security of AES, SHA-3 etc.) n Caveat: classical communication has to be authenticated to prevent man-in-the-middle attacks Quantum Key Distribution (QKD)
0 1 1 1 0 0 0 1 1 0
k = 110 k = 110
[Bennett Brassard 84] Quantum Key Distribution (QKD)
? ? ? ? ?
0 1 1 1 0 0 0 1 1 0
k = ?
k = 10 k = 10 n Quantum states are unknown to Eve, she cannot copy them. n Honest players can test whether Eve interfered.
[Bennett Brassard 84] Quantum Key Distribution (QKD)
Alice
Bob
Eve
n technically feasible: no quantum computer required, only quantum communication Quantum Hacking
e.g. by the group of Vadim Makarov (University of Waterloo, Canada) Quantum Key Distribution (QKD)
Alice [Bennett Brassard 84]
Bob k = ?
k = 0101 1011 Eve k = 0101 1011 n Three-party scenario: two honest players versus one dishonest eavesdropper n Quantum Advantage: Information-theoretic security is provably impossible with only classical communication (Shannon’s theorem about perfect security) Quantum Key Distribution (QKD) Conjugate Coding & Q Money [Wiesner 68] also known as quantum coding or quantum multiplexing
0 1 1 1 0
n Originally proposed for securing quantum banknotes (private-key quantum money) n Adaptive attack if money is returned after successful verification n Publicly verifiable quantum money is still a topic of active research, e.g. very recent preprint by Zhandry17
[Molina Vidick Watrous 13, Brodutch Nagaj Sattath Unruh 14] Computational Security of Quantum Encryption
http://arxiv.org/abs/1602.01441 GORJAN ALAGIC, COPENHAGEN at ICITS 2016 ANNE BROADBENT, OTTAWA BILL FEFFERMAN, MARYLAND TOMMASO GAGLIARDONI, DARMSTADT MICHAEL ST JULES, OTTAWA
CHRISTIAN SCHAFFNER, AMSTERDAM
FOQUS workshop, Paris Saturday, 29 April 2017 Computational Security of Quantum Encryption Secure Encryption
plaintext message � ciphertext � = ��� (�) � = ��� (�) Alice
Bob �� = ?
Secret key �� Eve Secret key ��
One-Time Pad:
Classical: � = ��� � ∶= � ⊕ �� , ��� � ≔ � ⊕ �� Quantum: ��� , � ≔ � � � � � , ��� , � ≔ � � � � � QOTP
[Miller 1882, Vernam 1919, Ambainis Mosca Tapp de Wolf 00, Boykin Roychowdhury 03] Information-Theoretic Security
plaintext message � ciphertext � = ��� (�) � = ��� (�) Alice
Bob �� = ?
Secret key �� Eve Secret key �� QOTP Perfect / information-theoretic security:
Ciphertext distribution � is statistically independent of message distribution � . Theorem: Secret key has to be as large as the message. Highly impractical, e.g. for encrypting a video stream…
[Shannon 48, Dodis 12, Ambainis Mosca Tapp de Wolf 00, Boykin Roychowdhury 03] Computational Security
plaintext message � ciphertext � = ��� (�) � = ��� (�) Alice
Bob �� = ?
Secret key �� Eve Secret key �� Threat model: Security guarantee: §Eve sees ciphertexts (eavesdropper) c does not reveal �� §Eve knows plaintext/ciphertext pairs c does not reveal the whole � § Eve chooses plaintexts to be c does not reveal any bit of � encrypted §Eve can decrypt ciphertexts c does not reveal “anything” about � Private-Key Encryption 59
PROOF (Sketch) The fact that (Enc, Dec) is EAV-secure means that, for ℓ any S 0, 1 ,noppt adversary can distinguish between Enck(m) (for uni- form m⊆ {S)and} Enc (1ℓ). Consider now the probability that successfully ∈ k A computes f(m) given Enck(m). We claim that should successfully compute ℓ A f(m) given Enck(1 ) with almost the same probability; otherwise, could ℓ A be used to distinguish between Enck(m)andEnck(1 ). The distinguisher is easily constructed: choose uniform m S,andoutputm = m, m =1ℓ. ∈ 0 1 When given a ciphertext c that is an encryption of either m0 or m1, invoke (1n,c) and output 0 if and only if outputs f(m). If outputs f(m)when givenA an encryption of m with probabilityA that is significantlyA different from the probability that it outputs f(m) when given an encryption of 1ℓ,thenthe described distinguisher violates Definition 3.9. The above suggests the following algorithm ′ that does not receive c = A n Enck(m), yet computes f(m) almost as well as does: ′(1 )choosesa n ℓA A uniform key k 0, 1 , invokes on c Enck(1 ), and outputs whatever does. By the above,∈ { we} have thatA outputs← f(m) when run as a subroutineA Semantic SecurityA by ′ with almost the same probability as when it receives Enc (m). Thus, A k ′ fulfills the property required by the claim. A plaintext message � ciphertext � = ��� (�) Semantic security. The full definition of semantic security guarantees con- � = ��� (�) siderably more than the property considered in Theorem 3.11. The definition Aliceallows the length of the plaintext to depend on the security parameter, and allows for essentially arbitrary distributions over plaintexts. (Actually, we allow only efficiently sampleable distributions. This means that there is some probabilistic polynomial-time algorithm Samp such that Samp(1n)outputs Bob messages according to the distribution.) The definition�� = ? also takes into ac- count arbitrary “external” information h(m) about the plaintext that may be leaked to the adversary through other means (e.g., because the same message Secret key m is used for�� some other purpose as well). Eve Secret key ��
DEFINITION 3.12 Aprivate-keyencryptionscheme(Enc, Dec) is seman- tically secure in the presence of an eavesdropper if for every ppt algorithm A there exists a ppt algorithm ′ such that for any ppt algorithm Samp and polynomial-time computable functionsA f and h,thefollowingisnegligible:
n n Pr[ (1 , Enc (m),h(m)) = f(m)] Pr[ ′(1 , m ,h(m)) = f(m)] , A k − A | | ! ! ! n ! where! the first probability is taken over uniform k 0, 1 , m output! by Samp(1n),therandomnessof ,andtherandomnessof∈ { Enc},andthesecond A n probability is taken over m output by Samp(1 ) and the randomness of ′. A The adversary is given the ciphertext Enc (m) as well as the external A k [Goldwasserinformation Micalih(m), and84] leading to Turing attempts to guess the-Award (Noble price for CS) value of f(m). Algorithm ′ also attempts to guess the value of f(m), but is given only h(m)andtheA Private-Key Encryption 59
PROOF (Sketch) The fact that (Enc, Dec) is EAV-secure means that, for ℓ any S 0, 1 ,noppt adversary can distinguish between Enck(m) (for uni- form m⊆ {S)and} Enc (1ℓ). Consider now the probability that successfully ∈ k A computes f(m) given Enck(m). We claim that should successfully compute ℓ A f(m) given Enck(1 ) with almost the same probability; otherwise, could ℓ A be used to distinguish between Enck(m)andEnck(1 ). The distinguisher is easily constructed: choose uniform m S,andoutputm = m, m =1ℓ. ∈ 0 1 When given a ciphertext c that is an encryption of either m0 or m1, invoke (1n,c) and output 0 if and only if outputs f(m). If outputs f(m)when givenA an encryption of m with probabilityA that is significantlyA different from the probability that it outputs f(m) when given an encryption of 1ℓ,thenthe described distinguisher violates Definition 3.9. The above suggests the following algorithm ′ that does not receive c = A n Enck(m), yet computes f(m) almost as well as does: ′(1 )choosesa n ℓA A uniform key k 0, 1 , invokes on c Enck(1 ), and outputs whatever does. By the above,∈ { we} have thatA outputs← f(m) when run as a subroutineA A by ′ with almost the same probability as when it receives Enc (m). Thus, A k ′ fulfills the property required by the claim. A
Semantic security. The full definition of semantic security guarantees con- siderably more than the property considered in Theorem 3.11. The definition allows the length of the plaintext to depend on the security parameter, and allows for essentially arbitrary distributions over plaintexts. (Actually, we allow only efficiently sampleable distributions. This means that there is some probabilistic polynomial-time algorithm Samp such that Samp(1n)outputs messages according to the distribution.) The definition also takes into ac- count arbitrary “external” information h(m) about the plaintext that may be leaked to the adversary through other means (e.g., because the same message Classical Semantic Securitym is used for some other purpose as well).
DEFINITION 3.12 Aprivate-keyencryptionscheme(Enc, Dec) is seman- tically secure in the presence of an eavesdropper if for every ppt algorithm A there exists a ppt algorithm ′ such that for any ppt algorithm Samp and polynomial-time computable functionsA f and h,thefollowingisnegligible:
n n Pr[ (1 , Enc (m),h(m)) = f(m)] Pr[ ′(1 , m ,h(m)) = f(m)] , A k − A | | ! ! ! n ! where! the first probability is taken over uniform k 0, 1 , m output! by ��� n ∈ { } Samp(1 ),therandomnessof ,andtherandomnessofEnc,andthesecond A n probability is taken over m output by Samp(1 ) and the randomness of ′. A The adversary is given the ciphertext Enc (m) as well as the external A k information h(m), and attempts to guess the value of f(m). Algorithm ′ also attempts to guessREAL world the value of f(m), but is given only h(m)andtheA ℳ � Adversary � auxiliary ℎ(�) �(�) target
|�| Simulator � IDEAL world Definition (SEM): ∀� ∃� ∶ ∀ ℳ, ℎ, � Pr � ��� � , ℎ(�) = �(�) ≈ Pr � |�|, ℎ(�) = �(�) [Goldwasser Micali 84] leading to Turing-Award (Noble price for CS) Classical Indistinguishability �����
Challenger � � � ← {0,1} ��� 0 if b=0 � � = ��� � if b=1 �′ � wins iff � = �′