PseudoRandom Synthesizers

Functions and Permutations

Thesis for the Degree of

DOCTOR of PHILOSOPHY

by

Omer Reingold

Department of Applied Mathematics and Computer Science

Weizmann Institute of Science

Submitted to the Scientic Council of

The Weizmann Institute of Science

Rehovot Israel

November i

ii

Thesis Advisor

Professor

Department of Applied Mathematics and Computer Science

Weizmann Institute of Science

Rehovot Israel

Acknowledgments

Writing these lines I realize how many p eople to ok part in my initiation into the world

of science In the following few paragraphs I will try to acknowledge some of these many

contributions Naturallymywarmest thanks are to Moni Naor my advisor and much more

than that

Thanking Moni for all he has done and b een to me these last four years is extremely easy

and extremely hard at the same time Extremely easy since I havesomuchIwant to thank

him for and Im grateful for this opp ortunity to do so a cynic might say that this is the

ve so much I want most imp ortant function of the dissertation Extremely hard since I ha

to thank him for what chance is there to do a go o d enough job Im sure that in the

silent understanding b etween us Moni is more aware of the extent of friendship admiration

and gratitude I have for him than I am able to express in words But this is not only for his

eyes so letmedomy b est

Thank you Moni for your close guidance in all dierent asp ects of the scientic pro cess

In all things large and small I always knew that I can count on you Thank you for your

constant encouragement for treating me as a colleague from the rst time I stepp ed into

your oce when I deserved it even less than now and at the same time for sheltering me

in a parental manner Thank you for pushing me forward when p ossible and for laying

o in times work could not have b een a high priority for me Thank you for sharing with

me in hours and hours of conversation your deep understanding and amazing knowledge

of computer science as well as your scientic philosophy and an abundance of ideas all of

which will surely nd their way into my research for years to come Last but certainly not

least thank you for your friendship which was most explicitly articulated by Yael but was

very clear to me all along The nest you have built for me is so warm that I hate leaving

it Therefore at least in my mind I will not

Before picking up the pace sp ecial thanks are due to one other friend teacher and

Although my joint research with Ran Raz is not presented in this dis colleague of mine

sertation I enjoyed it enormously and have learned much from it I enjoyed viewing Rans

sup erb research capabilities and creativity in action hop e some of it has rubb ed o on me

to o I admire his relentless optimism and enthusiasm I treasure his friendship so very

deeply I wish for many more years of all of that

lessons they have I thank my teachers at the Weizmann Institute for the invaluable

taught me both inside and outside of the class ro om Sp ecial thanks to each member of

the group Oded Goldreich Sha Goldwasser Moni Naor and Adi Shamir

I take great pride and pleasure in the fact that I have learned from this incredible group

of researchers and even in being asso ciated with them I often toy with the idea of not iii

iv ACKNOWLEDGMENTS

graduating in order to continue learning in the presence of these exp erts I also thank Itai

Benjamini Uri Feige Yoram Moses and David Peleg for their enlightening courses

A group of researchers to whom I have sp ecial sentiments are my teachers from under

graduate studies at TelAviv University Yossi Azar Amos Fiat Ron Shamir

and Uri Zwick I thank them for installing in me the love for the foundations of computer

science

Over the years I have learned with and from many of myfello w students I will list only

a few but I thank them all for many hours of work and fun Sp ecial thanks are due to my

close brothers at the house of Moni namely and BennyPinkas Having you

guys around was priceless Many thanks are also due to Avishai Wo ol and to the new recruit

Yehuda Lindell I also thank Tal Malkin for our jointwork at the coee shops of Manhattan

and Tal Yadid for an extremely valuable time we sp ent together in undergraduate scho ol

There are so many other p eople to whom I am in debt for sharing their knowledge and

ideas with me The partial list that comes to mind rightnow includes Eli Biham

Ran Canetti Russell Impagliazzo Daniele Micciancio Steven

Rudich Amnon TaShma and Shiyu Zhou I ap ologize to anyone I have

forgetfully omitted and thank you all for many stimulating conversations Sp ecial thanks to

Cynthia Russell and Steven for extending to me part of their sentiments for Moni Iwould

also like to thank Joan Feigenbaum Charlie Racko and Umesh

Vazirani for their warm hospitalityduringafew memorable visits

I would like to acknowledge now the direct contributions to the work presented in this

dissertation Most of the work is based on my joint research with Moni

Section is based on jointwork with Eli Biham and Dan Boneh My research during

this time was supp orted by a Clore Scholars award and a grant from the Israel Science

Foundation administered by the Israeli Academy of Sciences We thank Sha Goldwasser

and Jon Sorenson who brought to our attention and Mihir Bellare for his observation

describ ed in Section Sp ecial thanks to Victor Shoup for suggesting the improved pro of

of Lemma and for p ointing out We thank Ran Canetti Oded Goldreich Jo e

Kilian Kobbi Nissim Benny Pinkas Amnon TaShma and the anonymous referees of the

relevant journals and conferences for many helpful comments and for their diligent reading

of some of the work included here It is dicult to overestimate Odeds contribution to the

presentation of the work describ ed in Chapter Iwould also like to thank Oded Goldreich

Jo e Kilian Yehuda Lindell and Alon Rosen for their comments on several parts of the

dissertation

Finally many thanks are due to my family and friends However they deserve their

thanks in p erson and hop efuly I did not let them wait till now

Abstract

The research reected in this dissertation is a study of computational pseudorandomness

More sp ecically the main ob jective of this research is the ecient and simple construction

of pseudorandom functions and permutations where eciency refers b oth to the

sequential and parallel time complexity of the computation Pseudorandom functions and

permutations are fundamental cryptographic primitives with many applications in cryptog

raphy and more generally in computational complexity

Constructions of PseudoRandom Functions

For our constructions of pseudorandom functions we intro duce and study a new crypto

graphic primitive which we call a pseudorandom synthesizer and a generalization of this

primitive whichwecallak dimensional pseudorandom synthesizer These primitiv es are of

indep endent interest as well In addition we consider various applications of our construc

tions and study some of the underlying cryptographic assumptions used in these construc

tions The main results obtained by this research are

Intro ducing new cryptographic primitives called pseudorandom synthesizer and k

dimensional pseudorandom synthesizer

Using pseudorandom synthesizers for a parallel construction of a pseudorandom func

tion the depth of the functions is larger by a logarithmic factor than the depth of the

synthesizers

Showing several NC implementations of synthesizers based on concrete intractability

assumptions such as factoring and the computational DieHellman assumption

Showing a very simple parallel construction of synthesizers based on what we call weak

pseudorandom functions which implies simple constructions of synthesizers based on

trap do or oneway permutations and based on any hardtolearn problem under the

denition of

These results yield the rst parallel pseudorandom functions based on computational in

tractability assumptions and the rst alternative to the original construction of Goldre

ich Goldwasser and Micali In addition we show two new constructions of pseudo

random functions that are related to the construction based on synthesizers The pseudo

randomness of one construction is proven under the assumption that factoring is hard while v

vi ABSTRACT

the other construction is pseudorandom if the decisional version of the DieHel lman as

sumption holds These functions have the following prop erties

They are much more ecient than previous prop osals Computing the value of our

functions at any given pointinvolves two subset pro ducts

They are in TC the class of functions computable by constant depth circuits con

sisting of a p olynomial number of threshold gates This fact has several interesting

applications

They have a simple algebraic structure that implies additional features In particular

weshow a zeroknowledge pro of for statements of the form y f x and y f x

s s

given a commitmenttoa s of a pseudorandom function f

s

We discuss some applications of our constructions in cryptography including applications

in publickey cryptography as well as their consequences in computational complexity and

in computational learningtheory

Constructions of PseudoRandom Permutations

Luby and Racko showed a metho d for constructing a pseudorandom p ermutation from

a pseudorandom function The metho d is based on comp osing four or three for weakened

security so called Feistel p ermutations each of which requires the evaluation of a pseudo

random function We reduce somewhat the complexity of the construction and simplify its

pro of of securityby showing that twoFeistel p ermutations are sucient together with initial

and nal pairwise indep endent p ermutations The revised construction and pro of provide a

framework in which similar constructions may be designed and their security can be easily

proved We demonstrate this by presenting some additional adjustments of the construction

that achieve the following

Reduce the success probabilityof the adversary

Provide a construction of pseudorandom p ermutations with large inputlength using

pseudorandom functions with smal l input length

A Study of Some Numb erTheoretical Assumptions

Our research includes a study of two numb ertheoretical assumptions that are related to

the DieHellman keyexchange proto col and that are used in our constructions of pseudo

random functions The rst is the decisional version of the DieHellman assumption DDH

Assumption This assumption is relatively new or more accuratelywas explicitly considered

only recently We therefore survey some of the dierent applications of the assumption and

the current knowledge on its security Furthermore we show a randomized reduction of the

worstcase DDHAssumption to its average case based on the randomselfreducibilityofthe

DDHProblem that was previously used by Stadler We consider our research of the

tly DDHAssumption to be of indep endent imp ortance given that the assumption was recen

used in quite a few interesting applications eg

vii

The second assumption we study is the generalized DieHellman assumption GDH

Assumption This assumption was originally considered in the context of a generalization

of the DieHellman keyexchange proto col to k parties We prove that breaking

this assumption mo dulo a so called Bluminteger would imply an ecient algorithm for

factoring Blumintegers Therefore b oth the generalized keyexchange proto col and our

pseudorandom function that is based on the GDHAssumption are secure as long as fac

toring Blumintegers is hard Our reduction strengthens a previous w orstcase reduction

of Shmuely

viii ABSTRACT

Contents

Acknowledgments iii

Abstract v

Intro duction

PseudoRandomness and Computational Indistinguishability

PseudoRandom Bit Generators

PseudoRandom Function Ensembles

PseudoRandom Permutation Ensembles

Overview of Research Ob jectives and Results

Constructions of PseudoRandom Functions

A Study of Some Numb erTheoretical Assumptions

A Study of the LRConstruction

Preliminaries

Notation

PseudoRandomness Denitions

kWise Indep endent Functions and Permutations

A Study of Some Numb erTheoretical Assumptions

The Decisional DieHellman Assumption

Background

Formal Denition

A Randomized Reduction

The GDHAssumption Mo dulo a Comp osite

The Assumptions

Conclusions

Constructions of PseudoRandom Functions

PseudoRandom Synthesizers and Functions

Organization

Notation

Pseudorandom Synthesizers

AParallel Construction of PseudoRandom Functions

Security of the Construction ix

x CONTENTS

A Related Construction and Additional Prop erties

Synthesizers Based on General Cryptographic Primitives

Numb erTheoretic Constructions of PseudoRandom Synthesizers

PseudoRandomness and LearningTheory

Concrete Constructions of PseudoRandom Functions

Construction of PseudoRandom Functions

Construction Based on Factoring or the GDHAssumption

Additional Prop erties of the PseudoRandom Functions

Further Research

Constructions of PseudoRandom Permutations

Notation

Construction of PPE and SPPE

Intuition

Construction and Main Result

Pro of of Security

The Framework

Relaxing the Construction

PPE and SPPE with a Single PseudoRandom Function

Relaxing the PairWise Indep endence Requirement

Reducing the Distinguishing Probability

SPPE on Many Blo cks Using PFE or PPE on a Single Blo ck

Relaxing the Construction

Related Work

Constructions of k Wise Dep endent Permutations

Conclusion and Further Work

Chapter

Intro duction

The research reected in this dissertation is a study of computational pseudorandomness

More sp ecically the main ob jective of this research is the ecient and simple construction

of pseudorandom functions and permutations where eciency refers b oth to the

sequential and parallel time complexity of the computation Pseudorandom functions and

permutations are fundamental cryptographic primitives with many applications in cryptog

raphy and more generally in computational complexity

PseudoRandomness and Computational Indistin

guishability

Since pseudorandomness is the main notion considered in this work we start by reviewing

some of the central ideas of this area We also discuss the denitions applications and con

structions of the relevant pseudorandom ob jects with a fo cus on pseudorandom functions

and permutations Go o d references for additional reading are Goldreich and Luby

To understand the notion of computational pseudorandomness we must rst consider

the role of randomness in computations A is an algorithm that is

allowed to ip coins A coin ip is mo deled as a random ie uniformly distributed

bit Therefore at each step of its computation a randomized algorithm can obtain a bit

which is with probability half and with probability half and is indep endent of previous

coinips An imp ortant question though less relevant to our discussion is to determine

the applicability of this mo del What kind of random sources are available to computers

We note that this question is nontrivial even to those who b elieve that Go d or Nature

do es indeed ip coins

Many algorithms and proto cols use randomness to p erform computational tasks These

algorithms are often faster and simpler than the corresp onding currently known determin

istic algorithms For some tasks as those of achieving cryptographic security or simulating

probabilistic events randomness is essential Thus it is natural to view randomness as a

computational resource and to study its relations with other computational resources like

time and space More sp ecically a natural question is that of derandomization study

CHAPTER INTRODUCTION

ing ways of reducing or eliminating the amount of randomness used by algorithms without

signicantly enlarging their usage of other resources

An imp ortant to ol for derandomization is pseudorandomness though as we argue b elow

the role of pseudorandomness is not limited to derandomization Consider an algorithm

A that uses a random string of length ie A uses random bits One way to reduce

the amount of randomness used by A without extensively changing its internal structure

is to replace its uniformly distributed random string with a string sampled from a dierent

distribution D That is to use an algorithm A that samples a string r from D and invokes

A with r as its random string For this technique to be useful D must have the following

prop erties

Sampling a string from D should require signicantly less than random bits to

achieve our original goal of derandomizing A

D is eciently samplable For the derandomization of A using D not to be to o

costly the resources required to sample a string from D should be comparable with

those needed to execute A otherwise running A is substantially more exp ensive than

running A

The b ehavior of A should b e practically the same when using a uniformly distributed

string and when using a string sampled from D For example if A solves some compu

tational problem its probability to answer correctly on every input or on most inputs

should be almost the same for b oth distributions of its random string In this sense

A do es not distinguish D from the uniform distribution put dierently A cannot be

used to distinguish the two distributions

By Prop erty D has much less entropy than the uniform distribution and is therefore

statistically very dierent from it However Prop erty means that for the sp ecic com

putational task at hand D is almost as go o d as the uniform distribution This contrast

betw een a large statistical dierence and computational indistinguishability is in the essence

of pseudorandomness see also for additional insight on the role of Prop erties

To b etter understand the denition of computational indistinguishability due to Goldwasser

and Micali and to Yao we can think of the algorithm A not as trying to p erform

just any computational task but rather as trying to distinguish between two distributions D

and D with probability half A gets an input sampled from D and with probability half an

input sampled from D The algorithm A tries to guess from which distribution its input was

sampled D and D are indistinguishable to A if its success probability is at most negligibly

larger than half which is the success probability A can easily achieve for every D and D

Usually however wewould b e interested in derandomizing a class of algorithms or proto

cols instead of a single algorithm A as in the discussion ab ove In such a case wewantthe

pseudorandom distribution to fo ol every algorithm from the class that we are trying to

derandomize That is to be indistinguishable from random for this class In cryptography

the class of algorithms wewant to fo ol is that of all ecient algorithms which should cap

ture all p ossible adversaries For convenience ecient algorithms are usually mo deled as

probabilistic p olynomialtime algorithms However the results presented in cryptographic

PSEUDORANDOMNESS AND COMPUTATIONAL INDISTINGUISHABILITY

literature are usually more concrete ie they imply statements such as If there is an algo

rithm that distinguishes these functions from random in time t and then there

is and algorithm that inverts that function in time t and success probability as is the

case for the work presented here

PseudoRandom Bit Generators

Pseudorandom bit generators were intro duced by Blum and Micali and Yao A

pseudorandom generator is an eciently computable deterministic function such that

Its output is longer than its input and For a uniformly distributed input its output is

indistinguishable from uniform to any ecient algorithm In other words pseudorandom

generators dene a pseudorandom distribution of bitsequences their output distribution

which is eciently samplable using a relatively small truly random bitsequence usually

referred to as the seed

Unfortunately the existence of pseudorandom generators is unproven in itself such a

pro of would b e a tremendous breakthrough in computational complexity since in particular

it implies that P NP However the existence of pseudorandom generators can be

reduced to other computational assumptions Most notably Hastad Impagliazzo Levin and

Luby showed how to construct a pseudorandom generator from any oneway function

informally a function is oneway if it is easy to compute its value on any input but hard to

invert it on a random input In fact pseudorandom generators as well as pseudorandom

functions and p ermutations exist i oneway functions exist

As suggested ab ove a pseudorandom generator can be used for derandomization A

trivial way of derandomizing an algorithm is going over all p ossible random strings lo oking

for a go o d p oint or taking ma jorityover all p ossible outputs By using a pseudorandom

generator the numb er of strings wehave to go through can b e substantially reduced Thus it

was shown byYao that if pseudorandom generators that fo ol nonuniform p olynomial

size circuits exist then

n

RP B P P DT I M E

where RP resp BP P is the class of languages that can be accepted by a probabilistic

p olynomialtime algorithm with onesided resp twosided error Stronger results in this

direction were given by A dierent line of work

deals with the construction of bitgenerators that fo ol an observer of restricted

computational power eg generators against p olynomialsize constantdepth circuits or

with b ounded space and readonce access to the random tap e Most of these constructions

need no unproven assumptions

terested in the role of pseudorandomness in cryptog In our discussion we are mostly in

raphy A representative scenario is the following Assume that two parties A and B share

a secret nbit long random string r and that A wants to communicate to B an nbit long

message m in a way that prevents an eavesdropp er from learning anything ab out m In this

case A can use r as a one time pad and send m r bitwise XOR of m and r toB From

Shannons work itisknown that this solution is optimal in the numb er of random bits

shared by A and B Thus in order to obtain p erfect security in the information theoretical

CHAPTER INTRODUCTION

sense every two parties in a distributed network need to share in advance a random string of

length prop ortional to the total length of the messages that will b e communicated between

them ever This is rarely practical hence A and B should consider trading the requirement

imp ossible to learn anything ab out m with the weaker requirement computationally

infeasible to learn anything ab out m This can be achieved using a pseudorandom gen

erator G with outputlength nasfollows the of an bit message m that

is the value A sends to B can be dened to be E m m Gr For every message m

r

the encryption E m is indistinguishable from random It follows that for anytwo messages

r

m and m the E m and E m are indistinguishable from each other We

r r

therefore have away to encrypt an bit message m such that the content of m is concealed

while using less than bits In fact for any constant c if there exists any pseudorandom

c

generator then there also exists one that outputs n bits on an nbit input Thus in the

c

scenario discussed ab o veifA and B communicate n bits where n is the security parameter

it is enough for them to share an nbit truly random secret

The examples we saw show that pseudorandom generators can save random bits and

can reduce the length of keys used in cryptographic settings We note that there are many

other p erhaps more striking examples where tasks that are imp ossible in the information

theoretical sense have a computational analogue that can b e p erformed using pseudorandom

generators One such example are bitcommitmentschemes Lo osely sp eaking these are two

phase proto cols b etween A and B At the rst phase A commits itself to a bit b and at the

second phase A reveals its commitment The requirements are that at the end of the rst

phase a p olynomialtime b ounded B would not be able to distinguish a commitment to

from a commitment to and at the second stage even an all powerful A would not

be able to reveal its commitment b oth as a commitment to and as a commitment to

Naor showed how to use the fact that the output of a pseudorandom generator and the

uniform distribution are simultaneously computationally indistinguishable and statistically

tscheme very dierent in order to obtain a bit commitmen

PseudoRandom Function Ensembles

As mentioned ab ove the research presented herein is fo cused on the construction of pseudo

random functions and permutations These are ecient pseudorandom distributions of

functions resp p ermutations that can replace uniformlychosen functions resp p ermuta

tions in many applications They are the key comp onent of privatekey cryptography and

havemany additional applications in cryptography and in computational complexity in gen

eral In this section we discuss the denitions and original constructions of these powerful

cryptographic primitives as well as some of their applications

Denition

Pseudorandom functions were intro duced by Goldreich Goldw asser and Micali These

are ecient distributions of functions that are indistinguishable from the uniform distribu

tion to an ecient ie p olynomialtime b ounded observer There are two subtleties in the

denition of a pseudorandom distribution of functions F ff g compared with the de

s

nition of a pseudorandom distribution of bit sequences The rst issue is the exact meaning

PSEUDORANDOMNESS AND COMPUTATIONAL INDISTINGUISHABILITY

of an ecient distribution in this context Here the answer is rather straightforward It

should be easy to sample a key s from the distribution and given such a key it should be

easy to evaluate the corresp onding function f on any input The second issue is what kind

s

of access do es the distinguisher have to the function This is a more delicate question

Assume that F ff g is a distribution of functions in B where B is the set of all

s n n

n n

f g f g functions ie the set of Binary functions with domain f g Consider

an algorithm A that tries to distinguish F from the uniform distribution over B The

n

algorithm A has some access to a function g B which with probability half is sampled

n

from F and with probability half is uniformly distributed A distinguishes F from random

if it has a nonnegligible advantage over half in guessing from which of the distributions g

was sampled The question is what kind of access should A have In the actual denition

of pseudorandom functions given in the distinguisher A has a blackbox access to g

This means that at each step A can sp ecify an input x to g and obtain the value g x One

may be tempted to strengthen the denition by giving the distinguisher additional access

n

to the functions A rst idea might be to give the long bitsequence of all the outputs

of g as an input to A However this is an exp onentially long input which makes A which

is p olynomial in its input length to o powerful for example this might allow A to verify

whether g F by going over all p ossible keys in F Another idea is to give A a key s

whenever g f F However given a short description g can no longer be confused to

s

be random We stress that pseudorandom functions are only guaranteed to lo ok random

when the distinguisher do es not get the key of the function This means that usually pseudo

random functions cannot be used in a scenario where all the parties including the bad

ones should b e able to compute the functions on their own

It still remains to determine howmuch freedom the distinguisher A should haveinselect

ing the values it asks to see In the distinguisher is has a adaptive blackb ox access to

g The meaning of adaptive here is that each input x that A sp ecies may dep end on the

value of g on previous queries This adaptiveness is part of what makes pseudorandom func

tions so strong In some cases it makes sense to consider weaker notions For examples one

might consider a static attack where the distinguisher has to sp ecify all its queries in advance

Another interesting example is a random attack where the distinguisher gets the value of

call such functions weak pseudorandom functions g on uniformly distributed inputs We

and further consider them in Section In particular we show a relatively ecient way

of constructing full edged pseudorandom functions from weak pseudorandom functions

We note that pseudorandom functions as well as the weaker notions mentioned ab ove exist

i oneway functions exist However weaker notions are still interesting since

they may b e easier to construct eciently more on denitions of function families that are

weaker than pseudorandom functions can be found in

Still it is a worthwhile goal to to come up with interesting denitions for functions that lo ok random to an

algorithm that gets their key and with distributions of functions that satisfy such denitions see

for some of the recentrelevantwork

CHAPTER INTRODUCTION

Applications

An equivalent formulation of pseudorandom functions is as a distribution of exp onentially

long bitsequences that is indistinguishable from random to an ecient observer that has

direct access to the sequence This formulation stresses some of the prop erties that make

pseudorandom functions so strong and so easy to work with compared to pseudorandom

generators Sharing or xing a key of a pseudorandom function is equivalent in many

scenarios to sharing or xing a huge amount of randomness What makes pseudorandom

functions so p owerful is that we do not need to know in advance how many bits will b e used

or the location of these bits Indeed pseudorandom functions have numerous applications

a few examples app ear in

Probably the most notable applications of pseudorandom functions are in privatekey

cryptography They provide parties who share a common key straightforward proto cols

for sending secret messages to each other for identifying themselves and for authenticating

messages Wenow describ e the basic schemes for p erforming these three most common

tasks of privatekey cryptography more elab orated schemes that achieve b etter security

also exist Consider a group of parties that share a pseudorandom function f They may

s

p erform

Authentication To authenticate a message m app end the authentication tag f mtothe

s

message

From the denition of pseudorandom functions we have that this authentication

scheme is existentially unforgeable against a chosen message attack No ecient ad

versary that adaptively queries for the tags of chosen messages m m m can

q

pro duce the tag of any new message m

Identication A member of the group V determines if A is also a member by issuing a

random challenge r and verifying that the resp onse of A is f r

s

No ecientadversary that is even allowed to participate in executions of the proto col

as the verier can imp ersonate a member of the group

Encryption The encryption of a message m is dened to be hrf r mi where r is a

s

uniformly chosen input

This scheme is semantically secure against an ecientadversary that p erforms a chosen

attack in the prepro cessing mo de See for the relevant terminology

on attacks chosen plaintext chosen ciphertext in the prepro cessing and p ostpro cessing

mo des and notions of security semantic and nonmalleability Intuitively it means

that an adversary that is allowed to ask for encryptions or descriptions and later gets

an encryption of a new message m cannot learn anything ab out m

For any implementation of f this scheme is mal leable ie a ciphertext of related message can b e generated

by an adversary and hence not secure against a chosen ciphertext attackin the postprocessing mode ie

when the adversary queries for decryptions after getting the chal lenge Away to amend this problem was

prop osed in the full version of

PSEUDORANDOMNESS AND COMPUTATIONAL INDISTINGUISHABILITY

Note that it may b e p ossible to use pseudorandom bit generators directly to enable private

key cryptography between two parties A and B The idea is that at each interaction A and

B they can use a dierent part of a pseudorandom sequence they share for their privatekey

op erations such as authentication identication and encryption However for suchanidea

to work care should be taken to synchronize the pointers A and B have to the sequence

which might be nontrivial in the presence of an adversary Even if such synchroniza

tion can be achieved for two parties it is imp ossible in the case of multiparty privatekey

cryptography In this case pseudorandom functions are indeed essential

We stress that pseudorandom functions have many additional applications including

in publickey cryptography For example Bellare and Goldwasser showed how to use

pseudorandom functions and a noninteractive zeroknowledge of their values to construct

digitalsignatures Another interesting example was given by Goldreich who showed

how to eliminate the state in the GoldwasserMicaliRivest signature scheme the technique

of is very general In addition the existence of pseudorandom functions computable

in some complexity class has profound consequences to computational complexity and to

computational learningtheory

As was observed by Valiant if a concept class contains pseudorandom functions

then it cannot be learned There exists a distribution of concepts computable in

this class that is hard for every ecient learning algorithms for every nontrivial

distribution on the instances even when memb ershipqueries are allowed Note that

such an unlearnabilityresultisvery strong see Section for more details

As shown by Razb orov and Rudich if a circuitclass contains pseudorandom

functions that are secure against a sub exp onentialtime adversary then there are no

what they called Natural Proofs which include all known lower b ound techniques for

separating this class from Ppoly

As we discuss in Section these results imply interesting consequences of our more

ecient constructions of pseudorandom functions

The GGMConstruction

In addition to intro ducing pseudorandom functions Goldreich Goldwasser and Micali

have suggested a construction of such functions from pseudorandom generators Let G be

a pseudorandom generator that expands the input by a factor of two eg

h that for any nbit string x b oth G x and G x are nbit strings Dene G and G suc

and Gx hG xG xi Under the GGMConstruction the key of a pseudorandom

n n

function f f g f g is a uniformly chosen nbit string s For any nbit input

s

x x x x the function f is dened by

n s

def

x x x

n

s G G f x G

s

For roughly a decade this was the only known construction of pseudorandom functions

based on standard computational assumptions including concrete assumptions such as fac

toring is hard as well as general assumptions such as the existence of oneway functions

CHAPTER INTRODUCTION

Their construction is sequential in nature since computing f consists of n successive in

s

vocations of G The goal of the research describ ed in Chapter is to present alternative

constructions for pseudorandom functions that are more ecient either in the paralleltime

complexityorthe sequentialtime complexityofevaluating f and preferably in b oth

PseudoRandom Permutation Ensembles

Pseudorandom p ermutations were intro duced by Luby and Racko The main moti

vation to consider pseudorandom permutations is that they formalize the well established

cryptographic notion of blo c k ciphers These are lengthpreserving privatekey encryption

schemes the private key of a blo ckcipher determines a p ermutation E such that the en

cryption of a message m is E m and the decryption of a ciphertext c is E c A highly

inuential example of a blo ck cipher is the DES

When a blo ck cipher E is indistinguishable from a random permutation then the only

information that is leaked from a sequence of fE m Em Em g on

the corresp onding plaintexts is whether or not m m for pairs i j Still when us

i j

ing pseudorandom functions for privatekey encryption even this information is concealed

However using a lengthpreserving encryption scheme has the advantage that the plaintext

and ciphertext are of the same length This prop erty saves memory and prevents waste of

communication bandwidth Furthermore it enables the easy incorp oration of the encryption

scheme into existing proto cols or hardware comp onents

The denition Luby and Racko gave to pseudorandom p ermutation ensembles is closely

related to the denition of pseudorandom functions describ ed ab ove Pseudorandom per

mutation ensembles are ecient distributions of p ermutations that are indistinguishable

from the uniform distribution to an ecient observer that has adaptive blackbox access to

the p ermutations ie the distinguisher can only access the p ermutation by sp ecifying inputs

and obtaining the value of the p ermutation on these inputs In addition Luby and Racko

considered a stronger notion of pseudorandomness which they call super pseudorandom per

mutation generators Here the distinguisher also has adaptiveblackb ox access to the inverse

permutation Following we use the term strong pseudorandom permutation ensembles

instead The two notions of security are analogous to the dierent attacks considered in the

context of blo ck ciphers

terpreted as blo ck ciphers that are secure against Pseudorandom p ermutations can b e in

an adaptive chosenplaintext attack Informally this means that an ecient adver

sary with access to the encryptions of messages of its choice cannot tell apart those

encryptions from the valuesofa truly random p ermutation

Strong pseudorandom p ermutations can b e interpreted as blo ck ciphers that are secure

Blo ckciphers cannot always b e directly used to encrypt the entire message since often they have xed

and relatively small input length In such a case the blo ckcipher is used in some mo de of op eration that

enables the encryption of longer messages The standard mo des of op eration prop osed in the context of

DES are ECB CBC CFB and OFB Unfortunately all these mo des reveal information on the message or

on relations b etween dierent messages In Section we prop ose a dierentwaytouseblockciphers that

do es not have this problem

OVERVIEW OF RESEARCH OBJECTIVES AND RESULTS

against an adaptive chosen plaintext and ciphertext attack Here the adversary has

the additional p ower to ask for the decryption of ciphertexts of its choice

The LRConstruction

Luby and Racko provided a construction of strong pseudorandom permutations the

LRConstruction whichwas motivated by the structure of DES The basic building blo ckis

the so called Feistel p ermutation based on a pseudorandom function AFeistel p ermutation

def

n n n

for a function f f g f g is a p ermutation on f g dened by D L R

f

R L f R where jLj jR j n Each of the rounds of DES involves a Feistel

permutation of a function determined by the key bits The LubyRacko design of

pseudorandom p ermutations resp strong pseudorandom p ermutations is D D D

f f f

resp D D D D where all f s are indep endent pseudorandom functions see

f f f f i

Figure a for an illustration This elegant construction inspired a considerable amount

of research The work describ ed in Chapter provides a study of the LRConstruction

Overview of Research Ob jectives and Results

The research included here is comp osed of two main parts Several constructions of

pseudorandom functions A study of the LRConstruction and resultant constructions

of pseudorandom p ermutations In Section wehave describ ed the imp ortance of pseudo

random functions and p ermutations in cryptography and part of their applications We also

describ ed the original constructions of these primitives that were given in The

many applications of pseudorandom functions and p ermutations motivated our search for

more ecient and simpler constructions where eciency refers b oth to the sequential and

parallel time complexity of the computation In this section we describ e some of our results

in that direction as well as some of the consequences of our constructions In addition we

briey describ e our research of some of the underlying cryptographic assumptions used in

our constructions

Constructions of PseudoRandom Functions

For our constructions of pseudorandom functions describ ed in Chapter weintro duce and

study a new cryptographic primitivewhichwecallapseudorandom synthesizer and a gener

alization of this primitivewhich we call a k dimensional pseudorandom synthesizer These

primitives are of indep endentinterest as well In addition weconsidervarious applications of

our constructions and study some of the underlying cryptographic assumptions used in these

constructions Motivated by the inherent sequentiality of the GGMConstruction we show

in Section a paral lel construction of pseudorandom functions from a pseudorandom

synthesizer and parallel constructions of pseudorandom synthesizers based on several con

crete and general intractability assumptions In Section we show even more parallel

constructions of pseudorandom functions based on concrete intractability assumptions such

as the assumption that factoring Blumintegers is hard In addition these constructions are

CHAPTER INTRODUCTION

more ecient and have a simple algebraic structure The constructions of Section are mo

tivated by the constructions of Section and in particular by the notion of k dimensional

synthesizers We now describ e the main results of Chapter

PseudoRandom Synthesizers and Functions

We intro duce in Section a new which we call a pseudo

random synthesizer A pseudorandom synthesizer isatwovariable function S so that

if many but p olynomially b ounded random assignments hx x i and hy y i are

m m

chosen to b oth variables then the output of S on all the combinations of these assignments

m

f x y is indistinguishable from random to a p olynomialtime observer Our main

i j

ij

results are

A construction of pseudorandom functions based on pseudorandom synthesizers de

scrib ed in Sections Evaluating such a function involves log n phases where

each phase consists of several parallel invo cations of a synthesizer with a total of n

invo cations altogether

Constructions of parallel NC synthesizers based on standard numb ertheoretic as

sumptions such as factoring is hard RSA it is hard to extract ro ots mo dulo a

comp osite and DieHellman describ ed in Section In addition a very sim

ple construction based on a problem from learning describ ed in Section The

keygenerating algorithm of these constructions is sequential for RSA and factoring

nonuniformly parallel for DieHellman and parallel for the learning problem

An extremely simple parallel construction of synthesizers based on what we call a weak

pseudorandom function describ ed in Section Aweak pseudorandom function

is indistinguishable from a truly random function to a p olynomialtime b ounded ob

server who gets to see the value of the function on uniformly distributed inputs instead

of inputs of its choice This construction almost immediately implies constructions of

synthesizers based on trap do or oneway p ermutations and based on any hardtolearn

problem under the denition of

Taking and together we get a pseudorandom function that can b e evaluated in NC

Our constructions of pseudorandom functions have additional attractive prop erties

First it is p ossible to obtain from the constructions a sharp timespace tradeo Lo osely

sp eaking by keeping m strings as the key we can reduce the amount of work for com

n

puting the functions from n invo cations of the synthesizer to ab out invo cations in

log m

log n log log m phases thus also reducing the paralleltime complexity In addition the

construction obtains a nice incremental prop erty For any y of Hamming distance one from

x given the computation of f x we can compute f y with only log n invo cations of the

synthesizer we can also make this prop erty hold for y x We discuss b oth prop erties

in Section

OVERVIEW OF RESEARCH OBJECTIVES AND RESULTS

Concrete Constructions of PseudoRandom Functions

This part of the work describ ed in Section studies the ecient construction of several

fundamental cryptographic primitives Our ma jor result are two related constructions of

pseudorandom functions based on numb ertheoretic assumptions The rst construction

gives pseudorandom functions i the decisional version of the DieHellman assumption

DDHAssumption holds The second construction is at least as secure as the assump

tion that factoring the so called Blumintegers is hard Having ecient pseudorandom

functions based on factoring is very desirable since this is one of the most established con

crete intractability assumption used in cryptography The construction based on the DDH

Assumption is also attractive since these pseudorandom functions are even more ecientin

that they have a larger output size and since the construction is more securitypreserving

as describ ed b elow We study the DDHAssumption in Section The prop erties of our

new pseudorandom functions are

Eciency Computing the value of the function at a given point is comparable with two

mo dular exp onentiations and is more ecient by an n factor than any previous

prop osal that is proven to b e as secure as some computational intractability assump

tion This is essential for the eciency of the many applications of pseudorandom

functions

Depth Given appropriate prepro cessing of the key the value of the functions at any given

point can be computed in TC compared with TC for the concrete constructions in

Section Therefore this construction

Achieves reduced latency for computing the functions in parallel and in hardware

implementations

Has profound consequences to computational complexity ie Natural Pro ofs

and to computational learningtheory

desirable Simplicity The simple algebraic structure of the functions implies additional

features To demonstrate this weshow in Section a simple zeroknowledge pro of

for the value of the function and other proto cols We suggest the task of designing

additional proto cols and improving the current ones as an interesting line for further

research

We note that our constructions b oth in Section and Section do not weaken

the security of the underlying assumptions Take for instance the constructions that are

based on factoring If there is an algorithm for breaking this construction in time t and

success success means that the observer has advantage of at least in distinguishing

the pseudorandom function from the random one then there is an algorithm that works

in time pol y t and factors Blumintegers with probability pol y t See for a

discussion of security preserving reductions In their terminologysuch a reduction is called

In fact weprove the security of the second construction based on a generalized version of the computa

tional DHAssumption GDHAssumption However breaking the GDHAssumption mo dulo a comp osite

would imply an ecient algorithm for factorization

CHAPTER INTRODUCTION

p olypreserving In fact most of our reductions are even more secure than that In particular

the construction of pseudorandom functions based on the DDHAssumption describ ed in

Section is linearpreserving If there is an algorithm for breaking this construction in

time t and success then there is an algorithm that works in time t pol y n where n is

the security parameter and breaks the DDHAssumption with probability This better

security is partly due to the fact that this construction do es not rely on the hardcore bits

of that are not known to b e linearpreserving

Some Consequences of the Functions

As mentioned ab ove the original motivation of our constructions was to overcome the inher

ent sequentiality of the GGMConstruction and to come up with pseudorandom functions

that have shal low depth In Section we show parallel constructions of pseudorandom

functions based on several general and concrete intractability assumptions The functions

or actually in constructed under concrete intractability assumptions are computable in NC

TC given appropriate prepro cessing of their key The functions constructed in Section

are in TC which is the class of functions computable by constant depth circuits consisting

of a p olynomial number of threshold gates and therefore also in NC These functions

have the additional advantage of being ecient and of having a simple algebraic structure

We rst discuss the motivation for having parallel pseudorandom functions and then the

motivation for the additional prop erties of our functions

For some applications of pseudorandom functions minimizing the latency of computing

the functions is essential Such an application is the encryption of messages on a

network where the latency of computing the function is added to the latency of the

network Having shallowdepth pseudorandom functions implies reduced latency in

computing those functions in hardware and parallel implemen tations note that pseudo

random functions are likely to b e implemented in hardware as is the case for DES

Many of the applications of pseudorandom functions preserve the paralleltime com

plexity of the functions An imp ortant example is the LRConstruction of pseudo

random p ermutations from pseudorandom functions that is further discussed in Chap

ter Therefore our constructions yield parallel strong pseudorandom permutations

as well

There is a deep connection between pseudorandom functions and hardness results

for learning Since a random function cannot be learned if a concept class is strong

enough to contain pseudorandom functions we cannot hop e to learn it eciently

There exists a distribution of concepts computable in this class that is hard for every

ecient learning algorithm for every nontrivial distribution on the instances even

ed Therefore a typical result that can b e obtained when memb ershipqueries are allow

from our constructions is that if factoring is hard then TC cannot be learned Notice

that the unlearnability result implied by the existence of pseudorandom functions is

very strong Since no construction of pseudorandom functions in NC was known

weaker unlearnability results for NC and TC based on cryptographic assumptions

were obtained in see Section for more details It is also interesting

OVERVIEW OF RESEARCH OBJECTIVES AND RESULTS

to compare with the result of Linial Mansour and Nisan who showed that AC

can b e learned in time slightly sup erp olynomial under the uniform distribution on the

examples

Another application of pseudorandom functions in complexitywas suggested byRazborov

and Rudich They showed that if a circuitclass contains pseudorandom func

tions that are secure against a sub exp onentialtime adversary then there are no

what they called Natural Proofs which include all known lower bound techniques

for separating this class from Ppoly Therefore atypical result that can b e obtained

from our construction is that if factoring cannot becarriedout in subexponentialtime

then there are no Natural Proofs for separating TC from Ppoly

We note that one can extract a similar result assuming the hardness of factoring from

the work of Kharitonov which is based on the pseudorandom generator of Blum

Blum and Shub

Except of b eing more parallelizable the constructions of pseudorandom functions that

are describ ed in Section have two additional advantages over previous ones

It is ecient computing the value of the function at any given p oint is comparable

with two exp onentiations This is the rst construction that seems ecien t enough

to b e implemented and indeed these functions were implemented by Langb erg in

Given the many applications of pseudorandom functions it is clear that having ecient

pseudorandom functions is an imp ortantgoal

It has a simple algebraic structure To see our main motivation here consider the

BellareGoldwasser signature scheme The public key in this scheme contains a com

mitment for a key s of a pseudorandom function The signature for a message m

is comp osed of a value y and a noninteractive zeroknowledge pro of that y f m

s

In order for this scheme to be attractive we must have a simple noninteractive zero

knowledge pro of for claims of the form y f m In this and other scenarios we might

s

wish to have additional prop erties for the functions such as a simple functionsharing

scheme in the sense of It seems that for such prop erties to b e p ossible we need a

simple construction of pseudorandom functions

In Section we consider some desirable features of pseudorandom functions We

also present preliminary results in obtaining these features for our construction of

pseudorandom functions A rather simple zeroknowledge pro of for claims of the

y f m and y f m A way to distribute a pseudorandom function form

s s

among a set of parties such that only an authorized subset can compute the value of

the function at any given p oint A proto col for oblivious evaluation of the value

of the function Assume that a party A knows a key s of a pseudorandom function

Then A and a second party B can p erform a proto col during which B learns exactly

one value f x of its choice whereas A do es not learn a thing and in particular

s

do es not learn x We consider the task of improving these proto cols and designing

additional ones to b e an interesting line for further research

CHAPTER INTRODUCTION

Related Work

In addition to intro ducing pseudorandom functions Goldreich Goldwasser and Micali

have suggested a construction of such functions from pseudorandom generators that expand

the input by a factor of twolike the one in As mentioned ab ove the GGM construction

is sequential in nature An idea of Levin is to select some secret hash function h and

apply the GGM construction to hx instead of x If jhxj log n then the depth of

theGGMtreeisonlylog n and presumably we get a pseudorandom function in NC The

problem with this idea is that wehave decreased the security signicantly with probability

log n

n the function can be broken irresp ective of the security guaranteed by the pseudo

random generator To put this construction in the correct light supp ose that for security

k

parameter k we have some problem whose solution requires time on instance of length

k

p olynomial in k If we would like to have security for our pseudorandom function

then the Levin construction requires depth k whereas our construction requires depth log k

Impagliazzo and Naor have provided parallel constructions for several other cryp

tographic primitives based on the hardness of subset sum and factoring The primitives

include pseudorandom generators that expand the input by a constant factor universal

oneway hash functions and strong bitcommitments

Blum et al prop osed away of constructing in parallel several cryptographic prim

itives based on problems that are hard to learn We extend their result by showing that

hardtolearn problems can be used to obtain synthesizers and thus pseudorandom func

tions

A dierent line of work more relevant to deran

domization and saving random bits is to construct bitgenerators such that their output

is indistinguishable from a truly random source to an observer of restricted computational

power eg generators against p olynomialsize constantdepth circuits Most of these con

structions need no unproven assumptions

It turns out that there are a number of researchers who observed that the average

case DDHAssumption yields pseudorandom generators with go o d expansion One such

construction was prop osed by Racko unpublished A dierent construction is suggested

by Gertner and Malkin This construction is similar to the pseudorandom generator

one gets by scaling down our pseudorandom functions

A Study of Some Numb erTheoretical Assumptions

The constructions of pseudorandom functions describ ed in Section are based on two

numb ertheoretical assumptions that are related to the DieHellman keyexchange proto

col The rst is the decisional version of the DieHellman assumption DDHAssumption

and the second is the generalized version of the computational DieHellman assumption

GDHAssumption To b etter understand the security of our constructions we include in

Chapter astudy of these assumptions

The DDHAssumption assumption is relatively new or more accuratelywas explicitly

considered only recently In Section we survey some of the dierent applications

They also provided a construction of AC pseudorandom generators with small expansion

OVERVIEW OF RESEARCH OBJECTIVES AND RESULTS

of the assumption and the current knowledge on its security Furthermore we show a

randomized reduction of the worstcase DDHAssumption to its average case based on

the randomselfreducibilityofthe DDHProblem that was previously used by Stadler

We consider our research of the DDHAssumption to b e of indep endent imp or

tance given that the assumption was recently used in quite a few interesting applications

eg

The GDHAssumption was originally considered in the context of a generalization of

the DieHellman keyexchange proto col to k parties In Section we prove

that breaking this assumption mo dulo a so called Bluminteger would imply an ecient

algorithm for factoring Blumintegers Therefore b oth the generalized keyexchange

proto col and our pseudorandom functions that are based on the GDHAssumption

are secure as long as factoring Blumintegers is hard Our reduction strengthen a

previous worstcase reduction of Shmuely

A Study of the LRConstruction

As describ ed ab ove in addition to dening strong pseudorandom permutations Luby and

Racko haveprovided a construction of such p ermutations LRConstructionwhich

was motivated by the structure of DES The basic building blo ck is the so called Feistel

permutation based on a pseudorandom function dened by the key Their construction

consists of four rounds of Feistel p ermutations or three rounds for pseudorandom per

mutations each round involves an application of a dierent pseudorandom function see

Figure a for an illustration

The part of our research describ ed in Chapter is a study of the LRConstruction We

reduce somewhat the complexity of the construction and simplify its pro of of security by

showing that two Feistel p ermutations are sucient together with initial and nal pair

wise indep enden t permutations Minimizing the number of invo cations of pseudorandom

functions makes sense since all known constructions of pseudorandom functions involve

nontrivial though of course p olynomialtime computations The revised construction and

pro of provide a framework describ ed in Section in which similar constructions maybe

designed and their security can be easily proved We demonstrate this by presenting some

additional adjustments of the construction

Alongside cryptographic pseudorandomness the last two decades saw the development

of the notion of limited indep endence in various setting and formulations

For a family of functions F to have some sort of limited indep endence means that

if we consider the value of a function f chosen uniformly at random from F at each p oint

as a random variable in the probability space dened by cho osing f then these random

variables p ossess the promised indep endence prop erty Thus a family of permutations on

n

f g is pairwise indep endent if for all x y the values of f x and f y are uniformly

n

distributed over strings a b f g such that a b Functions of limited indep endence

n n n

A Feistel p ermutation for a function f f g f g is a permutation on f g dened by

def

D L R R L f R where jLj jR j n Each of the rounds of DES involves a Feistel p ermutation

f

of a function determined by the key bits

CHAPTER INTRODUCTION

are typically much simpler to construct and easier to compute than cryptographic pseudo

random functions

New Results

The goal of this research is to provide a b etter understanding of the LRConstruction and

as a result improve the construction in several resp ects Our main observation is that the

dierent rounds of the LRConstruction serve signicantly dierent roles We show that the

rst and last rounds can b e replaced by pairwise indep endent p ermutations and use this in

order to

Simplify the pro of of security of the construction esp ecially in the case of strong

pseudorandom permutations and provide a framework for proving the security of

similar constructions

Derive generalizations of the construction that are of practical and theoretical interest

The pro of of security for each one of the constructions is practically free of charge

given the framework

Achieve an improvement in the computational complexity of the pseudorandom per

mutations two applications of a pseudorandom function on n bits suce for com

puting the value of a pseudorandom p ermutation on n bits at a given p oint vs

four applications in the original LRConstruction This implies that the reduction is

optimal

As discussed in Section the new construction is in fact a generalization of the original

LRConstruction Thus the pro of of security we give Theorem also applies to the

original construction We also showhow the main construction can b e relaxed by Using

a single pseudorandom function instead of two and Using weaker and more ecient

permutations or functions instead of the pairwise indep endent permutations The main

generalizations of our construction describ ed in Sections can briey be describ e as

follows

Using t rounds of generalized Feistel permutations instead of two the success

m

to approximately probability of the distinguisher is reduced from approximately

t m

where the p ermutation is on bits and the distinguisher makes at most m

t

queries see Figure for an illustration

Instead of applying Feistel p ermutations on the entire outputs of the rst and second

rounds Feistel p ermutations can b e separately applied on each one of their subblo cks

This is a construction of a strong pseudorandom p ermutation on many blo cks using

pseudorandom functions on a single blo ck see Figure for an illustration

Finally we analyzes in Section the dierent constructions of this work as constructions

of k wise dep endent permutations

OVERVIEW OF RESEARCH OBJECTIVES AND RESULTS

Related Work

The LRConstruction inspired a considerable amount of research We try to refer to the

more relevantto this work part of these directions

Several alternative pro ofs of the LRConstruction were presented over the years Maurer

gives a pro of of the threeround construction His pro of concentrates on the nonadaptive

case ie when the distinguisher has to sp ecify all its queries in advance A p oint worth

noticing is that indistinguishability under nonadaptive attacks do es not necessarily imply

indistinguishability under adaptiveattacks For example a random involution an involution

is a p ermutation whichistheinv erse of itself and a random p ermutation are indistinguish

able under nonadaptive attacks and can be distinguished using a very simple adaptive

attack A dierent approach toward the pro of was describ ed by Patarin this is the

only published pro of we are aware of for the LRConstruction of strong pseudorandom

permutations another pro of was given by Koren

Other pap ers consider the security of p ossible variants of the construction A signicant

p ortion of this research deals with the construction of pseudorandom p ermutations and

strong pseudorandom p ermutations from a single pseudorandom function This line of

work is describ ed in Section

Lucks shows that a hash function can replace the pseudorandom function in the rst

round of the threeround LRConstruction His pro of is based on and is motivated by

his suggestion to use the LRConstruction when the input is divided into two unequal parts

Lucks left op en the question of the construction of strong pseudorandom p ermutations

Somewhat dierent questions were considered by Even and Mansour and by Kilian

and Rogaway Lo osely sp eaking the former construct several pseudorandom p ermu

tations from a single one while the latter show how to make exhaustive keysearch attac ks

more dicult The construction itself amounts in b oth cases to XORing the input of the

pseudorandom p ermutation with a random key and XORing the output of the p ermutation

with a second random key This construction is essentially DESX which was invented by

Ron Rivest see details and references in

The background and related work concerning other relevant issues are discussed in the

appropriate sections herein Denitions and constructions of ecient hash functions in Sec

tion reducing the distinguishing probability in Section and the construction of

pseudorandom permutations or functions with large inputlength from pseudorandom

permutations or functions with small inputlength in Section

An even more striking example is obtained by comparing a random p ermutation P that satises

P P with a truly random p ermutation

CHAPTER INTRODUCTION

Chapter

Preliminaries

Notation

We start by describing the notation used in the subsequent chapters Additional notation

will b e describ ed in the relevant chapters

N denotes the set of all natural numb ers

For any integer N N the multiplicative group mo dulo N is denoted by Z and the

N

additive group mo dulo N is denoted by Z

N

For anyinteger k denote byk the set of integers f kg For anytwointegers

km denote by km the set of integers fk k mg

n n

I denotes the set of all nbit strings f g

n

U denotes the random variable uniformly distributed over I

n

Let x be any bitstring we denote by jxj its length ie the number of bits in x This

should not be confused with the usage of jj to denote absolute values

Let x and y be two bit strings of equal length then x y denotes their bitbybit

exclusiveor and x y denotes their inner pro duct mo d

Let x and y be anytwo bit strings then x y denotes the string x concatenated with y

For any two functions f and g such that the range of g is the domain of f denote by

f g their comp osition ie f g x f g x

PseudoRandomness Denitions

Pseudorandomness is the main notion studied by this work In Section we have de

scrib ed the imp ortance of pseudorandom functions and p ermutations in cryptography part

of their applications and their original constructions In this section w e give the denitions

CHAPTER PRELIMINARIES

of pseudorandom functions and permutations A motivating discussion of these denitions

also app ears in Section

We give all denitions in this sections for some sequence of domains fA B g Let

n n nN

F be the set of all A B functions Let P be the set of all p ermutations over P We

n n n n n

n

will often take A B I in particular we will often concentrate on lengthpreserving

n n

this work we make the the domains explicit For example in functions In some places in

some places we use a phrase of the form an A B function ensemble

n n

A function ensemble is a sequence H fH g such that H is a distribution over

n nN n

F H is the uniform function ensemble if H is uniformly distributed over F A

n n n

permutation ensemble is a sequence H fH g such that H is a distribution over

n nN n

P H is the uniform permutation ensemble if H is uniformly distributed over P

n n n

A function ensemble or a p ermutation ensemble H fH g is eciently com

n nN

putable if the distribution H can be sampled eciently and the functions in H can be

n n

computed eciently That is there exist probabilistic p olynomialtime Turingmachines

n

I and V and a mapping from strings to functions such that I and H are

n

n

identically distributed and V i xix so in fact H VI We denote

n

def

by f the function assigned to i ie f i We refer to i as the key of f and to I as

i i i

the keygenerating algorithm of F

We would like to consider eciently computable function or p ermutation ensembles

uniform ensemble In our setting the that cannot be eciently distinguished from the

n

distinguisher is an oracle machine that on input can make queries to a function in F

n

or a p ermutation or p ermutations in P and outputs a single bit We assume that on

n

n

input the oracle machine makes only queries in A Therefore n serves as the security

n

parameter An oracle machine has an interpretation both under the uniform complexity

mo del and under the nonuniform mo del In the former it is interpreted as a Turingmachine

with a sp ecial oracletap e in this case ecient means probabilistic p olynomialtime and in

the latter as a circuitfamily with sp ecial oraclegates in this case ecient means p olynomial

size The discussion of this work is indep endent of the chosen interpretation

Let M be an oracle machine let f be a function in F and H a distribution over F

n n n

f n

Denote by M the distribution of M s output when its queries are answered by f and

n f n H

n

the distribution M where f is distributed according to H We denote by M

n

would also like to consider oracle machines with access b oth to a p ermutation and to its

inverse Let M be such a machine let f be a permutation in P and H a distribution over

n n

ff n

P Denote by M the distribution of M s output when its queries are answered by

n

H H n ff n

n

n

f and f and denote by M the distribution M where f is distributed

according to H

n

Denition advantage Let M be an oracle machine and let H fH g and

n nN

H fH g be two function or permutation ensembles We cal l the function

n nN

H H n n

n n

PrM PrM

the advantage M achieves in distinguishing between H and H

PSEUDORANDOMNESS DEFINITIONS

Let M beanoracle machine and let H fH g and H fH g be two permutation

n nN n nN

ensembles We cal l the function

H H n H H n

n n

n n

Pr M PrM

the advantage M achieves in distinguishing between hH H i and hH H i

Denition distinguish We say that M distinguishes between H and H resp

hH H i and hH H i for n if for innitely many n the advantage M achieves in

distinguishing between H and H resp hH H i and hH H i is at least n

Denition negligible functions A function h N N is negligible if for every

constant c and al l suciently large n

hn

c

n

Denition PFE Let H fH g be an eciently computable function ensemble

n nN

and let R fR g be the uniform function ensemble H is a pseudorandom function

n nN

ensemble if for every ecient oraclemachine M the advantage M has in distinguishing

between H and R is negligible

Denition PPE Let H fH g be an eciently computable permutation en

n nN

semble and let R f R g be the uniform permutation ensemble H is a pseudorandom

n nN

permutation ensemble if for every ecient oraclemachine M the advantage M has in dis

tinguishing between H and R is negligible

Denition SPPE Let H fH g be an eciently computable permutation en

n nN

semble and let R fR g be the uniform permutation ensemble H is a strong pseudo

n nN

random p ermutation ensemble if for every ecient oraclemachine M the advantage M has

distinguishing between hH H i and hR R i is negligible in

Remark We use the phrase f is a pseudorandom function as an abbreviation for

f is distributed according to a pseudorandom function ensemble and similarly for f is a

pseudorandom permutation and f is a strong pseudorandom permutation

Remark In the denitions above and in the rest of this work we interpret ecient

computation as probabilistic polynomialtime and negligible as smal ler than pol y

This is a rather standard choice and it signicantly simplies the presentation However

al l the proofs in this work easily imply more quantitative results see eg Remark

Moreover al l the reductions of this work are securitypreserving in the sense of as

is also discussed in the introduction

CHAPTER PRELIMINARIES

kWise Indep endent Functions and Permutations

The notions of kwise indep endent functions and kwise almost indep endent functions

under several dierentformulations play a ma jor role in contemp orary

computer science These are distributions of functions such that their value on any given k

inputs is uniformly or almost uniformly distributed Several constructions of such functions

and a large variety of applications have been suggested over the years

We briey review the denitions of k wise indep endence and k wise dep endence The

denitions of pairwise indep endence and pairwise dep endence can b e derived by taking

k

Denition Let D and D be two distributions denedover the variation distance

or statistical distance betwe en D and D is

X

kD D k jD D j

Denition Let A and B be two sets k an integer k jAj and F a

distribution of A B functions

Let x x x be k distinct members of A consider the fol lowing two distributions

k

hf x fx fx i where f is distributed according to F

k

k

The uniform distribution over B

F is k wise independent if for al l x x x the two distributions are identical F is

k

k wise dependent if for al l x x x the two distributions are of variation distance at

k

most

These denitions are naturally extended to p ermutations

Denition Let A bea set k an integer k jAjandF a distribution

of permutations over A

Let x x x be k distinct members of A consider the fol lowing two distributions

k

hf x fx fx i where f is distributed according to F

k

The uniform distribution over sequences of k distinct elements of A

F is k wise independent if for al l x x x the two distributions are identical F is

k

if for al l x x x the two distributions are of variation distance at k wise dependent

k

most

The connection of this work to k wise indep endence is bidirectional as describ ed in the

following two paragraphs

Pairwise indep endent permutations are used in several places in this work eg in Sec

tion Pairwise indep endent p ermutations are used in an esp ecially substantial way

in Chapter As shown in Section pairwise indep endent permutations can replace the

KWISE INDEPENDENT FUNCTIONS AND PERMUTATIONS

rst and fourth rounds of the LRConstruction Let A be a nite eld The permutation

def

f x a x b where a b A are uniformly distributed is pairwise indep endent

ab

n

Thus there are pairwise indep endent p ermutations over I the p ermutations f with op

ab

n

erations over GF In Section it is shown that we can use even more ecient

functions and permutations in our construction In particular we dene and consider the

concept of AXU functions

In contrast with the case of pairwise indep endent p ermutations we are not aware of

any go o d constructions of k wise dep endent p ermutations for general k and The

LRConstruction oers a partial solution to this problem partial b ecause of the b ounded

value of that can be achieved Using k wise dep endent functions on n bits instead of

pseudorandom functions in the LRConstruction yields a k wise dep endent permutation

n

on n bits for O k In Section we analyze the dierent constructions of

Chapter as constructions of k wise dep endent p ermutations

CHAPTER PRELIMINARIES

Chapter

A Study of Some Numb erTheoretical

Assumptions

The constructions of pseudorandom functions in Section are based on two number

theoretical assumptions that are related to the DieHellman keyexchange proto col The

rst is the decisional version of the DieHellman assumption DDHAssumption and the

second is the generalized version of the computational DieHellman assumption GDH

Assumption To b etter understand the security of our constructions we include in this

section a study of these assumptions

The Decisional DieHellman Assumption

The construction of pseudorandom functions that is describ ed in Section is based

on the DDHAssumption This assumption is relatively new or more accurately was ex

plicitly considered only recently We therefore devote this section to a discussion of the

DDHAssumption we describ e and dene the assumption consider some of its dierentap

plications and the current knowledge on its security Furthermore we show in Section

a randomized reduction of the worstcase DDHAssumption to its average case

Background

The DHAssumption was intro duced in the context of the Die and Hellman key

exchange proto col Informally a keyexchange proto col is a way for two parties A and B

to agree on a common key K while communicating over an insecure but authenticated

AB

channel Such a proto col is secure if any ecient third party C with access to the com

munication between A and B but not to their private random strings cannot tell apart

K from a random value ie K is pseudorandom to C This guarantees that it is

AB AB

computationally infeasible for an eavesdropp er to gain any partial information on K

AB

Let P be a large prime publicly known All exp onentiations in the rest of this section

Section are in Z To simplify the exp osition we omit the expression mod P from

P

nowon Given a generator g of Z the DieHellman keyexchange proto col go es as follows

P

a

A cho oses an integer a uniformly at random in P and sends g to B In return B cho oses

CHAPTER A STUDY OF SOME NUMBERTHEORETICAL ASSUMPTIONS

b

an integer b uniformly at random in P and sends g to A Both A and B can now compute

ab ab

g and their common key K is dened by g in some publicly known manner For

AB

this proto col to be secure we must have at the minimum that the standard computational

version of the DieHellman assumption DHAssumption holds

a b a b

Given hg g g i it is hard to compute g

The reason is that if this assumption do es not hold then C as ab ove can also compute

K

AB

One metho d to pro duce the key K is to apply the GoldreichLevin hardcore

AB

ab

function to g an imp ortant improvement on the security of such an application was

made by Shoup If the DHAssumption holds then this metho d indeed gives a pseudo

random key However the pro of in only implies the pseudorandomness of the key in

is at most logarithmic in the security parameter A much more ambitious case its length

ab

metho d is to take g itself as the key For instance in the ElGamal given the

a b ab

public key g the encryption of a message m is hg g mi The security of the keyexchange

proto col now relies on the DDHAssumption

a b ab

Given hg g g zi it is hard to decide whether or not z g

a b

However when g is a generator of Z we have that g and g do give some information

P

ab a b ab

on g For example if either g or g is a quadratic residue then so is g A standard

solution for this problem is to take g to be a generator of the subgroup of Z of order Q

P

where Q is a large prime divisor of P In fact for most applications using g of order Q

is an advantage since Q maybemuch smaller than P say bits long which results in a

y b e as small is that all known substantial improvement in eciency The reason that Q ma

subexponential algorithms for computing the discrete log are subexponential in the length of

P as long as P is not to o smo oth even when applied to the subgroup of size Q generated

by g see for surveys on algorithms for the discrete log the best known algorithm

for general groups has time square ro ot of the size of the group

How Much Condence Can we Have in the DDHAssumption

It is clear that the computational DHProblem is at most as hard as computing the discrete

a

log given hg g i nd a Recentworks by Maurer and Wolf and Boneh and Lipton

show that in several settings these two problems are in fact equivalent For example Maurer

and Wolf showed that given some information which only dep ends on P and an ecient

algorithm for computing the DHProblem in Z one can eciently compute the discrete

P

log in Z so in some nonuniform sense these problems are equivalent Shoup showed

P

that there are no ecient genericalgorithms for computing the discrete log or the DH

Problem where a genericalgorithm is one that do es not exploit any sp ecial prop erties

of the enco ding of group elements A bit more formally a generic algorithm is one that

works for a blackb ox group where each element has a random enco ding and given the

enco dings of a and b the algorithm can query for the enco dings of a b and a

ab

For example to get a key of one bit we can dene K to b e the inner pro duct mo d of g and a

AB

random string r chosen by one of the parties and sent to the other over the insecure channel

THE DECISIONAL DIFFIEHELLMAN ASSUMPTION

Perhaps the best evidence for the validity of the DHAssumption is the fact that it

endured extensive research over the last two decades This research do es not seem to un

dermine the stronger decisional version of the DHAssumption as well In addition the

DDHAssumption did app ear b oth explicitly and implicitly in several previous works How

ever it seems that given the many applications of the DDHAssumption a more extensive

study of its security is in place

To some extent the DDHAssumption is supp orted by the work of Shoup and

the work of Boneh and Venkatesan Shoup showed that the DDHProblem is hard for

any genericalgorithm as ab ove Boneh and Venkatesan showed that computing the k

p

ab a b ab

log P most signicant bits of g given hg g g i is as hard as computing g itself

A recent result with applications to the DDHAssumption was shown by Canetti Friedlander

and Shparlinski

In Section we prove an attractive feature of the DDHAssumption There is quite

a simple randomized reduction between its worstcase and its averagecase for xed P and

Q More sp ecically

For any primes P and Q such that Q divides P the fol lowing statements

are equivalent

a b

Given hP Q g g g iitiseasy to distinguish with nonnegligible advantage

ab c

between g and g where g is a uniformly chosen element of order Q in

Z and a b and c are uniformly chosen from Z

Q

P

a b c

It is easytodecide with overwhelming success probability for hP Q g g g g i

whether or not c a b mo d Q where a b and c are any three elements in

der Q in Z Z and g is any element of or

Q

P

This reduction is based on the randomselfreducibilityof the DDHProblem that was pre

viously used by Stadler The reduction may strengthen our condence in the DDH

Assumption and in the security of its applications

For most applications of the DDHAssumption including ours there is no reason to

insist on working in a subgroup of Z where P is a prime Therefore a natural question

P

is howvalid is this assumption for other groups Sp ecic groups that were considered in the

where N is a comp osite McCurley and Shmuely context of the DHAssumption are Z

N

showed that for many of those groups breaking the DHAssumption is at least as

hard as factoring N Ellipticcurve groups for which in some cases no sub exp onential

algorithms for the discrete log are currently known We stress that the randomized reduction

mentioned ab ove relies on the primalityof the order of g

The Decisional DHAssumption is Very Attractive

It turns out that the DDHAssumption was assumed in several previous works b oth explic

itly and implicitly In the following we briey refer to some of those works and describ e

some additional applications

The most obvious application of the DDHAssumption is to the DieHellman key

exchange proto col and to the related publickey cryptosystem namely the ElGamal cryp

a b ab

tosystem given the public key g the encryption of a message m is hg g mi Assume

CHAPTER A STUDY OF SOME NUMBERTHEORETICAL ASSUMPTIONS

that the message space is restricted to the subgroup generated by g In this case it is easy

to see that the semantic security see of the cryptosystem is equivalent to the DDH

Assumption In the general case without the restriction on the message space we can use

a

the following related cryptosystem given the public key hg hi the encryption of a message

b ab

m is hg hg mi where h is a pairwise indep endent hash function from nbit strings to

strings of approximately the length of Q see Lemma for more details on the role of h

Therefore given the DDHAssumption we get a probabilistic encryption of many bits for

the price of a single or two exp onentiation This is comparable with the BlumGoldwasser

cryptosystem

Other applications that previously app eared are

Bellare and Micali showed an ecient noninteractive oblivious transfer of many

bits that relies on the DDHAssumption

Brands p ointed out that several suggestions for undeniable signatures as the one

in where this concept was intro duced implicitly rely on the DDHAssumption If

this assumption do es not hold then such schemes are in fact digital signatures

Canetti gave a simple construction based on the DDHAssumption for a new

cryptographic primitive called Oracle Hashing later renamed p erfectly oneway

probabilistic hash functions Lo osely these are hash functions that hide all partial

information on their input

Franklin and Hab er showed a construction of a joint encryption scheme based on

the the DDHAssumption mo dulo a composite Using this scheme they showed howto

get an ecient proto col for secure circuit computation

Stadler presents veriable secret sharing based on the DDHAssumption

Steiner Tsudik and Waidner showed how to extend the DieHellman proto col

to a keyexchange proto col for a group of parties They reduced the security of the

extended proto col to the DDHAssumption by showing that the DDHAssumption

decisional GDHAssumption implies the

A very attractive application of the DDHAssumption was recently prop osed by Cramer

and Shoup They have presented a new publickey cryptosystem that is secure against

adaptive chosen ciphertext attacks Both encryption and decryption in this cryptosystem

only require afew exp onentiations in addition to universal oneway hashing

To all these applications we can add

A pseudorandom generator that practically doubles the input length Essentiallythe

b ab

a

generator is dened by G b hg g i As mentioned in the intro duction

P Qg g

several unpublished constructions of pseudorandom generators based on the DDH

Assumption were previously suggested

a

In fact the output of G is a pseudorandom pair of valuesinthe subgroup generated by g In

P Qg g

order to get a pseudorandom value in f g for of approximately twice the length of Q one needs to

hash the output of the generator see Lemma A similar observation holds for the constructions of

pseudorandom synthesizers and pseudorandom functions

THE DECISIONAL DIFFIEHELLMAN ASSUMPTION

A pseudorandom synthesizer see denition in Section whose output length is

ab

similar to its arguments length essentially dened by S a bg

P Qg

Both these constructions are overshadowed by the construction of pseudorandom functions

intro duced in Section

Formal Denition

To formalize the DDHAssumption we rst need to sp ecify an eciently samplable distri

bution for P Q and g where g is an element of order Q in Z

P

Let n b e the security parameter for some function N N wewanttocho ose an nbit

prime P with an nbit prime Q that divides P A natural way todothisistocho ose P

and Q uniformly at random sub ject to those constraints However it is p ossible to consider

dierent distributions For example it is not inconceivable that the assumption holds when

for every n we have a single p ossible choice of P Q and g Another common example is

letting P and Q satisfy P Q although cho osing a smaller Q may increase the

eciency of most applications In order to keep our results general we let P Q and g be

generated by some p olynomialtime algorithm IG where IG stands for instance generator

Denition IG The DieHel lman instance generator IGisaprobabilistic polynomial

n

time algorithm such that on input the output of IG is distributed over triplets hP Q g i

where P is an nbit prime Q a large prime divisor of P and g an element of order Q

in Z

P

For the various applications of the DDHAssumption we need its averagecase version

Namely when a and b are uniformly chosen and c is either a b or uniformly chosen In

Section it is shown that a worstcase choice of a b and c can be reduced to a uniform

choice Similarly the assumption is not strengthened if g generated by IG is taken to be

a uniformly chosen element of order Q in Z

P

Assumption Decisional DieHellman For every probabilistic polynomialtime

algorithm A every constant and al l suciently large n

a b ab a b c

PrAP Q g g g g PrAP Q g g g g

n

where the probabilities are taken over the random bits of A the choiceofhP Q g i according

n

to the distribution IG and the choice of a b and c uniformly at random in Z

Q

A Randomized Reduction

In this subsection we use a simple randomized reduction to show that for every P Q and g

the DDHProblem is either very hard on the average or very easy in the worstcase Given

the currentknowledge of the DDHProblem such a result strengthens our b elief in the DDH

Assumption The main part of the reduction Lemma was previously used by Stadler

CHAPTER A STUDY OF SOME NUMBERTHEORETICAL ASSUMPTIONS

Denition For any hP Q g i such that P is aprime Q a prime divisor of P and

g an element of order Q in Z the function DDH is dened by

hP Qg i

P

if c a b mo d Q

a b c

DDH g g g

P Qg

otherwise

for any three elements a b c in Z

Q

Theorem Let A be any probabilistic algorithm with running time t t n and

n any positive function such that is eciently constructible There exists a polynomial

p pn and a probabilistic algorithm A with running time tn pnn such that

for any choice of hP Q g i as in Denition if

a b ab a b c

Pr AP Q g g g g PrAP Q g g g g n

where the probabilities are taken over the random bits of A and the choice of a b and c

uniformly at random in Z then for any a b and c in Z

Q Q

a b c a b c n

PrA P Q g g g g DDH g g g j

P Qg

where the probability is only over the random bits of A

In particular if A is probabilistic polynomialtime and n pol y n then A is also

probabilistic polynomialtime

Blum and Micali intro duced the concept of randomselfreducibility and randomized

reductions Informally a problem is randomselfreducible if solving the problem on any

instance x can be eciently reduced to solving the problem on a random instance y or on

p olynomial number of random instances Ie for any instance x a random instance y can

be eciently sampled using a random string r such that given r and the solution of the

problem on y it is easy to compute the solution of the problem on x A problem that is

randomselfreducible can either be eciently solved for every instance with overwhelming

success probabilityor it cannot b e solved for a random instance with nonnegligible success

probability

Our randomized reduction is closely related to other known reductions Blum and Micali

showed that for any sp ecic prime P and generator g the discrete log problem is

a

randomselfreducible given hP g g i for any a it is easy to generate a random instance

ar a r

the solution for the random hP g g g g i where r is uniform in P Given

instance ie a r it is easy to compute the solution for the original instance ie a

a b

A similar prop erty was shown for the DHProblem eg given hP g g g i for any a

ar bs

and b it is easy to generate a random instance hP g g g i where r and s are uniform

ar bs

in P Given the solution for the random instance ie z g it is easy to

ab a s b r sr

compute the solution for the original instance ie g z g g g

However in order to prove Theorem we need a somewhat dierent reduction In

particular we need to use the fact that g is an element of prime order Theorem is not

true when g is a generator of Z P

THE DECISIONAL DIFFIEHELLMAN ASSUMPTION

Lemma There exists a probabilistic polynomialtime algorithm R such that on any

input

a b c

hP Q g g g g i

where P is a prime Q a prime divisor of P g an element of order Q in Z and a b c

P

are three elements in Z the output of R is

Q

a b c

hP Q g g g g i

where if c a b mo d Q then a and b are uniform in Z and c a b mo d Q and if

Q

c a b mo d Q then a b and c are all uniform in Z

Q

Proof R cho oses s s and r uniformly in Z computes

Q

a a r s

g g g

b b s

g g g

r s b s s s c r a c

g g g g g

and outputs

a b c

hP Q g g g g i

Let c a b e mo d Q for e in Z then

Q

a r a s mo d Q b b s mo d Q c a b e r mo d Q

If e we get that a and b are uniformly distributed in Z and c a b mo d Q If e

Q

we get that a b and c are all uniformly distributed in Z this is the place we use the fact

Q

that Q is a prime which implies that e r mo d Q is uniformly distributed in Z Therefore

Q

the output of R has the desired distribution

Proof of Theorem Let A be any probabilistic algorithm with running time t tn

let nbeany p ositive function suchthat is eciently constructible and let hP Q g i

be as in Denition Assume that

a b c a b ab

PrAP Q g g g g n PrAP Q g g g g

where the probabilities are taken over the random bits of A and the choice of a b and c

uniformly at random in Z

Q

Let R be the probabilistic p olynomialtime algorithm that is guaranteed to exist by

Lemma By the denition of R and our assumption we get that for any a b and

c a b mo d Q in Z

Q

a b ab a b c

PrARP Q g g g g PrARP Q g g g g n

Now the probabilities are only taken over the random bits of A and R Therefore by

standard metho ds of amplication we can dene a probabilistic algorithm A such that for

any a b and c a b mo d Q in Z

Q

a b ab a b c n

PrA P Q g g g g Pr A P Q g g g g

a b c

On any input hP Q g g g g i the output of A is essentially a threshold function of

a b c

O nn indep endent values ARP Q g g g g It is clear that A satises the

conditions required in Theorem

CHAPTER A STUDY OF SOME NUMBERTHEORETICAL ASSUMPTIONS

The GDHAssumption Mo dulo a Comp osite

The Generalized DieHellman Assumption GDHAssumptionwas originally considered

in the context of a keyexchange proto col for k parties see eg This

proto col is an extension of the extremely inuential DieHellman keyexchange proto col

Given a group G and an element g G the highlevel structure of the proto col is

def

as follows Party i k f kg cho oses a secret value a The parties exchange

i

Q

a

i

iI

messages of the form g for several prop er subsets I k Given these messages

Q

a

i

ik

each of the parties can compute g and this value denes their common key in some

publicly known way Since the parties use an insecure though authenticated channel it is

Q

a

i

ik

essential that the messages they exchange do not reveal g The GDHAssumption is

Q

a

i

ik

even stronger informally it states that it is hard to compute g for an algorithm that

Q

a

i

iI

can query g for any prop er subset I k of its choice The precise statement of the

assumption is given in Section

In Section we prop ose another application to the GDHAssumption We show

an attractive construction of pseudorandom functions that is secure as long as the GDH

Assumption holds Motivated by this application we provide in this section a pro of that the

GDHAssumption mo dulo a Bluminteger follows from the assumption that factoring Blum

integers is hard Similar reductions were previously describ ed in the context of the standard

DieHellman assumption by McCurley and Shmuely In fact Shmuely

also provided a related reduction for the GDHAssumption mo dulo a comp osite itself Her

reduction works when the algorithm that breaks the GDHAssumption succeeds in computing

Q

a

i

ik

g for every choice of values ha a a i which is not sucient for the applications

k

of the GDHAssumption In contrast our reduction works even when the algorithm breaking

the GDHAssumption only succeeds for some nonnegligible fraction of the ha a i

k

The Assumptions

In this section we dene the GDHAssumption in Z the multiplicative group mo dulo

N

N where N is a so called Bluminteger We also dene the assumption that factoring

Blumintegers is hard The restriction to Blumintegers is quite standard and it makes the

reduction of factoring to the GDHProblem much simpler We therefore start by dening

Blumintegers and describing some of their prop erties

Blumintegers An integer N is a Bluminteger if N P Q where P and Q are two

distinct primes of the same length and b oth P and Q are congruent to mod ie P

Q mod Avery helpful prop erty of a Bluminteger N is that squaring is a p ermutation

over the subgroup of quadratic residues in Z an element x Z is a quadratic residue if

N N

there exist a y Z such that y x mo d N An imp ortant application along with a

N

pro of of this prop erty was given by Blum Blum and Shub One way to verify this

prop ertyisby considering the Chineseremainder representation of a quadratic residue x In

x has exactly two distinct square ro ots y Since is not a quadratic as well as Z Z

Q P

The work describ ed here is joint with Eli Biham and Dan Boneh

THE GDHASSUMPTION MODULO A COMPOSITE

residue in Z exactly one of the square ro ots y is a quadratic residue in itself Therefore

P

x has exactly four our distinct square ro ots y y in Z from which exactly one is a

N

quadratic residue in itself Another prop erty of a Bluminteger N that we use in our pro of

is that the order of any quadraticresidue x in Z is o dd

N

In order to keep our result general weletN in b oth assumptions b e generated by some

p olynomialtime algorithm FIG where FIG stands for factoringinstancegenerator

Denition FIG The factoringinstancegenerator FIGisaprobabilistic polynomial

n

time algorithm such that on input its output N P Q is distributed over n bit

integers where P and Q are two n bit primes and P Q mod such N is known as

a Bluminteger

n

A natural way to dene FIG is to let FIG b e uniformly distributed over n bit Blum

integers However other choices were previously considered eg letting P and Q ob ey

some safety conditions

The GDHAssumption

To formalize the GDHAssumption which is describ ed in the intro duction we use the fol

lowing two denitions

n

Denition Let N be any possible output of FIG let g be any quadraticresidue

in Z and let a ha a a i be any sequence of k elements of N Dene the

k

N

k

function h with domain f g such that for any k bit input x x x x

Nga k

Q

a

def

i

x

i

mo d N h x g

Nga

r k

Dene h to be the restriction of h to the set of al l k bit strings except ie the

Nga

Nga

k k

restriction of h to f g nf g

Nga

Denition solving the GDH Problem Let A be a probabilistic oracle ma

k

an integervalued function such that n k n and n arealvalued chine k k n

function A solves the GDH Problem if for innitely many n

k

r

h

n

Nga

N g h n PrA

Nga

where the probability is taken over the random bits of A the choice of N according to the

n

in the set of quadraticresidues distribution FIG the choice of g uniformly at random

in Z and the choice of each of the values in a ha a a i uniformly at random in

k n

N

N

Informally the GDHAssumption is that there is no ecient oracle machine A that

solves the GDH Problem for nonnegligible We formalize this in the standard way

k

of interpreting ecient as probabilistic p olynomialtime and nonnegligible as larger

than pol y However our reduction Theorem is more quantitative

We note that n bit Blumintegers are a nonnegligible fraction of all n bit integers and that it is

easy to sample a uniformly distributed n bit Bluminteger

CHAPTER A STUDY OF SOME NUMBERTHEORETICAL ASSUMPTIONS

Assumption The GDHAssumption Mo dulo a BlumInteger Let A beany

probabilistic polynomialtime oracle machine and k k n any integervalued function

that is bounded by a polynomial and is ecientlyconstructible There is no positive con

stant such that A solves the GDH Problem

k

n

The FactoringAssumption

We formalize the assumption that factoring Blumintegers is hard in an analogous way

Denition factoring Let A be a probabilistic Turingmachine and n a

realvalued function A factors if for innitely many n

Pr AP Q fP Qg n

n

where the distribution of N P Q is FIG

Assumption Factoring BlumIntegers Let A be any probabilistic polynomial

time oracle machine There is no positive constant such that A factors

n

Reducing Factoring to the GDHProblem

Theorem Assumption the GDHAssumption is implied by Assumption

Factoring Furthermore assume that A is a probabilistic oracle machine with runningtime

t tn such that A solves the GDH Problem where k k n is an integervalued

k

function that is ecientlyconstructible and n a realvalued function Then there

pol y n k n tn that exists a probabilistic Turingmachine A with running time t n

n

factors where nn O k n

As an intuition to the pro of let us rst describ e it under the assumption that A computes

k

h for any sequence a ha a a i The algorithm A can use A to extract

Nga

k n

squarero ots in Z and consequently A can factor N as shown in This is done as

N

follows

k

and computes g v A rst samples v uniformly at random in Z mo d N Let

N

be the order of g in Z note that is not known to A Since N is a Bluminteger and

N

g is a quadraticresidue we have that is odd This implies that Z and therefore

For simplicity of exp osition denote by g mo d N mo d exists and is simply

mo d

the value g mo d N Hence g mo d N is not just any squarero ot of g in Z but

N

i

mo d N In the same way denote by g mo d N the value is rather precisely equal to g

i i k i

mo d

g mo d N Under this notation wehaveforevery ik that g v mo d N

Let a ha a i be the vector where for all i a mo d It follows that

k i

k

if exactly i k bits of x are algorithm A can easily compute h x for any x

Nga

i k i

we have that h x g mo d N v mo d N Hence A can invoke A with

Nga

r

input hN gi and answer every query q ofA with h q Eventually A outputs the value

Nga

k

n

u h g mo d N Wenowhave that u v mo d N and that Pru v

Nga

This implies that Prgcdu v N fP Qg which enables A to factor N The

complete pro of follows the same lines along with additional randomization of the a s

i

achieved by taking a r which eliminates the assumption that A always succeeds

i i

THE GDHASSUMPTION MODULO A COMPOSITE

Proof Assume that A is as in Theorem we dene the algorithm A that is guaranteed

to exist by the theorem Let N P Q be any nbit Bluminteger Given N as its input

A p erforms the following basic steps we later describ e how these steps can be carried out

in the required running time

k

Sample v uniformly at random in Z Compute k k nandg v mo d N Denote

N

by the order of g in Z note that is not known to A

N

As mentioned ab ove since N is a Bluminteger and g is a quadraticresidue we have

that is o dd This implies that Z and therefore mo d exists and is simply

Sample each one of the values in hr r r i uniformly at random in N For

k

i k denote by a the value r mo d again note that a is not known to

i i i

A Denote by a the sequence ha a a i

k

r

Invoke A with input hN gi and answer each query q of A with the value h q

Nga

k

n

Given that A outputs the correct value h compute u g mo d N As

Nga

noted b elow u v mo d N If u v mo d N output gcdu v N whichisindeed

in fP Qg Otherwise output failed

The RunningTime of A

Steps and can easily b e carried out in time pol y n k n For steps and to b e

tnitisenough to show that carried out in time t npol y n k n

k k

a For every query q f g n f g the value h q can be computed in time

Nga

pol y n k n

k

n

b Given h the value g mo d N can b e computed in time pol y n k n

Nga

Recall that whenever app ears in the exp onent it denotes the value mo d

i

i i

Similarly in the exp onent denotes the value mo d mo d Therefore under

i

this notation for every i the value g mo d N is a quadraticresidue since g is a quadratic

residue The keyobservation for proving a and b is that for all i k we have

i k i k

that g v mo d N For i this is implied by the fact that b oth g and v are

square ro ots of g and they are b oth quadraticresidues Since squaring is a p ermutation over

Bluminteger N we must have that g and the set of quadraticresidues in Z for any

N

k i k i

v are equal By induction on i k we get that g v mo d N in the same

k

way Therefore for every q q q q

k

Q Q

P P

k k

k j j

r a

i i

j j

q q

j j

i i

g g mo d N v h q g

Nga

k

where the values f g are the integers for which the two p olynomials in the variable x

j

j

Q P

k

j

r x and x are identical over Z Since these values can easily b e computed

i j

q

j

i

in time pol y n k n we get that a holds Similarly

P P

Q Q

k k

k k

j k j

k k

k a r

j j

i i

j j

i i

h g g g g g v mo d N

Nga

CHAPTER A STUDY OF SOME NUMBERTHEORETICAL ASSUMPTIONS

k

where the values f g can easily b e computed in time pol y n k n Wenow get that b

j

j

holds since

P

k

k j

k

k

j

j

mo d N g h v

Nga

The SuccessProbability of A

n

It remains to show that A factors where n n O k n Recall that u

k k

denotes the value g mo d N As shown ab ove v g u modN Therefore it is

not hard to verify that

h i

r

h

n

Nga

Pr A N fP QgPr u v mo d N and A N gh

Nga

r r

h h

Nga Nga

Note that A N g does not depend on v itself but rather on v Therefore A N g

is equally distributed for any two assignments w and w of v as long as w w mo d N

r n r

including g h only dep end This follows from the fact that all the values of h

Nga Nga

k

on v For every q q q q the value h q was shown ab ove to b e

k Nga

P

k

P

k j

k

k j

j

j

j

j

mo d N v mo d N v

where the s only dep end on q and the r s We therefore get that

j i

r

h

n

Nga

N g h Pr A N fP Qg PrA

Nga

n

Let N be chosen from FIG We need to show that for innitely many n

Pr A N fP Qg n

Which is equivalentto showing that for innitely many n

r

h

n n

Nga

N g h n O k n PrA

Nga

Todosowe need a couple of simple facts on the distribution of g and of each a mo d Let

i

us rst recall the denition of statistical distance

Denition Let X and Y be two random variables over D The statistical dierence

between X and Y is dened to be

X

j PrX a PrY aj

aD

We now have that

Fact g is a uniformly distributed quadraticresidue in Z

N

Reason v is a uniformly distributed quadraticresidue in Z and squaring is a per

N

mutation over the set of quadraticresidues in Z since N is a Bluminteger N

THE GDHASSUMPTION MODULO A COMPOSITE

Fact Let r and a b e uniformly distributed in N and denote by a the value r mo d

n

Then a and a mo d are of statistical distance O

Reason divides Q P Therefore the distribution of a conditioned on

the event that r Q P is the same as the distribution of a mo d con

and in both cases it is simply the ditioned on the event that a Q P

uniform distribution over Z It remains to notice that Pr r Q P

QP

P Q

n

Pr a Q P O

N N N

Let each value in a ha a a i be uniformly distributed in N Since A solves

k

the GDH Problem and given Fact we have that

k

r

h

n

Nga

N g h n Pr A

Nga

and h are of Given Fact it is easy to verify that the two random variables h

Nga

Nga

n

statistical distance O k n Therefore we can conclude that

r

h

n n

Nga

N g h n O k n PrA

Nga

which completes the proofofthe theorem

Conclusions

Weshowed that breaking the generalized DieHellman assumption mo dulo a Bluminteger

is at least as hard as factoring Blumintegers This implies that the security of the generalized

DieHellman keyexchange proto col whichismentioned in the intro duction can b e based

on the assumption that factoring is hard In addition as shown in Section it implies

the existence of ecient pseudorandom functions which are at least as secure as Factoring

A p ossible line for further research is the study of the generalized DieHellman assump

tion in other groups and the relation b etween the generalized DieHellman assumption and

the standard DieHellman assumption It is interesting to note that the decisional version

of the generalized DieHellman assumption is equivalent to the decisional version of the

standard DieHellman assumption as shown in and in Section

CHAPTER A STUDY OF SOME NUMBERTHEORETICAL ASSUMPTIONS

Chapter

Constructions of PseudoRandom

Functions

In this chapter we present several constructions of pseudorandom functions see the intro

duction for a discussion on the denitions applications and originalconstruction of pseudo

random functions and see Section for their actual denitions For our constructions

we intro duce and study a new cryptographic primitive which we call a pseudorandom syn

thesizer and a generalization of this primitivewhichwecallak dimensional pseudorandom

synthesizer These primitives are of indep endent interest as well In Section we show

a paral lel construction of pseudorandom functions from a pseudorandom synthesizer and

parallel constructions of pseudorandom synthesizers based on several concrete and general

intractability assumptions In Section we show even more parallel constructions of

pseudorandom functions based on concrete intractability assumptions such as the assump

tion that factoring Blumintegers is hard In addition these constructions are more ecient

and have a simple algebraic structure The constructions of Section are motivated bythe

constructions of Section and in particular by the notion of k dimensional synthesizers

A more detailed description of the results obtained in the two stages of the research and of

their applications app ears is the intro duction in Section

PseudoRandom Synthesizers and Functions

Organization

In Section we dene pseudorandom synthesizers and collections of pseudorandom syn

thesizers and discuss their prop erties In Section we describ e our parallel construction of

pseudorandom functions from pseudorandom synthesizers and in Section we prove its

security In Section we describ e a related construction of pseudorandom functions In

addition we discuss the timespace tradeo and the incremental prop erty of our construc

tions In Section we discuss the relations between pseudorandom synthesizers and

other cryptographic primitives In Section we describ e constructions of pseudorandom

synthesizers based on several numb ertheoretic assumptions In Section weshowhowto

construct pseudorandom synthesizers from hardtolearn problems and consider a very sim

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

ple concrete example We also discuss the application of parallel pseudorandom functions

to learningtheory

Notation

The notation and denitions used in Section are given in Chapter Additional notation

that are used in this section include

k

Let X be any random variable we denote by X the k matrix whose entries are

k

indep endently identically distributed according to X We denote by X the vector

k

X

We identify functions of twovariables and functions of one variable in the natural way

n n k n k

Ie by letting f I I I b e equivalent to f I I and letting f x y bethe

same value as f x y where x y stands for x concatenated with y

Pseudorandom Synthesizers

As mentioned ab ove weintro duce in this work a new cryptographic primitive called a pseudo

random synthesizer In this section we dene pseudorandom synthesizers and describ e their

prop erties

Motivation

Pseudorandom synthesizers are eciently computable functions of two variables The sig

nicant feature of such a function S is that given p olynomiallymany uniformly distributed

assignments hx x i and hy y i for both variables the output of S on all the

m m

m

combinations of these assignments f x y is pseudorandom ie is indistinguish

i j

ij

able from random to a p olynomialtime observer This is a strengthening of an imp ortant

prop erty of pseudorandom generators the indistinguishabilityofa p olynomial sample

A pseudorandom bit generator is a p olynomialtime computable function

n

G f g f g such that x I jG xj n n and GU is pseudorandom

n

ie fGU g and fU g are computationally indistinguishable It turns out that

n nN n nN

this denition implies that Given p olynomiallymany uniformly distributed assignments

m

hz z ithe sequence fGz g is pseudorandom

m i

i

The ma jor idea b ehind the denition of pseudorandom synthesizers is to obtain a func

m

s are not completely tion S such that fS z g remains pseudorandom even when the z

i i

i

m

independent More sp ecically pseudorandom synthesizers require that fS z g remains

i

i

m

pseudorandom even when the z s are of the form fx y g Our work shows that under

i i j

ij

some standard intractability assumptions it is p ossible to obtain such a function S and that

this prop erty is indeed very powerful As a demonstration to their strength we note below

that pseudorandom synthesizers are useful even when no restriction is made on their output

length which is very dierent than what we have for pseudorandom generators

Remark It is important to note that there exist pseudorandom generators that are

not pseudorandom synthesizers An immediate example is the generator dened by Gx

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

def

y G x y where G is also a pseudorandom generator A more natural example is

P

a This is which is dened by Gz the subsetsum generator G G

i a a a

z

n

i

not a pseudorandom synthesizer for xed values a a a since for every four nbit

n

strings x x y and y we have that Gx y Gx y Gx y Gx y

Formal Denition

We rst intro duce an additional notation to formalize the phrase all dierentcombinations

n

Notation Let f beanI I function and let X fx x g and Y fy y g

k m

be two sequences of nbit strings We dene C X Y to be the k m matrix f x y

f i j

ij

C stands for combinations

We can now dene what a pseudorandom synthesizer is

Denition pseudorandom synthesizer Let be any N N function and let

S f g f g f g be a polynomialtime computable function such that x y

n

I jS x y j n Then S is a pseudorandom synthesizer if for every probabilistic

polynomialtime algorithm D every two polynomials p and m and al l suciently large

n

h i

mnmn

Pr D C X Y Pr D U

S

n

pn

mn

where X and Y are independently drawn from U That is for random X and Y the

n

matrix C X Y cannot be eciently distinguished from a random matrix

S

Expanding the Output Length

In Denition no restriction was made on the outputlength function of the pseudo

random synthesizer However our parallel construction of pseudorandom functions uses

parallel pseudorandom synthesizers with linear output length n n The following

lemma shows that any synthesizer S can b e used to construct another synthesizer S with

large outputlength such that S and S have the same parallel time complexity Therefore

for the construction of pseudorandom functions in NC it is enough to show the existence

of synthesizers with constant output length in NC

Lemma Let S be a pseudorandom synthesizer with arbitrary outputlength function

i i

in NC resp AC Then for every constant there exists a pseudorandom

i i

synthesizer S in NC resp AC such that its outputlength function satises n

n

def

c c

Proof For every constant c dene S as follows Let k maxfk Z k ng On

n

n c c

input x y I regard the rst k bits of x and y as two lengthk sequences X and Y of

n n

c

is dened to be C X Y viewed as a single bitstring rather than k bit strings S x y

S n

c

a matrix Notice that the following prop erties hold for S

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

c

S is indeed a pseudorandom synthesizer For any p olynomial m let X and Y be

mn

indep endently drawn from U and let X and Y be indep endently drawn from

n

c

c mnk

n

c

X Y and C X Y are By the denition of S the distributions C U

S S k

n

identical Taking into account the fact that n is p olynomial in k we conclude that

n

c

every p olynomialtime distinguisher for S is also a p olynomialtime distinguisher for

c

S Since S is a pseudorandom synthesizer so is S

c

c

Let denote the outputlength function of S then n n Since c is a

c c

c

constant and nk for every n it holds that

n

c

c c

c c

nk l k k n n

c n n n

c i i c

S is in NC resp AC Immediate from the denition of S

c

Thus by taking S to b e S for some c we obtain the lemma

The construction of Lemma has the advantage that it is very simple and that the

parallel time complexityof S and S is identical Nevertheless it has an obvious disadvantage

The securityofS is related to the securityofS on a much smaller input length For example

if n and nn then the security of S on k bit strings is related to the security

S on k bit strings This results in a substantial increase in the time and space complexity

of any construction that uses S

Wenowshow an alternative construction to the one of Lemma that is more security

preserving The alternative construction uses a pseudorandom generator G that expands

the input by a factor of and relies on the GGMConstruction

i i

Corollary of Let G be a pseudorandom generator in NC resp AC such

that s jGsj jsj Then for every polynomial p there exists a pseudorandom generator

i i

G in NC resp AC such that s jG sj pjsj jsj

G is dened as follows On input s it computes Gs s s and recursively generates

pjsjjsj pjsjjsj

bits from s and bits from s The number of levels required is dlog pjsje

O log jsj Using Corollary we get

Lemma Let S be a pseudorandom synthesizer with arbitrary outputlength function

i i j j

in NC resp AC Let G be a pseudorandom generator in NC resp AC such that

s jGsj jsj Let k denote maxfi j g Then for every positive constant c there exists

k k

a pseudorandom synthesizer S in NC resp AC such that its outputlength function

c

satises nn l n

Furthermore the construction of S is linearpreserving in the sense of the exact

meaning of this claim is described below

n

g Proofsketch S is dened as follows On input x y I compute X G xfx x

c

dn e

and Y G y f y y g where G is the pseudorandom generator that is guaranteed

c

dn e

to exist by Corollary S x y is dened to b e C X Y S

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

k k c

It is immediate that S is in NC resp AC and that n n l n It is also

not hard to verify that S is indeed a pseudorandom synthesizer and from the pro of of

Corollary that the construction of S is linearpreserving in the following sense

Assume that there exists an algorithm that works in time tn and distinguishes C X Y

S

m nm n

from U with bias n where X and Y are indep endently drawn from

n

m n c

U Let mnm n dn e Then one of the following holds

n

mnmn

The same algorithm distinguishes C X Y from U with bias n

S n

mn

where X and Y are indep endently drawn from U

n

There exists an algorithm that works in time tnm n pol y n and distinguishes

GU from random with bias nO mn

n

The construction of Lemma is indeed more securitypreserving than the construction

of Lemma since the security of S relates to the security of S and G on the same

input length However the time complexity of S is still substantially larger than the time

complexity of S and the parallel time complexity of S might also be larger Given the

drawbacks of b oth construction it seems that a direct construction of ecient and parallel

synthesizers with linear output length is very desirable

Collection of PseudoRandom Synthesizers

A natural way to relax the denition of a pseudorandom synthesizer is to allow a distribution

of functions for every input length rather than a single function To formalize this we use

the concept of an eciently computable function ensemble

Denition collection of pseudorandom synthesizers Let be any N N func

n

function ensemble S is a tion and let S fS g be an eciently computable I I

n nN

n

collection of I I pseudorandom synthesizers if for every probabilistic polynomialtime

algorithm D every two polynomials p and m and al l suciently large n

h i

mnmn

Pr D C X Y Pr D U

S

n

n

pn

mn

where X and Y are independently drawn from U

n

As shown b elow a collection of pseudorandom synthesizers is sucient for our construc

tion of pseudorandom functions Working with a collection of synthesizers rather than a

single synthesizer enables us to move some of the computation into a prepro cessing stage

during the keygeneration This is esp ecially useful if all other computations can b e done in

parallel

Note that Lemma and Lemma easily extend to collections of synthesizers

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

A Parallel Construction of PseudoRandom Functions

This section describ es the construction of pseudorandom functions using pseudorandom

synthesizers as building blo cks The intuition of this construction is b est explained through

the concept of a k dimensional pseudorandom synthesizer This is a natural generalization of

the regular twodimensional synthesizer Informally an eciently computable function

k

of k variables S is a k dimensional pseudorandom synthesizer if

n o

k

m

Given polynomial lymany uniformlychosen assignments for each variable fa g

ji

i

j

m

k k

the output of S on al l the combinations M S a a a

i i ki

k

i i i

k

cannot be eciently distinguished from uniform by an algorithm that can access

M at points of its choice

Note that this denition is somewhat dierent from the twodimensional case For any

constant k and in particular for k the matrix M is of p olynomial size and we can give

it as an input to the distinguisher In general M might be to o large and therefore we let

the distinguisher access M at points of its choice

Using this concept the construction of pseudorandom functions can b e describ ed in two

steps

n

A parallel construction of an ndimensional synthesizer S from a twodimensional

synthesizer S that has output length nn This is a recursive construction where

k k

the k dimensional synthesizer S is dened using a k dimensional synthesizer S

def

k k

S x x x S S x x Sx x Sx x

k k k

n

An immediate construction of the pseudorandom function f from S

def

n

f x S a a a

x x nx

ha a a a a a i

n

n n

In fact pseudorandom functions can b e constructed from a col lection of synthesizers In

this case for each level of the recursion a dierent synthesizer is sampled from the collection

As noted below for some collections of synthesizers as those constructed in this work it is

enough to sample a single synthesizer for all levels

Formal Denition

The following op eration on sequences is used in the construction

n n

Denition For every function S I I and every sequence L f g

k

of nbit strings dene SQ L to be the sequence L f g where S l

k

i i

S

i

d e

k

for i b c and if k is odd then SQ stands for squeeze

k

k

d e

We now turn to the construction itself

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

f

ai

s

k

Xy

X

X

X

X

X

m

s

k

a

Pi

P

P

P

P

m

s s

k k

a

HY

H

AK I

H A

m m m m m m m m m m

a a a a a a a a a a

a

Figure Computing the Value of the PseudoRandom Function for n

Construction PseudoRandom Functions Let S fS g be a col lection of

n nN

n n

I I pseudorandom synthesizers and let I be a probabilistic polynomialtime key

S

n

generating algorithm for S For every possible value k of I denote by s the corre

S k

n n

sponding I I function The function ensemble F fF g is dened as fol lows

n nN

n

keygeneration On input the probabilistic polynomialtime keygenerating algo

rithm I outputs a pair a k where a fa a a a a a g is sampled

F n n

n

fr om U and k fk k k g is generatedby dlog ne independent executions

n

dlog ne

n n dlog ne

of I on input ie is sampled from I

S S

n n n

evaluation For every possible value a k of I the function f I I is

F

ak

dened as fol lows On an n bit input x x x x the function outputs the single

n

value in

fa a a g SQ SQ SQ

x x nx

n s s s

k k k

dlog ne

Final ly F is denedto be the random variable that assumes as values the functions f

n

ak

n

with the probability space induced by I

F

The evaluation of f x can be thought of as a recursive lab eling pro cess of a binary

ak

th

tree with n leaves and depth dlog ne The i leaf has two p ossible lab els a and a The

i i

th

i input bit x selects one of these lab els a The lab el of each internal no de at depth

i ix

i

the lab el on the lab els of its children The value of f x is simply d is the value of s

k

d

ak

of the ro ot Figure illustrates the evaluation of f for n We note that this

ak

lab eling pro cess is very dierent than the one asso ciated with the GGMConstruction

First the binary tree is of depth dlog ne instead of depth n as in Secondly the lab eling

pro cess is b ottomup instead of topdown as in ie starting at leaves instead of the

ro ot Moreover here each input denes a dierent lab eling of the tree whereas in the

lab eling of the tree is fully determined by the key and the input only determines a leaf such

that its lab el is the value of the function on this input

Eciency of the Construction

It is clear that F is eciently computable given that S is eciently computable Further

more the parallel time complexity of functions in F is larger by a factor of O log n than n

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

the parallel time complexity of functions in S The parallel time complexity of I and I

n S F

is identical

We note that for simplicity the parameter n serves a double role n is b oth the length

of inputs to f F and the security parameter for such a function the second role is

n

ak

expressed by the fact that the strings in a are nbit long In practice however these

roles would b e separated The security parameter would b e determined by the qualityofthe

synthesizers and the length of inputs to the pseudorandom functions would b e determined by

their application In fact one can usually use a pseudorandom function with a reasonably

small inputlength say bit long to prevent a birthday attack This is implied by

the suggestion of Levin to pairwise indep endently hash the input b efore applying the

pseudorandom function this idea is describ ed with more details in the intro duction

Reducing the KeyLength

An apparent disadvantage of Construction is the large keylength of a function f F

n

ak

In particular the sequence a is dened by n bits However this is not truly a problem

since a In Section a related construction is describ ed Construction where a

consists of a constant number of strings and is therefore dened by O n bits b The

truly random sequence a can be replaced by a pseudorandom sequence without increasing

the depth of the construction by more than a constant factor This is achieved as follows

Let G be a pseudorandom generator that expands the input by a factor of Let G be

to Corollary the pseudorandom generator that can be constructed from G according

for pn n ie by using dlog n e levels of the recursion Then a can be replaced by

G a where a is an nbit seed

In addition to a the key of f F consists of dlog ne keys of functions in S It

n n

ak

turns out that for some collections of synthesizers such as those describ ed in this work this

overhead can be eliminated as well This is certainly true when using a single synthesizer

instead of a collection Moreover from the pro of of security for Construction one can

easily extract the following claim If the collection of synthesizers remains secure even when

it uses a public key ie if C X Y remains pseudorandom even when the distinguisher

s

k

sees k then the dlog ne keys can be replaced with a single one ie the same key can be

used at all levels of the recursion

Security of the Construction

be the Theorem Let S and F be as in Construction and let R fR g

nN n

n n

uniform I I function ensemble Then F is an eciently computable pseudorandom

function ensemble Furthermore any ecient distinguisher Mbetween F and R yields an

ecient distinguisher D for S such that the success probability of D is smal ler by a factor

of at most dlog ne than the success probability of M

To prove Theorem we use of a hybrid argument for details ab out this pro of tech

nique see W e rst dene a sequence of dlog ne function distributions such that

the two extreme distributions are R and F We then show that any distinguisher for two

n n

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

neighb oring distributions can b e transformed into a distinguisher for the pseudorandom syn

thesizers For simplicitywe dene those hybriddistributions in case n The denition

easily extends to a general value of n such that Claim still holds

j th

For any j denote by H the j hybriddistribution The computation of

n

j

functions in H may be describ ed as a lab eling pro cess of a binary tree with n leaves and

n

depth an analogous description for F app ears in Section Here the lab eling pro cess

n

j j

th

starts with no des at depth j The i such no de has p ossible lab els fa s I g

is

th j

which are part of the key The i bit substring of the input x selects one of these

i

lab els a The rest of the lab eling pro cess is the same as it was for functions in F The

ix n

i

lab el of each no de at depth d j is the value of s on the lab els of its children The

k

d

value of the function on this input is simply the lab el of the ro ot

j

way to think of H is via the concept of a k dimensional synthesizer see Sec Another

n

n j

tion As was the case for F the construction of functions in H can be describ ed in

n

j

j

two steps A recursive construction of a dimensional synthesizer S from a two

dimensional synthesizer S An immediate construction of the pseudorandom function

j

f fromS

j

def

j

f x x x S a a a

j j

j

x x

x

j

fa r sI g

rs

We turn to the formal denition of the hybriddistributions

Denition Let I be the keygenerating algorithms of S Let n and j be three

S

integers such that n and j For every sequence k fk k k g of possible

j

j

n j

of I and for every length sequence of nbit strings a fa r values

S rs

j

n n j

g the function f I I is dened as fol lows On input x x x x j s I

ak

j

j

where i x I the function outputs the single value in

i

SQ SQ SQ a a a

j

x x

x

s s s

j

k k k

j

j

H is the random variable that assumes as values the functions f denedabove where

n

ak

n

the k s are independently distributed according to I and a is independently distributed

i S

j

j

according to U

n

This denition immediately implies that

dlog ne

Claim H and F are identical ly distributed and H and R are identical ly dis

n n

n n

tributed

j

The pro of b elow shows that for every j the two neighb oring ensembles H and

n

j

H are computationally indistinguishable As shown b elow this implies Theorem by

n

a standard hybrid argument

Proof of Theorem As mentioned in Section it is obvious that F is an eciently

computable function ensemble Assume that F is not pseudorandom By the denition of

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

pseudorandom function ensembles there exists a p olynomialtime oracle machine M and

a p olynomial p so that for innitely many n

i i h h

n n R F

n n

Pr M Pr M

pn

n n

where R fR g is the uniform I I function ensemble Let t b e a p olynomial that

n nN

n

bounds the number queries that M makes on input

Given Mwe dene the probabilistic p olynomialtime algorithm D that distinguishes the

def

output of S from random Let m mn be dened by mn tn n For every n the

n

whose entries are nbit strings As part of its input of D is an mnmn matrix B b

ij

n

algorithm D invokes M on input The denition of m allows D to answer all the queries

of M which are b ounded by tn It is shown below that D distinguishes between the

following two distributions of B a C X Y where X and Y are indep endently drawn

S

n

mn mnmn

from U b U

n n

For simplicity of presentation w e only dene the algorithm that D p erforms for n It

is easy to extend this denition to a general value of n such that Claim and Claim

mn

still hold On input B b the algorithm is dened as follows

ij

ij

Cho ose J uniformly at random

Generate k fk k k g by J indep endent executions of I on input

J S

n

tn

J J i i

Extract submatrices of B For i denote by B b the

uv

uv

tn tn diagonal submatrix of B dened by

def

i

b b

uitnv itn

uv

n r r r r th

Invoke M on input Denote by q q q q the r query M makes where

J

J

J r

i On each of these queries D answers as follows q I for

i

i i J

r r

the entry b of B where For every i denote by a

q iq

uv

i i

j j

r r

g q g and v minf j r q q u minf j r q

i i

i i

er the query with the single value in Answ

r r

r r

SQ SQ fa a J g

q q

q q

s s

k k

J J

J

Output whatever M outputs

It is obvious that D is a p olynomialtime algorithm Toshow that D is also a distinguisher

for the pseudorandom synthesizers we rst state and prove the following two claims

Claim For every J

i h i h

j

n mnmn H

n

D U jJ j Pr M Pr n

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

Proof As part of its algorithm D denotes some of B s entries by names of the form a

rs

J

J

where r and s I Note that D never denotes an entry of B by two

dierent names Assume for the sake of the pro of that any name a that was not used

rs

by D is assigned an indep endently and uniformly distributed nbit string Denote by a the

J

J

sequence fa r s I g It is easy to verify that

rs

J

J

When B is uniformly distributed the distribution of a is identical to U

n

D answers every query q ofM with the value f q where f is as in the denition

ak ak

J

of H

n

The claim immediately follows from and and from the denition of k

mn

Let X and Y be independently drawn from U Then for every J Claim

n

h i

j

H n

n

M Pr D C X Y jJ j Pr

S

n

Proof Let X fx x x g and Y fy y y g b e indep endently drawn from

mn mn

mn

U and let s be drawn from S Assume that the input of D is B C X Y For

n k n s

k

J

J

i s I the sake of the pro of dene the vector a fa g as follows

is

i i

r r r

to b e x the entry b of B then dene a If D denoted by a

iq itn q u iq

uv

i i i

r r r r r

s a a and a to b e y Note that a

k q iq iq iq iq

itnv

i i i i i

For all other values in a assign an indep endently and uniformly distributed nbit string

J

J

It is easy to verify that the distribution of a is identical to U Let k be the

n

J

sequence fk k k kg and let f b e as in the denition of H Wenowhave that

J

n

a k

th r

the answer D gives to the r query q ofM is

r r

r r

SQ SQ fa a g

J

q q

q q

s s

k k

J J

J

a g a fa SQ SQ SQ

r r

r

J

s s s

q q

q

k k k

J

J

r

f q

a k

From this fact and from the denition of a and k we immediately get the claim

X and Y be By Claims and we can now conclude the following Let

mn

indep endently drawn from U then for innitely many n

n

h i

mnmn

Pr D C X Y Pr D U

S n

n

dlog ne dlog ne

h i

X X

mnmn

Pr D U jJ j Pr D C X Y jJ j

n S

n

dlog ne

j j

dlog ne dlog ne

i h i h

X X

j j

n H n H

n n

Pr M Pr M

dlog ne

j j

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

h i

dlog ne

H n H n

n

n

Pr M Pr M

dlog ne

h i h i

F n R n

n n

Pr M Pr M

dlog ne

pn dlog ne

This contradicts the assumption that S is a collection of pseudorandom synthesizers and

completes the pro of of Theorem

Corollary For any col lection of pseudorandom synthesizers S such that its func

i

tions are computable in NC there exists an eciently computable pseudorandom function

i

ensemble F such that its functions arecomputable in NC Furthermore the correspond

ing keygenerating algorithms I and I have the same paral lel time complexity

S F

n n

Proof By Lemma we can construct from S a new collection of I I pseudo

i

random synthesizers S such that its functions are computable in NC By Theorem

we can construct from S an eciently computable pseudorandom function ensemble F

i

such that its functions are computable in NC Both constructions preserve the parallel

time complexityofthe keygenerating algorithms

A Related Construction and Additional Prop erties

Though designed to enable ecient computation in parallel Construction obtains some

additional useful prop erties In this section we describ e two such prop erties arathersharp

timespace tradeo and an incremental prop erty Wealsoshow how to adjust the construc

tion in order to improveupon these prop erties

TimeSpace Tradeo

Construction has the advantage of a sharp timespace tradeo In order to get an even

sharp er tradeo we describ e an alternative construction of pseudorandom functions The

best way to understand the revised construction is by viewing the computation pro cess back

n

wards Every function on nbits is dened by the length sequence of all its values Assume

p

n

e sequences X and Y of random strings as that we could sample and store two lengthd

the key of a pseudorandom function In this case given a pseudorandom synthesizer S

n

we can dene the values of the pseudorandom function to be the entries of the matrix

C X Y In order to reduce the key size we can replace the random sequences X and

S

Y with pseudorandom sequences Such sequences X and Y can b e obtained together from

wo shorter random sequences of length approximately C X Y where X and Y are t

S

p

n

By continuing this pro cess of reducing the key size log n times we getakey with

constant number of strings see Figure for an illustration of the construction

In order to understand where the original construction is wasteful in the size of the

n

key we can describ e it in similar terms The values of the function are still the values

of C X Y for two sequences X and Y in the description of the computation as a tree S

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

2n Values

m-keys

Figure Illustration of the Alternative Construction

lab eling pro cess these are all the p ossible lab els of the ro ots children but then we get X and

Y separately as C X Y and C X Y By the time the sequences have constantlength

S S

there are O n of those

Returning to the new construction note that if we allow a key of m strings we only

need t log n log log m of the steps describ ed ab ove Computing such functions requires t

phases and in each phase several parallel invo cations of S The total number of invo cations

n

t

of S is This seems to be a relatively sharp timespace tradeo and to the

log m

best of our knowledge one that cannot be obtained by the GGMconstruction

For some applications like the protection of the data on a disk weneedpseudorandom

functions with reasonably small amount of entries In this case by storing relatively few

strings we can achieve a very easytocompute function For example random bit

strings dene a pseudorandom I I function Computing this function requires only

invo cations of a pseudorandom synthesizer in phases

We formalize the denition of the alternative construction

Alternative Construction of PseudoRandom Functions Construction

n n

Let S fS g be a col lection of I I pseudorandom synthesizers and let I be

n nN S

a probabilistic polynomialtime keygenerating algorithm for S For every possible value

n n n

k of I denote by s the corresponding I I function The function ensemble

S k

F fF g is dened as fol lows

n nN

j

keygeneration Let m denote the value and let t denote the smal lest integer

j n

n

t such that m n On input the probabilistic polynomialtime keygenerating

t

m

g is generatedaccording algorithm I outputs a pair a k where a fa a a

F

m

and k fk k k g to U is generatedby t independent executions of I on

t n n S

n

n n t

n

input ie is sampled from I S

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

n

evaluation For every possible value a k ofI and every j such that j t

F n

j

m n m

j

dene the function f I I in recursion on j For x I dene f x to be

ak

ak

m j

j

a For any j and x x x I x and x are bit strings dene

x

j j j

f x to be s f x f x

k

j

ak ak ak

t

n n n

n

For every x I the value of the function f I I on x is dened to be f x

ak

ak

n zeros where x is obtained by padding x with m

t

n

Final ly F is denedto be the random variable that assumes as values the functions f

n

ak

n

with the probability space induced by I

F

The pro of of security for Construction is omitted since it is almost identical to the

pro of of security for Construction

We can now state the exact form of the timespace tradeo under the notation of Con

m m i

i

struction If a contains strings instead of then we can dene f x to be a

ak

j

m

i

distinct value in a for every x I and keep the recursive denition of f as b efore for

ak

t i

n

j i In this case computing f x can be done in t i phases with a total of

n

ak

invo cations of the synthesizers The next lemma follows for simplicity this lemma is stated

in terms of a synthesizer instead of a collection of syn thesizers

Lemma Let S be a pseudorandom synthesizer with outputlength function n n

Assume that S can be computed in paral leltime D n and work W n on nbit inputs

m n

Then for every m mn such that mn there exists an eciently computable

is a pseudorandom function ensemble F fF g such that the key of a function in F

n nN n

sequence of at most mn random nbit strings and this function can becomputedinparal lel

n

time log n log log mnO D n and using work of O W n

log mn

Incremental Prop erty

Wenow describ e an observation of Mihir Bellare that gives rise to an interesting incremental

prop erty of our construction For the formulation and treatment of incremental cryptogra

phy see the work of Bellare Goldreich and Goldwasser

Let f be any function in F where F fF g is the pseudorandom function ensemble

n n nN

n

dened in Construction Let x y I be of Hamming distance one x and y dier on

exactly one bit Then given the computation of f x including all intermediate values we

only need additional log n invo cations of the pseudorandom synthesizers instead of n in

order to evaluate f y The easiest way to see the correctness of this observation is to recall

the description of the computation of f x as a lab eling pro cess on a depthlog n binary tree

The only lab els that change as a result of ipping one bit of x are those of the no des on a

path from one leaf to the ro ot ie log n lab els

If a Grayco de representation of numbers is used we get a similar observation for the

computation of f x and f x Given the computation of one of these values computing

n n

A p ermutation PonI is called a Grayco de representation if for every x the Hamming dis

n

tance b etween P x and P x mod is one SuchaP denes a Hamiltoniancycle on the ndimensional

cub e It is not hard to see that an easytocompute P can b e dened

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

the other requires only additional log n invo cations of the pseudorandom synthesizers It is

not hard to imagine situations where one of these incremental prop erties is useful

The observation regarding the computation of f x and f y for x and y of Hamming

distance one also holds for the functions of Construction The observation regarding

the computation of f x and f x holds if we use a dierent representation of numbers

this representation is similar to a Grayco de though a bit more complicated

Synthesizers Based on General Cryptographic Primitives

Sections are mostly devoted to showing parallel constructions of pseudorandom

synthesizers In this section weprovide a simple construction of pseudorandom synthesizers

based on what we call weak pseudorandom functions This construction immediately im

plies a construction of pseudorandom synthesizers based on trap do or oneway p ermutations

and an additional construction based on any hardtolearn problem which is considered in

Section An interesting line for further research is the parallel construction of pseudo

random synthesizers from other cryptographic primitives In particular we do not know of

such a construction from pseudorandom generators or directly from oneway functions

Weak PseudoRandom Functions

The reason pseudorandom functions are hard to construct is that they must endure avery

powerful kind of an attack The adversary the distinguisher may query their values at

ery p oint and may adapt its queries based on the answers it gets We can weaken the ev

opp onentby letting the only access it has to the function b e a p olynomial sample of random

points and the value of the function at these p oints more on denitions of function families

that are weaker than pseudorandom functions can b e found in We call functions that

lo ok random to suchanadversary weak pseudorandom functions In this section it is shown

that weak pseudorandom functions yield pseudorandom synthesizers in a straightforward

manner We therefore get a parallel construction of standard pseudorandom functions

from weak pseudorandom functions

For simplicity we dene weak pseudorandom functions as lengthpreserving In their

denition we use the following notation

Notation For every function f and every sequence X fx x g of values in the

k

domain of f denote by VX f the sequence fx fx x fx x fx g Vstands for

k k

values

Denition collection of weak pseudorandom functions An eciently com

putable

n n

I I function ensemble F fF g is a collection of weak pseudorandom func

n nN

abilistic polynomialtime algorithm D every two polynomials p and tions if for every prob

m and al l suciently large n

h i h i

mn mn

Pr D VU F Pr D U

n n n

pn

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

Let F b e a collection of weak pseudorandom functions and let I b e the p olynomialtime

keygenerating algorithm for F Lemma shows how to construct a pseudorandom

synthesizer from F and I Since the random bits of I can be replaced by pseudorandom

n

bits we can assume that I only uses n truly random bits on input In fact this is only

a simplifying assumption which is not really required for the construction of pseudorandom

n n n

synthesizers For every r I denote by I the value of I when I uses r as its

r

random bits

Lemma Let F and I be as above and dene S f g f g f g such that

n

n

x y I Sx y f x Then S is a pseudorandom synthesizer

I

y

Proof It is obvious that S is eciently computable Assume in contradiction to the lemma

that S is not a pseudorandom synthesizer Then there exists a probabilistic p olynomialtime

algorithm D and p olynomials p and m such that for innitely many n

h i

mnmn

Pr D C X Y Pr D U

S n

pn

mn

where X and Y are indep endently drawn from U

n

th i

For every n and every i mn dene the i hybrid distribution H over mnmn

n

matrices as follows The rst i columns are distributed according to C X Y where X is

S

mn i

drawn from U and Y is indep endently drawn from U The last mn i columns are

n n

mnmni

indep endently distributed according to U It is immediate that for innitely

n

many n

h i h i

mn

Pr D H Pr D H

n n

pn

Wenow dene a distinguisher D for F Given fx z x z x z g as its input

mn mn

D p erforms the following algorithm

Dene X fx x g and Z fz z g

mn mn

Uniformly cho ose J mn

J

columns Sample Y from U and generate an mnmn matrix B whose rst J

n

th t

are C X Y its J column is Z and the last mn J columns are indep endently

S

mnmnJ

distributed according to U

n

Output D B

It is obvious that D is eciently computable It is also easy to verify that

i h i h

mn j

VU F jJ j Pr D Pr D H

n n

n

and that

h i h i

mn j

Pr D U jJ j Pr D H

n n

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

Thus by a standard hybrid argument we get that for innitely many n

h i h i

mn mn

Pr D VU F Pr D U

n n n

pnmn

in contradiction to the assumption that F is a collection of weak pseudorandom functions

We can therefore conclude the lemma

Notice that S as in Lemma ob eys an even more powerful requirement than is

needed by the denition of a pseudorandom synthesizer For random X and Y the matrix

C X Y cannot be eciently distinguished from a random matrix even if we allow the

S

distinguisher access to X

e Corollary of Lemma If there exist weak pseudorandom functions that can b

sampled and evaluatedin NC then there also exist a pseudorandom synthesizer in NC and

standard pseudorandom functions that can be sampled and evaluated in NC

Trap do or OneWay Permutations

We now describ e a rather simple construction of weak pseudorandom functions from a

collection of trap do or p ermutations Therefore given Lemma we get a construction

of a pseudorandom synthesizer out of a collection of trap do or p ermutations This pseudo

random synthesizer is in NC if the trap do or permutations can be sampled and inverted in

NC in fact there is an additional requirement of a hardcore predicate in NC but this is

already satised by Since we have no concrete example of this sort we only give a

brief and informal description of the construction for formal denitions of trap do or oneway

permutations and hardcore bits see eg

Let F fF g be a p ermutation ensemble such that every f F is a permutation

n nN i n

over a domain D Informally F is a collection of trap do or oneway p ermutations if the key

n

generating algorithm I of F outputs b oth a publickey i and a trap do orkey ti and we

F

have that

Given i the function f is easy to compute everywhere but hard to invert on the

i

average

Given ti the function f is easy to compute and to invert everywhere

i

Let F fF D D g b e a collection of trap do or oneway permutations Assume

n n n nN

that the collection is oneway for the uniform distribution over the inputs ie it is hard

to compute x given F x where x is uniformly distributed in D Let the sequence of

n n

functions fb D I g be a hardcore predicate for F Informally this means that

n n nN

given F x for a uniformly distributed x it is hard to guess b x with probability which

n n

is nonnegligibly b etter than half We can now dene a collection of weak pseudorandom

functions G fG g in the following way

n nN

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

n

For every x I denote by D x the element in D sampled using x as the

n n

n

random bits For every key i of f F dene g I I as fol lows

i n i

def

n

x I g x b f D x

i n n

i

Note that computing g x requires know ledge of the trapdoorkey ti Let G

i n

be the random variable that assumes as values the functions g with the probability

i

space induced by the distribution over the keys in F

n

Claim without proof If F is a col lection of trapdoor oneway permutations and

fb g is a hardcore predicate for F then the function ensemble G which is dened above

n nN

is a col lection of weak pseudorandom functions

Numb erTheoretic Constructions of PseudoRandom Syn

thesizers

In this section we present several NC or actually TC constructions of pseudorandom

synthesizers based on concrete frequentlyused intractability assumptions The rst con

struction is at least as secure as the computational DieHellman assumption Since

the DieHellman assumption mo dulo a comp osite is not stronger than factoring

see also Section this implies a construction that is at least as secure as Factoring Fi

nally we show two constructions that are at least as secure as the RSA assumption

Although the RSA assumption is not weaker than factoring the constructions based on RSA

mighthave other advantages For example under the assumption that n leastsignicant

bits are simultaneously hard for RSA we get pseudorandom synthesizers with linear output

length In addition the constructions based on RSA and their pro of of security use several

interesting ideas that mightbe useful elsewhere

The constructions of NC pseudorandom synthesizers imply constructions of NC pseudo

random functions in fact we get TC functions In Section we present more ecient

direct constructions of pseudorandom functions that are in particular in TC These

constructions are either based on the decisional DieHellman assumption or on factoring

Some of the ideas and to ols used in Section are similar to those used here

We rst address some issues that are common to all constructions presented here

The evaluation of our pseudorandom synthesizers in NC relies on a prepro cessing stage

This stage can be p erformed as part of the sequential keygenerating algorithm In this

idea we followthe work of Kearns and Valiant In their context the additional data is

forced into the input whereas in our context it is added to the key

The analysis of the paralleltime complexity of the synthesizers uses previous results on

the paralleltime complexity of arithmetic op erations see Karp and Ramachandran for a

survey In particular we use the result of Beame Co ok and Ho over They showed that

iterated multiplication m ultiplying n numb ers of length n and additional related op erations

can b e p erformed by logdepth circuits these circuits can b e constructed eciently though

sequentially The results of enable the computation of mo dular exp onentiation in

NC given prepro cessing that only dep ends on the base This follows from the fact that

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

e

computing b mo d N is reduced to an iterated multiplication and an additional mo dular

i

reduction given the values b mo d N for i the length of e

The pseudorandom synthesizers constructed in this section are Bo olean functions Sec

tion showed two metho ds for expanding the outputlength of pseudorandom synthe

sizers The metho d of Lemma requires a pseudorandom generator that expands the

input by a factor of two A natural choice for this purp ose in the case of the synthesizers

which are describ ed in this section is the pseudorandom generators of Blum Blum and

Shub or the one of Hastad Schrift and Shamir Given appropriate prepro cessing

b oth generators can b e computed in NC and their security is based on the assumption that

factoring integers Blumintegers in is hard

We note that all the constructions of this section give col lections of pseudorandom syn

thesizers However the security of theses synthesizers do es not rely on keeping their key

private As discussed in Section this allows us to use a single synthesizer at all the

levels of Construction and of Construction

Common To ols

In our constructions we use the result of Goldreich and Levin which gives a hardcore

predicate for any oneway function

Theorem Let f be any oneway function For every probabilistic polynomial

time algorithm A for every polynomial p and al l suciently large n

Pr Af xr r x

pn

where x and r are independently drawn from U recal l that r x denotes the inner product

n

mo d of r and x

In fact we use their result in a slightly dierentcontext Lo osely sp eaking if given f x

it is hard to compute g x with nonnegligible probability then given f x it is also hard to

guess g x r with nonnegligible advantage over To verify that the GoldreichLevin

result applies in this context let us state a main technical lemma regarding the Goldreich

LevinRacko reconstruction algorithm

n

Lemma There exists a probabilistic oracle machine M such that for any x f g

any and any probabilistic machine B if

x

B r r x Pr

x

r U

n

B n

x

then M

runs in time pol y n

In fact all these synthesizers can b e made to output a logarithmic numb er of bits Furthermore given

stronger assumptions they may output an even larger numberofbits See Remark for an example

The GoldreichLevin Theorem is a constructive one that enables reconstruction of x given an algorithm

for guessing x r See for details the algorithm there is due to Racko

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

makes O n queries to B and

x

outputs a list L of O n values such that

Prx L

Given this lemma it is not hard to verify that if there exists an ecient machine that on

input f x guesses g x r with nonnegligible advantage over then there also exists

an ecient machine M that on input f x computes g x with nonnegligible probability

by rst pro ducing p olynomial many candidates for g x and then cho osing among them

at random In general M may not be able to verify whether it got the right value g x

unlike the case where g xx ie that f is a oneway function However in the contexts

of the DieHellman assumption Shoup has shown how to increase the certainty of a

machine such as M using the randomself reducibility of the problem

In addition to the GoldreichLevin result the pro of of security for all the constructions

uses the nextbit prediction tests of Blum and Micali The equivalence b etween pseudo

random ensembles and ensembles that pass all p olynomialtime nextbit tests was shown by

Yao

The DieHellman Assumption

We now dene a collection of pseudorandom synthesizers that are at least as secure as

the computational DieHellman assumption DHAssumption This assumption was

intro duced in the seminal pap er of Die and Hellman as a requirement for the security

of their keyexchange proto col The validity of the DHAssumption was studied quite

extensively over the last two decades A few notable representatives of this research are

Maurer and Wolf and Boneh and Lipton have shown that in several

settings the DHAssumption is equivalent to the assumption that computing the discrete log

is hard In particular for any sp ecic prime P there is an ecient reduction given some

information that only dep ends on P of the discrete log problem in Z to the DHProblem in

P

Z Shoup has shown that the DHAssumption holds against what he calls generic

P

algorithms See Section for more details on these results and on the DHAssumption in

general

For concreteness we state the DHAssumption in the group Z where P is a prime

P

However our construction works just as well given the DHAssumption in other groups

We use this fact in Section to get pseudorandom synthesizers which are at least as

secure as Factoring In order to formalize the DHAssumption in Z we need to sp ecify the

P

distribution of P One p ossible choice is to let P b e a uniformly distributed prime of a given

length However there are other p ossible choices For example it is not inconceivable that

P can b e xed for anygiven length As in Chapter wekeep our results general by letting

some p olynomialtime algorithm IG where IG stands for instance P be generated by

DH

generator

Denition IG The DieHel lman instance generator IG is a probabilistic

DH DH

n

polynomialtime algorithm such that on input the output of IG is distributed over nbit

DH primes

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

In addition we need to sp ecify the distribution of a generator g ofZ It can b e shown

P

that if the DHAssumption holds for some distribution of g then it also holds if we let g be

a uniformly distributed generator of Z since there exists a simple randomizedreduction of

P

the DHProblem for any g to the DHProblem with a uniformly distributed g

All exp onentiations in the rest of this subsection are in Z the denition of P will be

P

clear by the context To simplify the notation we omit the expression mod P from now

on We can now formally state the DHAssumption for the instance generator given IG

DH

Assumption DieHel lman For every probabilistic polynomialtime algorithm

A for every polynomial q and al l suciently large n

h i

a b ab

Pr AP g g g g

q n

n

where the distribution of P is IG the distribution of g is uniform over the set of

DH

generators of Z and the distribution of ha bi is U

n

P

n

Based on this assumption we dene a collection of I I pseudorandom synthesizers

S

DH

n

Denition For every nbit prime P every generator g of Z and every r I

P

n

dene s I I by

Pgr

def

n xy

x y I s x y g r

Pgr

Let S to be the random variable that assumes as values the functions s where the

n Pgr

n

distribution of P is IG the distribution of g is uniform over the set of generators of

DH

and the distribution of r is U The function ensemble S is dened to be fS g Z

n DH n nN

P

Note that in Section we sho w a direct and ecient construction of ndimensional

pseudorandom synthesizers based on the stronger decisional version of the DHAssumption

which gives very ecient pseudorandom functions

Theorem If the DHAssumption Assumption holds then S is a col lection

DH

n

of I I pseudorandom synthesizers

Proof It is obvious that S fS g is eciently computable Assume that S is not

DH n nN DH

a collection of pseudorandom synthesizers Then there exists a p olynomial m such that

the ensemble E fE g is not pseudorandom where E C X Y forX and Y that are

n n S

n

mn

indep endently drawn from U Therefore there exists an ecient nextbit prediction

n

test T and a p olynomial q such that for innitely many n it holds that

Given a prex of E of uniformly chosen length T succeeds to predict the next

n

bit with probability greater than

q n

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

We now show how to use T in order to dene an ecient algorithm A such that for

innitely many n

h i

a b ab

Pr AP g g g rg r

q n

where the distribution of P g a and b is as in the DHAssumption and r is drawn from U

n

ab

By Theorem this means that g can also be eciently computed which contradicts

the DHAssumption and completes the proofofthe lemma

y x

xy x y

In the denition of A we use the fact that in order to compute g g g it is

x y

enough to either know g and y or g and x ie it is not required to know b oth x and y

This enables A to dene a matrix which is distributed according to E such that one of its

n

ab

entries is g r the value A tries to guess and all other entries can b e computed by A It

ab

is now p ossible for A to guess g r byinvoking T on the appropriate prex of this matrix

a b

In more details on input hP g g g ri the algorithm A is dened as follows

Uniformly cho ose i j mn

Dene X fx x g and Y fy y g by setting x a y b and

mn mn i j

indep endently drawing all other values from U

n

mn

Dene B b to b e C X Y

uv s

uv

Pgr

Note that A knows all the values of X and Y except x and y Therefore A can

i j

ab

compute all the entries of B except b g r

ij

Invoke T and feed it with all the entries of B up to b ie the rst i rows and

ij

th

the rst j entries of the i row

Output T s prediction of b

ij

It is obvious that A is ecient Furthermore since the distribution of B is exactly E it

n

h i

a b ab

is immediate that for innitely many nPr AP g g g r g r where the

q n

distribution of P g a b and r is as ab ove This contradicts the DHAssumption and proves

the lemma

Corollary If the DH assumption Assumption holds then there exist pseudo

random functions that are computable in NC given a sequential precomputation which is

part of the keygenerating algorithm

Proof By Theorem given that the DHAssumption holds S is a collection of

DH

i

pseudorandom synthesizers If the keygenerating algorithm precomputes g mo d P for

S can be evaluated in NC This precomputation i n then the functions of

DH

reduces any mo dular exp onentiation with g as the base to an iterated multiplication and

an additional mo dular reduction see also the discussion at the b eginning of this section By

Corollary there exist pseudorandom functions in NC the keygenerating algorithm

in both cases is sequential

n

Remark Assume that IG has a single possible value P for every n Then

DH

S can be transformed into a synthesizer rather than a col lection of synthesizers In this

DH

case the keygenerating algorithm of the pseudorandom functions we get is in nonuniform

NC

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

Comp osite DieHellman Assumption and Factoring

The collection of pseudorandom synthesizers S is at least as secure as the DHAssumption

DH

modulo a prime As mentioned ab ove the DHAssumption in any other group gives a cor

resp onding construction of pseudorandom synthesizers with practically the same pro of of

security We now consider the DHAssumption modulo a Bluminteger comp osite DH

Assumption McCurley and Shmuely have shown that the comp osite DH

Assumption is implied by the assumption that factoring Blumintegers is hard See also

Section for denitions and pro of that are more consistent with our setting We therefore

get a simple construction of pseudorandom synthesizers which is at least as secure as Fac

toring In the subsequent wegive the relevant denitions and claims for b etter readability

we rep eat some of the denitions from Section We omit the pro ofs since they are

practically the same as in Section

To formalize the comp osite DHAssumption we let this comp osite b e generated by some

p olynomialtime algorithm FIG We restrict the output of FIG to the set of Blumintegers

This restriction is quite standard and it is meant to simplify the reduction of the comp osite

DHAssumption to factoring

Denition FIG The factoring instance generator FIGisaprobabilistic polynomial

n

time algorithm such that on input its output N is distributedovernbit integers where

N P Q for two nbit primes P and Q such that P Q mod such an integer is

known as a Bluminteger

n

We note that it is essential for FIG tohave sup erp olynomial numb er of p ossible val

ues since otherwise factoring would b e nonuniformly easy in this resp ect it is very dierent

from the case of the DieHellman instance generator IG One natural distribution of

DH

n

FIG that has this prop erty is the uniform distribution over nbit Blumintegers

the denition of N will be All exp onentiations in the rest of this subsection are in Z

N

clear by the context To simplify the notation we omit the expression mo d N from

now on We can now dene b oth the comp osite DHAssumption and the assumption that

factoring Blumintegers is hard for the instance generator FIG

Assumption Composite DieHel lman For every probabilistic polynomialtime al

gorithm A for every polynomial q and al l suciently large n

i h

a b ab

Pr AN g g g g

q n

n

where the distribution of N is FIG the distribution of g is uniform over the set of

quadraticresidues in Z and the distribution of ha bi is U

n

N

Assumption Factoring For every probabilistic polynomialtime algorithm A for

every polynomial q and al l suciently large n

PrAP Q fP Qg

q n

n

where the distribution of N P Q is FIG

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

n

We dene a collection of I I pseudorandom synthesizers S in analogy to the

F

denition of S

DH

Denition For every nbit Bluminteger N every quadraticresidue g in Z and

N

n n

every r I dene s I I by

Ngr

def

n xy

x y I s x y g r

Ngr

Let S to be the random variable that assumes as values the functions s where the

n Ngr

n

distribution of N is FIG the distribution of g is uniform over the set of quadratic

residues in Z and the distribution of r is U The function ensemble S is dened to be

n DH

N

fS g

n nN

In the same way Theorem was proven we get that

Theorem If the composite DHAssumption Assumption holds then S is a

F

n

col lection of I I pseudorandom synthesizers

As shown in and in Theorem the comp osite DHAssumption Assump

tion is implied by the factoring assumption Assumption We therefore get

that

Corollary of Theorem If factoring Blumintegers is hard Assumption

n

then S is a col lection of I I pseudorandom synthesizers

F

Finallywe can conclude that

Corollary If factoring Blumintegers is hard Assumption then there exist

computable in NC given a sequential precomputation pseudorandom functions that are

which is part of the keygenerating algorithm

The RSA Assumption

We now dene two collections of pseudorandom synthesizers under the assumption that

the RSAp ermutations of Rivest Shamir and Adleman are indeed oneway ie under

the assumption that it is hard to extract ro ots mo dulo a comp osite This assumption is

not weaker than the factoring assumption However the constructions based on RSA might

have other advantages For example the second RSAconstruction gives pseudorandom

synthesizers with linear output length under the assumption that n leastsignicant bits

are simultaneously hard for RSA Another reason to include these constructions is that they

use several interesting techniques that mightbe useful elsewhere

As was the case with the previous assumptions we keep the denition of the RSA

Assumption general by using some p olynomialtime instance generator IG our con

RS A

structions of synthesizers however will imp ose further conditions on IG

RS A

Denition IG The RSA instance generator IG isaprobabilistic polynomial

RS A RS A

n

time algorithm such that on input its output is distributed over pairs hN ei Where

N P Q is a nbit integer P and Q are two nbit primes and e Z ie e is

N

relatively prime to the order of Z which is denoted by N N

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

All exp onentiations in the rest of this subsection are in Z the denition of N will be

N

clear by the context To simplify the notation we omit the expression mo d N from now

on We can now dene the RSAAssumption for the instance generator given IG

RS A

Assumption RSA For every probabilistic polynomialtime algorithm A for

every polynomial q and al l suciently large n

e

Pr AN e m m

q n

n

where the distribution of hN ei is IG and m is uniformly distributed in Z

RS A

N

The RSAAssumption gives a collection of trap do or oneway permutations the public

e

key is hN ei the function f is dened by f m m and the trap do orkey is hN e di

Ne Ne

d

e

which enables ecient inversion by the formula m m where d e mo d Z

N

In Section we showed a general construction of pseudorandom synthesizers out of

trap do or onewaypermutations However a straightforward application of this construction

to the RSA collection gives very inecientsynthesizers In the following few paragraphs we

describ e this construction the reasons it is inecient and some of the ideas and to ols that

allowus to get more ecient synthesizers which are also computable in NC

Applying the construction of Section to the RSA collection using the Goldreich

Levin hardcore predicate gives the following collection of synthesizers The key of each

def

n d

synthesizer is a uniformly distributed string r and for every x y I s x y m r

r

where x samples the trap do orkey hN e di and y samples a uniformly chosen element m

Z The most obvious drawback of this denition is that computing the value s x y

r

N

consists of sampling an RSA trap do orkey In particular computing s x y consists of

r

sampling a Bluminteger N This rather heavy op eration might be acceptable as part

of the keygenerating algorithm of the pseudorandom synthesizers or functions but is

extremely undesirable as part of their evaluation

In the direct constructions of pseudorandom synthesizers based on RSA we manage to

push the comp osite N into the key of the synthesizers thus overcoming the drawback

describ ed in the previous paragraph Nevertheless we are still left with the following

d

problem Computing m in NC requires precomputation that dep ends on m To enable

wever in the this precomputation it seems that m needs to be part of the key as well Ho

construction which is describ ed ab ove m dep ends on the input and is uniformly distributed

for a random input In order to overcome this problem we show a metho d of sampling

m almost uniformly at random in a way that facilitates the necessary prepro cessing This

metho d uses the subset product functions We rst dene these functions and then describ e

the way they are used in our context

Denition Let G beanitegroup and let y fy y g bean ntuple of elements

n

x x x dene the subset product SP x to be the product in G For any nbit string

n Gy

in G of the elements y such that x

i i

The following lemma was shown by Impagliazzo and Naor and is based on the leftover

hash lemma of

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

Lemma Let G be a nite group nclog jGj and c Then for al l but an

n

exponential ly smal l in n fraction of the choices of y G the distribution SP U is

Gy n

statistical ly indistinguishable within an exponential ly smal l amount from the uniform distri

bution over G

Let N be a nbit integer Lemma gives a way of dening a collection of functions

n

ff I Z g which solves the problem of sampling an almost uniformly distributed

k k

N

d

element m Z for which m can b e computed in NC This collection is ff g fSP g

g g Z g g

N

N

where g fg g g is a sequence of n elements in Z The functions ff g have the

n g g

N

following prop erties

ethatf U is almost uniformly distributed For almost all choices of the key g wehav

g n

in Z

N

y

Following prepro cessing that dep ends only on the key g each value f x can be

g

j

where i n computed in NC The values that need to b e precomputed are g

i

y

and j the length of y With these values the computation of f x is reduced

g

to a single iterated multiplication and an additional mo dular reduction

The First RSA Construction

th

For our rst RSA construction we need to assume that it is hard to extract the e ro ot

mo dulo a comp osite when e is a large prime To formalize this we assume that for every

n

p ossible value hN ei of IG we have that e is a nbit prime which in particular

RS A

means that e Z Based on this version of the RSAAssumption we dene a collection

N

n

of I I pseudorandom synthesizers S

RS A

Denition Let N be a nbit integer let g fg g g be a sequence of n

n

n

elements in Z and let r be a nbit string Dene the function s I I by

Ngr

N

y

def

n

x y I s x y SP x r

Ngr Z g

N

Let S to be the random variable that assumes as values the functions s where the

n N gr

n

distribution of N is induced by IG and g and r are uniformly distributed in their

RS A

range The function ensemble S is dened to be fS g

RS A n nN

Note that the only reason we let y be a nbit number instead of a nbit number

our is to make both inputs of s be of the same length which not really necessary for

Ngr

constructions

Theorem If the RSAAssumption Assumption holds when for every possible

n

value hN ei of IG we have that e is a nbit prime Then S is a col lection of

RS A RS A

n

I I pseudorandom synthesizers

Proof It is obvious that S fS g is eciently computable Assume that S is

RS A n nN RS A

not a collection of pseudorandom synthesizers Then there exists a p olynomial m such

X Y for X and Y g is not pseudorandom where E C that the ensemble E fE

n n S

n

mn

that are indep endently drawn from U Therefore there exists an ecient nextbit

n

prediction test T and a p olynomial q such that for innitely many n it holds that

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

Given a prex of E of uniformly chosen length T succeeds to predict the next

n

bit with probability greater than

q n

We now show how to use T in order to dene an ecient algorithm A such that for

innitely many n

e z

Pr AN e m zrm r

q n

where the distribution of N e and m is as in the RSAAssumption with the restriction that

e is a nbit prime r is drawn from U and z is uniformly distributed over the set of nbit

n

z

integers that are relatively prime to e By Theorem this means that m can also be

eciently computed with nonnegligible probability Following Shamir we note that

z

given any z such that gcde z and given m it is easy to compute m The reason is

e a z b

that if gcde z then m can b e computed by the formula m m m where a b Z

satisfy that ae bz and can b e eciently computed as well Thus the existence of such

an algorithm A contradicts the RSAAssumption and completes the pro of of the lemma

The algorithm A denes a matrix B whichisalmost identically distributed as E One of

n

z

the entries of B is m r the value A tries to guess and all other entries can b e computed

z

by A It is now p ossible for A to guess m r by invoking T on the appropriate prex of

e

this matrix In more details on input hN e m zri the algorithm A is dened as follows

Uniformly cho ose i j mn

Dene the values fh h g and fd d g by setting h m uniformly

i

mn mn

drawing all other h s from Z setting d z e mo d N and drawing all other

u j

N

d s from U

v n

mn

e d

v

by setting b h r Dene B b

uv u uv

uv

z

Note that A can compute any entry b except b m r The reason is that

uv ij

e

if v j then A knows both d and h and if u i then A can compute b

v u uj

e z e z

h r h r since it knows b oth h and z

u u u

Invoke T and feed it with all the entries of B up to b ie the rst i rows and

ij

th

the rst j entries of the i row

Output T s prediction of b

ij

It is obvious that A is ecient In order to complete the pro of we need to show that if

e

hN e m zri are distributed as ab ove then B and E are of exp onentially small statistical

n

n if we feed T with the bits of B up distance This would imply that for innitely many

z

to b it predicts b m r with probability greater than say As argued ab ove

ij ij

q n

this would contradict the RSAAssumption and would complete the pro of

To see that B and E are indeed statistically close notice that

n

Since e Z and u mn the value m is uniformly distributed in Z we

u

N

N

e

have that u mn the value m is also uniformly distributed in Z

u N

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

e

By Lemma we therefore have that the distribution of fm g is sta

u umn

tistically close to the distribution of fSP x g for uniformly distributed

Z g u

umn

N

mn

n n

values fx x gI and g fg g gZ

n

mn

N

For z that is chosen from U the distribution of z e mo d N and U mo d N

n n

are statistically close Since e is a large prime even given the restriction that z is

relatively prime to e these distributions are statistically close

Given these two observations it is easy to verify that B and E are indeed of exp onentially

n

small statistical distance

Claim The functions in S can be evaluated in NC given a sequential precom

RS A

putation which is part of the keygenerating algorithm

j

Proof Given that the keygenerating algorithm precomputes g for i j n the

i

evaluation of functions in S is reduced to an iterated multiplication and an additional

RS A

mo dular reduction

The Second RSA Construction

th

The securityofS dep ends on the assumption that it is hard to extract the e ro ot mo d

RS A

ulo a comp osite where e is a large prime Here we dene another collection of synthesizers

th

under the assumption that it is hard to extract the e ro ot mo dulo a comp osite N without

However we intro duce a new restriction on any restriction on the distribution of e Z

N

the p ossible values of the comp osite N

Denition Let G be the set of nbit integers N P Q such that P and Q are

n

two nbit primes and N has no odd factor smal ler than n

It is easy to verify that if N G then a sequence of n uniformlychosen o ddvalues d

n

n

fd d g Z have a constant probability to be in Z indeed this probability

n N

N

is o By Lemma given such a sequence it is easy to almost uniformly sample

even without know ledge of N This can be any p olynomial number of values in Z

N

done by using the subset pro duct function SP Notice that here the subset pro duct

Z d

N

function serves an additional role to the one already describ ed ab ove

For example denote by B x the number Sieve theory shows that G is not to o sparse

n

of primes p smaller than x such that p is the pro duct of two primes each of which

cx

See is larger than p Then there exists a p ositive constant c such that B x

log x

for several results of this sort which are more than sucient for our purp ose As a

result we get that a If the RSAassumption holds for a uniformly distributed value of N

then it also holds under the restriction N G b The uniform distribution over G can

n n

as large integers That is we let anyinteger Provided that weallow the representation of values in Z

N

v represent u v mo d N We note that suchavalue v is just as go o d as u for our pro of as long as v

is bounded by pol y N In particular using such a representation it is p ossible to compute SP even

Z d

N 

without know ledge of N

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

be eciently sampled using Bachs algorithm Given a and b it seems that this

restriction is rather reasonable

Based on the RSAAssumption with the restriction that N G we dene a collection

n

n

of I I pseudorandom synthesizers S In the denition of S we use the least

RS A RS A

signicant bit LSB instead of the GoldreichLevin hardcore bit Alexi et al showed

that LSB is a hardcore bit for RSA Fischlin and Schnorr have recently provided a

stronger reduction for this bit

Denition Let N be a nbit integer let g fg g g be a sequence of n

n

Dene the elements in Z and let d fd d g be a sequence of n elements in Z

n

N

N

n

function s I I by

Ngd

SP y

d Z

def

C B

n

N

x y LS B SP x x y I s

A

Z g

gd N

N

Let S to be the random variable that assumes as values the functions s where the

n

Ngd

n

distribution of N is induced by IG and g and d are uniformly distributed in their

RS A

range The function ensemble S is dened to be fS g

RS A n nN

Theorem If the RSAAssumption Assumption holds when for every possible

n n

value hN ei of IG we have that N G Then S is a col lection of I I

RS A n RS A

pseudorandom synthesizers

Proof It is obvious that S fS g is eciently computable Assume that S is

RS A n nN RS A

m such not a collection of pseudorandom synthesizers Then there exists a p olynomial

that the ensemble E fE g is not pseudorandom where E C X Y for X and Y

n n S

n

mn

that are indep endently drawn from U Therefore there exists an ecient nextbit

n

prediction test T and a p olynomial q such that for innitely many n it holds that

Given a prex of E of uniformly chosen length T succeeds to predict the next

n

bit with prob ability greater than

q n

We now show how to use T in order to dene an ecient algorithm A such that for

innitely many n

e

Pr AN e m LS B m

q n

where the distribution of N e and m is as in the RSAAssumption with the restriction that

N G By this contradicts the RSAAssumption and completes the pro of of the

n

lemma

The basic idea in the denition of the algorithm A is similar to the pro of of Theo

rem the algorithm A denes a matrix B which is almost identically distributed as

E One of the entries of B is LS B m the value A tries to guess and all other entries can

n

be computed by A It is now p ossible for A to guess LS B m by invoking T on the appro

e

priate prex of this matrix In more details on input hN e m i as ab ove the algorithm A

is dened as follows

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

Uniformly cho ose i j mn

such a value d Dene e to be e d where d is almost uniformly distributed in Z

N

can b e sampled b ecause N G

n

Note that e is almost uniformly distributed in Z

N

Dene the values fh h g and fd d g by setting h m uniformly

mn mn i

dra wing all other h s from Z setting d e mo d N and sampling all other d s

u j v

N

almost uniformly from Z

N

d

v

mn

e

Dene B b by setting b LS B h

uv uv u

uv

Note that A can compute any entry b except b LS B m The reason is that

uv ij

e

if v j then A knows both d and h and if u i then A can compute b

v u uj

e

e

h LS B h since it knows h LS B

u u u

Invoke T and feed it with all the entries of B up to b ie the rst i rows and

ij

th

the rst j entries of the i row

Output T s prediction of b

ij

e

It is obvious that A is ecient It is also easy to verify that if hN e m i is distributed as

ab ove then B and E are of exp onentially small statistical distance Therefore for innitely

n

many nifwe feed T with the bits of B up to b it predicts b LS B m with probability

ij ij

greater than say As argued ab ove this contradicts the RSAAssumption and

q n

completes the pro of of the lemma

Claim The functions in S can be evaluated in NC given a sequential precom

RS A

putation which is part of the keygenerating algorithm

j

Proof Given that the keygenerating algorithm precomputes g for i j n the

i

wo mo dular evaluation of functions in S is reduced to two iterated multiplication and t

RS A

reductions

Remark Since Alexi et al showed that the log n leastsignicant bits are simul

taneously hard for RSA we can adjust the functions in S to output log n bits If we

RS A

make a stronger assumption that n bits are simultaneously hard for RSA we get a direct

construction of pseudorandom synthesizers with linear output size Although the stronger

assumption is not known to be equivalent to the RSAAssumption it is stil l quite standard

PseudoRandomness and LearningTheory

Syn thesizers Based on HardtoLearn Problems

The traditional connection between cryptography and learning theory is using crypto

graphic assumptions to deduce computational nonlearnability results Blum Furst Kearns

PSEUDORANDOM SYNTHESIZERS AND FUNCTIONS

and Lipton have suggested that the other direction is interesting as well They have

shown how to construct several cryptographic primitives out of hardtolearn functions in a

way that preserves the degree of parallelism of the functions A ma jor motivation for pre

senting such constructions is the simplicity of function classes that are b elieved to be hard

for ecient learning

Weshow that under the denitions of pseudorandom synthesizers can easily b e con

structed from distributions of functions that are hard to learn Thus by our constructions

can be constructed in parallel out of hardtolearn two additional cryptographic primitives

functions pseudorandom generators with large expansion ratio without assuming as in

that the functions are hard to learn with memb ership queries and pseudorandom

functions

There is a dierence between standard learningtheory denitions and standard crypto

graphic denitions Lo osely sp eaking a collection of concepts is hard to learn if for every

ecient algorithm there exists a distribution over the concepts that is hard to learn for this

specic algorithm In cryptographic settings the order of quantiers is reversed the hard

distribution should b e hard for every ecient algorithm In order for hardtolearn problems

to b e useful in cryptographic settings an averagecase learning mo del is intro duced in

Informally describing one of the denitions in we can say that a distribution en

semble of functions F fF g is not weakly predictable on the average with resp ect to a

n N

distribution D on the inputs if the following holds There is no ecient algorithm that can

predict f x with probability given x and a p olynomial sequence fhx fx ig

i i

pol y n

where f is distributed according to F and all the inputs are indep endently distributed

n

according to D

It is easy to verify that a distribution ensemble of functions F isnotweakly predictable

on the average with resp ect to the uniform distribution if and only if it is a collection

of weak pseudorandom functions Thus by Lemma such a distribution denes a

n n

pseudorandom synthesizer S where S x y is simply f x recall that f denotes

I I

y y

the function that is sampled from F using y as random bits Using S we can construct

n

pseudorandom generators and pseudorandom functions Moreover by Lemma the

pseudorandom generator we construct may have a large expansion ratio n for every

The pseudorandom generator constructed in under the same assumption has

expansion ratio b ounded ab ove by n

A Concrete HardtoLearn Problem

on functions with parameters k and n Each function Consider the following distribution

is dened by a uniformly distributed pair of disjoint sets A B fng each of size

k Given an nbit input the output of the function is the exclusiveor of two values the

parity of the bits indexed by A and the ma jority of the bits indexed by B In it is

estimated that these functions for k log n cannot be weakly predictable without using

profoundly new ideas If indeed this distribution of functions is not weakly predictable on

the average for any k then it denes an extremely ecient synthesizer Therefore using

our constructions we get rather ecient parallel pseudorandom functions

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

The Application of PseudoRandom Functions to Learning Theory

As observed byValiant if a concept class contains pseudorandom functions then we

can deduce avery strong unlearnability result for this class Informally it means that there

exists a distribution of concepts in this class that is hard for every learning algorithm for

every nontrivial distribution on inputs even when memb ership queries are allowed Since

no parallel pseudorandom functions were known b efore the current work this observation

could not have b een applied to NC

Nevertheless other techniques based on cryptographic assumptions were used in

to show hardness results for NC and TC For example Kharitonov used the follow

ing fact after prepro cessing a p olynomiallength pseudorandom bitstring based on

can b e pro duced in TC the length of the string can stay undetermined at the prepro cessing

stage The existence of pseudorandom functions in NC as shown in this chapter under

several assumptions is still of interest to computational learning theory b ecause the result

it implies is stronger than previous results To briey state the dierence we note that the

results of use a very sp ecic distribution on the inputs that is hardtolearn and the

results of strongly rely on the order of quantiers in learningtheory mo dels which was

mentioned ab ove eg for any given learning algorithm shows a dierent hard concept

which can still b e easily learned by an algorithm which has a somewhat larger runningtime

Concrete Constructions of PseudoRandom Func

tions

In this section we describ e two related constructions of pseudorandom functions based on

numb ertheoretic assumptions The rst construction gives pseudorandom functions i the

decisional version of the DieHellman assumption DDHAssumption holds The second

construction is at least as secure as the assumption that factoring the so called Blumintegers

is hard Having ecient pseudorandom functions based on factoring is very desirable since

this is one of the most established concrete intractability assumption used in cryptography

The construction based on the DDHAssumption is also attractive since these pseudorandom

have a larger output size and since the functions are even more ecient in that they

construction is linearpreserving see Remark To b etter understand the security

of our constructions we include in Chapter a study of the underlying assumptions As

discussed in Section the constructions of this section are motivated by the construction

of Section and in particular by the notion of k dimensional synthesizers

The pseudorandom functions that are constructed in this section are ecient computing

the value of the function at a given point is comparable with two mo dular exp onentiations

whichismuch more ecient than previous prop osals have shal low depth given appropriate

prepro cessing of the key the value of the functions at any given poin t can be computed in

andhave a simple algebraic structure The prop erties of our pseudorandom functions TC

In fact weprove the security of the second construction based on a generalized version of the computa

tional DHAssumption GDHAssumption However breaking the GDHAssumption mo dulo a comp osite

would imply an ecient algorithm for factorization as shown in Section and

CONCRETE CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

and their consequences are discussed at length in the intro duction

The notation and denitions used in this section are given in Chapter in particular we

use denitions of pseudorandomness from there In addition we rely on the denitions of

the relevant assumptions from Chapter mainly on the denition of the DDHAssumption

Organization

In Section we describ e a construction of pseudorandom functions based on the DDH

Assumption prove its security and consider its complexity In Section we dene the

GDHAssumption and show a related construction of pseudorandom functions based on this

assumption In Section we consider some of the p ossible features of our pseudorandom

functions

Construction of PseudoRandom Functions

In this section we describ e a construction of pseudorandom functions based on the DDH

Assumption prove its security and consider its complexity A related construction based

on a weaker assumption is describ ed in Section

Construction and Main Result

Construction We dene the function ensemble F fF g For every n a key

n nN

of a function in F is a tuple hP Q g ai where P is an nbit prime Q a prime divisor of

n

and a ha a a i a sequenc P g an element of order Q in Z e of n elements

n

P

of Z For any nbit input x x x x the function f is dened by

Q n PQga

Q

a

def

i

a

x

i

f x g

PQga

The distribution of functions in F is induced by the fol lowing distribution on their keys a

n

n

is uniform in its range and the distribution of hP Q g i is IG see Denition

tly computable since IG is ecient The pseudorandomness It is clear that F is ecien

prop erty of F is the following

Theorem Let F fF g be as in Construction If the DDHAssumption

n nN

Assumption holds then for every probabilistic polynomialtime oracle machine M

every constant and al l suciently large n

f R

P Qg

P Qg a

PrM P Q g PrM P Q g

n

where in the rst probability f is distributed according to F and in the second prob

P Qg a n

n

ability the distribution of hP Q g i is IG and R is uniformly chosen in the set of

P Qg

n

functions with domain f g and range hg i the subgroup of Z generated by g P

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

Remark It is easy to verify from the proof of Theorem that the fol lowing more

quantitative version of the theorem also holds

Assume that there exists a probabilistic oracle machine with running time t tn that

distinguishes f from R as in Theorem with advantage n Then

P Qg a P Qg

there exists a probabilistic algorithm with running time pol y n tn that breaks the DDH

Assumption with advantage nn Therefore the reduction is linearpreserving see

This reduction is rather unique in that the security of the functions does not signicantly

decrease when the number of queries the distinguisher makes increases

Given Theorem we have that F is almost an eciently computable pseudo

has domain random function ensemble There is one dierence A function f in F

P Qg a n

n

f g and range hg i Therefore dierent functions in F have dierent ranges which de

n

viates from the standard denition of pseudorandom functions However for many ap

plications of pseudorandom functions this deviation do es not present a problem eg the

applications of pseudorandom functions to privatekey authentication and identication and

their applications to digital signatures In addition it is rather easy to construct from

F pseudorandom functions under the standard denition In order to show this we need

the following lemma which is a simple corollary of the leftover hash lemma

n

Lemma Let n and e be threepositive integers such that en Let X f g

be a set of at least elements and x uniformly distributed in X Let H be a family of pair

n e e

wise independent f g f g hash functions Then for al l but a fraction of

e

h H the uniform distribution over f g and h x are of statistical distanceofatmost

e

Lemma suggests the following construction

Construction Let n be an integervalued function such that for any output

n n

hP Q g i of IG we have that Q Let F fF g be as in Construction

n nN

n n

g hash functions and n let H be a family of pairwise independent f g f

n

We dene the function ensemble F fF g For every n a key of a function in F is a

n nN n

pair hk hi where k is a key of a function in F and h H For any nbit input x the

n n

function f is dened by

kh

def

f x hf x

kh k

The distribution of functions in F is induced by the fol lowing distribution on their keys h

n

is uniform in H and the distribution of k is the same as the distribution of keys in F

n n

n

Note that cho osing the range of the hash functions to b e f g is arbitrary One can

nen en

cho ose the range to be f g for any function en such that is negligible

Using Theorem and Lemma we can easily conclude

Theorem If the DDHAssumption Assumption holds then F fF g as

n nN

in Construction is an eciently computable pseudorandom function ensemble

Remark From Theorem and Lemma it fol lows that F fF g remains

n nN

indistinguishable from the uniform functionensemble even when the distinguisher has access

to hP Q g i and to h as in the denition of functions in F n

CONCRETE CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

Pro of of Security

There are a few p ossible approaches to proving Theorem One approach is related

to Construction and in particular to the concept of an ndimensional synthesizer

Indeed Construction has motivated the constructions of Section the connection is

describ ed in Section However the pro of we give here for Theorem follows an

analogous line to the pro of of security for the GGMConstruction of pseudorandom functions

This may seem surprising since the two constructions lo ok very dierent Nevertheless

in some sense one may view our construction as a careful application or a generalization

of the GGMConstruction In the following few paragraphs we describ e the similarities and

dierences between the two constructions

Let G b e a pseudorandom generator that doubles its input Dene G and G such that

for any nbit string x both G x and G x are nbit strings and Gx hG xG xi

n n

Under the GGMConstruction the key of a pseudorandom function f f g f g

s

is a uniformly chosen nbit string s For any nbit input x x x x the function f is

n s

dened by

def

x x x

n

s G G f x G

s

The denition of f can b e thought of as a recursive lab eling pro cess of a depthn binary

s

tree The key s is the lab el of the ro ot and it induces a lab eling of all the no des in the tree

n n

The lab els of the leaves corresp ond to the dierent outputs of the function In contrast

in our construction no tree app ears in the design and no particular order is attached to the

input bits Nevertheless wewere able to relate the pro of of securityofthetwo constructions

The DDHAssumption implies a simple pseudorandom generator that practically dou

def

b ab

a

bles its input G b hg g i whose output is a pseudorandom pair of values in the

P Qg g

subgroup generated by g It is tempting to use this generator for the GGMConstruction

However a straightforward application of the GGMConstruction would give a rather inef

cient function We therefore suggest aslightchange to the denition of the generator

def

b b b b ab

a

G g hG g G g i hg g i

a a

P Qg g

P Qg g P Qg g

a

At a rst lo ok this seems absurd G is not eciently computable unless the DH

P Qg g

a

is eciently computable then it is not pseudo Problem is easy Therefore if G

P Qg g

a

random However G has the following prop ertythatallows us to use a generalization

P Qg g

b

a

of the GGMConstruction G g is eciently computable if either a or b are known

PQgg

A more general way to state this is

a

G is eciently computable on any input given the random bits that were used

P Qg g

to sample it in particular given a

b

a a

G it is easy to generate the distribution of its output G g on For any

P Qg g PQgg

a uniformly chosen input this fact implies Lemma

Wenow obtain the pseudorandom functions of Construction using the GGMConstruction

a

where at each level of the construction we use a dierent value g for the generator

def

x x

x

a

n

g G G x G f

a a

a

n

P Qg a a a

n

PQgg PQgg P Qg g

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

We turn to the formal pro of of Theorem First we show in Lemma that

b b

t

a a

a p olynomial sample hG g G g i is pseudorandom i a single sample

PQgg P Qg g

b

a

g is pseudorandom In preliminary versions of this work the pro of of Lemma G

PQgg

used ahybridargument based on prop erty ab ove which is similar to the corresp onding

argument in However Victor Shoup has p ointed out that one can use the randomized

reduction of the DDHProblem see Section for an alternative pro of of the lemma

The new pro of is b oth simpler and more securitypreserving Given a distinguisher for

the p olynomialsample we get a distinguisher for the single sample that achieves the same

advantage Based on this prop erty the security of the functions in our pro of of Theorem

do es not signicantly decrease when the number of queries the distinguisher makes increases

whichisvery dierent from the pro ofs of security for the functions in and in Section

Denition Let n and t be any pair of positive integers Dene the two distributions

nt nt

I and I as fol lows

R PR

def

nt

c a b c b

t t

i g I hP Q g g g g g

R

and

def

nt

ab b ab a b

t t

i g g g I hP Q g g g

PR

n

where hP Q g i is distributedaccording to IG and al l the values in ha b b c c i

t t

are uniform in Z

Q

Lemma Indistinguishability of a Polynomial Sample If the DDHAssumption As

sumption holds then for every probabilistic polynomialtime algorithm D every poly

nomial t every constant and al l suciently large n

ntn ntn

PrD I PrD I

tn

R

n

Proof Let n be any p ositive realvalued function Assume that there exists a

probabilistic p olynomialtime algorithm D and a p olynomial t such that for innitely

many n

ntn ntn

PrD I PrD I n

PR R

We dene a probabilistic p olynomialtime algorithm A such that for innitely many n

a b a b a b c

PrAP Q g g g g Pr AP Q g g g g n

where the probabilities are taken over the random bits of A the choice of hP Q g i according

n

to the distribution IG and the choice of a b and c uniformly at random in Z For

Q

n this contradicts the DDHAssumption and completes the pro of of the lemma

n

a b c

Let the input of A be hP Q g g g g i where P is nbit long andc is either a b mo d Q

or uniform in Z Using a randomized reduction similar to that in the pro of of Lemma

Q

c b

i i

such that i c a b mo d Q i c a b mo d Q A g A generates tn random pairs g

i i

now invokes D on these values to distinguish between the two p ossible distributions of its

own input More formally A executes the following algorithm

CONCRETE CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

Dene t tn and sample each one of the values in hd d e e i uniformly

t t

at random in Z

Q

Dene the sequence I to b e

a a b c a b c

hP Q g g R g g g R g g g i

d e d e

t t

where

d d

i i

def

e

a b c b e c a

i

i

i R g g g g g g g

d e

i i

Output D I

b c a b c

i i

Denote by g g the value R g g g By the same arguments used in the pro of of

d e

i i

Lemma we have that

If c a b mo d Q then b b are uniform in Z and indep endent of each other

t Q

and of a and i c a b mo d Q

i i

If c a b mo d Q then b b c b c are al l uniform in Z and indep endent

t t t Q

of each other and of a

nt nt

Therefore by the denitions of A I and I it easily follows that

PR R

nt

a b ab

PrAP Q g g g g PrD I

PR

nt

a b c

and PrAP Q g g g g PrD I

R

It is now immediate that innitely many n

a b ab a b c

PrAP Q g g g g Pr AP Q g g g g n

where the probabilities are as ab ove

The pro of of Theorem given Lemma uses a hybridarguments which is a pro of

technique for showing that two distributions are indistinguishable See for details on

hybridarguments Lo osely the metho d for showing that D and D are indistinguishable

t distributions D D D with is to Dene a p olynomiallength sequence of ecien

m

D D and D D Show that any two neighb oring distributions D and D are

m j j

indistinguishable In fact in the uniform version of this argument eg in the pro of of Theo

rem we usually show that it is hard to distinguish D and D where J is uniformly

J J

chosen in m Furthermore in the pro of of Theorem as well as in the corresp onding

pro ofs in and in Section the n distributions that are implicitly dened are

of functions and they are not eciently samplable For example one of the two extreme

distributions is the uniform distribution which is certainly not eciently samplable Nev

ertheless a uniform function can be eciently simulated by an algorithm that answers

each query at random under the restriction of keeping consistency of its answers for rep eat

ing queries Since all other intermediate function distributions can be simulated in the

same sense we can still apply the hybridargument Wenow turn to the formal pro of where

the arguments describ ed ab ove are implicit

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

Proof of Theorem Let n be any p ositive realvalued function Assume that

there exists a probabilistic p olynomialtime oracle machine M such that for innitely many

n

f R

PQg

PQga

PrM P Q g Pr M P Q g n

where the probabilities are as in Theorem Let t be a p olynomial that bounds the

running time of M We dene a probabilistic p olynomialtime algorithm D such that for

innitely many n

ntn ntn

n Pr D I PrD I

PR R

n

By Lemma for n this contradicts the DDHAssumption and completes the

n

proofofthe theorem

a b c b c b c

t t

On any input hP Q g g g g g g g g i where P is n bits long and either

each c is a b mo d Q or each c is uniform in Z D executes the following algorithm

i i i Q

Sample J uniformly at random in n

Sample each one of the values in ha a a i uniformly at random in Z

Q J J n

Invoke M on input hP Q g i and answer its queries in the following way Let the

m th i

queries asked by M be hx x x i The i query x is an nbit string Denote

i i i i i i i i i

x x x x x where x is a J bit string and x x x are single

J J n J J n

th i i

bits To answ er the i query dene i minfi jx x g and answer the query

by

Q

a

k i

k J x

i c

k

if x g

J

Q

a

i k

x k J

b i

k

g if x

J

These answers are well dened since m t

Output whatever M outputs

and R as in Theorem From the denition of D we have that for f

a P Qg P Qg

nt

f

PQga

Pr D I j J PrM P Q g

PR

nt

R

PQg

PrD I j J nPrM P Q g

R

and for any j n

nt nt

Pr D I j J j PrD I j J j

R PR

By the assumption we get that for innitely many n

nt nt

PrD I PrD I

R PR

n n

X X

nt nt

PrD I j J j Pr D I j J j

PR R

n n

j j

CONCRETE CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

nt nt

j J n j J PrD I PrD I

R PR

n

f R

P Qg

PQga

PrM P Q g Pr M P Q g

n

n

n

This completes the pro of of the theorem

Eciency of the PseudoRandom Functions

Consider a function f F where a ha a a ias in Construction Com

P Qg a n n

puting the value of this function at any given point x involves one multiple pro duct a

Q

pro duct of p olynomially manynumb ers y a a which can b e p erformed mo dulo

i

x

i

y

Q and one mo dular exp onentiation g This gives a pseudorandom function which is

much more ecient than previous constructions Furthermore one can use prepro cessing in

i

order to get improved eciency The most obvious prepro cessing is computing the values g

for every p ositiveinteger i up to the length of Q Now computing the value of the function

requires two multiple pro ducts mo dulo a prime Additional prepro cessing can reduce the

work by a factor of O log n see Brickell et al Actually to compute the value of

the pseudorandom function of Construction we also need one application of a pair

wise indep endent hash function but this op eration is very cheap compared with a multiple

pro duct or a mo dular exp onentiation

As describ ed in the intro duction in Section we are also interested in nding the

paralleltime complexity of the pseudorandom functions In order to do so let us rst

recall the result of Beame Co ok and Ho over who showed that devision and related

op erations including multiple product are computable in NC Based on this result Reif

and Tate showed that these op erations are also computable in TC The exact

depth required for these op erations was considered in where it was shown that

multiple sum is in TC multiplication and division in TC and multiple pro duct in TC

recall that for every integer i the class of functions computable by depth i circuits consisting

of a p olynomial number of threshold gates is denoted by TC

i

i

By the results ab ove we get that after prepro cessing ie computing the values g it

is p ossible to evaluate the function f in TC since all the necessary op erations can b e

P Qg a

p erformed in TC

Theorem Let F fF g be as in Construction Then there exists a polyno

n nN

mial p and an integer i such that for every n N and every function f F there exists

k n

a depth i threshold circuit of size bounded by pn that computes f

k

The exact depth of the functions Theorem can b e obtained by a naive application

of the results in A more detailed analysis of the function f reveals that

PQga

optimizations in the depth are p ossible Using additional prepro cessing this some further

In the case that Q is much smaller than P wehave that the rst multiple pro duct is muchcheap er than the second

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

function can in fact be evaluated in TC Since the exact depth of the construction is

p eripheral to this work and since a formal pro of of this claim would rep eat quite a lot

of the analysis in we only comment on a few facts regarding f that allow

P Qg a

us to reduce the depth First note that in b oth multiple pro ducts we can assume any

prepro cessing of the values in the multiplication since these v alues are taken from the

i

sequence ha a a i or from the set fg g Second we dont need the actual value

n

Q

of the rst multiple pro duct y a Computing values r obtained by the CRT

i i

x

i

P

representation for which y m r where the values m are known in advance and can b e

i i i

prepro cessed is just as go o d Finallythevalue P is also known in advance Therefore the

i

depth of the nal mo dular reduction can b e reduced by precomputing the values mo d P

Remark The same analysis hold for eciency and depth of the pseudorandom func

tions of Construction

Construction Based on Factoring or the GDHAssumption

In this section we show an additional construction of pseudorandom functions Construc

tion that is very similar to Construction The security of Construction

is reduced to the GDHAssumption which is a generalization of the computational DH

Assumption This construction is interesting for two main reasons

The GDHAssumption is implied by the DDHAssumption but they are not known

be valid even if the DDH to be equivalent Therefore Construction may still

Assumption do es not hold In addition the GDHAssumption mo dulo a so called

Bluminteger is not stronger than the assumption that factoring Blumintegers is hard

This gives an attractive construction of pseudorandom functions that is at least as

secure as Factoring which was recently improved in

Construction is based on a somewhat dierent metho dology than Construc

tion It may be easier to apply this metho dology in order to construct pseudo

random functions based on additional assumptions in fact Construction was

obtained as a mo dication of Construction

The GDHAssumption

The GDHAssumption was previously considered in the context of a keyexchange proto col

for a group of parties see eg In this proto col party i n cho oses a secret

Q

a

i

in

value a After executing the proto col each of these parties can compute g and

i

this value denes their common key While executing the proto col an eavesdropp er may

Q

a

i

iI

learn values of the form g for several prop er subsets I n It is essential to assume

Q

a

i

in

The GDHAssumption is that even with this knowledge it is hard to compute g

Q

a

i

in

even stronger Informally this assumption says that it is hard to compute g for an

Q

a

i

iI

algorithm that can query g for any prop er subset I n of its choice

To remain consistent with the DDHAssumption we state the GDHAssumption As

sumption in a subgroup of Z of order Q where P and Q are primes In fact the P

CONCRETE CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

corresp onding assumption in any other group implies a corresp onding construction of pseudo

random functions For example since breaking the GDHAssumption mo dulo a comp osite

is at least as hard as factoring as shown in Section and we obtain in Section

a construction of pseudorandom functions which is at least as secure as Factoring Further

more in contrast with the DDHAssumption one can consider the GDHAssumption in Z

P

itself ie when g is a generator of Z

P

In order to formalize the GDHAssumption we use the following denition

n

Denition Let hP Q g i be any possible output of IG and let a ha a a i

n

n

be any sequence of n elements of Z Dene the function h with domain f g such

Q P Qg a

that for any nbit input x x x x

n

Q

a

def

i

x

i

h x g

P Qg a

r n n

Dene h to be the restriction of h to inputs f g nf g

P Qg a

P Qg a

Assumption Generalized DieHellman For every probabilistic polynomialtime

oracle machine A every constant and al l suciently large n

r

h

n

PQga

P Q g h PrA

PQga

n

where the probability is taken over the random bits of A the choice of hP Q g i according to

n

the distribution IG and the choice of each of the values in a ha a a i uniformly

n

at random in Z

Q

As a corollary of Theorem we have that if the DDHAssumption holds then so

do es the GDHAssumption In fact we get that the DDHAssumption implies the decisional

GDHAssumption this was also previously shown in

Corollary If the DDHAssumption Assumption holds then for every proba

bilistic polynomialtime oracle machine A every constant and al l suciently large

n

r r

h h

n c

PQga P Qg a

Pr A P Q g h PrA P Q g g

P Qg a

n

where the probabilities are taken over the random bits of A the choiceofhP Q g i according

n

to the distribution IG the choice of each of the values in a ha a a i uniformly

n

at random in Z and the choice of c uniformly at random in Z

Q Q

Motivation to the construction

Construction is motivated by the concept of pseudorandom synthesizers and by Con

struction of pseudorandom functions using pseudorandom synthesizers as building

blo cks To see the connection let us recall some of the main ideas of Construction

Informally a pseudorandom synthesizer S is

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

An eciently computable function of two arguments such that given polynomial ly

m m

many uniformlychosen inputs for each argument fx g and fy g the out

i i

i i

m

put of S on al l the combinations S x y cannot be eciently distin

i j

ij

guished from uniform

A natural generalization is a k dimensional pseudorandom synthesizer Informally a k

dimensional pseudorandom synthesizer S maybe dened to b e

An eciently computable function of k arguments such that given polynomial ly

o nn o

k

m

j

many uniformlychosen inputs for each argument x the output of S

i

i

j

m

k

cannot be eciently on al l the combinations M S x x x

i i i

k

i i i

k

distinguished from uniform by an algorithm that can access M at points of its

choice

Construction can b e viewed as rst recursively applying a dimensional synthesizer

to get an ndimensional synthesizer S and then dening the pseudorandom function f

by

def

f S a a a

n n

ha a a a a a i

n

n n

However using this construction the depth of the ndimensional synthesizer and the pseudo

random functions is larger by a logarithmic factor than the depth of the dimensional

synthesizer Therefore a natural problem is to come up with a direct construction of an

ndimensional synthesizer

In this section it is shown that under the GDHAssumption the function S de

P Qg r

Q

n

def

a

i

i

r is an ndimensional synthesizer Construc ned by S a a a g

P Qg r n

tion is then obtained as describ ed ab ove

The Construction

We turn to the construction of pseudorandom functions

Construction We dene the function ensemble F fF g For every n a key

n nN

of a function in F is a tuple hP Q g ari where P is an nbit prime Q a prime divisor

n

of P g an element of order Q in Z a ha a a a a a i a sequence of

n n

P

n elements of Z and r an nbit string For any nbit input x x x x the Binary

Q n

function f is dened by

PQgar

Q

n

def

a

ix

i

i

x g r f

P Qg ar

where denotes the inner product mo d The distribution of functions in F is induced

n

by the fol lowing distribution on their keys a and r are uniform in their range and the

n

distribution of hP Q g i is IG

Theorem If the GDHAssumption Assumption holds then F fF g as

n nN

in Construction is an eciently computable pseudorandom function ensemble

CONCRETE CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

In order to prove Theorem we need the following corollary of the GoldreichLevin

hardcorebit theorem more precisely the setting of this corollary is somewhat dierent

than the one considered in but their result still applies

Corollary If the GDHAssumption Assumption holds then for every proba

bilistic polynomialtime oracle machine A every constant and al l suciently large

n

r r

h h

n

PQga P Qg a

PrA P Q g r h r PrA P Q g r

P Qg a

n

where the probabilities are taken over the random bits of A the choiceofhP Q g i according

n

to the distribution IG the choiceofeach of the values in a ha a a i uniformly at

n

n

random in Z the choiceof r uniformly at random in f g and the choiceof uniformly

Q

at random in f g

Proofof Theorem Let F fF g be as in Construction It is clear that F is

n nN

eciently computable Assume that F is not pseudorandom then there exists a probabilistic

p olynomialtime oracle machine M and a constant such that for innitely many n

f R

n

P Qg ar

PrM P Q g r Pr M P Q g r

n

where in the rst probability f is distributed according to F and in the second

P Qg ar n

n

probability R is uniformly distributed over the set of f g f g functions hP Q g i

n

n

is distributed according to IG and r is a uniformly chosen n bit string

Let t be a p olynomial that bounds the running time of M We dene a probabilistic

p olynomialtime oracle machine A such that for innitely many n

r r

h h

n

PQga PQga

P Q g r PrA P Q g r h r PrA

P Qg a

n tn

where the probabilities are as in Corollary By Corollary this would contradict

the GDHAssumption and would complete the pro of of the theorem

r

and on input hP Q g r i where we exp ect to either be Given access to h

PQga

n

uniformly chosen or to b e h r A executes the following algorithm

P Qg a

Dene t tn and sample J uniformly at random in t

Sample each one of hb b b i uniformly at random in Z

n Q

Invoke M on input hP Q g r i and answer its queries in the following way Let the

m

queries asked by M be hx x x i and assume without loss of generality that all

those queries are distinct

Answer each one of the rst J queries with a uniformly chosen bit

th

Answer the J query with

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

i th

Let x be the i query for i J and dene the nbit string z z z z such

n

th i th J

that z is if the k bit of x and the k bit of x are equal and otherwise

k

i J n th

Since x x we have that z Finally answer the i query with

Q

b

k

z

r

k

h z r

P Qg a

Output whatever M outputs

Denote a ha a a i From the denition of A we have that all its answers to queries

n

i i th

x for i J are f x where a ha a a a a a i dep ends on the J

P Qg ar n n

J J J J J

x x x as follows For every k n if x thena a and a b query x

k k k k

n k

J

and if x then a a and a b The rst J queries are answered by A

k k k k

k

th

uniformly at random The only answer that dep ends on is the J answer itself This

is uniform It is also not hard to verify answer is of course uniformly distributed in case

th J n

that the J answer is f x incase h r We can therefore conclude

P Qg ar PQga

that

r

h

n f

PQgar

PQga

P Q g r PrA P Q g r h r j J PrM

P Qg a

r

h

R

n

PQga

P Q g r PrA P Q g r j J tn Pr M

and

r r

h h

n

P Qg a PQga

P Q g r h r j J j P Q g r j J j Pr A PrA

PQga

where the probabilities are as ab ove Therefore by the standard hybrid argument we get

from the assumption that for innitely many n

r r

h h

n

PQga PQga

P Q g r PrA P Q g r h r PrA

P Qg a

n tn

Remark From the proof of Theorem we get that F is pseudorandom even if

the distinguisher denoted by M in the proof has access to P Q g and r

PseudoRandom Functions at Least as Secure as Factoring

The pro of of Theorem do es not rely on the sp ecic group for which the GDHAssumption

is dened Therefore the corresp onding assumption in any other group implies a corresp ond

ing construction of pseudorandom functions An esp ecially interesting example is taking

the GDHAssumption mo dulo a comp osite Since breaking this assumption is at least as

hard as factoring as shown in Section and we obtain an attractive construction

of pseudorandom functions which is at least as secure as Factoring This construction was

recently improved by Naor Reingold and Rosen where an ecient metho d is provided

for expanding the one bit output of our functions while paying only a small overhead in the

complexity of the evaluation ie one mo dular multiplication for each additional output bit

CONCRETE CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

In particular this implies a lengthpreserving pseudorandom function that is at least as

secure as Factoring whose evaluation requires only a constant number of mo dular multipli

cations p er output bit In this subsection we rep eat the denition of the GDHAssumption

and the construction of pseudorandom functions with the group set to Z where N is a

N

Bluminteger for b etter readabilitywe also rep eat some of the denitions from Section

The pro of of security is practically the same as the pro of of Theorem and is therefore

omitted

Similarly to the case of the DDHAssumption we keep our results general by letting the

comp osite N b e generated by some p olynomialtime algorithm FIG where FIG stands for

factoringinstancegenerator

Denition FIG The factoringinstancegenerator FIGisaprobabilistic polynomial

n

time algorithm such that on input of FIG its output N is distributed over n bit

integers where N P Q for two n bit primes P and Q such that P Q mod

such an integer is known as a Bluminteger

The GDHAssumption Mo dulo a Comp osite

n

Denition Let N be any possible output of FIG let g be any quadraticresidue

in Z and let a ha a a i be any sequence of n elements of N Dene the function

n

N

n

h with domain f g such that for any nbit input x x x x

Nga n

Q

a

def

i

x

i

mo d N h x g

Nga

r n n

Dene h to be the restriction of h to inputs f g nf g

Nga

Nga

Assumption Generalized DieHellman in Z For every probabilistic polynomial

N

time oracle machine A every constant and al l suciently large n

r

h

n

Nga

PrA N g h

Nga

n

where the probability is taken over the random bits of A the choice of N according to the

n

distribution FIG the choice of g uniformly at random in the set of quadraticresidues

in Z and the choiceofeach of the values in a ha a a i uniformly at random in N

n

N

The Construction and its Security

Construction We dene the function ensemble F fF g For every n a key of

n nN

a function in F is a tuple hN g ari where N is a nbit Bluminteger g is a quadratic

n

residue in Z a ha a a a a a i is asequenceof n values in N and r is

n n

N

a nbit string For any nbit input x x x x the Binaryfunction f is dened

n Ngar

by

Q

n

def

a

ix

i

i

f x g mo d N r

Ngar

The distribution of functions in F is induced by the fol lowing distribution on their keys g a

n

n

and r are uniform in their range and the distribution of N is FIG

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

In the same way Theorem is proven we get that

Theorem If the GDHAssumption in Z Assumption holds then F fF g

n nN

N

as in Construction is an eciently computable pseudorandom function ensemble

However breaking the GDHAssumption in Z is at least as hard as factoring N The

N

orem We can therefore deduce that

Corollary of Theorem and Theorem Let F fF g be as in Con

n n N

struction and assume that F is not an eciently computable pseudorandom function

ensemble Then there exists a probabilistic polynomialtime algorithm A and a constant

such that for innitely many n

PrAP QhP Qi

n

n

where the distribution of N P Q is FIG

Additional Prop erties of the PseudoRandom Functions

The pseudorandom functions of Constructions and have a simple algebraic struc

ture We consider this to b e an imp ortantadvantage over all previous constructions mainly

since several attractive features seem more likely to exist for a simple construction of pseudo

random functions An interesting example arises by the work of Bellare and Goldwasser

They suggest a way to design a digitalsignature scheme that is very attractivegiven ecient

pseudorandom functions and an ecient noninteractive zeroknow ledge proof for claims of

the form y f m when a commitment to a key s of a pseudorandom function f is

s s

a vailable as part of the publickey Another very attractive scheme one may desire is a

functionsharing scheme for pseudorandom functions in an analogous meaning to function

sharing schemes for trap do or oneway p ermutations as dened in Two examples for

applications of such schemes are ecient metering of web usage and the distribution of

KDCs keydistribution centers

In this section we suggest preliminary designs for several proto cols Though there is

much ro om for improving these designs they are still a signicant improvement over the

proto cols that are available for all previous constructions of pseudorandom functions in

cluding commonly used blo ckciphers such as DES and they serve as a demonstration to

the potential of our construction The main purp ose of this section is to stimulate further

research b oth in improving our designs and in suggesting designs for other proto cols as the

noninteractive zeroknowledge pro of and the functionsharing scheme mentioned ab ove

We therefore do not insist on formal denitions and pro ofs for our proto cols In addition

we fo cus on the construction of Section using similar designs for the construction of

ay this construction uses the GoldreichLevin Section may b e problematic due to the w

hardcore bit

A point worth noticing is that we describ e our designs for the functions of Construc

tion That is we ignore the pairwise indep endent hashing intro duced in Construc

tion However as mentioned in Section for many applications as the undeniable

CONCRETE CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

signatures suggested below the extra hashing in Construction is unnecessary In ad

dition recall Remark that the pseudorandom functions of Construction remain

pseudorandom even if the hash function is public Therefore the proto cols we design here

for Construction imply similar proto cols for Construction For example when

distributing a pseudorandom function to a set of parties we can distribute the function

obtained by ignoring the hash function and supply each party with the value of this hash

function

ZeroKnowledge Pro of for the Value of the Function

As mentioned ab ove a noninteractive zeroknowledge pro of for claims of the form y

f m is required by the BellareGoldwasser digitalsignature scheme Similarly a simple

s

interactive zeroknowledge pro of for claims of the form y f mandy f m implies a

s s

simple construction of undeniable signatures Informally undeniable signatures whichwere

intro duced by Chaum and Antwerp en are publickey schemes that allow a party to

sign messages such that his participation is required in order to verify or deny a signature

We can let the public key for an undeniable signature be a commitment to a key s of a

pseudorandom function f A signature for a message m can simply b e f m or f H m

s s s

for a collisionintractable hash function H Now the conrmation proto col for a message

m and a signature y is simply a zeroknowledge pro of that y f m whereas the denial

s

proto col is a zeroknowledge pro of that y f m

s

In this section we describ e such zeroknowledge pro ofs for the values of the functions of

Construction To be a bit more accurate both the commitment for a function and

the zeroknowledge pro ofs do reveal some of the values of the function Therefore these

are zeroknowledge pro ofs for a restriction of the function to a subset of its inputs those

with their last two bits set to when the value of the function on the rest of the inputs is

publicly available this can be formalized by allowing the zeroknowledge simulator access

to all other v alues of its choice

The proto col for y f x is essentially several parallel applications of a zeroknowledge

s

pro of for the result of the DieHellman proto col These pro ofs are strongly related to

Schnorrs identication proto col In order to make his pro of into a zeroknowledge

pro of we use a strongreceiver commitmentscheme in a rather standard way using a strong

sender commitmentscheme gives p erfect zeroknowledge arguments We denote the commit

resp reveal phase of this proto col by COMMIT resp REVEAL

Construction and Denition a commitment for f Let F fF g be as in

s n nN

Lo osely sp eaking a commitment scheme is a proto col between a sender and a receiver that has two

phases The commit phase in which the sender commits to a value p and the reveal phase in which

the sender op ens this commitment Such a proto col should havetwo prop erties It is binding acom

putationally b ounded sender should not be able to op en two distinct values p p in the reveal phase

the commit phase the receiver should not It is semantically secure for any p p at the end of

be able to distinguish a commitment to p from a commitment to p See eg for formal denitions

and constructions Two stronger variants of commitment schemes are strongsender and strongreceiver

In strongsender commitments even a computationally unb ounded sender should not be able to op en two

distinct values p p In strongreceiver commitments also known as informationtheoretic commitments

the commit phase leaks no information on p in an informationtheoretically sense

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

let f be some function in F where a ha a a i A commitment for f is

PQga n n P Qg a

a a a a a a a

n

hP Q g g g g g i

Proto col ZKProof for y f x

s

The prover P knows the key s hP Q g ai and the verier V knows the commitment

n

for f The common input is a pair hx y i where x x x x f g satises that

s n

a

x x and y is in the subgroup of Z generated by g Denote by g the value g and

n n

P

assume wlog that for some k x x x whereas x x The

k k n

protocol for proving y f x is dened as fol lows

s

V chooses a value e uniformly from Z and sends COMMITe to P

Q

Q

i

mo d For each i k n P chooses r uniformly from Z computes c a

i Q i j

j

c r r

i i i

Q and sends hy g g y i to V Denote by y the value y

i i n

V sends REVEAL e to P

For each i k n P sends d e a r mo d Q to V

i i i

r

i

V accepts if the fol lowing conditions hold y g i k n y

k i

d e r a e d

i i i i

y g g and g y

i i

In addition to the ideas used by the proto col for y f x the proto col for y f x

s s

uses the randomizedreduction of Section in order to provethatavalue is not the result

of the DieHellman proto col

Proto col ZKProof for y f x

s

Let the setting and notation be as in Protocol The protocol for proving y f x is

s

dened as fol lows

V chooses a value e uniformly from Z and a uniform subset J of n V sends

Q

COMMITe J to P

Q

i

For each i k n P chooses r uniformly from Z computes c a mo d

i Q i j

j

c r r

i i i

Q and sends hy g g y i to V

i i

Also for every i n P chooses u and v uniformly from Z and sends h

i i Q

i

v u u a v

n

i i i i

i to V g y g

i

V sends REVEAL e J to P

For each i k n P sends d e a r mo d Q to V

i i i

A lso for every i J P sends u and v to V and for every i n n J P sends

i i

u a v to V

i n i

V accepts if the fol lowing conditions hold y g i k n

k

v u v u a d r e a e d r

n

i i i i i i i i i

g and y g i J g g g and g y y y

i i i

i i

u a v

n

i i

ie P g ie P sent the correct values i n n J we have that

i

u a v

n

i i

sent the correct value and y

n i

CONCRETE CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

Prop erties As mentioned ab ove we do not give formal pro ofs for the prop erties of Pro

to cols and Nevertheless we now informally describ e these prop erties

Completeness Aprover that knows the key s can always convince the legal verier of

the correct statement y f x or y f x

s s

Soundness No prover can convince the legal verier of an incorrect statement y f x

s

or y f x with nonnegligible probability In case the commitmentscheme is not a

s

strongreceiver commitment scheme soundness only holds against a computationally

b ounded p ossibly cheating prover

ZeroKnowledge The conversation of any p ossibly cheating V with P that is trying

to prove a correct statement y f xory f x can b e eciently simulated byan

s s

oracle machine that has access to any value f x where at least one of the bits x

s

n

and x is zero

n

Function Sharing

For many applications as the undeniable signaturescheme describ ed ab ove and the appli

cations describ ed in one would like a simple functionsharing scheme for pseudo

random functions In recentyears there has b een considerable work on threshold publickey

cryptography and in particular on functionsharing schemes for trap do orp ermutations see

for some of the early works on this sub ject However threshold cryptography

is very desirable in the setting of private key as well In particular it is p ossible to dene

functionsharing schemes for pseudorandom functions in an analogous way to denitions of

DeSantis et al Informally given some monotone access structure for parties and

a key s of a pseudorandom function f we want a way to give party i a function Sh such

s i

that the following two conditions hold For any value x and any authorized subset of

the parties J it is easy to compute f x given fSh xg Let J be any

s j j J

unauthorized subset of the parties and A any ecient algorithm that is given the shares

fSh g as input Then even after A adaptively queries all the shares fSh g at p oints

j j J j j J

m

hx x i of its choice A cannot tell apart the restriction of f to all other inputs apart

s

m

of hx x i from a random function

pseudo Unfortunately we do not know of an ecient functionsharing scheme for the

random functions that are constructed in this section as well as for any other pseudo

random function see and references therein for various approximations of function

sharing scheme for a pseudorandom functions We do however know how to distribute

these functions in a weaker sense The main dierence is that now an authorized subset of

the parties is required to engage in a proto col in order to compute f x Suchascheme has

s

several disadvantages compared with a functionsharing scheme see for a discussion

In particular the denition of security is more delicate since it should consider executions

of the proto col when an unauthorized subset is cheating However such a scheme might

still b e useful

Consider a function f f F where F fF g as in Construction

s PQga n n nN

a ha a a i In order to distribute f we can simply share its key s among the parties

n s

in fact since P Q and g can be public it is enough to share a Whenever a legal subset

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

wants to compute the function at a p oint x it can engage in a secure function evaluation

of f x This idea works for every function f using secret sharing schemes as in

s s

and universal techniques for secure function evaluation However the sp ecic

proto cols that are p ossible in our case are much more ecient given the simplicity of our

functions As an example assume that the only authorized subset is itself Party i gets

i i i i i

i

the key of the function Sh f F where a ha a a i and the values fa g

i n i

P Qg a

n

Q

i

are uniformly chosen sub ject to the condition that j n a a mo d Q Now

j

i

j

Q

c c

i

for every input x if i S h xg and f xg then c c mo d Q Therefore the

i s i

i

c

computation of f x can b e done in n steps At step the rst party publishes g Atstep

s

Q

i

c

i

i

i party i can compute and publish g and p erhaps also prove this value Additional

access structures for which we can distribute the function f are t out of the

P Qg a

authorized subsets are the ones with at least t elements Access structures that can be

describ ed by a small monotone formula

Oblivious Evaluation

We also suggest a new and attractive feature for a pseudorandom function a proto col

for oblivious evaluation of its value Assume that a party A knows akey s of a pseudo

random function Then A and a second party B can p erform a proto col during which

B learns exactly one value f x of its choice whereas A do es not learn a thing and in

s

particular do es not learn x One p ossible application of such a proto col is for blind

authentication It is also interesting to compare with a proto col for oblivioustransfer

during which B learns one of two p ossible values whereas in oblivious evaluation B learns

one of exp onential many p ossible values

We now describ e a preliminary design for obliviousevaluation of the functions of Con

struction This is a rather inecient proto col since it requires n rounds still it is

much more ecient than what we can show for previous constructions of pseudorandom

Q

a

i

a

x ij

i

and functions The main idea of the proto col is the following Let u g

j

Q

a

i

r r a

x oij

i

The output of step j is u and v for two values r and r known v g

j j j

r t r t

and send to B Now B cho oses t and t uniformly at random computes u and v

j j

this pair to A in a uniformly chosen order in fact B also has to prove that he knows such

a

j

a

j

r t r t

and otherwise for v values t and t If x B asks for u

j j j

We note that subsequently to our work two rather ecient obliviousevaluation schemes

were designed for our functions by Benny Pinkas and Ronald Cramer and by Benny Pinkas

and Moni Naor

Further Researc h

In Sections we discussed the existence of pseudorandom synthesizers in NC Ad

ditional work should b e done in this area The most obvious question is what are the general

assumptions in cryptography or in other elds that imply the existence of pseudorandom

synthesizers in NC In particular whether there exist parallel constructions of pseudo

random synthesizers out of pseudorandom generators or directly from oneway functions

FURTHER RESEARCH

It is also of interest to nd parallel constructions of pseudorandom synthesizers based

on other concrete intractability assumptions A task of practical imp ortance is to derive

more ecient concrete constructions of pseudorandom synthesizers in order to get ecient

constructions of pseudorandom functions As describ ed in Section an imp ortant

contribution to the eciency of the pseudorandom functions would b e a direct construction

of synthesizers with linear output length

An extensive research eld deals with pseudorandom generators that fo ol algorithms

p erforming spaceb ounded computations This kind of generators can b e constructed without

any unproven assumptions see for some denitions constructions and

applications It is p ossible that the concept of pseudorandom synthesizers and the idea

of our construction can be applied to the world of spaceb ounded computations As a

motivation remark note that the construction in bares some resemblance to the GGM

construction

In some sense we can think of the inner pro duct function as a pseudorandom synthesizer

for space b ounded computation Let IP x y b e the inner pro duct of x and y mod and

let X and Y be random lengthm sequences of nbit strings For some constant

and s n it can be shown that C X Y is a pseudorandom generator for S P AC E s

IP

s

with parameter m when C X Y is given row by row The only fact we use

IP

is that approximating IP is hard in the communication complexity mo del see

One might also try to apply the concept of pseudorandom synthesizers for other classes

of algorithms For example construct pseudorandom generators for p olynomialsize

constantdepth circuits and in general for any class for which hard problems are known

Our primary motivation for intro ducing pseudorandom synthesizers is the parallel con

struction of pseudorandom functions The sp ecial characteristics of pseudorandom synthe

sizers lead us to b elieve that other desired applications may exist For instance pseudo

random synthesizers easily dene a pseudorandom generator with large output length and

the ability to directly compute subsequences of the output This and the prop erties dis

cussed in Section suggests that pseudorandom synthesizers may b e useful for software

implementations of pseudorandom generators or functions Another p ossible application of

the idea of Construction that should be examined is to convert encryption metho ds

that are not immune to chosen plaintext attacks into ones that are immune

Section gives two very ecient constructions of pseudorandom functions The rst

construction is based on the decisional DHAssumption Assumption and the second

construction is based on a generalization of the computational DHAssumption Assump

tion We study these assumptions in Chapter A natural line for further research

is the additional study of the validityof these assumptions and the relations b etween these

assumptions and the standard computational DHAssumption Since our constructions can

be based on the corresp onding assumptions for other groups eg in ellipticcurve groups

it is interesting to study the validity of these assumptions as well

As discussed in Section the pseudorandom functions constructed in Section are

motivated by the constructions of Section In fact Section can b e describ ed as direct

and ecient constructions of ndimensional pseudorandom synthesizers see Section

An interesting problem is to construct ecient ndimensional synthesizers based on dierent

intractability assumptions

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM FUNCTIONS

Another interesting line for further research is improving the proto cols describ ed in Sec

tion and designing additional proto cols as a noninteractive zeroknowledge pro of for

the value of a pseudorandom function and a functionsharing scheme in the sense of

An alternative direction for constructing parallel pseudorandom functions is to try and

generalize the philosophy b ehind the Data Encryption Standard DES while maintaining

its apparent eciency Some interesting ideas and results on the generalization of DES can

be found in Cleves work

Chapter

Constructions of PseudoRandom

Permutations

The work describ ed in this chapter is a study of the LRConstruction of pseudorandom

permutations from pseudorandom function see the intro duction for a discussion on pseudo

random p ermutations and their applications as well as a description of the LRConstruction

Our goal is to provide a b etter understanding of the LRConstruction and as a result improve

the construction in several resp ects Our main observation is that the dierent rounds of the

LRConstruction serve signicantly dierent roles We show that the rst and last rounds

can be replaced by pairwise indep endent p ermutations and use this in order to

Simplify the pro of of security of the construction esp ecially in the case of strong

pseudorandom permutations and provide a framework for proving the security of

similar constructions

Derive generalizations of the construction that are of practical and theoretical interest

The pro of of security for each one of the constructions is practically free of charge

given the framework

Achieve an improvement in the computational complexity of the pseudorandom per

mutations two applications of a pseudorandom function on n bits suce for com

puting the value of a pseudorandom p ermutation on n bits at a given p oint vs

four applications in the original LRConstruction This implies that the reduction is

optimal

As discussed in Section the new construction is in fact a generalization of the

original LRConstruction Thus the pro of of security Theorem also applies to the

original construction The following is a brief and informal description of the main results

and the organization of this chapter

Section Presents the main construction and proves its security pairwise indep endent

p ermutations can replace the rst and fourth rounds of the LRConstruction see Fig

ure for an illustration

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

Section Highlights the highlevel structure of the pro of of security which provides a

framework that enables us to relax and generalize the main construction

Section Shows how the main construction can be relaxed by

Using a single pseudorandom function instead of two and

Using weaker and more ecientpermutations or functions instead of the pair

wise indep endent permutations

Section Provides a simple generalization of the main construction using t rounds

of generalized Feistel p ermutations instead of two the success probability of the

m m t

distinguisher is reduced from approximately to approximately where the

t

p ermutation is on bits and the distinguisher makes at most m queries see Figure

for an illustration

Section Provides a second generalization of the main construction Instead of apply

ing Feistel p ermutations on the entire outputs of the rst and second rounds Feistel

p ermutations can b e separately applied on each one of their subblo cks This is a con

struction of a strong pseudorandom p ermutation on many blo cks using pseudorandom

functions on a single blo ck see Figure for an illustration

Section Analyzes the dierent constructions of the chapter as constructions of k wise

dep endent p ermutations

Section Suggests directions for further research

Notation

We use in this c hapter notation and denitions given in Chapter in particular we use

denitions of pseudorandomness and k wise indep endence that app ear there Additional

notation that is used in this chapter include

n n

I denotes the set of all nbit strings f g

n n

F denotes the set of all I I functions and P denotes the set of all such p ermu

n n

tations P F In this chapter we concentrate on pseudorandom functions in F

n n n

and pseudorandom permutations in P

n

n

For x I denote by x the rst left n bits of x and by x the last right n bits

j j

L R

of x

Denition Feistel Permutations For any function f F let D P bethe

n f n

def

permutation dened by D L R R L f R where jLj jR j n

f

D stands for DESlike another common term for these p ermutations

CONSTRUCTION OF PPE AND SPPE

L 0 R0 Input

f1 h1

LR11 L 0 R0

f2 f1

LR22 LR 11

f3 f2

LR33LR 22

f4 -1 h2

LR44 Output

(a) (b)

Figure Constructions of SPPE a The original LRConstruction b The revised

Construction In a and b i L R and R L f R In b

i i i i i i

hL R i h I nput and O utput h hL R i

Notice that Feistel p ermutations are as easy to invert as they are to compute since the

inverse p ermutation satises D L R R f LL that is D L R D

f

f f

def

for L R R L Therefore the LRConstruction and its dierent variants which are

intro duced in Sections are easy to invert

Recall that the Luby and Racko design of PPE resp SPPE is D D D resp

f f f

D where all f s are indep endent lengthpreserving pseudorandom D D D

f i f f f

functions and D as in Denition see Figure a for an illustration

f

i

Construction of PPE and SPPE

Intuition

As mentioned ab ove a principle observation of this work is that the dierent rounds of the

LRConstruction serve signicantly dierent roles To illustrate this point consider two

where f f F are two indep en D rounds of the construction Namely E D

n f f

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

dently chosen pseudorandom functions It is not hard to verify that E is computationally

indistinguishable from a random p ermutation to any ecient algorithm that has access to

m m

pairs fhx Ex ig where the sequence fx g is uniformly distributed The intuition is

i i i

i i

as follows Note that it is enough to prove the pseudorandomness of E when f and f are

truly random functions instead of pseudorandom Let L R x and L R E x

i i

i i i i

by the denition of E we get that L L f R and R R f L Since the

i i i i i i

m

sequence fx g is uniformly distributed we have that with good probability b etter than

i

i

m

m

R R for all i j Conditioned on this event the sequence fL g is

n

i j i i

m

uniformly distributed and indep endent of the sequence fx g since f is random We

i

i

now have that with go o d probability L L for all i j Conditioned on this event

i j

m m m

the sequence fR g is uniformly distributed and indep endent of b oth fL g and fx g

i

i i i i i

m

Notice that this argument still works if the sequence fx g is only pairwise indep endent

i

i

Nevertheless as Luby and Racko showed E can b e easily distinguished from a random

permutation by an algorithm that gets to see the value of E or E on inputs of its choice

The reason is that for anyvalues L L and R suchthatL L wehave that E L R

j

L

E L R L L In contrast for a truly random p ermutation the probability of this

j

L

n

event is This is the reason that the LRConstruction includes three or four rounds

If we think of the second and third rounds of the LRConstruction as the p ermutation E

then the discussion ab ove implies that the role of the rst and fourth rounds is to prevent

the distinguisher from directly cho osing the inputs of E and E We show that this goal

can also be achieved with combinatorial constructions eg pairwise indep endent per

particular the mutations rather than cryptographic ie pseudorandom functions In

LRConstruction remains secure when the rst and fourth rounds are replaced with pairwise

indep endent permutations see Figure for an illustration

Construction and Main Result

Denition For any f f F and h h P dene

n n

def

W h f f D D h

f f

and

def

h D S h f f h D h

f f

Theorem Let h h P be pairwise independent permutations similarly to Re

n

mark this is an abbreviation for distributed according to a pairwise independent

permutation ensemble and let f f F be pseudorandom functions h h f and

n

f are independently chosen Then W W h f f is a pseudorandom permutation

and S S h f f h is a strong pseudorandom permutation W and S as in Deni

tion

Furthermore assume that no ecient oraclemachine that makes at most m mn

queries distinguishes between the pseudorandom functions and random functions for

n see Denition Then no ecient oraclemachine that makes at most m queries

to W resp S and S distinguishes W resp S from a random permutation for

m m

n n

CONSTRUCTION OF PPE AND SPPE

Remark The conditions of Theorem aremeant to simplify the exposition of the

theorem and of its proof These conditions can be relaxed as discussed in Section The

main points are the fol lowing

A single pseudorandom function f can replace both f and f

h and h may obey weaker requirements than pairwise independence For example it

is enough that for every x y

n n

Pr h x h y and Prh x h y

j j j j

R R L L

Pro of of Security

We now prove the security of the SPPEconstruction the pro of of security for the PPE

construction is very similar and in fact a bit simpler As with the original LRConstruction

the main task is to prove that the p ermutations are pseudorandom when f and f are truly

random instead of pseudorandom

Theorem Let h h P be pairwise independent p ermutations and let f f F

n n

be random functions Dene S S h f f h as in Denition and let R P be

n

a random permutation Then for any oracle machine M not necessarily an ecient one

that makes at most m queries

m m

n n RR SS

PrM PrM

n n

Theorem follows easily from Theorem see a pro ofsketch in the sequel In

order to prove Theorem we intro duce additional notation

Let G denote the p ermutation that is accessible to the machine M G is either S or R

There are twotyp es of queries M can make either xwhich denotes the query what is

th

Gx or y which denotes the query what is G y For the i query M makes

n n

dene the queryanswer pair hx y i I I where either M s query was x and the

i i i

answer it got was y or M s query was y and the answer it got was x We assume that

i i i

M makes exactly m queries and refer to the sequence fhx y i hx y ig of all these pairs

m m

as the transcript of M s computation

Notice that no limitations were imp osed on the computational power of M Therefore

M can be assumed to be deterministic we can always x the random tap e that maxi

advantage M achieves This assumption implies that for every i m the mizes the

th

i query of M is fully determined by the rst i queryanswer pairs Thus for every

th

i it can be determined from the transcript whether the i query was x or y

i i

We also get that M s output is a deterministic function of its transcript Denote by

th

C fhx y ihx y ig the i query of M as a function of the previous queryanswer

M i i

pairs and denote by C fhx y i hx y ig the output of M as a function of its tran

M m m

script

Denition Let be a sequence fhx y i hx y ig where for i m we have

m m

n n

Then is a p ossible M transcript if for every i m that hx y i I I

i i

C fhx y i hx y ig fx y g

M i i i i

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

Let us consider yet another distribution on the answers to M s queries which in turn

induces another distribution on the p ossible M transcripts Consider a random pro cess R

that on the ith query of M answers as follows

th

If M s query is x and for some j i the j queryanswer pair is hx y i then

R s answer is y for an arbitrary such queryanswer pair hx y i

th

If M s query is y and for some j i the j queryanswer pair is hx y i then

R s answer is x for an arbitrary such queryanswer pair hx y i

If neither nor holds then R s answer is a uniformly chosen nbit string

It is p ossible that R provides answers that are not consistent with any permutation

Denition Let fhx y ihx y ig be any possible M transcript is incon

m m

sistent if for some j i m the corresponding queryanswer pairs satisfy x x and

i j

y y or y y and x x Otherwise is consistent

i j i j i j

We rst show in Prop osition that the advantage M mighthave in distinguishing

between the pro cess R and the random p ermutation R is small The reason is that as long

as R answers consistently which happ ens with go o d probability it b ehaves exactly as a

random permutation In order to formalize this we consider the dierent distributions on

the transcript of M induced by the dierent distributions on the answers it gets

Denition Let T T and T be the random variables such that T is the transcript

S R S

R

of M when its queries are answered by S T is the transcript of M when its queries are

R

is the transcript of M when its queries are answered by R answered by R and T

R

n SS

C T are Notice that by these denitions and by our assumptions M

M S

RR n

the same random variables and M C T

M R

Prop osition

m

PrC T Pr C T

M M R

R

n

R

R

Proof For any p ossible and consistent M transcript we have that

n

PrT PrT j T is consistent

R

R R

n

R

m

R

b eing consistent is exactly the distribu conditioned on T Therefore the distribution of T

R R

is inconsistent is inconsistent is small T tion of T Furthermore the probability that T

R

R R

if for some j i m the corresp onding queryanswer pairs satisfy x x and y y

i j i j

x For a given i and j this event happ ens with probability at most or y y and x

j i j i

n

Hence

m m

n

T Pr is inconsistent

R

n

R

CONSTRUCTION OF PPE AND SPPE

The prop osition follows

PrC T Pr C T

M M R

R

R

R

PrC T j T is consistent PrC T PrT is consistent

M M R

R R R

R

R R

PrC T j T is inconsistent PrC T Pr T is inconsistent

M M R

R R R

R

R R

PrT is inconsistent

R

R

m

n

It remains to b ound the advantage M might have in distinguishing b etween T and T

S

R

The intuition is that for every p ossible and consistent M transcript unless some bad and

rare event on the choice of h and h as in the denition of S happ ens the probability

that T is exactly the same as the probabilitythat T Wenow formally dene this

S

R

event Denition and b ound its probability Prop osition

Convention For any possible M transcript fhx y i hx y ig we can assume

m m

hereafter that if is consistent then for i j both x x and y y this means that M

i j i j

never asks a query if its answer is determined by a previous queryanswer pair

Denition For every specic choice of pairwise independent permutations h h

P in the denition of S dene BADh h to be the set of al l possible and consistent

n

M transcripts fhx y ihx y ig satisfying

m m

ij m such that h x h x or h y h y

i j i j

j j j j

R R L L

Prop osition Let h h P be pairwise independent permutations then for any

n

possible and consistent M transcript fhx y ihx y ig we have that

m m

m

Pr BADh h

n

h h

Proof By denition BADh h if there exist i j m such that either

For any given i and j b oth Pr h y h x or h y h x h x

h j i i j i

j j j j j

L R L R R

n

h y h y are smaller than since h and h are pairwise h x and Pr

i j j j j j h

L L R

indep endent Therefore

m m

n

Pr BADh h

n

h h

The key lemma for proving Theorem is

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

Lemma Let fhx y ihx y ig be any possible and consistent M transcript

m m

then

PrT j BADh h Pr T

S

R

S

R

Proof Since is a p ossible M transcript we have that for all i m

C fhx y i hx y ig fx y g

M i i i i

th

Therefore T i for all i m the i answer R gives is y in the case that

i

R

th

C fhx y ihx y ig x and otherwise its i answer is x Assume that R

M i i i i

answered correctly ie y or x as ab ove for each one of the rst i queries Then

i i

th

by Convention and the denition of R its i answer is an indep endent and uniform

nbit string Therefore

nm

PrT

R

R

Since is a p ossible M transcript we have that T i for all i m y S x

S i i

Consider any sp ecic choice of p ermutations h and h for which S S h f f h such

that BADh h Let L R h x and L R h y By the denition of S

i i

i i i i

we get that

y S x f R L L and f L R R

i i

i i i i i i

For every i j m b oth R R and L L otherwise BADh h

i j i j

Therefore since f and f are random wehave that for every choice of h and h such that

nm

BADh h the probability that T is exactly We can conclude

S

nm

PrT j BADh h

S

S

which complete the proofofthe lemma

Proof of Theorem Let b e the set of all p ossible and consistent M transcripts

such that M

PrC T PrC T

M S M

R

S

R

X

is inconsistent PrT PrT PrT

S

R R

S

R R

X

PrT j BADh h PrT Pr BADh h

S

R

S

h h

R

X

BADh h Pr T PrT j BADh h Pr

S

R

S

h h

R

PrT is inconsistent

R

R

CONSTRUCTION OF PPE AND SPPE

We already showed in the pro of of Prop osition that the value of Expression is

m

smaller than by Lemma we get that the value of Expression is Therefore

n

it remains to b ound the value of Expression Assume without loss of generality that

X

PrT j BADh h Pr BADh h

S

S

h h

X

PrT Pr BADh h

R

h h

R

then using Prop osition we get that

X

PrT j BADh h Pr T Pr BADh h

S

R

S h h

R

X

PrT Pr BADh h

R

h h

R

h max Pr BADh

h h

m

n

Thus we can conclude that

m m

C T Pr C T Pr

M M S

R

n n

S

R

Using Prop osition we complete the pro of

n n RR SS

PrM Pr M

R S

Pr C T Pr C T

M S M R

S R

Pr C T Pr C T Pr T PrC T C

M S M M M R

R R

S R

R R

m m

n n

Given Theorem the pro of of Theorem is essentially the same as the corre

sp onding pro of of the original LRConstruction the proofofTheorem of given their

main Lemma The pro ofidea is the following Dene three distributions

S S h f f h where h h P are pairwise indep endentandf f F are

n n

pseudorandom functions

S S h g f h where h h P are pairwise indep enden t f F a pseudo

n n

random function and g F a random function

n

S S h g g h where h h P are pairwise indep endentandg g F are

n n random functions

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

It is enough to show that for every oracle machine for all but nite number of n

n n S S S S

n PrM PrM

n n S S S S

n PrM PrM

If or do not hold then we can construct an ecient oraclemachine M that

distinguishes the pseudorandom functions from the random functions in contradiction to

the assumption Assume for example that for innitely many n

S S n S S n

Pr M PrM n

n

The oraclemachine M on input and with access to a function O F rst samples

n

pairwise indep endent p ermutations h h P and a pseudorandom function f F

n n

n

M then invokes M with input and answers its queries with the values of S and S

for S S h Of h When M halts so do es M and M outputs whatever M outputs

Notice that if O is a pseudorandom function then the distribution of S is S whereas if

O is a truly random function then the distribution of S is S This is the reason that M

distinguishes a pseudorandom function from a random one with advantage greater than

n Similar hybridarguments apply to all the other constructions of this chapter

The Framework

As we shall see in Sections the construction of Section can be relaxed and

generalized in several ways The dierent pseudorandom p ermutations obtained share a

similar structure and almost identical pro of of security In this section we examine the pro of

of Theorem in a more abstract manner Our goal is to establish a framework for proving

almost all the constructions of this chapter and to suggest a way for designing and proving

additional constructions

Our framework deals with constructions of a pseudorandom permutation S on bits

E h see Figure for an which is the comp osition of three p ermutations S h

illustration In general h and h are lightweight and E is where most of the work is

done E is constructed from pseudorandom functions and for the purp ose of the analysis

we assume as in Theorem that these functions are truly random In Section

and h are chosen as pairwise indep endent p ermutations and for example n h

E D D for random f f F

f f n

The framework starts with E which may be easily distinguished from a truly random

permutation and transforms it via h and h into a pseudorandom p ermutation The prop

erty E should have is that for almost every sequence fhx y ihx y ig the probability

m m

truly random p ermutation that i y E x is close to what we have for a

i i

Denition A sequence fhx y ihx y ig is E Go o d if Pr i y E x

m m E i i

m

THE FRAMEWORK

Input

h1

E

-1 h2

Output

Figure The highlevel structure of the dierent constructions of SPPE

We assume that apart from some rare sequences all others are E Good Lo osely sp eaking

the role of h and h is to ensure that under any adaptive chosen plaintext and cipher

text attack on S the inputs and outputs of E form an E Go o d sequence with a very high

probability

For the exact prop erties needed from the distributions on h h and E we shall try to

follow the statement and pro of of Theorem The goal is to show that S is indistin

guishable from a truly random p ermutation R on bits Sp ecically that for some small

whose choice is explained hereafter for any oracle machine M not necessarily an ecient

one that makes at most m queries

m

SS RR

Pr M PrM

Let the notions of queryanswer pair a transcript the function C a p ossible M transcript

M

the random pro cess R a consistent transcript and the random variables T T and T

S R

R

be as in the pro of of Theorem Prop osition saying that the distance between

T and T is b ounded by the probability that T is inconsistent and that this probability

R

R R

m

still holds The heart of applying the framework is in sp ecifying the is bounded by

bad M transcripts for given h and h This set BAD h h replaces BADh h in

E

Denition and in the rest of the pro of It contains p ossible and consistent M transcripts

and should have the prop ertythatany fhx y ihx y ig not in BAD h h satises

m m E

that fhh x h y ihh x h y ig is E Go o d Note that Denition is indeed

m m

a sp ecial case of the ab ove and also that by this prop erty

m

PrT j BAD h h

S E

S

This implies that Lemma where BADh h is replaced with BAD h h is true

E

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

Lemma Let fhx y ihx y ig be any possible and consistent M transcript

m m

then

PrT j BAD h h PrT

S E

R

S

R

For BAD h h to b e useful we must have that

E

Pr BAD h h

E

h h

and this substitutes Prop osition This is the only place in the pro of where we use

the denition of and the denition of the distributions of h and h As demonstrated in

Sections there is actually a tradeo between reducing the requirements from

h and h and having a somewhat larger value of Applying and Lemma as in

the pro of of Theorem we conclude

over permutations in P let S h E h Theorem Let h h E be distributed

and let R P be a random permutation Suppose that BAD h h is as above and

E

satises Then for any oracle machine M not necessarily an ecient one that makes

at most m queries

m

SS RR

Pr M PrM

To summarize the ma jor point in proving the security of the dierent constructions is

to dene the set BAD h h such that for any p ossible and consistent M transcript

E

m

both Pr T j BAD h h and Pr BAD h h for the

S S E h h E

sp ecic in the claim we are proving This suggests that the critical step for designing a

pseudorandom p ermutation using the framework describ ed in this section is to come up

with a permutation E such that the set of E Go o d sequences is large enough and nice

enough Note that to meet this end one can use dierent or more general denitions of an

E Go o d sequence with only minor changes to the pro of as is the case for the permutation

S in Section

Relaxing the Construction

PPE and SPPE with a Single PseudoRandom Function

Since Luby and Racko in tro duced their construction a considerable amount of research

has fo cused on the following question

Can we obtain a similar construction of PPE or SPPE such that every p ermutation will b e

constructed from a single pseudorandom function

Apparently this line of research originated in the work of Schnorr Schnorr consid

ered the LRConstruction where the functions used are truly random as a pseudorandom

generator that is secure if not to o many bits are accessible The securityofSchnorrs genera

tor do es not dep end on any unproven assumption This notion of lo calrandomness is further

treated in Since the key of a random function is huge it makes sense to minimize

RELAXING THE CONSTRUCTION

the numb er of functions and indeed Schnorr suggested D D D as pseudorandom the

f f f

suggested permutation was later shown to b e distinguishable from random

Following is an informal description of some of these results Let f F be a random

n

function Then

i j

For all i j k the p ermutation D D D is not pseudorandom

k

f f

f

i j

For all i j k the p ermutation D D D k D is not strongly pseudorandom

f f

f f

D D D D is pseudorandom

f f f

f

D D D D D D is strongly pseudorandom where I F is the identity

f I f I n

f f

function

D D D is pseudorandom and D D D D is strongly pseudorandom

f f f f f f f f f

where is for example a rotation of one bit

A critique that has been voiced often is that using only one pseudorandom function

do es not seem to o signicant A pseudorandom function on n bits can replace pseudo

n bits or alternatively a small key can be used to pseudorandomly random functions on

generate a larger key It should also b e noticed that the new constructions require additional

invo cations of the pseudorandom functions which imply an increase in the computation

time Furthermore these results involve detailed and nontrivial pro ofs to a p oint where

some pap ers claim to nd inaccuracies in others

The adjustment of the LRConstruction we suggest in Section can easily b e converted

into a construction of PPE and SPPE from a single pseudorandom function Simply re

place both pseudorandom functions f and f with a single pseudorandom function

f This solution do es not suer from the drawbacks of the previous ones The construction

and the pro of remain as simple as b efore and the pseudorandom function is only invoked

twice at each computation of the permutation The additional keylength for the pairwise

indep endent functions h and h is not substantial esp ecially compared to the length of

a truly random function Consider for example the construction of SPPE when we use a

truly random function f

Theorem Let h h P be pairwise independent permutations and let f F be

n n

a random function Dene S S h ffh as in Denition and let R P be

n

a random permutation Then for any oracle machine M not necessarily an ecient one

that makes at most m queries

m m

n n RR SS

PrM PrM

n n

The pro of follows the framework describ ed in Section The set BADh h Deni

tion is replaced with the set BAD h h dened to b e

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

The set of al l possible and consistent M transcripts fhx y ihx y ig

m m

satisfying that there exist i j m such that either h x h x or

i j

j j

R R

h y h y as before or there exist i j m such that h x

i j i

j j j

L L R

h y

j j

L

In order to apply Theorem it is enough to note that by this denition we get that for

nm

any p ossible and consistent M transcripts b oth Pr T j BAD h h

S S

hence it is a prop er denition according to the framework and Pr BAD h h

h h

m

n

Relaxing the PairWise Indep endence Requirement

One might interpret the construction of Section in the following way given the task

of constructing ecient pseudorandom p ermutations it is enough to concentrate on the

ecient construction of pseudorandom functions The assumption that supp orts such a

claim is that the computation of pseudorandom functions is much more exp ensive than the

computation of pairwise indep endent p ermutations Therefore computing the value of the

pseudorandom p ermutation that is constructed in Section on any input of n bits

is essentially equivalent to t wo invo cations of a pseudorandom function with nbit inputs

In this section we show that we can use even weaker p ermutations instead of the pair

wise indep endent ones resulting in an even more ecient construction of pseudorandom

permutations

As mentioned in Section the only place in Section we use the fact that h and

h are pairwise indep endent p ermutations is in the pro of of Prop osition In fact the

we use is that for every x y exact requirement on h and h

n n

h x h y h x h y and Pr Pr

j j j j

L L R R

h h

n n

Furthermore we can replace with any and still get a construction of pseudo

random permutations with somewhat larger distinguishing probability Consider for ex

ample the revised statement of Theorem

Theorem Let H and H be distributions of permutations in P such that for every

n

pair of nbit strings x y

Pr h x h y and Pr h x h y

j j j j

R R L L

h H h H

h distributed according to H and let f f F be Let h be distributed according to H

n

random functions Dene S S h f f h as in Denition and let R P be

n

a random permutation Then for any oracle machine M not necessarily an ecient one

that makes at most m queries

m

SS n RR n

PrM PrM m

n

RELAXING THE CONSTRUCTION

The pro of follows the framework describ ed in Section This time the denition of

BADh h stays unchanged and in order to apply Theorem we only need to note

that for any p ossible and consistent M transcript Pr BADh h m

h h

The conditions on H and H in Theorem are somewhat nonstandard since the

requirements are on half the bits of the output Nevertheless these conditions are satised

by more traditional requirements on functionfamilies In particular one can use the concept

of AXU functions

n n

Denition A distribution on I I functions or permutations H is AXU if

n

for every x y and every z x y z I

Pr hx hy z

hH

This concept was originally dened by Carter and Wegman Weuse the terminology of

Rogaway

It is easy to verify that the conditions on H and H in Theorem are satised if

n n

and H are AXU Such a distribution of p ermutations over I both H

def

n n

for is h x a x where a is uniform in I nfg and the multiplication is

a

n

in GF

Another way to construct H and H is by using Feistel permutations with AXU

functions Let H b e a distribution of AXU functions on n bits strings then we can dene

H to b e fD g and H to b e fD g The reason is that for every two distinct nbit

h hH hH

h

strings x L R and y L R andevery function h F we have by denition that

n

D x D y hR hR L L

h j h j

R R

If R R then L L and therefore D x D y otherwise by the denition of

h h

j j

R R

AXU functions

Pr D x D y Pr hR hR L L

h j h j

R R

hH hH

Thus H satises its requirement and similarly for H

and H By using Feistel p ermutations to construct H we get the original LRConstruction

n

as a sp ecial case since a random function is in particular AXU Thus the pro of of

security in Section also holds for the original LRConstruction The idea of using AXU

functions instead of pseudorandom functions for the rst round of the LRConstruction was

previously suggested by Lucks

Another advantage of this approach isthatitallows us to use many ecient constructions

n

of function families An example of ecient AXU functions are Vaziranis shiftfamily

n th

Akey of such a function is a uniformly chosen string a I and the j bit of f x

a

P

n

j n is dened to b e x a mo d

i j i

i

A substan tial amount of research deals with the construction of

ecient hash functions This line of work contains constructions that ob ey weaker denitions

on function families than pairwise indep endence and in particular contains constructions of

AXU functions Unfortunately these functions were designed to be esp ecially ecient

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

when their output is substantially smaller than their input since they were mainly brought

up in the context of authentication which is not true in our case but is relevant in Sec

tion An additional ob jective is to reduce the size of the family of hash functions eg

In our setting the purp ose of this is to reduce the keylength of the pseudorandom

permutations

Reducing the Distinguishing Probability

There are various circumstances where it is desirable to have a pseudorandom p ermutation

on relatively few bits say This is esp ecially true when we want to minimize the size

of the hardwarecircuit that implements the p ermutation or the comm unication bandwidth

with the hardware or software comp onent that computes the p ermutation

Let F b e a pseudorandom p ermutation on bits note that n in Section con

structed from truly random functions on bits using the LRConstruction As shown by

Patarin F can be distinguished with constant probability from a random p ermuta

tion using O queries which means that the analysis of the LRConstruction where the

m

distinguishing probability for m queries is O is tight Therefore the LRConstruction

on bits can only be used if is large enough to bound the number of queries in the

attack on the blo ck cipher

In this section a simple generalization of the construction of Section is presented

Using this construction the adversarys probability of distinguishing between the pseudo

m t

random and random p ermutations can be reduced to roughly for every integer

t

t for t we get the original construction To achieve this security t

permutations are comp osed The initial and nal are pairwise indep endent p ermutations

t t

the rest are generalized Feistel p ermutations dened by I I random or pseudo

random functions see Figure for an illustration

Patarin shows that if wetake six rounds of the LRConstruction instead of three or

four then the resulting p ermutation cannot be distinguished from a random permutation

m

improving This means that distinguishing the six with advantage b etter than

round construction from a truly random p ermutation with constant probability requires at

t

least queries The b ound weachieve in this section is b etter for any

t Note that our construction uses pseudorandom functions with larger inputlength

which mightbe a disadvantage for some applications

In order to describ e our generalized constructions we rst extend Feistel p ermutations to

deal with the case where the underlying functions have arbitrary input and output lengths

instead of lengthpreserving functions as in Denition We note that using such

unbalanced Feistel p ermutations was previously suggested in

Denition Generalized Feistel Permutations For any two positive integers s

s

and and any function f I I let s and let D P be the permutation

f

def

dened by D L R R L f R where jLj s and jR j

f

simple We can now dene the revised construction and consider its security These are

generalizations of the construction in Section and of its pro of of security

REDUCING THE DISTINGUISHING PROBABILITY

Input

h1

FFF123

f1

F2 F3 L 1

f2

FL3 1 L 2

f3

L 1 L 2 L 3

-1 h2

Output

Figure Construction of strong pseudorandom p ermutations with reduced distinguishing

t t

I here f I I probability using t rounds here t Recall f I

i i

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

Denition t Round Construction For any integers t lets and r be

s s

integers such that s t r where rt For any h h P f f f I I

r

s s

and f f I I dene

r t

def

W h f f f D D h D

t f f f

t

t

and

def

D D D h S h f f f h h

f f f t

t

t

We get the construction of Denition by choosing t s and r

Theorem Let W and S be as in Denition where h and h ar e pairwise

independent permutations and f f f are pseudorandom functions t is al lowed to be

t

a function of h h and f f f are independently chosen Then W is a pseudo

t

random permutation and S a strong pseudorandom permutation

Furthermore assume that no ecient oraclemachine that makes at most m m

queries distinguishes between the pseudorandom functions and random functions for

n Then no ecient oraclemachine that makes at most m queries to W resp S and

t m m

S distinguishes W resp S from a random permutation for t

dte

In case the middle functions are truly random this reduces to

Theorem Let S be as in Denition where h and h arepairwise independent

permutations and f f f are random functions and let R P be a random permuta

t

tion Then for any oracle machine M not necessarily an ecient one that makes at most

m queries

m t m

SS RR

PrM PrM

dte

The pro of of Theorem follows the framework describ ed in Section Assume

for simplicity that s t the set BADh h Denition is replaced with the set

BAD h h dened to be

The set of al l possible and consistent M transcripts fhx y ihx y ig

m m

satisfying that there exist ij m and k t such that

k t k k t k

F F L L F F L L

i i i i j j j j

t t

where F F F h x and L L L h y jF j jF j

i i

i i i i i i i i

t t

jF j jL j jL j jL j s

i i i i

This guarantees that for any p ossible and consistent M transcript wehavethatPr T

S S

m

j BAD h h and hence it is a prop er denition according to the frame

work The reason is that under the notation ab ove

k

k k t k

F L i y S x i m k t f F F L L

i i k

i i i i i i

SPPE ON MANY BLOCKS USING PFE OR PPE ON A SINGLE BLOCK

Therefore given any sp ecic choice of h and h in the denition of S such that

BAD h h the event T is comp osed of m t indep endent events each of which has

S

s

probability to happ en In order to apply Theorem it remains to note that for any

such we have that

m m t

dte

Pr BAD h h t

dte

h h

Remark The construction of this section achieves a substantial improvement in se

curity over the construction in Section even for a smal l constant t that is with a

few additional applications of the pseudorandom functions Nevertheless it might be useful

for some applications to take a larger value of t Choosing t reduces the advantage the

m

distinguisher may achieve to roughly

SPPE on Many Blo cks Using PFE or PPE on a

Single Blo ck

Consider the application of pseudorandom p ermutations to encryption ie using f M

in order to encrypt a message M where f is a pseudorandom p ermutation Assume also

that we want to use DES for this purp ose We now have the following problem while DES

works on xed and relatively small length strings we need a permutation on jM j bit long

strings where the length of the message jM jma y b e large and mayvary between dierent

messages

This problem is not restricted to the usage of DES though the fact that DES was de

signed for hardware implementation contributes to it Usually a direct construction of

pseudorandom p ermutations or pseudorandom functions if we want to employ the LR

Construction with large inputlength is exp ensive Therefore we would like a way to

construct pseudorandom p ermutations or functions on many blocks from pseudorandom

permutations or functions on a single block

Sev eral such constructions were suggested in the context of DES see eg for the dif

ferent mo des of op eration for DES The simplest known as the electronic co deb o ok mo de

ECBmo de is to divide the input into subblo cks and to apply the pseudorandom p ermu

tation on each subblo ck separately This solution suers from the obvious drawback that

every subblo ck of output solely dep ends on a single subblo ck of input and in particular

the permutation on the complete input is not pseudorandom This may leak information

ab out the message being encrypted See Section for further discussion and additional

related work

construction of Section that uses In this section we consider a generalization of the

pseudorandom functions or p ermutations on a single blo ck to construct strong pseudo

random p ermutations on many blo cks The idea is as follows apply a pairwise indep endent

permutation on the entire input divide the value you get into subblo cks and apply two

rounds of Feistelp ermutations or one round of a pseudorandom p ermutation on each sub

blo ck separately nally apply a second pairwise indep endent permutation on the entire

value you get see Figure for an illustration

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

Input

h1

f1 f1 f1

f2 f2 f2

-1 h2

Output

Figure Construction of a strong pseudorandom p ermutation on many six in this case

blo cks from a pseudorandom function on a single blo ck

SPPE ON MANY BLOCKS USING PFE OR PPE ON A SINGLE BLOCK

This solution resembles the ECBmo de it is almost as simple and it is highly suitable for

parallel implementation Contrary to the ECBmo de this construction do es give a pseudo

random permutation on the entire message though the security parameter is still relative

to the length of a subblo ck

For simplicitywe only describ e the construction using truly random functions or a truly

random permutation The analysis of the construction when pseudorandom functions are

used follows easily In addition we restrict our attention to the construction of strong

pseudorandom p ermutations

b

Denition For any two integers b and s for any function g F let g F be

s bs

the function dened by

def

b

g x x x g x gx gx

b b

For any f f F and h h P dene

n nb

def

b b

S h f f h h D D h

f f

For any p P and h h P dene

n nb

def

b

S h ph h p h

Theorem Let h h P be pairwise independent permutations let f f F

nb n

be random functions and p P a random permutation Dene S S h f f h and

n

S S h ph as in Denition and let R P be a random permutation Then

nb

for any oracle machine M not necessarily an ecient one that makes at most m queries

m m b

SS nb RR nb

Pr M PrM

n nb

and

m b

SS nb RR nb

Pr M PrM

n

The pro of of Theorem for S follows the framework describ ed in Section The

set BADh h Denition is replaced with the set BAD h h dened to b e

The set of al l possible and consistent M transcripts fhx y ihx y ig

m m

j

such that either there are two equal values in fF g or there are

im j b

i

j

b

two equal h x and F F values in fL g where F

i im j b

i

i i i

b b

L L L h y jF j jF j F jL j jL j

i

i i i i i i i i

b

L n

i

This guarantees that for any p ossible and consistent M transcript we have that

nbm

PrT j BAD h h

S S

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

and hence it is a prop er denition according to the framework The reason is that under

the notation ab ove

j j j j j j

i y S x i m j b f F F L and f L F L

i i

i i i i i i

Therefore given any sp ecic choice of h and h in the denition of S such that

BAD h h the event T is comp osed of m b indep endent events each of which has

S

n

probability to happ en In order to apply Theorem it remains to note that for any

such we have that

m b m b

n

Pr BAD h h

n

h h

The pro of of Theorem for S slightly deviates from the framework describ ed in

Section providing yet another evidence to the claim that nob o dy is p erfect The set

BADh h Denition is replaced with the set BAD h h dened to b e

The set of al l possible and consistent M transcripts fhx y ihx y ig

m m

j

g or there ar such that either there are two equal values in fF e two

im j b

i

j

b b

equal values in fL g where F F F h x and L L L

im j b i

i

i i i i i i

b b

h y jF j jF j F jL j jL j L n

i

i i i i i i

Now we have that for any p ossible and consistent M transcript

m b m b

n

BAD h h Pr

n

h h

but now for any such

n

PrT j BAD h h

S

n

m b

S

nbm

instead of as required by the framework However the dierence in probabilities

is rather small which result in only a minor deviation from the proofofTheorem

Relaxing the Construction

As in Section wewould like to reduce the requirements from h and h in Theorem

Our main motivation in doing so is to decrease the keylength of the pseudorandom p ermu

tations We would like the keylength to be of order n the length of the small subblo cks

and not of order nb the length of the complete input in some cases w emayallow a small

dep endence on b

Wesketch away to redene the distributions on h and h in the denition of S almost

the same ideas apply to the denition of S The requirement these distributions havetoobey

is that for any p ossible and consistent M transcript wehave that Pr BAD h h

h h

is small We use the following notation For any n bbit string z z z z such

b

th

substring of z the substring z the i that j jz j n and for all i b denote by z

i j

j

i

The requirement ab ove can be achieved by sampling h and h according to a permutation

n

distribution H such that for some small we have that

SPPE ON MANY BLOCKS USING PFE OR PPE ON A SINGLE BLOCK

For any n bbit string x i j b Pr hx hx and

hH j j

i j

For any n bbit strings x x i j b Pr hx hx

hH

j j

i j

We start by dening a p erm utation distribution H that almost achieves this Apermutation

n n

h h sampled from H is dened by two AXU functions u I I and u

u u

dlog be n

I I see denition of AXU functions in Section For any z z z z

b

such that j jz j n

j

def

h z z u z u z u z u z u z u b z u b

b b b b b

u u

It is not hard to verify that

and h x h x For any n bbit string x i j b Pr

j j h H

i j

For any n bbit strings x x such that x x and for all i j b

j

j

b

b

Pr h x h x

h H j j

i j

In order to eliminate the additional requirementinthatx x we dene the p ermu

j

b j

b

tation distribution H such that a p ermutation h sampled from H is dened to b e h D see

g

nb n

Denition where h is sampled according to H and g I I is a AXU

function see Figure for an illustration Using and and the fact that for any

n bbit strings x x

PrD x D x

g j g j

b b

g

we get that H satises and for

Notice that the computation of a function h H is essentially equivalent to one compu

nb n

tation of an AXU function g I I and a few additional XOR op erations p er

blo ck Using ecient constructions of AXU functions wegetan

m

ecient function h Krawczyk shows a construction of AXU functions from m bits

to bits with keybits Using these functions we can achieve the desired goal of reducing

the keylength of h to On bits

Related Work

The construction presented in this section is certainly not the only solution to the problem

at hand We refer in brief to some additional solutions

As mentioned ab ove DES mo des of op eration were suggested as a way of encrypting

long messages However none of these mo des constitutes a construction of a pseudorandom

permutation Note that when the encryption of a message M is f M for a pseudorandom

then the only information that is leaked on M is whether or not M is equal permutation f

to a previously encrypted message This is not true for DES mo des of op eration For

instance when using the cipher blo ck chaining mo de CBCmo de the encryptions of two

However as shown by Bellare et al the CBCmo de do es dene a construction of a pseudorandom

function with small output length A somewhat related solution to this problem is the so called cascade

construction that is considered by Bellare et al

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

g

x xx. . . x 12 b-1 b

u1 x’b

vv1 2 v b-1 v b

y y . . . y y

1 2 b-1 b

Figure The Construction of h H Each v denotes the string u i

i

messages with identical prex will also have an identical prex The ECBmo de leaks even

more information the existence of two identical subblo cks in two dierent encrypted

messages or in a single message The reason that the ECBmo de leaks so much information

is that every ciphertextblo ck solely dep ends on a single plaintextblo ck Our construction

implies that only very little and noncryptographic diusion the p ermutations h and h

is required in order to overcome this aw of the ECBmo de

Bellare and Roga way show how to convert the CBCmo de in order to construct a

pseudorandom p ermutation with large inputlength this is the only place we are aware of

that explicitly refers to the problem The amount of work in their construction is compa

rable with two applications of the original CBCmo de approximately twice the work of our

construction assuming that h and h are relatively ecient The security of this construc

tion is of similar order to the security of our construction In contrast to our construction

in nature as well as is sequential

A dierent approach is to dene a lengthpreserving pseudorandom function F on bits

using a lengthpreserving pseudorandom function F on bits where and then to apply

our version of the LRConstruction using F in order to get a pseudorandom permutation on

bits The function F can b e dened to b e G F h where h is a pairwise indep endent

hash function from bits to bits and G a pseudorandom bit generator from bits to

bits This idea may be attributed in part to Carter and Wegman Anderson and

Biham and Lucks showhow to directly apply similar ideas into the LRConstruction

A comparison between this approach and our construction relies on the sp ecic parameters

SPPE ON MANY BLOCKS USING PFE OR PPE ON A SINGLE BLOCK

of the dierent primitives that are used In particular the parameters of the pseudorandom

function F vs the pseudorandom generator G For instance for this approach to be more

ecient than our construction we need that one application of G would be more ecient

than de applications of F

Reducing the Distinguishing Probability

All the constructions of a pseudorandom p ermutation on many blo cks from a pseudorandom

function or p ermutation on a single blo ck that are describ ed in this subsection including

ours have the following weakness If the length of a single blo ck is to o small eg bits

then the pseudorandom p ermutation on many blo c ks is very weak even when the original

pseudorandom function or permutation is very secure eg completely random In the

following few paragraphs we discuss this problem and a way to overcome it

Consider the p ermutation S S h f f h as in Denition where h h P

nb

are pairwise indep endent p ermutations and f f F are random functions Our analysis

n

of the security of S Theorem fails when the number of queries that the adversary

n n

makes is b in fact this analysis is tight Having b large enough forces a

signicant restriction on n Therefore a natural question is whether we can improve the

security of the construction A simple InformationTheoretic argument implies that all

n

such constructions can be distinguished from random using O b queries This follows

n

from the fact that with O b queries the adversary gets much more bits than the length

of the permutations secretkey Hence the distribution of the answers to these queries is

statistically very dierent from uniform which allows an allp owerful adversary to distinguish

the permutation from random

In order to match this bound we rst note that the somewhat high distinguishing prob

ability of S is due to its vulnerability to a birthdayattack on the length of a single blo ck

n

An adversary that makes b uniformly chosen queries to S will force a collision in the

inputs to f or f with a constant probability Such a collision foils our analysis and can

indeed b e used to distinguish S from uniform The solution lies in the following observation

The problem of foiling birthdayattacks when constructing a pseudorandom p ermutation on

ks can be reduced to the problem of foiling birthdayattacks when constructing many blo c

a pseudorandom function or p ermutation on two blo cks We demonstrate this using the

Aiello and Venkatesan construction of pseudorandom functions

Let f and f be two indep endent copies of the pseudorandom functions on n bits we get

when using truly random functions on n bits in the construction of Aiello and Venkatesan By

distinguishing each f from a truly random function with constant probability requires

i

n

queries Let h and h P be pairwise indep endent p ermutations and let the

nb

permutation S S h f f h be as in Theorem for the parameters n n and

b b We now get that distinguishing S from random with constant probability

n

requires O b queries which is optimal

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

Constructions of k Wise Dep endent Permutations

In this section we summarize the connection between the various constructions of this

chapter and the task of obtaining k wise dep endent p ermutations As mentioned in Sec

tion Schnorr suggested using the LRConstruction with truly random functions

in order to get a pseudorandom generator that is secure as long as not to o many bits of its

output are accessible to the adversary This idea is further treated by Maurer and Massey

Maurer suggested to replace the truly random functions with what he calls lo cally

random or almost random functions In the terminology of k wise indep endence these

ideas can be interpreted as a way of using the LRConstruction in order to obtain k wise

dep endentpermutations from k wise dep endent functions as long as k is not to o large

Theorem in implies that

when k wise dependent functions are used instead of pseudorandom functions

in the LRConstruction the result is a k wise dependent permutations for

n

O k

Similar observations apply to the dierent constructions of this chapter as discussed in this

section

Corollary to Theorem Let h h P bepairwise independent permutations

n

and let f f F be k wise dependent functions Then S S h f f h as in

n

Denition is a k wise dependent permutation for

k k

def

n n

Proof Let S S P have the following distributions

n

S S h g f h where h h P are pairwise indep endent f F a k wise

n n

dep endent function and g F a truly random function

n

S S h g g h where h h P are pairwise indep endentandg g F are

n n

truly random functions

and let R P b e a truly random p ermutation It is enough to show that for every k strings

n

of nbits x x x we have

k

khS x Sx Sx i hS x S x S x i k

k k

khS x S x S x ihS x S x S x ik

k k

k k

khS x S x S x ihR x Rx Rx ik

k k

n n

th

The reason holds is that if we dene an oracle machine M such that its i query is

always x and such that

i

C fhx y i hx y ihx y ig

M k k

y y i PrhS x S x i hy y i PrhR x Rx i h

k k k k

CONSTRUCTIONS OF K WISE DEPENDENT PERMUTATIONS

we get by the denition of variation distance and from Theorem that

k hS x S x S x i hR x Rx Rx i k

k k

S S n RR n

PrM Pr M

k k

n n

and hold by the denition of k wise dep endent functions For example if

k hS x Sx Sx ihS x S x S x ik

k k

then we can x h h P and f F in the denition of both S and S such that

n n

nbits z z z not necessarily all the inequality still holds This denes k strings of

k

dierent and a function V for which

hS x Sx Sx i V hf z f z f z i and

k k

hS x S x S x i V hg z g z g z i

k k

We get a contradiction since for any function V

kV hf z f z f z i V hg z g z g z ik

k k

khf z f z f z i hg z g z g z ik

k k

In a similar way we get the following two Corollaries from the constructions of Sec

tions

Corollary to Theorem Let S be as in Denition where h and h

are pairwise independent permutations and f f f are k wise dependent functions

t

Then S is a k wise dependent permutation for

t k k

def

t

dte

Corollary to Theorem Let h h P be pairwise independent permuta

nb

tions let f f F be b k wise dependent functions and let p P be a b k wise

n n

Deni dependent permutation Dene S S h f f h and S S h ph as in

tion Then S is a k wise dependent permutation for

k b k

def

n nb

and S is a k wise dependent permutation for

k b

def

n

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

By taking t in Corollary we get a simple construction of a k wise dep endent

k

permutation on bits for as close to as we wish This construction requires

applications of k wise dep endent functions from bits to a single bit An interesting

question is to nd a simple construction of k wise dep endentpermutations for an arbitrarily

smal l and an arbitrary k

An old prop osal by Moni Naor see page is to apply a card shuing pro cedure

that requires only few rounds and is oblivious in the sense that the lo cation of a card after

each round dep ends on a few random decisions The sp ecic card shuing for which this

to the idea is describ ed in was suggested by Aldous and Diaconis Unfortunately

b est of our knowledge this pro cedure was never proven to give with few rounds an almost

uniform ordering of the cards Nevertheless we briey describ e it in order to demonstrate

the concept of an oblivious card shuing and the way that such a pro cedure can be used

to construct a k wise dep endent p ermutation Finally we describ e the main idea in the

denition of another oblivious card shuing for which we can prove that only few rounds

are needed

Each round shue in a card shuing pro cedure is a p ermutation on the lo cations of

def

the N cards of a deck ie a permutation on the set N f Ng In the case of

the Aldous and Diaconis card shuing each such p ermutation is dened by a uniformly

chosen Nbit string r r r r Denote this p ermutation by then

N r

ii i N i if r

r r i

i N

ii i N i otherwise

r r

That is the cards at lo cations i and i Nmove to lo cations i andi and their internal

order is uniformlychosen indep endently of all other choices Note that xevaluating x

r

or x requires the knowledge of a single bit of r and therefore this card shuing is

r

indeed oblivious

def

s

s

s s

Consider s rounds of the card shuing describ ed ab ove

r

r r r

s s

where fr r g are uniformlydistributed and indep endent of each other If is of

r

statistical distance at most from a uniform p ermutation then we can construct a k wise

s s

dep endent p ermutation for as follows simply take the permutation

s

s

s

where the s N bits to be s rounds of fr r g are the outputs

r

r r

s

dependent Binaryfunction f Evaluating or its inverse at a given of a k swise

s

point consists of s invo cations of f Therefore an interesting problem is to show that is

of exp onentiallysmall statistical distance from a uniform p ermutation for a small value of s

In it is conjectured that this can be shown for s O log N While this conjecture is

to the b est of our knowledge still op en we can show a dierentcardshuing pro cedure for

whic h it can be proven that O log N rounds are sucient This card shuing is dened

in a recursive manner Split the deck into two halves lo cations fNg and lo cations

fNNg apply the card shuing recursively on each half of the deck and merge the

two now shued halves in an almost uniform way A p ermutation M on N is a merge

if for every i and j such that i j N or N i j N we have that

M i Mj An oblivious in the same meaning as ab ove merging pro cedure can also b e

dened recursively but since the construction is rather cumb ersome we omit its description

This direction may b ecome attractive given an ecient and simple merging pro cedure

CONCLUSION AND FURTHER WORK

A dierent direction to solving the problem of constructing k wise dep endent p ermu

tations is to try and generalize the algebraic construction of pairwise indep endent p ermu

tations Leonard Schulman private communication suggested such a generalization that

yields wise indep endent p ermutations His suggestion is to use sharply transitive per

mutation groups A permutation group over the set n f ng is a subgroup of

the symmetric group S A p ermutation group G over n is k transitive if for every two

n

k tuples fa a g and fb b g of distinct elements of n there exist a permutation

k k

G such that i k a b A p ermutation group G over n is sharply k

i i

transitiveifforevery two such tuples there exists exactly one p ermutation G such that

i k a b A sharply k transitive p ermutation group is in particular k wise in

i i

dep endent and indeed the algebraic construction of pairwise indep endent p ermutations use

a sharply transitive p ermutation group containing all the linear p ermutations Schulman

suggested to use the fact that there are known constructions of sharply transitive p ermu

tation groups However this approach cannot be generalized to larger values of k from

the classication of nite simple groups it follows that for k there are no k transitive

groups over n other than the symmetric group S and the alternating group A and there

n n

are only few such groups for k and k see One should be careful not

to interpret this as implying that for k there are no ecient algebraic constructions of

k wise indep endent p ermutations It is however justied to deduce that for k any small

family of k wise indep endent permutations is not a permutation group ie is not closed

under comp osition and inverse

Conclusion and Further Work

The constructions describ ed in Sections are optimal in their cryptographic work in

h the cryptographic functions are applied on the sense that the total number of bits on whic

is exactly the number of bits in the input Therefore it seems that in order to achieve the

goal of constructing ecient blo ckciphers it is sucient to concentrate on the construction

of ecient pseudorandom functions The depth of the constructions on the other hand

is twice the depth of the cryptographic functions It is an interesting question whether

there can be a construction of similar depth The goal of reducing the depth is even more

signicantinthe case of the t round construction in Section A dierent question is

nding a simple construction of k wise dep endent p ermutations for an arbitrarily smal l

and an arbitrary k This question is discussed in Section

CHAPTER CONSTRUCTIONS OF PSEUDORANDOM PERMUTATIONS

Bibliography

W Aiello and R Venkatesan Foiling Birthday Attacks in LengthDoubling Transformations

Advances in Cryptology EUROCRYPT Lecture Notes in Computer Science vol

SpringerVerlag pp

M Ajtai and A Wigderson Deterministic simulations of probabilistic constant depth circuits

Proc th Symp on Foundations of Computer Science

D Aldous and P Diaconis Strong uniform times and nite random walks Advances in Applied

Mathematicsvol pp

W B Alexi B Chor O Goldreich and C PSchnorr RSA and Rabin functions certain parts

are as hard as the whole SIAM J Computvol pp

N Alon L Babai and A Itai A fast and simple randomized parallel algorithm for the maximal

indep endent set problem J Algorithmsvol pp

N Alon O Goldreich J Hastad and R Peralta Simple constructions for almost k wise inde

p endent random variables Random Structures and Algorithmsvol pp

R Anderson and E Biham Two practical and provably secure blo ck ciphers BEAR and LION

Proc Fast Software Encryption Lecture Notes in Computer Science vol SpringerVerlag

pp

D Angluin and M Kharitonov When wont memb ership queries help J Comput System

Scivol pp

R Armoni On the derandomization of spaceb ounded computations RANDOM International

Workshop on Randomization and Approximation Techniques in Computer Science LNCS

L Babai N Nisan and M Szegedy Multiparty proto cols pseudorandom generators for

logspace and timespace tradeos J Comput System Scivol pp

E Bach How to generate factored random numbers SIAM J Comput vol

pp

P W Beame S A Co ok and H J Ho over Log depth circuits for division and related

problems SIAM J Computvol pp

M Bellare A Desai D Pointcheval and P Rogaway Relations among notions of security

for publickey encryption schemes Advances in Cryptology CRYPTO Lecture Notes in

Computer Science vol

BIBLIOGRAPHY

M Bellare R Canetti and H Krawczyk Pseudorandom functions revisited the cascade

construction Proc th IEEE Symp on Foundations of Computer Science pp

M Bellare O Goldreich and S Goldwasser Incremental cryptography the case of hashing and

signing Advances in Cryptology CRYPTO Lecture Notes in Computer Science vol

SpringerVerlag pp

M Bellare O Goldreich and S Goldwasser Incremental Cryptography with Application to

Virus Protection Proc th Ann ACM Symp on Theory of Computing pp

M Bellare and S Goldwasser New paradigms for digital signatures and message authentica

tion based on noninteractive zero knowledge pro ofs Advances in Cryptology CRYPTO

Lecture Notes in Computer Science vol SpringerVerlag pp

M Bellare J Kilian and P Rogaway The security of cipher blo ck chaining Advances in

Cryptology CRYPTO Lecture Notes in Computer Science vol SpringerVerlag

pp

M Bellare and S Micali Noninteractive oblivious transfer and applications Proc Advances

in Cryptology CRYPTO LNCS Springer pp

M Bellare and P Rogaway Blo ck cipher mo de of op eration for secure lengthpreserving

encryption manuscript in preparation

BenOr M Goldwasser S and Wigderson A Completeness theorems for noncryptographic

fault tolerant distributed computation Proc th Ann ACM Symp on Theory of Computing

pp

E Biham D Boneh and O Reingold Breaking generalized DieHellman mo dulo a comp osite

is no easier than factoring Information Processing Letters vol pp

A Blum M Furst M Kearns and R J Lipton Cryptographic primitives based on hard

learning problems Advances in Cryptology CRYPTO Lecture Notes in Computer Science

vol SpringerVerlag pp

L Blum M Blum and M Shub A simple secure unpredictable pseudorandom number gen

erator SIAM J Computvol pp

M Blum W Evans P Gemmell S Kannan M Naor Checking the correctness of memories

Algorithmica pp Preliminary version Proc st Symp on Foundations of

Computer Science

An ecient probabilistic publickey encryption scheme which M Blum and S Goldwasser

hides all partial information Advances in Cryptology CRYPTO LNCS vol Springer

pp

M Blum and S Micali How to generate cryptographically strong sequence of pseudorandom

bits SIAM J Computvol pp

D Boneh and R Lipton Algorithms for BlackBox elds and their application to cryptography

Advances in Cryptology CRYPTO LNCS vol Springer pp

BIBLIOGRAPHY

D Boneh and R Venkatesan Hardness of computing most signicant bits in secret keys in

DieHellman and related schemes Advances in Cryptology CRYPTO LNCS vol

Springer pp

S Brands An ecient oline electronic cash system based on the representation problem

CWI Technical Rep ort CSR

G Brassard Mo dern cryptology Lecture Notes in Computer Science vol Springer

Verlag

E F Brickell D M Gordon K S McCurley and D B Wilson Fast exp onentiation with

precomputation Proc Advances in Cryptology EUROCRYPT LNCS Springer

pp

P J Cameron Finite p ermutation groups and nite simple groups Bul l London Math Soc

vol pp

R Canetti Towards realizing random oracles hash functions that hide all partial information

Proc Advances in Cryptology CRYPTO LNCS Springer pp

R Canetti J Friedlander and I Shparlinski On certain exp onential sums and the distribution

of DieHellman triples Research report IBM T J Watson Research Center Number RC

July

R Canetti O Goldreich and S Halevi The random oracle mo del revisited to app ear in

Proc th Ann ACM Symp on Theory of Computing

R Canetti D Micciancio and O Reingold Perfectly oneway probabilistic hash functions

Proc th Ann ACM Symp on Theory of Computing pp

L Carter and M Wegman Universal hash functions J of Computer and System Sciences

vol pp

D Chaum and H van Antwerp en Undeniable signatures Proc Advances in Cryptology

CRYPTO LNCS Springer pp

B Chor A Fiat and M Naor Tracing traitors Advances in Cryptology CRYPTO

SpringerVerlag pp

B Chor and O Goldreich Unbiased bits from sources of weak randomness and probabilistic

communication complexity SIAM J Comput vol pp Preliminary version

in Proc th IEEE Symp on Foundations of Computer Science pp

B Chor and O Goldreich On the p ower of twop oint based sampling J Complexity vol

pp

R Cleve Methodologies for designing block ciphers and cryptographic protocols Part I PhD

Thesis UniversityofToronto

R Cleve Complexity theoretic issues concerning blo ck ciphers related to DES Advances

in Cryptology CRYPTO Lecture Notes in Computer Science vol SpringerVerlag pp

BIBLIOGRAPHY

R Cramer and V Shoup A practical public key cryptosystem provably secure against adaptive

chosen ciphertext attack Proc Advances in Cryptology CRYPTO LNCS vol

Springer pp

D Chaum C Crep eau and I Damgard Multiparty unconditionally secure proto cols Proc

th Ann ACM Symp on Theory of Computing pp

A De Santis Y Desmedt Y Frankel and M Yung How to share a function securely Proc

th ACM Symp on Theory of Computing pp

Y Desmedt So ciety and group oriented cryptography A new concept Proc Advances in

Cryptology CRYPTO LNCS Springer

Y Desmedt and Y Frankel Threshold Advances in Cryptology CRYPTO

SpringerVerlag LNCS vol pp

W Die and M Hellman New directions in cryptography IEEE Trans Inform Theory

vol pp

D Dolev C Dwork and M Naor Nonmalleable cryptography Proc rd Ann

ACM Symp on Theory of Computing pp Full version available at

httpwwwwisdomweizmannacilnaor

T ElGamal A publickey cryptosystem and a signature scheme based on discrete logarithms

Advances in Cryptology CRYPTO LNCS vol Springer pp

S Even and Y Mansour A construction of a cipher from a single pseudorandom p ermutation

J of Cryptology vol pp Preliminary version in Advances in Cryptology

ASIACRYPT Lecture Notes in Computer Science SpringerVerlag

R FischlinandCPSchnorr Stronger security pro ofs for RSA and Rabin bits Advances in

Cryptology EUROCRYPT Lecture Notes in Computer Science vol SpringerVerlag

pp

M Franklin and S Hab er Joint encryption and messageecient secure computation J of

Cryptology vol pp

Y Gertner and T Malkin A PSRG based on the decision DieHellman assumption preprint

O Goldreich Two remarks concerning the GoldwasserMicaliRivest signature scheme Ad

vances in Cryptology CRYPTO Lecture Notes in Computer Science vol Springer

Verlag pp

O Goldreich Towards a theory of software protection Proc th Ann ACM Symp on Theory

of Computing pp

O Goldreich A note on computational indistinguishability Information Processing Letters

vol pp

BIBLIOGRAPHY

O Goldreich Foundations of cryptography fragmentsofabook Electronic pub

lication httpwwwecccunitrierdeecccinfoECCCBo oksec cc b o okshtml Electronic

Collo quium on Computational Complexity

O Goldreich Mo dern cryptography probabilistic pro ofs and pseudorandomness

Algorithms and Combinatorics vol SpringerVerlag

O Goldreich S Goldwasser and S Micali How to construct random functions J of the

ACMvol pp

O Goldreich S Goldwasser and S Micali On the cryptographic applications of random

functions Advances in Cryptology CRYPTO Lecture Notes in Computer Science vol

SpringerVerlag pp

O Goldreich and H Krawczyk On sparse pseudorandom ensembles Random Structures and

Algorithmsvol pp

O Goldreich and L Levin A hardcore predicate for all oneway functions Proc st Ann

ACM Symp on Theory of Computing pp

O Goldreich O S Micali and A Wigderson How to playany mental game Proc th Ann

ACM Symp on Theory of Computing pp

O Goldreich and A Wigderson Tiny families of functions with random prop erties a quality

size tradeo for hashing Proc th ACM Symp on Theory of Computing pp

S Goldwasser and S Micali Probabilistic encryption J Comput System Sci vol

pp

S Halevi and H Kra wczyk MMH message authentication in software in the Gbitsecond

rates Proc Fast Software Encryption Lecture Notes in Computer Science vol Springer

Verlag pp

J Hastad A W Schrift and A Shamir The discrete logarithm mo dulo a comp osite hides

O n bits J of Computer and System Sciencesvol pp

J Hastad R Impagliazzo L A Levin and M Luby Construction of a pseudorandom gener

ator from anyoneway function SIAM Journal on Computingvol pp

Preliminary versions by Impagliazzo et al in st STOC and Hastad in nd STOC

A Herzb erg and M Luby Public randomness in cryptograph y Advances in Cryptology

CRYPTO Lecture Notes in Computer Science vol SpringerVerlag pp

R Impagliazzo and L Levin No b etter ways to generate hard NP instances than picking uni

formly at random Proc st IEEE Symp on Foundations of Computer Sciencepp

R Impagliazzo and M Naor Ecient Cryptographic schemes provably secure as subset sum

J of Cryptology vol pp

BIBLIOGRAPHY

R Impagliazzo N Nisan and A Wigderson Pseudorandomness for network algorithms Proc

th ACM Symp on Theory of Computing

R Impagliazzo and A Wigderson P BPP if E requires exp onential circuits derandomizing

the XOR lemma Proc th Ann ACM Symp on Theory of Computing pp

R Impagliazzo and A Wigderson Randomness vs time derandomization under a uniform

assumption To app ear in Proc th IEEE Symp on Foundations of Computer Science

R Impagliazzo and D Zuckerman Recycling random bits Proc th IEEE Symposium on

Foundations of Computer Science pp

R M Karp and V Ramachandran Parallel algorithms for sharedmemory machines in J

van Leeuwen ed Handb o ok of Theoretical Computer Science vol A MIT Press

pp

M Kearns and L Valiant Cryptographic limitations on learning Bo olean formulae and nite

automata J of the ACMvol pp

M Kharitonov Cryptographic hardness of distributionsp ecic learning Proc th ACM

Symp on Theory of Computing pp

J Kilian and P Rogaway How to protect DES against exhaustive key search Advances in

Cryptology CRYPTO Lecture Notes in Computer Science vol SpringerVerlag

pp

T Koren On the construction of pseudorandom blo ck ciphers MSc Thesis in Hebrew CS

Dept Technion Israel May

H Krawczyk LFSRbased hashing and authentication Advances in Cryptology CRYPTO

Lecture Notes in Computer Science vol SpringerVerlag pp

M Langb erg An implementation of ecient pseudorandom functions At

httpwwwwisdomweizmannacilnaorprfuncabsabshtml

L A Levin Oneway function and pseudorandom generators Proc th Ann ACM Symp

on Theory of Computing pp

N Linial Y Mansour and N Nisan Constant depth circuits Fourier transform and learn

y J of the ACMvol pp abilit

M Luby A simple parallel algorithm for the maximal indep endent set problem SIAM J

Computvol pp

M Luby Pseudorandomness and applications Princeton University Press

M Luby and C Racko How to construct pseudorandom permutations and pseudorandom

functions SIAM J Computvol pp

S Lucks Faster LubyRacko ciphers Proc Fast Software Encryption Lecture Notes in Com

puter Science vol SpringerVerlag pp

BIBLIOGRAPHY

U M Maurer A simplied and generalized treatment of LubyRacko pseudorandom p er

mutation generators Advances in Cryptology EUROCRYPT Lecture Notes in Computer

Science vol SpringerVerlag pp

U Maurer and S Wolf Towards the equivalence of breaking the DieHellman proto col and

computing discrete logarithms SIAM Journal on Computing vol pp

Preliminary version by U Maurer in Advances in Cryptology CRYPTO LNCS vol

Springer pp

U M Maurer and J L Massey Lo cal randomness in pseudorandom sequences J of Cryptol

ogyvol SpringerVerlag pp

K McCurleyAkey distribution system equivalent to factoring J of Cryptology vol

pp

K McCurley The discrete logarithm problemCrypto graphy and Computational Number The

ory Proc Symp Appl Math AMS Lecture Notes vol pp

J Naor and M Naor Smallbias probability spaces ecient constructions and applications

SIAM J Computvol pp

M Naor Bit commitment using pseudorandomness Journal of Cryptology

M Naor and B Pinkas Secure and ecient metering Advances in Cryptology EUROCRYPT

Lecture Notes in Computer Science SpringerVerlag

M Naor B Pinkas and O Reingold Distributed PseudoRandom Functions and KDCs

Advances in Cryptology Euro crypt Pro ceedings LNCS vol SpringerVerlag pp

M Naor and O Reingold Synthesizers and their application to the parallel construction of

pseudorandom functions Proc th IEEE Symp on Foundations of Computer Science

pp

M Naor and O Reingold On the construction of pseudorandom p ermutations LubyRacko

revisited To app ear in J of Cryptology Preliminary version in Proc th Ann ACM Symp

on Theory of Computing pp

M Naor and O Reingold Numb erTheoretic constructions of ecient pseudorandom func

tions Proc th IEEE Symp on Foundations of Computer Science

M Naor and O Reingold From unpredictability to indistinguishability A simple construc

tion of pseudorandom functions from MACs Advances in Cryptology CRYPTO Lecture

Notes in Computer Science vol SpringerVerlag pp

M Naor O Reingold and A Rosen Pseudorandom functions and Factoring manuscript

ederal Information Processing National Bureau of Standards Data encryption standard F

Standard US Department of Commerce FIPS PUB Washington DC

BIBLIOGRAPHY

N Nisan Pseudorandom generators for spaceb ounded computation Combinatorica

vol pp

N Nisan RL SC Proc th Ann ACM Symp on Theory of Computing pp

N Nisan and A Wigderson Hardness vs randomness J Comput System Sci vol

pp Preliminary version Proc th IEEE Symp on Foundations of Computer

Science pp

N Nisan and D Zuckerman Randomness is linear in space J Comput System Scivol

pp Preliminary version More deterministic simulations in logspace Proc th

Ann ACM Symp on Theory of Computing pp

A M Odlyzko Discrete logarithms and smo oth p olynomials Contemporary Mathematics

AMS

Y Ohnishi A study on data security Master Thesis in Japanese Tohuku UniversityJapan

R Ostrovsky An ecient software protection scheme Proc nd Ann ACM Symp on

Theory of Computing pp

J Patarin Pseudorandom permutations based on the DES scheme Proc of EUROCODE

Lecture Notes in Computer Science vol SpringerVerlag pp

J Patarin New results on pseudorandom p ermutation generators based on the DES scheme

Advances in Cryptology CRYPTO Lecture Notes in Computer Science vol Springer

Verlag pp

J Patarin How to construct pseudorandom and sup er pseudorandom p ermutations from one

single pseudorandom function Advances in Cryptology EUROCRYPT Lecture Notes in

Computer Science vol SpringerVerlag pp

J Patarin Improved security b ounds for pseudorandom p ermutations th ACM Conference

on Computer and Communications Security

J Pieprzyk How to construct pseudorandom p ermutations from single pseudorandom func

tions Advances in Cryptology EUROCRYPT Lecture Notes in Computer Science vol

SpringerVerlag pp

B Pinkas private communication

M O Rabin Digitalized signatures and publickey functions as intractable as factorization

Technical Rep ort TR MIT Lab oratory for Computer Science

M Ram Murty Artins conjecture for primitive ro ots The Mathematical Intel ligencer

vol SpringerVerlag pp

A Razb orov and S Rudich Natural pro ofs J of Computer and System Sciencesvol

pp Preliminary version Proc th Ann ACM Symp on Theory of Computing pp

BIBLIOGRAPHY

J Reif On threshold circuits and p olynomial computation Proc of the nd Conference on

Structure in Complexity Theory pp

J Reif and S Tate On threshold circuits and p olynomial computation SIAM J Comput

vol pp

J H Reif and J D Tygar Ecient parallel pseudorandom number generation SIAM J

Computvol pp

R L Rivest A Shamir and L M Adleman A metho d for obtaining and

public key cryptosystems Comma ACMvol pp

D J S Robinson A course in the theory of groups nd ed New York Springer

Verlag

P Rogaway Bucket hashing and its application to fast message authentication Advances

in Cryptology CRYPTO Lecture Notes in Computer Science vol SpringerVerlag

pp

S Rudich Limits on the provable consequences of oneway functions PhD Thesis U C

Berkeley

R A Ruepp el On the securityofSchnorrs pseudo random generator Advances in Cryptol

ogy EUROCRYPT Lecture Notes in Computer Science vol SpringerVerlag

pp

B Sadeghiyan and J Pieprzyk On necessary and sucien t conditions for the construction of

sup er pseudorandom p ermutations Abstracts of ASIACRYPT Lecture Notes in Computer

Science vol SpringerVerlag pp

B Sadeghiyan and J Pieprzyk A construction for sup er pseudorandom p ermutations from

a single pseudorandom function Advances in Cryptology EUROCRYPT Lecture Notes

in Computer Science vol SpringerVerlag pp

B Schneier and J Kelsey Unbalanced Feistel networks and blo ck cipher design Proc Fast

Software Encryption Lecture Notes in Computer Science vol SpringerVerlag

pp

C PSchnorr On the construction of random numb er generators and random function genera

tors Advances in Cryptology EUROCRYPT Lecture Notes in Computer Science vol

SpringerV erlag pp

C P Schnorr Ecient identication and signatures for smart cards Proc Advances in

Cryptology CRYPTO LNCS Springer pp

A Shamir How to share a secret Comm ACM vol

A Shamir On the generation of cryptographically strong pseudorandom numb er sequences

ACM Trans Comput Sys pp

C E Shannon Communication theory of secrecy systems Bel l System Technical Journal

vol

BIBLIOGRAPHY

Z Shmuely Comp osite DieHellman publickey generating systems are hard to break Tech

nical Rep ort No Computer Science Department Technion Israel

V Shoup Lower b ounds for discrete logarithms and related problems Proc Advances in

Cryptology EUROCRYPT Lecture Notes in Computer Science SpringerVerlag

pp

KY Siu J Bruck T Kailath and T Hofmeister Depth ecient neural network for division

and related problems IEEE Trans Inform Theoryvol pp

KY Siu and V P Roychowdhury On optimal depth threshold circuits for multiplication

and related problems SIAM J Disc Mathvol pp

M Stadler Publicly veriable secret sharing Proc Advanc es in Cryptology EUROCRYPT

LNCS vol Springer pp

M Steiner G Tsudik and M Waidner DieHellman key distribution extended to group

communication Proceedings rdACM Conference on Computer and Communications Security

pp

D Stinson Universal hashing and authentication co des Designs Codes and Cryptography

vol pp

L G Valiant A theory of the learnable Comm ACMvol pp

U V Vazirani Randomness adversaries and computation PhD Thesis U C Berkeley

U V Vazirani Strong communication complexity or generating quasirandom sequences from

two communicating semirandom sources Combinatorica vol Preliminary version in

Proc th Ann ACM Symp on Theory of Computing pp

M Wegman and L Carter New hash functions and their use in authentication and set

equality J of Computer and System Sciences vol pp

A C Yao Theory and applications of trap do or functions Proc rd IEEE Symp on Foun

dations of Computer Science pp

Y Zheng T Matsumoto and H Imai Imp ossibilityand optimality results on constructing

pseudorandom p ermutations Advances in Cryptology EUROCRYPT Lecture Notes in

Computer Science vol SpringerVerlag pp