<<

Synthesizers and Their Application to the Parallel Construction of



PseudoRandom Functions

y z

Moni Naor Omer Reingold

Abstract

A pseudorandom function is a fundamental cryptographic primitive that is essential for encryp

tion identication and authentication We present a new cryptographic primitive called pseudo

random synthesizer and show how to use it in order to get a parallel construction of a pseudorandom

function We show several NC implementations of synthesizers based on concrete intractability

assumptions as factoring and the DieHellman assumption This yields the rst parallel pseudo

random functions based on standard intractability assumptions and the only alternative to the

original construction of Goldreich Goldwasser and Micali In addition we show parallel construc

tions of synthesizers based on other primitives such as weak pseudorandom functions or trap do or

oneway p ermutations The security of all our constructions is similar to the security of the under

lying assumptions The connection with problems in Computational Learning Theory is discussed

A preliminary version of this pap er app eared at the Proc th IEEE Symp on Foundations of Computer Science

pp

y

Incumbent of the Morris and Rose Goldman Career Development Chair Dept of Applied Mathematics and

Computer Science Weizmann Institute of Science Rehovot Israel Research supp orted by BSF grant no

and a grant from the Israel Science Foundation administered by the Israeli Academy of Sciences Email

naorwisdomweizmannacil

z

Dept of Applied Mathematics and Computer Science Weizmann Institute of Science Rehovot Israel

Research supp orted by a Clore Scholars award and by a grant from the Israel Science Foundation administered by

the Israeli Academy of Sciences Email reingoldwisd omweizmannacil

Introduction

A pseudorandom function as dened by Goldreich Goldwasser and Micali is a function that

is indistinguishabl e from a truly random function to a p olynomialtime b ounded observer who

can access the function as a blackbox ie can provide inputs of his choice and gets to see the value

of the function on these inputs Pseudorandom functions are the key comp onent of privatekey

They allow parties who share a common key to send secret messages to each other

to identify themselves and to authenticate messages In addition they have many

other applications essentially in any setting that calls for a random function that is provided as a

blackbox

Goldreich Goldwasser and Micali provided a construction of such functions For roughly a

decade this was the only known construction even under sp ecic assumptions such as factoring

is hard Their construction is sequential in nature and consists of n successive invocations of a

pseudorandom generator where n is the number of bits in the input to the function Our goal

in this pap er is to present an alternative construction for pseudorandom functions that can b e

implemented in log n phases

We introduce a new cryptographic primitive which we call pseudorandom synthesizer A

pseudorandom synthesizer is a two variable function S so that if many but p olynomially

b ounded random assignments hx x i and hy y i are chosen to b oth variables then

m m

m

is indistinguishabl e the output of S on all the combinations of these assignments f x y

i j

ij

from random to a p olynomialtime observer Our main results are

A construction of pseudorandom functions based on pseudorandom synthesizers Evaluating

such a function involves log n phases where each phase consists of several parallel invocations

of a synthesizer with a total of n invocations altogether

Constructions of parallel NC synthesizers based on standard numbertheoretic assumptions

such as factoring is hard RSA it is hard to extract ro ots mo dulo a comp osite and Die

Hellman In addition a very simple construction based on a problem from learning The

keygenerating algorithm of these constructions is sequential for RSA and factoring non

uniformly parallel for DieHellman and parallel for the learning problem

An extremely simple and also parallel construction of synthesizers based on what we call

a weak pseudorandom function A weak pseudorandom function is indistinguishable from

a truly random function to a p olynomialtime b ounded observer who gets to see the value

of the function on uniformly distributed inputs instead of any input of its choice This

construction almost immediately implies constructions of synthesizers based on trap do or one

way p ermutations and based on any hardtolearn problem under the denition of

Taking and together we get a pseudorandom function that can b e evaluated in NC

We note that our constructions do not weaken the security of the underlying assumptions Take

for instance the construction that is based on factoring If there is an algorithm for breaking this

construction in time t and success success means that the observer has advantage of at least

in distinguishing the pseudorandom function from the random one then there is an algorithm

that works in time pol y t and factors Blumintegers with probability pol y t See for a

discussion of security preserving reductions

In their terminology such a reduction is called p olypreserving In fact most of our reductions as the reduction

from the security of the pseudorandom functions to the security of the pseudorandom synthesizers are linear

preserving The only place were our reductions are not linearpreserving is when they rely on the hardcard bits of

Our constructions of pseudorandom functions have additional attractive prop erties First it

is p ossible to obtain from the constructions a sharp timespace tradeo Lo osely sp eaking by

keeping m strings as the key we can reduce the amount of work for computing the functions from

n

n invocations of the synthesizer to ab out invocations in log n log log m phases thus also

log m

reducing the paralleltime complexity In addition the construction obtains a nice incremental

prop erty For any y of Hamming distance one from x given the computation of f x we can

compute f y with only log n invocations of the synthesizer we can also make this prop erty hold

for y x We discuss b oth prop erties in Section

Applications of NC Computable PseudoRandom Functions

The class NC has b een criticized as a mo del for parallel computation for two main reasons

It ignores communication delays and other parameters that determine the execution time on

an actual parallel machine

It overemphasizes latency rather than the sp eedup of problems

These criticisms seem less valid for the problem of constructing pseudorandom functions since

a It is likely that it will b e implemented in a sp ecial purp ose circuit as there are DES chips

and b For some applications of pseudorandom functions minimizing the latency of computing

the functions is essential Such an application is the encryption of messages on a network where

the latency of computing the function is added to the latency of the network Furthermore if the

complexity of evaluating a synthesizer on a given input is comparable to that of a pseudorandom

generator then the work p erformed by our construction is comparable to the one in and we

can get optimal sp eedup

Note that many of the applications of pseudorandom functions preserve the paralleltime com

plexity of the functions An imp ortant example is the Luby and Racko construction of

pseudorandom permutations from pseudorandom functions Their construction is very simple

and involves four invocations of a pseudorandom function in order to evaluate the pseudorandom

p ermutation at a given p oint see also for an optimal construction that requires only two

invocations Therefore our constructions yield strong pseudorandom permutations in NC as

well

There is a deep connection b etween pseudorandom functions and hardness results for learning

Since a random function cannot b e learned if a concept class is strong enough to contain pseudo

random functions we cannot hop e to learn it eciently Since no construction of pseudorandom

functions in NC was known several ways of bypassing this were suggested However

these are weaker unlearnabilityresults than the one obtained by pseudorandom functions The

existence of pseudorandom functions in a concept class implies that there exists a distribution of

concepts in this class that is hard for every learning algorithm for every nontrivial distribution

on inputs even when membership queries are allowed Finding such a distribution of concepts is still

of interest to learning theory We discuss the connection b etween our work and learningtheory

in Section

Another application of pseudorandom functions in complexity was suggested by the work of

Razb orov and Rudich on Natural Pro ofs They showed that if a circuitclass contains pseudo

random functions that are secure against a sub exp onentialtime adversary then there are no

what they called Natural Pro ofs which include all known lower b ound techniques for separating

this class from P pol y Given our constructions the existence of Natural Pro ofs for separating NC

from P pol y would imply that several wellestablished intractability assumptions are false

The question of whether pseudorandom functions exist in NC is also interesting in contrast to

the lower b ound of Linial Mansour and Nisan that there are no pseudorandom functions in

AC

Previous Work

In addition to introducing pseudorandom functions Goldreich Goldwasser and Micali have

suggested a construction of such functions from pseudorandom generators that expand the input

by a factor of two like the one in As mentioned ab ove the GGM construction is sequential

in nature An idea of Levin is to select some secret hash function h and apply the GGM

construction to hx instead of x If jhxj log n then the depth of the GGMtree is only log n

and presumably we get a pseudorandom function in NC The problem with this idea is that

log n

we have decreased the security signicantly with probability n the function can b e broken

irresp ective of the security guaranteed by the pseudorandom generator To put this construction

in the correct light supp ose that for security parameter k we have some problem whose solution

k k

requires time on instance of length p olynomial in k If we would like to have security for

our pseudorandom function then the Levin construction requires depth k whereas our construction

requires depth log k

Impagliazzo and Naor have provided parallel constructions for several other cryptographic

primitives based on the hardness of subset sum and factoring The primitives include pseudo

random generators that expand the input by a constant factor universal oneway hash functions

and strong bitcommitments

Blum et al prop osed a way of constructing in parallel several cryptographic primitives

based on problems that are hard to learn We extend their result by showing that hardtolearn

problems can b e used to obtain synthesizers and thus pseudorandom functions

A dierent line of work more relevant to derandomization and saving

random bits is to construct bitgenerators such that their output is indistinguishabl e from a truly

random source to an observer of restricted computational p ower eg generators against p olynomial

size constantdepth circuits Most of these constructions need no unproven assumptions

In a subsequent work we describ e constructions of pseudorandom functions and other

cryptographic primitives that are at least as secure as the decisional version of the DieHellman

assumption or as the assumption that factoring is hard These functions can b e computed in NC

in fact even in TC and are much more ecient than the concrete constructions of this pap er

It is interesting to note that is motivated by this pap er and in particular by the concept of

pseudorandom synthesizers

Organization of the Paper

In Section we dene pseudorandom synthesizers and collections of pseudorandom synthesizers

and discuss their prop erties In Section we describ e our parallel construction of pseudorandom

functions from pseudorandom synthesizers and in Section we prove its security In Section we

describ e a related construction of pseudorandom functions In addition we discuss the timespace

tradeo and the incremental prop erty of our constructions In Section we discuss the relations

b etween pseudorandom synthesizers and other cryptographic primitives In Section we describ e

constructions of pseudorandom synthesizers based on several numbertheoretic assumptions In

Section we show how to construct pseudorandom synthesizers from hardtolearn problems and

They also provided a construction of AC pseudorandom generators with small expansion

consider a very simple concrete example We also discuss the application of parallel pseudorandom

functions to learningtheory In Section we suggest topics for further research

Preliminaries

Notation

N denotes the set of all natural numbers

n n

I denotes the set of all nbit strings f g

n

U denotes the random variable uniformly distributed over I

n

k

Let X b e any random variable we denote by X the k matrix whose entries are inde

k k

p endently identically distributed according to X We denote by X the vector X

We identify functions of two variables and functions of one variable in the natural way Ie

n n k n k

by letting f I I I b e equivalent to f I I and letting f x y b e the same value

as f x y where x y stands for x concatenated with y

Let x b e any bitstring we denote by jxj its length ie the number of bits in x This should

not b e confused with the usage of j j as absolute value

For any two bitstrings of the same length x and y the inner pro duct mo d of x and y is

denoted by x y

PseudoRandom Functions

For the sake of completeness and concreteness we briey review in this section the concept of

pseudorandom functions almost as it app ears in Another go o d reference on pseudorandom

functions is Informally a pseudorandom function ensemble is an ecient distribution of

functions that cannot b e eciently distinguished from the uniform distribution That is an ecient

algorithm that gets a function as a black b ox cannot tell with nonnegligible success probability

from which of the distributions it was sampled To formalize this we rst dene function ensembles

and ecient function ensembles

k

Denition function ensemble Let and k be any two N N functions An I I

function ensemble is a sequence F fF g of random variables such that the random variable

n nN

n k n k

F assumes values in the set of I I functions The uniform I I function ensemble

n

n k n

R fR g has R uniformly distributed over the set of I I functions

n nN n

Denition eciently computable function ensemble

A function ensemble F fF g is eciently computable if there exist probabilistic polynomial

n nN

n

time algorithms I and V and a mapping from strings to functions such that I and F

n

are identically distributed and V i x ix

def

We denote by f the function assigned to i ie f i We refer to i as the key of f and

i i i

to I as the keygenerating algorithm of F

For simplicity we concentrate in the denition of pseudorandom functions and in their con

struction on lengthpreserving functions The distinguisher in our setting is dened to b e an

oracle machine that can make queries to a length preserving function which is either sampled from

the pseudorandom function ensemble or from the uniform function ensemble We assume that on

n

input the oracle machine makes only nbit queries For any probabilistic oracle machine M

n n O n n

and any I I function O we denote by M the distribution of Ms output on input

and with access to O

Denition eciently computable pseudorandom function ensemble An eciently

n n

computable I I function ensemble F fF g is pseudorandom if for every probabilistic

n nN

polynomialtime oracle machine M every polynomial p and al l suciently large ns

i i h h

n n R F

n n

Pr M Pr M

pn

n n

where R fR g is the uniform I I function ensemble

n nN

At the rest of this pap er the term pseudorandom functions is used as an abbreviation for

eciently computable pseudorandom function ensemble

Remark In the denition above and in the rest of the paper we interpret ecient computa

tion as probabilistic polynomialtime and negligible as smal ler than pol y This is a rather

standard choice and it signicantly simplies the presentation of the paper However from each one

of the proofs in this paper one can easily extract a more quantitative version of the corresponding re

sult As mentioned in the introduction the dierent reductions of this paper are securitypreserving

in the sense of

Pseudorandom Synthesizers

As mentioned ab ove we introduce in this pap er a new cryptographic primitive called a pseudo

random synthesizer In this section we dene pseudorandom synthesizers and describ e their prop

erties

Motivation

Pseudorandom synthesizers are eciently computable functions of two variables The signicant

feature of such a function S is that given p olynomiallymany uniformly distributed assignments

hx x i and hy y i for b oth variables the output of S on all the combinations of these as

m m

m

signments f x y is pseudorandom ie is indistinguishable from random to a p olynomial

i j

ij

time observer This is a strengthening of an imp ortant prop erty of pseudorandom generators

the indistinguishabi l ity of a p olynomial sample

A pseudorandom bit generator is a p olynomialtime computable function G

n

f g f g such that x I jGxj n n and GU is pseudorandom ie

n

fGU g and fU g are computationally indistinguishable It turns out that this de

n nN nN

n

nition implies that Given p olynomiallymany uniformly distributed assignments hz z i the

m

m

sequence fGz g is pseudorandom

i

i

The ma jor idea b ehind the denition of pseudorandom synthesizers is to obtain a function S

m

such that fS z g remains pseudorandom even when the z s are not completely independent

i i

i

m

More sp ecically pseudorandom synthesizers require that fS z g remains pseudorandom

i

i

m

even when the z s are of the form fx y g This pap er shows that under some standard

i i j

ij

intractability assumptions it is p ossible to obtain such a function S and that this prop erty is

indeed very p owerful As a demonstration to their strength we note b elow that pseudorandom

synthesizers are useful even when no restriction is made on their output length which is very

dierent than what we have for pseudorandom generators

Remark It is important to note that there exist pseudorandom generators that are not pseudo

def

random synthesizers An immediate example is a generator which is dened by Gx y G x y

where G is also a pseudorandom generator A more natural example is the subsetsum generator

P

G G which is dened by Gz a This is not a pseudorandom synthesizer

a a a i

n

z

i

for xed values a a a since for every four nbit strings x x y and y we have that

n

Gx y Gx y Gx y Gx y

Formal Denition

We rst introduce an additional notation to formalize the phrase all dierent combinations

n

Notation Let f be an I I function and let X fx x g and Y fy y g be

k m

two sequences of nbit strings We dene C X Y to be the k m matrix f x y C stands

f i j

ij

for combinations

We can now dene what a pseudorandom synthesizer is

Denition pseudorandom synthesizer Let be any N N function and let S f g

n

f g f g be a polynomialtime computable function such that x y I jS x y j n

Then S is a pseudorandom synthesizer if for every probabilistic polynomialtime algorithm D

every two polynomials p and m and al l suciently large ns

h i

mnmn

Pr D C X Y Pr D U

S

n

pn

mn

where X and Y are independently drawn from U Ie for random X and Y the matrix

n

C X Y cannot be eciently distinguished from a random matrix

S

Expanding the Output Length

In Denition no restriction was made on the outputlength function of the pseudorandom

synthesizer However our parallel construction of pseudorandom functions uses parallel pseudo

random synthesizers with linear output length n n The following lemma shows that any

synthesizer S can b e used to construct another synthesizer S with large outputlength such that

S and S have the same parallel time complexity Therefore for the construction of pseudorandom

functions in NC it is enough to show the existence of synthesizers with constant output length in

NC

Lemma Let S be a pseudorandom synthesizer with arbitrary outputlength function in

i i

NC resp AC Then for every constant there exists a pseudorandom synthesizer S

i i

in NC resp AC such that its outputlength function satises n n

def

c c

Proof For every constant c dene S as follows Let k maxfk Z k ng On input

n

n c c

x y I regard the rst k bits of x and y as two lengthk sequences X and Y of k bit

n

n n

c

strings S x y is dened to b e C X Y viewed as a single bitstring rather than a matrix

S

c

Notice that the following prop erties hold for S

c

S is indeed a pseudorandom synthesizer For any p olynomial m let X and Y b e inde

c

mn mnk

n

p endently drawn from U and let X and Y b e indep endently drawn from U

n k

n

c

c

By the denition of S the distributions C X Y and C X Y are identical Taking

S S

into account the fact that n is p olynomial in k we conclude that every p olynomialtime

n

c

distinguisher for S is also a p olynomialtime distinguisher for S Since S is a pseudorandom

c

synthesizer so is S

c

c

Since c is a constant Let denote the outputlength function of S then n n

c c

c

and n k for every n it holds that

n

c

c c

c c

n k l k k n n

c n n n

c i i c

S is in NC resp AC Immediate from the denition of S

c

Thus by taking S to b e S for some c we obtain the lemma 2

The construction of Lemma has the advantage that it is very simple and that the parallel

time complexity of S and S is identical Nevertheless it has an obvious disadvantage The security

of S is related to the security of S on a much smaller input length For example if n and

n n then the security of S on k bit strings is related to the security S on k bit strings This

results in a substantial increase in the time and of any construction that uses S

We now show an alternative construction to the one of Lemma that is more security

preserving The alternative construction uses a pseudorandom generator G that expands the

input by a factor of and relies on the GGMConstruction

i i

Corollary of Let G be a pseudorandom generator in NC resp AC such that

s jGsj jsj Then for every polynomial p there exists a pseudorandom generator G in

i i

NC resp AC such that s jG sj pjsj jsj

pjsj jsj

G is dened as follows On input s it computes Gs s s and recursively generates

pjsjjsj

bits from s and bits from s The number of levels required is dlog pjsje O log jsj

Using Corollary we get

i

Lemma Let S be a pseudorandom synthesizer with arbitrary outputlength function in NC

i j j

resp AC Let G be a pseudorandom generator in NC resp AC such that s jGsj jsj

Let k denote maxfi j g Then for every positive constant c there exists a pseudorandom

k k c

synthesizer S in NC resp AC such that its outputlength function satises n n

l n

Furthermore the construction of S is linearpreserving in the sense of the exact

meaning of this claim is described below

n

x g Proofsketch S is dened as follows On input x y I compute X G x fx

c

dn e

and Y G y fy y g where G is the pseudorandom generator that is guaranteed to

c

dn e

exist by Corollary S x y is dened to b e C X Y

S

k k c

It is immediate that S is in NC resp AC and that n n l n It is also not hard

to verify that S is indeed a pseudorandom synthesizer and from the pro of of Corollary that

the construction of S is linearpreserving in the following sense

X Y Assume that there exists an algorithm that works in time tn and distinguishes C

S

m nm n m n

from U with bias n where X and Y are indep endently drawn from U

n

n

c

Let mn m n dn e Then one of the following holds

mnmn

The same algorithm distinguishes C X Y from U with bias n where X

S

n

mn

and Y are indep endently drawn from U

n

There exists an algorithm that works in time tn m n pol y n and distinguishes GU

n

from random with bias nO mn

2

The construction of Lemma is indeed more securitypreserving than the construction of

Lemma since the security of S relates to the security of S and G on the same input length

However the time complexity of S is still substantially larger than the time complexity of S and

the parallel time complexity of S might also b e larger Given the drawbacks of b oth construction

it seems that a direct construction of ecient and parallel synthesizers with linear output length

is very desirable

Collection of PseudoRandom Synthesizers

A natural way to relax the denition of a pseudorandom synthesizer is to allow a distribution of

functions for every input length rather than a single function To formalize this we use the concept

of an eciently computable function ensemble of Denition

Denition collection of pseudorandom synthesizers Let be any N N function

n

and let S fS g be an eciently computable I I function ensemble S is a collection of

n nN

n

I I pseudorandom synthesizers if for every probabilistic polynomialtime algorithm D every

two polynomials p and m and al l suciently large ns

h i

mnmn

Pr D C X Y Pr D U

S

n

n

pn

mn

where X and Y are independently drawn from U

n

As shown b elow a collection of pseudorandom synthesizers is sucient for our construction

of pseudorandom functions Working with a collection of synthesizers rather than a single syn

thesizer enables us to move some of the computation into a prepro cessing stage during the key

generation This is esp ecially useful if all other computations can b e done in parallel

Note that Lemma and Lemma easily extend to collections of synthesizers

A Parallel Construction of PseudoRandom Functions

This section describ es the construction of pseudorandom functions using pseudorandom synthe

sizers as building blo cks The intuition of this construction is b est explained through the concept

of a k dimensional pseudorandom synthesizer This is a natural generalization of the regular

k

twodimensional synthesizer Informally an eciently computable function of k variables S is

a k dimensional pseudorandom synthesizer if

k

m

Given polynomiallymany uniformlychosen assignments for each variable fa g

ji

i

j

m

k k

can the output of S on al l the combinations M S a a a

i i k i

k

i i i

k

not be eciently distinguished from uniform by an algorithm that can access M at points

of its choice

Note that this denition is somewhat dierent from the twodimensional case For any constant k

and in particular for k the matrix M is of p olynomial size and we can give it as an input to

the distinguisher In general M might b e to o large and therefore we let the distinguisher access

M at p oints of its choice

Using this concept the construction of pseudorandom functions can b e describ ed in two steps

n

A parallel construction of an ndimensional synthesizer S from a twodimensional syn

thesizer S that has output length n n This is a recursive construction where the

k k

k dimensional synthesizer S is dened using a k dimensional synthesizer S

def

k k

S x x x S S x x S x x S x x

k k k

n

An immediate construction of the pseudorandom function f from S

def

n

a a f x S a

nx x x

ha a a a a a i

n

n n

In fact pseudorandom functions can b e constructed from a collection of synthesizers In this

case for each level of the recursion a dierent synthesizer is sampled from the collection As will b e

noted b elow for some collections of synthesizers as those constructed in this pap er it is enough

to sample a single synthesizer for all levels

Formal Denition

The following op eration on sequences is used in the construction

n n

Denition For every function S I I and every sequence L f g of nbit

k

k

strings dene SQ L to be the sequence L f c g where S l for i b

k i i

S

i

d e

and if k is odd then SQ stands for squeeze

k

k

d e

We now turn to the construction itself

n n

Construction PseudoRandom Functions Let S fS g be a collection of I I

n nN

pseudorandom synthesizers and let I be a probabilistic polynomialtime keygenerating algorithm

S

n

for S as in Denition For every possible value k of I denote by s the corresponding

S k

n n

I I function The function ensemble F fF g is dened as fol lows

n nN

n

keygeneration On input the probabilistic polynomialtime keygenerating algorithm I

F

n

outputs a pair a k where a fa a a a a a g is sampled from U

n n n

n

and k fk k k g is generated by dlog ne independent executions of I on input

S

dlog ne

n dlog ne

ie is sampled from I

S

n n n

evaluation For every possible value a k of I the function f I I is dened

F

ak

as fol lows On an nbit input x x x x the function outputs the single value in

n

fa a a g SQ SQ SQ

x x nx

n s s s

k k k

dlog ne

f

ai

s

k

Xy

X

X

X

X

X

m

s

k

a

Pi

P

P

P

P

m

s s

k k

a

HY

H

AK I

H A

m m m m m m m m m m

a a a a a a a a a a

a

Figure Computing the Value of the PseudoRandom Function for n

Final ly F is dened to be the random variable that assumes as values the functions f with

n

ak

n

the probability space induced by I

F

The evaluation of f x can b e thought of as a recursive lab eling pro cess of a binary tree with

ak

th th

n leaves and depth dlog ne The i leaf has two p ossible lab els a and a The i input bit x

i i i

selects one of these lab els a The lab el of each internal no de at depth d is the value of s on

ix k

i

d

the lab els of its children The value of f x is simply the lab el of the ro ot Figure illustrates

ak

the evaluation of f for n We note that this lab eling pro cess is very dierent than the one

ak

asso ciated with the GGMConstruction First the binary tree is of depth dlog ne instead of

depth n as in Secondly the lab eling pro cess is b ottomup instead of topdown as in ie

starting at leaves instead of the ro ot Moreover here each input denes a dierent lab eling of

the tree whereas in the lab eling of the tree is fully determined by the key and the input only

determines a leaf such that its lab el is the value of the function on this input

Eciency of the Construction

It is clear that F is eciently computable given that S is eciently computable Furthermore

the parallel time complexity of functions in F is larger by a factor of O log n than the parallel

n

time complexity of functions in S The parallel time complexity of I and I is identical

n S F

We note that for simplicity the parameter n serves a double role n is b oth the length of inputs

to f F and the security parameter for such a function the second role is expressed by the

n

ak

fact that the strings in a are nbit long In practice however these roles would b e separated The

security parameter would b e determined by the quality of the synthesizers and the length of inputs

to the pseudorandom functions would b e determined by their application In fact one can usually

use a pseudorandom function with a reasonably small inputlength say bit long to prevent a

birthday attack This is implied by the suggestion of Levin to pairwise indep endently hash

the input b efore applying the pseudorandom function this idea is describ ed with more details in

the introduction

Reducing the KeyLength

An apparent disadvantage of Construction is the large keylength of a function f F In

n

ak

particular the sequence a is dened by n bits However this is not truly a problem since a

In Section a related construction is describ ed Construction where a consists of a constant

number of strings and is therefore dened by O n bits b The truly random sequence a can b e

replaced by a pseudorandom sequence without increasing the depth of the construction by more

than a constant factor This is achieved as follows Let G b e a pseudorandom generator that

expands the input by a factor of Let G b e the pseudorandom generator that can b e constructed

from G according to Corollary for pn n ie by using dlog n e levels of the recursion

Then a can b e replaced by G a where a is an nbit seed

In addition to a the key of f F consists of dlog ne keys of functions in S It turns out

n n

ak

that for some collections of synthesizers such as those describ ed in this pap er this overhead can

b e eliminated as well This is certainly true when using a single synthesizer instead of a collection

Moreover from the pro of of security for Construction one can easily extract the following claim

If the collection of synthesizers remains secure even when it uses a public key ie if C X Y

s

k

remains pseudorandom even when the distinguisher sees k then the dlog ne keys can b e replaced

with a single one ie the same key can b e used at all levels of the recursion

Security of the Construction

n

Theorem Let S and F be as in Construction and let R fR g be the uniform I

n nN

n

I function ensemble Then F is an eciently computable pseudorandom function ensemble

Furthermore any ecient distinguisher M between F and R yields an ecient distinguisher D

for S such that the success probability of D is smal ler by a factor of at most dlog ne than the success

probability of M

To prove Theorem we use of a hybrid argument for details ab out this pro of technique see

We rst dene a sequence of dlog ne function distributions such that the two extreme

distributions are R and F We then show that any distinguisher for two neighboring distributions

n n

can b e transformed into a distinguisher for the pseudorandom synthesizers For simplicity we

dene those hybriddistributions in case n The denition easily extends to a general value of

n such that Claim still holds

th j

the j hybriddistribution The computation of functions in For any j denote by H

n

j

H may b e describ ed as a lab eling pro cess of a binary tree with n leaves and depth an analogous

n

description for F app ears in Section Here the lab eling pro cess starts with no des at depth j

n

j j

th j th

g which are part of the key The i bit p ossible lab els fa s I The i such no de has

is

The rest of the lab eling pro cess is the substring of the input x selects one of these lab els a

i ix

i

same as it was for functions in F The lab el of each no de at depth d j is the value of s

n k

d

on the lab els of its children The value of the function on this input is simply the lab el of the ro ot

j

Another way to think of H is via the concept of a k dimensional synthesizer see Section

n

n j

As was the case for F the construction of functions in H can b e describ ed in two steps A

n

j

j

recursive construction of a dimensional synthesizer S from a twodimensional synthesizer

j

S An immediate construction of the pseudorandom function f from S

j

def

a a a S f x x x

j j j

x x

j

x

j

fa r sI g

rs

We turn to the formal denition of the hybriddistributions

Denition Let I be the keygenerating algorithms of S Let n and j be three integers such

S

n

that n and j For every sequence k fk k k g of possible values of I

j S

j j

j j

and for every length sequence of nbit strings a fa r s I g the function

rs

j

j n n

where i x I f I I is dened as fol lows On input x x x x

j

i

ak

the function outputs the single value in

SQ a a a SQ SQ

j

x x

s x s s

k k j k

j

j

H is the random variable that assumes as values the functions f dened above where the

n

ak

n

k s are independently distributed according to I and a is independently distributed according

i S

j

j

to U

n

This denition immediately implies that

dlog ne

Claim H and F are identically distributed and H and R are identically distributed

n

n n

n

j j

The pro of b elow shows that for every j the two neighboring ensembles H and H are

n n

computationally indistinguishable As shown b elow this implies Theorem by a standard hybrid

argument

Proof of Theorem As mentioned in Section it is obvious that F is an eciently computable

function ensemble Assume that F is not pseudorandom By the denition of pseudorandom

function ensembles there exists a p olynomialtime oracle machine M and a p olynomial p so

that for innitely many ns

i i h h

n n R F

n n

Pr M Pr M

pn

n n

where R fR g is the uniform I I function ensemble Let t b e a p olynomial that

n nN

n

b ounds the number queries that M makes on input

Given M we dene the probabilistic p olynomialtime algorithm D that distinguishes the output

def

of S from random Let m mn b e dened by mn tn n For every n the input of D is an

n

mnmn matrix B b whose entries are nbit strings As part of its algorithm D invokes

ij

n

M on input The denition of m allows D to answer all the queries of M which are b ounded

by tn It will b e shown b elow that D distinguishes b etween the following two distributions of B

mn mnmn

a C X Y where X and Y are indep endently drawn from U b U

S n n

n

For simplicity of presentation we only dene the algorithm that D p erforms for n It is

easy to extend this denition to a general value of n such that Claim and Claim still hold

mn

the algorithm is dened as follows On input B b

ij

ij

Cho ose J uniformly at random

n

Generate k fk k k g by J indep endent executions of I on input

J S

tn

J J i i

Extract submatrices of B For i denote by B b the

uv

uv

tn tn diagonal submatrix of B dened by

def

i

b b

uitnv itn

uv

n r r r r th

Invoke M on input Denote by q q q q the r query M makes where

J

J

r J

q I for i On each of these queries D answers as follows

i

J i i

r r

For every i denote by a the entry b of B where

iq q

uv

i i

j j

r r

u minf j r q q g and v min f j r q q g

i i

i i

Answer the query with the single value in

r r

SQ fa SQ g a r r

J

q q

s s q q

k k

J J

J

Output whatever M outputs

It is obvious that D is a p olynomialtime algorithm To show that D is also a distinguisher for

the pseudorandom synthesizers we rst state and prove the following two claims

Claim For every J

h i h i

j

mnmn H n

n

Pr D U jJ j Pr M

n

Proof As part of its algorithm D denotes some of B s entries by names of the form a where

rs

J

J

r and s I Note that D never denotes an entry of B by two dierent names

Assume for the sake of the pro of that any name a that was not used by D is assigned an

rs

indep endently and uniformly distributed nbit string Denote by a the sequence fa r

rs

J

J

s I g It is easy to verify that

J

J

When B is uniformly distributed the distribution of a is identical to U

n

D answers every query q of M with the value f q where f is as in the denition of

ak ak

J

H

n

The claim immediately follows from and and from the denition of k 2

mn

Claim Let X and Y be independently drawn from U Then for every J

n

h i

j

H n

n

Pr D C X Y jJ j Pr M

S

n

Proof Let X fx x x g and Y fy y y g b e indep endently drawn from

mn mn

mn

U and let s b e drawn from S Assume that the input of D is B C X Y For the

n k n s

k

J

J

sake of the pro of dene the vector a fa i s I g as follows

is

i i

r r r

If D denoted by a the entry b of B then dene a to b e x and

iq q iq

itnu

uv

i i i

r r r r r

a to b e y Note that a s a a

iq iq q k iq iq

itnv

i i i i i

For all other values in a assign an indep endently and uniformly distributed nbit string

J

J

It is easy to verify that the distribution of a is identical to U Let k b e the sequence

n

J

fk k k k g and let f b e as in the denition of H We now have that the answer D

J

n

a k

th r

gives to the r query q of M is

r r

SQ fa SQ g a r r

J

q q

s s q q

k k

J J

J

g fa r a r a SQ SQ SQ

r

J

s s s

q q

q

k k k

J

J

r

f q

a k

From this fact and from the denition of a and k we immediately get the claim 2

By Claims and we can now conclude the following Let X and Y b e indep endently

mn

drawn from U then for innitely many ns

n

h i

mnmn

Pr D C X Y Pr D U

S n

n

dlog ne dlog ne

h i

X X

mnmn

Pr D U jJ j Pr D C X Y jJ j

n S

n

dlog ne

j j

dlog ne dlog ne

h i h i

X X

j j

H n H n

n n

Pr M Pr M

dlog ne

j j

h i

dlog ne

n H n H

n

n

Pr M Pr M

dlog ne

i i h h

n n R F

n n

Pr M Pr M

dlog ne

pn dlog ne

This contradicts the assumption that S is a collection of pseudorandom synthesizers and com

pletes the pro of of Theorem 2

Corollary For any collection of pseudorandom synthesizers S such that its functions are

i

computable in NC there exists an eciently computable pseudorandom function ensemble F

i

such that its functions are computable in NC Furthermore the corresponding keygenerating

algorithms I and I have the same parallel time complexity

S F

n n

Proof By Lemma we can construct from S a new collection of I I pseudorandom

i

synthesizers S such that its functions are computable in NC By Theorem we can construct

from S an eciently computable pseudorandom function ensemble F such that its functions

i

are computable in NC Both constructions preserve the parallel time complexity of the key

generating algorithms 2

A Related Construction and Additional Prop erties

Though designed to enable ecient computation in parallel Construction obtains some addi

tional useful prop erties In this section we describ e two such prop erties a rather sharp timespace

tradeo and an incremental prop erty We also show how to adjust the construction in order to

improve up on these prop erties

TimeSpace Tradeo

Construction has the advantage of a sharp timespace tradeo In order to get an even sharp er

tradeo we describ e an alternative construction of pseudorandom functions The b est way to

understand the revised construction is by viewing the computation pro cess backwards Every

n

function on nbits is dened by the length sequence of all its values Assume that we could

p

n

sample and store two lengthd e sequences X and Y of random strings as the key of a pseudo

n

random function In this case given a pseudorandom synthesizer S we can dene the values

of the pseudorandom function to b e the entries of the matrix C X Y In order to reduce the

S

key size we can replace the random sequences X and Y with pseudorandom sequences Such 2n Values

m-keys

Figure Illustration of the Alternative Construction

sequences X and Y can b e obtained together from C X Y where X and Y are two shorter

S

p

n

random sequences of length approximately By continuing this pro cess of reducing the

key size log n times we get a key with constant number of strings see Figure for an illustration

of the construction

In order to understand where the original construction is wasteful in the size of the key we

n

can describ e it in similar terms The values of the function are still the values of C X Y for

S

two sequences X and Y in the description of the computation as a tree lab eling pro cess these are

all the p ossible lab els of the ro ots children but then we get X and Y separately as C X Y

S

and C X Y By the time the sequences have constantlength there are O n of those

S

Returning to the new construction note that if we allow a key of m strings we only need

t log n log log m of the steps describ ed ab ove Computing such functions requires t phases and

n

t

in each phase several parallel invocations of S The total number of invocations of S is

log m

This seems to b e a relatively sharp timespace tradeo and to the b est of our knowledge one that

cannot b e obtained by the GGMconstruction

For some applications like the protection of the data on a disk we need pseudorandom func

tions with reasonably small amount of entries In this case by storing relatively few strings we can

achieve a very easytocompute function For example random bit strings dene a pseudo

random I I function Computing this function requires only invocations of a pseudorandom

synthesizer in phases

We formalize the denition of the alternative construction

Construction Alternative Construction of PseudoRandom Functions

n n

Let S fS g be a collection of I I pseudorandom synthesizers and let I be a probabilistic

n nN S

polynomialtime keygenerating algorithm for S as in Denition For every possible value k

n n n

of I denote by s the corresponding I I function The function ensemble F fF g

S k n nN

is dened as fol lows

j

keygeneration Let m denote the value and let t denote the smal lest integer t such

j n

n

that m n On input the probabilistic polynomialtime keygenerating algorithm I

t F

m

m

outputs a pair a k where a fa a a g is generated according to U and

n

n

k fk k k g is generated by t independent executions of I on input ie is

t n S

n

n t

n

sampled from I

S

n

evaluation For every possible value a k of I and every j such that j t

F n

j

m n m

j

I I in recursion on j For x I dene f x to be a dene the function f

x

ak

ak

j

m j

j

x to For any j and x x x I x and x are bit strings dene f

ak

j j

be s f x f x

k

j

ak ak

t

n n n

n

For every x I the value of the function f I I on x is dened to be f x where

ak

ak

x is obtained by padding x with m n zeros

t

n

Final ly F is dened to be the random variable that assumes as values the functions f with

n

ak

n

the probability space induced by I

F

The pro of of security for Construction is omitted since it is almost identical to the pro of of

security for Construction

We can now state the exact form of the timespace tradeo under the notation of Construc

i m m

i

then we can dene f strings instead of tion If a contains x to b e a distinct value

ak

j

m

i

as b efore for j i In this case com and keep the recursive denition of f in a for every x I

ak

t i

n

puting f x can b e done in t i phases with a total of invocations of the synthesizers

n

ak

The next lemma follows for simplicity this lemma is stated in terms of a synthesizer instead of a

collection of synthesizers

Lemma Let S be a pseudorandom synthesizer with outputlength function n n Assume

that S can be computed in paralleltime D n and work W n on nbit inputs Then for every m

n m

mn there exists an eciently computable pseudorandom function mn such that

ensemble F fF g such that the key of a function in F is a sequence of at most mn random

n nN n

nbit strings and this function can be computed in paralleltime log n log log mn O D n

n

and using work of O W n

log mn

Incremental Prop erty

We now describ e an observation of Mihir Bellare that gives rise to an interesting incremental

prop erty of our construction For the formulation and treatment of incremental cryptography see

the work of Bellare Goldreich and Goldwasser

Let f b e any function in F where F fF g is the pseudorandom function ensemble

n n nN

n

dened in Construction Let x y I b e of Hamming distance one x and y dier on exactly

one bit Then given the computation of f x including all intermediate values we only need

additional log n invocations of the pseudorandom synthesizers instead of n in order to evaluate

f y The easiest way to see the correctness of this observation is to recall the description of the

computation of f x as a lab eling pro cess on a depthlog n binary tree The only lab els that change

as a result of ipping one bit of x are those of the no des on a path from one leaf to the ro ot ie

log n lab els

If a Graycode representation of numbers is used we get a similar observation for the compu

tation of f x and f x Given the computation of one of these values computing the other

requires only additional log n invocations of the pseudorandom synthesizers It is not hard to

imagine situations where one of these incremental prop erties is useful

The observation regarding the computation of f x and f y for x and y of Hamming distance

one also holds for the functions of Construction The observation regarding the computation

of f x and f x holds if we use a dierent representation of numbers this representation is

similar to a Graycode though a bit more complicated

Construction of PseudoRandom Synthesizers Based on Gen

eral Cryptographic Primitives

Sections are mostly devoted to showing parallel constructions of pseudorandom synthesizers In

this section we provide a simple construction of pseudorandom synthesizers based on what we call

weak pseudorandom functions This construction immediately implies a construction of pseudo

random synthesizers based on trap do or oneway p ermutations and an additional construction

based on any hardtolearn problem which is considered in Section An interesting line for

further research is the parallel construction of pseudorandom synthesizers from other cryptographic

primitives In particular we do not know of such a construction from pseudorandom generators

or directly from oneway functions

Weak PseudoRandom Functions

The reason pseudorandom functions are hard to construct is that they must endure a very p owerful

kind of an attack The adversary the distinguisher may query their values at every p oint and may

adapt its queries based on the answers it gets We can weaken the opp onent by letting the only

access it has to the function b e a p olynomial sample of random p oints and the value of the function

at these p oints We call functions that lo ok random to such an adversary weak pseudorandom

functions In this section it is shown that weak pseudorandom functions yield pseudorandom

synthesizers in a straightforward manner We therefore get a parallel construction of standard

pseudorandom functions from weak pseudorandom functions

For simplicity we dene weak pseudorandom functions as lengthpreserving In their denition

we use the following notation

Notation For every function f and every sequence X fx x g of values in the domain

k

of f denote by VX f the sequence fx f x x f x x f x g Vstands for values

k k

Denition collection of weak pseudorandom functions An eciently computable

n n

I I function ensemble F fF g is a collection of weak pseudorandom functions if

n nN

for every probabilistic polynomialtime algorithm D every two polynomials p and m and al l

suciently large ns

h i h i

mn mn

Pr D VU F Pr D U

n n n

pn

n n

A p ermutation P on I is called a Graycode representation if for every  x the Hamming distance

n

b etween P x and P x mo d is one Such a P denes a Hamiltoniancycl e on the ndimensional cub e It is

not hard to see that an easytocompute P can b e dened

Let F b e a collection of weak pseudorandom functions and let I b e the p olynomialtime key

generating algorithm for F Lemma shows how to construct a pseudorandom synthesizer from

F and I Since the random bits of I can b e replaced by pseudorandom bits we can assume that

n

I only uses n truly random bits on input In fact this is only a simplifying assumption which

n

is not really required for the construction of pseudorandom synthesizers For every r I denote

n n

by I the value of I when I uses r as its random bits

r

Lemma Let F and I be as above and dene S f g f g f g such that x y

n

I S x y f n x Then S is a pseudorandom synthesizer

I

y

Proof It is obvious that S is eciently computable Assume in contradiction to the lemma that

S is not a pseudorandom synthesizer Then there exists a probabilistic p olynomialtime algorithm

D and p olynomials p and m such that for innitely many ns

h i

mnmn

Pr D C X Y Pr D U

S n

pn

mn

where X and Y are indep endently drawn from U

n

th i

For every n and every i mn dene the i hybrid distribution H over mn

n

mn matrices as follows The rst i columns are distributed according to C X Y where X is

S

mn i

drawn from U and Y is indep endently drawn from U The last mn i columns are

n n

mnmni

indep endently distributed according to U It is immediate that for innitely many

n

ns

h i h i

mn

Pr D H Pr D H

n n

pn

We now dene a distinguisher D for F Given fx z x z x z g as its input D

mn mn

p erforms the following algorithm

Dene X fx x g and Z fz z g

mn mn

Uniformly choose J mn

J

Sample Y from U and generate an mnmn matrix B whose rst J columns are

n

th t

C X Y its J column is Z and the last mn J columns are indep endently distributed

S

mnmnJ

according to U

n

Output D B

It is obvious that D is eciently computable It is also easy to verify that

h i h i

mn j

Pr D VU F jJ j Pr D H

n n

n

and that

h i h i

mn j

Pr D U jJ j Pr D H

n

n

Thus by a standard hybrid argument we get that for innitely many ns

h i h i

mn mn

Pr D VU F Pr D U

n n n

pnmn

in contradiction to the assumption that F is a collection of weak pseudorandom functions We

can therefore conclude the lemma 2

Notice that S as in Lemma ob eys an even more p owerful requirement than is needed by

the denition of a pseudorandom synthesizer For random X and Y the matrix C X Y cannot

S

b e eciently distinguished from a random matrix even if we allow the distinguisher access to X

Corollary of Lemma If there exist weak pseudorandom functions that can be sampled

and evaluated in NC then there also exist a pseudorandom synthesizer in NC and standard

pseudorandom functions that can be sampled and evaluated in NC

Trapdo or OneWay Permutations

We now describ e a rather simple construction of weak pseudorandom functions from a collection

of trap do or p ermutations Therefore given Lemma we get a construction of a pseudorandom

synthesizer out of a collection of trap do or p ermutations This pseudorandom synthesizer is in NC

if the trap do or p ermutations can b e sampled and inverted in NC in fact there is an additional

requirement of a hardcore predicate in NC but this is already satised by Since we have no

concrete example of this sort we only give a brief and informal description of the construction for

formal denitions of trap do or oneway p ermutations and hardcore bits see eg

Let F fF g b e a p ermutation ensemble such that every f F is a p ermutation over a

n nN i n

domain D Informally F is a collection of trap do or oneway p ermutations if the key generating

n

algorithm I of F outputs b oth a publickey i and a trap do orkey ti and we have that

F

Given i the function f is easy to compute everywhere but hard to invert on the average

i

Given ti the function f is easy to compute and to invert everywhere

i

Let F fF D D g b e a collection of trap do or oneway p ermutations Assume that

n n n nN

the collection is oneway for the uniform distribution over the inputs ie it is hard to compute x

given F x where x is uniformly distributed in D Let the sequence of functions fb D

n n n n

I g b e a hardcore predicate for F Informally this means that given F x for a uniformly

nN n

distributed x it is hard to guess b x with probability which is nonnegligibly b etter than half

n

We can now dene a collection of weak pseudorandom functions G fG g in the following

n nN

way

n

For every x I denote by D x the element in D sampled using x as the random

n n

n

bits For every key i of f F dene g I I as fol lows

i n i

def

n

g x x I b f D x

i n n

i

Note that computing g x requires know ledge of the trapdoorkey ti Let G be

i n

the random variable that assumes as values the functions g with the probability space

i

induced by the distribution over the keys in F

n

Claim without proof The function ensemble G which is dened above is a collection of weak

pseudorandom functions

NumberTheoretic Constructions of PseudoRandom Synthe

sizers

In this section we present several NC constructions of pseudorandom synthesizers based on con

crete frequentlyused intractability assumptions The rst construction is at least as secure as

the DieHellman assumption As we shall see in Section we also get a construction that

is at least as secure as factoring since the DieHellman assumption mo dulo a comp osite is not

stronger than factoring also see Finally we show two constructions that are at

least as secure as the RSA assumption Although the RSA assumption is not weaker than

factoring the constructions based on RSA might have other advantages For example under the

assumption that n leastsignicant bits are simultaneously hard for RSA we get pseudorandom

synthesizers with linear output length In addition the constructions based on RSA and their pro of

of security use several interesting ideas that might b e useful elsewhere We rst address issues that

are common to all constructions

The evaluation of our pseudorandom synthesizers in NC relies on a prepro cessing stage This

stage can b e p erformed as part of the sequential keygenerating algorithm In this idea we follow

the work of Kearns and Valiant In their context the additional data is forced into the

input whereas in our context it is added to the key

The analysis of the paralleltime complexity of the synthesizers uses previous results on the

paralleltime complexity of arithmetic op erations see Karp and Ramachandran for a review

In particular we use the result of Beame Co ok and Ho over They showed that iterated multi

plication multiplying n numbers of length n and additional related op erations can b e p erformed

by logdepth circuits these circuits can b e constructed eciently though sequentially The re

sults of enable the computation of mo dular exp onentiation in NC given prepro cessing that

e

only dep ends on the base This follows from the fact that computing b mo d N is reduced to an

i

iterated multiplication and an additional mo dular reduction given the values b mo d N where

i the length of e

The pseudorandom synthesizers constructed in this section are Bo olean functions Section

showed two metho ds for expanding the outputlength of pseudorandom synthesizers The metho d

of Lemma requires a pseudorandom generator that expands the input by a factor of two A

natural choice for this purp ose in the case of the synthesizers which are describ ed in this section

is the pseudorandom generators of Blum Blum and Shub or the one of Hastad Schrift and

Shamir Given appropriate prepro cessing b oth generators can b e computed in NC and their

security is based on the assumption that factoring integers Blumintegers in is hard

We note that all the constructions of this section give collections of pseudorandom synthesizers

However the security of theses synthesizers do es not rely on keeping their key private As discussed

in Section this allows us to use a single synthesizer at all the levels of Construction and of

Construction

Common Tools

In our constructions we use the result of Goldreich and Levin which gives a hardcore predicate

for any oneway function

Theorem Let f be any oneway function For every probabilistic polynomialtime algo

rithm A for every polynomial p and al l suciently large ns

Pr Af x r r x

pn

where x and r are independently drawn from U recall that r x denotes the inner product mo d

n

of r and x

In fact all these synthesizers can b e made to output a logarithmic number of bits Furthermore given stronger

assumptions they may output an even larger number of bits See Remark for an example

In fact we use their result in a slightly dierent context Lo osely sp eaking if given f x it is

hard to compute g x then given f x it is also hard to guess g x r An imp ortant improvement

on the application of in the context of the DieHellman assumption was made by Shoup

In addition the pro of of security for all the constructions uses the nextbit prediction tests of

Blum and Micali The equivalence b etween pseudorandom ensembles and ensembles that pass

all p olynomialtime nextbit tests was shown by Yao

The DieHellman Assumption

We now dene a collection of pseudorandom synthesizers that are at least as secure as the Die

Hellman assumption DHAssumption This assumption was introduced in the seminal pap er

of Die and Hellman as a requirement for the security of their keyexchange proto col The

validity of the DHAssumption was studied quite extensively over the last two decades A few

notable representatives of this research are Maurer and Boneh and Lipton have

shown that in several settings the DHAssumption is equivalent to the assumption that computing

the discretelog is hard In particular for any sp ecic prime P there is an ecient reduction given

to the DHProblem some information that only dep ends on P of the discretelog problem in Z

P

in Z Shoup has shown that the DHAssumption holds against what he calls generic

P

algorithms

For concreteness we state the DHAssumption in the group Z where P is a prime However

P

our construction works just as well given the DHAssumption in other groups We use this fact in

Section to get pseudorandom synthesizers which are at least as secure as factoring In order to

formalize the DHAssumption in Z we need to sp ecify the distribution of P One p ossible choice

P

is to let P b e a uniformly distributed prime of a given length However there are other p ossible

choices For example it is not inconceivable that P can b e xed for any given length To keep our

results general we let P b e generated by some p olynomialtime algorithm IG where IG stands

DH

for instance generator

Denition IG The DieHel lman instance generator IG is a probabilistic polynomial

DH DH

n

time algorithm such that on input the output of IG is distributed over nbit primes

DH

In addition we need to sp ecify the distribution of a generator g of Z It can b e shown that

P

if the DHAssumption holds for some distribution of g then it also holds if we let g b e a uniformly

distributed generator of Z since there exists a simple randomizedreduction of the DHProblem

P

for any g to the DHProblem with a uniformly distributed g

the denition of P will b e clear by All exp onentiations in the rest of this subsection are in Z

P

the context To simplify the notations we omit the expression mo d P from now on We can

now formally state the DHAssumption for the instance generator given IG

DH

Assumption DieHel lman For every probabilistic polynomialtime algorithm A for

every polynomial q and al l suciently large ns

h i

a b ab

Pr AP g g g g

q n

n

where the distribution of P is IG the distribution of g is uniform over the set of generators

DH

of Z and the distribution of ha bi is U

n

P

n

Based on this assumption we dene a collection of I I pseudorandom synthesizers S

DH

n

Denition For every nbit prime P every generator g of Z and every r I dene

P

n

s I I by

P g r

def

n xy

x y I s x y g r

P g r

Let S to be the random variable that assumes as values the functions s where the distribution of

n P g r

n

P is IG the distribution of g is uniform over the set of generators of Z and the distribution

DH

P

of r is U The function ensemble S is dened to be fS g

n DH n nN

Note that in a subsequent work we show a direct and ecient construction of ndimensional

pseudorandom synthesizers based on the stronger decisional version of the DHAssumption This

construction gives very ecient pseudorandom functions

n

Theorem If the DHAssumption Assumption holds then S is a collection of I I

DH

pseudorandom synthesizers

Proof It is obvious that S fS g is eciently computable Assume that S is not a col

DH n nN DH

lection of pseudorandom synthesizers Then there exists a p olynomial m such that the ensemble

E fE g is not pseudorandom where E C X Y for X and Y that are indep endently drawn

n n S

n

mn

from U Therefore there exists an ecient nextbit prediction test T and a p olynomial q

n

such that for innitely many ns it holds that

Given a prex of E of uniformly chosen length T succeeds to predict the next bit with

n

probability greater than

q n

We now show how to use T in order to dene an ecient algorithm A such that for innitely

many ns

h i

a b ab

Pr AP g g g r g r

q n

where the distribution of P g a and b is as in the DHAssumption and r is drawn from U By

n

ab

Theorem this means that g can also b e eciently computed which contradicts the DH

Assumption and completes the pro of of the lemma

y x

xy x y

In the denition of A we use the fact that in order to compute g g g it is enough

x y

to either know g and y or g and x ie it is not required to know b oth x and y This enables A

ab

to dene a matrix which is distributed according to E such that one of its entries is g r the

n

value A tries to guess and all other entries can b e computed by A It is now p ossible for A to

ab

guess g r by invoking T on the appropriate prex of this matrix

a b

In more details on input hP g g g r i the algorithm A is dened as follows

Uniformly choose i j mn

Dene X fx x g and Y fy y g by setting x a y b and indep en

i j

mn mn

dently drawing all other values from U

n

mn

Dene B b to b e C X Y

uv s

uv P g r

Note that A knows all the values of X and Y except x and y Therefore A can compute all

i j

ab

the entries of B except b g r

ij

Invoke T and feed it with all the entries of B up to b ie the rst i rows and the rst

ij

th

j entries of the i row

Output T s prediction of b

ij

It is obvious that A is ecient Furthermore since the distribution of B is exactly E it

n

h i

a b ab

where the is immediate that for innitely many ns Pr AP g g g r g r

q n

distribution of P g a b and r is as ab ove This contradicts the DHAssumption and proves the

lemma 2

Corollary If the DH assumption Assumption holds then there exist pseudorandom

functions that are computable in NC given a sequential precomputation which is part of the key

generating algorithm

Proof By Theorem given that the DHAssumption holds S is a collection of pseudorandom

DH

i

synthesizers If the keygenerating algorithm precomputes g mo d P for i n then the func

tions of S can b e evaluated in NC This precomputation reduces any mo dular exp onentiation

DH

with g as the base to an iterated multiplication and an additional mo dular reduction see also the

discussion at the b eginning of this section By Corollary there exist pseudorandom functions

in NC the keygenerating algorithm in b oth cases is sequential 2

n

Remark Assume that IG has a single possible value P for every n Then S can

DH DH

be transformed into a synthesizer rather than a collection of synthesizers In this case the key

generating algorithm of the pseudorandom functions we get is in nonuniform NC

Comp osite DieHellman Assumption and Factoring

The collection of pseudorandom synthesizers S is at least as secure as the DHAssumption

DH

modulo a prime As mentioned ab ove the DHAssumption in any other group gives a corresp onding

construction of pseudorandom synthesizers with practically the same pro of of security We now

consider the DHAssumption modulo a Bluminteger comp osite DHAssumption McCurley

and Shmuely have shown that the comp osite DHAssumption is implied by the assumption

that factoring Blumintegers is hard see also for denitions and pro of that are more consistent

with our setting We therefore get a simple construction of pseudorandom synthesizers which is

at least as secure as factoring In the subsequent we give the relevant denitions and claims We

omit the pro ofs since they are practically the same as in Section

To formalize the comp osite DHAssumption we let this comp osite b e generated by some p olynomial

time algorithm IG We restrict the output of IG to the set of Blumintegers This restriction

F F

is quite standard and it is meant to simplify the reduction of the comp osite DHAssumption to

factoring

Denition IG The factoring instance generator IG is a probabilistic polynomialtime

F F

n

algorithm such that on input its output N is distributed over nbit integers where N P Q

for two nbit primes P and Q such that P Q mo d such an integer is known as a

Bluminteger

n

We note that the most natural distribution of IG is the uniform distribution over nbit

F

n

Blumintegers Furthermore it is essential that IG would have many p ossible values since

F

otherwise factoring would b e nonuniformly easy in this resp ect it is very dierent from the case

of the DieHellman instance generator IG

DH

All exp onentiations in the rest of this subsection are in Z the denition of N will b e clear by

N

the context To simplify the notations we omit the expression mo d N from now on We can

now dene b oth the comp osite DHAssumption and the assumption that factoring Blumintegers

is hard for the instance generator given IG

F

Assumption Composite DieHel lman For every probabilistic polynomialtime algorithm

A for every polynomial q and al l suciently large ns

h i

a b ab

Pr AN g g g g

q n

n

where the distribution of N is IG the distribution of g is uniform over the set of quadratic

F

residues in Z and the distribution of ha bi is U

n

N

Assumption Factoring For every probabilistic polynomialtime algorithm A for every poly

nomial q and al l suciently large ns

PrAP Q fP Qg

q n

n

where the distribution of N P Q is IG

F

n

We dene a collection of I I pseudorandom synthesizers S in analogy to the denition

F

of S

DH

Denition For every nbit Bluminteger N every quadraticresidue g in Z and every

N

n n

r I dene s I I by

N g r

def

n xy

x y I s x y g r

N g r

Let S to be the random variable that assumes as values the functions s where the distribution

n N g r

n

of N is IG the distribution of g is uniform over the set of quadraticresidues in Z and the

F

N

distribution of r is U The function ensemble S is dened to be fS g

n DH n nN

In the same way Theorem was proven we get that

Theorem If the composite DHAssumption Assumption holds then S is a collection of

F

n

I I pseudorandom synthesizers

Since the comp osite DHAssumption Assumption is implied by the factoring assumption

Assumption we get that

Corollary of Theorem and of If factoring Blumintegers is hard Assump

n

tion then S is a collection of I I pseudorandom synthesizers

F

Finally we can conclude that

Corollary If factoring Blumintegers is hard Assumption then there exist pseudorandom

functions that are computable in NC given a sequential precomputation which is part of the key

generating algorithm

The RSA Assumption

We now dene two collections of pseudorandom synthesizers under the assumption that the RSA

p ermutations of Rivest Shamir and Adleman are indeed oneway ie under the assumption

that it is hard to extract ro ots mo dulo a comp osite This assumption is not weaker than the

factoring assumption However the constructions based on RSA might have other advantages For

example the second RSAconstruction gives pseudorandom synthesizers with linear output length

under the assumption that n leastsignicant bits are simultaneously hard for RSA Another

reason to include these constructions is that they use several interesting techniques that might b e

useful elsewhere eg the multiple role played by the subset pro duct function

As was the case with the previous assumptions we keep the denition of the RSAAssumption

general by using some p olynomialtime instance generator IG

RS A

Denition IG The RSA instance generator IG is a probabilistic polynomialtime

RS A RS A

n

algorithm such that on input its output is distributed over pairs hN ei Where N P Q is a

nbit integer P and Q are two nbit primes and e Z ie e is relatively prime to the order

N

of Z which is denoted by N

N

All exp onentiations in the rest of this subsection are in Z the denition of N will b e clear by

N

the context To simplify the notations we omit the expression mo d N from now on We can

now dene the RSAAssumption for the instance generator given IG

RS A

Assumption RSA For every probabilistic polynomialtime algorithm A for every poly

nomial q and al l suciently large ns

e

Pr AN e m m

q n

n

where the distribution of hN ei is IG and m is uniformly distributed in Z

RS A

N

The RSAAssumption gives a collection of trap do or oneway p ermutations the public key is

e

hN ei the function f is dened by f m m and the trap do orkey is hN e di where

N e N e

d

e

d e mo d Z which enables ecient inversion by the formula m m In Section we

N

showed a general construction of pseudorandom synthesizers out of trap do or oneway p ermuta

tions However a straightforward application of this construction to the RSA collection gives very

inecient synthesizers In the following few paragraphs we describ e this construction the reasons

it is inecient and some of the ideas and to ols that allow us to get more ecient synthesizers

which are also computable in NC

Applying the construction of Section to the RSA collection using the GoldreichLevin

hardcore predicate gives the following collection of synthesizers The key of each synthesizer

def

n d

is a uniformly distributed string r and for every x y I s x y m r where x sam

r

ples the trap do orkey hN e di and y samples a uniformly chosen element m Z The most

N

obvious drawback of this denition is that computing the value s x y consists of sampling an

r

RSA trap do orkey In particular computing s x y consists of sampling a Bluminteger N This

r

rather heavy op eration might b e acceptable as part of the keygenerating algorithm of the pseudo

random synthesizers or functions but is extremely undesirable as part of their evaluation

In the direct constructions of pseudorandom synthesizers based on RSA we manage to push

the comp osite N into the key of the synthesizers thus overcoming the drawback describ ed in the

d

previous paragraph Nevertheless we are still left with the following problem Computing m in

NC requires precomputation that dep ends on m To enable this precomputation it seems that

m needs to b e part of the key as well However in the construction which is describ ed ab ove m

dep ends on the input and is uniformly distributed for a random input In order to overcome this

problem we show a metho d of sampling m almost uniformly at random in a way that facilitates

the necessary prepro cessing This metho d uses the subset product functions We rst dene these

functions and then describ e the way they are used in our context

Denition Let G be a nite group and let y fy y g be an ntuple of elements in G

n

For any nbit string x x x dene the subset product SP x to be the product in G of the

n

Gy

elements y such that x

i i

The following lemma was shown by Impagliazzo and Naor and is based on the leftover hash

lemma of

Lemma Let G be a nite group n c log jGj and c Then for al l but an ex

n

ponentially smal l fraction of the choices of y G the distribution SP U is statistically

n

Gy

indistinguishable within an exponentially smal l amount from the uniform distribution over G

n

Let N b e a nbit integer Lemma gives a way of dening a collection of functions ff I

k

Z g which solves the problem of sampling an almost uniformly distributed element m Z for

k

N N

d

g where g fg g g which m can b e computed in NC This collection is ff g fSP

n

g g g g Z

N

is a sequence of n elements in Z The functions ff g have the following prop erties

g g

N

For almost all choices of the key g we have that f U is almost uniformly distributed in

n

g

Z

N

y

Following prepro cessing that dep ends only on the key g each value f x can b e computed

g

j

in NC The values that need to b e precomputed are g where i n and j

i

y

the length of y With these values the computation of f x is reduced to a single iterated

g

multiplication and an additional mo dular reduction

The First RSA Construction

th

For our rst RSA construction we need to assume that it is hard to extract the e ro ot mo dulo

a comp osite when e is a large prime To formalize this we assume that for every p ossible value

n

hN ei of IG we have that e is a nbit prime which in particular means that e Z

RS A

N

n

Based on this version of the RSAAssumption we dene a collection of I I pseudorandom

synthesizers S

RS A

Denition Let N be a nbit integer let g fg g g be a sequence of n elements in

n

n

and let r be a nbit string Dene the function s I I by Z

N gr

N

def

n y

x y I s x y g r

x

N gr

where g SP x Let S to be the random variable that assumes as values the functions s

x n

Z g N gr

N

n

where the distribution of N is induced by IG and g and r are uniformly distributed in their

RS A

range The function ensemble S is dened to be fS g

RS A n nN

Note that the only reason we let y b e a nbit number instead of a nbit number is to make

b oth inputs of s b e of the same length which not really necessary for our constructions

N gr

Theorem If the RSAAssumption Assumption holds when for every possible value hN ei

n n

of IG we have that e is a nbit prime Then S is a collection of I I pseudo

RS A RS A

random synthesizers

Proof It is obvious that S fS g is eciently computable Assume that S is

RS A n nN RS A

not a collection of pseudorandom synthesizers Then there exists a p olynomial m such that

the ensemble E fE g is not pseudorandom where E C X Y for X and Y that are

n n S

n

mn

indep endently drawn from U Therefore there exists an ecient nextbit prediction test

n

T and a p olynomial q such that for innitely many ns it holds that

Given a prex of E of uniformly chosen length T succeeds to predict the next bit with

n

probability greater than

q n

We now show how to use T in order to dene an ecient algorithm A such that for innitely

many ns

e z

Pr AN e m z r m r

q n

where the distribution of N e and m is as in the RSAAssumption with the restriction that e

is a nbit prime r is drawn from U and z is uniformly distributed over the set of nbit

n

z

integers that are relatively prime to e By Theorem this means that m can also b e eciently

z

computed Following Shamir we note that given any z such that gcde z and given m it

is easy to compute m The reason is that if g cde z then m can b e computed by the formula

e a z b

m m m where a b Zsatisfy that ae bz and can b e eciently computed as well

Thus the existence of such an algorithm A contradicts the RSAAssumption and completes the

pro of of the lemma

The algorithm A denes a matrix B which is almost identically distributed as E One of the

n

z

entries of B is m r the value A tries to guess and all other entries can b e computed by A It

z

is now p ossible for A to guess m r by invoking T on the appropriate prex of this matrix In

e

more details on input hN e m z r i the algorithm A is dened as follows

Uniformly choose i j mn

Dene the values fh h g and fd d g by setting h m uniformly drawing

i

mn mn

all other h s from Z setting d z e mo d N and drawing all other d s from U

u j v n

N

mn

e d

v

by setting b h Dene B b r

uv u uv

uv

z

Note that A can compute any entry b except b m r The reason is that if v j then A

uv ij

e e z e z

knows b oth d and h and if u i then A can compute b h r h r

v u uj u u

since it knows b oth h and z

u

Invoke T and feed it with all the entries of B up to b ie the rst i rows and the rst

ij

th

j entries of the i row

Output T s prediction of b

ij

It is obvious that A is ecient In order to complete the pro of we need to show that if

e

hN e m z r i are distributed as ab ove then B and E are of exp onentially small statistical dis

n

tance This would imply that for innitely many ns if we feed T with the bits of B up to b it

ij

z

predicts b m r with probability greater than say As argued ab ove this would

ij

q n

contradict the RSAAssumption and would complete the pro of

To see that B and E are indeed statistically close notice that

n

Since e Z and u mn the value m is uniformly distributed in Z we have

u

N

N

e

that u mn the value m is also uniformly distributed in Z

u

N

e

By Lemma we therefore have that the distribution of fm g is statistically

u

umn

close to the distribution of fg SP x g for uniformly distributed values

x u

Zg

umn

u

N

mn

n n

fx x g I and g fg g g Z

n

mn

N

For z that is chosen from U the distribution of z e mo d N and U mo d N are

n n

statistically close Since e is a large prime even given the restriction that z is relatively prime

to e these distributions are statistically close

Given these two observations it is easy to verify that B and E are indeed of exp onentially small

n

statistical distance 2

Claim The functions in S can be evaluated in NC given a sequential precomputation

RS A

which is part of the keygenerating algorithm

j

Proof Given that the keygenerating algorithm precomputes g for i j n the evaluation

i

of functions in S is reduced to an iterated multiplication and an additional mo dular reduction

RS A

2

The Second RSA Construction

th

The security of S dep ends on the assumption that it is hard to extract the e ro ot mo dulo a

RS A

comp osite where e is a large prime Here we dene another collection of synthesizers under the

th

assumption that it is hard to extract the e ro ot mo dulo a comp osite N without any restriction

However we introduce a new restriction on the p ossible values of on the distribution of e Z

N

the comp osite N

Denition Let G be the set of nbit integers N P Q such that P and Q are two nbit

n

primes and N has no odd factor smal ler than n

It is easy to verify that if N G then a sequence of n uniformlychosen o ddvalues d

n

n

By Lemma given such fd d g Z have a constant probability to b e in Z

n N

N

a sequence it is easy to almost uniformly sample any p olynomial number of values in Z even

N

without know ledge of N This can b e done by using the subset pro duct function SP

Z d

N

Notice that here the subset pro duct function serves an additional role to the one already describ ed

ab ove

Sieve theory shows that G is not to o sparse For example denote by B x the number of

n

primes p smaller than x such that p is the pro duct of two primes each of which is larger

cx

than p Then there exists a p ositive constant c such that B x See for several

log x

results of this sort which are more than sucient for our purp ose As a result we get that a

If the RSAassumption holds for a uniformly distributed value of N then it also holds under the

restriction N G b The uniform distribution over G can b e eciently sampled using Bachs

n n

algorithm Given a and b it seems that this restriction is rather reasonable

Actually without knowledge of N we cannot really compute SP However for every input x we can



Z d

(N )

x for our x y mo d N Such a value y would b e just as go o d as SP still compute y such that SP

 

d d Z Z

(N ) (N )

pro of

Based on the RSAAssumption with the restriction that N G we dene a collection of

n

n

I I pseudorandom synthesizers S In the denition of S we use the leastsignicant

RS A RS A

bit LSB instead of the GoldreichLevin hardcore bit Alexi et al showed that LSB is a hard

core bit for RSA Fischlin and Schnorr have recently provided a stronger reduction for this

bit

Denition Let N be a nbit integer let g fg g g be a sequence of n elements

n

in Z and let d fd d g be a sequence of n elements in Z Dene the function

n

N

N

n

s I I by

N gd

def

n d

y

x y I s x y LS B g

x

N gd

where g SP x and d SP y Let S to be the random variable that assumes as

x y n

Z g

Z d

N

N

n

values the functions s where the distribution of N is induced by IG and g and d are

RS A

N gd

uniformly distributed in their range The function ensemble S is dened to be fS g

RS A n nN

Theorem If the RSAAssumption Assumption holds when for every possible value hN ei

n n

of IG we have that N G Then S is a collection of I I pseudorandom

RS A n RS A

synthesizers

Proof It is obvious that S fS g is eciently computable Assume that S is

RS A n nN RS A

not a collection of pseudorandom synthesizers Then there exists a p olynomial m such that

the ensemble E fE g is not pseudorandom where E C X Y for X and Y that are

n n S

n

mn

indep endently drawn from U Therefore there exists an ecient nextbit prediction test

n

T and a p olynomial q such that for innitely many ns it holds that

Given a prex of E of uniformly chosen length T succeeds to predict the next bit with

n

probability greater than

q n

We now show how to use T in order to dene an ecient algorithm A such that for innitely

many ns

e

Pr AN e m LS B m

q n

where the distribution of N e and m is as in the RSAAssumption with the restriction that

N G By this contradicts the RSAAssumption and completes the pro of of the lemma

n

The basic idea in the denition of the algorithm A is similar to the pro of of Theorem the

algorithm A denes a matrix B which is almost identically distributed as E One of the entries

n

of B is LS B m the value A tries to guess and all other entries can b e computed by A It is now

p ossible for A to guess LS B m by invoking T on the appropriate prex of this matrix In more

e

details on input hN e m i as ab ove the algorithm A is dened as follows

Uniformly choose i j mn

Dene e to b e e d where d is almost uniformly distributed in Z such a value d can b e

N

sampled b ecause N G

n

Note that e is almost uniformly distributed in Z

N

Dene the values fh h g and fd d g by setting h m uniformly drawing all

i

mn mn

other h s from Z setting d e mo d N and sampling all other d s almost uniformly

u j v

N

from Z

N

d

v

mn

e

h Dene B b by setting b LS B

u uv uv

uv

Note that A can compute any entry b except b LS B m The reason is that if v j

uv ij

e

e e

h then A knows b oth d and h and if u i then A can compute b LS B

u v u uj

LS B h since it knows h

u u

Invoke T and feed it with all the entries of B up to b ie the rst i rows and the rst

ij

th

j entries of the i row

Output T s prediction of b

ij

e

It is obvious that A is ecient It is also easy to verify that if hN e m i is distributed as ab ove

then B and E are of exp onentially small statistical distance Therefore for innitely many ns if

n

we feed T with the bits of B up to b it predicts b LS B m with probability greater than

ij ij

As argued ab ove this contradicts the RSAAssumption and completes the pro of of say

q n

the lemma 2

Claim The functions in S can be evaluated in NC given a sequential precomputation

RS A

which is part of the keygenerating algorithm

j

Proof Given that the keygenerating algorithm precomputes g for i j n the evaluation

i

of functions in S is reduced to two iterated multiplication and two mo dular reductions 2

RS A

Remark Since Alexi et al showed that the log n leastsignicant bits are simultaneously

hard for RSA we can adjust the functions in S to output log n bits If we make a stronger

RS A

assumption that n bits are simultaneously hard for RSA we get a direct construction of pseudo

random synthesizers with linear output size Although the stronger assumption is not known to be

equivalent to the RSAAssumption it is stil l quite standard

PseudoRandomness and LearningTheory

Synthesizers Based on HardtoLearn Problems

The traditional connection b etween cryptography and learning theory is using cryptographic

assumptions to deduce computational nonlearnability results Blum Furst Kearns and Lipton

have suggested that the other direction is interesting as well They have shown how to construct

several cryptographic primitives out of hardtolearn functions in a way that preserves the degree of

parallelism of the functions A ma jor motivation for presenting such constructions is the simplicity

of function classes that are b elieved to b e hard for ecient learning

We show that under the denitions of pseudorandom synthesizers can easily b e con

structed from distributions of functions that are hard to learn Thus by the constructions shown

in this pap er two additional cryptographic primitives can b e constructed in parallel out of hard

tolearn functions pseudorandom generators with large expansion ratio without assuming

as in that the functions are hard to learn with membership queries and pseudorandom

functions

There is a dierence b etween standard learningtheory denitions and standard cryptographic

denitions Lo osely sp eaking a collection of concepts is hard to learn if for every ecient algorithm

there exists a distribution over the concepts that is hard to learn for this specic algorithm In

cryptographic settings the order of quantiers is reversed the hard distribution should b e hard for

every ecient algorithm In order for hardtolearn problems to b e useful in cryptographic settings

an averagecase learning mo del is introduced in

Informally describing one of the denitions in we can say that a distribution ensemble of

functions F fF g is not weakly predictable on the average with resp ect to a distribution D

n N

on the inputs if the following holds There is no ecient algorithm that can predict f x with

given x and a p olynomial sequence fhx f x ig where f is distributed probability

i i

poly n

according to F and all the inputs are indep endently distributed according to D

n

It is easy to verify that a distribution ensemble of functions F is not weakly predictable on

the average with resp ect to the uniform distribution if and only if it is a collection of weak pseudo

random functions Thus by Lemma such a distribution denes a pseudorandom synthesizer S

n n

where S x y is simply f x recall that f denotes the function that is sampled from F

n

I I

y y

using y as random bits Using S we can construct pseudorandom generators and pseudorandom

functions Moreover by Lemma the pseudorandom generator we construct may have a large

expansion ratio n for every The pseudorandom generator constructed in under the

same assumption has expansion ratio b ounded by n

A Concrete HardtoLearn Problem

Consider the following distribution on functions with parameters k and n Each function is dened

by two uniformly distributed disjoint sets A B f ng each of size k Given an nbit input

the output of the function is the exclusiveor of two values the parity of the bits indexed by

A and the ma jority of the bits indexed by B In it is estimated that these functions for

k log n cannot b e weakly predictable without using profoundly new ideas If indeed this

distribution of functions is not weakly predictable on the average for any k then it denes an

extremely ecient synthesizer Therefore using the constructions of this pap er we get ecient

parallel pseudorandom functions

The Application of PseudoRandom Functions to Learning Theory

As observed by Valiant if a concept class contains pseudorandom functions then we can

deduce a very strong unlearnability result for this class Informally it means that there exists a

distribution of concepts in this class that is hard for every learning algorithm for every nontrivial

distribution on inputs even when membership queries are allowed Since no parallel pseudorandom

functions were known b efore the current work this observation could not have b een applied to NC

Nevertheless other techniques based on cryptographic assumptions were used in

to show hardness results for NC and additional classes For example Kharitonov used the

following fact after prepro cessing a p olynomiallength pseudorandom bitstring based on

can b e pro duced in NC the length of the string can stay undetermined at the prepro cessing stage

The existence of pseudorandom functions in NC might still b e of interest to computational learning

theory b ecause the result it implies is stronger than previous results To briey state the dierence

we note that the results of use a very sp ecic distribution on the inputs that is hardtolearn

and the results of strongly rely on the order of quantiers in learningtheory mo dels which was

mentioned ab ove eg for any given learning algorithm shows a dierent hard concept which

can still b e easily learned by an algorithm which has a somewhat larger runningtime

Further Research

In Sections we discussed the existence of pseudorandom synthesizers in NC Additional work

should b e done in this area The most obvious question is what are the general assumptions in

cryptography or in other elds that imply the existence of pseudorandom synthesizers in NC In

particular whether there exist parallel constructions of pseudorandom synthesizers out of pseudo

random generators or directly from oneway functions

It is also of interest to nd parallel constructions of pseudorandom synthesizers based on

other concrete intractability assumptions A task of practical imp ortance is to derive more e

cient concrete constructions of pseudorandom synthesizers in order to get ecient constructions of

pseudorandom functions As describ ed in Section an imp ortant contribution to the eciency

of the pseudorandom functions would b e a direct construction of synthesizers with linear output

length

An extensive research eld deals with pseudorandom generators that fo ol algorithms p er

forming spaceb ounded computations This kind of generators can b e constructed without any

unproven assumptions see for denitions constructions and applications It is

p ossible that the concept of pseudorandom synthesizers and the idea of our construction can b e

applied to the world of spaceb ounded computations As a motivation remark note that the

construction in bares some resemblance to the GGM construction

In some sense we can think of the inner pro duct function as a pseudorandom synthesizer for

space b ounded computation Let IP x y b e the inner pro duct of x and y mo d and let X

and Y b e random lengthm sequences of nbit strings For some constant and s n

it can b e shown that C X Y is a pseudorandom generator for S P AC E s with parameter

IP

s

m when C X Y is given row by row The only fact we use is that approximating

IP

IP is hard in the communication complexity mo del see

One might also try to apply the concept of pseudorandom synthesizers for other classes of

algorithms For example construct pseudorandom generators for p olynomialsize constant

depth circuits and in general for any class for which hard problems are known

Our primary motivation for introducing pseudorandom synthesizers is the parallel construction

of pseudorandom functions The sp ecial characteristics of pseudorandom synthesizers lead us to

b elieve that other desired applications may exist For instance pseudorandom synthesizers easily

dene a pseudorandom generator with large output length and the ability to directly compute

subsequences of the output This and the prop erties discussed in Section suggests that pseudo

random synthesizers may b e useful for software implementations of pseudorandom generators or

functions Another p ossible application of the idea of Construction that should b e examined is

to convert encryption metho ds that are not immune to chosen plaintext attacks into ones that are

immune

As mentioned in the introduction in a subsequent work we describ e constructions of pseudo

random functions and other cryptographic primitives based on several numbertheoretic assump

tions These functions can b e computed in NC in fact even in TC and are very ecient We

note that is motivated by the current work and that it can b e describ ed as a direct and ecient

construction of ndimensional pseudorandom synthesizers see Section The question that arises

is whether there exist other such constructions of ndimensional pseudorandom synthesizers

An alternative direction for constructing parallel pseudorandom functions is to try and gener

alize the philosophy b ehind the Data Encryption Standard DES while maintaining its apparent

eciency Some interesting ideas and results on the generalization of DES can b e found in Cleves

work

Acknowledgments

We thank the two anonymous referees and Benny Pinkas for their many helpful comments We

thank Sha Goldwasser and Jon Sorenson who brought to our attention and Mihir Bellare for

his observation describ ed in Section

References

M Ajtai and A Wigderson Deterministic simulations of probabilistic constant depth circuits ADVCR

Advances in Computing Research vol Preliminary version Proc th Symp on Foundations

of Computer Science pp

W B Alexi B Chor O Goldreich and C P Schnorr RSA and Rabin functions certain parts are as

hard as the whole SIAM J Comput vol pp

D Angluin and M Kharitonov When wont membership queries help J Comput System Sci vol

pp

L Babai N Nisan and M Szegedy Multiparty proto cols pseudorandom generators for logspace and

timespace tradeos J Comput System Sci vol pp

E Bach How to generate factored random numbers SIAM J Comput vol pp

P W Beame S A Co ok and H J Ho over Log depth circuits for division and related problems SIAM

J Comput vol pp

M Bellare O Goldreich and S Goldwasser Incremental cryptography the case of hashing and signing

Advances in Cryptology CRYPTO Lecture Notes in Computer Science vol SpringerVerlag

pp

M Bellare O Goldreich and S Goldwasser Incremental Cryptography with Application to Virus Pro

tection Proc th Ann ACM Symp on Theory of Computing pp

M Bellare and S Goldwasser New paradigms for digital signatures and message authentication based

on noninteractive zero knowledge pro ofs Advances in Cryptology CRYPTO Lecture Notes in

Computer Science vol SpringerVerlag pp

E Biham D Boneh and O Reingold Generalized DieHellman mo dulo a comp osite is not weaker

than factoring Theory of Cryptography Library Record at

httptheorylcsmitedutcryptolhomepagehtml

L Blum M Blum and M Shub A simple secure unpredictable pseudorandom number generator

SIAM J Comput vol pp

M Blum W Evans P Gemmell S Kannan M Naor Checking the correctness of memories Algorith

mica pp Preliminary version Proc st Symp on Foundations of Computer Science

A Blum M Furst M Kearns and R J Lipton Cryptographic primitives based on hard learning

problems Advances in Cryptology CRYPTO Lecture Notes in Computer Science vol Springer

Verlag pp

M Blum and S Micali How to generate cryptographically strong sequence of pseudorandom bits

SIAM J Comput vol pp

D Boneh and R Lipton Algorithms for BlackBox elds and their application to cryptography Ad

vances in Cryptology CRYPTO LNCS vol Springer pp

G Brassard Mo dern cryptology Lecture Notes in Computer Science vol SpringerVerlag

R Cleve Methodologies for designing block ciphers and cryptographic protocols Part I PhD Thesis

University of Toronto

R Cleve Complexity theoretic issues concerning blo ck ciphers related to DES Advances in Cryptology

CRYPTO Lecture Notes in Computer Science vol SpringerVerlag pp

B Chor A Fiat and M Naor Tracing traitors Advances in Cryptology CRYPTO SpringerVerlag

pp

B Chor and O Goldreich Unbiased bits from sources of weak randomness and probabilistic communi

cation complexity SIAM J Comput vol pp Preliminary version in Proc th IEEE

Symp on Foundations of Computer Science pp

W Die and M Hellman New directions in cryptography IEEE Trans Inform Theory vol

pp

R Fischlin and C P Schnorr Stronger security pro ofs for RSA and Rabin bits Advances in Cryptology

EUROCRYPT Lecture Notes in Computer Science vol SpringerVerlag pp

O Goldreich Two remarks concerning the GoldwasserMicaliRivest signature scheme Advances in

Cryptology CRYPTO Lecture Notes in Computer Science vol SpringerVerlag pp

O Goldreich Towards a theory of software protection Proc th Ann ACM Symp on Theory of

Computing pp

O Goldreich Foundations of Cryptography Fragments of a Bo ok Electronic publi

cation httpwwwecccunitrierdeecccinfoECCCBooksecccbookshtml Electronic Col

lo quium on Computational Complexity

O Goldreich S Goldwasser and S Micali How to construct random functions J of the ACM vol

pp

O Goldreich S Goldwasser and S Micali On the cryptographic applications of random functions

Advances in Cryptology CRYPTO Lecture Notes in Computer Science vol SpringerVerlag

pp

O Goldreich and L Levin A hardcore predicate for all oneway functions Proc st Ann ACM Symp

on Theory of Computing pp

S Goldwasser and S Micali Probabilistic encryption J Comput System Sci vol pp

J Hastad A W Schrift and A Shamir The discrete logarithm mo dulo a comp osite hides O n bits

J of Computer and System Sciences vol pp

J Hastad R Impagliazzo L A Levin and M Luby Construction of a pseudorandom generator from

any oneway function To app ear in SIAM J Comput Preliminary versions by Impagliazzo et al in st

STOC and Hastad in nd STOC

A Herzb erg and M Luby Public randomness in cryptography Advances in Cryptology CRYPTO

Lecture Notes in Computer Science vol SpringerVerlag pp

R Impagliazzo and L Levin No b etter ways to generate hard NP instances than picking uniformly at

random Proc st IEEE Symp on Foundations of Computer Science pp

R Impagliazzo and M Naor Ecient Cryptographic schemes provably secure as subset sum J of

Cryptology vol pp

R Impagliazzo and D Zuckerman Recycling random bits Proc th IEEE Symposium on Foundations

of Computer Science pp

R M Karp and V Ramachandran Parallel algorithms for sharedmemory machines in J van Leeuwen

ed Handb o ok of Theoretical Computer Science vol A MIT Press pp

M Kearns and L Valiant Cryptographic limitations on learning Bo olean formulae and nite automata

J of the ACM vol pp

M Kharitonov Cryptographic hardness of distributionsp ecic learning Proc th ACM Symp on

Theory of Computing pp

N Linial Y Mansour and N Nisan Constant depth circuits Fourier transform and learnability J of

the ACM vol pp

M Luby Pseudorandomness and applications Princeton University Press

M Luby and C Racko How to construct pseudorandom p ermutations and pseudorandom functions

SIAM J Comput vol pp

L A Levin Oneway function and pseudorandom generators Proc th Ann ACM Symp on Theory

of Computing pp

U Maurer Towards the equivalence of breaking the DieHellman proto col and computing discrete

logarithms Advances in Cryptology CRYPTO LNCS vol Springer pp

K McCurley A key distribution system equivalent to factoring J of Cryptology vol pp

M Naor and O Reingold On the construction of pseudorandom p ermutations LubyRacko revisited

To app ear in J of Cryptology Preliminary version in Proc th Ann ACM Symp on Theory of

Computing pp

M Naor and O Reingold NumberTheoretic constructions of ecient pseudorandom functions Proc

th IEEE Symp on Foundations of Computer Science

Nisan N Pseudorandom generators for spaceb ounded computation Combinatorica vol

pp

N Nisan RL  SC Proc th Ann ACM Symp on Theory of Computing pp

N Nisan and A Wigderson Hardness vs randomness J Comput System Sci vol pp

Preliminary version Proc th IEEE Symp on Foundations of Computer Science pp

N Nisan and D Zuckerman Randomness is linear in space J Comput System Sci vol

pp Preliminary version More deterministic simulations in logspace Proc th Ann ACM Symp

on Theory of Computing pp

R Ostrovsky An ecient software protection scheme Proc nd Ann ACM Symp on Theory of

Computing pp

M Ram Murty Artins conjecture for primitive ro ots The Mathematical Intel ligencer vol

SpringerVerlag pp

A Razb orov and S Rudich Natural pro ofs J of Computer and System Sciences vol pp

Preliminary version Proc th Ann ACM Symp on Theory of Computing pp

J H Reif and J D Tygar Ecient parallel pseudorandom number generation SIAM J Comput

vol pp

R L Rivest A Shamir and L M Adleman A metho d for obtaining digital signature and public key

cryptosystems Comma ACM vol pp

A Shamir On the generation of cryptographically strong pseudorandom number sequences ACM

Trans Comput Sys pp

V Shoup Lower b ounds for discrete logarithms and related problems Proc Advances in Cryptology

EUROCRYPT Lecture Notes in Computer Science SpringerVerlag pp

Z Shmuely Comp osite DieHellman publickey generating systems are hard to break Technical Re

p ort No Computer Science Department Technion Israel

L G Valiant A theory of the learnable Comm ACM vol pp

U V Vazirani Strong communication complexity or generating quasirandom sequences from two

communicating semirandom sources Combinatorica vol Preliminary version in Proc th

Ann ACM Symp on Theory of Computing pp

A C Yao Theory and applications of trap do or functions Proc rd IEEE Symp on Foundations of

Computer Science pp