Texts in Computational Complexity: Pseudorandom Generators

Texts in Computational Complexity: Pseudorandom Generators

Texts in Computational Complexity Pseudorandom Generators Oded Goldreich Department of Computer Science and Applied Mathematics Weizmann Institute of Science Rehovot Israel January Indistinguishable things are identical GW Leibniz A fresh view at the question of randomness has b een taken in the theory of computing It has b een p ostulated that a distribution is pseudorandom if it cannot b e told apart from the uniform distribution by any ecient pro cedure The paradigm originally asso ciating ecient pro cedures with p olynomialtime algorithms has b een applied also with resp ect to a variety of limited classes of such distinguishing pro cedures At the extreme this approach says that the question of whether the world is deterministic or allows for some free choice which may b e viewed as sources of randomness is irrelevant What matters is how the world looks to us and to various computationally bounded devices That is if some phenumenon lo oks random then we may just treat it as if it were random Likewise if we can generate sequences that cannot b e told apart from the uniform distribution by any ecient pro cedure then we can use these sequences in any ecient randomized application instead of the ideal random bits that are p ostulated in the design of this application Summary A generic formulation of pseudorandom generators consists of sp ecifying three fundamental asp ects the stretch measure of the generators the class of dis tinguishers that the generators are supp osed to fo ol ie the algorithms with resp ect to which the computational indistinguishability requirement should hold and the re sources that the generators are allowed to use ie their own computational complexity The archetypical case of pseudorandom generators refers to ecient generators that fo ol any feasible pro cedure that is the p otential distinguisher is any probabilistic p olynomialtime algorithm which may b e more complex than the generator itself which in turn has timecomplexity b ounded by a xed p olynomial These generators are called generalpurp ose b ecause their output can b e safely used in an ecient appli cation Such generalpurp ose pseudorandom generators exist if and only if oneway functions exist This is the Principle of Identity of Indiscernibles Leibniz admits that counterexamples to this principle are conceivable but will not o ccur in real life b ecause Go d is much to o b enevolent We thus b elieve that he would have agreed to the theme of this text which asserts that indistinguishable things should be considered as identical For purp oses of derandomization one may use pseudorandom generators that are some what more complex than the p otential distinguisher which represents the algorithm to b e derandomized Following this approach suitable pseudorandom generators which can b e constructed assuming the existence of problems in E that have no subexp onential size circuits yield a full derandomization of BPP ie BPP P It is also b enecial to consider pseudorandom generators that fo ol spaceb ounded dis tinguishers and generators that exhibit some limited random b ehavior eg outputting a pairwise indep endent or a smallbias sequence Contents Introduction The General Paradigm GeneralPurp ose Pseudorandom Generators The basic denition The archetypical application Computational Indistinguishability Amplifying the stretch function Constructions Nonuniformly strong pseudorandom generators Other variants and a conceptual discussion Stronger notions Conceptual Discussion Derandomization of timecomplexity classes Denition Construction Variants and a conceptual discussion Space Pseudorandom Generators Denitional issues Two constructions Overviews of the pro ofs of Theorems and Derandomization of spacecomplexity classes Sp ecial Purp ose Generators PairwiseIndependence Generators Constructions Applications SmallBias Generators Constructions Applications Random Walks on Expanders Notes Exercises Bibliography Introduction The second half of this century has witnessed the development of three theories of randomness a notion which has b een puzzling thinkers for ages The rst theory cf initiated by Shan non is ro oted in probability theory and is fo cused at distributions that are not p erfectly random Shannons Information Theory characterizes p erfect randomness as the extreme case in which the information contents is maximized ie there is no redundancy at all Thus p erfect randomness is asso ciated with a unique distribution the uniform one In particular by denition one cannot deterministically generate such p erfect random strings from shorter random seeds The second theory cf due to Solomonov Kolmogorov and Chaitin is ro oted in computability theory and sp ecically in the notion of a universal language equiv universal machine or computing device It measures the complexity of ob jects in terms of the shortest program for a xed universal machine that generates the ob ject Like Shannons theory Kolmogorov Complexity is quantitative and p erfect random ob jects app ear as an extreme case However in this approach one may say that a single ob ject rather than a distribution over ob jects is p erfectly random Still Kolmogorovs approach is inherently intractable ie Kolmogorov Complexity is uncomputable and by denition one cannot deterministically generate strings of high Kolmogorov Complexity from short random seeds The third theory is ro oted in complexity theory and is the fo cus of this text This approach is explicitly aimed at providing a notion of randomness that nevertheless allows for an ecient and deterministic generation of random strings from shorter random seeds The heart of this approach is the suggestion to view ob jects as equal if they cannot b e told apart by any ecient pro cedure Consequently a distribution that cannot b e eciently distinguished from the uniform distribution will b e considered as b eing random or rather called pseudorandom Thus randomness is not an inherent prop erty of ob jects or distributions but is rather relative to an observer and its computational abilities To demonstrate this approach let us consider the following mental exp eriment Alice and Bob play head or tail in one of the following four ways In each of them Alice ips an unbiased coin and Bob is asked to guess its outcome before the coin hits the o or The alternative ways dier by the knowledge Bob has b efore making his guess In the rst alternative Bob has to announce his guess b efore Alice ips the coin Clearly in this case Bob wins with probability In the second alternative Bob has to announce his guess while the coin is spinning in the air Although the outcome is determined in principle by the motion of the coin Bob do es not have accurate information on the motion and thus we b elieve that also in this case Bob wins with probability The third alternative is similar to the second except that Bob has at his disp osal sophisticated equipment capable of providing accurate information on the coins motion as well as on the environment eecting the outcome However Bob cannot pro cess this information in time to improve his guess In the fourth alternative Bobs recording equipment is directly connected to a powerful computer programmed to solve the motion equations and output a prediction It is conceivable that in such a case Bob can improve substantially his guess of the outcome of the coin We conclude that the randomness of an event is relative to the information and computing resources at our disp osal Thus a natural concept of pseudorandomness arises a distribution is pseudo random if no ecient pro cedure can distinguish it from the uniform distribution where ecient pro cedures are asso ciated with probabilistic p olynomialtime algorithms This notion of pseudo randomness is indeed the most fundamental one and much of this text is fo cused on it Weaker notions of pseudorandomness arise as well they refer to indistinguishability by weaker pro cedures such as spaceb ounded algorithms constantdepth circuits etc Stretching this approach even fur ther one may consider algorithms that are designed on purp ose so not to distinguish even weaker forms of pseudorandom sequences from random ones such algorithms arise naturally when trying to convert some natural randomized algorithm into deterministic ones see Section The foregoing discussion has fo cused at one asp ect of the pseudorandomness question the resources or type of the observer or p otential distinguisher Another imp ortant asp ect is whether such pseudorandom sequences can b e generated from much shorter ones and at what cost or complexity A natural approach is that the generation pro cess has to b e at least as ecient as the distinguisher equiv that the distinguisher is allowed at least as much

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    50 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us