<<

Interdisciplinary Reviews, 12, B. B. Mandelbrot: 1987, 117-127 It occupies a critical position along the tortuous path that eventually led to fractals. This, and the fact that this text appears in a Towards a second stage journal called Interdisciplinary Science of in Reviews, seems to call for a few philosophical and autobiographical comments. My research science career, which must be described as “improbable,” was triggered by a casual side interest in diverse isolated empirical Benoit Mandelbrot regularities that everyone else viewed as of little consequence. As I look back, my divides into three well-separated periods. When the International Congress for Logic, A period of gestation started with my PhD Methodology, and the Philosophy of Science thesis in 1952 and lasted until 1964. was held in Jerusalem, in September 1964, I Jumping ahead, the third period that delivered an invited address titled “The started in 1975, witnessed consolidation and of Chance in Certain Newer increasingly broad, rapid, and smooth .” But I hardly tried to prepare a development, marked by books that do seem text for the Proceedings, and for many years to involve an effective mix of technique and I kept resisting friendly suggestions – philosophy. Fractal geometry has the special notably by the Berkeley molecular biologist charm of allowing uninterrupted interplay Gunther S. Stent – that the draft be between concrete fields (ranging from widely reworked, completed and printed. One practiced ones to the very obscure) and reason was that success kept eluding sophisticated pure mathematics. It has been repeated attempts to state a technical point, successful as mathematics. In fact, it has while also making clear its philosophical shamed the iconoclastic tradition that ran implications. But it is good to see the old from Laplace to Bourbaki by stimulating or text published at long last. It has been reviving several mathematical theories; it substantially edited for style and shortened, has become a widely used tool in the but not otherwise modified, and it is description of nature and in the wide search preceded by a few pages of miscellaneous for order in chaos; and finally, fractal art is observations, which have been recast in the now becoming widely admired as art, form of a dialogue. References were updated irrespective of its unusual origin. in 2002. The middle period lasted from 1964 to 1975. From the viewpoint of fractals' development, it was in many ways the most 1. Reflections from the interesting but from a personal viewpoint it perspective of 1987 on a was most frustrating. This period was premature fractal manifesto punctuated by successive fractal manifestos, the most notable ones having been a 1972 written in 1964 lecture at the Collège de France in Paris, which followed a Trumbull Lecture at Yale in While the word fractal did not appear until 1970, and the even earlier 1964 Jerusalem 1975, this 1964 draft was important in the lecture with which we deal here. of fractal geometry, an When chance or duty makes me reread interdisciplinary enterprise I conceived in this and other unpublished texts of the 1964, then developed. I have devoted to it middle period, I am surprised at the almost all my creative life. precision and clarity given to many ideas that were not fully worked out until much later in Question: my life. But my style failed to encourage the Why should this old text be of historical reader to plow through papers that had interest today? already acquired the reputation of advancing very disturbing ideas. It is useful, therefore, weather and of prices were “wild.” I used to to state at this point one basic idea of use “erratic,” an ill-chosen Latin word that fractals. did not last. My work invited the sciences to Why is school geometry so often described move on to a second stage of indeterminism. as “cold” and “dry?” One reason is this How was this invitation received? geometry’s inability to tell what shape a Certainly not to my satisfaction! On the one cloud is, or a mountain, or a coastline. hand, many influential scholars considered Clouds are not spheres, mountains are not my discoveries to be potentially important, cones, coastlines are not circles, and more and offered me a of renowned pulpits generally, man’s oldest questions concerning from which to present them. Yet, until 1975, the shape of this world were left unanswered they were called controversial. In fact, they by Euclid and his successors, who concerned provoked little discussion, pro or con, to themselves exclusively with an unrealistically justify them being so called controversial. orderly universe. In order to achieve a They failed to affect the work of numerous, handle on nature, a radically different diverse, distinguished and often well- geometry is needed, one that must disposed people who heard me. To use a contradict many old ideas that have become term favored by Stent, my work suffered so familiar as to seem obvious and from being “premature.” universally valid. However, to negate these ideas completely would be self-defeating Question: because it would replace excessive order You have said that, in your work, a with utter chaos. Fractal geometry is a new growing role is played by sophisticated and very different broad area of order within graphics, dear to a geometer’s and an artist’s the domain of the old chaos. Some fractals eye. Could you elaborate? imitate the mountains and the clouds, while others are wild and wonderful new shapes. BBM: More generally, the new fractal world is in Being premature is particularly painful some cases hard to tell from the real one, when one’s whole scientific work has been and in other cases it is of fantastic and interdisciplinary. Thus, it is unusual indeed surprising beauty. that fractal geometry managed to survive and to become part of the mainstream, Question: without having to be first forgotten and later Is there any relation between second stage rediscovered by others, when its came. indeterminism and chaotic fluctuations? Why did its time come after 1975, but not before? We cannot be sure, except that an BBM: essential role has clearly been played by The conventional wisdom has long been computer graphics – of which I became a that the study of the weather and of pioneer by necessity. economics is harder than the study of perfect Mention of Stent’s paper necessarily brings gases, but will eventually use the same forth a thought concerning the issue of to achieve the same degree of uniqueness in scientific discovery. Indeed, perfection. To the contrary, my work Stent draws our attention to the (hostile) suggested a profound qualitative distinction review that the biochemist Erwin Chargaff between the underlying fluctuations, and as wrote of The Double Helix by James D. a result the theories of the corresponding Watson. In that review, we read that phenomena were bound to differ sharply. On “Timon of Athens could not have been the one hand, the fluctuations that written, Les Desmoiselles d’Avignon could characterize the theory of gases should be not have been painted, had Shakespeare and viewed as “mild,” and the first stage of Picasso not existed. But of how many indeterminism in science was comparatively scientific achievements can this be claimed? easy because of their being mild. On the One could almost say that, with very few other hand, the facts already established by exceptions, it is not the men that make 1964 indicated that the fluctuations of the science, it is science that makes the men. What A does today, B or C or D could surely that – for better or worse – were greatly do tomorrow.” affected by my peculiar life story. Would This may be true of many of the individual another individual, or some collectivity, have strands of fractal geometry. But fractal reached the same philosophy and built the geometry is not merely a juxtaposition of its same whole? A worthy question for the individual strands. It arose as an integrated future, assuming that this whole actually whole, ruled by a philosophy that was survives. conceived and developed under conditions 2.1. Differences in scientific 2. Text of the Premature development It is often asserted that differences in Fractal Manifesto of 1964 development between sciences are solely due to differences of “age” as measured from Since the turn of the century, acceptance the earliest systematic investigation of the of indeterministic stochastic theories in different topics. I disagree. Indeed, science has spread spectacularly. A new theory saw its first triumphs in epistemology has arisen as a result, , but first arose in the study of the superseding the epistemology built upon statistical problems raised by economic- deterministic causal theories. In certain psychological . In the hands of areas of physics, the new approach was Laplace (circa 1800), a probabilistic view of rapidly and strikingly successful, for example social science and an arch-deterministic view in the study of thermal fluctuations in gases of physics had reached a high point at the and in solids, and in . same time. Even as late as 1912, statistical Elsewhere, progress has turned out to be social science could still be presented as a slow, and the fulfillment of high initial model to be followed by statistical physics. expectations is continually postponed. Such Similarly, in the works of Boussinesq (1872) is the case of meteorology and in most of and Osborne Reynolds (1895), the statistical economics. The present paper proposes to concept of turbulence in fluids was roughly trace this difference to the existence of a contemporary with Maxwell’s and deep qualitative contrast between the nature Boltzmann’s (1866) kinetic theories of gases. of the observed fluctuations in the But stochastic theories dashed forward in the “developed” and the “less developed” study of gases, while they still lag in the sciences. study of turbulence.

effectiveness of a macroscopic theory. 2.2. Articulation . •. The “classical central limit ” Before proposing an explanation of this is applicable. A fortiori, the “law of large difference in fate, it is good to recall that an numbers” is applicable. “articulation” is at the root of many statistical theories: small systems combine into big Let us recall the meaning of the terms systems, and one is interested in a statistical used in the second statement. As applied to temporal means, the classical central limit theory that applies to the latter and is based -1/2 on a limit theorem of . In theorem holds that the sum of T from t=1 the case of thermal fluctuations, the small to T of [X(t)-E(X)] is approximately Gaussian systems have physical reality but are for large T. As applied to means over large microscopic, that is, inaccessible to human numbers of systems, this theorem states -1/2 perception. Only the large systems are on that the sum of N from n=1 to N of [X(n)- man’s spatial and temporal scale. Moreover, E(X)] becomes approximately Gaussian for the following ideas are held to be true: large N. The (strong) . •. The details of the microsystems states that there is a probability equal to one have no effect on the macrosystems, and that, for increasingly long samples, the sum brutal approximations concerning the -1 structure of the former do not affect the of T from t=1 to T of X(t) → E(X) as T →∞, and for increasingly large assemblies, the around an equilibrium state.” -1 sum of N from n=1 to N of X(n) → E(X) as N It is usually felt that this correspondence → ∞. principle is obvious and that scientists’ First-stage indeterminism has the virtue of universal reliance upon the law of large being closely related to causal theories. numbers and the classic central limit When it prevails, successful statistical theorem requires no special justification. At theories can be constrained so that a best, a scientist may occasionally observe “correspondence principle” holds: the that the conditions of validity of these -1 are so undemanding, or weak, that E[the sum of N from n=1 to N of X(n)], or -1 they are overwhelmingly likely to be verified. the temporal trend E[the sum of T from t=1 But natural science exhibits very few cases to T of X(t)] may be made to match those of (if any) where validity of these conditions is an approximating deterministic-causal rigorously reduced to basic physical laws. theory. In , for Usually, their validity is listed as a kind of example, the additional information provided phenomenological principle that happens to by is an important but detailed be remarkably effective. correction, an “error term,” a “fluctuation answered, and hence worth asking. 2.3. Less-developed sciences and There are several possible reasons why the articulation classical may fail to With this in mind, consider less developed hold, and a corresponding variety of “kinds” statistical theories that also involve a clear- of new statistical theories. No fallacious unity cut articulation. Here is a first main point. is therefore implied by referring to the My investigations lead me to believe that the aggregate of these theories as constituting a less developed sciences are precisely those new second-stage indeterminism. for which classical central limit theorem or Also, this last term does not exclude the even the law of large numbers fails to hold. possibility that indeterministic theories may Does this belief imply that statistical lie between the causal and the first-stage techniques become helpless? It does not, by indeterministic theories, rather than beyond any means. However, and this is my second the latter. main point, the new models will necessarily Anticipating briefly questions to be differ in kind from the old ones. In other discussed below, we may note that second- words, they will usher a new stage of stage indeterministic models may be avoided indeterminism into science. The change will by giving up the concept of statistical not only affect the details of the answers but stationarity. If we do so, however, there the very characterization of what makes a could be no theory, and this would be a very question well-posed, or capable of being poor bargain. on, the population mean itself fails to 2.4. Possible reasons for failure of first- converge, and the law of large numbers stage indeterminism becomes invalid as well. Even when the random quantities X(t) or The mathematicians' search of interesting X(n) are statistically independent, first-stage “pathologies” conceived these possibilities indeterminism fails when the distribution of long ago, at least since Cauchy in 1853. But X(t) is excessively “long-tailed” (that is, it is only recently that my work showed that there is a very large probability of X being these possibilities are not pathological, but very large, for example exceeding 4 or 10 practical and even indispensable. Examples the interquartile range). Let the occur in economics: Pareto’s law of income distribution of X be made distribution; the variation of speculative increasingly fat-tailed. Sufficiently fat-tails prices; the problem of industrial cause the population to become concentration; and so on. Examples also infinite, and the classical central limit occur in physics, among them the flow of theorem necessarily becomes invalid. Later water from Lake Albert into the Nile River; this is also the context in which one ought to coming from the largest among the X(n). re-examine the distribution of the energy of When the expectation of X is zero and S / N primary cosmic rays. 1/2 N tends to the Gaussian as N → ∞, a There is a second possible reason for first- theorem says that the relative contribution of stage indeterminism to fail. Even in cases the largest among the X(n) vanishes when the distribution of X(t) itself is short- 1/2 asymptotically. When S / N does not tend tailed (for example, Gaussian or even N bounded), first-stage indeterminism fails to the Gaussian, the situation is sharply when the intensity of the interdependence different: as N → ∞, the relative contribution between X(t) and X(t+T), as measured by a of the largest X(n) may tend (in statistical correlation, decreases very slowly as T →∞. law) to a nonvanishing limit. In such cases, Indeed, as the span of interdependence the few largest contributions to S stick out, N lengthens, the classical central limit theorem and there is a strong temptation to censor eventually fails. them a posteriori, calling them outliers, and S being a sum of N independent variables N use first-stage indeterministic analysis to X(n), consider the relative contribution to S N study what is left. possibility. 2.5. Permanence of second-stage D. One cannot always expect the causal indeterminism reduction of fluctuations described under B to The features described in the preceding be possible. It may, however, be possible to two paragraphs both raise a question. How reduce a second-stage indeterministic model final or permanent one may expect the to a mixture of the causal and the first stage second-stage indeterministic models to be? indeterministic approaches. Classically, of course, there have been at Reductions of this kind are reassuring, least three distinct views of the roots of because one has learned how to live with the indeterminism: two classical approaches. From the A. Some stochastic models are held to be viewpoint of historical description, they are irreducible; this is the Copenhagen school's feasible ex post facto and are obviously a view of quantum theory. useful tool. In other contexts, they may also B. Other stochastic models are held to help classify and name the parts of the describe the state of ignorance of some Unknown, a whose performance is observer. For Laplace, once the past is fully known to procure a feeling of power. known, so, potentially, will be the future. If As a comment upon this last feature, so, the observer’s state of ignorance will observe that ordinary vocabulary is biased eventually reduce, more or less thoroughly, against terms that denote entities with a to an interplay of causal relationships. As very long-tailed statistical distribution. For was already mentioned, the usual argument example, the size distribution of “human holds that, when these causes are very agglomerations” is much longer-tailed than numerous, and each contributes negligibly to those of “cities,” “towns” or “villages,” the whole, one should expect the whole to be considered separately. Do the distinctions ruled by first-stage indeterminism. between city, town and village also have C. The third classical view is favored by other foundations, more intrinsic than the some extremist historians, who claim that desire to avoid a long-tailed distribution? the past can be of interest only for its own However, similar distinctions often lack sake, and not as a basis of forecasting. In other motivation. If so, Descartes’ precept, particular, statistical regularities in past “subdivide the difficulties into parts” may be records can be of no predictive interest. very dangerous. Common sense and the old Turn now to problems accessible to informal science embedded in the vocabulary second-stage indeterministic models. They both involve a preliminary processing that do fall between Laplace’s conception of disfiguring the results of experience. In physics and the extremist’s conception of the areas where second-stage indeterministic history. Therefore, they introduce a fourth models are indicated, examples abound where such faux pas are to be feared. For with the Copenhagen view of quantum example, the flow of water alternates theory, that, as long as A has not been between being laminar and turbulent; hence disproved by explicit construction, it is the temptation to try to study the two kinds pragmatically unattackable. Most will also of regimes separately. In fact, my work agree that one may conceive of cases when suggests that the study of the whole natural A has been disproved, but the best rules of flow may be simpler than that of its turbulent behavior continue to be the same as if A held inserts taken alone. Similarly, in studying true. For example, the engineer’s attitude economic records, it may very well be towards thermal fluctuations is hardly preferable to avoid the temptation to attack affected by the fact that they are ultimately periods of crisis separately, as if the explainable by the kinetic theory of gases. A economy changed in kind during the major fortiori, it is likely in many fields that the depressions and booms. best action will long continue to be based To decide between alternatives A and D is upon a second-stage indeterministic theory a task that a scientist must not face on without concern about whether it may grounds of a priori epistemological theoretically be reducible. preferences. Most scientists, however, agree density with no local maximum but a pole at 2.6. Meaningfulness of hierarchical the zero. Usually the sample descriptions functions of such a process seem to exhibit An important special issue is the problem long Kondriatieff-like cycles that recall of the meaningfulness of hierarchical ordinary business cycles, and so on down to description. I have found that samples short-period wiggles much like the generated by a second-stage indeterministic speculative fluctuations. A similar process model often seem “stratified” or engenders stars that group into galaxies, and “hierarchical” even though no hierarchy had then into clusters and then super clusters of been built into the generating model. For galaxies, and so on. Obviously, the presence example, generate an economic time series of such striking patterns in “forgeries” by a stationary a, with a engendered by processes with no built-in continuous spectrum and a smooth spectral hierarchy has far-reaching consequences.