Is Glasersfeld's Constructivism a Dangerous Intellectual Tendency?
Total Page:16
File Type:pdf, Size:1020Kb
Riegler, A. (2007) Is Glasersfeld’s constructivism a dangerous intellectual tendency? In: Glanville, R. & Riegler, A. (eds.) The importance of being Ernst. Echoraum: Vienna, pp. 263–275. Is Glasersfeld’s Constructivism a Dangerous Intellectual Tendency? Alexander Riegler Paper type: Conceptual paper Approach: philosophical–epistemological School: Radical Constructivism Purpose: Radical Constructivism has been subject to extensive criticism and denigration such as that it is a naturalized biologism which supports an “anything goes” philosophy of arbitrarily constructed realities. In an extreme case RC is equated with intellectual silliness. These accusations are to be refuted. Approach: Based on the concept that cognition can work only with experiences, we investigate the question of where their apparent order comes from. Arguments are presented that favor the amorphousness of the “external” world. To support the idea of “internal” order we review results in formal network research. Findings: The properties of networks suggest that order arises without influence from the outside. Conclusions: RC based on network models (a) does not need any empirical support and is therefore no biologism nor naturalism, (b) forgoes arbitrariness, and (c) goes beyond narrative (armchair) philosophy. Key words: amorphousness, naturalism, closure, reality, random Boolean networks, scale-free networks, small-world networks. INTRODUCTION Radical Constructivism (RC) has been subject to extensive criticism and denigration. The most frequent accusations include the allegation of simply repeating old (and often outdated) philosophical positions. RC is reproached for being: a naturalized biologism which refutes itself as it gets caught in argumentative circles; a form of extreme idealism that unavoidably results in solipsism; an “anything goes” philosophy claiming that realities are arbitrarily constructed, which makes all constructions of the same standard irrespective of whether they are science or voodoo; or a flavor of postmodernism with applications to literature and education only. Probably the fiercest statement reads, “I have a candidate for the most dangerous contemporary intellectual tendency, it is … constructivism … Constructivism attacks the immune system that saves us from silliness.” (Devitt 1991). This is a serious accusation. The author is right to be concerned about threatening tendencies in the academic world. However, he is wrong about constructivism. It is my objective to refute these claims and to present RC as a streamlined uniform discipline. THE WORLD AS A BLACK BOX For Ernst von Glasersfeld, searching for knowledge can be compared with the situation of a subject who is facing a black box, and who is trying to read sense into its behavior. “[The world] is a black box with which we can deal remarkably well.” (Glasersfeld 1974/2007, p. 81). In his view, experience “basically consists of signals… [any] representation of an ‘outside reality’ will necessarily be based on such regularities as they can establish in the experienced signal sequences… any representation of the outside reality will be a model of an inaccessible black box in which the input, registered as effector signals, is systematically related to the output, registered as receptor signals” (Glasersfeld 1979, p. 79).1 1 The concept of the black box was extensively discussed by Ashby (1956). See also Glanville (1982). Central to Glasersfeld’s theory are recognition patterns that enable us to recognize experiential sequences. The construction of reality is based on the recurrent extraction of repetitive patterns from the stream of experience. That we succeed in recognizing patterns says nothing about any ontological existence of these patterns, so even “if we posit causes for the sense data […], this does in no way entail that these causes exist in the spatio-temporal or other relational structures into which we have coordinated them” (ibid. p. 82). The relationship between experience and regularity was, among others, explored by Ernst Mach (1912/1960). In his concept of the “economy of thoughts” he emphasized the importance of compressing experiences into laws. He wrote that it “is the object of science to replace, or save, experiences, by the reproduction and anticipation of facts in thought. Memory is handier than experience, and often answers the same purpose” (p. 577). Likewise, Herbert Simon defines science as pattern recognition. He characterized the “discovery” of laws as “detecting the pattern information contained in the data, and using this information to recode the data in more parsimonious form” (Simon 1973, p. 479). For Gerald Weinberg science can only proceed by simplification: “Newton’s genius was … his ability to simplify, idealize, and streamline the world so that it became, in some measure, tractable to the brains of perfectly ordinary men” (Weinberg 1972/1991, p. 505). Therefore, laws are not only convenient and economical, they are indispensable for scientific thinking. This idea cumulates in Ross Ashby’s claim, “The systems theorist of the future, I suggest, must be an expert in how to simplify” (Ashby 1964/1991, p. 510).2 However one has to proceed carefully with the idea of compression of experiences into laws. For example the physicist Roman Sexl (1983) could prove that the hollow earth theory forms a complete alternative physical theory when applying certain formal inversions to conventional Newtonian physics. Both theories – living on the surface of planet Earth and living on its inside – may appear to be in extreme opposition to each other but both can be reconciled with physical data obtained in experiments. Therefore the hollow earth theory is a coherent scientific world view. In a sense, it is a mathematically equivalent version of our usual world view. Both can be mapped onto each other. It is up to the metaphysical criteria of science to decide which of them is “truer” than the other. Such criteria include simplicity, clarity, and beauty (McAllister 1996), which strictly speaking have nothing to do with the physical experiments forming the basis of either theory. In philosophy of science this is known as the empirical underdeterminism of theories, which says that there is an asymmetry between observable facts (which can be described in propositions) and postulated laws. As Pierre Duhem (1906) claimed, there is a practically infinite number of possible laws that can be extracted from a given data set. Starting from Glasersfeld’s assumption that knowledge can be characterized as the isolation of repetitive patterns from the totality of sense data, the underdeterminism of theories lets us arrive at the conclusion that there must be an infinite number of possible meaningful entities that can be constructed from the stream of experience. Glasersfeld notes, “[t]here may, indeed, be countless ways of operating and arriving at coherent structures that are no less recurrently imposable on our stream of experience than the ones we have come to construct.” (Glasersfeld 1974/2007, p. 82). In view of the underdeterminism of laws, we face the question of the origins of the order that we observe in our reality. Let us first explore the traditional account according to which order comes from the “outside.” EXTERNAL ORDER One of the central assumptions of naturalism is that the world is structured. Even constructivists like Gerhard Roth assure us that “of course, nobody has serious doubts about the fact that the brain- and consciousness-independent world is ordered” (Roth 1996, p. 365, my translation). 2 I am grateful to Ranulph Glanville for drawing my attention to Weinberg and Ashby. This genuinely metaphysical assumption about the inherent ontological structure at the basis of (scientific) data sets has been challenged, among others, by James McAllister. He claims that “[a]ny given data set can be interpreted as the sum of any conceivable pattern and a certain noise level. In other words, there are infinitely many descriptions of any data set as ‘Pattern A + noise at m percent’, ‘Pattern B + noise at n percent’, and so on, ranging over all conceivable patterns” (McAllister 1997, pp. 219–220. (Here, the notion “noise” refers, in an information-theoretical sense, to the purely mathematical discrepancy between a certain pattern and a given data set.) Therefore all laws and patterns that can be read into a data set have the same status, which means that all of them match the structures of the “world” to the same degree. The statement that the world encompasses all possible structures is equivalent to the claim that it does not contain any structure and is, therefore, amorphous, i.e., structureless (McAllister 1999). However, if the world is amorphous doesn’t this mean that all scientists (can) do is cut out arbitrarily sized clusters of phenomena from a given collection of sense data? McAllister, too, points at this matter of fact. He writes, “a scientific law or theory provides an algorithmic compression not of a data set in its entirety … but only of a regularity that constitutes a component of the data set and that the scientist picks out in the data” (McAllister 2003, p. 644). In this sense scientific laws are algorithmic compressions of regularities partially covering a given data set. These regularities – or patterns – are competing with each other (e.g., the traditional model and the hollow earth theory), and scientists must choose which pattern they prefer over the others. How is this decision making carried out? How do scientists determine which observational features or experimental results are trustworthy and relevant evidence of the investigated phenomenon? For example, Johannes Kepler, facing a huge amount of observational data about the passage of planets in the sky, needed many years to choose the proper data that led him to the formulation of his first law of planetary movement, according to which, planets revolve around the sun on ellipses (rather than circles or any other geometrical shape Kepler had taken into consideration in those years). As soon as he had isolated the appropriate data sets the formulation of the law fell into place while before other data sets had obstructed the view on it.