From Quantum Axiomatics to Quantum Conceptuality
Total Page:16
File Type:pdf, Size:1020Kb
From Quantum Axiomatics to Quantum Conceptuality∗ Diederik Aerts1, Massimiliano Sassoli de Bianchi1,2 Sandro Sozzo3 and Tomas Veloz1,4,5 1 Center Leo Apostel for Interdisciplinary Studies, Brussels Free University Krijgskundestraat 33, 1160 Brussels, Belgium E-Mails: [email protected], [email protected] 2 Laboratorio di Autoricerca di Base, Lugano, Switzerland E-Mail: [email protected] 3 School of Business and IQSCS, University of Leicester University Road, LE1 7RH Leicester, United Kingdom E-Mail: [email protected] 4 Instituto de Filosof´ıa y Ciencias de la Complejidad IFICC, Los Alerces 3024, Nu˜noa,˜ Santiago, Chile 5 Departamento Ciencias Biol´ogicas, Facultad Ciencias de la vida Universidad Andres Bello, 8370146 Santiago, Chile E-Mail: [email protected] Abstract Since its inception, many physicists have seen in quantum mechanics the possibility, if not the necessity, of bringing cognitive aspects into the play, which were instead absent, or unnoticed, in the previous classical theories. In this article, we outline the path that led us to support the hypothesis that our physical reality is fundamentally conceptual-like and cognitivistic- like. However, contrary to the ‘abstract ego hypothesis’ introduced by John von Neumann arXiv:1805.12122v1 [quant-ph] 29 May 2018 and further explored, in more recent times, by Henry Stapp, our approach does not rely on the measurement problem as expressing a possible ‘gap in physical causation’, which would point to a reality lying beyond the mind-matter distinction. On the contrary, in our approach the measurement problem is considered to be essentially solved, at least for what concerns the origin of quantum probabilities, which we have reasons to believe they would be epistemic. Our conclusion that conceptuality and cognition would be an integral part of all physical processes comes instead from the observation of the striking similarities between the non-spatial behavior of the quantum micro-physical entities and that of the human concepts. This gave birth to a ∗Submitted to a special issue of Activitas Nervosa Superior: Brain, Mind and Cognition, dedicated to Henry Stapp in honor of his 90th birthday. 1 new interpretation of quantum mechanics, called the ‘conceptualistic interpretation’, currently under investigation within our group in Brussels. Keywords: Quantum theory; Conceptuality interpretation; Quantum cognition; Extended Bloch representation; Quantum structures; Quantum probabilities; Non-spatiality 1 Introduction Following the traces left by pioneers such as Niels Bohr, Werner Heisenberg, John von Neumann and Eugene Wigner, Henry Stapp has been a staunch defender, over the years, of the idea that what quantum theory has revealed to us, among other things, is that reality cannot consists of mindless physical entities (like particles and fields), as the deterministic edifice of classical physics has led us to believe. According to Stapp, the most radical shift brought by quantum physics is the explicit introduction of mind into the basic conceptual structure of the theory. For instance, he saw in the dimension of potentiality a signature of the fact that what reality is made of is more akin to a conceptual/cognitive stuff, formed for instance by ‘ideas of what might happen’, rather than to the persisting substances posited by Newtonian (pre-quantum) theories (Stapp, 2009, 2011). Our group in Brussels was also led in recent years to consider the possibility that ‘the stuff our reality is made of is fundamentally conceptual’. As for Stapp, we arrived at this view by taking very seriously the quantum formalism, in particular the fact that the two processes that quantum theory contemplates – one deterministic, described by the Schr¨odinger equation, and the other indeterministic, governed by the projection postulate and the Born rule – are both fundamental. The ontology that emerges from our analysis is however quite different from that presupposed by Stapp – although not necessarily incompatible with it – and it is the purpose of this article to trace the trajectory of the ideas that led us to consider this conceptualistic paradigm shift. 2 Quantum axiomatics It began with Garett Birkhoff & John von Neumann (1936) instigating a domain of research that was able to achieve a number of interesting results in the seventies and eighties of the last century. At that time, pioneers like George Mackey (1963), Jozef Maria Jauch (1968), Constantin Piron (1976), David Foulis & Charles Randall (1978) and G¨unther Ludwig (1983), were all dedicated to the task of building an axiomatic foundation for quantum theory starting from as much as possible operationally founded axioms, also allowing in this way for structurally more general quantum- like theories, in case part of the proposed axioms were withheld. One of us, student of Constantin Piron, was also thoroughly engaged in this research on quantum axiomatics in the foregoing century (Aerts, 1982, 1986, 1999a; Aerts & Durt, 1994; Aerts et al., 1997a, 1999b), so much so that the Geneva school of quantum mechanics became also known – at least among the insiders – as the ‘Geneva-Brussels school’. As we mentioned, an important aspect of these axiomatic approaches is the possibility to frame standard quantum mechanics within more general mathematical structures, able to jointly describe quantum and classical systems, understanding them as limit situations of more general ‘in between quantum and classical’ intermediary systems. This allowed in particular to appreciate a key point: 2 physical entities have properties that can be classical, quantum or intermediate, and that this does not depend on the fact that they would be microscopic or macroscopic, or immersed in a cold or hot environment. In other words, and more precisely, the realization was that what we came to call “quantum” is first of all a form of organization, which can be present in certain experimental contexts and revealed by performing suitably designed experiments. 3 Quantum machines One of the consequences of this understanding was that, being the appearance of the quantum laws not limited to the micro-level and to the very low temperature regimes, it was possible to conceive a number of macroscopic physical systems with a quantum behavior, due to the specific way measurements were defined with respect to them. In other words, these were physical systems of a classical nature which could acquire a ‘quantum-like nature’ when some specific measure- ments, perfectly defined in operational terms, were replacing the classical ones. It could indeed be proved that these macroscopic quantum-like physical systems entailed non-Kolmogorovian proba- bility models, which made them a very valuable study object for the investigation of possible ori- gins of non-Kolmogorovity and quantumness (Aerts, 1986; Aerts et al., 1997b; Aerts, 1998, 1999b; Sassoli de Bianchi, 2013a,b). More specifically, these “quantum machines” were analyzed in great depth with the aim to shed light on the origin of quantum indeterminism. It was well known, thanks to the existing no-go the- orems restricting the permissible hidden-variable models (Von Neumann, 1932; Bell, 1966; Gleason, 1957; Jauch & Piron, 1963; Kochen & Specker, 1967; Gudder, 1970), that quantum probabilities could not reflect a situation of lack of knowledge about ‘better defined states’ of a quantum en- tity. As a consequence, the majority of physicists was led to believe that quantum probabilities are ‘ontic’, and not ‘epistemic’, i.e., not explainable as a lack of knowledge about an objective deeper reality. But the quantum machines unveiled a completely different scenario that opened to a new possibility. Indeed, because of their macroscopic nature, and the explicit way in which measurements were defined, everything was perfectly visible about how these machines operate. What they showed is that the fluctuations at the origin of the quantum probabilities were not to be attributed to the states of the measured systems, which were perfectly well defined prior to the measurements, but to their interactions with the measuring apparatuses. The no-go theorems, however, assumed that the hidden-variables were to be assigned to the states of the entities under consideration, so they did not apply anymore if they were instead assigned to the measuring interactions (Aerts, 1984). More interesting, the insights gathered from these macroscopic quantum machines allowed for the identification of a theoretical approach to derive the Born rule in a non-circular way, first for two-dimensional systems (Aerts, 1986) and then, in more recent times, for systems of arbitrary dimension, in what has now been called the ‘extended Bloch representation of quantum mechanics’ (Aerts & Sassoli de Bianchi, 2014, 2016, 2018a). 4 Observations and creations The important aspect that was revealed by these studies is that in the description of a physical entity one needs to distinguish between two typologies of lack of knowledge. One is about an incomplete 3 knowledge of the actual state of the entity, and the other is about an incomplete knowledge of the interactions arising between the entity and its experimental context, particularly the measuring apparatus. Classical probabilities, obeying Kolmogorov’s axioms, formalize the former ignorance, whereas quantum probabilities, not obeying Kolmogorov’s axioms, formalize the latter, and more precisely, they correspond to the limit situation where there is a complete knowledge of the state of the entity but a maximum lack of knowledge about the interaction