
ARTICLE IN PRESS Physica A 348 (2005) 505–543 www.elsevier.com/locate/physa Econophysics: from Game Theory and Information Theory to Quantum Mechanics Edward Jimeneza,b,c,Ã, Douglas Moyad aExperimental Economics, Todo1 Services Inc., Miami, FL 33126, USA bGATE, UMR 5824 CNRS, France cResearch and Development Department, Petroecuador, Paul Rivet E11-134 y 6 de Diciembre, Quito, Ecuador dPhysics Department, Escuela Politecnica Nacional, Ladro´n de Guevara E11-253, Quito, Ecuador Received 11 March 2004; received in revised form 20 September 2004 Available online 27 October 2004 Abstract Rationality is the universal invariant among human behavior, universe physical laws and ordered and complex biological systems. Econophysics isboth the use of physical concepts in Finance and Economics, and the use of Information Economics in Physics. In special, we will show that it is possible to obtain the Quantum Mechanics principles using Information and Game Theory. r 2004 Elsevier B.V. All rights reserved. PACS: 02.30.Cj; 02.50.Le; 03.65.Ta Keywords: Rationality; Optimal laws; Quantum mechanic laws; Energy; Information 1. Introduction At the moment, Information Theory is studied or utilized by multiple perspectives (Economics, Game Theory, Physics, Mathematics, and Computer Science). Our goal ÃCorresponding author. Research and Development Department, Petroecuador, Todo1 Services, Ecuador Experimental Economics, Miami, USA. E-mail address: [email protected] (E. Jimenez). 0378-4371/$ - see front matter r 2004 Elsevier B.V. All rights reserved. doi:10.1016/j.physa.2004.09.029 ARTICLE IN PRESS 506 E. Jimenez, D. Moya / Physica A 348 (2005) 505–543 in this paper is to present the different focuses about information and to select the main ones, to apply them in Economics of Information and Game Theory. Economics and Game Theory1 are interested in the use of information and order state, in order to maximize the utility functions of the rational2 and intelligent3 players, which are part of an interest conflict. The players gather and process information. The information can be perfect4 and complete5 see Refs. [1–4], (Sorin, 1992). On the other hand, Mathematics, Physics and Computer Science are all interested in information representation, entropy (disorder measurement), optim- ality of physical laws and in the living beings’ internal order see Refs. [5–7]. Finally, information is stored, transmitted and processed by physical means. Thus, the concept of information and computation can be formulated not only in the context of Economics, Game Theory and Mathematics, but also in the context of physical theory. Therefore, the study of information ultimately requires experimentation and some multidisciplinary approaches such as the introduction of the Optimality Concept [8]. The Optimality Concept is the essence of the economic and natural sciences [9,10]. Economics introduces the optimality concept (maximum utility and minimum risk) as equivalent of rationality and Physics understands action minimum principle, and maximum entropy (maximum information) as the explanation of nature laws [11,36]. If the two sciences have a common backbone, then they should allow certain analogies and to share other elements such us: equilibrium conditions, evolution, uncertainty measurement and the entropy concept. In this paper, the contributions of Physics (Quantum Information Theory)6 and Mathematics (Classical Information Theory)7 are used in Game Theory and Economics being able to explain mixed 1Game Theory can be defined as the study of mathematical models of conflict and cooperation among intelligent rational decision-makers. 2A decision-maker is rational if he makes decision in pursuit of his own objectives. It is normal to assume that each player’s objective is to maximize the expected utility value of his own payoff. 3A player in the game is intelligent if he knows everything that we know about the game and he can make inferences about the situation that we can make. 4Perfect information means that at each time only one of the players moves, that the game depends only on their choices, they remember the past information (utilities, strategies), and in principle they know all possible futures of the game [1]. 5In games of incomplete information the state of the nature is fixed but not know to all players. In repeated games of incomplete information, the changes in time is each player’s knowledge about the other players’ past actions, which affects his beliefs about the (fixed) state of nature. Games of incomplete information are usually classified according to the nature of the three important elements of the model, namely players and payoffs (within the two-person games: zero-sum games and non-zero-sum games), prior information on the state of the nature (within two-person games: incomplete information on one side and incomplete information on two sides), and signaling structure (full monitoring and state independent signals) [2,3,37]. 6Quantum Information Theory is fundamentally richer than Classical Information Theory, because quantum mechanics includes so many more elementary classes of static and dynamic resources—not only does it support all the familiar classical types, but there are entirely new classes such as the static resource of entanglement (correlated equilibria) to make life even more interesting that it is classically. 7Classical Information Theory is mostly concerned with the problems of sending Classical Information- letters in an alphabet, speech, strings of bits—over communications channels which operate within the laws of classical physics. ARTICLE IN PRESS E. Jimenez, D. Moya / Physica A 348 (2005) 505–543 507 strategy Nash’s equilibrium using Shannon’s entropy [12–15]. According to [16, p. 11] P ‘‘quantities of the form H ¼À pi log pi play a central role in information theory as measures of information, choice and uncertainty. The form of H will be recognized as that entropy as defined in certain formulations of statistical mechanics where pi is the probability of a system being in cell i of its phase space; ...’’ In Quantum Information Theory, the correlated equilibria in two-player games means that the associated probabilities of each-player strategies are functions of a correlation matrix. Entanglement, according to the Austrian physicist Erwin Schro¨ dinger, which is the essence of Quantum Mechanics, has been known for long time now to be the source of a number of paradoxical and counterintuitive phenomena. Of those, the most remarkable one is the usually called non-locality which is at the heart of the Einstein–Podolsky–Rosen paradox (ERP) see Ref. [17, p. 12]. Einstein et al. [18] which consider a quantum system consisting of two particles separated long distance. ‘‘ERP suggests that measurement on particle 1 cannot have any actual influence on particle 2(locality condition); thus the property of particle 2 must be independent of the measurement performed on particle 1.’’ The experiments verified that two particles in the ERP case are always part of one quantum system and thus measurement on one particle changes the possible predictions that can be made for the whole system and therefore for the other particle [5]. It is evident that physical and mathematical theories have had a lot of utility for the economic sciences, but it is necessary to highlight that Information Theory and Economics also contribute to the explanation of Quantum Mechanics laws. Will the strict incorporation of Classic and Quantum Information Theory elements allow the development of Economics and Game Theory? The definitive answer is yes. Economics has carried out its own developments around information theory; especially it has demonstrated both that the asymmetry of information causes errors in the definition of a optimal negotiation and that the assumption of perfect markets is untenable in the presence of asymmetric information see Refs. [19,20]. The asymmetry of the information according to the formalism of Game Theory can have two causes: incomplete information and imperfect information. As we will see in the development of this paper, Information Economics does not even incorporate in a formal way neither elements of Classical Information Theory nor Quantum Information concepts. The creators of Information Theory are Shannon and von Newmann see Ref. [15, Chapter 11]. Shannon the creator of Classical Information Theory introduces the entropy as the heart of your theory, endowing it of a probabilistic characteristics. On (footnote continued) ‘‘The key concept of Classical Information Theory is the Shannon Entropy Theory. Suppose we learn the value of a random variable X. The Shannon Entropy of X quantifies how much information we gain, on average, when we learn the value of X: An alternative view is that the entropy of X measures the amount of uncertainty about X before we learn its value. These two views are complementary; we can view the entropy either as a measure of our uncertainty before we learn the value of X; or as measure of how much information we have gained after we learn the value of X: ‘‘[15, p. 500]. ARTICLE IN PRESS 508 E. Jimenez, D. Moya / Physica A 348 (2005) 505–543 the other hand, von Newmann also creator of Game Theory, uses the probabilistic elements take into account by Shannon but defines a new mathematical formulation of entropy using the density matrix of Quantum Mechanics. Both entropy formulations developed by Shannon and von Newmann, respectively, permit us to model pure states (strategies) and mixed states (mixed strategies). In Eisert et al. [21,35], they not only give a physical model of quantum strategies but also express the idea of identifying moves using quantum operations and quantum properties. This approach appears to be fruitful in at least two ways. On one hand, several recently proposed quantum information application theories can already be conceived as competitive situations, where several factors which have opposing motives interact. These parts may apply quantum operations using a bipartite quantum system. On the other hand, generalizing decision theory in the domain of quantum probabilities seems interesting, as the roots of game theory are partly rooted in probability theory.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages39 Page
-
File Size-