Calculations on Randomness Effects on the Business Cycle Of
Total Page:16
File Type:pdf, Size:1020Kb
Calculations on Randomness Effects on the Business Cycle of Corporations through Markov Chains Christian Holm [email protected] under the direction of Mr. Gaultier Lambert Department of Mathematics Royal Institute of Technology Research Academy for Young Scientists Juli 10, 2013 Abstract Markov chains are used to describe random processes and can be represented by both a matrix and a graph. Markov chains are commonly used to describe economical correlations. In this study we create a Markov chain model in order to describe the competition between two hypothetical corporations. We use observations of this model in order to discuss how market randomness affects the corporations financial health. This is done by testing the model for different market parameters and analysing the effect of these on our model. Furthermore we analyse how our model can be applied on the real market, as to further understand the market. Contents 1 Introduction 1 2 Notations and Definitions 2 2.1 Introduction to Probability Theory . 2 2.2 Introduction to Graph and Matrix Representation . 3 2.3 Discrete Time Markov Chain on V ..................... 6 3 General Theory and Examples 7 3.1 Hidden Markov Models in Financial Systems . 7 4 Application of Markov Chains on two Corporations in Competition 9 4.1 Model . 9 4.2 Simulations . 12 4.3 Results . 13 4.4 Discussion . 16 Acknowledgements 18 A Equation 10 20 B Matlab Code 20 C Matlab Code, Different β:s 21 1 Introduction The study of Markov chains began in the early 1900s, when Russian mathematician Andrey Markov first formalised the theory of Markov processes [1]. A Markov process is a memoryless process that is used to describe probability measures evolving over time. Markov himself was the one who first applied the Markov property on mathematics, but not the first one who used this property. A famous use of the memoryless process that pre-dated Markov was Karl Marx, with his book “Das Kapital”, where he used the concept of memoryless processes to describe the economical development in a capitalist society [2]. Markov’s development of memoryless processes opened a new field of applications in mathematics that later made possible inventions such as hidden Markov models and the Google algorithm. This is a Markov processes algorithm called PageRank to give ranking to internet pages [3, 4]. These are two of the applications that have made Markov processes a popular subject of mathematical research. Markov processes have frequently been used within the financial sector and economi- cal research [1]. This paper will attempt to apply the Markov property on micro-financial systems. We will present a model which describes a system of two corporations in com- petition. Customers buy product at random from the two corporation according to a Markov chain and the corporation receive an increased amount of assets or increased fitness if they are chosen by the customers. In this paper we will formalize a mathematical model of this market in order to describe the market randomness effect on business cycle of corporations. 1 2 Notations and Definitions 2.1 Introduction to Probability Theory In this section, we will review some elementary facts of probability theory. Throughout these notes, we will take V = f1;:::;Ng for some N 2 N, as our probability space. The probability that the event A occurs we denote as P[A]. Definition 1. Let A, B be two events, then we denote by P[AjB] the probability that the event A happens knowing that B happens. P[AjB] is called the probability of A given B. Example 1. To give an example of this definition imagine that you have a deck of cards, with 52 cards in it. Pick a card which you know is a head, what is the possibility of this card being a king? The easy answer is of course 4=12 or 1=3. Observe that this probability is strictly greater than the probability of just picking a king, which is P[king] = 1=13. If P[B] > 0 then we compute the probability of A knowing that event B occurs by P[A \ B] P[AjB] = (1) P[B] A \ B is the event that both A and B happen. As such the probability P[AjB] that A happens given that B is happening is equal to the probability of A and B happening divided by the probability that B has occurred. Definition 2. Two events A and B are independent if P[A \ B] = P[A]P[B]. This means that A and B are independent if the probability that they both happen is equal to the product of the probabilities that they happen separately. Theorem 1. Two events A and B are independent if and only if P[AjB] = P[A] Proof. [AjB] = P[A\B] P P[B] It may be drawn from definition 2, that if A and B are independent then: [AjB] = P[A]·P[B] P P[B] We simplify the equation and get P[AjB] = P[A] 2 Definition 3. A probability measure on V = f1;:::;Ng is a vector µ 2 RN that satisfy PN µi ≥ 0 and i=1 µi = 1 for i 2 f1;:::;Ng. In other words, if we let all possible states be elements i 2 f1;:::;Ng in the space V , then µi is the probability of observing state i. The probability of all possible states must add up to 1. This is only true if all states are disjoint. Example 2. If we have the three events of picking a king, a red card and a black card from the same deck as before, then our events are not disjoint. For example it is possible to pick a king and a red card. As such the sum of all events is not equal to 1, it is 1=13 + 1=2 + 1=2 = 14=13. 2.2 Introduction to Graph and Matrix Representation Throughout this section we will review some elementary facts about graphs and matrices, facts that the paper is based on and occasionally will refer to. A directed graph G = (V; E~ ) is a collection of vertices V and a collection of edges −! −! ij 2 E~ connecting them. An edge ij is represented by an arrow directed from the vertex i to the vertex j. In this report, we will consider the graph G to be finite and denote the set of vertices V = f1;:::;Ng. Example 3. In this example we see the graph in Figure 1 with states 1; 2; 3 and 4, between these states there are edges, one of which is edge 13~ represented by an arrow. This edge is the possible path between the two states 1 and 3. Definition 4. A N × N matrix P is an array of real numbers Pij for j; i 2 f1;:::;Ng 2 3 1 0 6 7 4 5 (2) 1 2 As an example of how to read a 2 × 2 matrix, the value of P12 = 0 and P22 = 2. 3 13~ 1 , 3 < L | 2 l , 4 L Figure 1: The directed graph of example 3. A matrix P corresponds to a linear map from RN to RN . For any vector x 2 RN , the i:th component of the vector P x is given by the equation N X (P x)i = Pijxj (3) j=1 Definition 5. The number λ 2 R is a real eigenvalue of the matrix P if there exists a vector x 2 Rn; Rnnf0g which satisfy the equation P x = λx We call x is an eigenvector associated to λ. A matrix does not necessarily have eigenvalues. Moreover, eigenvectors are not uniquely defined. For instance if x is an eigenvector associated to λ then so is cx for any constant c 2 R. −! Given a graph G = (V; E ) we can put probability weights on its edges. For any two ~ vertices i and j, if there is an edge ji~ 2 E, we let Pij be the probability of going from j to i, otherwise, we let Pij = 0. In this way, we construct a matrix P which correspond to the graph G and its probability weights. Since the sum of probabilities of ending in any possible state i given any starting state PN j, is equal to 1, we must have i=1 Pij = 1. In the theory of Markov processes, a matrix 4 1=2 1=2 1 , 3 < L 1=3 5=6 1=2 1=2 1=2 | 2 l , 4 L 1=6 1=6 Figure 2: Figure 3 with weights. P satisfying the latter condition is called a stochastic or transition matrix. Example 4. We reuse Figure 1 and attach weights to its edges, as seen in Figure 2. This directed graph has edges that stretch between states, illustrating the possible changes of states in this system. To the edges there are weights attached, as an example the weight attached to edge 13~ is 1=2. The weight might be seen as the possibility of travelling through an edge. 2 3 1=2 0 0 0 6 7 6 7 6 0 1=6 1=2 1=67 6 7 P = 6 7 61=2 1=3 0 5=67 6 7 4 5 0 1=2 1=2 0 This stochastic matrix is the corresponding matrix to the graph in example 4. The sum of its every column, the sum of the possibility of entering all i corresponding to a given starting value j, is equal to 1. As all stochastic matrices P it might be rewritten as a graph, both ways of illustrating a process are equivalent.