
8/20/2019 Linear Regression in Python – Real Python Linear Regression in Python by Mirko Stojiljkovic Apr 15, 2019 5 Comments data-science intermediate machine-learning Tweet Share Email Table of Contents Regression What Is Regression? When Do You Need Regression? Linear Regression Problem Formulation Regression Performance Simple Linear Regression Multiple Linear Regression Polynomial Regression Underfitting and Overfitting Implementing Linear Regression in Python Python Packages for Linear Regression Simple Linear Regression With scikit-learn Multiple Linear Regression With scikit-learn Polynomial Regression With scikit-learn Advanced Linear Regression With statsmodels Beyond Linear Regression Conclusion We’re living in the era of large amounts of data, powerful computers, and artificial intelligence. This is just the beginning. Data science and machine learning are driving image recognition, autonomous vehicles development, decisions in the financial and energy sectors, advances in medicine, the rise of social networks, and more. Linear regression is an important part of this. https://realpython.com/linear-regression-in-python/ 1/21 8/20/2019 Linear Regression in Python – Real Python Linear regression is one of the fundamental statistical and machine learning techniques. Whether you want to do statistics, machine learning, or scientific computing, there are good chances that you’ll need it. It’s advisable to learn it first and then proceed towards more complex methods. By the end of this article, you’ll have learned: What linear regression is What linear regression is used for How linear regression works How to implement linear regression in Python, step by step Free Bonus: Click here to get access to a free NumPy Resources Guide that points you to the best tutorials, videos, and books for improving your NumPy skills. Remove ads Regression Regression analysis is one of the most important fields in statistics and machine learning. There are many regression methods available. Linear regression is one of them. What Is Regression? Regression searches for relationships among variables. For example, you can observe several employees of some company and try to understand how their salaries depend on the features, such as experience, level of education, role, city they work in, and so on. This is a regression problem where data related to each employee represent one observation. The presumption is that the experience, education, role, and city are the independent features, while the salary depends on them. Similarly, you can try to establish a mathematical dependence of the prices of houses on their areas, numbers of bedrooms, distances to the city center, and so on. Generally, in regression analysis, you usually consider some phenomenon of interest and have a number of observations. Each observation has two or more features. Following the assumption that (at least) one of the features depends on the others, you try to establish a relation among them. In other words, you need to find a function that maps some features or variables to others suiciently well. The dependent features are called the dependent variables, outputs, or responses. The independent features are called the independent variables, inputs, or predictors. Regression problems usually have one continuous and unbounded dependent variable. The inputs, however, can be continuous, discrete, or even categorical data such as gender, nationality, brand, and so on. It is a common practice to denote the outputs with 푦 and inputs with 푥. If there are two or more independent variables, they can be represented as the vector 퐱 = (푥₁, …, 푥ᵣ), where 푟 is the number of inputs. When Do You Need Regression? Typically, you need regression to answer whether and how some phenomenon influences the other or how several variables are related. For example, you can use it to determine if and to what extent the experience or gender impact salaries. https://realpython.com/linear-regression-in-python/ 2/21 8/20/2019 Linear Regression in Python – Real Python Regression is also useful when you want to forecast a response using a new set of predictors. For example, you could try to predict electricity consumption of a household for the next hour given the outdoor temperature, time of day, and number of residents in that household. Regression is used in many dierent fields: economy, computer science, social sciences, and so on. Its importance rises every day with the availability of large amounts of data and increased awareness of the practical value of data. Linear Regression Linear regression is probably one of the most important and widely used regression techniques. It’s among the simplest regression methods. One of its main advantages is the ease of interpreting results. Problem Formulation When implementing linear regression of some dependent variable 푦 on the set of independent variables 퐱 = (푥₁, …, 푥ᵣ), where 푟 is the number of predictors, you assume a linear relationship between 푦 and 퐱: 푦 = 훽₀ + 훽₁푥₁ + ⋯ + 훽ᵣ푥ᵣ + 휀. This equation is the regression equation. 훽₀, 훽₁, …, 훽ᵣ are the regression coeicients, and 휀 is the random error. Linear regression calculates the estimators of the regression coeicients or simply the predicted weights, denoted with 푏₀, 푏₁, …, 푏ᵣ. They define the estimated regression function 푓(퐱) = 푏₀ + 푏₁푥₁ + ⋯ + 푏ᵣ푥ᵣ. This function should capture the dependencies between the inputs and output suiciently well. The estimated or predicted response, 푓(퐱ᵢ), for each observation 푖 = 1, …, 푛, should be as close as possible to the corresponding actual response 푦ᵢ. The dierences 푦ᵢ - 푓(퐱ᵢ) for all observations 푖 = 1, …, 푛, are called the residuals. Regression is about determining the best predicted weights, that is the weights corresponding to the smallest residuals. To get the best weights, you usually minimize the sum of squared residuals (SSR) for all observations 푖 = 1, …, 푛: SSR = Σᵢ(푦ᵢ - 푓(퐱ᵢ))². This approach is called the method of ordinary least squares. Regression Performance The variation of actual responses 푦ᵢ, 푖 = 1, …, 푛, occurs partly due to the dependence on the predictors 퐱ᵢ. However, there is also an additional inherent variance of the output. The coeicient of determination, denoted as 푅², tells you which amount of variation in 푦 can be explained by the dependence on 퐱 using the particular regression model. Larger 푅² indicates a better fit and means that the model can better explain the variation of the output with dierent inputs. The value 푅² = 1 corresponds to SSR = 0, that is to the perfect fit since the values of predicted and actual responses fit completely to each other. Simple Linear Regression Simple or single-variate linear regression is the simplest case of linear regression with a single independent variable, 퐱 = 푥. The following figure illustrates simple linear regression: https://realpython.com/linear-regression-in-python/ 3/21 8/20/2019 Linear Regression in Python – Real Python Example of simple linear regression When implementing simple linear regression, you typically start with a given set of input-output (푥-푦) pairs (green circles). These pairs are your observations. For example, the lemost observation (green circle) has the input 푥 = 5 and the actual output (response) 푦 = 5. The next one has 푥 = 15 and 푦 = 20, and so on. The estimated regression function (black line) has the equation 푓(푥) = 푏₀ + 푏₁푥. Your goal is to calculate the optimal values of the predicted weights 푏₀ and 푏₁ that minimize SSR and determine the estimated regression function. The value of 푏₀, also called the intercept, shows the point where the estimated regression line crosses the 푦 axis. It is the value of the estimated response 푓(푥) for 푥 = 0. The value of 푏₁ determines the slope of the estimated regression line. The predicted responses (red squares) are the points on the regression line that correspond to the input values. For example, for the input 푥 = 5, the predicted response is 푓(5) = 8.33 (represented with the lemost red square). The residuals (vertical dashed gray lines) can be calculated as 푦ᵢ - 푓(퐱ᵢ) = 푦ᵢ - 푏₀ - 푏₁푥ᵢ for 푖 = 1, …, 푛. They are the distances between the green circles and red squares. When you implement linear regression, you are actually trying to minimize these distances and make the red squares as close to the predefined green circles as possible. Multiple Linear Regression Multiple or multivariate linear regression is a case of linear regression with two or more independent variables. If there are just two independent variables, the estimated regression function is 푓(푥₁, 푥₂) = 푏₀ + 푏₁푥₁ + 푏₂푥₂. It represents a regression plane in a three-dimensional space. The goal of regression is to determine the values of the weights 푏₀, 푏₁, and 푏₂ such that this plane is as close as possible to the actual responses and yield the minimal SSR. The case of more than two independent variables is similar, but more general. The estimated regression function is 푓(푥₁, …, 푥ᵣ) = 푏₀ + 푏₁푥₁ + ⋯ +푏ᵣ푥ᵣ, and there are 푟 + 1 weights to be determined when the number of inputs is 푟. Polynomial Regression You can regard polynomial regression as a generalized case of linear regression. You assume the polynomial dependence between the output and inputs and, consequently, the polynomial estimated regression function. In other words, in addition to linear terms like 푏₁푥₁, your regression function 푓 can include non-linear terms such as 푏₂푥₁², 푏₃푥₁³, or even 푏₄푥₁푥₂, 푏₅푥₁²푥₂, and so on. The simplest example of polynomial regression has a single independent variable, and the estimated regression function is a polynomial of degree 2: 푓(푥) = 푏₀ + 푏₁푥 + 푏₂푥². Now, remember that you want to calculate 푏₀, 푏₁, and 푏₂, which minimize SSR. These are your unknowns! Keeping this in mind, compare the previous regression function with the function 푓(푥₁, 푥₂) = 푏₀ + 푏₁푥₁ + 푏₂푥₂ used for linear regression.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages21 Page
-
File Size-