MS&E 226: \Small" Data Lecture 9: Logistic regression (v2) Ramesh Johari
[email protected] 1 / 28 Regression methods for binary outcomes 2 / 28 Binary outcomes For the duration of this lecture suppose the outcome variable Yi 2 f0; 1g for each i. (Much of what we cover generalizes to discrete outcomes with more than two levels, but we focus on the binary case.) 3 / 28 Linear regression? Why doesn't linear regression work well for prediction? A picture: 1 Y 0 −20 0 20 X 4 / 28 Logistic regression At its core, logistic regression is a method that directly addresses this issue with linear regression: it produces fitted values that always lie in [0; 1]. Input: Sample data X and Y. Output: A fitted model f^(·), where we interpret f^(X~ ) as an estimate of the probability that the corresponding outcome Y is equal to 1. 5 / 28 Logistic regression: Basics 6 / 28 The population model Logistic regression assumes the following population model: exp(X~ β) P(Y = 1jX~ ) = = 1 − P(Y = 0jX~ ): 1 + exp(X~ β) A plot in the case with only one covariate, and β0 = 0; β1 = 1: 1.00 0.75 0.50 P(Y = 1 | X) 0.25 0.00 −10 −5 0 5 10 X 7 / 28 Logistic curve The function g given by: q g(q) = log 1 − q is called the logit function. It has the following inverse, called the logistic curve: exp(z) g−1(z) = : 1 + exp(z) In terms of g, we can write the population model as:1 −1 P(Y = 1jX~ ) = g (X~ β): 1This is one example of a generalized linear model (GLM); for a GLM, g is called the link function.