<<

6.1 Point Estimation What is an estimate? • Want to study a population • Ideally the knowledge of the distribution, • Some parameters of the population may be the first consideration: θ • Use the information from a sample to estimate • θˆ : defines the procedure (the formula) • Different , each is a rv. Examples of estimators

• want to estimate the µ of a normal dist • Possibilities of estimators: X sample mean X¯ = i • n ! • sample • average of the smallest and the largest • 10% trimmed mean Which one is best (better)?

• Error in estimation θˆ = θ + error of estimation • Error is random, depending on the sample • We would like to control the error • mean 0 (unbiased) • smallest Unbiased Estimator • estimator: a particular • different estimators: some tend to over (under) estimate • unbiased: E(θˆ) = θ • difference is called the bias • examples: sample mean (yes), sample proportion (yes) if X is a binomial rv. • If possible, we should always choose an unbiased Sample Variance

• Need an estimator for the variance • Natural candidate: sample variance • Why divide by n-1, not n? • Make sure the estimator is unbiased • We can not say the same thing for sample Unbiased Estimators for Population Mean • several choices (make things complicated) • sample mean • if distribution is continuous and symmetric, sample median and any trimmed mean Minimum Variance

• suppose we have two estimators, both unbiased, which one do we prefer? • pick the one with smaller variance • minimum variance unbiased estimator (MVUE) • for normal distribution, the sample mean X ¯ is the MVUE for the population mean. Complications

• Which estimator is better? depending on the distribution • Cauchy dist: MVUE unknown, sample median is better ¯ • Uniform dist: X e is the best • Trimmed mean quite versatile, quite useful in practice

• What quantity shall we use to report the error in estimation? σ = V (θˆ) • Standard error: θˆ • Not always available in theor! y • May involve some guess for other parameters