<<

QUANTILE REGRESSION

ROGER KOENKER

Abstract. Classical regression maybeviewed as a natural way of extending the

idea of estimating an unconditional parameter to the problem of estimating conditional

mean functions; the crucial link is the formulation of an optimization problem that encompasses

b oth problems. Likewise, regression o ers an extension of univariate quantile estimation

to estimation of conditional quantile functions via an optimization of a piecewise linear ob jective

function in the residuals. regression minimizes the sum of absolute residuals, an idea

intro duced by Boscovich in the 18th century.

The asymptotic theory of quantile regression closely parallels the theory of the univariate sample

. Computation of quantile regression estimators may b e formulated as a linear program-

ming problem and eciently solved by simplex or barrier metho ds. A close link to rank based

inference has b een forged from the theory of the dual regression quantile pro cess, or regression

rankscore pro cess.

Quantile regression is a statistical technique intended to estimate, and conduct inference ab out,

conditional quantile functions. Just as classical metho ds based on minimizing sums

of squared residuals enable one to estimate mo dels for conditional mean functions, quantile regression

metho ds o er a mechanism for estimating mo dels for the conditional median function, and the full

range of other conditional quantile functions. By supplementing the estimation of conditional mean

functions with techniques for estimating an entire family of conditional quantile functions, quantile

regression is capable of providing a more complete statistical analysis of the sto chastic relationships

among random variables.

Quantile regression has b een used in a broad range of application settings. Reference growth

curves for childrens' heightandweighthave a long history in p ediatric medicine; quantile regression

metho ds may b e used to estimate upp er and lower quantile reference curves as a function of age, sex,

and other covariates without imp osing stringent parametric assumptions on the relationships among

these curves. Quantile regression metho ds have b een widely used in economics to study determinents

of wages, discrimination e ects, and trends in income inequality. Several recent studies have mo deled

the p erformance of public scho ol students on standardized exams as a function of so cio-economic

characteristics like their parents' income and educational attainment, and p olicy variables likeclass

size, scho ol exp enditures, and teacher quali cations. It seems rather implausible that suchcovariate

e ects should all act so as to shift the entire distribution of test results by a xed amount. It is of

obvious interest to know whether p olicy interventions alter p erformance of the strongest students in

the same way that weaker students are a ected. Such questions are naturally investigated within

the quantile regression framework.

In ecology, theory often suggests how observable covariates a ect limiting sustainable p opulation

sizes, and quantile regression has b een used to directly estimate mo dels for upp er quantiles of the

conditional distribution rather than inferring such relationships from mo dels based on conditional

. In survival analysis, and event history analysis more generally, there is often also a

Version: Octob er 25, 2000. This article has b een prepared for the statistics section of the International Encyclo-

pedia of the Social Sciences edited by Stephen Fienb erg and Jay Kadane. The researchwas partially supp orted by

NSF grant SBR-9617206. 1

2 ROGER KOENKER

desire to fo cus attention on particular segments of the conditional distribution, for example survival

prosp ects of the oldest-old, without the imp osition of global distributional assumptions.

1. Quantiles, Ranks and Optimization

Wesay that a student scores at the  th quantile of a standardized exam if he p erforms better than

the prop ortion  ,andworse than the prop ortion 1  , of the reference group of students. Thus,

half of the students p erform b etter than the median student, and half p erform worse. Similarly,the

quartiles divide the p opulation into four segments with equal prop ortions of the p opulation in each

segment. The quintiles divide the p opulation into 5 equal segments; the deciles into 10 equal parts.

The quantile, or p ercentile, refers to the general case.

More formally,anyrealvalued random variable, Y ,maybecharacterized by its distribution

function,

F y  = ProbY  y 

while for any0< <1,

Q = inffy : F y    g

is called the  th quantile of X . The median, Q1=2, plays the central role. Like the distribution

function, the quantile function provides a complete characterization of the random variable, Y.

The quantiles may b e formulated as the solution to a simple optimization problem. For any

0 < <1, de ne the piecewise linear \check function",  u= u I u<0 illustrated in Figure

 1.

ρτ (u)

τ−1 τ

Figure 1. Quantile Regression  Function

^

Minimizing the exp ectation of  Y   with resp ect to  yields solutions,   , the smallest of



whichisQ  de ned ab ove.

The sample analogue of Q , based on a random sample, fy ; :::; y g,of Y 's, is called the  th

1 n

sample quantile, and may b e found by solving,

n

X

min  y  ;

 i

 2R

i=1

QUANTILE REGRESSION 3

While it is more common to de ne the sample quantiles in terms of the order statistics, y 

1

y  :::  y , constituting a sorted rearrangement of the original sample, their formulation as a

2 n

minimization problem has the advantage that it yields a natural generalization of the quantiles to

the regression context.

Just as the idea of estimating the unconditional mean, viewed as the minimizer,

X

2

^ = argmin y 

i

j

2

0

can b e extended to estimation of the linear conditional mean function E Y jX = x= x by solving,

X

0 2

^

p

= argmin y x  ;

i

j

i

2 R

0

the linear conditional quantile function, Q  jX = x= x  , can b e estimated by solving,

Y

i

X

0

^

p

  = argmin  y x :

 i

j

2 R

The median case,  =1=2, which is equivalent to minimizing the sum of absolute values of the

residuals has a long history. In the mid-18th century Boscovich prop osed estimating a bivariate

linear mo del for the ellipticity of the earth by minimizing the sum of absolute values of residuals

sub ject to the condition that the mean residual to ok the value zero. Subsequentwork by Laplace

characterized Boscovich's estimate of the slop e parameter as a weighted median and derived its as-

ymptotic distribution. F.Y. Edgeworth seems to have b een the rst to suggest a general formulation

of median regression involving a multivariate vector of explanatory variables, a technique he called

the \plural median". The extension to quantiles other than the median was intro duced in Ko enker

and Bassett 1978.

2. An Example

To illustrate the approachwemay consider an analysis of a simple rst order autoregressivemodel

for maximum daily temp erature in Melb ourne, Australia. The are taken from Hyndman,

Bashtannyk, and Grunwald 1996. In Figure 2 weprovide a scatter plot of 10 years of daily

temp erature data: to day's maximum daily temp erature is plotted against yesterday's maximum.

Our rst observation from the plot is that there is a strong tendency for data to cluster along the

dashed 45 degree line implying that with high probabilitytoday's maximum is near yesterday's

maximum. But closer examination of the plot reveals that this impression is based primarily on the

left side of the plot where the central tendency of the scatter follows the 45 degree line very closely.

On the right side, however, corresp onding to summer conditions, the pattern is more complicated.

There, it app ears that either there is another hot day, falling again along the 45 degree line, or

there is a dramatic co oling o . But a mild co oling o app ears to b e more rare. In the language of

conditional densities, if to dayishot,tomorrow's temp erature app ears to b e bimo dal with one mo de



roughly centered at to day's maximum, and the other mo de centered at ab out 20 .

Several estimated quantile regression curves have b een sup erimp osed on the scatterplot. Each

curve is sp eci ed as a linear B-spline. Under winter conditions these curves are bunched around

the 45 degree line, however in the summer it app ears that the upp er quantile curves are bunched



around the 45 degree line and around 20 .Intheintermediate temp eratures the spacing of the

quantile curves is somewhat greater indicating lower probability of this temp erature range. This

impression is strengthened by considering a sequence of density plots based on the quantile regression

estimates. Given a family of reasonably densely spaced estimated conditional quantile functions,

it is straightforward to estimate the conditional density of the resp onse at various values of the

conditioning covariate. In Figure 3 we illustrate this approach with several of density estimates

based on the Melb ourne data. Conditioning on a low previous day temp erature weseeanice

4 ROGER KOENKER

...... today's max temperature ...... 10 20. 30 40 ......

10 15 20 25 30 35 40

yesterday's max temperature

Figure 2. Melb ourne Maximum Daily Temp erature: The plot illustrates 10 years

of daily maximum temp erature data in degrees centigrade for Melb ourne, Aus-

tralia as an AR1 scatterplot. The data is scattered around the dashed 45

degree line suggesting that to day is roughly similar to yesterday. Sup erimp osed

on the scatterplot are estimated conditional quantile functions for the quantiles

 2f:05;:10;:::; :95g.Notethatwhenyesterday's temp erature is high the spac-

ing b etween adjacent quantile curves is narrower around the 45 degree line and at

ab out 20 degrees Centigrade than it is in the intermediate region. This suggests

bimo dality of the conditional density in the summer.

QUANTILE REGRESSION 5

Yesterday's Temp 11 Yesterday's Temp 16 Yesterday's Temp 21 density density density 0.05 0.10 0.15 0.02 0.06 0.10 0.02 0.06 0.10 0.14

10 12 14 16 12 14 16 18 20 22 24 15 20 25 30

today's max temperature today's max temperature today's max temperature

Yesterday's Temp 25 Yesterday's Temp 30 Yesterday's Temp 35 density density density 0.01 0.03 0.05 0.01 0.03 0.05 0.01 0.03 0.05

15 20 25 30 35 20 25 30 35 40 20 25 30 35 40

today's max temperature today's max temperature today's max temperature

Figure 3. Conditional density estimates of to day's maximum temp erature for sev-

eral values of yesterday's maximum temp erature based on the Melb ourne data:

These density estimates are based on a kernel smo othing of the conditional quantile

estimates as illustrated in the previous gure using 99 distinct quantiles. Note that

temp erature is bimo dal when yesterdaywas hot.

unimo dal conditional density for the following day's maximum temp erature, but as the previous

day's temp erature increases we see a tendency for the lower tail to lengthen and eventually wesee

a clearly bimo dal density. In this example, the classical regression assumption that the covariates

a ect only the lo cation of the resp onse distribution, but not its scale or shap e, is violated.

3. Interpretation of Quantile Regression

Least squares estimation of mean regression mo dels asks the question, \How do es the conditional

mean of Y dep end on the covariates X ?" Quantile regression asks this question at each quantile of the

conditional distribution enabling one to obtain a more complete description of how the conditional

distribution of Y given X = x dep ends on x. Rather than assuming that covariates shift only the

lo cation or scale of the conditional distribution, quantile regression metho ds enable one to explore

p otential e ects on the shap e of the distribution as well. Thus, for example, the e ect of a job-

training program on the length of participants' current unemployment sp ell might b e to lengthen the

shortest sp ells while dramatically reducing the probabilityofvery long sp ells. The mean treatment

6 ROGER KOENKER

e ect in such circumstances might b e small, but the treatment e ect on the shap e of the distribution

of unemployment durations could, nevertheless, b e quite signi cant.

3.1. QuantileTreatment E ects. The simplest formulationofquantile regression is the two-

sample treatment-control mo del. In place of the classical Fisherian exp erimental design mo del in

which the treatment induces a simple lo cation shift of the resp onse distribution, Lehmann 1974

prop osed the following general mo del of treatment resp onse:

\Supp ose the treatment adds the amountx when the resp onse of the untreated

sub ject would b e x. Then the distribution G of the treatment resp onses is that of the

random variable X +X  where X is distributed according to F ."

Sp ecial cases obviously include the lo cation shift mo del, X =, and the scale shift mo del,

0

X = X , but the general case is natural within the quantile regression paradigm.

0

Doksum 1974 shows that if x is de ned as the \horizontal distance" b etween F and G at x,

so

F x= Gx +x

then x is uniquely de ned and can b e expressed as

1

x= G F x x:

Changing variables so  = F x one may de ne the quantile treatment e ect,

1 1 1

  =F   = G   F  :

In the two sample setting this quantity is naturally estimable by

1 1

^ ^ ^

  =G   F  

n m

where G and F denote the empirical distribution functions of the treatment and control observa-

n m

tions, based on n and m observations resp ectively.

Formulating the quantile regression mo del for the binary treatment problem as,

Q  jD =  +  D

Y i i

i

where D denotes the treatment indicator, with D = 1 indicating treatment, D =0,control, then

i i i

the quantile treatment e ect can b e estimated by solving,

n

X

0

^

^  ;    = argmin  y D :

 i i

i=1

0 1

^

^

The solution  ^  ;    yields ^  = F  , corresp onding to the control sample, and

n

1 1

^ ^ ^

 :   F   = G

n n

Doksum suggests that one mayinterpret control sub jects in terms of a latentcharacteristic: for

example in survival analysis applications, a control sub ject may b e called frail if he is prone to die at

an early age, and robust if he is prone to die at an advanced age. This latentcharacteristic is thus

implicitly indexed by  , the quantile of the survival distribution at which the sub ject would app ear

if untreated, i.e., Y jD =0=  : And the treatment, under the Lehmann mo del, is assumed to

i i

alter the sub jects control resp onse,  , making it  +    under the treatment. If the latent

characteristic, say, the prop ensity for longevity,were observable ex ante, then one could view the

treatment e ect    as an explicit with this observable variable. In the absence of such

QUANTILE REGRESSION 7

an observable variable however, the quantile treatment e ect may b e regarded as a natural measure

of the treatment resp onse.

It may b e noted that the quantile treatment e ect is intimately tied to the two-sample QQ-plot

1

^

which has a long history as a graphical diagnostic device. The function x= G F x x is

m

n

exactly what is plotted in the traditional two sample QQ-plot. If F and G are identical then the

1

function G F x will lie along the 45 degree line; if they di er only by a lo cation scale shift,

m

n

1

then G F x will lie along another line with intercept and slop e determined by the lo cation and

m

n

scale shift, resp ectively. Quantile regression may b e seen as a of extending the two-sample

QQ plot and related metho ds to general regression settings with continuous covariates.

When the treatmentvariable takes more than twovalues, the Lehmann-Doksum quantile treat-

ment e ect requires only minor reinterpretation. If the treatmentvariable is continuous as, for

example, in dose-resp onse studies, then it is natural to consider the assumption that its e ect is

linear, and write,

Q  jx =  +  x :

Y i i

i

We assume thereby that the treatmente ect,  , of changing x from x to x + 1 is the same as the

0 0

treatment e ect of an alteration of x from x to x +1: Note that this notion of the quantile treatment

1 1

e ect measures, for each  ,thechange in the resp onse required to stayonthe  th conditional quantile

function.

3.2. Transformation Equivariance of Quantile Regression. An imp ortant prop ertyofthe

quantile regression mo del is that, for any monotone function, h,

Q  jx= hQ  jx:

T

hT 

This follows immediately from observing that

Prob T

This equivariance to monotone transformations of the conditional quantile function is a crucial

feature, allowing one to decouple the p otentially con icting ob jectives of transformations of the re-

sp onse variable. This equivariance prop erty isindirectcontrast to the inherent con icts in estimating

transformation mo dels for conditional mean relationships. Since, in general, E hT jx 6= hE T jx

the transformation alters in a fundamental way what is b eing estimated in

regression.

A particularly imp ortant application of this equivariance result, and one that has proven extremely

in uential in the econometric application of quantile regression, involves censoring of the observed



resp onse variable. The simplest mo del of censoring may b e formulated as follows. Let y denote a

i

latent unobservable resp onse assumed to b e generated from the linear mo del

 0

y = x + u i =1;::: ;n

i

i i



's are not observed directly,but with fu g iid from distribution function F. Due to censoring, the y

i

i

instead one observe



g: y = maxf0;y

i

i

Powell 1986 noted that the equivariance of the quantiles to monotone transformations implied

that in this mo del the conditional quantile functions of the resp onse dep ended only on the censoring

p oint, but were indep endentofF .Formally,the  th conditional quantile function of the observed

resp onse, y ; in this mo del may b e expressed as

i

1 0

 g + F Q  jx  = maxf0;x

i i

u i

8 ROGER KOENKER

The parameters of the conditional quantile functions maynow b e estimated by solving

n

X

0

min  y maxf0;x bg

 i

i

b

i=1

where it is assumed that the design vectors x ,contain an intercept to absorb the additive e ect of

i

1

F  : This mo del is computationally somewhat more demanding than conventional linear quantile

u

regression b ecause it is non-linear in parameters.

3.3. Robustness. Robustness to distributional assumptions is an imp ortant consideration through-

out statistics, so it is imp ortant to emphasize that quantile regression inherits certain robustness

prop erties of the ordinary sample quantiles. The estimates and the asso ciated inference apparatus

have an inherent distribution-free character since quantile estimation is in uenced only by the lo cal

b ehavior of the conditional distribution of the resp onse near the sp eci ed quantile. Given a solution

^

 , based on observations, fy; X g, as long as one do esn't alter the sign of the residuals, any of

the y observations may b e arbitrary altered without altering the initial solution. Only the signs of

the residuals matter in determining the quantile regression estimates, and thus outlying resp onses

in uence the t in so far as they are either ab ove or b elow the tted hyp erplane, but how far ab ove

or b elow is irrelevant.

While quantile regression estimates are inherently robust to contamination of the resp onse ob-

servations, they can b e quite sensitivetocontamination of the design observations, fx g.Several

i

prop osals have b een made to ameliorate this e ect.

4. Computational Aspects of Quantile Regression

Although it was recognized bya numb er of early authors, including Gauss, that solutions to

the median regression problem were characterized by an exact t through p sample observations

when p linear parameters are estimated, no e ective algorithm arose until the development of linear

programming in the 1940's. It was then quickly recognized that the median regression problem could

b e formulated as a linear program, and the simplex metho d employed to solve it. The algorithm

of Barro dale and Rob erts 1973 provided the rst ecient implementation sp eci cally designed for

median regression and is still widely used in statistical software. It can b e concisely describ ed as

follows. At each step, wehave a trial set of p \basic observations" whose exact t may constitute a

solution. We compute the directional derivative of the ob jective function in each of the 2p directions

that corresp ond to removing one of the current basic observations, and taking either a p ositive

or negative step. If none of these directional derivatives are negative the solution has b een found,

otherwise one cho oses the most negative, the direction of steap est descent, and go es in that direction

until the ob jective function ceases to decrease. This one dimensional search can b e formulated as

a problem of nding the solution to a scalar weighted quantile problem. Having chosen the step

length, wehave in e ect determined a new observation to enter the basic set, a simplex pivot o ccurs

to up date the current solution, and the iteration continues.

This mo di ed simplex strategy is highly e ective on problems with a mo dest numberofobser-

vations, achieving sp eeds comparable to the corresp onding least squares solutions. But for larger

problems with, say n> 100; 000 observations, the simplex approacheventually b ecomes considerably

slower than least squares. For large problems recentdevelopmentofinterior p oint metho ds for lin-

ear programming problems are highly e ective. Portnoy and Ko enker 1997 describ e an approach

that combines some statistical prepro cessing with interior p oint metho ds and achieves comparable

p erformance to least squares solutions even in very large problems.

QUANTILE REGRESSION 9

An imp ortant feature of the formulation of quantile regression is that the

entire range of solutions for  2 0; 1 can b e eciently computed by parametric programming.

^

Atany solution   there is an interval of  's over which this solution remains optimal, it is

0

straightforward to compute the endp oints of this interval, and thus one can solve iteratively for the

^

entire sample path  by making one simplex pivot at each of the endp oints of these intervals.

5. Statistical Inference for Quantile Regression

^

The asymptotic b ehavior of the quantile regression pro cess f  : 2 0; 1g closely parallels the

theory of ordinary sample quantiles in the one sample problem. Ko enker and Bassett 1978 show

that in the classical linear mo del,

y = x + u

i i i

with u iid from dfF; with density f u > 0 on its supp ort fuj0

i

p

m 1

^

of n     is asymptotically normal with mean 0 and covariance matrix D .

n i i

i1

P

1 0 1 0

Here  = + F  e ;e =1; 0;::: ;0 ;x 1;n x x ! D; a p ositive de nite matrix,

1 1 1i i

u

i

and

1 1

=! =minf ; g   =f F  f F  :

ij i j i j i j

When the resp onse is conditionally indep endentover i, but not identically distributed, the as-

p

^

ymptotic covariance matrix of   = n     is somewhat more complicated. Let

  = x  

i i

denote the conditional quantile function of y given x ,andf  the corresp onding conditional density,

i i

and de ne,

n

X

1 0

; J  ; = minf ; g   n x x

n 1 2 1 2 1 2 i

i

i=1

and

X

1 0

f   : H  = n x x

i i n i

i

Under mild regularity conditions on the ff g's and fx g's, wehave joint asymptotic normalityfor

i i

vectors   ;::: ;  with mean zero and covariance matrix

i m

1 1 m

V =H   J  ; H    :

n n i n i j n j

i=1

An imp ortant link to the classical theory of rank tests was made byGutenbrunner and Jureckov a



1992, who showed that the rankscore functions of H ajek and Sid ak 1967 could b e viewed as a

sp ecial case of a more general formulation for the linear quantile regression mo del. The formal dual

of the quantile regression linear programming problem may b e expressed as,

0 0 0 n

maxfy ajX a =1 tX 1;a2 [0; 1] g:



The dual solutiona ^  reduces to the H ajek and Sid ak rankscores pro cess when the design matrix,

X ,takes the simple form of an n vector of ones. The regression rankscore pro cess ^a  b ehaves

asymptotically muchlike the classical univariate rankscore pro cess, and thus o ers a way to extend

many rank based inference pro cedures to the more general regression context.

10 ROGER KOENKER

6. Extensions and Future Developments

There is considerable scop e for further developmentofquantile regression metho ds. Applications

to survival analysis and time-series mo deling seem particularly attractive, where censoring and

recursive estimation p ose, resp ectively,interesting challenges. For the classical latentvariable form

of the binary resp onse mo del where,

0

y = I x + u  0

i i

i

and the median of u conditional on x is assumed to b e zero for all i =1; :::; n, Manski 1975

i i

prop osed an estimator solving,

X

0

y 1=2I x b  0: max

i

i

jjbjj=1

This \maximum score" estimator can b e viewed as a median version of the general linear quantile

regression estimator for binary resp onse,

X

0

min  y I x b  0:

 i

i

jjbjj=1

In this formulation it is p ossible to estimate a family of quantile regression mo dels and explore,

semi-parametrically, a full range of linear conditional quantile functions for the latentvariable form

of the binary resp onse mo del.

Ko enker and Machado 1999 intro duce inference metho ds closely related to classical go o dness of

t statistics based on the full quantile regression pro cess. There havebeenseveral prop osals dealing

with generalizations of quantile regression to nonparametric resp onse functions involving b oth lo cal

p olynomial metho ds and splines. Extension of quantile regression metho ds to multivariate resp onse

mo dels is a particularly imp ortantchallenge.

7. Conclusion

Classical least squares regression may b e viewed as a natural way of extending the idea of esti-

mating an unconditional mean parameter to the problem of estimating conditional mean functions;

the crucial step is the formulation of an optimization problem that encompasses b oth problems.

Likewise, quantile regression o ers an extension of univariate quantile estimation to estimation of

conditional quantile functions via an optimization of a piecewise linear ob jective function in the resid-

uals. Median regression minimizes the sum of absolute residuals, an idea intro duced by Boscovich

in the 18th century.

The asymptotic theory of quantile regression closely parallels the theory of the univariate sample

quantiles; computation of quantile regression estimators maybeformulated as a linear programming

problem and eciently solved by simplex or barrier metho ds. A close link to rank based inference

has b een forged from the theory of the dual regression quantile pro cess, or regression rankscore

pro cess.

Recentnon-technical intro ductions to quantile regression are provided byBuchinsky 1998 and

Ko enker 2001. A more complete intro duction will b e provided in the forthcoming monograph

of Ko enker 2002. Most of the ma jor statistical computing languages now include some capa-

bilities for quantile regression estimation and inference. Quantile regression packages are avail-

able for R and Splus from the R archives at http://lib.stat.cmu.edu/R/CRAN and Statlib at

http://lib.stat.cmu.edu/S, resp ectively. 's central core provides quantile regression esti-

mation and inference functions. SAS o ers some, rather limited, facilities for quantile regression.

QUANTILE REGRESSION 11

REFERENCES

Barrodale, I. and F.D.K. Roberts 1974. Solution of an overdetermined system of equations

in the ` norm, Communications ACM, 17, 319-320.

1

Buchinsky, M., 1998, Recent Advances in Quantile Regression Mo dels: A practical guide for

empirical research, J.Human Resources, 33, 88-126,

Doksum, K. 1974 Empirical probability plots and statistical inference for nonlinear mo dels in the

two sample case, , 2, 267-77.



Gutenbrunner, C. and J. Jureckova 1992. Regression quantile and regression rank score

pro cess in the linear mo del and derived statistics, Annals of Statistics 20, 305-330.



Hajek , J. and Z. Sidak 1967. TheoryofRank Tests, Academia: Prague.

Hyndman, R.J., D.M Bashtannyk, and G.K. Grunwald, 1996 Estimating and Visualizing

Conditional Densities, J. of Comp. and Graphical Stat. 5, 315-36.

Koenker, R. and G. Bassett 1978. Regression quantiles, Econometrica, 46, 33-50.

Koenker, R. 2001. Quantile Regression, J. of Economics Perspectives, forthcoming.

Koenker, R. 2002. Quantile Regression, forthcoming.

Lehmann, E. 1974 Nonparametrics: Statistical Methods BasedonRanks, Holden-Day: San Fran-

cisco.

Manski, C. 1985 Semiparametric analysis of discrete resp onse: asymptotic prop erties of the

maximum score estimator, J. of Econometrics, 27, 313-34.

Portnoy, S. and R. Koenker 1997. The Gaussian Hare and the Laplacian Tortoise: Com-

putability of Squared-error vs. Absolute-error Estimators, with discussion, Statistical Science,

12, 279-300.

Powell, J.L. 1986 Censored regression quantiles, J. of Econometrics, 32, 143-55.