Contributors to This Volume
Total Page:16
File Type:pdf, Size:1020Kb
Institute of Mathematical Statistics COLLECTIONS Volume 3 Pushing the Limits of Contemporary Statistics: Contributions in Honor of Jayanta K. Ghosh Bertrand Clarke and Subhashis Ghosal, Editors arXiv:0806.4445v1 [math.ST] 27 Jun 2008 Institute of Mathematical Statistics Beachwood, Ohio, USA Institute of Mathematical Statistics Collections Series Editor: Anthony Davison The production of the Institute of Mathematical Statistics Collections is managed by the IMS Office: Rong Chen, Treasurer and Elyse Gustafson, Executive Director. Library of Congress Control Number: 2008924408 International Standard Book Number 978-0-940600-75-1 International Standard Serial Number 1939-4039 Copyright c 2008 Institute of Mathematical Statistics All rights reserved Printed in the United States of America Contents Preface Bertrand Clarke and Subhashis Ghosal ........................... v Contributors ................................................. vii J. K. Ghosh’s contribution to statistics: A brief outline Bertrand Clarke and Subhashis Ghosal ........................... 1 Objective Bayesian analysis under sequential experimentation Dongchu Sun and James O. Berger ............................. 19 Sequential tests and estimates after overrunning based on p-value combination W. J. Hall and Keyue Ding ................................. 33 On predictive probability matching priors Trevor J. Sweeting ...................................... 46 Data-dependent probability matching priors for empirical and related likelihoods Rahul Mukerjee ........................................ 60 Probability matching priors for some parameters of the bivariate normal distri- bution Malay Ghosh, Upasana Santra and Dalho Kim ...................... 71 Fuzzy set representation of a prior distribution Glen Meeden ......................................... 82 Fuzzy sets in nonparametric Bayes regression Jean-Fran¸cois Angers and Mohan Delampady ....................... 89 Objective Bayes testing of Poisson versus inflated Poisson models M. J. Bayarri, James O. Berger and Gauri S. Datta ................... 105 Consistent selection via the Lasso for high dimensional approximating regression models Florentina Bunea ....................................... 122 Asymptotic optimality of a cross-validatory predictive approach to linear model selection Arijit Chakrabarti and Tapas Samanta ........................... 138 Risk and resampling under model uncertainty Snigdhansu Chatterjee and Nitai D. Mukhopadhyay .................... 155 Remarks on consistency of posterior distributions Taeryon Choi and R. V. Ramamoorthi ........................... 170 Large sample asymptotics for the two-parameter Poisson–Dirichlet process Lancelot F. James ...................................... 187 Reproducing kernel Hilbert spaces of Gaussian priors A. W. van der Vaart and J. H. van Zanten ........................ 200 A Bayesian semi-parametric model for small area estimation Donald Malec and Peter M¨uller ............................... 223 A hierarchical Bayesian approach for estimating the origin of a mixed population Feng Guo, Dipak K. Dey and Kent E. Holsinger ..................... 237 Kendall’s tau in high-dimensional genomic parsimony Pranab K. Sen ........................................ 251 iii iv Contents Orthogonalized smoothing for rescaled spike and slab models Hemant Ishwaran and Ariadni Papana ........................... 267 Nonparametric statistics on manifolds with applications to shape spaces Abhishek Bhattacharya and Rabi Bhattacharya ...................... 282 An ensemble approach to improved prediction from multitype data Jennifer Clarke and David Seo ............................... 302 Sharp failure rates for the bootstrap particle filter in high dimensions Peter Bickel, Bo Li and Thomas Bengtsson ........................ 318 Preface Jayanta Kumar Ghosh is one of the most extraordinary professors in the field of Statistics. His research in numerous areas, especially asymptotics, has been ground- breaking, influential throughout the world, and widely recognized through awards and other honors. His leadership in Statistics as Director of the Indian Statistical Institute and President of the International Statistical Institute, among other emi- nent positions, has been likewise outstanding. In recognition of Jayanta’s enormous impact, this volume is an effort to honor him by drawing together contributions to the main areas in which he has worked and continues to work. The papers naturally fall into five categories. First, sequential estimation was Jayanta’s starting point. Thus, beginning with that topic, there are two papers, one classical by Hall and Ding leading to a variant on p-values, and one Bayesian by Berger and Sun extending reference priors to stopping time problems. Second, there are five papers in the general area of prior specification. Much of Jayanta’s earlier work involved group families as does Sweeting’s paper here for instance. There are also two papers dwelling on the link between fuzzy sets and priors, by Meeden and by Delampady and Angers. Equally daring is the work by Mukerjee with data dependent priors and the pleasing confluence of several prior selection criteria found by Ghosh, Santra and Kim. Jayanta himself studied a variety of prior selection criteria including probability matching priors and reference priors. Third, between his work on parametric Bayes and nonparametrics, Jayanta took an interest in model selection. Accordingly, three papers on model selection come next. Bunea’s work on consistency echoes Jayanta’s work on consistency of the BIC. Chatterjee and Mukhopadhyay’s work on data adaptive model averaging continues the direction they started under Jayanta’s guidance. Chakrabarti and Samanta’s work on the asymptotic optimality of predictive cross validation contrasts nicely with standard Bayes model selection, via the BIC for instance. Fourth, there are five papers generally on Bayesian nonparametrics. Some are applied as in Malec and Mueller’s work on semi-parametrics in small area estima- tion or Guo, Dey and Holsinger’s work carefully using prior selection for modeling purposes. And some are more theoretical: Choi and Ramamoorthi provide a review, with some new results, on posterior consistency while James focuses on a class of priors and van der Vaart and van Zanten focus on the role of reproducing kernel Hilbert spaces in Bayesian nonparametrics with Gaussian process priors. Finally, Jayanta has most recently turned his attention to high dimensional prob- lems. On this topic, there are five papers from a variety of standpoints. For instance, it is possible to make unexpected use of the information in the large dimensions themselves as in Sen’s work with Kendall’s tau. Others focus on the parametric parts of a nonparametric model as in Ishwaran and Papana, or in Bhattacharya and Bhattachcarya. A third tack in Clarke and Seo is the focus on selecting the dimensions for use in emerging model classes. Finally, the work of Bickel, Li and Bengtsson establishes a general convergence result for computing conditional dis- tributions. As can be seen, some papers fit comfortably into more than one section and some only fit into a section if it is interpreted broadly. Even so, we would like to v vi think that the papers have achieved a nice tradeoff between clustering rather nicely around the topic of each category and maintaining a reasonable diversity in line with Jayanta’s work. Despite his manifold research interests, asymptotics and their applications have been the main recurring theme of Jayanta’s research since he published his first pa- per in sequential statistics in 1960 (at the age of 23). So, as a generality, asymptotics undergirds most of the material in this volume honoring him. Fortunately, asymptotic thinking pervades statistical inference, even in the most applied contexts, so, this is hardly a limitation. On the other hand, asymptotics has a way of being impenetrably abstract. However, all the papers here, are, in Woodroofe’s memorable phrase, written at a level that would be ‘accessible to a determined graduate student’. We encourage readers to have a look at least at the introductions of papers outside their research area, just for pure love of the field and the joy of intellectual stimulation. We suspect that once someone has read the introduction, he or she will be ineluctably led to finish reading the whole paper. As editors, we have been delighted at the depth and quality of work our con- tributors submitted. They all make foundational points in the spirit of Jayanta. We believe each paper will be of interest to researchers, theoretical and applied, who confront problems that are difficult enough that conventional solutions are inadequate and closed form solutions are intractable in the several areas covered here. We are deeply grateful to all contributors for offering their finest work to this volume. Of course, no volume such as this could have been possible without the free and anonymous labor of referees: You folks know who you are, but for the sake of confidentiality we cannot name you. We especially thank those who provided extremely prompt reports when we badly needed them. If any of you meet one of us at a conference, we owe you a drink. Probably two – you helped us immeasurably. In terms of actually producing this volume, Jennifer Clarke provided invaluable support. She helped us repeatedly with compiling complete versions of the volume. In particular, the final, detailed copy-editing was largely her work; the balance of her account