The Importance of Research Design in Political Science Gary King, Robert 0
Total Page:16
File Type:pdf, Size:1020Kb
American Political Science Review Vo1. 89, No. 2 Tune 1995 THE IMPORTANCE OF RESEARCH DESIGN IN POLITICAL SCIENCE GARY KING, ROBERT 0. KEOHANE, and SIDNEY VERBA Harvard University eceiving five serious reviews in this syrnpo- improving research design to produce better data in sium is gratifymg and confirms our belief that the first place, which almost always improves infer- R research design should be a priority for our ences more than the necessarily after-the-fact statis- discipline. We are pleased that our five distinguished tical solutions. reviewers appear to agree with our unified approach This lack of focus on research design in social to the logic of inference in the social sciences, and science statistics is as surprising as it is disappointing, with our fundamental point: that good quantitative since some of the most historically important works and good qualitative research designs are based fun- in the more general field of statistics are devoted to damentally on the same logic of inference. The re- problems of research design (see, e.g., Fisher (1935) viewers also raised virtually no objections to the main The Design of Experiments). Experiments in the social practical contribution of our book--our many specific sciences are relatively uncommon, but we can still procedures for avoiding bias, getting the most out of have an enormous effect on the value of our aualita- qualitative data, and making reliable inferences. tive or quantitative information, even withoui statis- However, the reviews make clear that although our tical corrections, by improving the design of our book may be the latest word on research design in research. We hope our book will help move these political science, it is surely not the last. We are taxed fields toward studying innovations in research de- for failing to include important issues in our analysis sim." and for dealing inadequately with some of what we We culled much useful information from the social included. Before responding to the reviewers' most science statistics literatures and qualitative methods direct criticisms, let us explain what we emphasize in fields. But for our goal of explicating and unifying the Designing Social Inquiry and how it relates to some of logic of inference, both literatures had problems. the points raised by the reviewers. Social science statistics focuses too little on research design, and its language seems arcane if not impen- etrable. The numerous languages used to describe WHAT WE TRIED TO DO methods in qualitative research are diverse, inconsis- tent in jargon and methodological advice, and not Designing Social Inquiry grew out of our discussions always helpful to researchers. We agree with David while coteaching a graduate seminar on research Collier that asvects of our advice can be revhrased design, reflecting on job talks in our department, and into some of the languages used in the q;alitative reading the professional literature in our respective methods literature or that used by quantitative re- subfields. Although many of the students, job candi- searchers. We hope our unified logic and, as David dates, and authors were highly sophisticated qualita- Laitin puts it, our "common vocabulary" will help tive and quantitative data collectors, interviewers, foster communication about these imvortant issues soakers and pokers, theorists, philosophers, formal among all social scientists. But we bekeve that any modelers, and advanced statistical analysts, many coherent language could be used to convey the same nevertheless had trouble defining a research question ideas.---- - and designing the empirical research to answer it. We demonstrated that "the differences between The students proposed impossible fieldwork to an- the quantitative and qualitative traditions are only swer unanswerable questions. Even many active stylistic and are methodologically and substantively scholars had difficulty with the basic questions: What unimportant" (p. 4). Indeed, much of the best social do you want to find out? How are you going to find science research can combine quantitative and quali- it out? and, above all, How would you know if you tative data, precisely because there is no contradic- were right or wrong? tion between the fundamental processes of inference We found conventional statistical training to be involved in each. Sidney Tarrow asks whether we only marginally relevant to those with qualitative agree that "it is the combination of quantitative and data. We even found it inadequate for students with qualitative" approaches that we desire (p. 473). We projects amenable to quantitative analysis, since so- do. But to combine both types of data sources pro- cial science statistics texts do not frequently focus on ductively, researchers need to understand the funda- research design in observational settings. With a few mental logic of inference and the more specific rules important exceptions, the scholarly literatures in and procedures that follow from an explication of this quantitative political methodology and other social logic. science statistics fields treat existing data and their Social science, both quantitative and qualitative, problems as given. As a result, these literatures seeks to develop and evaluate theories. Our concern largely ignore research design and, instead, focus on is less with the development of theory than theory making valid inferences through statistical correc- evaluation-how to use the hard facts of empirical tions to data problems. This approach has led to some reality to form scientific opinions about the theories dramatic progress; but it slights the advantage of and generalizations that are the hoped for outcome of Review Symposium June 1995 our efforts. Our social scientist uses theory to gener- specify ways to increase the information in qualitative ate observable implications, then systematically applies studies that can be used to evaluate theories; we publicly known procedures to infer from evidence show how this can be accomplished without return- whether what the theory implied is correct. Some ing to the field for additional data collection. theories emerge from detailed observation, but they Throughout the book, we illustrate our propositions should be evaluated with new observations, prefera- not only with hypothetical examples but with refer- bly ones that had not been gathered when the theo- ence to some of the best contemporary research in ries were being formulated. Our logic of theory political science. evaluation stresses maximizing leverage-explaining This statement of our purposes and fundamental as much as possible with as little as possible. It also arguments should put some of the reviewers' com- stresses minimizing bias. Lastly, though it cannot plaints about omissions into context. Our book is eliminate uncertainty, it encourages researchers to about doing empirical research designed to evaluate report estimates of the uncertainty of their conclu- theories and learn about the world-to make infer- sions. ences-not about generating theories to evaluate. We Theory and empirical work, from this perspective, believe that researchers who understand how to cannot productively exist in isolation. We believe that evaluate a theory will generate better theories-the- it should become standard practice to demand clear ories that are not only more internally consistent but implications of theory and observations checking that also have more observable implications (are more those implications derived through a method that at risk of being wrong) and are more consistent with minimizes bias. We hope that Designing Social lnquiy prior evidence. If, as Laitin suggests, our singlemind- helps to "discipline political science" in this way, as edness in driving home this argument led us implic- David Laitin recommends; and we hope, along with itly to downgrade the importance of such matters as James Caporaso, that "improvements in measure- ment accuracy, theoretical specification, and research concept formation and theory creation in political should yield a smaller range of allowable outcomes science, this was not our intention. consistent with the predictions made" (p. 459). Designing Social Inquiry repeatedly emphasizes the Our book also contains much specific advice, some attributes of good theory. How else to avoid omitted of it new and some at least freshly stated. We explain variable bias, choose causal effects to estimate, or how to distinguish systematic from nonsystematic derive observable implications? We did not offer components of phenomena under study and focus much advice about what is often called the "irrational explicitly on trade-offs that may exist between the nature of discovery," and we leave it to individual goals of unbiasedness and efficiency (chap. 2). We researchers to decide what theories they feel are discuss causality in relation to counterfactual analysis worth evaluating. We do set forth some criteria for and what Paul Holland calls the "fundamental prob- choosing theories to evaluate-in terms of their im- lem of causal inference" and consider possible com- portance to social science and to the real world-but plications introduced by thinking about causal mech- our methodological advice about research design anisms and multiple causality (chap. 3). Our applies to any type of theory. We come neither to discussion of counterfactual reasoning is, we believe, praise nor to bury rational-choice theory, nor to make consistent with Donald Campbell's "quasi-experi- an argument in favor of deductive over inductive mental" emphasis; and we thank James Caporaso for theory. All we ask is that whatever theory is chosen clarifying this. ' be evaluated by the same standards of inference. We pay special attention in chapter 4 to issues of Ronald Rogowski's favorite physicist, Richard Feyn- what to observe: how to avoid confusion about what man, explains clearly how to evaluate a theory (which constitutes a "case" and, especially, how to avoid or he refers to as a "guess"): "If it disagrees with [the limit selection bias. We show that selection on values empirical evidence], it is wrong. In that simple state- of explanatory variables does not introduce bias but ment is the key to science.