Research at the United States Bureau of the Census Carmen E

Total Page:16

File Type:pdf, Size:1020Kb

Research at the United States Bureau of the Census Carmen E Research at the United States Bureau of the Census Carmen E. Hoy and Cynthia Z.F. Clark U.S. Bureau of the Census, Washington, D.C. 1. Introduction Statistical and survey methodology research at the U.S. Bureau of the Census (Census Bureau or Bureau) is conducted by a number of researchers throughout the agency, but most of them are concentrated within the Methodology and Standards Directorate. The directorate has three divisions/offices, each of which contains groups of persons with expertise in specific areas of research. Additionally, the directorate oversees sampling and survey methodology and applied research; establishes best practices; and sets standards for the decennial census, for demographic surveys, and for economic census and survey programs of the Bureau. The Statistical Research Division (SRD) works in three main research areas: statistical computing, survey statistics, and survey methodology. • The statistical computing staff specializes in statistical computing, computing applications, and technology research. Statistical computing deals with data editing and imputation, record linkage, and variance estimation. Technology research deals with various technology applications, including graphics and metadata. • The survey statistics staff focuses on sampling, estimation, and small area/domain estimation research; statistical estimation and analysis; disclosure limitation research; and time series and seasonal adjustment research. • The survey methodology staff have expertise in social science research, i.e., questionnaire design and measurement research, usability, questionnaire pretesting, and human factors research. The Planning, Research, and Evaluation Division (PRED) includes groups with interest in administrative records research, survey and census evaluations, and planning for the year 2010 census. The Computer Assisted Survey Research Office (CASRO) coordinates research, develops prototypes, and conducts pretests on computer-assisted survey data collection. 2. Types of Interaction with Academic and Other Researchers On any given research project there may be various opportunities for Bureau staff to interact with academic researchers and researchers at other agencies. Some researchers are funded under contracts with the Census Bureau to conduct research off-site (i.e., at their academic institutions) on the Bureau's priority projects. Some of these contracts with researchers provide for shared expenses and benefits between the contractor and the Census Bureau. For other research contracts the Census Bureau awards a large multi- year and general research contract and pays the entire cost for services rendered. For the general research contract many of the prime contractors are teamed with one or more organizations or have arrangements with outside experts/consultants to broaden their ability to meet all of the potential needs of the Bureau. These contracts allow divisions and offices to quickly and easily obtain outside advisory and assistance services to support their research and development efforts. These contracts undergo public competitive bidding every five years. Winners of the contract work on specific research task orders submitted by the Census Bureau throughout the period of the contract. The Bureau is currently in the second round for these competitive research contracts. Since 1978 the Census Bureau has partnered with the National Science Foundation and the American Statistical Association to solicit research proposals from academics and other researchers to conduct research at the Census Bureau (on-site) through the ASA/NSF/Census Bureau Research Fellow Program. For these research projects, experienced researchers are encouraged to submit research proposals whose results would benefit Census Bureau programs. This program allows researchers with Special Sworn Status to use Census Bureau confidential data that would not be made available to them at their academic sites. This research arrangement facilitates interaction and exchange of knowledge between the senior researcher and the Census Bureau staff. It also exposes the researchers to interesting problems confronting the Bureau. The researchers often work on these problems after returning to their academic institutions, and may interest their graduate students in them — continuing a knowledge exchange over many years. The Census Bureau has several other forums in which staff interact with academic researchers not funded by the agency. Such interactions occur at Bureau-sponsored conferences, workshops, expert review panels, professional advisory committee meetings, and professional meetings. For 12 years (1985 - 1996) the Census Bureau sponsored an Annual Research Conference that brought together researchers from academia and from organizations producing official statistics. This conference was successful in interesting researchers in the work of official statisticians. During 1997-98 the Census Bureau hosted three one-day conferences on, respectively, Censuses of Population and Housing, the American Community Survey, and Small Area Estimation. These conferences also attracted a mix of individuals from academia and from official statistical organizations. In November 1999 the Office of Management and Budget's Federal Committee on Statistical Methodology (FCSM) will, with significant support from the Census Bureau, host the FCSM Research Conference. In June 2000 the Census Bureau will join with other official statistical agencies in sponsoring the Second International Conference on Establishment Surveys. Two ongoing forums are particularly valuable to the Census Bureau. The Bureau has a professional advisory committee, meeting twice yearly for two days, made up of representatives of the American Statistical Association, the Population Association of America, the American Economic Association, and the American Marketing Association. On occasions when in-depth review of a specific survey topic is needed, the Bureau contracts with the National Academy of Sciences’ Committee on National Statistics to create a panel of academic researchers to review the area and make recommendations. Several panels of note have been convened during the past few years, particularly to review census methodology. The three panels were: 1) Panel on Census Requirements in the Year 2000 and Beyond; 2) Panel to Evaluate Alternative Census Methods; and 3) Panel on Alternative Census Methodologies. Two new census panels were initiated recently: Panel to Review the 2000 Census, and Panel on the Census in the Twenty-first Century. Additionally, the Department of Education commissioned a Panel on the Small Area Estimates of Income and Poverty (SAIPE) [estimates that the Census Bureau was producing] providing the basis for distributing $8 billion in funds to local school districts. More recently, a workshop on the newly-initiated American Community Survey (ACS) was held. This was the American Community Survey Workshop - Building a Research Agenda. 3. Examples of Research Projects 3 We have selected several projects in the areas of research mentioned above to illustrate the impact of academic interactions on the statistical and survey research of the Census Bureau. 3.1. Mathematical Statistics Research Disclosure Limitation Methods. The purpose of this research is to develop disclosure limitation methods to be used for Census Bureau data products that are to be made publicly available. Emphasis will be placed on techniques to implement disclosure limitation at the stage of processing. On this project, researchers from academia, other countries, and other agencies have interacted. The Census Bureau has recently funded contracts with Steve Samuels at Purdue University and, through Westat, Inc., with Stephen Fienberg at Carnegie Mellon University. Samuels’ research focused on a measure of disclosure risk for microdata. Fienberg’s research focused on disclosure limitation methods for demographic tabular data. The Bureau plans to fund a contract for the disclosure review of American FactFinder, the computer system that will be used to facilitate user access to Census Bureau data products and user-defined data products. With academics and federal workers from nine different European countries, the Bureau also funded a recent workshop on confidentiality and data access. The focus of the workshop was to coordinate disclosure-limitation research, encourage collaborations, and prioritize research. Attendees will now be seeking funding for this collaborative work. With Census Bureau leadership, the United States will host a follow-up conference in the fall of 2000. In addition, the Census Bureau recently made free cell-suppression and auditing software available to other statistical agencies. All staff in the disclosure limitation research group within the Statistical Research Division are members of the Interagency Confidentiality and Data Access Group (ICDAG), an interest group of the FCSM. ICDAG has done a series of workshops on Privacy, Confidentiality, and the Protection of Data--A Statistical Perspective. ICDAG has prepared a Checklist on Disclosure Potential of Proposed Data Releases which aids people in reviewing data for potential disclosure problems. Other ICDAG products under development include five-dimensional cell-suppression auditing software, a general brochure on confidentiality and data access issues, a review of data licensing procedures, and a manual on restricted access options. Time Series and Seasonal Adjustment Research. For many years there has been significant
Recommended publications
  • E-Survey Methodology
    Chapter I E-Survey Methodology Karen J. Jansen The Pennsylvania State University, USA Kevin G. Corley Arizona State University, USA Bernard J. Jansen The Pennsylvania State University, USA ABSTRACT With computer network access nearly ubiquitous in much of the world, alternative means of data col- lection are being made available to researchers. Recent studies have explored various computer-based techniques (e.g., electronic mail and Internet surveys). However, exploitation of these techniques requires careful consideration of conceptual and methodological issues associated with their use. We identify and explore these issues by defining and developing a typology of “e-survey” techniques in organiza- tional research. We examine the strengths, weaknesses, and threats to reliability, validity, sampling, and generalizability of these approaches. We conclude with a consideration of emerging issues of security, privacy, and ethics associated with the design and implications of e-survey methodology. INTRODUCTION 1999; Oppermann, 1995; Saris, 1991). Although research over the past 15 years has been mixed on For the researcher considering the use of elec- the realization of these benefits (Kiesler & Sproull, tronic surveys, there is a rapidly growing body of 1986; Mehta & Sivadas, 1995; Sproull, 1986; Tse, literature addressing design issues and providing Tse, Yin, Ting, Yi, Yee, & Hong, 1995), for the laundry lists of costs and benefits associated with most part, researchers agree that faster response electronic survey techniques (c.f., Lazar & Preece, times and decreased costs are attainable benefits, 1999; Schmidt, 1997; Stanton, 1998). Perhaps the while response rates differ based on variables three most common reasons for choosing an e-sur- beyond administration mode alone.
    [Show full text]
  • Questionnaire Design Guidelines for Establishment Surveys
    Journal of Official Statistics, Vol. 26, No. 1, 2010, pp. 43–85 Questionnaire Design Guidelines for Establishment Surveys Rebecca L. Morrison1, Don A. Dillman2, and Leah M. Christian3 Previous literature has shown the effects of question wording or visual design on the data provided by respondents. However, few articles have been published that link the effects of question wording and visual design to the development of questionnaire design guidelines. This article proposes specific guidelines for the design of establishment surveys within statistical agencies based on theories regarding communication and visual perception, experimental research on question wording and visual design, and findings from cognitive interviews with establishment survey respondents. The guidelines are applicable to both paper and electronic instruments, and cover such topics as the phrasing of questions, the use of space, the placement and wording of instructions, the design of answer spaces, and matrices. Key words: Visual design; question wording; cognitive interviews. 1. Introduction In recent years, considerable effort has been made to develop questionnaire construction guidelines for how questions should appear in establishment surveys. Examples include guidelines developed by the Australian Bureau of Statistics (2006) and Statistics Norway (Nøtnæs 2006). These guidelines have utilized the rapidly emerging research on how the choice of survey mode, question wording, and visual layout influence respondent answers, in order to improve the quality of responses and to encourage similarity of construction when more than one survey data collection mode is used. Redesign efforts for surveys at the Central Bureau of Statistics in the Netherlands (Snijkers 2007), Statistics Denmark (Conrad 2007), and the Office for National Statistics in the United Kingdom (Jones et al.
    [Show full text]
  • Interactive Voice Response for Data Collection in Low and Middle-Income Countries
    Interactive Voice Response for Data Collection in Low and Middle-Income Countries Viamo Brief April 2018 Suggested Citation Greenleaf, A.R. Vogel, L. 2018. Interactive Voice Response for Data Collection in Low and Middle- Income Countries. Toronto, Canada: Viamo. 1 0 - EXECUTIVE SUMMARY Expanding mobile network coverage, decreasing cost of cellphones and airtime, and a more literate population have made mobile phone surveys an increasingly viable option for data collection in low- and middle-income countries (LMICs). Interactive voice response (IVR) is a fast and cost-effective option for survey data collection. The benefits of trying to reach respondents in low and middle-income countries (LMICs) via cell phone have been described by The World Bank,[1] academics[2,3], and practitioners[4] alike. IVR, a faster and less expensive option than face-to-face surveys, can collect data in areas that are difficult for human interviewers to reach. This brief explains applications of IVR for data collection in LMICs. Sections 1- 4 provide background information about IVR and detail the advantages of “robo-calls”. The next three sections explain the three main target groups for IVR. Beginning with Section 5 we outline the four approaches to sampling a general population and address IVR data quality. Known respondents, who are often enrolled for monitoring and evaluation, are covered in Section 6, along with best practices for maximizing participant engagement. Finally, in Section 7 we explain how professionals use IVR for surveillance and reporting. Woven throughout Sections 5-7, four case studies illustrate how four organizations have successfully used IVR to for data collection.
    [Show full text]
  • When Should We Ask, When Should We Measure?
    Page 1 – CONGRESS 2015 Copyright © ESOMAR 2015 WHEN SHOULD WE ASK, WHEN SHOULD WE MEASURE? COMPARING INFORMATION FROM PASSIVE AND ACTIVE DATA COLLECTION Melanie Revilla • Carlos Ochoa • Roos Voorend • Germán Loewe INTRODUCTION Different sources of data Questionnaires have been a fundamental tool for market research for decades. With the arrival of internet, the questionnaire, a tool invented more than 100 years ago, was simply adapted to online data collection. The arrival of online panels in the 2000s meant an important boost for online questionnaires, and as a consequence, the tipping point for their online migration. We have come this far making all kind of market research projects using almost always the same tool. But this does not mean there is no room or need for improvement. If researchers massively used questionnaires during all these years, it is not because this is the most suited data collection tool for all the problems they faced; but because there were no better alternatives available. In addition, nowadays, things are changing really fast. This affects survey-based market research, particularly when the data collection process relies on respondent’s memory. A tool that was working reasonably in the past may not be sufficient anymore. Indeed, in the last decades, with the development of new technologies, an increased part of the consumer activity takes place on the internet. Also, we have witnessed an explosion of relevant events for marketing: increased consumer ad exposure through multiple channels, increased availability of products and brands, complex and quick decision making processes, etc. Larger memory issues may be expected in this new situation.
    [Show full text]
  • 2021 RHFS Survey Methodology
    2021 RHFS Survey Methodology Survey Design For purposes of this document, the following definitions are provided: • Building—a separate physical structure identified by the respondent containing one or more units. • Property—one or more buildings owned by a single entity (person, group, leasing company, and so on). For example, an apartment complex may have several buildings but they are owned as one property. Target population: All rental housing properties in the United States, circa 2020. Sampling frame: The RHFS sample frame is a single frame based on a subset of the 2019 American Housing Survey (AHS) sample units. The RHFS frame included all 2019 AHS sample units that were identified as: 1. Rented or occupied without payment of rent. 2. Units that are owner occupied and listed as “for sale or rent”. 3. Vacant units for rent, for rent or sale, or rented but not yet occupied. By design, the RHFS sample frame excluded public housing and transient housing types (i.e. boat, RV, van, other). Public housing units are identified in the AHS through a match with the Department of Housing and Urban Development (HUD) administrative records. The RHFS frame is derived from the AHS sample, which is itself composed of housing units derived from the Census Bureau Master Address File. The AHS sample frame excludes group quarters housing. Group quarters are places where people live or stay in a group living arrangement. Examples include dormitories, residential treatment centers, skilled nursing facilities, correctional facilities, military barracks, group homes, and maritime or military vessels. As such, all of these types of group quarters housing facilities are, by design, excluded from the RHFS.
    [Show full text]
  • A Meta-Analysis of the Effect of Concurrent Web Options on Mail Survey Response Rates
    When More Gets You Less: A Meta-Analysis of the Effect of Concurrent Web Options on Mail Survey Response Rates Jenna Fulton and Rebecca Medway Joint Program in Survey Methodology, University of Maryland May 19, 2012 Background: Mixed-Mode Surveys • Growing use of mixed-mode surveys among practitioners • Potential benefits for cost, coverage, and response rate • One specific mixed-mode design – mail + Web – is often used in an attempt to increase response rates • Advantages: both are self-administered modes, likely have similar measurement error properties • Two strategies for administration: • “Sequential” mixed-mode • One mode in initial contacts, switch to other in later contacts • Benefits response rates relative to a mail survey • “Concurrent” mixed-mode • Both modes simultaneously in all contacts 2 Background: Mixed-Mode Surveys • Growing use of mixed-mode surveys among practitioners • Potential benefits for cost, coverage, and response rate • One specific mixed-mode design – mail + Web – is often used in an attempt to increase response rates • Advantages: both are self-administered modes, likely have similar measurement error properties • Two strategies for administration: • “Sequential” mixed-mode • One mode in initial contacts, switch to other in later contacts • Benefits response rates relative to a mail survey • “Concurrent” mixed-mode • Both modes simultaneously in all contacts 3 • Mixed effects on response rates relative to a mail survey Methods: Meta-Analysis • Given mixed results in literature, we conducted a meta- analysis
    [Show full text]
  • A Computationally Efficient Method for Selecting a Split Questionnaire Design
    Iowa State University Capstones, Theses and Creative Components Dissertations Spring 2019 A Computationally Efficient Method for Selecting a Split Questionnaire Design Matthew Stuart Follow this and additional works at: https://lib.dr.iastate.edu/creativecomponents Part of the Social Statistics Commons Recommended Citation Stuart, Matthew, "A Computationally Efficient Method for Selecting a Split Questionnaire Design" (2019). Creative Components. 252. https://lib.dr.iastate.edu/creativecomponents/252 This Creative Component is brought to you for free and open access by the Iowa State University Capstones, Theses and Dissertations at Iowa State University Digital Repository. It has been accepted for inclusion in Creative Components by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. A Computationally Efficient Method for Selecting a Split Questionnaire Design Matthew Stuart1, Cindy Yu1,∗ Department of Statistics Iowa State University Ames, IA 50011 Abstract Split questionnaire design (SQD) is a relatively new survey tool to reduce response burden and increase the quality of responses. Among a set of possible SQD choices, a design is considered as the best if it leads to the least amount of information loss quantified by the Kullback-Leibler divergence (KLD) distance. However, the calculation of the KLD distance requires computation of the distribution function for the observed data after integrating out all the missing variables in a particular SQD. For a typical survey questionnaire with a large number of categorical variables, this computation can become practically infeasible. Motivated by the Horvitz-Thompson estima- tor, we propose an approach to approximate the distribution function of the observed in much reduced computation time and lose little valuable information when comparing different choices of SQDs.
    [Show full text]
  • Chi Square Survey Questionnaire
    Chi Square Survey Questionnaire andCoptic polo-neck and monoclonal Guido catholicizes Petey slenderize while unidirectional her bottom tigerishness Guthrie doffs encased her lamplighter and cricks documentarily necromantically. and interlinedMattery disinterestedly,queenly. Garcon offerable enskied andhis balderdashesmanufactural. trivializes mushily or needily after Aamir ethylate and rainproof Chi Square test of any Contingency Table because Excel. Comparing frequencies Chi-Square tests Manny Gimond. Are independent of squared test have two tests are the. The Chi Squared Test is a statistical test that already often carried out connect the start of they intended geographical investigation. OpenStax Statistics CH11THE CHI-SQUARE Top Hat. There are classified according to chi square survey questionnaire. You can only includes a questionnaire can take these. ANOVA Regression and Chi-Square Educational Research. T-Tests & Survey Analysis SurveyMonkey. Aids victims followed the survey analysis has a given by using likert? Square test of questionnaires, surveys frequently scared to watch horror movies too small? In short terms with are regression tests t-test ANOVA chi square and. What you calculate a survey solution is two columns of questionnaires, surveys frequently than to download reports! Using Cross Tabulation and Chi-Square The Survey Says. The Chi-Square Test for Independence Department of. And you'll must plug the research into a chi-square test for independence. Table 4a reports the responses to questions 213 in framework study survey. What output it mean look the chi square beauty is high? Completing the survey and surveys frequently ask them? Chi square test is rejected: the survey in surveys, is the population, explain that minority male and choose your dv the population of.
    [Show full text]
  • Survey Data Collection Using Complex Automated Questionnaires
    From: AAAI Technical Report SS-03-04. Compilation copyright © 2003, AAAI (www.aaai.org). All rights reserved. Survey Data Collection Using Complex Automated Questionnaires William P. Mockovak, Ph.D. Bureau of Labor Statistics Office of Survey Methods Research 2 Massachusetts Ave. N.E. Washington, DC 20212 [email protected] Abstract Since the early 1980s, researchers in the federal statistical com- The CPS is conducted by a workforce of about 1,500 inter- munity have been moving complex surveys from paper question- viewers scattered across the U.S. who conduct interviews naires to computer-assisted interviewing (CAI). The data col- in respondents’ homes and, occasionally, on doorsteps, lected through such surveys cover a variety of needs, from critical porches, lawns, etc., or over the telephone from either an information on the health of the economy to social issues as interviewer’s home or from a centralized telephone inter- measured by statistics on health, crime, expenditures, and educa- viewing facility. Although the composition of the inter- tion. This paper covers some of the key issues involved in devel- viewer workforce has undergone some changes in recent oping applications used primarily by a middle-age, part-time years with the increased hiring of retirees, historically, the workforce, which uses the software in a variety of situations, majority of interviewers have been women between the while interacting with an occasionally uncooperative public. ages of 45 and 55, who work part-time, and who lack many of the computer skills taken for granted in younger popula- Introduction tions. Many interviewers tend to work on a single, primary survey, but are available to work on others (some continu- Government surveys provide critical information on a vari- ous, some one-time).
    [Show full text]
  • Planning and Implementing Household Surveys Under COVID-19
    TECHNICAL GUIDANCE NOTE Planning and Implementing Household Surveys Under COVID-19 TECHNICAL GUIDANCE NOTE Planning and Implementing Household Surveys Under COVID-19 Date: 15 December 2020 Keywords: Household surveys, Face-to-face interview, Safety protocol, COVID-19 ii Contents Acknowledgements iii Introduction 5 1. General principles 6 2. Planning data collection 7 2.1 Setting/Revisiting survey objectives 7 2.2 Assessing the COVID-19 situation 7 2.3 Building the project team 7 2.4 Budgeting 8 2.5 Choosing the mode of data collection 9 2.6 Designing the questionnaire 10 3. Field organization 11 3.1 Recruiting field staff 11 3.2 Organizing field staff 12 3.3 Advocacy and communication 12 3.4 Survey materials and equipment 13 3.5 Training field staff 13 Protocols 14 Venue 15 Supplies 15 During training 15 3.6 Making fieldwork plan 16 4. Fieldwork 17 4.1 Before the interview 17 4.2 During the interview 18 4.3 After the interview 18 4.4 Transportation to and from the field 18 5. Post fieldwork 19 ANNEX 1. Checklist for planning and carrying out household survey sunder COVID-19 20 ANNEX 2. Etiquette for organizing and attending remote training sessions 25 ANNEX 3. COVID-19 Risk Assessment Questionnaire 26 ANNEX 4. Informed consent (example) 28 iii Acknowledgements This Technical Guidance Note was prepared by Haoyi Chen from the Inter-Secretariat Working Group on Household Surveys and, Gbemisola Oseni, Amparo Palacios-Lopez, and Akiko Sagesaka from the Living Standards Measurement Study team of the World Bank. It was produced under the direction of co-Leads of the COVID-19 Task Force: Francesca Perucci, Deputy Director of the United Nations Statistics Division; Gero Carletto, Manager of the Living Standards Measurement Study of the World Bank and Silvia Montoya, Director of the UNESCO Institute for Statistics.
    [Show full text]
  • Survey Methods
    SURVEY METHODS About the survey In the spring of 2015, the Center for Survey Research at the University of Virginia entered into an agreement with Foothills Forum, a nonprofit, nonpartisan group of citizens in Rappahannock County, Virginia, to design and then conduct a mail-out survey of households in Rappahannock County. Foothills Forum is currently organized as a 501(c)3 non-profit. Foothills Forum was represented by its chairman, Larry “Bud” Meyer, and a survey committee charged with assisting in the development of the questionnaire. The goal of the survey was to determine citizen opinion on issues important to them regarding life in the County. Questionnaire development Beginning in January, 2015, the staff at the Center for Survey Research and the survey committee for Foothills Forum discussed the aims of the survey, based on an initial conceptual outline formulated by Foothills Forum. Foothills Forum also conducted a series of focus groups, not designed or assisted by the Center for Survey Research, in order to help them clarify issues to be included in the questionnaire. A preliminary questionnaire was developed by August, 2015 and was pretested at a focus group in Washington, Virginia, held on September 15, 2015. Foothills Forum was responsible for the recruitment of volunteers for the focus group and for arrangements and set-up of the meeting. The group was facilitated by Kathryn Wood, assisted by Matthew Braswell, who served as recorder. As a result of the focus group, significant modifications were made to the questionnaire. The final questionnaire was approved by the Foothills Forum survey committee on October 9, 2015 and was submitted for review and approval by the University of Virginia’s Institutional Review Board for the Social and Behavioral Sciences.
    [Show full text]
  • Preparing and Fielding High-Quality Surveys Practical Strategies for Successfully Implementing Neighborhood and School Climate Surveys in Promise Neighborhoods
    NEIGHBORHOODS, CITIE S, AND METROS RESEARCH REPORT Preparing and Fielding High-Quality Surveys Practical Strategies for Successfully Implementing Neighborhood and School Climate Surveys in Promise Neighborhoods Kaitlin Franks Hildner Elizabeth Oo Peter A. Tatian June 2015 ABOUT THE URBAN INSTITUTE The nonprofit Urban Institute is dedicated to elevating the debate on social and economic policy. For nearly five decades, Urban scholars have conducted research and offered evidence-based solutions that improve lives and strengthen communities across a rapidly urbanizing world. Their objective research helps expand opportunities for all, reduce hardship among the most vulnerable, and strengthen the effectiveness of the public sector. Copyright © June 2015. Urban Institute. Permission is granted for reproduction of this file, with attribution to the Urban Institute. Cover image by Tim Meko. Contents Acknowledgments iv Introduction 1 Characteristics of a High-Quality Survey 2 Neighborhood Survey 4 Mode of Survey Data Collection 4 Considerations When Constructing a Survey Sample 4 Preparation and Logistics of Survey Field Management 5 Recommendations for Sample Training Scenarios 12 School Survey 18 Conclusion 20 Appendix A 21 Appendix B 23 Notes 24 References 25 About the Authors 26 Statement of Independence 27 Acknowledgments This report was funded by the US Department of Education. We are grateful to them and to all our funders, who make it possible for Urban to advance its mission. Funders do not, however, determine our research findings or the insights and recommendations of our experts. The views expressed are those of the authors and should not be attributed to the Urban Institute, its trustees, or its funders.
    [Show full text]