Extensible Public Opinion Research

Total Page:16

File Type:pdf, Size:1020Kb

Extensible Public Opinion Research 67thWORLD ASSOCIA AnnualTION FOR PUBLIC Conference OPINION RESEARCH Extensible Public Opinion 4-6 September 2014 Radisson Blu, Nice, France NORC, an independent research organization, informs decision-makers and the public on key issues facing society by collecting, analyzing, and disseminating objective information. $ % NORC is pleased to be a sponsor of ? WAPOR 2014 % downtown chicago | university of chicago | dc metro | atlanta | boston | san francisco www.norc.org | [email protected] Introduction to the Program The 2014 WAPOR Annual Conference presents about 150 papers with the theme of Extensible Public Opinion. Con- sistent with the theme, the papers are quite varied and extend the boundaries of public opinion. All papers are presented in 90-minute sessions of four or five papers each. All authors chose ideas relevant to their papers from the 28 ideas in Appendix A at the end of this program. These choices were used to assign sessions with papers with shared ideas. The ideas are focused enough to allow authors to have discussions based on some com- mon interests. The ideas are also broad enough that sessions can include both usual and more mainstream papers. The intermingling of diverse papers was designed to encourage the cross-pollination among authors that is the goal of the conference theme. One consequence of the diversity is that sessions are not always easy to label. Therefore, session titles are the consensus ideas chosen by authors from Appendix A. In addition to a conference program in a standard printed format, there is an online program with both an InfAlign map and an e-commerce version available at http://infotrend.com/?page_id=250. We look forward to your comments about this approach of assigning loosely focused sessions that still have enough in common for meaningful interactions among people who might otherwise not meet. WAPOR thanks its generous sponsors for supporting the organization and this year’s annual conference. We would like to acknowledge our platinum sponsors NORC, Nielsen, and Social and Economic Survey Research Institute (SES- RI); gold sponsor D3 Systems; silver sponsors Oxford University Press and Westat; and bronze sponsors Gulf Opin- ions Center for Polls and Statistics, Instituto Tecnológico Autónomo de México (ITAM), Parametro, RTI International, and the Roper Center. With our best wishes for an enjoyable conference, David Fan, Program Chair We would like to extend thanks to all of the reviewers who read and scored abstracts for inclusion into the program. They are: Porismita Borah, Washington State University (USA) Gabriella Catterberg, Universidad de Buenos Aires (Argentina) Roei Davidson, University of Haifa (Israel) Claire Durand, University of Montreal (Canada) Michael Elasmar, Boston University (USA) Stefan Geiss, University of Mainz (Germany) Homero Gil de Zuniga, Universität Wien (Austria) Jibum Kim, Sungkyunkwan University (Republic of Korea) Patricia Moy, University of Washington (USA) Alejandro Moreno, Instituto Tecnologico Autonomo de Mexico, ITAM (Mexico) Seungahn Nah, University of Kentucky (USA) Lilach Nir, Hebrew University of Jerusalem (Israel) Thomas Petersen, Institut für Demoskopie Allensbach (Germany) Oliver Quiring, University of Mainz (Germany) Hernando Rojas, University of Wisconsin (USA) Yariv Tsfati, University of Haifa (Israel) Claes H. De Vreese, University of Amsterdam (Netherlands) WAPOR Council 2014 EXECUTIVE COUNCIL Chair, Media Relations Committee: President: Ms. Jennifer Agiesta Dr. Alejandro Moreno The Associated Press Instituto Tecnologico Autonomo de Mexico, ITAM USA MEXICO email: [email protected] email: [email protected] Chair, Membership Committee: Vice President & President-Elect: Prof. Dominique Joye Prof. Patricia Moy University of Lausanne University of Washington SWITZERLAND USA email: [email protected] email: [email protected] IJPOR Editor: Past President: Prof. Dr. Claes H. De Vreese Dr. Tom W. Smith University of Amsterdam NORC NETHERLANDS USA email: [email protected] email: [email protected] Historian: Secretary-Treasurer: Dr. Kathleen A. Frankovic Prof. Claire Durand USA University of Montreal email: [email protected] CANADA Chair, 2014 WAPOR Annual Conference: email: [email protected] Dr. David Fan University of Minnesota COUNCIL USA Chair, Professional Standards Committee: email: [email protected] Dr. Anne Niedermann Institut für Demoskopie Allensbach GERMANY NATIONAL AND REGIONAL REPRESENTATIVES: email: [email protected] Argentina, Ms. María Braun Chair, Liaison Committee: Australia, Prof. Murray Goot Dr. Robert TY Chung Belgium, Prof. Jaak Billiet The University of Hong Kong Brazil, Ms. Helcimara Telles HONG KONG Canada, Ms. Rosanna Shih email: [email protected] Chile, Ms. Marta Lagos Colombia, Dr. Hernando Rojas Chair, Publications Committee: Czech Republic, Prof. Hynek Jerabek Mr. Trevor Tompson Egypt, Prof. Ragia Kandil Associated Press-NORC Center for Public Affairs Research Germany, Dr. Thomas Petersen NORC at the University of Chicago Hong Kong, Dr. Robert TY Chung USA India, Mr. Prakash Nijhara email: [email protected] Italy, Ms. Maria Francesca Romano Chair, Conference Committee: Japan, Prof. Etsushi Tanifuji Prof. Patricia Moy Korea, Prof. Sung Kyum Cho University of Washington Netherlands, Mr. Jeroen Slot USA Norway, Prof. Ottar Hellevik email: [email protected] Poland, Dr. Krzysztof Zagórski Qatar, Dr. Darwish Alamadi OTHER POSITIONS Singapore, Prof. Weiyu Zhang General Secretary: Spain, Mr. José Carlos Del Ama Prof. Dr. Allan McCutcheon Switzerland, Prof. Dominique Joye University of Nebraska-Lincoln Taiwan, Ruoh-rong Yu USA UK, Mr. Nick Moon email: [email protected] USA, Mr. Mark Schulman At-A-Glance THURSDAY, SEPTEMBER 4 8:00-12:00 WAPOR Council Meeting Amirauté 10:00-12:00 IJPOR Editors Meeting Fregate 12:00-18:00 Reception Desk Open Foyer 13:00-13:20 Welcome and Introduction to the Program Baie des Anges 13:30-15:00 Concurrent Sessions A1-A4 Rooms listed on pp. 6-7 15:00-15:25 Coffee Break Foyer 15:30-17:00 Concurrent Sessions B1-B4 Rooms listed on pp. 7-9 18:00-20:00 Welcome Reception Terrace FRIDAY, SEPTEMBER 5 8:00-all day Reception Desk Open Foyer 8:30-10:00 Concurrent Sessions C1-C4 Rooms listed on pp. 9-10 10:00-10:25 Coffee Break Foyer 10:30-12:00 Concurrent Sessions D1-D4 Rooms listed on pp. 10-11 12:00-13:25 Lunch Terrace 13:30-15:00 Concurrent Sessions E1-E4 Rooms listed on pp. 11-12 15:00-15:25 Coffee Break Foyer 15:30-17:00 Concurrent Sessions F1-F4 Rooms listed on pp. 12-14 17:00-18:15 Regional Chapters Meeting Courants 18:30-19:30 Cocktails (cash bar) Terrace 19:30-21:00 Awards Banquet Riviera SATURDAY, SEPTEMBER 6 8:00-11:30 Reception Desk Open Foyer 8:30-9:45 WAPOR business meeting (open to all) Azur 9:45-10:10 Coffee Break Foyer 10:15-11:45 Concurrent Sessions G1-G4 Rooms listed on pp. 14-15 11:45-13:00 Lunch Terrace 13:00-14:30 Concurrent Sessions H1-H4 Rooms listed on pp. 15-16 14:30-14:55 Coffee Break Foyer 15:00-16:30 Concurrent Sessions I1-I4 Rooms listed on p. 17 WAPOR 67th Annual Conference Extensible Public Opinion 4-6 September 2014 Radisson Blu, Nice, France Session Room Day Time Session Topics Intro Baie de Thur 1300 Using the Online Conference Program Anges Conference Chair: David Fan A-1 Azur Thur 1330 Method: Pre-Interview Session Chair: Johnny Heald (ORB International) Coverage Error in Mobile Web Surveys Across European Countries - Tanja Kunz (Darmstadt University of Technology), Anke Metzler (Darmstadt University of Technology), Marek Fuchs (Darmstadt University of Technology) Challenges and Solutions of Introducing Mobile Phones in CATI Surveys - Kim De Cuyper (GfK Belgium), Sara Gysen (GfK Belgium), Christine Tresignie (GfK Belgium) Holistic Approach to Quality Management in Multi-Country Research - Sara Gysen (GfK Belgium), Kim De Cuyper (GfK Belgium), Christine Tresignie (GfK Belgium) What’s the Impact of Coverage Error in CATI Surveys? The Italian Case - Emanuela Sala (Università di MIlano Bicocca), Roberto Lillini (Università di MIlano Bicocca) A-2 Courants Thur 1330 Quantitative; Opinion Theory Session Chair: Allan McCutcheon (University of Nebraska-Lincoln) Problems and Developments in Data Analysis of Cross-National Comparative Surveys - Kazufumi Manabe (Aoyama Gakuin University) EstimatingtheEffectsofNonresponsesinOnlinePanelsThroughImputation - Weiyu Zhang (National University of Singapore) This paper tells you how much bias nonresponses in online panels will bring, using an imputation method Adjust Survey Response Distributions Using Multiple Imputation: A Simulation with External Validation - Yu-Sung Su (Tsinghua University), Frank C. S. Liu (National Sun Yat-Sen University) HowandWhyOpinionsbecomePublic.TheSocialConditionsofaPublicOpinion - Nathalie Heinich (CNRS) A-3 Mistral Thur 1330 AAPOR Panel: Measuring Opinion in a Changing World: Reports from the AAPOR Taskforce on Emerging Technologies Session Chair: Michael Link (Nielsen) 6 4-6 September 2014 - Nice, France Session Room Day Time Session Topics Beyond Surveys: Multimode Capabilities of Mobile Devices to Enhance (or Replace) Traditional Surveys - Michael Link (Nielsen) Researchers leveraging the multimode capabilities of mobile devices to enhance/replace surveys - AAPOR TF Report Social Media Platform Touchpoints Across the Survey Lifecycle - Joe Murphy (RTI International) Social platforms provide new tools for survey researchers - surveys, recruitment, testing - AAPOR TF Report Surveys Conducted on Mobile Devices: What’s New and What Has Not Changed? - Michael Schober
Recommended publications
  • Hybrid Online Audience Measurement Calculations Are Only As Good As the Input Data Quality
    People First: A User-Centric Hybrid Online Audience by John Brauer, Manager, Product Leadership, Measurement Model Nielsen Overview As with any media measurement methodology, hybrid online audience measurement calculations are only as good as the input data quality. Hybrid online audience measurement goes beyond a single metric to forecast online audiences more accurately by drawing on multiple data sources. The open question for online publishers and advertisers is, ‘Which hybrid audience measurement model provides the most accurate measure?’ • User-centric online hybrid models rely exclusively on high quality panel data for audience measurement calculations, then project activity to all sites informed by consumer behavior patterns on tagged sites • Site-centric online hybrid models depend on cookies for audience calculations, subject to the inherent weaknesses of cookie data that cannot tie directly to unique individuals and their valuable demographic information • Both hybrid models assume that panel members are representative of the universe being measured and comprise a base large enough to provide statistical reliability, measured in a way that accurately captures behavior • The site-centric hybrid model suffers from the issue of inaccurately capturing unique users as a result of cookie deletion and related activities that can result in audience measurement estimates different than the actual number • Rapid uptake of new online access devices further exacerbates site-centric cookie data tracking issues (in some cases overstating unique browsers by a factor of five) by increasing the likelihood of one individual representing multiple cookies Only the user-centric hybrid model enables a fair comparison of audiences across all web sites, avoids the pitfall of counting multiple cookies versus unique users, and employs a framework that anticipates future changes such as additional data sets and access devices.
    [Show full text]
  • Sampling and Fieldwork Practices in Europe
    www.ssoar.info Sampling and Fieldwork Practices in Europe: Analysis of Methodological Documentation From 1,537 Surveys in Five Cross-National Projects, 1981-2017 Jabkowski, Piotr; Kołczyńska, Marta Veröffentlichungsversion / Published Version Zeitschriftenartikel / journal article Empfohlene Zitierung / Suggested Citation: Jabkowski, P., & Kołczyńska, M. (2020). Sampling and Fieldwork Practices in Europe: Analysis of Methodological Documentation From 1,537 Surveys in Five Cross-National Projects, 1981-2017. Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 16(3), 186-207. https://doi.org/10.5964/meth.2795 Nutzungsbedingungen: Terms of use: Dieser Text wird unter einer CC BY Lizenz (Namensnennung) zur This document is made available under a CC BY Licence Verfügung gestellt. Nähere Auskünfte zu den CC-Lizenzen finden (Attribution). For more Information see: Sie hier: https://creativecommons.org/licenses/by/4.0 https://creativecommons.org/licenses/by/4.0/deed.de Diese Version ist zitierbar unter / This version is citable under: https://nbn-resolving.org/urn:nbn:de:0168-ssoar-74317-3 METHODOLOGY Original Article Sampling and Fieldwork Practices in Europe: Analysis of Methodological Documentation From 1,537 Surveys in Five Cross-National Projects, 1981-2017 Piotr Jabkowski a , Marta Kołczyńska b [a] Faculty of Sociology, Adam Mickiewicz University, Poznan, Poland. [b] Department of Socio-Political Systems, Institute of Political Studies of the Polish Academy of Science, Warsaw, Poland. Methodology, 2020, Vol. 16(3), 186–207, https://doi.org/10.5964/meth.2795 Received: 2019-03-23 • Accepted: 2019-11-08 • Published (VoR): 2020-09-30 Corresponding Author: Piotr Jabkowski, Szamarzewskiego 89C, 60-568 Poznań, Poland. +48 504063762, E-mail: [email protected] Abstract This article addresses the comparability of sampling and fieldwork with an analysis of methodological data describing 1,537 national surveys from five major comparative cross-national survey projects in Europe carried out in the period from 1981 to 2017.
    [Show full text]
  • Political Socialization, Public Opinion, and Polling
    Political Socialization, Public Opinion, and Polling asic Tenants of American olitical Culture iberty/Freedom People should be free to live their lives with minimal governmental interference quality Political, legal, & of opportunity, but not economic equality ndividualism Responsibility for own decisions emocracy Consent of the governed Majority rule, with minority rights Citizen responsibilities (voting, jury duty, etc.) olitical Culture vs. Ideology Most Americans share common beliefs in he democratic foundation of government culture) yet have differing opinions on the proper role & scope of government ideology) olitical Socialization Process through which an individual acquires particular political orientations, beliefs, and values at factors influence political opinion mation? Family #1 source for political identification Between 60-70% of all children agree with their parents upon reaching adulthood. If one switches his/her party affiliation it usually goes from Republican/Democrat to independent hat factors influence political opinion mation? School/level of education “Building good citizens”- civic education reinforces views about participation Pledge of Allegiance Volunteering Civic pride is nation wide Informed student-citizens are more likely to participate/vote as adults Voting patterns based on level of educational attainment hat factors influence political opinion rmation? Peer group/occupation Increases in importance during teen years Occupational influences hat factors influence political opinion mation? Mass media Impact
    [Show full text]
  • ESOMAR 28 Questions
    Responses to ESOMAR 28 Questions INTRODUCTION Intro to TapResearch TapResearch connects mobile, tablet and pc users interested in completing surveys with market researchers who need their opinions. Through our partnerships with dozens of leading mobile apps, ad networks and websites, we’re able to reach an audience exceeding 100 million people in the United States. We are focused on being a top-quality partner for ad hoc survey sampling, panel recruitment, and router integrations. Our technology platform enables reliable feasibility estimates, highly competitive costs, sophisticated quality enforcement, and quick-turnaround project management. Intro to ESOMAR ESOMAR (European Society for Opinion and Market Research) is the essential organization for encouraging, advancing and elevating market research worldwide. Since 1948, ESOMAR’s aim has been to promote the value of market and opinion research in effective decision-making. The ICC/ESOMAR Code on Market and Social Research, which was developed jointly with the International Chamber of Commerce, sets out global guidelines for self-regulation for researchers and has been undersigned by all ESOMAR members and adopted or endorsed by more than 60 national market research associations worldwide. Responses to ESOMAR 28 Questions COMPANY PROFILE 1) What experience does your company have in providing online samples for market research? TapResearch connects mobile, tablet and pc users interested in completing surveys with market researchers who need their opinions. Through our partnerships with dozens of leading mobile apps, ad networks and websites, we’re able to reach an audience exceeding 100 million people in the United States - we’re currently adding about 30,000 panelists/day and this rate is increasing.
    [Show full text]
  • 10 Questions Opinion Polls
    Questions you might have on 10opinion polls 1. What is an opinion poll? An opinion poll is a survey carried out to measure views on a certain topic within a specific group of people. For example, the topic may relate to who Kenyans support in the presidential race, in which case, the group of people interviewed will be registered voters. 2. How are interviewees for an opinion poll selected? The group of people interviewed for an opinion poll is called a sample. As the name suggests, a sample is a group of people that represents the total population whose opinion is being surveyed. In a scientific opinion poll, everyone has an equal chance of being interviewed. 3. So how come I have never been interviewed for an opinion poll? You have the same chance of being polled as anyone else living in Kenya. However, chances of this are very small and are estimated at about 1 in 14,000. This is because there are approximately 14 million registered voters in Kenya and, for practical and cost reasons, usually only between 1,000 and 2,000 people are interviewed for each survey carried out. 4. How can such a small group be representative of the entire population? In order to ensure that the sample/survey group is representative of the population, the surveyors must ensure that the group reflects the characteristics of the whole. For instance, to get a general idea of who might win the Kenyan presidential election, only the views of registered voters in Kenya will be surveyed as these are the people who will be able to influence the election.
    [Show full text]
  • Esomar/Grbn Guideline for Online Sample Quality
    ESOMAR/GRBN GUIDELINE FOR ONLINE SAMPLE QUALITY ESOMAR GRBN ONLINE SAMPLE QUALITY GUIDELINE ESOMAR, the World Association for Social, Opinion and Market Research, is the essential organisation for encouraging, advancing and elevating market research: www.esomar.org. GRBN, the Global Research Business Network, connects 38 research associations and over 3500 research businesses on five continents: www.grbn.org. © 2015 ESOMAR and GRBN. Issued February 2015. This Guideline is drafted in English and the English text is the definitive version. The text may be copied, distributed and transmitted under the condition that appropriate attribution is made and the following notice is included “© 2015 ESOMAR and GRBN”. 2 ESOMAR GRBN ONLINE SAMPLE QUALITY GUIDELINE CONTENTS 1 INTRODUCTION AND SCOPE ................................................................................................... 4 2 DEFINITIONS .............................................................................................................................. 4 3 KEY REQUIREMENTS ................................................................................................................ 6 3.1 The claimed identity of each research participant should be validated. .................................................. 6 3.2 Providers must ensure that no research participant completes the same survey more than once ......... 8 3.3 Research participant engagement should be measured and reported on ............................................... 9 3.4 The identity and personal
    [Show full text]
  • Random Selection in Politics
    Random selection in politics Lyn Carson and Brian Martin Published in 1999 by Praeger Publishers, Westport, CT Available for purchase from Praeger This is the text submitted to Praeger in February 1999. It differs from the published version in minor ways, including formatting, copy-editing changes, page numbering (100 pages instead of 161) and omission of the index. This version prepared August 2008 Contents 1. Introduction 1 2. Random selection in decision making 10 3. Direct democracy 26 4. Citizen participation without random selection 36 5. Citizen participation with random selection: the early days 43 6. Citizen participation with random selection: yesterday, today, and tomorrow 52 7. Sortition futures 65 8. Strategies 76 Appendix: Examples of citizen participation 84 Bibliography 93 Acknowledgments Brownlea, John Burnheim, Ned Crosby, Many people we’ve talked to find the Jim Dator, Mary Lane, Marcus Schmidt, idea of random selection in politics and Stuart White. Their input was unnatural and unwelcome. This didn’t valuable even when we decided on an deter us! Fortunately, there were a few alternative approach. Helpful comments enthusiasts who got and kept us going. were also received from Ted Becker, Alan Davies, Fred Emery, and Merrelyn Stephen Healy, Lars Klüver, and Ken Emery provided the original inspiration Russell. Others who provided many years ago as well as ongoing information are acknowledged in the conversations. text. The process of obtaining comments Ted Becker encouraged us to write has been stimulating because not all this book. On drafts, we received readers go with us all the way on extensive comments from Arthur randomness.
    [Show full text]
  • 13 Collecting Statistical Data
    13 Collecting Statistical Data 13.1 The Population 13.2 Sampling 13.3 Random Sampling 1.1 - 1 • Polls, studies, surveys and other data collecting tools collect data from a small part of a larger group so that we can learn something about the larger group. • This is a common and important goal of statistics: Learn about a large group by examining data from some of its members. 1.1 - 2 Data collections of observations (such as measurements, genders, survey responses) 1.1 - 3 Statistics is the science of planning studies and experiments, obtaining data, and then organizing, summarizing, presenting, analyzing, interpreting, and drawing conclusions based on the data 1.1 - 4 Population the complete collection of all individuals (scores, people, measurements, and so on) to be studied; the collection is complete in the sense that it includes all of the individuals to be studied 1.1 - 5 Census Collection of data from every member of a population Sample Subcollection of members selected from a population 1.1 - 6 A Survey • The practical alternative to a census is to collect data only from some members of the population and use that data to draw conclusions and make inferences about the entire population. • Statisticians call this approach a survey (or a poll when the data collection is done by asking questions). • The subgroup chosen to provide the data is called the sample, and the act of selecting a sample is called sampling. 1.1 - 7 A Survey • The first important step in a survey is to distinguish the population for which the survey applies (the target population) and the actual subset of the population from which the sample will be drawn, called the sampling frame.
    [Show full text]
  • EFAMRO / ESOMAR Position Statement on the Proposal for an Eprivacy Regulation —
    EFAMRO / ESOMAR Position Statement on the Proposal for an ePrivacy Regulation — April 2017 EFAMRO/ESOMAR Position Statement on the Proposal for an ePrivacy Regulation April 2017 00. Table of contents P3 1. About EFAMRO and ESOMAR 2. Key recommendations P3 P4 3. Overview P5 4. Audience measurement research P7 5. Telephone and online research P10 6. GDPR framework for research purposes 7. List of proposed amendments P11 a. Recitals P11 b. Articles P13 2 EFAMRO/ESOMAR Position Statement on the Proposal for an ePrivacy Regulation April 2017 01. About EFAMRO and ESOMAR This position statement is submitted In particular our sector produces research on behalf of EFAMRO, the European outcomes that guide decisions of public authorities (e.g. the Eurobarometer), the non- Research Federation, and ESOMAR, profit sector including charities (e.g. political the World Association for Data, opinion polling), and business (e.g. satisfaction Research and Insights. In Europe, we surveys, product improvement research). represent the market, opinion and In a society increasingly driven by data, our profession ensures the application of appropriate social research and data analytics methodologies, rigour and provenance controls sectors, accounting for an annual thus safeguarding access to quality, relevant, turnover of €15.51 billion1. reliable, and aggregated data sets. These data sets lead to better decision making, inform targeted and cost-effective public policy, and 1 support economic development - leading to ESOMAR Global Market Research 2016 growth and jobs. 02. Key Recommendations We support the proposal for an ePrivacy Amendment of Article 8 and Recital 21 to enable Regulation to replace the ePrivacy Directive as research organisations that comply with Article this will help to create a level playing field in a true 89 of the General Data Protection Regulation European Digital Single Market whilst increasing (GDPR) to continue conducting independent the legal certainty for organisations operating in audience measurement research activities for different EU member states.
    [Show full text]
  • Summary of Human Subjects Protection Issues Related to Large Sample Surveys
    Summary of Human Subjects Protection Issues Related to Large Sample Surveys U.S. Department of Justice Bureau of Justice Statistics Joan E. Sieber June 2001, NCJ 187692 U.S. Department of Justice Office of Justice Programs John Ashcroft Attorney General Bureau of Justice Statistics Lawrence A. Greenfeld Acting Director Report of work performed under a BJS purchase order to Joan E. Sieber, Department of Psychology, California State University at Hayward, Hayward, California 94542, (510) 538-5424, e-mail [email protected]. The author acknowledges the assistance of Caroline Wolf Harlow, BJS Statistician and project monitor. Ellen Goldberg edited the document. Contents of this report do not necessarily reflect the views or policies of the Bureau of Justice Statistics or the Department of Justice. This report and others from the Bureau of Justice Statistics are available through the Internet — http://www.ojp.usdoj.gov/bjs Table of Contents 1. Introduction 2 Limitations of the Common Rule with respect to survey research 2 2. Risks and benefits of participation in sample surveys 5 Standard risk issues, researcher responses, and IRB requirements 5 Long-term consequences 6 Background issues 6 3. Procedures to protect privacy and maintain confidentiality 9 Standard issues and problems 9 Confidentiality assurances and their consequences 21 Emerging issues of privacy and confidentiality 22 4. Other procedures for minimizing risks and promoting benefits 23 Identifying and minimizing risks 23 Identifying and maximizing possible benefits 26 5. Procedures for responding to requests for help or assistance 28 Standard procedures 28 Background considerations 28 A specific recommendation: An experiment within the survey 32 6.
    [Show full text]
  • 1988: Coverage Error in Establishment Surveys
    COVERAGE ERROR IN ESTABLISHMENT SURVEYS Carl A. Konschnik U.S. Bureau of the CensusI I. Definition of Coverage Error in the survey planning stage results in a sam- pled population which is too far removed from Coverage error which includes both under- the target population. Since estimates based coverage and overcoverage, is defined as "the on data drawn from the sampled population apply error in an estimate that results from (I) fail- properly only to the sampled population, inter- ure to include all units belonging to the est in the target population dictates that the defined population or failure to include speci- sampled population be as close as practicable fied units in the conduct of the survey to the target population. Nevertheless, in the (undercoverage), and (2) inclusion of some units following discussion of the sources, measure- erroneously either because of a defective frame ment, and control of coverage error, only or because of inclusion of unspecified units or deficiencies relative to the sampled population inclusion of specified units more than once in are included. Thus, when speaking of defective the actual survey (overcoverage)" (Office of frames, only those deficiencies are discussed Federal Statistical Policy and Standards, 1978). which arise when the population which is sampled Coverage errors are closely related to but differs from the population intended to be clearly distinct from content errors, which are sampled (the sampled population). defined as the "errors of observation or objec- tive measurement, of recording, of imputation, Coverage Error Source Categories or of other processing which results in associ- ating a wrong value of the characteristic with a We will now look briefly at the two cate- specified unit" (Office of Federal Statistical gories of coverage error--defective frames and Policy and Standards, 1978).
    [Show full text]
  • American Community Survey Design and Methodology (January 2014) Chapter 15: Improving Data Quality by Reducing Non-Sampling Erro
    American Community Survey Design and Methodology (January 2014) Chapter 15: Improving Data Quality by Reducing Non-Sampling Error Version 2.0 January 30, 2014 ACS Design and Methodology (January 2014) – Improving Data Quality Page ii [This page intentionally left blank] Version 2.0 January 30, 2014 ACS Design and Methodology (January 2014) – Improving Data Quality Page iii Table of Contents Chapter 15: Improving Data Quality by Reducing Non-Sampling Error ............................... 1 15.1 Overview .......................................................................................................................... 1 15.2 Coverage Error ................................................................................................................. 2 15.3 Nonresponse Error............................................................................................................ 3 15.4 Measurement Error ........................................................................................................... 6 15.5 Processing Error ............................................................................................................... 7 15.6 Census Bureau Statistical Quality Standards ................................................................... 8 15.7 References ........................................................................................................................ 9 Version 2.0 January 30, 2014 ACS Design and Methodology (January 2014) – Improving Data Quality Page iv [This page intentionally left
    [Show full text]