1 AAPOR/WAPOR Task Force Report on Quality in Comparative Surveys April, 2021 Chairing Committee: Lars Lyberg, Demoskop, AAPOR T

Total Page:16

File Type:pdf, Size:1020Kb

1 AAPOR/WAPOR Task Force Report on Quality in Comparative Surveys April, 2021 Chairing Committee: Lars Lyberg, Demoskop, AAPOR T AAPOR/WAPOR Task Force Report on Quality in Comparative Surveys April, 2021 Chairing Committee: Lars Lyberg, Demoskop, AAPOR Task Force Chair Beth-Ellen Pennell, University of Michigan, WAPOR Task Force Chair Kristen Cibelli Hibben, University of Michigan, Co-Chair Julie de Jong, University of Michigan, Co-Chair Contributors: Dorothée Behr, GESIS – Leibniz Institute for the Social Sciences Jamie Burnett, Kantar Public Rory Fitzgerald, City, University of London Peter Granda, University of Michigan Linda Luz Guerrero, Social Weather Stations Hayk Gyuzalyan, Conflict Management Consulting Tim Johnson, University of Illinois, Chicago Jibum Kim, Sungkyunkwan University, South Korea Zeina Mneimneh, University of Michigan Patrick Moynihan, Pew Research Center Michael Robbins, Princeton University Alisú Schoua-Glusberg, Research Support Services Mandy Sha, www.mandysha.com Tom W. Smith, NORC University of Chicago Ineke Stoop, The Netherlands Irina Tomescu-Dubrow, Institute of Philosophy and Sociology, Polish Academy of Sciences (PAN) and CONSIRT at Ohio State University and PAN Diana Zavala-Rojas, Universitat Pompeu Fabra, Barcelona Elizabeth J. Zechmeister, Vanderbilt University, LAPOP This report was commissioned by the AAPOR and WAPOR Executive Councils as a service to the profession. The report was reviewed and accepted by AAPOR and WAPOR Executive Councils. The opinions expressed in this report are those of the authors and do not necessarily reflect the views of either council. The authors, who retain the copyright to this report, grant AAPOR a non-exclusive perpetual license to the version on the AAPOR website and the right to link to any published versions. 1 This report is dedicated to the memory of Lars Lyberg, who has had a profound and lasting influence on our field. He was a generous collaborator, colleague, and mentor, and a great friend. 2 Table of Contents Abbreviations used in the report 5 Executive Summary 7 Background 7 Priority areas for future research 12 1. Introduction 16 2. Background 19 2.1. History of 3MC surveys 19 2.2 3MC surveys in practice 20 2.3 The fundamental challenges of 3MC surveys 24 3. Quality and comparability in 3MC surveys 32 4. Prevailing operational and design challenges 37 4.1 Organizational structure 37 Introduction and key operational and design challenges 37 Current best practices 38 Recent innovations 39 Suggested future directions 40 4.2 Sampling 40 Introduction and key operational and design challenges 40 Current best practices 41 Recent innovations 47 Suggested future directions 50 4.3 Questionnaire design 51 Introduction and key operational and design challenges 51 Current best practices 54 Recent innovations 55 Suggested future directions 57 4.4 Translation and adaptation 58 Introduction and key operational and design challenges 58 Current best practices 60 Recent innovations 63 Suggested future directions 63 4.5 Questionnaire pretesting 65 Introduction and key operational and design challenges 65 3 Current best practices 66 Recent innovations 67 Suggested future directions 69 4.6 Field implementation 70 Introduction and key operational and design challenges 70 Current best practices 71 Recent innovations 77 Suggested future directions 81 4.7 Documentation in 3MC surveys 83 Introduction and key operational challenges 83 Current best practices 86 Recent innovations 89 Suggested future directions 90 5. The changing survey landscape 94 6. Summary and recommendations 97 Appendix 1 – Task Force Charge 101 Appendix 2 – Table 2 References 106 Appendix 3 – Smith’s 2011 TSE and comparison error figure 110 Appendix 4 – Pennell et al. 2017 TSE framework adapted for 3MC surveys 112 Appendix 5 – Bauer’s random route alternatives 114 True Random Route (TRR) 114 Street Section Sampling (SSS) 115 Appendix 6 – 3MC Survey documentation standards for study-level and variable-level metadata and auxiliary data 117 References 121 4 Abbreviations used in the report 3MC Multicultural, Multinational, and Multiregional AAPOR American Association for Public Opinion Research ADQ Asking different questions ASQ Ask-the-same question ASQT Ask-the-same questions and translating AT Advance translation CAMCES Computer-Assisted Measurement and Coding of Educational Qualifications in Surveys CAPI Computer-assisted Personal Interviews/interviewing CCSG Cross-cultural Survey Guidelines CESSDA Consortium of European Social Science Data Archives CIRF Cognitive Interviewing Reporting Framework CRONOS Cross-National Online Survey CSDI Comparative Survey Design and Implementation CSES Comparative Study of Electoral Systems DDI Data Documentation Initiative DHS Demographic and Health Surveys EES European Election Studies EFTA European Free Trade Association EQLS European Quality of Life Survey ESRA European Survey Research Association ESS European Social Survey EU-LFS EU-Labour Force Survey EU-SILC EU-Statistics on Income and Living Conditions EVS European Values Surveys EWCS European Working Conditions Survey fMoW Functional Map of the World GAP Pew Global Attitudes Project GDPR General Data Protection Regulation GEM Global Entrepreneurship Monitor GGS Generations and Gender Survey HETUS Harmonised European Time Use Surveys IALS International Adult Literacy Survey ICT Information and Communications Technology ISCED International Standard Classification of Education ISCO International Standard Classification of Occupations ISSP International Social Survey Programme LAPOP Latin American Public Opinion Project LSMS World Bank Living Standards Measurement Survey 5 MCA Multiple correspondence analysis NCES National Center for Education Statistics ODNI Office for the Director for National Intelligence OECD Organization for Economic and Co-operation and Development OMB Office of Management and Budget PAPI Paper and Pencil Interviews/interviewing PCA Principal Component Analysis PIAAC Programme for the International Assessment of Adult Competencies PIRLS Progress in International Reading Literacy Study PISA Programme for International Student Assessment PPeS Probability Proportional to Estimated Size QAS Quality Appraisal System QDDT Questionnaire Design and Documentation Tool QUAID Question Understanding Aid SERISS Synergies for Europe’s Research Infrastructure in the Social Sciences SHARE Survey of Health, Aging and Retirement in Europe SQP Survey Quality Predictor SRC Survey Research Center SSHOC Social Sciences and Humanities Open Cloud SSS Street Section Sampling SUSTAIN Sustainable Tailored Integrated Care for Older People in Europe TA Translatability Assessment TIMSS Trends in International Mathematics and Science Study TMT Translation Management Tool TRAPD Translation, Review, Adjudication, Pretest, and Documentation TRR True Random Routes TSE Total Survey Error TTT Train-the-trainer WAPOR World Association for Public Opinion Research WMHS World Mental Health Survey WVS World Values Survey 6 Executive Summary Background Comparative surveys are surveys that study more than one population with the purpose of comparing various characteristics of the populations. The purpose of these types of surveys is to facilitate research of social phenomena across populations, and, frequently, over time. Researchers often refer to comparative surveys that take place in multinational, multiregional, and multicultural contexts as “3MC” surveys (Mneimneh et al., forthcoming).1 To achieve comparability, these surveys need to be carefully designed according to state-of-the-art principles and standards. There are many 3MC surveys conducted within official statistics, and the academic and private sectors. They have become increasingly important to global and regional decision-making as well as theory-building. At the same time these surveys display considerable variation regarding methodological and administrative resources available, organizational infrastructure, awareness of error sources and error structures, level of standardized implementation across populations, as well as user involvement. These circumstances make 3MC surveys vulnerable from a quality perspective. Quality problems present in single-population surveys are therefore magnified in 3MC surveys. In addition, there are quality problems specific to 3MC surveys such as translation processes. The wealth of output from such surveys is usually not accompanied by a corresponding interest in informing researchers, decision-makers, and other users about quality shortcomings. This can lead to understated margins of error and estimates that therefore appear more precise than they actually are. There are also cases where researchers are informed about quality shortcomings but opt to ignore those in their research reports. There are of course many possible explanations for this state of affairs. One is that 3MC surveys are very expensive and the formidable planning and implementation leaves relatively little room for a comprehensive treatment of quality issues. Another explanation is that the survey-taking cultures among survey professionals vary considerably across nations as manifested by varying degrees of methodological capacity, risk assessment, and willingness to adhere to specifications that are not normally applied. The literature on data quality in 3MC surveys is scarce compared to the substantive literature. There are exceptions, though, including the Cross-Cultural Survey Guidelines developed by the University of Michigan and members of the International Workshop on Comparative Survey Design and Implementation (CSDI). AAPOR has created a cross-cultural and multilingual research
Recommended publications
  • This Project Has Received Funding from the European Union's Horizon
    Deliverable Number: D6.6 Deliverable Title: Report on legal and ethical framework and strategies related to access, use, re-use, dissemination and preservation of administrative data Work Package: 6: New forms of data: legal, ethical and quality matters Deliverable type: Report Dissemination status: Public Submitted by: NIDI Authors: George Groenewold, Susana Cabaco, Linn-Merethe Rød, Tom Emery Date submitted: 23/08/19 This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 654221. www.seriss.eu @SERISS_EU SERISS (Synergies for Europe’s Research Infrastructures in the Social Sciences) aims to exploit synergies, foster collaboration and develop shared standards between Europe’s social science infrastructures in order to better equip these infrastructures to play a major role in addressing Europe’s grand societal challenges and ensure that European policymaking is built on a solid base of the highest-quality socio-economic evidence. The four year project (2015-19) is a collaboration between the three leading European Research Infrastructures in the social sciences – the European Social Survey (ESS ERIC), the Survey for Health Aging and Retirement in Europe (SHARE ERIC) and the Consortium of European Social Science Data Archives (CESSDA AS) – and organisations representing the Generations and Gender Programme (GGP), European Values Study (EVS) and the WageIndicator Survey. Work focuses on three key areas: Addressing key challenges for cross-national data collection, breaking down barriers between social science infrastructures and embracing the future of the social sciences. Please cite this deliverable as: Groenewold, G., Cabaco, S., Rød, L.M., Emery, T., (2019) Report on legal and ethical framework and strategies related to access, use, re-use, dissemination and preservation of administrative data.
    [Show full text]
  • Lack of Safe Water, Sanitation Spurs Growing Dissatisfaction with Government Performance
    Afrobarometer Round 6 New data from 36 African countries Dispatch No. 76 | 22 March 2016 Lack of safe water, sanitation spurs growing dissatisfaction with government performance Afrobarometer Dispatch No. 76 | Corah Walker Summary If water is fundamental to life and human dignity, no issue is more pressing for 663 million people for whom access is still lacking (United Nations, 2015). As World Water Day (March 22) reminds us, safe and readily available water is a human right and an important contributor to public health, whether it is used for drinking, washing, food production, or recreational purposes. Contaminated water and inadequate sanitation help transmit diseases such as diarrhea, cholera, dysentery, and typhoid; diarrheal deaths due to unclean drinking water are estimated at 502,000 each year, most of them of young children (World Health Organization, 2015). Improved access to safe water and sanitation boosts economic growth, contributes to poverty reduction, and is highly relevant to achieving all of the Sustainable Development Goals (SDGs), from health and education to food security and environmental sustainability (World Bank, 2014). Substantial progress was made under the Millennium Development Goals (MDG) initiative; worldwide, the target of reducing by half the proportion of the population without sustainable access to safe drinking water was met in 2010. But in sub-Saharan Africa, most countries fell short, and achieving the new SDG No. 6 – ensuring access to water and sanitation for all by 2030 – will require intensive and sustained action. In observance of World Water Day, new findings from Afrobarometer’s Round 6 surveys in 36 African countries give voice to citizens who call on their governments to address inadequate water supply and sanitation as a top priority.
    [Show full text]
  • Working Paper No. 130 RESOURCES: LAND, CHIEFS and COALITION-BUILDING in AFRICA by Kate Baldwin
    Working Paper No. 130 WHEN POLITICIANS CEDE CONTROL OF RESOURCES: LAND, CHIEFS AND COALITION-BUILDING IN AFRICA by Kate Baldwin AFROBAROMETER WORKING PAPERS Working Paper No. 130 WHEN POLITICIANS CEDE CONTROL OF RESOURCES: LAND, CHIEFS AND COALITION- BUILDING IN AFRICA by Kate Baldwin April 2011 Kate Baldwin is Assistant Professor, Department of Political Science, University of Florida and Visiting Scholar, Center for the Study of Democratic Politics, Princeton University. Mailing address: 308A Robertson Hall, Princeton University, Princeton, NJ, 08544. Telephone: 917-375-7614. Fax: 609-258-5014. E-mail: [email protected] . Copyright Afrobarometer i AFROBAROMETER WORKING PAPERS Editor Michael Bratton Editorial Board E. Gyimah-Boadi Carolyn Logan Robert Mattes Leonard Wantchekon Afrobarometer publications report the results of national sample surveys on the attitudes of citizens in selected African countries towards democracy, markets, civil society, and other aspects of development. The Afrobarometer is a collaborative enterprise of the Centre for Democratic Development (CDD, Ghana), the Institute for Democracy in South Africa (IDASA), and the Institute for Empirical Research in Political Economy (IREEP) with support from Michigan State University (MSU) and the University of Cape Town, Center of Social Science Research (UCT/CSSR). Afrobarometer papers are simultaneously co-published by these partner institutions and the Globalbarometer. Working Papers and Briefings Papers can be downloaded in Adobe Acrobat format from www.afrobarometer.org . Idasa co-published with: Copyright Afrobarometer ii When Politicians Cede Control of Resources: Land, Chiefs and Coalition-Building in Africa Abstract 1 Why would politicians give up power over the allocation of critical resources to community leaders? This article examines why many African governments have ceded power over the allocation of land to non-elected traditional leaders.
    [Show full text]
  • Sampling and Fieldwork Practices in Europe
    www.ssoar.info Sampling and Fieldwork Practices in Europe: Analysis of Methodological Documentation From 1,537 Surveys in Five Cross-National Projects, 1981-2017 Jabkowski, Piotr; Kołczyńska, Marta Veröffentlichungsversion / Published Version Zeitschriftenartikel / journal article Empfohlene Zitierung / Suggested Citation: Jabkowski, P., & Kołczyńska, M. (2020). Sampling and Fieldwork Practices in Europe: Analysis of Methodological Documentation From 1,537 Surveys in Five Cross-National Projects, 1981-2017. Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 16(3), 186-207. https://doi.org/10.5964/meth.2795 Nutzungsbedingungen: Terms of use: Dieser Text wird unter einer CC BY Lizenz (Namensnennung) zur This document is made available under a CC BY Licence Verfügung gestellt. Nähere Auskünfte zu den CC-Lizenzen finden (Attribution). For more Information see: Sie hier: https://creativecommons.org/licenses/by/4.0 https://creativecommons.org/licenses/by/4.0/deed.de Diese Version ist zitierbar unter / This version is citable under: https://nbn-resolving.org/urn:nbn:de:0168-ssoar-74317-3 METHODOLOGY Original Article Sampling and Fieldwork Practices in Europe: Analysis of Methodological Documentation From 1,537 Surveys in Five Cross-National Projects, 1981-2017 Piotr Jabkowski a , Marta Kołczyńska b [a] Faculty of Sociology, Adam Mickiewicz University, Poznan, Poland. [b] Department of Socio-Political Systems, Institute of Political Studies of the Polish Academy of Science, Warsaw, Poland. Methodology, 2020, Vol. 16(3), 186–207, https://doi.org/10.5964/meth.2795 Received: 2019-03-23 • Accepted: 2019-11-08 • Published (VoR): 2020-09-30 Corresponding Author: Piotr Jabkowski, Szamarzewskiego 89C, 60-568 Poznań, Poland. +48 504063762, E-mail: [email protected] Abstract This article addresses the comparability of sampling and fieldwork with an analysis of methodological data describing 1,537 national surveys from five major comparative cross-national survey projects in Europe carried out in the period from 1981 to 2017.
    [Show full text]
  • Appendix B: Religious Demography of Sub-Saharan Africa
    Pew Forum on Religion & Public Life / Islam and Christianity in Sub-Saharan Africa Appendix B: Religious Demography of Sub-Saharan Africa This appendix provides statistical estimates for the distribution of Christians, Muslims and other religious groups in the 19 countries surveyed by the Pew Research Center’s Forum on Religion & Public Life. In addition to the Pew Forum’s 2009 data, results are shown from national censuses, demographic and health surveys, and other general population surveys. Sources National censuses are the best starting point for the distribution of religious adherents because they generally cover the entire population. Some censuses, such as South Africa’s, even provide layers of detail under the major religious traditions. Censuses, however, can be affected by methodological decisions, political bias and social concerns that affect how the data are managed and whether respondents feel free to be truthful. In the absence of reliable census data on religion, Demographic and Health Surveys (DHS)1 provide nationally representative data on religion that is highly regarded by experts. The DHS usually sample at least 7,000 households and are often repeated at multiple time points. The DHS generally survey people ages 15 to 49 and oversample (and sometimes only sample) women. This is a limitation, since religious adherence differs, albeit slightly, by sex and age. General population surveys such as those by the Pew Global Attitudes Project and Afrobarometer also provide valuable information on the percentage of the population belonging to major religious groups.2 Because general population surveys typically involve 1,000 to 2,000 respondents, however, they cannot provide accurate detail on the size of small religious groups.
    [Show full text]
  • U.S. Image Plummets Internationally As Most Say Country Has Handled Coronavirus Badly Ratings for Trump Remain Poor
    FOR RELEASE SEPTEMBER 15, 2020 U.S. Image Plummets Internationally as Most Say Country Has Handled Coronavirus Badly Ratings for Trump remain poor BY Richard Wike, Janell Fetterolf and Mara Mordecai FOR MEDIA OR OTHER INQUIRIES: Richard Wike, Director, Global Attitudes Research Stefan S. Cornibert, Communications Manager 202.419.4372 www.pewresearch.org RECOMMENDED CITATION Pew Research Center, September, 2020, “U.S. Image Plummets Internationally as Most Say Country Has Handled Coronavirus Badly” 1 PEW RESEARCH CENTER About Pew Research Center Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping America and the world. It does not take policy positions. The Center conducts public opinion polling, demographic research, content analysis and other data-driven social science research. It studies U.S. politics and policy; journalism and media; internet, science and technology; religion and public life; Hispanic trends; global attitudes and trends; and U.S. social and demographic trends. All of the Center’s reports are available at www.pewresearch.org. Pew Research Center is a subsidiary of The Pew Charitable Trusts, its primary funder. © Pew Research Center 2020 www.pewresearch.org 2 PEW RESEARCH CENTER How we did this This analysis focuses on public opinion of the United States in 13 countries in North America, Europe and the Asia-Pacific region. Views of the U.S. and its president are examined in the context of long-term trend data. The report also examines how people in other countries perceive America’s handling of the coronavirus outbreak and how those perceptions compare to ratings for their own country, the World Health Organization, the European Union and China.
    [Show full text]
  • Questionnaire Design Guidelines for Establishment Surveys
    Journal of Official Statistics, Vol. 26, No. 1, 2010, pp. 43–85 Questionnaire Design Guidelines for Establishment Surveys Rebecca L. Morrison1, Don A. Dillman2, and Leah M. Christian3 Previous literature has shown the effects of question wording or visual design on the data provided by respondents. However, few articles have been published that link the effects of question wording and visual design to the development of questionnaire design guidelines. This article proposes specific guidelines for the design of establishment surveys within statistical agencies based on theories regarding communication and visual perception, experimental research on question wording and visual design, and findings from cognitive interviews with establishment survey respondents. The guidelines are applicable to both paper and electronic instruments, and cover such topics as the phrasing of questions, the use of space, the placement and wording of instructions, the design of answer spaces, and matrices. Key words: Visual design; question wording; cognitive interviews. 1. Introduction In recent years, considerable effort has been made to develop questionnaire construction guidelines for how questions should appear in establishment surveys. Examples include guidelines developed by the Australian Bureau of Statistics (2006) and Statistics Norway (Nøtnæs 2006). These guidelines have utilized the rapidly emerging research on how the choice of survey mode, question wording, and visual layout influence respondent answers, in order to improve the quality of responses and to encourage similarity of construction when more than one survey data collection mode is used. Redesign efforts for surveys at the Central Bureau of Statistics in the Netherlands (Snijkers 2007), Statistics Denmark (Conrad 2007), and the Office for National Statistics in the United Kingdom (Jones et al.
    [Show full text]
  • Combining Probability and Nonprobability Samples to Form Efficient Hybrid Estimates: an Evaluation of the Common Support Assumption Jill A
    Combining Probability and Nonprobability Samples to form Efficient Hybrid Estimates: An Evaluation of the Common Support Assumption Jill A. Dever RTI International, Washington, DC Proceedings of the 2018 Federal Committee on Statistical Methodology (FCSM) Research Conference Abstract Nonprobability surveys, those without a defined random sampling scheme, are becoming more prevalent. These studies can offer faster results at less cost than many probability surveys, especially for targeting important subpopulations. This can be an attractive option given the continual challenge of doing more with less, as survey costs continue to rise and response rates to plummet. Nonprobability surveys alone, however, may not fit the needs (purpose) of Federal statistical agencies where population inference is critical. Nonprobability samples may best serve to enhance the representativeness of certain domains within probability samples. For example, if locating and interviewing a required number of subpopulation members is resource prohibitive, data from a targeted nonprobability survey may lower coverage bias exhibited in a probability survey. In this situation, the question is how to best combine information from both sources. Our research searches for an answer to this question through an evaluation of hybrid estimation methods currently in use that combine probability and nonprobability data. Methods that employ generalized analysis weights (i.e., one set of weights for all analyses) are the focus because they enable other survey researchers and policy makers to analyze the data. The goal is to identify procedures that maximize the strength of each data source to produce hybrid estimates with the low mean square error. The details presented here focus on the propensity score adjusted (PSA) nonprobability weights needed prior to combining the data sources, and the common support assumption critical to hybrid estimation and PSA weighting.
    [Show full text]
  • ESOMAR 28 Questions
    Responses to ESOMAR 28 Questions INTRODUCTION Intro to TapResearch TapResearch connects mobile, tablet and pc users interested in completing surveys with market researchers who need their opinions. Through our partnerships with dozens of leading mobile apps, ad networks and websites, we’re able to reach an audience exceeding 100 million people in the United States. We are focused on being a top-quality partner for ad hoc survey sampling, panel recruitment, and router integrations. Our technology platform enables reliable feasibility estimates, highly competitive costs, sophisticated quality enforcement, and quick-turnaround project management. Intro to ESOMAR ESOMAR (European Society for Opinion and Market Research) is the essential organization for encouraging, advancing and elevating market research worldwide. Since 1948, ESOMAR’s aim has been to promote the value of market and opinion research in effective decision-making. The ICC/ESOMAR Code on Market and Social Research, which was developed jointly with the International Chamber of Commerce, sets out global guidelines for self-regulation for researchers and has been undersigned by all ESOMAR members and adopted or endorsed by more than 60 national market research associations worldwide. Responses to ESOMAR 28 Questions COMPANY PROFILE 1) What experience does your company have in providing online samples for market research? TapResearch connects mobile, tablet and pc users interested in completing surveys with market researchers who need their opinions. Through our partnerships with dozens of leading mobile apps, ad networks and websites, we’re able to reach an audience exceeding 100 million people in the United States - we’re currently adding about 30,000 panelists/day and this rate is increasing.
    [Show full text]
  • A Critical Review of Studies Investigating the Quality of Data Obtained with Online Panels Based on Probability and Nonprobability Samples1
    Callegaro c02.tex V1 - 01/16/2014 6:25 P.M. Page 23 2 A critical review of studies investigating the quality of data obtained with online panels based on probability and nonprobability samples1 Mario Callegaro1, Ana Villar2, David Yeager3,and Jon A. Krosnick4 1Google, UK 2City University, London, UK 3University of Texas at Austin, USA 4Stanford University, USA 2.1 Introduction Online panels have been used in survey research as data collection tools since the late 1990s (Postoaca, 2006). The potential great cost and time reduction of using these tools have made research companies enthusiastically pursue this new mode of data collection. However, 1 We would like to thank Reg Baker and Anja Göritz, Part editors, for their useful comments on preliminary versions of this chapter. Online Panel Research, First Edition. Edited by Mario Callegaro, Reg Baker, Jelke Bethlehem, Anja S. Göritz, Jon A. Krosnick and Paul J. Lavrakas. © 2014 John Wiley & Sons, Ltd. Published 2014 by John Wiley & Sons, Ltd. Callegaro c02.tex V1 - 01/16/2014 6:25 P.M. Page 24 24 ONLINE PANEL RESEARCH the vast majority of these online panels were built by sampling and recruiting respondents through nonprobability methods such as snowball sampling, banner ads, direct enrollment, and other strategies to obtain large enough samples at a lower cost (see Chapter 1). Only a few companies and research teams chose to build online panels based on probability samples of the general population. During the 1990s, two probability-based online panels were documented: the CentER data Panel in the Netherlands and the Knowledge Networks Panel in the United States.
    [Show full text]
  • Esomar/Grbn Guideline for Online Sample Quality
    ESOMAR/GRBN GUIDELINE FOR ONLINE SAMPLE QUALITY ESOMAR GRBN ONLINE SAMPLE QUALITY GUIDELINE ESOMAR, the World Association for Social, Opinion and Market Research, is the essential organisation for encouraging, advancing and elevating market research: www.esomar.org. GRBN, the Global Research Business Network, connects 38 research associations and over 3500 research businesses on five continents: www.grbn.org. © 2015 ESOMAR and GRBN. Issued February 2015. This Guideline is drafted in English and the English text is the definitive version. The text may be copied, distributed and transmitted under the condition that appropriate attribution is made and the following notice is included “© 2015 ESOMAR and GRBN”. 2 ESOMAR GRBN ONLINE SAMPLE QUALITY GUIDELINE CONTENTS 1 INTRODUCTION AND SCOPE ................................................................................................... 4 2 DEFINITIONS .............................................................................................................................. 4 3 KEY REQUIREMENTS ................................................................................................................ 6 3.1 The claimed identity of each research participant should be validated. .................................................. 6 3.2 Providers must ensure that no research participant completes the same survey more than once ......... 8 3.3 Research participant engagement should be measured and reported on ............................................... 9 3.4 The identity and personal
    [Show full text]
  • European Attitudes to Climate Change and Energy: Topline Results from Round 8 of the European Social Survey
    European Attitudes to Climate Change and Energy: Topline Results from Round 8 of the European Social Survey ESS Topline Issue Results Series9 2 European Attitudes to Climate Change and Energy This latest issue in our Topline Results we hope that this latest data will influence series examines public attitudes academic, public and policy debate in this towards climate change and energy for area. the first time in the ESS. The module We include two different topics in was selected for inclusion due to its each round of the survey to expand the academic excellence as well as the relevance of our data into new areas increasing relevance of this issue. For and to allow repetition if the case can be example the Paris Agreement made made to examine the same area again. by 195 United Nations Framework Everyone at the ESS is delighted with the Convention on Climate Change work of the Questionnaire Design Team (UNFCCC) countries in 2016 who led on the design of this module, underlines the salience of the topic. and who have written this excellent With many parts of Europe and the publication. world recording rising temperatures and Rory Fitzgerald experiencing more extreme weather, ESS ERIC Director the subject is a key grand challenge. City, University of London By assessing public opinion on climate change and the related issue of energy The authors of this issue: • Wouter Poortinga, Professor of Environmental Psychology, Cardiff University • Stephen Fisher, Associate Professor in Political Sociology, Trinity College, University of Oxford • Gisela
    [Show full text]