The Contribution of the World Values Survey to the Sdgs Monitoring

Total Page:16

File Type:pdf, Size:1020Kb

The Contribution of the World Values Survey to the Sdgs Monitoring Measuring inclusive participation and beyond: the contribution of the World Values Survey to the SDGs monitoring KSENIYA KIZILOVA HEAD OF SECRETARIAT AT THE WORLD VALUES SURVEY ASSOCIATION VICE - DIRECTOR OF THE INSTITUTE FOR COMPARATIVE SURVEY RESEARCH VIENNA, AUSTRIA Introduction to the World Values Survey The World Values Survey (WVS) is a global cross-national cross-sectional research program exploring human values and beliefs, their stability or change over time, and how they influence social, political and economic development of societies around the globe. Largest non-commercial academic High-quality national-wide random social survey program: covers 115 representative samples (1200 to countries representing 92% of the 6000 respondents per country); world population interviews in face to face mode Time-series data for 38-years (1981- Collaboration of over 400 highly 2019), over 700 indicators professional national survey teams measured in this period worldwide Free access to the data for Over 15 000 publications, including researchers, civil society, academic articles and books, working international development agencies: papers, development reports www.worldvaluessurvey.org World Values Survey geographic coverage (1981-2019): 115 countries Some of the WVSA cooperation initiatives and partnerships (2014-2019) Examples of global development reports that employ WVS data WVS data for the SDGs measurement SDG Target 16.5: Substantially reduce ▪ WVS survey contains 200+ indicators valid for corruption and bribery in all their forms monitoring SDGs 1, 2, 3, 4, 5, 6, 8, 9, 10, 11, 2.70 13, 16, 17 as supplement measures. BANGLADESH BOLIVIA SERBIA ECUADOR ▪ High quality samples: extrapolation of findings 2.50 PAKISTAN LEBANON NIGERIA PERU on the total country adult population. ROMANIA IRAQ 2.30 MALAYSIA ▪ Possibility of disaggregation by age, gender, EGYPT GREECE education, wellbeing, social class, migration SOUTH background, region of residence, type of 2.10 KAZAKHSTAN PUERTO RICO KOREA RUSSIA settlement. CHILE BRAZIL 1.90 USA ▪ Possibility of cross-country and cross- ANDORRA ARGENTINA regional comparison for the same measures; THAILAND 1.70 ▪ All data in free access for individuals and AUSTRALIA JORDAN organizations (HEIs, IDAs, CSOs, NGOs etc.) 1.50 INDONESIA for any non-commercial purpose of use; Frequency ordinary people pay a bribe GERMANY 1.30 ▪ Wide network of national research teams to 5.50 6.00 6.50 7.00 7.50 8.00 8.50 9.00 9.50 10.00 explore national context and engage with Perceived scale of corruption CSO/NGO actors. Source: World Values Survey (2017-2019); www.worldvaluessurvey.org SDG Target 16.1: Significantly reduce all forms of violence and related death rates everywhere 3.85 GEORGIA UZBEKISTAN AZERBAIJAN ANDORRA SINGAPORE CHINA GERMANY TAIWAN QATAR 3.65 POLAND SLOVENIA EGYPT CYPRUS BANGLADESH NETHERLANDS SOUTH KOREA ESTONIA 3.45 JAPAN ROMANIA INDONESIA JORDAN ARMENIA SWEDEN SERBIA BELARUS LIBYA RUSSIA TURKEY GHANA YEMEN 3.25 SPAIN UKRAINE KUWAIT KYRGYZSTAN THAILAND HONG KONG IRAQ PAKISTAN 3.05 KAZAKHSTAN LEBANON RWANDA USA MOROCCO INDIA TUNISIA GREECE NEW ZEALAND NIGERIA AUSTRALIA 2.85 BOLIVIA ZIMBABWE MALAYSIA COLOMBIA HAITI ECUADOR 2.65 ALGERIA PERU URUGUAY ARGENTINA PHILIPPINES MEXICO 2.45 CHILE SOUTH AFRICA FREQUENCY OF ROBBERIES IN THE NEIGHBOURHOOD THE INOFROBBERIES FREQUENCY BRAZIL 2.25 2.00 2.20 2.40 2.60 2.80 3.00 3.20 3.40 3.60 3.80 PERCEIVED SECURITY IN THE NEIGHBOURHOOD Source: World Values Survey (2014-2019); www.worldvaluessurvey.org SDG Target 16.6: Develop effective, accountable and transparent institutions at all levels 3.00 Bangladesh Uganda Indonesia China Singapore Ghana Kazakhstan Tanzania 2.80 India Mali Luxembourg Switzerland Thailand Philippines Iceland Azerbaijan Rwanda Estonia Kuwait Denmark Kyrgyzstan Germany Russia Georgia Malaysia France Sweden Turkey 2.60 Taiwan Austria Pakistan Belgium Norway South Korea Zimbabwe Netherlands Canada Argentina Montenegro Slovakia South Africa MacedoniaSpain Morocco Andorra New Zealand 2.40 Nigeria Moldova Zambia Cyprus Libya Portugal Australia Finland Brazil UK Ukraine Hungary Poland Iran Greece Lithuania Algeria USA Italy 2.20 Ethiopia Japan Armenia Chile Czechia Belarus Tunisia Bulgaria Croatia 2.00 Iraq Slovenia Serbia Romania Egypt Uruguay Argentina CONFIDENCE IN CIVIL CONFIDENCESERVICE IN CIVIL 1.80 Bolivia Mexico Colombia Yemen Lebanon Guatemala Ecuador 1.60 Peru 1.40 1.80 2.00 2.20 2.40 2.60 2.80 3.00 3.20 3.40 CONFIDENCE IN POLICE Source: World Values Survey (2014-2019); www.worldvaluessurvey.org Pilot of tier III indicator 16.7.2: Proportion How much would you say the political system in your country allows of population who believe people like you to have a say in what the government does? decision-making is inclusive and (“a great deal” + “a lot” in %) responsive, by population group Italy 89.8% 15.5% Egypt Slovenia ▪ Implemented as a part of cooperation 83.6% 17.9% Brazil agreement between the UNDP and the Estonia 74.0% 18.3% Australia WVSA. Argentina 73.5% 25.1% Lebanon Russia ▪ Pilot of the measure on inclusive and 71.1% 25.6% Switzerland responsive decision-making in 2018-2020 Lithuania 71.0% 29.6% Iraq conducted in 40 countries. Spain 70.8% 31.4% Norway ▪ In every country representative national Poland 69.7% 33.8% Nigeria samples are interviewed; item translated so Israel 69.3% 41.2% Indonesia far into 17 languages. France 68.0% 41.5% Jordan ▪ Data collected via face-to-face interview Ireland 67.4% 41.5% Pakistan method (PAPI; CAPI modes). Hungary 66.7% 43.5% Malaysia Portugal 63.9% ▪ Possibility of data disaggregation by 46.2% Andorra population group and location. Austria 63.9% 46.4% Netherlands Finland 63.8% Germany ▪ Study of correlations with measures of 49.6% Belgium democracy, voting and other forms of political 63.1% 50.1% Iceland participation, confidence in institutions etc. Czechia 61.9% 53.1% Sweden UK 54.6% Source: World Values Survey (2017-2019); European Social Survey 8 (2016) Proportion of adult population in Bangladesh who believe Males 53.70% decision-making is inclusive and responsive, by population Females 45.80% groups and region (%) 18-29 years 49.10% 30-45 years 51.70% 46-99 years 47.50% Primary, secondary 43.80% education Tertiary education 58.70% Low income 56.40% Medium income 50.10% High income 42.30% Urban 46.50% Rural 50.70% Source: World Values Survey in Bangladesh (2018); www.worldvaluessurvey.org Proportion of adult population in Malaysia who believe decision-making is inclusive and responsive, by population groups and regions (%) 59.10% 42.50% 46.50% 45.50% 46.50% 46.30% 45.80% 43.40% 37.10% 31.10% 36.80% 34.30% Males Females 18-29 30-49 50 and Primary, Tertiary Low Medium High Urban Rural years years older secondary education income income income education Source: World Values Survey in Malaysia (2018); www.worldvaluessurvey.org Proportion of adult population in Pakistan who believe Males 41.60% decision-making is inclusive and responsive, by population Females 41.30% groups and regions (%) 18-25 years 47.80% 26-40 years 40.50% 41-99 years 39.10% Primary, secondary 40.30% education Tertiary education 45.70% Low income 38.80% Medium income 41.80% High income 48.40% Urban 43.70% Rural 40.40% Source: World Values Survey in Pakistan (2018); www.worldvaluessurvey.org Perceptions of inclusive and responsive decision-making and reported forms of political participation and civil activity (%) Political system responsiveness: Very much or a lot Some Little or no 64.1 60.7 61.1 28.5 25.7 29.2 26.1 28.0 26.7 24.9 25.7 25.9 22.2 18.3 21.3 17.1 13.6 14.0 Voted in last Donated to a group Searched Encouraged others Signed a petition Contacted a elections or campaign information about to vote government official politics online 13.5 14.7 14.2 14.4 10.6 11.5 11.3 9.0 11.2 8.6 6.8 10.1 6.3 3.8 3.7 6.0 6.0 7.0 Attended peaceful Signed an e-petition Encouraged others Joined a strike Organized a event, Joined in boycott demonstration to take an action protest using social about political issue media Source: World Values Survey (2017-2019); www.worldvaluessurvey.org Key methodological findings from the pilot Variation in interpretation of “having a say” which affects the translation and the overall question meaning in other languages => remark for translators required; In most languages, very close distance between scale positions 1=Very much; 2=A lot => difficulty to reproduce the required difference between the two points; Item is a valid measure of external efficacy, responses correlate highly with the perceived satisfaction with democracy and the way political system is developing in the country, confidence in the government; Question was possible to ask in all countries regardless of the type of political regime, in less democratic countries the respondents more often tend to select “hard to say” or “refuse to answer” (up to 20%) => consider developing supplementary measures. Next steps: short-term and long-term Continue cooperation with UNDP and OGC on piloting SDG 16.7.2 in 2019-2020; Complete the pilot in 40 countries by July 2020; Submit the findings and methodological remarks for the further question polishing/ reclassification of the indicator from tier III to tier II; Explore possibilities to engage with other international development and civil society organizations who can benefit from the newly collected data both at global, regional and national basis; Expand further the number of SDG measures in the WVS questionnaire, in particular – for the next WVS-8 round (2022-2025); Explore possibilities of combining survey activity with additional actions, events engaging local communities, CSOs and policy-makers in the studied countries.
Recommended publications
  • Statistical Inference: How Reliable Is a Survey?
    Math 203, Fall 2008: Statistical Inference: How reliable is a survey? Consider a survey with a single question, to which respondents are asked to give an answer of yes or no. Suppose you pick a random sample of n people, and you find that the proportion that answered yes isp ˆ. Question: How close isp ˆ to the actual proportion p of people in the whole population who would have answered yes? In order for there to be a reliable answer to this question, the sample size, n, must be big enough so that the sample distribution is close to a bell shaped curve (i.e., close to a normal distribution). But even if n is big enough that the distribution is close to a normal distribution, usually you need to make n even bigger in order to make sure your margin of error is reasonably small. Thus the first thing to do is to be sure n is big enough for the sample distribution to be close to normal. The industry standard for being close enough is for n to be big enough so that 1 − p 1 − p n > 9 and n > 9 p p both hold. When p is about 50%, n can be as small as 10, but when p gets close to 0 or close to 1, the sample size n needs to get bigger. If p is 1% or 99%, then n must be at least 892, for example. (Note also that n here depends on p but not on the size of the whole population.) See Figures 1 and 2 showing frequency histograms for the number of yes respondents if p = 1% when the sample size n is 10 versus 1000 (this data was obtained by running a computer simulation taking 10000 samples).
    [Show full text]
  • SAMPLING DESIGN & WEIGHTING in the Original
    Appendix A 2096 APPENDIX A: SAMPLING DESIGN & WEIGHTING In the original National Science Foundation grant, support was given for a modified probability sample. Samples for the 1972 through 1974 surveys followed this design. This modified probability design, described below, introduces the quota element at the block level. The NSF renewal grant, awarded for the 1975-1977 surveys, provided funds for a full probability sample design, a design which is acknowledged to be superior. Thus, having the wherewithal to shift to a full probability sample with predesignated respondents, the 1975 and 1976 studies were conducted with a transitional sample design, viz., one-half full probability and one-half block quota. The sample was divided into two parts for several reasons: 1) to provide data for possibly interesting methodological comparisons; and 2) on the chance that there are some differences over time, that it would be possible to assign these differences to either shifts in sample designs, or changes in response patterns. For example, if the percentage of respondents who indicated that they were "very happy" increased by 10 percent between 1974 and 1976, it would be possible to determine whether it was due to changes in sample design, or an actual increase in happiness. There is considerable controversy and ambiguity about the merits of these two samples. Text book tests of significance assume full rather than modified probability samples, and simple random rather than clustered random samples. In general, the question of what to do with a mixture of samples is no easier solved than the question of what to do with the "pure" types.
    [Show full text]
  • Summary of Human Subjects Protection Issues Related to Large Sample Surveys
    Summary of Human Subjects Protection Issues Related to Large Sample Surveys U.S. Department of Justice Bureau of Justice Statistics Joan E. Sieber June 2001, NCJ 187692 U.S. Department of Justice Office of Justice Programs John Ashcroft Attorney General Bureau of Justice Statistics Lawrence A. Greenfeld Acting Director Report of work performed under a BJS purchase order to Joan E. Sieber, Department of Psychology, California State University at Hayward, Hayward, California 94542, (510) 538-5424, e-mail [email protected]. The author acknowledges the assistance of Caroline Wolf Harlow, BJS Statistician and project monitor. Ellen Goldberg edited the document. Contents of this report do not necessarily reflect the views or policies of the Bureau of Justice Statistics or the Department of Justice. This report and others from the Bureau of Justice Statistics are available through the Internet — http://www.ojp.usdoj.gov/bjs Table of Contents 1. Introduction 2 Limitations of the Common Rule with respect to survey research 2 2. Risks and benefits of participation in sample surveys 5 Standard risk issues, researcher responses, and IRB requirements 5 Long-term consequences 6 Background issues 6 3. Procedures to protect privacy and maintain confidentiality 9 Standard issues and problems 9 Confidentiality assurances and their consequences 21 Emerging issues of privacy and confidentiality 22 4. Other procedures for minimizing risks and promoting benefits 23 Identifying and minimizing risks 23 Identifying and maximizing possible benefits 26 5. Procedures for responding to requests for help or assistance 28 Standard procedures 28 Background considerations 28 A specific recommendation: An experiment within the survey 32 6.
    [Show full text]
  • Survey Experiments
    IU Workshop in Methods – 2019 Survey Experiments Testing Causality in Diverse Samples Trenton D. Mize Department of Sociology & Advanced Methodologies (AMAP) Purdue University Survey Experiments Page 1 Survey Experiments Page 2 Contents INTRODUCTION ............................................................................................................................................................................ 8 Overview .............................................................................................................................................................................. 8 What is a survey experiment? .................................................................................................................................... 9 What is an experiment?.............................................................................................................................................. 10 Independent and dependent variables ................................................................................................................. 11 Experimental Conditions ............................................................................................................................................. 12 WHY CONDUCT A SURVEY EXPERIMENT? ........................................................................................................................... 13 Internal, external, and construct validity ..........................................................................................................
    [Show full text]
  • Evaluating Survey Questions Question
    What Respondents Do to Answer a Evaluating Survey Questions Question • Comprehend Question • Retrieve Information from Memory Chase H. Harrison Ph.D. • Summarize Information Program on Survey Research • Report an Answer Harvard University Problems in Answering Survey Problems in Answering Survey Questions Questions – Failure to comprehend – Failure to recall • If respondents don’t understand question, they • Questions assume respondents have information cannot answer it • If respondents never learned something, they • If different respondents understand question cannot provide information about it differently, they end up answering different questions • Problems with researcher putting more emphasis on subject than respondent Problems in Answering Survey Problems in Answering Survey Questions Questions – Problems Summarizing – Problems Reporting Answers • If respondents are thinking about a lot of things, • Confusing or vague answer formats lead to they can inconsistently summarize variability • If the way the respondent remembers something • Interactions with interviewers or technology can doesn’t readily correspond to the question, they lead to problems (sensitive or embarrassing may be inconsistemt responses) 1 Evaluating Survey Questions Focus Groups • Early stage • Qualitative research tool – Focus groups to understand topics or dimensions of measures • Used to develop ideas for questionnaires • Pre-Test Stage – Cognitive interviews to understand question meaning • Used to understand scope of issues – Pre-test under typical field
    [Show full text]
  • MRS Guidance on How to Read Opinion Polls
    What are opinion polls? MRS guidance on how to read opinion polls June 2016 1 June 2016 www.mrs.org.uk MRS Guidance Note: How to read opinion polls MRS has produced this Guidance Note to help individuals evaluate, understand and interpret Opinion Polls. This guidance is primarily for non-researchers who commission and/or use opinion polls. Researchers can use this guidance to support their understanding of the reporting rules contained within the MRS Code of Conduct. Opinion Polls – The Essential Points What is an Opinion Poll? An opinion poll is a survey of public opinion obtained by questioning a representative sample of individuals selected from a clearly defined target audience or population. For example, it may be a survey of c. 1,000 UK adults aged 16 years and over. When conducted appropriately, opinion polls can add value to the national debate on topics of interest, including voting intentions. Typically, individuals or organisations commission a research organisation to undertake an opinion poll. The results to an opinion poll are either carried out for private use or for publication. What is sampling? Opinion polls are carried out among a sub-set of a given target audience or population and this sub-set is called a sample. Whilst the number included in a sample may differ, opinion poll samples are typically between c. 1,000 and 2,000 participants. When a sample is selected from a given target audience or population, the possibility of a sampling error is introduced. This is because the demographic profile of the sub-sample selected may not be identical to the profile of the target audience / population.
    [Show full text]
  • The World Values Survey in the New Independent States C
    THE WORLD’S LARGEST SOCIAL SCIENCE INFRASTRUCTURE AND ACADEMIC SURVEY RESEARCH PROGRAM: THE WORLD VALUES SURVEY IN THE NEW INDEPENDENT STATES C. Haerpfer1, K. Kizilova2* 1University of Vienna, Austria 2V.N. Karazin Kharkiv National University, Kharkiv, Ukraine The World Values Survey (WVS) is an international research program developed to assess the im- pact of values stability or change over time on the social, political and economic development of countries and societies. It started in 1981 by Ronald Inglehart and his team, since then has involved more than 100 world societies and turned into the largest non-commercial cross-national empirical time-series inves- tigation of human beliefs and values ever executed on a global scale. The article consists of a few sections differing by the focus. The authors begin with the description of survey methodology and organization management that both ensure cross-national and cross-regional comparative character of the study (the survey is implemented using the same questionnaire, a face-to-face mode of interviews, and the same sample type in every country). The next part of the article presents a short overview of the project history and comparative surveys’ time-series (so called “waves” — periods between two and four years long during which collection of data in several dozens of countries using one same questionnaire is taking place; such waves are conducted every five years). Here the authors describe every wave of the WVS mentioning coordination and management activities that were determined by the extension of the project thematically and geographically. After that the authors identify the key features of the WVS in the New Independent States and mention some of the results of the study conducted in NIS countries in 1990—2014, such as high level of uncertainty in the choice of ideological preferences; rapid growth of declared religiosity; ob- served gap between the declared values and actual facts of social life, etc.
    [Show full text]
  • Exploratory Factor Analysis with the World Values Survey Diana Suhr, Ph.D
    Exploratory Factor Analysis with the World Values Survey Diana Suhr, Ph.D. University of Northern Colorado Abstract Exploratory factor analysis (EFA) investigates the possible underlying factor structure (dimensions) of a set of interrelated variables without imposing a preconceived structure on the outcome (Child, 1990). The World Values Survey (WVS) measures changes in what people want out of life and what they believe. WVS helps a worldwide network of social scientists study changing values and their impact on social and political life. This presentation will explore dimensions of selected WVS items using exploratory factor analysis techniques with SAS® PROC FACTOR. EFA guidelines and SAS code will be illustrated as well as a discussion of results. Introduction Exploratory factor analysis investigates the possible underlying structure of a set of interrelated variables. This paper discusses goals, assumptions and limitations as well as factor extraction methods, criteria to determine factor structure, and SAS code. Examples of EFA are shown using data collected from the World Values Survey. The World Values Survey (WVS) has collected data from over 57 countries since 1990. Data has been collected every 5 years from 1990 to 2010 with each data collection known as a wave. Selected items from the 2005 wave will be examined to investigate the factor structure (dimensions) of values that could impact social and political life across countries. The factor structure will be determined for the total group of participants. Then comparisons of the factor structure will be made between gender and between age groups. Limitations Survey questions were changed from wave to wave. Therefore determining the factor structure for common questions across waves and comparisons between waves was not possible.
    [Show full text]
  • The Evidence from World Values Survey Data
    Munich Personal RePEc Archive The return of religious Antisemitism? The evidence from World Values Survey data Tausch, Arno Innsbruck University and Corvinus University 17 November 2018 Online at https://mpra.ub.uni-muenchen.de/90093/ MPRA Paper No. 90093, posted 18 Nov 2018 03:28 UTC The return of religious Antisemitism? The evidence from World Values Survey data Arno Tausch Abstract 1) Background: This paper addresses the return of religious Antisemitism by a multivariate analysis of global opinion data from 28 countries. 2) Methods: For the lack of any available alternative we used the World Values Survey (WVS) Antisemitism study item: rejection of Jewish neighbors. It is closely correlated with the recent ADL-100 Index of Antisemitism for more than 100 countries. To test the combined effects of religion and background variables like gender, age, education, income and life satisfaction on Antisemitism, we applied the full range of multivariate analysis including promax factor analysis and multiple OLS regression. 3) Results: Although religion as such still seems to be connected with the phenomenon of Antisemitism, intervening variables such as restrictive attitudes on gender and the religion-state relationship play an important role. Western Evangelical and Oriental Christianity, Islam, Hinduism and Buddhism are performing badly on this account, and there is also a clear global North-South divide for these phenomena. 4) Conclusions: Challenging patriarchic gender ideologies and fundamentalist conceptions of the relationship between religion and state, which are important drivers of Antisemitism, will be an important task in the future. Multiculturalism must be aware of prejudice, patriarchy and religious fundamentalism in the global South.
    [Show full text]
  • Sample IRB Application Relevant for Those Conducting Surveys
    Sample IRB application relevant for those conducting surveys EXEMPTION FORM San Jose State University Human Subjects–Institutional Review Board Request for Exemption from Human Subjects Review Name: Dr. Shishir Mathur and Dr. Melinda Jackson Department: Urban and Regional Planning (Dr.Mathur); Political Science (Dr. Jackson) Phone Number : Work:408-924-5875 During: 9 am to 6 pm Home: N/A Cell Phone: N/A Pager: N/A E-mail address: [email protected] Address: One Washington Square, SJSU, San Jose, CA 95192-0185 Select one: SJSU Student ___ SJSU Faculty _X___ SJSU Staff ______ Non-SJSU Investigator _____ If Non-SJSU Investigator, SJSU contact: ______________________ If Student, Name of Faculty Advisor: Signature of Faculty Advisor: ______________________________________ Title of proposed project: Five Wounds / Brookwood Terrace (FWBT) Neighborhood: Residents’ Perception Survey Abstract: This study aims, through the survey of residents of Five Wounds / Brookwood Terrace area of San Jose’, to find out: a) residents’ perception of the quality of their neighborhood; b) r esidents’ perception of the involvement of neighborhood. San José State University in their neighborhood; and c) residents’ political views. Funded by: INSTRUCTIONS DESCRIBE: 1. Purpose of proposed research 2. Methodology 3. Timelines 4. Procedure for selecting subjects 5. Number and age of subjects 6. Status of the information collected is Archival Data Base Non-Collected Others: ____________________________ 7. How and where information collected will be kept safe ATTACH: 1. Example of materials such as questionnaires, interview questions, representation of computer-generated stimuli, etc. 2. Document (on SJSU letterhead) that ensures informed consent (form for subjects signature, text to be read in telephone interviews, or introduction to inquiry with “primary sources” 3.
    [Show full text]
  • Evidence from the Gallup World Poll
    Journal of Economic Perspectives—Volume 22, Number 2—Spring 2008—Pages 53–72 Income, Health, and Well-Being around the World: Evidence from the Gallup World Poll Angus Deaton he great promise of surveys in which people report their own level of life satisfaction is that such surveys might provide a straightforward and easily T collected measure of individual or national well-being that aggregates over the various components of well-being, such as economic status, health, family circumstances, and even human and political rights. Layard (2005) argues force- fully such measures do indeed achieve this end, providing measures of individual and aggregate happiness that should be the only gauges used to evaluate policy and progress. Such a position is in sharp contrast to the more widely accepted view, associated with Sen (1999), which is that human well-being depends on a range of functions and capabilities that enable people to lead a good life, each of which needs to be directly and objectively measured and which cannot, in general, be aggregated into a single summary measure. Which of life’s circumstances are important for life satisfaction, and which—if any—have permanent as opposed to merely transitory effects, has been the subject of lively debate. For economists, who usually assume that higher incomes represent a gain to the satisfaction of individuals, the role of income is of particular interest. It is often argued that income is both relatively unimportant and relatively transi- tory compared with family circumstances, unemployment, or health (for example, Easterlin, 2003). Comparing results from a given country over time, Easterlin (1974, 1995) famously noted that average national happiness does not increase over long spans of time, in spite of large increases in per capita income.
    [Show full text]
  • Explaining Professed Popular Trust in Zimbabwe's Presidents
    Dispatch No. 399 | 20 October 2020 Fear and trust: Explaining professed popular trust in Zimbabwe’s presidents Afrobarometer Dispatch No. 399 | Simangele Moyo-Nyede Summary Popular trust in public institutions and officials is an important indicator of political legitimacy, a key resource for the development and functioning of modern democracies (Freitag & Bühlmann, 2009; Chingwete, 2016; Mishler & Rose, 2001; Newton, 2001). However, some analysts argue that while trust is important in a democracy, citizens would be naïve if they didn’t have a certain level of distrust as well (van de Walle & Six, 2004). In Zimbabwe, almost two-thirds of Afrobarometer survey respondents in 2017 said they trusted then-President Robert Mugabe “somewhat” or “a lot.” The following year, after Mugabe ended his 37-year rule under pressure from the military, more than half of respondents expressed trust in his successor, President Emmerson Mnangagwa. At the same time, clear majorities said their country was “going in the wrong direction,” assessed the national economic situation as bad, rated the government’s performance on the economy as poor, and said they did not feel free to criticize the president. As they had for years, headlines portrayed a country contending with a ruined economy, a collapsing health care system, high unemployment and corruption, and poor public services (Pindula News, 2018; Muronzi, 2020). This raises the question: How can citizens who see their country as “going in the wrong direction” express trust in the person leading it there? Do substantial numbers of Zimbabweans really trust their president? If so, what drives this trust? An analysis of Afrobarometer survey data from 2017 and 2018 – during Mugabe’s last year in office and Mnangagwa’s first – suggests that in addition to any number of possible reasons that Zimbabweans may have had for trusting their president, fear of appearing anti- government was one factor contributing to high levels of professed trust.
    [Show full text]