Rudolph, Chang, Rauvola, & Zacher (2020)
Total Page:16
File Type:pdf, Size:1020Kb
Meta-Analysis in Vocational Behavior 1 Meta-Analysis in Vocational Behavior: A Systematic Review and Recommendations for Best Practices Cort W. Rudolph Saint Louis University Kahea Chang Saint Louis University Rachel S. Rauvola Saint Louis University Hannes Zacher Leipzig University Note: This is a pre-print version of an in-press, accepted manuscript. Please cite as: Rudolph, C.W., Chang, K., Rauvola, R.S., & Zacher, H. (2020, In Press). Meta-analysis in vocational behavior: A systematic review and recommendations for best practices. Journal of Vocational Behavior. Author Note Cort W. Rudolph, Kahea Chang, & Rachel S. Rauvola, Department of Psychology, Saint Louis University, St. Louis, MO (USA). Hannes Zacher, Institute of Psychology, Leipzig University, Leipzig, Germany. Correspondence concerning this article may be addressed to Cort W. Rudolph, Department of Psychology, Saint Louis University, Saint Louis MO (USA), e-mail: [email protected] Meta-Analysis in Vocational Behavior 2 Abstract Meta-analysis is a powerful tool for the synthesis of quantitative empirical research. Overall, the field of vocational behavior has benefited from the results of meta-analyses. Yet, there is still quite a bit to learn about how we can improve the quality of meta-analyses reported in this field of inquiry. In this paper, we systematically review all meta-analyses published in the Journal of Vocational Behavior (JVB) to date. We do so to address two related goals: First, based on guidance from various sources (e.g., the American Psychological Association’s meta-analysis reporting standards; MARS), we introduce ten facets of meta-analysis that have particular bearing on statistical conclusion validity. Second, we systematically review meta-analyses published in JVB over the past 32 years, with a particular focus on the methods employed; this review informs a discussion of 19 associated “best practices” for researchers who are considering conducting a meta-analysis in the field of vocational behavior (or in related fields). Thus, this work serves as an important benchmark, indicating where we “have been” and where we “should go,” with respect to conducting and reporting meta-analyses on vocational behavior topics. Keywords: Systematic Review; Meta-Analysis; Methodology; Best Practices Highlights: - We review all meta-analyses published in the Journal of Vocational Behavior to date. - We evaluate studies using ten criteria related to statistical conclusion validity. - We derive 19 “best practices” for future meta-analyses in vocational behavior. Meta-Analysis in Vocational Behavior 3 Meta-Analysis in Vocational Behavior: A Systematic Review and Recommendations for Best Practices 1. Introduction The idea that quantitative research results can be statistically aggregated is not new. Over 100 years ago, Pearson (1904) empirically combined the results of multiple clinical studies of typhoid inoculations. Likewise, Fisher (1925) proposed a method for pooling p-values across null hypothesis significance tests. The introduction of more modern conceptualizations of “meta- analysis” as a research synthesis method is often attributed to Glass (1976), who is also credited for popularizing these ideas for the synthesis of psychological research. The field of vocational behavior was an early adopter of meta-analysis. Indeed, over the past 32 years, the flagship outlet for such work, the Journal of Vocational Behavior (JVB), has published 68 meta-analyses (see Figure 1), on topics ranging from occupational wellbeing (Assouline & Meir, 1987) to vocational interest congruence (Nye, Su, Rounds, & Drasgow, 2017). Meta-analyses have arguably been impactful to the field of vocational behavior as a whole. For example, as of the writing of this manuscript, four of the 25 “most downloaded” JVB articles in the past 90 days (16%), and three of the 25 “most cited” articles since 2016 (12%), have been meta-analyses. Moreover, according to Web of Science citation counts, a meta- analysis of organizational commitment by Meyer, Stanley, Herscovitch, and Topolnytsky (2002) has been cited just over 2,100 times; the median citation count across all 68 meta-analyses published in JVB is 55 (M = 147.02, SD = 309.95; see also Figure 2). In this manuscript, we systematically review and synthesize the entire corpus of meta- analytic articles that have been published in JVB. Systematic reviews are typically undertaken to synthesize the findings of primary empirical studies (e.g., Gough, Oliver, & Thomas, 2017). Our approach to this systematic review is somewhat different. Instead of integrating the findings of Meta-Analysis in Vocational Behavior 4 meta-analyses published in JVB in a general sense, our primary focus is on the methods employed to conduct meta-analyses and on the structure used to report these meta-analyses. Our goals for this systematic review are twofold: Our primary goal is to quantify the state of meta-analytic methods and to trace the development of meta-analytic methods applied to the study of vocational behavior phenomena over time, as published in JVB. We also aim to ascertain “gaps” that exist in the design, conduct, and presentation of meta-analytic studies published therein to date. Informed by the results of this systematic review, our second goal is to outline a set of “best practices” that are organized around the ten facets of our review and that guide the conduct and review of future meta-analyses in JVB, and for the field of vocational behavior more broadly defined (see Table 1). Thus, two overarching research questions that guide our review are, “How are meta-analyses published in JVB ‘done’?,” and “Do meta-analyses published in JVB conform to ‘best practices’?” To answer these questions, we organize our review around ten interrelated facets of the design and conduct of meta-analysis that have particular bearing on statistical conclusion validity (i.e., the extent to which the conclusions about meta-analyzed relationships are correct or “reasonable”; Shadish, Cook, & Campbell, 2002). These facets were derived from multiple sources. First, we consulted the American Psychological Association’s (APA) meta-analysis reporting standards—a comprehensive effort to establish criteria against which the scientific rigor of a meta-analysis can be judged (MARS; APA, 2008, 2010). Second, we considered more recent suggestions for applying MARS standards specifically to meta-analyses in the organizational sciences (Kepes, McDaniel, Brannick, & Banks, 2013). Third, we referenced recent “best practice” recommendations for the conduct of meta-analyses (Siddaway, Wood, & Hedges, 2019). Finally, we triangulated advice from each of these three sources against contemporary Meta-Analysis in Vocational Behavior 5 reference books regarding the design and conduct of meta-analyses (Borenstein, Hedges, Higgins, & Rothstein, 2011; Cooper, Hedges, & Valentine, 2009; Schmidt & Hunter, 2015). In our online appendix, we offer a “crosswalk,” tying common advice across these multiple sources to the ten facets of meta-analysis and the 19 best practices we derive therefrom: https://osf.io/pgujx/. Importantly, the primary focus of our review is on the statistical methods involved in the conduct of meta-analyses, and not on the supporting methods involved in such reviews (for a comprehensive review of literature search strategies that support systematic reviews and meta-analyses, see Harari, Parola, Hartwell, & Riegelman, 2020). Of note, our focus on the ten facets of meta-analysis is not designed to represent an exhaustive methodological summary and critique of every meta-analysis published in JVB to date. Rather, we focus on those ten facets of the design and conduct of meta-analysis that, if adopted prescriptively, would have the most “influence” on the broader applicability and impact of meta-analytic findings to the field as a whole. Moreover, our focus is on those facets of the meta-analytic process that are most actionable (i.e., those which researchers have most control over in the design, conduct, and reporting of meta-analyses), and that can be readily translated into best practices. Table 1 summarizes these ten facets and the best practice recommendations that we offer as guidance for researchers seeking to conduct meta-analyses of vocational behavior topics, including relevant cautionary notes and related practical advice, and notes about additional readings and resources to guide such efforts. To begin our discussion, we next summarize two predominant traditions of meta-analysis (i.e., Hedges-Olkin & Schmidt-Hunter) and then introduce the ten facets of meta-analysis that guided our review. 2. Two Traditions of Meta-Analysis The term “meta-analysis” refers to a process of systematically and quantitatively summarizing a body of literature. Generally speaking, meta-analyses are conducted to achieve a Meta-Analysis in Vocational Behavior 6 set of common goals. The overarching goal of any meta-analysis is to estimate average effects that are representative of population effects (e.g., population correlations, !xy) based upon the cumulation of multiple sample effects (e.g., correlations from individual primary studies, rxy). Moreover, meta-analyses generally involve procedures for differentially weighting such sample effects to account for variability in the precision of such estimates (e.g., weighting each rxy by its respective sample size, n). Finally, meta-analyses typically provide estimates of the variability of effects from study to study (i.e., estimates of