<<

Received: 27 August 2019 Revised: 22 April 2020 Accepted: 23 April 2020

DOI: 10.1111/1745-9125.12251

ARTICLE

The organizational effect among employees: A meta-analysis*

Scott E. Wolfe Spencer G. Lawson

School of Criminal Justice, Michigan State Abstract University has been shown to be an important Correspondence predictor of criminal justice employees’ work-related Scott E. Wolfe, School of Criminal Justice, Michigan State University, Baker Hall 510, East perceptions, attitudes, and behaviors. In this study, we take Lansing, MI 48824. stock of the organizational justice effect on criminal justice Email: [email protected] employees’ work outcomes by subjecting the literature to a

∗Additional supporting information meta-analysis. Multilevel modeling based on 1,924 effect can be found in the full text tab for this size estimates drawn from 143 studies (95 independent data article in the Wiley Online Library at sets) was used to establish the empirical status of the orga- http://onlinelibrary.wiley.com/doi/10.1111/ crim.2020.58.issue-4/issuetoc. nizational justice effect. The results indicate a sizeable rela- The authors would like to thank Jillian Tura- tionship between organizational justice and justice system novic for her invaluable statistical guidance on employee work outcomes (Mz = .256, CI = [.230, .283]). this project. The findings also demonstrate that the organizational jus- tice effect size varies slightly across several methodological variations. Specifically, the organizational justice effect size is larger when the concept is measured with scales that contain survey items tapping into all four dimensions of justice. Also, we found that outcome type, presence of confounding mechanisms, research design, and sample characteristics moderate the justice effect. We conclude that organizational justice theory is a useful framework for developing a more theoretically informed understanding of justice system employees’ work outcomes. We discuss the theoretical implications of the meta-analytic findings and avenues for future research based on the results.

KEYWORDS criminal justice , meta-analysis, organizational justice, supervisor–employee relationships, work outcomes

Criminology. 2020;1–26. wileyonlinelibrary.com/journal/crim © 2020 American Society of 1 2 WOLFE AND LAWSON

The concept of justice in the work environment has interested scholars for several decades. Empirical evidence from and management research consistently shows that employees’ perceptions of their supervisors are some of the strongest predictors of beneficial and counterproduc- tive work behaviors, perceptions, attitudes, emotions, and orientations. Specifically, employees who believe their supervisors treat them with organizational justice have higher levels of organizational commitment and productivity and are less likely to engage in deviance while at work (Cohen-Charash & Spector, 2001; Colquitt, Conlon, Wesson, Porter, & Ng, 2001). Criminology and criminal justice research also has produced a large evidence base concerning the role of justice in predicting a wide range of employee work outcomes in criminal justice organizations. Over the past few decades, research findings have shown that police officers who believe their supervisors use organizational justice have greater and more trust in their agency, and they are less likely to self-report engag- ing in misconduct (Myhill & Bradford, 2013; Tankebe & Meško, 2015; Wolfe & Nix, 2017; Wolfe & Piquero, 2011). Correctional officers also benefit from organizational justice. They are more likely to be committed to organizational goals and satisfied with their job, as well as less likely to be stressed at work when they feel they have been treated in an organizationally just manner (Lambert et al., 2010; Mesko, Hacin, Tankebe, & Fields, 2017). A quick read of the literature reveals that organizational justice “matters” to criminal justice employees. The problem, however, is the research evidence is more chaotic than it appears at first glance because of numerous methodological differences across studies. The organizational justice effect has been examined on numerous criminal justice employee work outcomes, using many different measurement strategies, and across various sample characteristics. Such methodological variation limits our ability to understand how much organizational justice matters and under what conditions it matters. The organizational justice literature in our discipline has reached a critical juncture where it is imperative to assess its empirical status and bring order to the disjointed evidence base. Doing so is fundamental to our theoretical understanding of the importance of organizational justice to criminal justice employees and the practical implications for justice system managers. Accordingly, the purpose of the present study is to subject the body of research on organizational justice and justice system employee outcomes to a meta-analysis. Our study is based on 1,924 effect size estimates derived from 143 empirical studies that used 95 independent data sets. Using multilevel modeling, the meta-analysis is guided by four primary goals. First, we will determine the overall effect size of organizational justice on various criminal justice employee work perceptions, attitudes, emotions, orientations, and behaviors. Second, our analysis will examine the extent to which the organizational justice effect varies across different organizational justice measurement strategies. Third, we will explore whether the organizational justice effect varies across other key methodological characteristics of the studies (i.e., does the size of the organizational justice effect vary across outcome type, inclusion of confounding variables, research design, or sample characteristics?). Fourth, based on the meta-analytic findings, we will provide suggestions for future theoretical development and empirical research on organizational justice in our discipline. The overarching purpose of this meta-analysis is to underscore the extent to which perceptions of organizational justice predict justice system employee work outcomes, as well as to determine how the construct could be used to develop a more theoretically informed understanding of employees’ work outcomes and the subsequent impact on communities and justice-involved populations. WOLFE AND LAWSON 3

1 ORGANIZATIONAL JUSTICE

Employees are subject to the decisions of their superiors on a routine basis. These decisions often deal with company policies and procedures, promotional opportunities, duty assignments, and the interpersonal dynamics of work life. Workers critically evaluate their supervisors because their actions and decisions have important economic and social consequences for employees (Colquitt, 2012). Organizational behavior and management research findings over the past half-century have shown that employees’ evaluations of their supervisors focus primarily on notions of justice. Visibility of justice research within authority–subordinate relations proliferated with equity theory introduced by Adams (1965). Focusing on input-outcome comparisons, Adams argued that employees assess supervisor fairness by calculating the ratio of their inputs (e.g., education, experience, and work contribution) to their outcomes (e.g., promotion and salary increase) and comparing that ratio to their coworkers. If these input–outcome ratios do not correspond across comparable colleagues, inequity is perceived and may lead to feelings of discomfort. This construct of justice significantly impacted the history of organizational justice research because it framed an early conceptualization of justice in the workplace, termed (i.e., how fairly employees believe resources are distributed). This work by Adams (1965) was followed by scholars such as Thibaut and Walker (1975) and Leventhal and colleagues (Leventhal, 1976, 1980; Leventhal, Karuza, & Fry, 1980) who argued that subordinates’ perceptions of justice are also based on the perceived fairness of the processes that lead to decision outcomes. This aspect of justice, termed , is attained when employees feel a sense of control during the decision-making process stage (Thibaut & Walker, 1975) and believe procedures were consistently applied, free of bias, correctable, representative, accurate, and ethical (Leventhal, 1976, 1980; Leventhal et al., 1980). Starting in the late 1980s, researchers started to conceptualize distributive and procedural justice under a more general, global term called organizational justice (Folger & Konovsky, 1989; Greenberg, 1987, 1990). Bies and Moag (1986) advanced the study of organizational justice by introducing its third dimen- sion, termed , which is employees’ perceptions of the interpersonal treatment received when policies and procedures are implemented in an . Early scholars, however, debated the conceptual status of this dimension. Some considered interactional justice as a distinct dimension of organizational justice (Cropanzano, Prehar, & Chen, 2002; Skarlicki & Folger, 1997), whereas others treated it as a subset of procedural justice (Moorman, 1991; Tyler & Bies, 1990). As another source of confusion in the literature, Greenberg (1993) argued that interactional justice is better conceptualized as two separate dimensions—interpersonal justice and informational justice.Forthis reason, many justice researchers consider interpersonal justice as the level of respect and propriety supervisors afford their subordinates and informational justice as the extent to which supervisors are candid, timely, and thorough in their communications with subordinates (Colquitt, 2001). A huge body of empirical evidence within the organizational behavior and management schol- arship shows support for the idea that perceptions of fair managerial practices engender many pro-organizational outcomes (Cohen-Charash & Spector, 2001; Colquitt et al., 2001). Employees in corporate settings who feel their managers use organizational justice tend to engage in more organizational citizenship behaviors (i.e., actions that go beyond the minimum requirements of the job) and have higher levels of productivity, organizational commitment, and job satisfaction (Ambrose & Schminke, 2009; Byrne, 2005; McFarlin & Sweeney, 1992). Organizational justice also seems to dissuade employees from engaging in behaviors and embracing attitudes that conflict with organizational expectations, goals, or rules (Bechtoldt, Welk, Zapf, & Hartig, 2007; Byrne, 2005; Colquitt, Noe, & Jackson, 2002; Fox, Spector, & Miles, 2001). 4 WOLFE AND LAWSON

As interest in the theoretical conceptualization and empirical study of organizational justice increased, so did efforts to resolve issues producing theoretical indeterminacy in the literature. As one example, the measurement of organizational justice had been inconsistent (Greenberg, 1993; Lind & Tyler, 1988). This led organizational behavior and management researchers to devote attention toward standardizing the measurement of the concept (Ambrose & Schminke, 2009; Gilliland, 2008). In the most influential study on this issue, Colquitt (2001) developed a measure of organizational justice with items that most closely matched the original theorists’ arguments (Adams, 1965; Bies & Moag, 1986; Leventhal, 1976; Thibaut & Walker, 1975). Using a series of structural equation models across several samples, he concluded that “organizational justice is best conceptualized as four distinct dimensions: procedural justice, distributive justice, interpersonal justice, and informational justice” (Colquitt, 2001, p. 396). As another example, since the inception of organizational justice research, disparate theoretical approaches, outcomes of interest, and methodologies have clouded scholars’ understanding of the justice effect. In response, two large meta-analytic investigations were published in 2001, which helped summarize 25 years of research on the justice effect and its association with a variety of work behaviors, perceptions, and attitudes (Cohen-Charash & Spector, 2001; Colquitt et al., 2001).

2 ORGANIZATIONAL JUSTICE AMONG CRIMINAL JUSTICE EMPLOYEES

Criminal justice agencies are like other organizational settings in many ways. For example, police departments, correctional facilities, and the courts all serve a client base. Like workers in other organizational settings, criminal justice employees value fair managerial practices in their agencies. The findings from organizational justice research in criminology and criminal justice have shown that, when criminal justice employees believe their supervisors are fair, they are more likely to self-report and engage in a wide range of beneficial work behaviors or hold attitudes favorable to such behaviors (Armeli, Eisenberger, Fasolo, & Lynch, 1998; Farmer, Beehr, & Love, 2003; Frear, Donsbach, Theilgard, & Shanock, 2018; Jacobs, Belschak, & Hartog, 2014). Numerous organizational justice studies have appeared in criminology and criminal justice journals over the past few decades, and their results have shown that police officers, for example, who feel they have been treated fairly by their supervisors report more commitment to their agency’s goals, more identification with their agency, and less cynicism toward their job or citizens compared with their counterparts who perceive less organizational justice from their supervisors (Bradford & Quinton, 2014; Bradford, Quinton, Myhill, & Porter, 2014; Donner, Maskaly, Fridell, & Jennings, 2015; Myhill & Bradford, 2013; Rosenbaum & McCarty, 2017; Trinkner, Tyler, & Goff, 2016). What is more, officers who experience fair supervisory treatment report greater job satisfaction and more trust in their agencies (Donner et al., 2015; Wolfe & Nix, 2017). Recent research findings also have shown that police officers who experience organizational justice from their supervisors are more likely to support democratic styles of policing by holding positive attitudes toward the use of procedural justice when interacting with citizens and being restrained in the use of force (Tankebe, 2014; Trinkner et al., 2016; Van Craen & Skogan, 2017a, 2017b). The benefits of organizational justice also seem to extend to counterproductive work behaviors. Police officers are less likely to self-report violating agency rules or committing misconduct when they believe they have been treated fairly by their supervisors (Bradford et al., 2014; Haas, Van Craen, Skogan, & Fleitas, 2015; Tyler, Callahan, & Frost, 2007; Wolfe & Piquero, 2011). Although a tremendous amount of organizational justice research has been focused on police officers’ perceptions of fair supervisory treatment, there is also a large body of scholarship in which WOLFE AND LAWSON 5 organizational justice in a corrections context has been investigated. Correctional employees who per- ceive greater distributive and procedural justice from their supervisors reported more job satisfaction and organizational commitment (Lambert, 2003; Lambert, Hogan, & Jiang, 2008). Dimensions of organizational justice also have been shown to be associated with less job stress among correctional officers (Cullen, Link, Wolfe, & Frank, 1985; Lambert, 2006; Lambert, Hogan, & Griffin, 2007; Taxman & Gordon, 2009). The research evidence indicates that organizational justice is an important predictor of beneficial, harmful, and counterproductive work attitudes and behaviors among criminal justice employees. With that said, the study of organizational justice in criminology and criminal justice has now reached a critical point like what was experienced at the turn of the century in the broader organizational behav- ior literature. There is a need to take stock of the research evidence on the organizational justice effect among criminal justice employees for several reasons. For one, even though the results of previous meta-analyses provided a wealth of information about the role of organizational justice in typical employee settings, they were conducted 20 years ago and prior to when almost all the organizational justice research in a criminal justice setting had taken place. As such, these meta-analyses included nearly no studies that were focused on criminal justice employees. Second, although some parallels exist between corporate and criminal justice organizational settings, criminal justice agencies are different from corporate-type organizations in fundamental ways that may impact the importance of organizational fairness to justice system employees. Third, some lingering questions in the crimino- logical literature can only be answered with a meta-analysis. We will discuss each of these questions in turn.

2.1 Overall effect size First, what is the average organizational justice effect size for justice system employees? Taking stock of the organizational justice effect through a meta-analysis will shed light on the theoretical importance of the construct in understanding justice system employee work outcomes. Additionally, this will help underscore how much organizational justice matters to justice employees and provide practical guidance to justice system managers aiming to improve employee relations and work outcomes.

2.2 Measurement inconsistency Second, does the type of organizational justice measurement strategy influence its effect size? The criminological literature on organizational justice suffers from inconsistent and sometimes poor organizational justice measures. Although organizational behavior and management scholars recog- nized this issue in their literature, our field has yet to address inadequate measurement techniques. In some respects, it is understandable that our discipline lacks a standardized instrument with which to measure justice system employees’ perceptions of fairness because the area of research is newer (at least in comparison with the organizational behavior literature) and because of the many contexts in which it has been studied. Indeed, the way we understand the words “justice” and “fairness” in the United States may not be generalizable to different corners of the world (Leung, 2005; Van den Bos & Lind, 2002). Despite this issue, a strong link between organizational justice and beneficial work outcomes has been found in various countries (Blader, Chang, & Tyke, 2001; Jiang, Gollan, & Brooks, 2017; Pearce, Bigley, & Branyiczki, 1998; Rahim, Magner, Antonioni, & Rahman, 2001). We should expect some variation in organizational justice measurement as researchers adapt survey items to cultural settings. At the same time, however, a considerable amount of measurement inconsistency within cultural settings still produces theoretical indeterminacy concerning the results from the overall 6 WOLFE AND LAWSON literature. We need to acknowledge such measurement inconsistency and understand its impact on the conclusions derived from the organizational justice literature. Measurement inconsistency is underscored by several problems related to operationalization in the literature to date. Survey items are sometimes haphazardly used in the construction of organizational justice scales across studies. Moreover, the ratio of survey items dedicated toward each dimension in full organizational justice scales (i.e., scales with items tapping into each of the four dimensions of justice) is often unequal (see the supplemental table in the online supporting information for examples).1 Another issue is cross-pollination, which involves organizational justice scales being labeled as one dimension but including survey items from other dimensions. For example, Tankebe’s (2014) measure of “organizational procedural justice” included survey items from procedural (e.g., “Decisions by my senior officers are based on facts, not personal biases or opinions”), interpersonal (e.g., “My senior officers take account of my needs when they are making decisions that affect me”), and informational justice (e.g., “My senior officers usually give me an honest explanation for the decisions they make that affect me”). Sargeant, Antrobus, and Platz (2017) examined the impact of “supervisory procedural justice” on officer compliance, but their scale included survey items that also tapped into interpersonal justice (e.g., “My supervisor treats me with respect”). Although there is precedent to combine procedural justice and interpersonal justice (Moorman, 1991; Tyler & Bies, 1990), inconsistent labeling of scales extends beyond a simple debate in semantics. Cross-pollination of this type can artificially inflate or deflate the relationships among different types of justice and outcomes of interest (Colquitt & Shaw, 2005). As a result, our ability to understand which theoretical variables of interest matter to employees is limited. Also, such theoretical indeterminacy muddies the waters regarding what should be targeted from a practical managerial standpoint. Contamination is another measurement problem found in the criminology and criminal justice literature on organizational justice. It involves the inclusion of unrelated survey items in aggregated organizational justice scales. As one example, items related to organizational trust often creep into justice-related scales, but trust is an entirely separate construct (Hamm, Trinkner, & Carr, 2017). Nix and Wolfe (2016) included the following item in their organizational justice scale: “As an organization, my agency can be trusted to do what it right for the community.” This item does not capture the essence of any of the organizational justice dimensions. Running parallel to the organizational justice literature are studies on a variety of constructs tapping into different types of support received by employees (e.g., supervisory treatment characterized as caring and respectful). The literature on perceived organizational support (Eisenberger, Huntington, Hutchison, & Sowa, 1986) first appeared in the same year as Bies and Moag’s (1986) dimension of interactional justice. The introduction of two comparable but distinct constructs of interpersonal treatment complicates theory development aimed at understanding how supervisor treatment impacts employees’ outcomes because researchers now have two theoretical frameworks from which to draw. Indeed, a scholar who wants to study a supervisor’s interpersonal treatment of employees but wants to avoid examining the other key dimensions of organizational justice, which may invite criticism from journal reviewers, may be inclined to use a simpler framework (e.g., perceived organizational support) as a means to achieve this end (Colquitt, Greenberg, & Scott, 2005). Given that the items included in these support-related scales have tremendous overlap with organizational justice survey items, specifically interpersonal and informational justice, these studies must be included in our meta- analysis. Allowing the organizational justice and organizational support literatures to run parallel

1Additional supporting information can be found in the full text tab for this article in the Wiley Online Library at http:// onlinelibrary.wiley.com/doi/10.1111/crim.2020.58.issue-4/issuetoc. WOLFE AND LAWSON 7 without acknowledging their overlap creates theoretical indeterminacy in both literatures and inhibits our ability to provide criminal justice managers with clear, practical guidance. In sum, measurement inconsistency and theoretical indeterminacy of this type confuse our under- standing of the concepts and potentially lead to erroneous practical implications. The first step toward rectifying this problem is to conduct a meta-analysis with attention devoted toward the moderating role of measurement. This approach will help us understand what dimensions matter and what type of scales may be best to pursue in the future.

2.3 Other potential moderators The final issue deals with whether other moderating factors in criminology and criminal justice studies on organizational justice influence the concept’s average effect size. One potential moderator of the organizational justice effect is the type of outcome being examined. The results reported in the organizational behavior literature demonstrate that employees’ justice perceptions impact different outcomes to varying degrees (Alexander & Ruderman, 1987; Masterson, Lewis, Goldman, & Taylor, 2000; Sweeney & McFarlin, 1993). It would be difficult to ascertain, however, whether the organizational justice effect among justice system employees varies across different work outcomes without subjecting the body of evidence to a meta-analysis. The type of intervening mechanisms included in multivariate models may also serve as a potential moderator. Within criminal justice, limited theorizing has taken place regarding the relationship between organizational justice and work outcomes when other theoretical variables are considered simultaneously. Researchers often include control variables that have been shown to be related to whatever outcome they are examining, but it would be difficult to determine what intervening mechanisms matter and how they impact the magnitude of the organizational justice effect size simply by reading the literature. A meta-analysis can help bring order to the literature and pave a path for more sophisticated theorizing about confounding or mediating variables and organizational justice. A final potentially important set of moderators concerns the characteristics of the sample. For example, does organizational justice matter equally for all types of justice system employees? Lambert (2003, p. 155) argued that, “Correctional staff, as agents of the criminal justice system, are aware of the concerns of justice and fairness. Furthermore, correctional staff expect organizational justice as much or even more, than other workers.” Although possible, it remains an open empirical question concerning whether the organizational justice effect is stronger or weaker for certain groups of justice system employees (e.g., police vs. correctional officers). Relatedly, it is useful to examine whether the organizational justice effect size varies across nationality, gender, or rank composition of samples, as well as the type of research design used in the analyses. Such an understanding has important theoretical implications. On one hand, if the organizational justice effect does not vary across sample characteristics, the construct likely has a general effect on justice system employees’ work outcomes. On the other hand, if the organizational justice effect varies across sample characteristics, researchers would be motivated to explore the theoretical explanation of such variation.

2.4 Current study Based on these issues, we conducted a meta-analysis to answer key questions in the criminological and criminal justice literature on organizational justice:

1. What is the average organizational justice effect size for justice system employees? 2. Does organizational justice measurement strategy influence its effect size? 8 WOLFE AND LAWSON

3. How do potential moderators (e.g., type of outcome, confounding mechanisms, research design, and sample characteristics) included in studies of organizational justice influence the average effect size?

3 METHOD

3.1 Sample Organizational justice studies in a criminal justice employee context published up to May 2019 were included in this meta-analysis. Several inclusion criteria guided the literature search. First, the study’s sample must include criminal justice professionals. Second, a dimension of organizational justice had to be included as an independent variable in the study and conceptualized as employees’ perceptions of supervisor or agency justice. Third, the study must have included an outcome variable that was centered on some type of employee work-related perception, attitude, emotion, orientation, or behavior. Fourth, the study must have used a quantitative methodology. Fifth, the study must have been published between January 1975 and May 2019. We chose 1975 as a start date because it coincided with the publication of Thibaut and Walker’s (1975) paper introducing the dimension of procedural justice and, thereby, beginning the multidimensional study of justice. Data collection proceeded in four steps. Using the inclusion criteria, published studies were gathered through a systematic literature review of 104 electronic databases written in English (see supplemental table in the online supporting information for complete list). First, a Boolean search string was used to gather studies meeting the inclusion criteria (see the supplemental table). The preliminary database search identified 880 sources. The second step in the process involved screening the titles and abstracts for initial eligibility to reduce the results to studies that met our inclusion criteria. As a result, we identified 156 potentially eligible studies for the meta-analysis. Third, a full text review was conducted to screen for studies that were centered on the populations of interest, included a dimension of organizational justice as an independent variable and work-related dependent variable, and included a quantitative methodology. This step yielded 109 articles for inclusion in the meta-analysis. Studies were excluded from the meta-analysis primarily because they did not include a measure of organizational justice as an independent variable, did not list the individual items used for creating its justice scale, or did not provide enough statistical information to be included in the analysis. Finally, reference lists from these studies were reviewed to identify potentially eligible studies that were not captured during the database search. An additional 34 studies were identified through this step. Our final sample comprised 143 empirical studies that contained 1,924 effect size estimates. These studies represent a total of 70,360 individual cases from 95 unique data sets and 17 countries.

3.2 Effect size estimate The dependent variable of interest in a meta-analysis is the effect size estimate (ESE). We calculated ESEs of the relationship between organizational justice and criminal justice employee work-related outcomes. The ESE calculation was guided by the work of Pratt, Turanovic, and colleagues (Pratt, Turanovic, Fox, & Wright, 2014; see also, Pratt & Cullen, 2000, 2005; Pratt, Turanovic, & Cullen, 2016; Pyrooz, Turanovic, Decker, & Wu, 2016; Turanovic, Pratt, & Piquero, 2017). Specifically, we used the zero-order correlation coefficient (r) for bivariate ESEs and standardized regression coefficients for multivariate ESEs (Hedges & Olkin, 2014; Peterson & Brown, 2005). Either an r or a standardized regression coefficient was reported by most studies in our meta-analysis. Approximately 6 percent of the effects only relied on unstandardized regression coefficients and standard errors. In these WOLFE AND LAWSON 9 √ cases, we converted a t ratio into an r using the following equation: 𝑟 = 𝑡2∕(𝑡2 + 𝑛 −2)(Rosenthal, 1994). Next, each ESE was converted into a z(r) score using Fisher’s r to z transformation (Blalock, 𝑧(𝑟 )= 1 ln[(1 + 𝑟 )∕(1 − 𝑟 )].2 1972; Hedges & Olkin, 2014; Overton, 1998; Rosenthal, 1994): 𝑖 2 𝑖 𝑖 This was done because z(r) scores approximate normality and the transformation can be applied to bivariate and multivariate effects (Field & Gillett, 2010; Hedges & Olkin, 2014). As Pratt et al. (2014, p. 94) noted, “Normally distributed effect size estimates are necessary for the accurate determination of mean effect size estimates and for unbiased tests of statistical significance.”

3.3 Moderator variables 3.3.1 Organizational justice measurement One key problem within our discipline’s organizational justice literature is the lack of consistency in measurement across studies. To assess the impact of this variation on the magnitude of the organizational justice effect, we coded every effect size included in our meta-analysis for what type of justice-related scale(s) was used. Some effect sizes were estimated with scales of organizational justice that included survey items tapping into all four dimensions of the concept. We used a dummy variable—full OJ scale—when this was the case (1 = yes, 0 = no). At least one survey item from each of the four dimensions of organizational justice must have been included in the scale to be coded in this manner. Other effect sizes were estimated using combination scales that included items from several of the justice dimensions but not all of them. Combination OJ scale was coded 1 if the scale included at least one item from multiple justice dimensions but not all four. For example, a scale would be coded as a combination scale if it included items capturing procedural and distributive justice but no items from the other two dimensions. Researchers sometimes only captured employees’ justice perceptions using items from a single dimension of organizational justice. Procedural justice is a dummy variable coded 1 if the scale contained survey items tapping into only procedural justice. Dummy variables for distributive justice, interpersonal justice, and informational justice were coded in a similar way. We are not assuming that organizational justice is best measured as a unitary construct. Our goal is to shed light on whether the type of scale impacts the overall ESE.3 We also constructed variables that capture the percentage of survey items in a given scale that tap into each of the justice dimensions: percent procedural justice, percent distributive justice, percent inter- personal justice, and percent informational justice. These variables allowed for us to assess the extent to which the percentage of items from a specific justice dimension impacted the magnitude of the ESE. Many studies, especially those comprising nontraditional organizational justice labels (e.g., organi- zational support or supervisor support), contained scale items that were not related to organizational justice or did not fit clearly into a single dimension of justice (e.g., the survey item “Overall, I have been treated fairly at this department” could be considered a measure of distributive or procedural justice). Rather than exclude these studies entirely from the meta-analysis, we coded for whether the scale was contaminated with a binary variable (1 = yes, 0 = no) and percent other to capture the

2 = − Prior to conducing the Fisher’s r to z transformation, all ESEs were transformed using the following equation: ri r [r(1 − r2)/2(n − 3)]. This approach helps remove the slight positive bias produced by the Fisher transformation (Hedges & Olkin, 2014; Overton, 1998). 3A study may have multiple organizational justice scales and effect sizes. For example, the researchers behind the study could have estimated the effect of a full organizational justice scale—one containing at least one item from each of the dimensions— and the effect of each of the four dimensions using separate scales. In such a case, we would have coded for five justice effect sizes—the first being a “full OJ scale” and the others coded as the respective justice dimensions. 10 WOLFE AND LAWSON percentage of contaminated items in the scale. These variables allowed for us to assess whether scales including any nonjustice or ambiguous survey items impacted the ESEs. Colquitt’s (2001) description of each justice dimension guided our coding of all survey items used in the justice-related scales for all effect sizes included in the meta-analysis. Specifically, a survey item was considered procedural justice if it captured employees’ ability to voice their views during procedures or the ability to influence the outcome (Thibaut & Walker, 1975). Items were also coded as procedural justice if they tapped into employees’ perceptions of process consistency, bias suppression/neutrality, accuracy (i.e., procedures based on accurate information), correctability (i.e., ability to correct errors in the process), or representation (i.e., all impacted groups are listened to; Leventhal, 1980). Survey items were coded as distributive justice if they captured employees’ percep- tions concerning whether the distribution of outcomes in their agency (e.g., rewards or punishments) is based in accordance with their contributions (i.e., equity; Adams, 1965; Deutsch, 1975; Leventhal, 1976). Survey items were coded as interpersonal justice if they captured employees’ perceptions of whether their supervisors treat them with respect and propriety (Bies & Moag, 1986; Colquitt, 2001; Greenberg, 1990). Finally, survey items were coded as informational justice if they measured employees’ perceptions of the candidness/truthfulness of their supervisors or the extent to which supervisors sufficiently explain the reasons for their decisions (Bies & Moag, 1986; Greenberg, 1990).

3.3.2 Type of outcome Numerous work outcomes have been explored in the organizational justice literature in our discipline. We coded all possible beneficial and counterproductive work-related dependent variables from each study into one of nine outcome types for the meta-analysis (each dummy coded; 1 = yes, 0 = no). Organizational commitment is a dummy variable that includes survey items or scales that tapped into employees’ commitment to their agency (e.g., “I have a strong attachment to the force”). Organiza- tional citizenship behaviors/attitudes (hereafter, OCBA) includes outcomes that measure employees’ willingness to go beyond what is required of their job (e.g., “I am happy to take on extra work to help other people”). Study outcomes were coded as job satisfaction if they tapped into employees’ general satisfaction with their job or pay (e.g., “Overall, I am satisfied with my job”). Compliance measures employees’ adherence to rules, policies, or laws regarding their jobs (e.g., “I do what my supervisor asks me to do”). It is common in organizational justice studies to measure employees’ nonjustice perceptions of their supervisors because greater feelings of justice should be associated with more favorable views of their supervisor. As such, we coded study outcomes as view of supervisor if the item or scale tapped into employees’ general satisfaction with their supervisors’ effectiveness (e.g., “There is adequate leadership direction”). View of public includes outcomes in which employees’ perceptions of members of the public were measured, including citizens or incarcerated individuals (e.g., “The peo- ple in the community I patrol approach life with a strong moral code”). Attitudinal/emotional reactions captures employees’ negative or positive attitudes or emotional reactions to their job. For example, items or scales referred to as job stress (e.g., “When I’m at work, I often feel tense or uptight”), self-legitimacy (e.g., “I have confidence in the authority vested in me as a law enforcement officer”), or job burnout (e.g., “I feel that I treat some inmates as if they were impersonal objects”) were coded into this category. Role-related perceptions include outcomes that measured employees’ perceptions of issues such as job autonomy (e.g., “I have flexibility in how and when to do my job duties”), job variety (e.g., “My job requires that I must constantly learn new things”), role clarity (e.g., “I know what my responsibilities are”), or role overload (e.g., “It often seems like I have too much work for one person to do”). Lastly, internal factors captures outcomes that measured issues such as employees’ perceptions of colleague support (e.g., “I feel supported in my work by my colleague officers”). WOLFE AND LAWSON 11

3.3.3 Confounding mechanisms In the criminal justice literature, authors often account for potential confounding mechanisms in statistical models to examine the impact of organizational justice on work outcomes. We coded for the confounding mechanisms studies included in multivariate models (each dummy coded; 1 = yes, 0 = no). Not surprisingly, many of these variables overlap with the outcome variables described previously. These variables included organizational commitment, compliance, view of supervisor, view of public, attitudinal/emotional reactions, role-related perceptions, and internal factors.Wealso accounted for external factors if researchers controlled for employees’ views regarding issues such as family support/conflict (e.g., “I have people in my family that I can talk to about the problems I have at work” or “I frequently argue with my spouse/family members about my job”).

3.3.4 Research design and sample characteristics Lastly, we included several measures regarding research design and sample characteristics as mod- erator variables in our meta-analysis. These factors include whether the ESE was drawn from a study comprising a longitudinal design (1 = yes, 0 = cross-sectional) or from a model using ordinary least-squares regression (OLS;1= yes, 0 = no). Concerning sample characteristics, we accounted for whether the ESE was drawn from a police sample (1 = yes; 0 = corrections-based sample) or a USA sample (1 = yes, 0 = international sample).4 We also accounted for the gender composition of the samples with three dummy variables: male sample (1 = all male sample), female sample (1 = all female sample), and mixed gender sample (1 = mixture of males and females). Finally, we used three dummy variables to account for the rank composition of the sample: line level-only sample (1 = all line-level sample), supervisor-only sample (1 = all supervisor sample), and mixed rank sample (1 = mixture of line-level officers and supervisors).5

3.4 Analytic strategy Our sample of ESEs is nested within a hierarchical structure whereby level 1 contains the ESEs (n = 1,924), level 2 corresponds with the individual studies (n = 143), and level 3 contains the independent data sets used across the studies (n = 95; Snijders & Bosker, 2012). Accordingly, we used the multilevel modeling (MLM) procedure advocated by Pratt et al. (2014). Use of MLM allowed for us to analyze all effect sizes derived from all studies included in our meta-analysis (rather than picking a single effect size for inclusion) while accounting for the nonindependent nature of the data. MLM is advantageous because “several sources of dependence (e.g., dependence in effect size estimates both between and within studies and data sets) can be accounted for simultaneously” by incorporating a random effect for each level of analysis into the model (Pratt et al., 2014, p. 97). As Pratt et al. (2014) and others have noted (Hox & de Leeuw, 2003), some variance in the ESEs is assumed to be known. We account for this variation by calculating standard errors from information reported√ in each study. The following equation was used to calculate standard errors for bivariate ESEs: 𝜎 = 1∕(𝑛 −3)(Lipsey & Wilson, 2001). Multivariate ESE standard errors were calculated using

4Approximately 51.4 percent of the ESEs were drawn from studies comprising police officer samples and 46.4 percent from corrections officer samples. No studies included judicial samples. Approximately 2.2 percent of the ESEs were derived from samples containing probation or parole officers. Accordingly, they were combined with the corrections samples. Approximately 60.1 percent of ESEs were drawn from U.S. samples. 5It was common for scholars not to report enough information pertaining to the age (43 percent missing), race (57 percent missing), educational (48 percent missing), or experience (47 percent missing) composition of their samples. This prevented us from considering these factors as moderating variables in our analyses. 12 WOLFE AND LAWSON

′ 𝜎 =Fishers 𝑧(𝑟)∕(𝑏∕SE), where b is the unstandardized regression coefficient and SE is the cor- responding standard error reported for each organizational justice effect.6 In this “variance-known model,” the standard error of the ESE is incorporated as a predictor variable in the random part of level 1 in the MLM equation and the variance is constrained to one. Our meta-analysis proceeded in two stages, the first of which involved the estimation of the overall mean effect sizes between organizational justice and criminal justice employee work outcomes. This step allowed for us to assess the strength of the organizational justice effect across all studies included in the sample. In the second stage, we conducted a series of moderator analyses to determine whether the effects of organizational justice on work outcomes are robust across different measurement strategies, types of outcomes, research designs, or sample characteristics. The results from this step speak to the stability of the organizational justice effect across methodological variations and to the theoretical generality of the effect. Stata 14’s meglm with maximum likelihood estimation was used to estimate all variance-known MLMs.

4 RESULTS

The first step in MLM is to estimate an unconditional random intercept model (i.e., with no predictors) to determine whether significant between-group variation exists, which in this case refers to variation between studies and between data sets. The level 1 unconditional variance component was .037 (p < .001), which indicates that significant variation between ESEs exists within studies. At level 2, the unconditional variance component was .012 (p < .001), indicating significant variation in ESEs between studies. The interclass correlation coefficient (ICC) was .301, which indicates that an estimated 30 percent of the total variation in the organizational justice ESE is attributable to between-study differences. The unconditional variance component at level 3 was .004 (p = .124), and the ICC was .072, which indicates that an estimated 7 percent of the variation in the ESEs lies between data sets. Therefore, a portion of the variation in ESEs exists at each level of analysis, which supports the use of variance-known MLMs.

4.1 Overall strength of the organizational justice effect We estimated the mean organizational justice effect size across three models. The models control for the grand mean centered sample size and number of effect sizes contained in the study from which the ESE was derived. Therefore, the mean ESEs presented in table 1 are model intercepts and are interpreted as the expected effect size when the sample size and number of effect sizes per study are held constant at their respective means. In the first model, the overall mean effect size was estimated, which represents the average organizational justice effect within the pooled sample of bivariate and multivariate effect sizes, net of study sample size and number of effects per study. The overall mean effect size for organizational justice on criminal justice employee work outcomes (Mz) was .256 (p < .001; confidence interval [CI] = .230–.283). Overall, organizational justice has a sizeable effect on criminal justice employees’ work outcomes. The second and third models presented in table 1 reflect the mean organizational justice effect size for bivariate (Mz = .348, p < .001; CI = .311–.385)

6Consistent with the results reported in other meta-analyses (Pratt, Turanovic, Fox, & Wright, 2014; Pyrooz, Turanovic, Decker, & Wu, 2016), some researchers did not report enough information to calculate a standard error for the ESE (n = 425; 22 percent of ESEs). We used MICE (10 imputations) to impute the missing standard errors (Allison, 2002;√ Carlin, Galati, & Royston, 2008; Royston, 2005). The missingness equation included all variables used in our analysis and 1∕(n − 3). WOLFE AND LAWSON 13

TABLE 1 Mean effect size estimates for organizational justice on work outcomes Mean ESE SE z 95% CI Overall effect size .256** .014 18.947 .230 – .283 Bivariate effect size .348** .019 18.393 .311 – .385 Multivariate effect size .198** .014 14.163 .170 – .225 Notes: There are 1,924 overall ESEs, 1,066 bivariate ESEs, and 858 multivariate ESEs. Each model controls for the grand-mean-centered sample size and number of effect sizes per study. Abbreviations:CI= confidence interval; ESE = effect size estimate; ∗∗p < .01 (two-tailed test). and multivariate (Mz = .198, p < .001; CI = .170–.225) statistical models, respectively. These results indicate that organizational justice has a meaningful effect on criminal justice employees’ work outcomes. In an examination of the ESEs across studies, scholars demonstrated that ∼4.0 percent of the bivariate and ∼11.5 percent of the multivariate effects were negative (counter to theoretical expectations). Additionally, 22.5 percent of the bivariate and 40.9 percent of the multivariate effects were statistically insignificant. The bivariate ESEs ranged from –.662 to 2.289 and multivariate from –.534 to 1.127. This finding underscores the importance of examining whether any moderator variables impact the organizational justice ESE. We turn to these potential relationships now.

5 MODERATOR VARIABLES AND THE STABILITY OF THE ORGANIZATIONAL JUSTICE EFFECT

The second stage of our analysis involved assessing whether a series of moderator variables impacted the strength of the organizational justice effect on work outcomes. Table 2 presents the results from these moderator analyses across organizational justice operationalization, outcome type, research design, and sample characteristics. A separate variance-known MLM regressed ESE onto each moderator variable and controlled for the grand-mean-centered sample size and number of effect sizes per study. Several important findings emerged from these analyses. For starters, the model intercepts are the average justice ESE after controlling for differences in measurement strategy, outcome type, confounding mechanisms, research design, or sample characteristics in the respective models. The model intercepts reveal that justice perceptions have a consistent, sizeable effect on work outcomes across a wide range of methodological differences. At the same time, however, many moderator variables were shown to be significant predictors of the ESE. The first set of moderators in table 2 deals with the operationalization of organizational justice. The results of the analyses revealed that studies tended to have larger ESEs if organizational justice was operationalized with a full scale (b = .043, p < .10)—a scale constructed with at least one item from each of the four dimensions—or a scale containing only procedural justice survey items (b = .025, p < .10) in comparison with all other measurement strategies. The predicted ESE was smaller, however, if a study included a scale with only distributive justice (b = –.044, p < .01) or informational justice (b = –.037, p < .05) scale items. It is important to note that these results do not necessarily reveal that operationalizing organizational justice with a single scale comprising all four dimensions is the correct measurement strategy. Indeed, using separate scales may be justified if the individual dimensions in a study are empirically distinct or the research question requires examining a specific justice dimension. These results demonstrate which types of scales tend to be more strongly related to work outcomes. Also, we examined whether the presence of scale items that were unrelated to justice or were too ambiguous to be coded into a justice dimension were associated with the ESE (i.e., contamination). 14 WOLFE AND LAWSON .279 .276 .278 .288 .281 .286 .268 .275 .289 .283 .286 .275 .280 .300 .278 .288 .280 .282 .294 .279 .276 (Continues) − − − − − − − − − − − − − − − − − − − − − .225 .219 .223 .235 .228 .232 .208 .219 .235 .228 .232 .217 .227 .246 .225 .234 .227 .229 .239 .226 .221 ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** .252 .251 .262 .259 .238 .247 .262 .246 .273 .251 .261 .254 .267 .252 .249 .259 .255 .254 Model Intercept Intercept 95% CI † † ** * * * * † ** * * ** ** ** ** z .921 .716 2.261 7.502 1.669 2.698 2.416 3.107 2.406 3.077 1.853 1.363 2.888 − − − − − − − .001 .001 .017 .026 .016 .022 .042 .013 .023 .016 .044.037 .014 .015 .036 .012 .003.001.006 .001 .002 2.021 .003 .347 1.913 .255 .003 .001 .127 .043 .032 .053 .042 .025 .031 .045 − − − − − − − a a a a 19.4%) 16.1%) 19.1%) = = 31.6%) = = a 21.3%) = Do methodological differences impact the organizational justice effect size estimates? 2 Combo OJ scale (648)Distributive justice scale (279) Informational justice scale (248) Percent procedural justice items (mean Percent interpersonal justice items (mean Percent other items (mean .021Job satisfaction (194)View of supervisor .014 (97)Attitudinal/emotional reactions (495) Internal factors (174) 1.556 .248 .041 .073 .017 .023 .090 2.476 3.211 .020 4.444 Full OJ scale (147) Procedural justice scale (429) Interpersonal justice scale (173) Contaminated (673) Percent distributive justice items (mean Percent informational justice items (mean OCBA (268) Compliance (154) View of public (53) Role-related perceptions (226) Organizational commitment (263) .023 .016 1.435 .253 Type of Outcome Type of Measurement Moderator Variable Coefficient SE TABLE WOLFE AND LAWSON 15 .358 .291 .288 .295 .293 .333 .312 .294 .308 .284 .322 .292 .302 .282 .284 .228 .297 .276 .258 − − − − − − − − − − − − − − − − − − − .303 .237 .235 .238 .240 .278 .257 .240 .257 .230 .232 .213 .219 .227 .230 .053 .229 .218 .178 ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** 1,501). All moderator variables are binary .330 .264 .262 .266 .267 .305 .284 .267 .282 .277 .257 .140 .263 .218 .257 .261 = N Model Intercept Intercept 95% CI ** ** ** ** ** ** ** ** ** ** † ** † † z .245 .255 .501 .247 .0361 .252 ordinary least-squares regression. 8.178 5.501 4.351 2.910 3.623 9.013 1.944 1.824 6.042 1.782 − − − − 11.891 13.367 15.360 = − − − − − − − − − − − − 1,713) and rank subsamples ( = N .098 .030 .031 .018 .028 .036 .025 .015 .025 .010 organizational justice; OLS = .190.149 .023 .114 .027 .009 .026 .036 .143.142 .024 .012 .034 .069 .086 .114 .163 .035 .007 .065 .048 .203 .045 .159 − − − − − − − − − − − − − − − − 1,924), except the gender subsamples ( = N organizational citizenship behaviors/attitudes; OJ = .01 (two-tailed test). < p confidence interval; OCBA ** (Continued) = .05; :CI 2 no) unless otherwise noted. For binary variables, the frequency of effect size estimates coded “1” are reported in parentheses. < = p * : All estimates are based on the full sample of effect size estimates ( .10; yes, 0 Organizational commitment (108) View of supervisor (108) Attitudinal/emotional reactions (531) External factor (89) OLS (667) Male sample (64) Mixed gender sample (1,591)Supervisor-only sample (41) .121 .045 2.683 Controls for confounding mechanism (721) Compliance (108) View of public (66) Role-related perceptions (280) Internal factor (194) Longitudinal (26) USA sample (1,156) Female sample (65) Line level-only sample (517) Mixed rank sample (948) Police sample (989) .006 .027 .234 .253 < = Research Design Moderator Variable CoefficientSample Characteristics SE Confounding Mechanisms p Coefficient and standard error multiplied by 10. TABLE Notes (1 Abbreviations a † 16 WOLFE AND LAWSON

The results showed that contaminated justice scales tended to have larger ESEs than did scales with purely organizational justice survey items (b = .042, p < .05). Therefore, researchers risk artificially inflating the magnitude of the organizational justice effect if they include scale items unrelated to the dimensions of organizational justice. Another way to examine the measurement issue is to assess whether the percentage of survey items from each justice dimension is associated with the ESE. The findings from the analyses revealed that a higher percentage of procedural justice items in the scale corresponded with larger ESEs (b = .003, p < .05; coefficient multiplied by 10 to stay within three decimal places), whereas more distributive justice items in the scale resulted in smaller ESEs (b = –.003, p < .05; coefficient multiplied by 10). The percentage of interpersonal or informational justice items in the scale was not associated with the size of the ESE. Scales with a higher percentage of items that were not categorized into one of the four justice dimensions (i.e., contaminated) tended to have slightly larger ESEs (b = .006, p < .10; coefficient multiplied by 10). Taken together, the results from this set of moderator analyses reveal that measurement strategy impacts the magnitude of the organizational justice effect size. Next, we examined whether the magnitude of the organizational justice effect varied across different work outcomes. Nearly all the outcome moderators were significantly associated with the ESE. Job satisfaction (b = .041, p < .05), view of supervisor (b = .073, p < .01), role-related perceptions (b = .045, p < .01), and internal factors (b = .090, p < .01) tended to be more strongly related to organizational justice compared with other outcome variables. Conversely, organizational citizenship behaviors/attitudes (b = –.127, p < .01), compliance (b = –.053, p < .05), and attitudinal/emotional reactions (b = –.036, p < .01) all had weaker relationships with organizational justice when compared with all other outcomes. Researchers that controlled for any intervening mechanisms in their studies had smaller organiza- tional justice ESEs than did those that did not contain such variables (b = –.159, p < .01). Failure to account for such confounding mechanisms in predictive models will artificially inflate the magnitude of the organizational justice effect size. In the next set of analyses, we examined whether research design impacted the ESE. No statistically significant difference in the magnitude of the ESE was observed between longitudinal and cross- sectional studies. Only 1.35 percent of effect sizes in the meta-analysis, however, were derived from studies involving longitudinal samples. From a modeling standpoint, researchers using OLS (n = 667, 80.2 percent of multivariate effects) tended to report smaller ESEs compared with those using other models (b = –.114, p < .01). The final set of moderator variables dealt with sample characteristics. Gender and rank composition of the sample seemed to impact the ESE. Specifically, mixed-gender samples tended to have larger ESEs versus samples containing only men or women (b = .121, p < .01). Female-only samples reported weaker organizational justice effects than did male-only or mixed-gender samples (b = –.065, p < .10). On average, samples with line level-only respondents (b = –.045, p < .10) had smaller ESEs than did samples containing a mixture of line-level and supervisor employees (b = .048, p < .10). The results of the moderator analyses indicate that organizational justice measurement strategy, outcome type, intervening mechanisms, research design, and sample characteristics impact the mag- nitude of the organizational justice effect size. Yet, such analyses are limited because the moderator variables may be correlated with one another. As Pratt et al. (2014) noted, the inclusion of certain intervening mechanisms may only be related to the magnitude of the ESE because it is confounded with whether the study used a multivariate versus bivariate modeling strategy. In the last portion of our meta-analysis, we attempt to disentangle the effects of the moderators on the organizational justice ESE by using a multivariate equation of our own. WOLFE AND LAWSON 17

TABLE 3 Multivariate analysis of moderator impact on organizational justice effect size estimates Full Sample Multivariate Sample Model 1 Model 2 Moderator Variables Coeff. SE Coeff. SE Fixed Effects Intercept .196** .067 .065 .065 Level 1 Moderators Full OJ scale .115** .032 .087** .032 Contaminated .036† .019 .022 .022 Organizational outcome .013 .012 .002 .018 Controls for intervening mechanisms –.011 .023 –.046* .023 Longitudinal –.035 .083 –.022 .084 Multivariate statistical model –.156** .023 – – Police sample .002 .030 .059* .026 USA sample .005 .029 .032 .026 Mixed gender sample .100† .057 .087 .055 Mixed rank sample .015 .027 .011 .023 Sample sizeb .007 .007 –.124* .006 Level 2 Moderators Number of effect sizes per studya –.020* .009 –.020* .009 Random Effects Level 1 – Effect Size Estimates Variance between models, within studies .026** .001 .022** .002 Level 2 – Study Variance between studies, within data sets .007** .002 .006** .002 Level 3 – Data Set Variance between data sets .003† .002 .000 .000 N (ESEs) 1,367 623 Notes: All moderator variables are binary (1 = yes, 0 = no), except sample size and number of coefficients per study (both continuous). Abbreviations: Coeff. = coefficient; ESEs = effect size estimates; OJ = organizational justice. aCoefficient and standard error multiplied by 10. bCoefficient and standard error multiplied by 10,000. †p < .10; *p < .05; **p < .01 (two-tailed test).

Table 3 presents the results of two variance-known MLMs that simultaneously regressed ESEs on the various moderator variables. A separate equation was estimated for the full sample of ESEs (model 1) and for those derived from multivariate models (model 2). From a measurement standpoint, our main concern was with whether the full organizational justice scale was associated with variation in the ESEs compared with other measurement alternatives that do not include all four dimensions of the concept. A full organizational justice scale, however, is not necessarily better than other measurement strategies. Rather, this approach allows for us to assess whether a scale that contains all four dimensions of the construct is associated with ESEs in a different way than other measurement strategies. Thus, the full OJ scale and contaminated indicators were included in the equations. Multicollinearity prevented us from including all the outcome moderators in the equations. Rather, we calculated a new binary variable labeled “organizational outcome” that was coded 1 if the outcome dealt with perceptions of or attitudes 18 WOLFE AND LAWSON toward the organization (i.e., organizational commitment, OCBA, compliance, view of supervisor, and internal factors) and coded 0 if it dealt with a “personal outcome” (i.e., job satisfaction, view of public, attitudinal/emotional reactions, and role-related perceptions). This coding is consistent with McFarlin and Sweeney’s (1992) distinction between organizational and personal outcomes. We accounted for whether the research design was longitudinal, if the analysis was multivariate, and whether the sample comprised only police officers, United States-based, mixed gender, or mixed rank respondents. We also controlled for the grand-mean-centered sample size and the number of effect sizes per study. Model 1 in table 3 presents the results of our multivariate MLM for the full sample of ESEs. The findings of the analysis reveal that full organizational justice scales tended to yield larger ESEs compared with other measurement strategies (b = .115, p < .01). Supplemental analyses (not reported) included all binary justice measures into the equation but one justice measure was removed as the reference category. A separate equation was estimated for the removal of each measurement variable. In each model, the full organizational justice scale was significantly and positively associated with the ESEs. As a result, we can be confident that the full scale produces larger effect sizes than any of the other scales. It is also important to note that contaminated scales (b = .036, p < .10) and mixed-gender samples (b = .100, p < .10) tended to have larger effect sizes. ESEs drawn from multivariate statistical models tended to be smaller than those derived from bivariate analyses (b = –.156, p < .01). Model 2 in table 3 presents the results from the same equation but restricted to the subsample of ESEs that were drawn from multivariate models. The full organizational justice scale yielded larger ESEs than did the other scales (b = .087, p < .01). Researchers who controlled for intervening mechanisms (b = –.046, p < .05) or those with larger samples sizes (b = –.124, p < .05; coefficient multiplied by 10,000) had smaller effect sizes. Police samples had larger multivariate ESEs compared with corrections samples (b = .059, p < .05).

6 DISCUSSION

After reviewing the literature, we find that a justice system employee’s perception of organizational justice is an important predictor of work-related outcomes. Research findings of this type are important because many employee outcomes that stem from supervisor fairness ultimately impact the citizens the system serves. Organizational justice, for example, has been shown to be positively associated with police officers’ willingness to treat citizens with procedural fairness and with correctional officers hav- ing less cynical views of rehabilitation efforts (Lambert, Hogan, & Barton-Bellessa, 2011; Tankebe, 2014). In this way, organizational justice is the key to the success of the criminal justice “business.” At the same time, however, there has been a great deal of confusion in the literature concerning 1) how much organizational justice matters, 2) whether measurement decisions impact the magnitude of the organizational justice effect, and 3) whether moderators impact the justice effect. We conducted a meta-analysis to answer these questions. Consistent with results from traditional business-setting meta-analyses (Cohen-Charash & Spector, 2001; Colquitt et al., 2001), we found organizational justice has a sizeable effect on criminal justice employees’ work-related outcomes (Mz = .256).7 To date, the broader literature on justice system employees’ work outcomes lacks guiding theoretical frameworks. Our findings indicate that organizational justice theory is a useful framework to leverage as we move toward a deeper understanding of the factors that predict employees’ work outcomes.

7To compare our results with other predictor variables criminologists may be familiar with, the mean organizational justice effect size is similar in magnitude to attitudinal self-control (Mz = .257) and (Mz = .225), which are considered some of the most robust predictors of criminal behavior (Pratt & Cullen, 2000). WOLFE AND LAWSON 19

Police and correctional officers seem to care a great deal about how fairly they are treated by their supervisors. Such employees face a lot of uncertainty in their jobs—for example, the constant threat of danger, civil lawsuits for wrongdoing, and difficult to understand bureaucratic decision-making. Consistent with uncertainty management theory (Van den Bos, 2001), police and correctional officers likely use supervisor organizational justice as a way to manage such uncertainty (Wolfe, Rojek, Manjarrez, & Rojek, 2018). We encourage future researchers interested in justice system employees’ work outcomes to consider the theoretical and empirical importance of organizational justice. We found the organizational justice effect size varied across different measurement strategies— scales containing all four dimensions tended to produce larger effect sizes. Organizational justice theorists clearly argue that employees’ work outcomes are predicted by a combination of all four justice dimensions and the findings from our meta-analysis show support for this argument. Our advice for researchers is to use organizational justice scales containing survey items from all four dimensions of the concept. This advice should only be heeded if the goal of a research study is to understand the impact of organizational justice on justice system employees’ work outcomes. Plenty of research questions could be addressed concerning the impact of only one dimension of justice on work outcomes. In this case, it would be perfectly acceptable to use scales containing items from a specific justice dimension. If researchers use a scale, however, that does not contain survey items from all four justice dimensions, the authors should clearly articulate the conceptualization and operationalization of the specific justice construct under consideration. Doing so is necessary because each dimension is focused on a different aspect of supervisor fairness. Failing to explain a justice construct in full will lead to further theoretical confusion in the literature. Relatedly, we found that organizational justice scales that contained a higher percentage of procedural justice items tended to have larger effect sizes. The opposite was true for scales that had a higher percentage of distributive justice items. From a practical research standpoint, these findings indicate that the organizational justice effect size will be inflated if the scale has more procedural justice items and deflated if the scale contains more distributive justice items. Again, if the goal of a study is to understand broadly the effect of organizational justice, it is probably best to ensure the scale has an equal distribution of items from each of the four justice dimensions. During the coding process stage of our study, we noted several problems with the wording of individual survey items used in organizational justice scales. For example, Nix and Wolfe (2016) used the following item to capture police officers’ perceptions of distributive justice: “Command staff treats employees the same regardless of their gender.” This measure of distributive justice is commonly used in the criminal justice literature, but it is problematic because it is difficult to ascertain whether the item references procedures (procedural justice) or outcomes (distributive justice).8 Recall that distributive justice items should tap into employees’ perceptions of outcome equity (Colquitt, 2001). As such, better survey items could include the following: “Promotions in my agency are not based on employees’ gender” or “Promotions reflect what employees have contributed to the agency.” Our point is that care should be given to the wording of survey items and how accurately they capture the dimensions of organizational justice. Failure to follow this advice will result in theoretically ambiguous results. And, of course, avoiding this problem starts with sound theoretical arguments in the frontend of research papers by carefully articulating the specific justice dimensions under consideration. We also found that the type of outcome influenced the organizational justice effect size. Criminal justice employees’ perceptions of organizational justice seem to be more strongly tied to job satisfac- tion, views of supervisors, role-related perceptions, and internal factors when each is compared with all other outcomes. Conversely, organizational justice has a weaker relationship with organizational

8For our purposes, such ambiguous items were coded as “other” and the overall scale would be coded as “contaminated.” 20 WOLFE AND LAWSON citizenship behaviors/attitudes, compliance, and attitudinal/emotional reactions. It is important to note that these findings do not imply that organizational justice is unrelated to these outcomes. Rather, the effect size tends to be smaller in comparison with the aforementioned outcomes. Although organiza- tional justice has a different relationship with certain outcome variables, it is interesting that the model intercepts reported in table 2 are consistent. Thus, the average organizational justice effect does not vary greatly across the different outcome variables. Researchers can expect slight differences in the magnitude of the organizational justice effect across different outcomes, but they should also antici- pate a “general” organizational justice effect. Organizational justice is associated with a wide range of behavioral and attitudinal outcomes among criminal justice employees. At the same time, we encour- age future researchers to examine the organizational justice effect on outcomes of interest that have not received much empirical attention. For example, many employee satisfaction scales include a combina- tion of survey items that tap into various aspects of the construct. It may prove useful to focus on specific facets of satisfaction (e.g., pay satisfaction) that are particularly salient to criminal justice employees. Next, our results provided clear evidence that controlling for intervening mechanisms reduces the magnitude of the organizational justice effect. When examining the role of organizational justice on justice system employee work outcomes, we suggest researchers consider controlling for factors such as employees’ levels of organizational commitment and identification, compliance, views of supervisors, views of the public, attitudinal/emotional reactions to their work environment (e.g., job stress), role-related perceptions (e.g., role clarity or overload), external factors (e.g., work-life balance), and internal factors (e.g., relationship with colleagues). Doing so will help provide unbiased estimates of the organizational justice effect. Of course, decisions of which control variables to include should be guided by theoretical expectations. To date, we have avoided deeper theoretical discussions of what factors are important to consider in organizational justice studies. Our findings provide guidance for future researchers to do so, and we look forward to seeing what such investigations bring. Theoretical competition may be useful as we attempt to broaden our understanding of the factors that impact justice system employees’ work outcomes beyond organizational justice theory. From a sample characteristic standpoint in a multivariate context, samples containing police officers had slightly larger organizational justice effect sizes compared with samples of correctional officers. It could be that organizational justice matters more to police officers than their correctional officer counterparts, but the results of our meta-analysis cannot help us explain why this would be the case. Perhaps correctional settings are more structured and involve less discretion than is typical in policing. The use of discretion and the more job autonomy this creates in policing may produce more uncertainty about how an officer’s actions will be judged by superiors. If true, police employees may be more likely to use organizational justice as a mental heuristic regarding how much their agency would support the decisions they make on the street (Van den Bos, 2001; Wolfe et al., 2018). We encourage future researchers to explore this theoretical possibility. The findings from our meta-analysis provide insight into what we know about the body of organi- zational justice research in criminal justice settings. The findings also help call attention to what we do not know and what needs to be examined moving forward. First, we could not examine whether the organizational justice effect varied across different racial or ethnic groups, age, education level, or years of experience because too many studies did not include enough information for us to code in the meta-analysis. Future work should be aimed at considering the invariance of organizational justice across such groups to help shed light on the generality of the effect. There are theoretical reasons to expect both a general justice effect and one that varies across different subgroups (Wolfe, Nix, Kaminski, & Rojek, 2016). Second, more research is needed to examine the causal mechanisms that tie organizational justice to criminal justice employees’ work outcomes. For example, organizational justice might be associated WOLFE AND LAWSON 21 with negative emotions such as anger and depression among employees (Wu, Sun, Chang, & Hsu, 2017). Such emotional responses may partially mediate the association between organizational justice and beneficial or counterproductive work outcomes. Also, specific justice dimensions might mediate the effect of other dimensions. Take, for example, our finding that distributive justice scales had smaller effect sizes than other operationalization strategies. In attempting to understand the theoretical reasons why such a relationship exists, future researchers should consider that distributive justice may partially mediate the effect of other dimensions such as procedural justice. Distributive justice perceptions are difficult for employees to make because they rarely have enough information to judge accurately outcome equity (Lind, Kulik, Ambrose, & de Vera Park, 1993). Theory and research findings indicate that people often use procedural justice perceptions as a mental heuristic for judging outcome fairness that, in turn, influences other outcomes of interest (McLean, 2019; Van den Bos, Lind, Vermunt, & Wilke, 1997). Relatedly, little research to date has been aimed at examining potential moderators of organizational justice. Future research should be focused on the moderating role of individual factors such as emotion or self-control, which has been found to shape citizens’ perceptions of police procedural justice (Barkworth & Murphy, 2015; McLean & Wolfe, 2016; McLean, Wolfe, & Pratt, 2019; Murphy, 2009; Reisig, Wolfe, & Holtfreter, 2011; Wolfe, 2011). Criminal justice employees’ feelings of uncertainty may also play a key role in the magnitude of the relationship between organizational justice and outcomes of interest (Wolfe et al., 2018). Another issue worth exploring more is the role of organizational identification in mediating or moderating justice employees’ organizational justice perceptions (Bradford & Quinton, 2014; Bradford et al., 2014). Exploring how variables mediate or moderate organizational justice will provide a deeper understanding of how and why the concept is important to criminal justice employees. Moreover, such efforts will help test the comprehensiveness of organizational justice theory as it applies to justice system employees’ work outcomes. Third, researchers should consider criminal justice employees’ organizational justice perceptions of institutional sovereigns (i.e., entities that are independent of criminal justice organizations but influence their operations; Crank, 2003). Justice system employees (e.g., supervisors and line-level alike) may be impacted by how fairly they feel judges, prosecutors, city council members, or the mayor treat them. Organizational justice perceptions of such entities may play a key role in the prediction of employee work-related outcomes. Again, such work would help test the limits of organizational justice theory. Fourth, we need researchers to explore the dimensionality of organizational justice (Tankebe, Reisig, & Wang, 2016). Colquitt (2001) found a four-factor model of organizational justice was best within business settings, but it is possible the justice dimensions load differently within criminal justice samples. The findings from such research could shed light on the theoretical issue of whether all justice dimensions are viewed the same by justice employees. Our final suggestion for future research deals with the implications of these findings for criminal justice managers. The results of our meta-analysis clearly reveal that organizational justice is a key predictor of important outcomes among justice system employees. Criminal justice managers could benefit from studying the dimensions of organizational justice and tailoring their managerial practices in a manner that may produce the most beneficial outcomes from their employees. Yet, this begs the question of whether police or correctional managers can be trained on the principles of organizational justice. Researchers should partner with agencies to develop organizational justice training programs and evaluate their effectiveness. Can such training improve managers’ orientations toward the use of organizational justice, does it translate into fair managerial practices, and does it improve employees’ perceptions and behavioral outcomes? In the end, we hope that the findings from this meta-analysis have shed light on the importance of organizational justice among criminal justice employees, answered lingering questions in the literature, 22 WOLFE AND LAWSON and motivated more theoretical and empirical attention to the concept moving forward. Although we view this analysis as informative, it is important to point out that we have no meta-analyses on other predictors of justice system employee attitudes and behaviors. We encourage other researchers to conduct such analyses to provide the discipline a better understanding of the factors that shape criminal justice employees’ behaviors and orientations toward their job. More research of this type will help pinpoint needed areas of improvement in justice system employees’ working environments, which ultimately will help improve the “products” they deliver to the public. For now, organizational justice may rank as one of the strongest predictors of criminal justice employees’ work outcomes.

REFERENCES

Adams, J. S. (1965). Inequity in social exchange. In L. Berkowitz (Ed.), Advances in experimental (Vol. 2, pp. 267–299). Academic Press. Alexander, S., & Ruderman, M. (1987). The role of procedural and distributive justice in organizational behavior. Research, 1(2), 177–198. Allison, P. D. (2002). Missing data.SAGE. Ambrose, M. L., & Schminke, M. (2009). The role of overall justice judgments in organizational justice research: A test of mediation. Journal of Applied Psychology, 94(2), 491–500. Armeli, S., Eisenberger, R., Fasolo, P., & Lynch, P. (1998). Perceived organizational support and police performance: The moderating influence of socioemotional needs. Journal of Applied Psychology, 83(2), 288–297. Barkworth, J. M., & Murphy, K. (2015). Procedural justice policing and citizen compliance behaviour: The importance of emotion. Psychology, Crime & Law, 21(3), 254–273. Bechtoldt, M. N., Welk, C., Zapf, D., & Hartig, J. (2007). Main and moderating effects of self-control, organizational justice, and emotional labour on counterproductive behaviour at work. European Journal of Work and Organizational Psychology, 16(4), 479–500. Bies, R. J., & Moag, J. F. (1986). Interactional justice: Communication criteria of fairness. In R. J. Lewicki, B. H. Sheppard, & M. H. Bazerman (Eds.), Research on negotiations in organizations (pp. 43–55). JAI Press. Blalock, H. M. (1972). Social statistics (2nd ed.). McGraw-Hill. Blader, S. L., Chang, C. C., & Tyke, T. R. (2001). Procedural justice and retaliation in organizations: Comparing cross- nationally the importance of fair group processes. International Journal of Conflict Management, 12(4), 295–311. Bradford, B., & Quinton, P. (2014). Self-legitimacy, police culture and support for democratic policing in an English constabulary. British Journal of Criminology, 54(6), 1023–1046. Bradford, B., Quinton, P., Myhill, A., & Porter, G. (2014). Why do “the law” comply? Procedural justice, group identi- fication and officer motivation in police organizations. European Journal of Criminology, 11(1), 110–131. Byrne, Z. S. (2005). Fairness reduces the negative effects of organizational politics on intentions, citizenship behavior and . Journal of Business and Psychology, 20(2), 175–200. Carlin, J. B., Galati, J. C., & Royston, P. (2008). A new framework for managing and analyzing multiply imputed data in Stata. The Stata Journal, 8(1), 49–67. Cohen-Charash, Y., & Spector, P. E. (2001). The role of justice in organizations: A meta-analysis. Organizational Behav- ior and Human Decision Processes, 86(2), 278–321. Colquitt, J. A. (2001). On the dimensionality of organizational justice: A construct validation of a measure. The Journal of Applied Psychology, 86(3), 386–400. Colquitt, J. A. (2012). Organizational justice. In S. W. J. Kozlowski (Ed.), The Oxford handbook of organizational psy- chology (Vol. 1, pp. 526–547). Oxford University Press. Colquitt, J. A., Conlon, D. E., Wesson, M. J., Porter, C. O. L. H., & Ng, K. Y. (2001). Justice at the millennium: A meta-analytic review of 25 years of organizational justice research. Journal of Applied Psychology, 86(3), 425–445. Colquitt, J. A., Greenberg, J., & Scott, B. A. (2005). Organizational justice: Where do we stand? In J. Greenberg & J. A. Colquitt (Eds.), Handbook of organizational justice (pp. 113–152). Lawrence Erlbaum Associates. Colquitt, J. A., Noe, R. A., & Jackson, C. L. (2002). Justice in teams: Antecedents and consequences of procedural justice climate. Personnel Psychology, 55(1), 83–109. Colquitt, J. A., & Shaw, J. C. (2005). How should organizational justice be measured? In J. Greenberg & J. A. Colquitt (Eds.), Handbook of organizational justice (pp. 113–152). Lawrence Erlbaum Associates. WOLFE AND LAWSON 23

Crank, J. P. (2003). Institutional theory of police: A review of the state of the art. Policing: An International Journal of Police Strategies & Management, 26(2), 186–207. Cropanzano, Russell, Prehar, C. A., & Chen, P. Y. (2002). Using social exchange theory to distinguish procedural from interactional justice. Group & Organization Management, 27(3), 324–351. Cullen, F. T., Link, B. G., Wolfe, N. T., & Frank, J. (1985). The social dimensions of correctional officer stress. Justice Quarterly, 2(4), 505–533. Deutsch, M. (1975). Equity, equality, and need: What determines which value will be used as the basis of distributive justice? Journal of Social Issues, 31(3), 137–149. Donner, C., Maskaly, J., Fridell, L., & Jennings, W. G. (2015). Policing and procedural justice: A state-of-the-art review. Policing, 38(1), 153–172. Eisenberger, R., Huntington, R., Hutchison, S., & Sowa, D. (1986). Perceived organizational support. Journal of Applied Psychology, 71(3), 500–507. Farmer, S. J., Beehr, T. A., & Love, K. G. (2003). Becoming an undercover police officer: A note on fairness perceptions, behavior, and attitudes. Journal of Organizational Behavior, 24(4), 373–387. Field, A. P., & Gillett, R. (2010). How to do a meta-analysis. British Journal of Mathematical and Statistical Psychology, 63(3), 665–694. Folger, R., & Konovsky, M. A. (1989). Effects of procedural and distributive justice on reactions to pay raise decisions. Academy of Management Journal, 32(1), 115–130. Fox, S., Spector, P. E., & Miles, D. (2001). Counterproductive work behavior (CWB) in response to job stressors and organizational justice: Some mediator and moderator tests for autonomy and emotions. Journal of Vocational Behav- ior, 59(3), 291–309. Frear, K. A., Donsbach, J., Theilgard, N., & Shanock, L. R. (2018). Supported supervisors are more supportive, but why? A multilevel study of mechanisms and outcomes. Journal of Business and Psychology, 33(1), 55–69. Gilliland, S. W. (2008). The tails of justice: A critical examination of the dimensionality of organizational justice con- structs. Human Resource Management Review, 18(4), 271–281. Greenberg, J. (1987). A taxonomy of organizational justice theories. The Academy of Management Review, 12(1), 9–22. Greenberg, J. (1990). Employee theft as a reaction to underpayment inequity: The hidden cost of pay cuts. Journal of Applied Psychology, 561–568. Greenberg, J. (1993). The social side of fairness: Interpersonal and informational classes of organizational justice. In R. Cropanzano (Ed.), Justice in the workplace: Approaching fairness in human resource management (pp. 79–103). Lawrence Erlbaum Associates. Haas, N. E., Van Craen, M., Skogan, W. G., & Fleitas, D. M. (2015). Explaining officer compliance: The importance of procedural justice and trust inside a police organization. Criminology & Criminal Justice, 15(4), 442–463. Hamm, J. A., Trinkner, R., & Carr, J. D. (2017). Fair process, trust, and cooperation: Moving toward an integrated framework of police legitimacy. Criminal Justice and Behavior, 44(9), 1183–1212. Hedges, L. V., & Olkin, I. (2014). Statistical methods for meta-analysis. Academic Press. Hox, J. J., & de Leeuw, E. D. (2003). Multilevel models for meta-analysis. In S. P. Reise & N. Duan (Eds.), Multilevel modeling: Methodological advances, issues, and applications (pp. 87–104). Psychological Press. Jacobs, G., Belschak, F. D., & Hartog, D. N. (2014). (Un)ethical behavior and performance appraisal: The role of affect, support, and organizational justice. Journal of , 121(1), 63–76. Jiang, Z., Gollan, P. J., & Brooks, G. (2017). Relationships between organizational justice, organizational trust and organizational commitment: A cross-cultural study of China, South Korea and Australia. The International Journal of Human Resource Management, 28(7), 973–1004. Lambert, E. G. (2003). The impact of organizational justice on correctional staff. Journal of Criminal Justice, 31(2), 155–168. Lambert, E. G. (2006). I want to leave: A test of a model of turnover intent among correctional staff. Applied Psychology in Criminal Justice, 2(1), 57–83. Lambert, E. G., Hogan, N. L., & Barton-Bellessa, S. M. (2011). The association between perceptions of distributive justice and procedural justice with support of treatment and support of punishment among correctional staff. Journal of Offender Rehabilitation, 50(4), 202–220. Lambert, E. G., Hogan, N. L., & Griffin, M. L. (2007). The impact of distributive and procedural justice on correctional staff job stress, job satisfaction, and organizational commitment. Journal of Criminal Justice, 35(6), 644–656. 24 WOLFE AND LAWSON

Lambert, E. G., Hogan, N. L., & Jiang, S. (2008). Exploring antecedents of five types of organizational commitment among correctional staff: It matters what you measure. Criminal Justice Policy Review, 19(4), 466–490. Lambert, E. G., Hogan, N. L., Jiang, S., Elechi, O. O., Benjamin, B., Morris, A., … Dupuy, P. (2010). The relation- ship among distributive and procedural justice and correctional life satisfaction, burnout, and turnover intent: An exploratory study. Journal of Criminal Justice, 38(1), 7–16. Leung, K. (2005). How generalizable are justice effects across cultures? In J. A. Greenberb & J. Colquitt (Eds.), Handbook of organizational justice (pp. 555–589). Lawrence Erlbaum Associates. Leventhal, G. S. (1976). The distribution of rewards and resources in groups and organizations. In L. Berkowitz & W. Walster (Eds.), Advances in experimental social psychology (Vol. 9, pp. 91–131). Academic Press. Leventhal, G. S. (1980). What should be done with equity theory? New approaches to the study of fairness in social relationships. In K. Gergen, M. Greenberg, & R. Willis (Eds.), Social exchange: Advances in theory and research (pp. 27–55). Plenum. Leventhal, G. S., Karuza, J., & Fry, W. R. (1980). Beyond fairness: A theory of allocation preferences. In G. Mikula (Ed.), Justice and social interaction (pp. 167–218). Springer-Verlag. Lind, E. A., Kulik, C. T., Ambrose, M., & de Vera Park, M. V. (1993). Individual and corporate dispute resolution: Using procedural fairness as a decision heuristic. Administrative Science Quarterly, 38(2), 224. Lind, E. A., & Tyler, T. R. (1988). The social psychology of procedural justice. Springer Science & Business Media. Lipsey, M., & Wilson, D. B. (2001). Practical meta-analysis.SAGE. Masterson, S. S., Lewis, K., Goldman, B. M., & Taylor, M. S. (2000). Integrating justice and social exchange: The differing effects of fair procedures and treatment on work relationships. Academy of Management Journal, 43(4), 738–748. McFarlin, D. B., & Sweeney, P. D. (1992). Distributive and procedural justice as predictors of satisfaction with personal and organizational outcomes. Academy of Management Journal, 35(3), 626. McLean, K. (2019). Revisiting the role of distributive justice in Tyler’s legitimacy theory. Journal of , 1–12. McLean, K., & Wolfe, S. E. (2016). A sense of injustice loosens the moral bind of law: Specifying the links between procedural injustice, neutralizations, and offending. Criminal Justice and Behavior, 43(1), 27–44. McLean, K., Wolfe, S. E., & Pratt, T. C. (2019). Legitimacy and the life course: An age-graded examination of changes in legitimacy attitudes over time. Journal of Research in Crime and Delinquency, 56(1), 42–83. Mesko, G., Hacin, R., Tankebe, J., & Fields, C. (2017). Self-legitimacy, organisational commitment and commitment to fair treatment of prisoners: An empirical study of prison officers in Slovenia. European Journal of Crime, Criminal Law and Criminal Justice, 25(1), 11–30. Moorman, R. H. (1991). Relationship between organizational justice and organizational citizenship behaviors: Do fair- ness perceptions influence employee citizenship? Journal of Applied Psychology, 76(6), 845–855. Murphy, K. (2009). Procedural justice and affect intensity: Understanding reactions to regulatory authorities. Social Justice Research, 22(1), 1–30. Myhill, A., & Bradford, B. (2013). Overcoming cop culture? Organizational justice and police officers’ attitudes toward the public. Policing, 36(2), 338–356. Nix, J., & Wolfe, S. E. (2016). Sensitivity to the Ferguson effect: The role of managerial organizational justice. Journal of Criminal Justice, 47(1), 12–20. Overton, R. C. (1998). A comparison of fixed-effects and mixed (random-effects) models for meta-analysis tests of moderator variable effects. Psychological Methods, 3(3), 354–379. Pearce, J. L., Bigley, G. A., & Branyiczki, I. (1998). Procedural justice as modernism: Placing industrial/organisational psychology in context. Applied Psychology, 47(3), 371–396. Peterson, R. A., & Brown, S. P. (2005). On the use of beta coefficients in meta-analysis. Journal of Applied Psychology, 90(1), 175–181. Pratt, T. C., & Cullen, F. T. (2000). The empirical status of Gottfredson and Hirschi’s general theory of crime: A meta- analysis. Criminology, 38(3), 931–964. Pratt, T. C., & Cullen, F. T. (2005). Assessing macro-level predictors and theories of crime: A meta-analysis. Crime and Justice, 32, 373–450. Pratt, T. C., Turanovic, J. J., & Cullen, F. T. (2016). Revisiting the criminological consequences of exposure to fetal testosterone: A meta-analysis of the 2D:4D digit ratio. Criminology, 54(4), 587–620. WOLFE AND LAWSON 25

Pratt, T. C., Turanovic, J. J., Fox, K. A., & Wright, K. A. (2014). Self-control and victimization: A meta-analysis. Crim- inology, 52(1), 87–116. Pyrooz, D. C., Turanovic, J. J., Decker, S. H., & Wu, J. (2016). Taking stock of the relationship between gang membership and offending: A meta-analysis. Criminal Justice and Behavior, 43(3), 365–397. Rahim, M. A., Magner, N. R., Antonioni, D., & Rahman, S. (2001). Do justice relationships with organization-directed reactions differ across US and Bangladesh employees? International Journal of Conflict Management, 12(4), 333– 349. Reisig, M. D., Wolfe, S. E., & Holtfreter, K. (2011). Legal cynicism, legitimacy, and criminal offending: The noncon- founding effect of low self-control. Criminal Justice and Behavior, 38(12), 1265–1279. Rosenbaum, D. P., & McCarty, W. P. (2017). Organizational justice and officer “buy in” in American policing. Policing, 40(1), 71–85. Rosenthal, R. (1994). Parametric measures of effect size. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 231–244). Russell: Sage Foundation. Royston, P. (2005). Multiple imputation of missing values: Update of ICE. The Stata Journal, 5(2), 527–536. Sargeant, E., Antrobus, E., & Platz, D. (2017). Promoting a culture of fairness: Police training, procedural justice, and compliance. Journal of Experimental Criminology, 13(3), 347–365. Skarlicki, D. P., & Folger, R. (1997). Retaliation in the workplace: The roles of distributive, procedural, and interactional justice. Journal of Applied Psychology, 82(3), 434–443. Snijders, T. A. B., & Bosker, R. J. (2012). Multilevel analysis: An introduction to basic and applied multilevel analysis (2nd ed.). SAGE. Sweeney, P. D., & McFarlin, D. B. (1993). Workers′ evaluations of the “ends” and the “means”: An examination of four models of distributive and procedural justice. Organizational Behavior and Human Decision Processes, 55(1), 23–40. Tankebe, J. (2014). The making of “democracy’s champions”: Understanding police support for democracy in Ghana. Criminology & Criminal Justice, 14(1), 25–43. Tankebe, J., & Meško, G. (2015). Police self-legitimacy, use of force, and pro-organizational behavior in Slovenia. In G. Mesko & J. Tankebe (Eds.), Trust and legitimacy in criminal justice: European perspectives (pp. 261–277). Springer International. Tankebe, J., Reisig, M. D., & Wang, X. (2016). A multidimensional model of police legitimacy: A cross-cultural assess- ment. Law and Human Behavior, 40(1), 11–22. Taxman, F. S., & Gordon, J. A. (2009). Do fairness and equity matter? An examination of organizational justice among correctional officers in adult prisons. Criminal Justice and Behavior, 36(7), 695–711. Thibaut, J., & Walker, L. (1975). Procedural justice: A psychological analysis. Lawrence Erlbaum Associates. Trinkner, R., Tyler, T. R., & Goff, P. A. (2016). Justice from within: The relations between a procedurally just organi- zational climate and police organizational efficiency, endorsement of democratic policing, and officer well-being. Psychology, Public Policy, and Law, 22(2), 158–172. Turanovic, J. J., Pratt, T. C., & Piquero, A. R. (2017). Exposure to fetal testosterone, aggression, and violent behavior: A meta-analysis of the 2D:4D digit ratio. Aggression and Violent Behavior, 33, 51–61. Tyler, T. R., & Bies, R. J. (1990). Beyond formal procedures: The interpersonal context of procedural justice. In J. Carroll (Ed.), Applied social psychology and organizational settings (pp. 77–98). Lawrence Erlbaum Associates. Tyler, T. R., Callahan, P. E., & Frost, J. (2007). Armed, and dangerous (?): Motivating rule adherence among agents of social control. Law & Society Review, 41(2), 457–492. Van Craen, M., & Skogan, W. G. (2017a). Achieving fairness in policing: The link between internal and external proce- dural justice. Police Quarterly, 20(1), 3–23. Van Craen, M., & Skogan, W. G. (2017b). Officer support for use of force policy: The role of fair supervision. Criminal Justice and Behavior, 44(6), 843–861. Van den Bos, K. (2001). Uncertainty management: The influence of uncertainty salience on reactions to perceived pro- cedural fairness. Journal of Personality and Social Psychology, 80(6), 931–941. Van den Bos, K., & Lind, E. A. (2002). Uncertainty management by means of fairness judgments. In M. P. Zanna (Ed.), Advances in Experimental Social Psychology (Vol. 34, pp. 1–60). Academic Press. Van den Bos, K., Lind, E. A., Vermunt, R., & Wilke, H. A. M. (1997). How do I judge my outcome when I do not know the outcome of others? The psychology of the fair process effect. Journal of Personality and Social Psychology, 72(5), 1034–1046. 26 WOLFE AND LAWSON

Wolfe, S. E. (2011). The effect of low self-control on perceived police legitimacy. Journal of Criminal Justice, 39(1), 67–74. Wolfe, S. E., & Nix, J. (2017). Police officers’ trust in their agency: Does self-legitimacy protect against supervisor procedural injustice? Criminal Justice and Behavior, 44(5), 717–732. Wolfe, S. E., Nix, J., Kaminski, R., & Rojek, J. (2016). Is the effect of procedural justice on police legitimacy invari- ant? Testing the generality of procedural justice and competing antecedents of legitimacy. Journal of Quantitative Criminology, 32(2), 253–282. Wolfe, S. E., & Piquero, A. R. (2011). Organizational justice and police misconduct. Criminal Justice and Behavior, 38(4), 332–353. Wolfe, S. E., Rojek, J., Manjarrez, V. M., & Rojek, A. (2018). Why does organizational justice matter? Uncertainty management among law enforcement officers. Journal of Criminal Justice, 54(1), 20–29. Wu, Y., Sun, I. Y., Chang, C. K.-M., & Hsu, K. K.-L. (2017). Procedural justice received and given: Supervisory treat- ment, emotional states, and behavioral compliance among Taiwanese police officers. Criminal Justice and Behavior, 44(7), 963–982.

AUTHOR BIOGRAPHIES

Scott E. Wolfe is an Associate Professor in the School of Criminal Justice at Michigan State Uni- versity. His research is focused on policing, organizational justice, and criminological theory.

Spencer G. Lawson is a doctoral student in the School of Criminal Justice at Michigan State Uni- versity. His community-based research is primarily focused on organizational justice, the nexus between behavioral health and the criminal justice system, pretrial risk assessments, and incarcer- ation and reentry.

SUPPORTING INFORMATION

Additional supporting information may be found online in the Supporting Information section at the end of the article.

How to cite this article: Wolfe SE, Lawson SG. The organizational justice effect among criminal justice employees: A meta-analysis. Criminology. 2020;1–26. https://doi.org/10. 1111/1745-9125.12251