<p> Focus, Fiddle and Friends: Sources of Knowledge to Perform the Complex Task of Teaching 1</p><p>Kenneth A. Frank Yong Zhao</p><p>Michigan State University</p><p>William R. Penuel</p><p>SRI International</p><p>Nicole Ellefson</p><p>Michigan State University</p><p>Susan Porter</p><p>Michigan State University</p><p>Contact information: Kenneth A. Frank Room 462 Erickson Hall, Michigan State University East Lansing, MI 48824-1034 phone: 517-355-9567; e-mail: [email protected]. </p><p>1 This study was made possible by a grant from the Michigan Department of Education (MDE), but views and findings expressed in this report are not necessarily those of MDE. The following individuals participated in the design and implementation of this study: Yong Zhao, Ken Frank, Blaine Morrow, Kathryn Hershey, Joe Byers, Rick Banghart, Andrew Henry, and Nancy Hewat. Although we cannot identify the names of the schools that participated in this study, we want to thank all the teachers and administrators in these schools. Without their cooperation and support, this study would not have been possible. Running head: Focus, Fiddle and Friends</p><p>Focus, Fiddle and Friends: Sources of Knowledge to Perform the Complex Task of</p><p>Teaching</p><p>Abstract</p><p>Although knowledge has been linked to productivity within and between organizations, little is known about how knowledge flows into schools and then diffuses from teacher to teacher within schools. Here, we hypothesize that the value of different sources of knowledge depends on a teacher’s current level of implementation. We test our theory using longitudinal network data from 470 teachers in thirteen schools. From models of change (i.e. first differences) in teachers’ use of computers over a one year period, we infer that the more a teacher at the lowest initial levels of implementing an innovation is exposed to professional development focused on student learning, the more she increases her level of implementation (focus); the more a teacher at an intermediate initial level of implementation has opportunities to experiment and explore, the more she increases her level of implementation (fiddle); and the more a teacher at a high initial level of implementation accesses the knowledge of others, the more she increases her level of implementation (friends). Concerning the potential for selection bias, we quantify how large the impacts (Frank 2000) of confounding variables must be to invalidate our inferences. In the discussion, we emphasize the changing nature of knowledge through the diffusion process. </p><p>Keywords: Diffusion of Innovation, Professional Development, Implementation of Technology, </p><p>Causal Inference, Social Networks</p><p>2 Running head: Focus, Fiddle and Friends</p><p>INTRODUCTION</p><p>The goal of this study is to identify how different experiences contribute to knowledge required for the complex task of teaching. Generally, workers’ access to knowledge has been linked to productivity from the level of the state (Romer 1990) down to the individual (Burt </p><p>2002). In particular, over the course of the past decade, a number of organizational theorists have advanced the argument that the processes of knowledge creation, integration, management, and transfer are fundamental to the performance of an organization (Osterloh and Frey 2000; </p><p>Spender 1996; von Krogh, Ichijo, and Nonaka 2000). Knowledge flows within organizations have also been linked to such outcomes as overall organization performance (DeCarolis and </p><p>Deeds 1999; Gupta and Govindarajan 2000), product innovation (Hansen 1999), the transfer of best practices (Szulanski 1996), and workplace learning (Darr, Argote, and Epple 1995).</p><p>But even as knowledge transfer has moved to the fore of organizational theory, there is no integrated conceptualization of the nature of knowledge as it first permeates the organizational boundary and then diffuses from person to person within an organization. The macro economic perspective typically assumes that general knowledge circulates to organizations through sheer power of competition in the marketplace via generally accessible media, personnel transfers, etc. That is, the conceptualization is essentially of dissemination of fixed knowledge through unspecified mechanisms. On the other hand, the micro level perspective characterizes person to person diffusion within the organization but takes the body of knowledge within the organization as static and given without recognizing origins outside the organization. Even those studies that attend to how knowledge permeates the organizational </p><p>3 Running head: Focus, Fiddle and Friends boundary (e.g., 1Abrahamson and Rosenkopf, 1997) do not address the changing nature of knowledge as it does so.</p><p>Here, we address the gap between macro theories of knowledge circulation and micro processes of person to person diffusion within organizations by estimating the effects of different knowledge sources from outside and within the organization on changes in production practices. </p><p>As we do so we develop a trajectory of knowledge adaptation and evolution, as abstract knowledge is introduced from outside the organization, adapted as it is implemented on the shop floor, and then articulated as it circulates throughout an organization. </p><p>The empirical example of this study concerns teachers’ attempts to implement computer innovations (e.g., the Internet, educational software, and the digital camera) in the classroom. Of course, technology may generally directly contribute to production (Romer 1990). But in education there are still strong concerns regarding the value of digital technology for learning </p><p>(Cuban 1999, 2001; Dynarski et al. 2007). Regardless of the educational value of technology, there is strong pressure on schools to implement such innovations (Cuban 1999; Loveless 1996; </p><p>President's Committee 1997); even if computers do not enhance productivity, computers have strong institutionalized legitimacy (Rowan 1995). </p><p>THEORETICAL BACKGROUND</p><p>The implementation of technology in schools, like many reforms, has proven vexing for reformers and organizational theorists alike (Elmore, Peterson, and McCarthey 1996; Tyack and </p><p>Cuban 1995). Part of this resistance is due to the considerable variation within schools in how technology is implemented (Adelman et al. 2002). This is an example of resistance to </p><p>4 Running head: Focus, Fiddle and Friends comprehensive change that has prompted organizational theorists to give considerable attention to schools, (see Bidwell and Kasarda 1987, Bolman and Heller 1995, and Perrow 1986, for reviews) from control theory (Callahan 1962) to contingency theory (e.g., Greenfield 1975) and new institutionalism (Meyer and Rowan 1977; Rowan 1995).</p><p>Previous research on the diffusion of computers in schools has generally focused on the effects of four sets of factors on the adoption of computers. First, hardware and software and technical support affect implementation by making the technology more reliable (Collins 1996; </p><p>Cuban 1999; Zhao et al. 2002). Second, organizational factors, such as scheduling and types of school leadership, can facilitate conditions affecting teachers’ use of computers (Cuban 2001; </p><p>Hodas 1993; Loveless 1996; Zhao et al. 2002). Third, aspects of professional development can support the implementation of innovations (e.g., Berends, 2007; Little, 1993; Wilson and Berne, </p><p>1999). Technology uses that require teachers to adopt more constructivist methods, which increase the cognitive demands on students present particularly significant professional development needs, since they require teachers to adopt both new pedagogies and technology tools (Adelman et al., 2002; Windschitl and Sahl, 2002). Fourth, and perhaps most frequently cited, characteristics of individual teachers, including the willingness and ability to use technology and pedagogical style, as well as teacher preparation, affect implementation (Becker </p><p>2000; Smith et al., 2007; U.S. Congress, 1995).</p><p>Although some studies considered interactions among sets of factors, such as between teacher characteristics and professional development (Desimone, Smith, and Ueno, 2006; Penuel et al., 2007), few program designs that aim at technology implementation have seriously considered the complexity of teaching (Bidwell 1965; Woodward 1965). The complexity arises from multiple sources: variability in student needs, which can influence decisions about what and</p><p>5 Running head: Focus, Fiddle and Friends how to teach (e.g., Barr and Dreeben 1977 1983; Delpit 1988); conflicts among organizational demands that arise from policies enacted at different levels of organization (e.g., Bidwell and </p><p>Kasarda 1987; Honig, 2006); varying levels of coherence among curriculum, pedagogy, and assessments (Borman et al. 2003; Schmidt 2001); and from teachers’ unique educational trajectories, which exposes them to varying educational approaches (Lortie 1975). As a result, implementation may not be a simple function of physical resources, background institutional factors and individual motivation and preparation. Instead, implementation may depend on the adaptation of an innovation to the complex task of teaching within the organizational context of the school (Zhao and Frank 2003).</p><p>Given that teaching is complex, a teacher’s implementation of new practices can depend on her knowledge. Teaching is typically not just a simple matter of organizing a set of activities or delivering a particular curriculum. To the extent that teaching depends on interacting with students in context, teaching will depend on a teacher’s knowledge (e.g., Shulman, 1985). For example, the teacher needs this knowledge to anticipate student difficulties and integrate new learning with students’ existing understandings.</p><p>Where does Teacher Knowledge come from?</p><p>Knowledge for teaching has been differentiated primarily in terms of domains such as pedagogical versus content (e.g., Ball and Bass 2000; Darling-Hammond, 2000; Gess-Newsome and Lederman, 1999; Hill et al. 2005; Shulman, 1985). But here we differentiate knowledge by its source so that we may link knowledge flows to innovations as they permeate the organizational boundary of the school. In particular, we differentiate abstract general knowledge</p><p>6 Running head: Focus, Fiddle and Friends generated outside the school from specific knowledge that is locally adapted within the school. </p><p>For example, general knowledge might consist of the content of a particular lesson in social studies that can be conveyed to teachers in different schools, while local knowledge might consist of how to integrate that lesson given the backgrounds of the students and the previous units in a particular school. Given this distinction, we characterize the literature on the potential sources of teacher knowledge.</p><p>Externally Provided Professional Development. Teachers typically acquire some knowledge of core curricular subjects and how to teach them as part of their training. But innovations, by definition, are not generally part of a teacher’s pre-service training. Therefore, a large body of literature has accumulated on in-service teacher learning that includes recommendations for effective approaches to foster teacher change (e.g., Munby, Russell, and </p><p>Martin 2001; Richardson 2003; Wilson and Berne 1999). </p><p>Here we focus on two important new studies that have empirically analyzed the features of professional development programs that affect teaching practices: Garet, Porter, Desimone, </p><p>Birman, and Yoon (2001); and Desimone, Porter, Garet, Yoon, and Birman (2002). The Garet et al. and Desimone et al., studies establish direct connections between specific characteristics of professional development and teacher practices and help us understand the mechanisms by which certain features of professional development affect teachers’ practices. Analyzing professional development activities attended by a national sample of math and science teachers, Garet et. al. </p><p>(2001) found that professional development that positively affected teacher acquisition of knowledge and skills included a focus on content knowledge and coherence with other learning activities and with teachers’ own goals for professional development. Supporting Garet et al. </p><p>7 Running head: Focus, Fiddle and Friends</p><p>(2001), Desimone et al. (2002) found that professional development with a focus on a particular teaching practice leads to increased use of that practice in the classroom. </p><p>Opportunities for Experimentation. Garet et al and Desimone et al also attended to the format of professional development, finding that teachers were more likely to gain knowledge of how to implement an innovation when they had opportunities for active, “hands-on” learning. </p><p>This active learning provides teachers opportunities to process and adapt the information provided in the professional development, confirming long standing findings about the importance of experimentation for effective practices (e.g., Little, 1988).</p><p>The studies by Garet et al., (2001) and Desimone et al., (2002) significantly shortened the list of features that define effective professional development from earlier syntheses (e.g., </p><p>Loucks-Horsley et al. 1998; Wilson and Berne 1999) and drew on longitudinal data to support inferences about the effects of specific aspects of professional development on teacher practices. </p><p>In particular, the studies establish two of the tenets of our theory: that focus on specific practices and the knowledge needed to support those practices, and fiddling with innovations (through opportunities for active learning) are strongly related to teachers’ increased implementation of innovations.</p><p>We now extend Garet et al., and Desimone et al., by considering how innovations are adapted once they permeate the organizational boundary of the school. Thompson (1967) describes the processes of experimentation and exploration “intensive” because they require each worker to individually develop the knowledge necessary to modify practices to the local context. </p><p>This can be limiting because many workers, such as teachers, will simply not have time to experiment to acquire all of the local knowledge necessary to improve practices (Coburn, 2001; </p><p>Cohen and Hill, 2001). </p><p>8 Running head: Focus, Fiddle and Friends</p><p>Given the limitations of individual experimentation, we turn to a second source of local knowledge that inheres within the organizational boundary. As a social institution the organization adds to knowledge creation by facilitating interaction among those who possess diverse knowledge (Schumpeter, 1934) and by reducing transaction costs among dependent actors with common social contexts and cognitive schema (Williamson, 1981). From this perspective, organizations can be characterized by their capacity to create and convey knowledge without reliance on market mechanisms (Arrow, 1974; Coase, 1937). Within the organization of the school, teachers can learn from others in their local contexts who have adapted innovations given similar students, other curricular elements, and organizations contexts, and who have an interest in supporting others (Frank, Zhao and Borman, 2004). Implementation of an innovation then becomes a function of accessing the knowledge possessed by others in the local context.</p><p>Although other teachers may possess critical local knowledge, it may be difficult to access because teachers have limited opportunities to observe one another (Glidewell, et al., </p><p>1983; Hargreaves, 1991; Little, 1990, 2002; Lortie 1975). Instead the knowledge teachers gain from one another likely comes through interactions with others who have implemented practices in comparable contexts (Frank, Zhao and Borman, 2004; Hansen, 1999; Nonaka, 1994). 2 </p><p>Consider the teacher below describing a routine of explanation concerning the implementation of a new pedagogy (Coburn and Russell 2008, page 218):</p><p>We talked about, like, the math message and the mental math and how to coordinate the two and that we should be linking the message to the initial onset of the mini lesson and how those two are connected and that that would get the children eventually into their individual work and that we should connect them and that the math messages is separated from the mental math after it’s done until we go back to it and use that as a lead in for the lesson.</p><p>2 This point has been supported in recent work in the economics of education (e.g., Croninger et al., 2007; Jackson and Bruegmann 2009; Rothstein 2009) 9 Running head: Focus, Fiddle and Friends</p><p>Note that the new approach, math message, must be coordinated with the old, mental math, creating a complex task. The complex task is then articulated and knowledge shared through the teachers’ talk pertaining to how to implement the new approach, motivate the children, differentiate the approaches, and structure the lesson. Each of these tasks depends on the local context defined by the students, curricula, and organizational context, which in this case included a math coach to whom the teacher was describing her interactions.</p><p>Consistent with the anecdotal quote above, the knowledge teachers access from one another has been shown to be essential for changing complex production adapted to a local context. For example, using a social capital perspective (Burt 1997; Lin, 1999, 2001; Portes, </p><p>1998), Frank, Zhao and Borman (2004) found that elementary teachers were able to implement computer technology in the classroom the more they could access related expertise through informal interactions with colleagues (see also Reiser et al., 2000; Penuel et al, 2007). The potential knowledge teachers can access from others in their social contexts defines the third tenet of our theory, which we colloquially define as friends.</p><p>While studies of knowledge flow add an important new dimension to our understanding of factors that affect teachers’ practices, they have not been situated within a broader frame of knowledge creation and adaptation. That is, they attend only to how teachers influenced one another, but not how teacher learning might also be influenced by other professional development experiences external to the school. In contrast, studies of external professional development have not attended to the intra-organizational social processes related to knowledge flows within the school. Thus, in this study, we seek to understand the interplay of formal professional development and social experiences as they affect teachers’ implementation of innovations. By so doing, we evaluate the effects of focus and fiddle in the professional </p><p>10 Running head: Focus, Fiddle and Friends development literature alongside the effects of friends as in the literature on intra-organizational diffusion through social networks.</p><p>Hypotheses: Knowledge Type and Current Level of Implementation </p><p>Our study concerns not just diffusion, but the role of the organizational boundary in shaping diffusion. Therefore, we consider how the effects of different sources of knowledge depend on the location of an innovation in the diffusion trajectory relative to the organizational boundary. In the diffusion trajectory, teachers first access explicit general knowledge through focused professional development from outside the organization (focus). This general knowledge is represented by the black puzzle piece in Figure 1. Next, they develop locally specific knowledge of the innovation through experimentation (fiddle) as they adapt the innovation to their organizational contexts. The tacit knowledge that permeates the organizational boundary is represented by the modified, gray puzzle piece in the figure. Finally teachers can articulate their local knowledge and can share and benefit from colleagues within their organizational contexts (friends). The new, rearticulated knowledge is represented by the black border surrounding the modified gray puzzle piece in the figure.</p><p>Insert Figure 1 about here</p><p>We develop specific hypotheses from the preceding logic that matches the value of each type of knowledge to where a teacher is located in the trajectory of implementation. When workers first access outside knowledge they are novices with respect to the innovation. As such, </p><p>11 Running head: Focus, Fiddle and Friends their learning is dependent largely on forming new concepts for problem solving and on learning from initial experiences (e.g., Daley 1999). These experiences are likely to occur during professional development focused on student learning. Therefore, we hypothesize:</p><p>H1: The more a teacher at the lowest initial levels of implementing an innovation is exposed to professional development focused on student learning, the more she will increase her level of implementation.</p><p>Second, as production workers advance beyond the novice stage they must adapt external knowledge to their particular organizational contexts. To do so they likely use active concept integration and self initiated strategies which can occur during experimentation and exploration </p><p>(Daley 1999; Ertmer and Newby 1996). Therefore we hypothesize:</p><p>H2: The more a teacher at an intermediate initial level of implementation has opportunities to experiment and explore, the more she will increase her level of implementation.</p><p>Finally, to maintain and extend knowledge, experts need access to other experts, to whom they can turn to for help when they encounter unusual, non-routine problems (Barley, 1990; </p><p>Hansen, 1999; Penuel and Cohen 2003). Therefore, we hypothesize:</p><p>H3: The more a teacher at a high initial level of implementation accesses the knowledge of others, the more she will increase her level of implementation.</p><p>12 Running head: Focus, Fiddle and Friends</p><p>Together these hypotheses imply that implementation may depend on access to different knowledge sources at different points in the diffusion trajectory. As such, the hypotheses have implications for the changing nature of knowledge (Polanyi 1966) as well as changing behaviors.</p><p>It is not just that teachers at one phase of implementation respond to one type of knowledge while teachers at another respond to a different type of knowledge. Because the underlying process is of diffusion, there are common elements of knowledge that are transferred (Argote and</p><p>Ingram 2000). And yet, in spite of those common elements, we hypothesize that the form of the knowledge changes through the diffusion process. Explicit codified knowledge in focused professional development becomes tacit when it is adapted to the specific setting through experimentation. It then becomes re-codified, but localized, when it is articulated by experts during interaction. In this last stage the knowledge has the potential to generalize to complex settings beyond the specifics of a particular classroom.</p><p>We also consider whether the source of knowledge has optimal effects when matched with a teacher’s initial level of implementation. In particular, we will test whether the effect of focused professional development is strongest for those at initial levels of implementation, whether the effect of exploration and experimentation is strongest for those at intermediate initial levels, and whether access to the knowledge of others has the strongest effect for those at high initial levels. Evidence of the relative effects would inform us as to the nature of the knowledge types as a set, for example, whether the types form some sort of hierarchy or sequence in which each form of knowledge becomes less valuable as production workers increase their levels of implementation.</p><p>In the next section we present the context of our current study. We then present our measures, analytic approach, and results, including quantifying the robustness of our inferences </p><p>13 Running head: Focus, Fiddle and Friends with respect to concerns about internal validity. In the discussion, we emphasize the changing nature of knowledge through the diffusion process.</p><p>STUDY CONTEXT: ELEMNTARY TEACHERS’ IMPLEMENTATION OF </p><p>TECHNOLOGY</p><p>In the precursor to this study, we found examples of successful professional development programs that featured some or all of the factors: focus, fiddle, and friends (Zhao, Frank and </p><p>Ellefson 2006, pages 169-177). But here we shift our attention from isolated professional development events or programs to the general characteristics of professional experiences to track how access to sources of knowledge affect teacher practices.</p><p>Our instrument and data collection methods offer distinct advantages over most extant studies. First, our instrument attended to comprehensive measures of the nature of the professional development experience (e.g., potential foci of professional development, opportunities for non-interactive professional development). Second, like Desimone et al., and </p><p>Garet et al., our study design is longitudinal. Thus we are able to model factors that affect change in technology use. This allows us to make a stronger case for the inference of effects of our primary factors (Cook, Shadish and Wong, 2008; Shadish, Clark, and Steiner, 2008). Third, similar to Zhao and Frank (2003), our dependent measure is defined by the number of uses of technology for the core tasks of teaching (e.g., did teachers use the innovation for presenting material; for student to student communication; for student expression) – see O’Donnell (2008). </p><p>This differentiates our instrument from others that attend to particular goals of an innovation </p><p>(e.g., to have teachers use a particular innovation a given number of times in a week).</p><p>14 Running head: Focus, Fiddle and Friends</p><p>Sample</p><p>Because professional development is often organized and offered by school districts, we chose whole school districts as our first level of analysis. A total of ten districts were selected from one</p><p>Midwestern state. Since our example of an innovation was technology, we needed schools that had technology available to teachers and students. Operationally, the criteria used to select districts for participation in the study included passage (between 1996 and 2001) of a bond referendum or receipt of a community foundation grant for implementation of technology or recent investment in an externally supported professional development program for technology </p><p>(and of course the willingness of the superintendent to participate in the study). Because we wanted to study the social dynamics of technology implementation we focused on elementary schools that tend to be small and relatively tightly defined social systems. We selected all twenty five elementary schools within the ten districts.</p><p>As reported in Zhao and Frank (2003), the base of our sample had more access to technology than the national average (Cattagni and Farris 2001). This is valuable because we wanted to study examples of technology implementation that could anticipate future trends. </p><p>Furthermore, students attending the sampled schools came from slightly higher income families than average in the state in terms of percentage of students who qualified for free or reduced cost lunch. However, the sampled schools were not substantively different from other elementary schools on other measures such as per pupil expenditure, student teacher ratio, and school size.</p><p>15 Running head: Focus, Fiddle and Friends</p><p>The first round of data collection was completed in the spring of 2001 for 19 of the schools and in the Fall of 2001 for six of the schools (referred to as time 1). We then returned to each school in the Spring of 2002 and re-administered our instrument (referred to as time 2). </p><p>In order to obtain a comprehensive representation of technology uses we administered the survey to all staff within each school. We offered incentives to schools, as well as to individual teachers for high response rates to come as close as possible to enumerating each faculty population. </p><p>Ultimately we achieved a response rate of 92% or greater in each of our twenty-five schools at each time point. </p><p>Data Collected</p><p>We collected three types of data: a) survey of all staff; b) interviews with key informants in focus schools; and c) observations of professional development in each district. The survey included 33 various format items (e.g., Likert Scale, multiple choice, and open ended). The interviews were semi-structured, loosely following a set of questions about technology infrastructure, policy, investment, and attitudes and beliefs regarding technology. The interviews were conducted with the district superintendent, district technology director (or equivalent), principal of one focus school in each district, and three to five teachers in each focus school. </p><p>Though we do not report directly on the interviews and observations here, they were critical in informing our understanding of the phenomenon (Zhao and Frank 2003; Zhao Frank and </p><p>Ellefson 2006) as well as shaping our survey instruments. A professional independent research firm was contracted to collect the survey data and conduct some of the interviews. We conducted the professional development observations.</p><p>16 Running head: Focus, Fiddle and Friends</p><p>Measures3 </p><p>Dependent Variable: Change in Use of Computers. Our dependent variable represents the number of times a teacher indicated her students used computers for the core tasks of teaching. </p><p>In particular, level of implementation was measured based on responses to the following items: </p><p>How frequently [daily=180, weekly=40, monthly=9, yearly=2, never=0] do you or your students use computers for …classroom management and/or incentives for students; student to student communication; student inquiry; student expression; core curriculum skills development; remediation; and basic computer skills (alpha = .76). Thus the measure taps the implementation of the innovation (technology) for the core functions of teaching.</p><p>Key predictors</p><p>Hours of Professional Development Focused on Student Learning(focus). We measured each teacher’s perception of the focus of professional development on student learning from a composite of items beginning with the stem “What percentage of your technology professional development …” and concluding with “helped you improve the content you taught”; “helped you improve student achievement”; “helped you improve your teaching style”; “involved content directly linked to your curriculum”; “helped you learn how to integrate technology into the curriculum”; and “engaged you in the planning stages” (scale=0%,25%,50%,75%,100%; alpha=.81). Thus, like our dependent variable, this variable was defined by the core tasks of teaching. We then calculated the hours of professional development focused on student learning by multiplying the percentage focus by the total number of hours of professional development each teacher reported receiving between time 1 and time 2. </p><p>3 See on-line technical appendix for details. 17 Running head: Focus, Fiddle and Friends</p><p>Days per year Exploring or Experimenting with New Technology (fiddle). We measured exploration and experimentation by asking teachers to indicate the frequency [daily=180, weekly=40, monthly=9, yearly=2, never=0] with which they explored new technologies on their own and with which they experimented with district-supported software between time 1 and time</p><p>2. Our measure is the sum of the two items (which were correlated at .31). </p><p>Access to Knowledge of Technology Implementation through Talk and Help (friends). We defined a teacher’s access to knowledge through interaction as the sum of the knowledge of the others with whom she talked or from whom she received help regarding use of technology in the classroom. As such it is a measure of the manifestation of social capital because teachers accessed the resource of knowledge through the social relations of talk and help (Frank, Zhao and Borman 2004). As a hypothetical example, consider Lisa who indicated talking with Sue and Bob between time 1 and time 2. If Sue had a knowledge level of 9 at time 1 (out of a scale from 1 to 10), and Bob had a knowledge level of 3 at time 1, Lisa accessed 12 units (9+3=12) of knowledge through her interactions with Sue and Bob between time 1 and time 2. We divided by 180, the number of days in the school year, to create a metric representing the amount of uses per day of the others with whom a teacher interacted.</p><p>We used the self reported level of a teacher’s implementation at time 1 to measure the knowledge the teacher could potentially convey to others in her school. Therefore “knowledge” was not simply of the innovation, but of how to implement the innovation in the complex setting of the classroom (Casson 1994). Access to knowledge through talk and help was then the sum of access to knowledge through talk and help. We then took the log of the sum (plus one) to reduce positive skew (see technical appendix for details). We assigned a value of zero to those who did </p><p>18 Running head: Focus, Fiddle and Friends not list any others from whom they received help or with whom they talked about computers because they had no social capital on which to draw. The metric for access to knowledge through interactions with others is difficult to interpret because it is the logged sum of two products of frequency of interactions and others’ knowledge. Therefore we interpret primarily the standardized coefficient for this predictor and compare with estimates for our other key predictors. </p><p>Key Covariate: Teacher Perception of Technology. As in other settings (Rogers 1995), teacher perceptions of the value of technology are important predictors of use of technology </p><p>(Frank, Zhao and Borman 2004). We measured teachers’ perceptions of the value of computers for their own use in terms of responses to items with the stem “computers can help me...” and ending with core elements of teaching: “…integrate different aspects of the curriculum”; “…; teach innovatively”; “…direct student learning”; “…model an idea or activity”; “…connect the curriculum to real world tasks”; “…be more productive” (responses range from 1=strongly disagree to 6=strongly agree, alpha=.93).</p><p>Key Covariate: Seeking Help from others to Learn about Technology. It could be that access to knowledge is confounded with knowledge seeking and therefore an estimated effect of access to knowledge of colleagues may be upwardly biased because those who are more motivated to engage technology may seek more help and also increase their uses of technology. Therefore in all our models we controlled for the number of days per year (never=0; yearly=1; monthly=9; </p><p>19 Running head: Focus, Fiddle and Friends weekly=40; daily=180) a teacher indicated seeking help from others to learn about new technologies. 4 This variable was measured retrospectively at time 2.</p><p>Other Covariates. Of course, aspects of professional development, motivation for attending professional development, and adequacy of technology resources may affect technology implementation. Drawing on the professional development literature (e.g., Adelman et al. 2002; </p><p>Wilson Berne 1999), we explored controls for focus of professional development on student skills (e.g., focus on how to use technology to help students navigate the web); 5 focus of professional development on teaching skills (e.g., power point); linkage of professional development to national and state standards; opportunities for general professional development activities other than as represented by our key predictors; pedagogical style of professional development (e.g., didactic, interactive, collaborative); location, timing, duration, and instructor for professional development; support of professional development for other teacher functions </p><p>(e.g., managing the classroom); focus of professional development on improving teaching style; motivation for attending professional development; perceived reliability of technology resources; perceived adequacy of technology resources; perceived district support for hardware; perceived district support for software; perceived value of computers for student uses; perceived pressure to use technology; perceived pace of technology implementation in the school; and teacher background characteristics.</p><p>ANALYTIC APPROACH</p><p>4 Seeking Help from others to Learn about Technology is differentiated from Access to Knowledge of Technology Implementation through Talk and Help because those who seek help may not necessarily receive help. 5 Zhao and Frank, 2003, distinguish uses of computers for the teacher’s purpose versus student skills because teacher uses help develop student learning while student uses help develop specific skills that might be applied in other contexts. As such they serve separate purposes form the teacher’s perspective.</p><p>20 Running head: Focus, Fiddle and Friends</p><p>We began by estimating a model of change in uses of computers as a function of our independent variables as well as our key covariates. We then estimated a second model using the residuals from the first as the dependent variable and adding other covariates using a stepwise selection procedure. This was done to identify possible covariates after controlling for our independent variables and key covariates. Of the possible covariates, three were statistically significant using a liberal criterion (p < .10): motivated to attend professional development for technology because of excitement to apply technology in classroom; percentage of time professional development took place in your school computer lab (negatively related to change in use of computers) and percentage of time professional development took place at an intermediate school district facility.</p><p>We then divided our sample into three groups: low level implementers (fewer than 58 uses of computers in the time 1 school year, n=209), intermediate level implementers (between </p><p>59 and 182 uses of computers in the time 1 school year, n=174), and high level implementers </p><p>(more than 182 uses of computers in the time 1 school year, n=197). 6 For each group we report descriptive statistics before using the SAS PROC MI procedure to impute five data sets for (the less than 10 cases) with missing values using a Markov chain Monte Carlo method (with a </p><p>Jeffrey prior), constraining values for each variable to the range that occurred in the observed data. The imputed values were based on our key covariates as well as those identified by the stepwise regression. </p><p>6 The sample was divided into approximately equal groups using SAS proc ranked. The sizes are unequal because of ties in the number of uses of computers.</p><p>21 Running head: Focus, Fiddle and Friends</p><p>To evaluate our primary hypotheses concerning the match between initial level of implementation and knowledge gaining experience (H1, H2, H3), we estimated a separate regression for those at each initial level of implementation, retaining our key predictors as well the covariates identified by the stepwise regression for the whole sample that were statistically significant (p < .10). The only covariate to satisfy this criterion for any group was an indicator of whether a teacher was motivated to attend professional development for technology because of excitement to apply technology in classroom (statistically significant for the group of teachers high on initial level of implementation). We retained this variable in each of our models for comparability. We also controlled for the school in which the teacher taught (using fixed effects </p><p>– twenty-four dummy variables for the twenty-five schools). 7 We used interaction terms in an analysis of the whole data set to test the exploratory hypotheses concerning whether the effect of the knowledge matched to the specific level of implementation was strongest at that level (e.g, whether the estimated effect of focused professional development was stronger for teachers at low initial levels of implementation than for others). </p><p>Because our outcome is the change in uses of computers and therefore can be considered a count variable, we confirmed all of our inferences using Poisson models, correcting for over- dispersion (equivalent to a negative binomial), although we report results from linear models for ease of interpretation (inferences from the Poisson were consistent with those reported). In a subsequent section we explore the robustness of our inferences to potential selection bias due to omitted confounding variables.</p><p>RESULTS</p><p>7 Supplemental analyses indicated that there was essentially zero variation attributable to schools once districts had been taken into account. </p><p>22 Running head: Focus, Fiddle and Friends</p><p>Table 1 presents descriptive statistics for change in computer uses and for our key theoretical independent variables as well as the key covariates. Teachers at low levels of implementation at time 1 used computers roughly 31 times throughout the first year of teaching and more than doubled their uses between the first and second year of our study (the change is statistically significant p < .001 using a paired t-test). Teachers at intermediate levels of implementation at time 1 used computers roughly 101 times throughout the first year of the study and added thirteen uses on average (the change is borderline statistically significant p < .07). </p><p>Teachers at high levels of implementation used computers roughly 319 times throughout the first year of the study but decreased their uses by about 73 occurrences (the change is statistically significant p < .001). The changes in uses of computers across the three groups were significantly different from one another with the decrease in uses for the initial high level group significantly different form the other two groups (using Tukey’s honestly significant differences, p < .05). Thus, there is a tendency for regression to the mean, with teachers at the lowest initial levels increasing and teachers at the highest initial levels decreasing (see the discussion for further comment).</p><p>Insert table 1 about here</p><p>Regarding the independent variables, each independent variable increased with initial level of implementation. That is, teachers at the highest levels of implementation at time 1 subsequently experienced the most hours of professional development focused on student learning (8.6); spent the most days per year exploring technology (50); accessed the most expertise of others (2.7); held the highest perceptions of the value of technology (5.1 on a 6 point likert scale); sought the most help from others to learn about technology – about eighteen days; </p><p>23 Running head: Focus, Fiddle and Friends and were most likely (61%) to be motivated to attend technology professional development because they were excited to apply technology in their classrooms (all differences among the three groups were statistically significant at p < .001, except days per year sought help from others to learn about new technologies which was significant at p < .05). Thus, those at the highest initial level of implementation most actively engaged new sources of knowledge. This emphasizes the need to control for initial levels of implementation when evaluating our hypotheses. </p><p>Results for the estimated model of change in use of computers are reported in Table 2. </p><p>Consistent with hypothesis 1 (focus), the estimated effect of professional development focused on student learning was statistically significant (p < .01) for teachers at low levels of implementation at time 1. Teachers in this group increased about four and a half uses of computers for each hour of technology based professional development focused on student learning. Consistent with hypothesis 2 (fiddle), the estimated effect of experimentation was statistically significant (p < .05) for teachers at intermediate levels of implementation at time 1. </p><p>For each day exploring or experimenting with computers teachers in this group increased about one use of computers. Last, consistent with hypothesis 3, the estimated effect of access to knowledge through interactions with others (friends) was statistically significant (p <.01) for teachers at high levels of implementation at time 1. A one unit increase in the log of expertise accessed through help and talk per day (recalling the measure was divided by 180 to reflect access to expertise per day), resulted in almost thirty three more uses per year of computers for the core tasks of teaching. For this group, the standardized coefficient for access to knowledge through interactions was .27, comparable to the coefficient for days per year exploring or </p><p>24 Running head: Focus, Fiddle and Friends experimenting and roughly three times the size of the standardized coefficient for focused professional development.</p><p>Regarding the comparison of estimates across initial levels of implementation, the effect of focused professional development for the low implementers was not significantly different from the estimated effects for the intermediate and high level implementers. The effect of experimentation for the intermediate group was not stronger than for the low and high groups. </p><p>This is because the high group also benefitted from experimentation. Thus, the estimated effect of experimentation for the intermediate and high groups combined was statistically different from the estimated effect for the low group (p < .10). Finally, the effect of access to knowledge through talk and help was statistically higher for the high group than for either of the other two groups (p < .001). The effects generally were strongest for the specified levels of initial implementation, although the effects of each source of knowledge were relatively strong for those at high initial levels of implementation.</p><p>Beyond the effects directly pertaining to our hypotheses, the perceived value of computers to help with core tasks was statistically significant for teachers at low levels of implementation at time 1 (and negative for the other two groups). This suggests that low level users may well be motivated by their perceptions, as in the classic diffusion model. Low level users were also strongly influenced by how much they sought others for interaction, suggesting this alternative explanation to the effect of access to knowledge may apply to groups in the early phases of adoption. Last, the motivation to attend professional development for technology because of excitement to apply technology in the classroom approached statistical significance (p</p><p>< .10) for the high group. Overall, our model explained about 24% of the variation for each group.</p><p>25 Running head: Focus, Fiddle and Friends</p><p>Insert Table 2 about here</p><p>QUANTIFYING THE ROBUSTNESS OF OUR INFERENCES</p><p>Any policy or theoretical interpretations we make in our study will depend on the robustness of our inferences. Recognizing the importance of causal inference, our outcome of a difference score controls for implementation of technology in the classroom at time 1 (Allison, </p><p>1990). Furthermore, we also controlled for other key covariates. But, our inferences may be invalid because teachers more interested in technology might use technology more as well as seek more focused professional development, explore and experiment more, and seek more interactions. That is, our covariates may be inadequate as controls. For example, Heckman and </p><p>Hotz (1989) argue strongly for controlling not only for the prior measure of an outcome, but the prior trajectory that would also forecast a subject’s outcome. </p><p>We recognize that no matter how many statistical controls we employ there will be inevitable concerns about the validity of our inferences. Therefore to inform discourse about our inferences, we quantify the concerns about the potential to invalidate our inferences. Our approach can be considered an extension of sensitivity analysis (e.g., Copas and Li 1997; Robins,</p><p>Rotnitzky, and Scharfstein 2000; Rosenbaum and Rubin 1983). </p><p>Classically, internal validity can be expressed in terms of confounding variables that are correlated with both the predictor of interest and the outcome (Shadish, Cook, and Campbell </p><p>2002). For example, the effect of focused professional development could be confounded with motivation to attend professional development because motivation could be correlated with the </p><p>26 Running head: Focus, Fiddle and Friends type of professional development received as well as subsequent changes in practices. This is also known as concern over 2selection bias (Heckman 1978) or identification (Manski 1995). </p><p>To express robustness that accounts for the relationship between a confounding variable and the predictor of interest and between the confounding variable and the outcome, Frank </p><p>(2000) defines the impact of a confounding variable on an estimated regression coefficient as impact=. In this expression, is the correlation between a confounding variable, v (e.g., motivation), and the outcome y (e.g., change in computer uses), and is the correlation between v and x, a predictor of interest (e.g., hours of technology professional development focused on student learning). Frank (2000) then quantifies how large the impact must be to invalidate an inference.</p><p>Using Frank’s calculations, the impact of a confound would have to be greater than .047 to invalidate the inference of an effect of focused professional development on change in implementation for low level implementers. In terms of the component correlations, (the correlation between the confound and the outcome) must be greater than 0.21 and (the correlation between the confound and the predictor of interest) must be greater than 0.22 to invalidate the inference (using Frank’s multivariate correction). </p><p>Though the magnitude of the impact threshold for an unmeasured variable can be interpreted in terms of general effect sizes in the social sciences (Cohen and Cohen 1983), it is helpful to compare the threshold to the impacts of measured covariates. Partialling only for schools as fixed effects, the strongest covariate for the low level implementers is “seeking help from others to learn technology,” with an impact of .038 (.038=.259 x 148). Comparing .038 with the impact threshold of .047, the impact of an unmeasured confound would have to be about</p><p>27 Running head: Focus, Fiddle and Friends</p><p>50% greater than the impact of our strongest covariate to invalidate the inference of an effect of professional development focused on student learning. </p><p>Using a similar logic for those at intermediate initial levels of implementation, the impact threshold for the effect of days per year exploring or experimenting with technology is .036, with component correlations of =.187 and =.193. For this group, excitement to apply technology in the classroom has the strongest impact, a value of .065 (.065=.127 x .508). Thus if the impact of an unmeasured confounding variable had about half the impact of the strongest measured covariate, the inference would be invalid.</p><p>Last, for those at high initial levels of implementation, an unmeasured confound would have to have an impact of .084 (with component correlations of .29) to invalidate the inference that access to knowledge through talk and help affects change in level of implementation. As with the intermediate group, excitement to apply technology in the classroom had the largest impact with a value of .076 (.076=.187 x .404). Thus the impact of an unmeasured confounding variable would have to be 10% stronger than the impact of the strongest covariate to invalidate our inference.</p><p>Overall, the impact thresholds reveal that the inference for exploring and experimenting with technology for the intermediate group is less robust than the other inferences. This is important to acknowledge, although we note that the inference for the effect of experimentation is more robust for the high group (t-ratio of 3.17 for the high group versus 2.47 for the intermediate group), supporting the general effect of experimentation and exploration on change in use of computers.</p><p>DISCUSSION</p><p>28 Running head: Focus, Fiddle and Friends</p><p>The macro level link between knowledge and productivity has been conceptualized for the manufacturing process (Romer 1990). Productivity increases as new knowledge is transmitted via simple communication or transfer of personnel from one organization to another. </p><p>But when the production process is complex, one must pay careful attention to the evolution of knowledge that supports diffusion. As in the example in this study, the teachers in one school cannot typically “manufacture” human capital using the techniques learned from teachers in another school or experts in a particular technology. This suggests a diffusion and evolution of knowledge intertwined with the implementation of innovations.</p><p>Our findings support our primary hypotheses pertaining to the need to match knowledge source with initial level of implementation. The effect of focused professional development was statistically significant for those at the lowest levels of implementation. The effect of exploration and experimentation was statistically significant for teachers at intermediate initial levels of implementation, and the effect of interactions with others was statistically significant for those at the highest initial levels of implementation. Note this last finding is especially important because the highest initial implementers were likely to decrease in their use of computers. Thus interactions with colleagues may be especially important to sustaining the implementation of innovations.</p><p>Though some of the estimated effects are small, because our dependent variable is defined as a difference score we emphasize that these are effects on changes in levels of implementation. As such, the effects may accumulate across years or may complement one another as teachers develop knowledge and implement innovations. Furthermore, inferences for </p><p>29 Running head: Focus, Fiddle and Friends the effects for low and high level implementers are at least moderately robust with respect to concerns for unmeasured confounding variables (i.e., selection bias).</p><p>Our findings suggest a three part evolution in knowledge as innovations permeate organizational boundaries. Most teachers begin by learning basics from an external agent who conveys knowledge relevant for the teacher’s goals (focus). This process allows for the diffusion of knowledge from organization to organization, or from researcher to practitioner. Teachers may then adapt the innovation to their unique contexts by exploring and experimenting with the innovation (fiddle). Last, teachers may benefit from further interactions with others who may support their initiative to innovate, and from whom they can obtain further, context specific, knowledge (friends).</p><p>We emphasize how the organizational boundary shapes the evolution of knowledge because it defines the salient local conditions and identifies organizational membership. In this sense, our story is not simply one of a learning theory that includes experimentation and interaction applied in a professional context (e.g., Cohen, McLaughlin and Talbert, 1993; </p><p>Wilson, and Berne, 1999). It is not just that teachers at intermediate levels of implementation must generally adapt practices, but they must adapt them to the local conditions defined by their organization – the school. And it is not just that any interaction will do for teachers at high levels of implementation – school colleagues who know and understand the organizational context will be able to provide the most local knowledge and support. Our conceptualization also is not one of novices entering a community of practice in which the local knowledge is already embedded (e.g. Lave and Wenger, 1991). Instead the novices are vectors of knowledge transfer as they adapt knowledge generated external to the organization to the organizational context and then interact with others in that context.</p><p>30 Running head: Focus, Fiddle and Friends</p><p>The question then turns to the value of each type of knowledge when it is not matched to a specific level of implementation. Our findings provide modest support for the relative effects of different sources of knowledge depending on initial level of implementation. In particular, the tacit knowledge gained through experimentation may not be valuable to novices, and the explicit knowledge embedded in the local context may be valuable only to those who already possess general and tacit knowledge. The primary departure from these trends was that those at high initial levels of implementation tended to benefit from all sources of knowledge. Perhaps they have the scaffolding to filter knowledge from professional development and to extract value from experimentation. This suggests that knowledge sources are cumulative and not hierarchical through evolution. Each source is valuable for its targeted level as well as those at higher levels of implementation. </p><p>Limitations</p><p>While our research synthesizes the professional development literature with that of diffusion of innovations, there are still important limitations to recognize. First, we develop our theory generally with respect to innovations, but our data feature technology as an innovation. </p><p>While technology is an important innovation in elementary schools today, it is not clear what the effects of professional development will be for other innovations. The work by Garret et al., and </p><p>Desimone et al., suggests that focus and fiddle will be effective forms of professional development across innovations and some of our most recent work suggests that social factors will also be relevant across innovations (Fishman, Penuel, and Yamaguchi 2006; Penuel, et al., </p><p>31 Running head: Focus, Fiddle and Friends</p><p>2007), but research should explore whether the effects of focus, fiddle and friends, depend on initial levels of implementation in other contexts and relative to other outcomes.</p><p>Second, one of our strengths is that we comprehensively measured professional development experiences. Yet we do not know their order, nor how elements of specific programs might interact. For example, an ecological theory (Zhao and Frank 2003) would suggest that experimentation might have a stronger effect if it comes after focused professional development, but we were not able to empirically test this. </p><p>Even given these limitations, our data have tremendous strengths. Our longitudinal data allowed us to control for many alternative explanations. Our social network data yielded more valid measures of the knowledge teachers access through interactions than mere self report. We measured the full range of professional development as well as of multiple dimensions of professional development. Therefore, in the next subsection we draw policy implications based on inferences from our data. </p><p>Policy Implications</p><p>Our findings and alliterative mnemonic (focus, fiddle and friends) lead to specific guidelines. Professional development provided by an external agent focused on student learning will be most effective for teachers or schools just beginning to implement an innovation. This type of professional development will provide the baseline knowledge necessary to understand the innovation. As implementation progresses schools should support localized experimentation and exploration (fiddle). In this phase, schools might do well to devote their resources to teacher release time for experimentation, and to validating the legitimacy of experimentation, some of </p><p>32 Running head: Focus, Fiddle and Friends which may not necessarily be immediately beneficial in the classroom. Last, once teachers attain high levels of implementation they have the general knowledge of the innovation and the specific knowledge of adaptation in their classroom to benefit from interactions with colleagues (friends).</p><p>In this phase schools might do well to facilitate interactions among teachers, for example, by designating teachers knowledgeable in a particular area to provide professional development to others in the school; coordinating release time among sets of teachers to promote knowledge sharing; and cultivating new sources of knowledge or interactions to insure that knowledge does not become isolated in social pockets of the school. For example, Frank, Krause and Penuel </p><p>(2007) present the intriguing finding that schools as organizations are most responsive to knowledge flows that are generated by a small number of subgroups but that are distributed across the school.</p><p>Our emphasis on interactions among teachers has profound importance for schools as social organizations. In our data, teachers found it difficult to sustain high levels of implementation in the absence of interaction with colleagues. Implied, when interaction regarding an innovation is concentrated in localized pockets, so too will be sustained implementation of the innovation. The result could lead to uneven implementation across the organization (e.g., McLaughlin and Talbert 2001; Zmud and Apple 1992). Furthermore, because current teaching practices form the landscape for the implementation of new innovations, uneven implementation of one innovation could generate uneven implementation of future innovations, creating further coordination problems for the school (Bidwell 2000, 2001; Frank Zhao and </p><p>Borman 2004).</p><p>Now imagine learning in a school in which one group of teachers wholeheartedly implements an innovation such as technology while another group does not. Transitions from </p><p>33 Running head: Focus, Fiddle and Friends grade to grade or subject to subject could be extremely difficult and likely will be mastered only by those students with the most advantages and support. For example, a parent who is adept with technology may help her second grader engage new technology to which the student was not exposed in first grade. This allows stratification with respect to background to emerge.</p><p>Conclusion</p><p>The preceding policy recommendations derive from a developmental conceptualization of implementation. As such, they align the form of professional development with current levels of implementation. Moreover, the developmental trajectory of implementation informs how innovations diffuse first between organizations and then within organizations. In this case, professional development conveys general knowledge between schools that is then adapted through diffusion within schools. It is through altering these diffusion processes that organizations such as schools shape the implementation of innovations. That is, the social institution of the school contributes to society not just as a venue for transmitting knowledge to students but as an organization that shapes and transforms the knowledge used by teachers.</p><p>34 Running head: Focus, Fiddle and Friends</p><p>On-line technical appendix</p><p>Technical Appendix: Measures</p><p>Dependent variable: use of computers for student purposes</p><p>Based on composite of 7 items (alpha = .76): </p><p>How frequently do you or your students use computers for … classroom management and/or incentives for students; student to student communication; student inquiry; student expression; core curriculum; remediation; basic computer skills. </p><p>The scale was (never, yearly monthly weekly daily), recoded into days per year (never = 0; yearly=2; monthly=9; weekly=40; daily=180).</p><p>For independent variables, all responses included 0%, 25%, 50%, 75% ,100% and Don’t Know (Don’t Know was coded as missing) unless otherwise specified.</p><p>Key Predictors</p><p>Focus of professional development on student learning ( focus )</p><p>Teacher’s perception of the focus of the professional development on student learning (alpha=.81): What percentage of your technology professional development … helped you improve students’ engagement in learning; helped you improve the content you taught; helped you improve student achievement; helped you improve your teaching style; involved content directly linked to your curriculum; helped you learn how to integrate technology into the curriculum; engaged you in the planning stages? </p><p>We then multiplied by the number of hours attended district or school in-service programs for new technologies; </p><p>35 Running head: Focus, Fiddle and Friends</p><p>Opportunities for experimentation and exploration ( fiddle)</p><p>Please indicate the frequency with which you engage in each activity below (never = 0; yearly=2; monthly=9; weekly=40; daily=180. Items combined (correlated at .31). </p><p>Explore new technologies on my own; Experiment with district-supported software; </p><p>Log of Access to knowledge through talk and help ( friends)</p><p>To measure access of knowledge of others, we developed the following network effects measure (Marsden and Friedkin 1994). First, we asked each respondent to list the others in the school with whom they talked about new uses for computers in their teaching. Then, the helpfulness of another depends on the helper’s use of computers for student purposes at time 1. Our reasoning is the helper’s “knowledge” as a resource is best measured not by the helper’s knowledge of computers in the abstract but by the helper’s ability to implement computers in her own classroom (Zhao and Frank 2003; Frank Zhao and Borman 2004). Formally, access to knowledge through talk is the sum of extent of implementation of the others with whom a respondent talked: ∑ talk level of implementation . Here talk i’ ii’ i’ t1 ii’ takes a value of one if actor i indicated talking to i’ about technology, zero otherwise and level of implementation is the extent to which i’ implemented technology at time 1. A similar i’ t1 measure was created based on help . Access to knowledge through talk and help was then the ii’ sum of access to knowledge through talk and access to knowledge through help. We then took the log of the sum (plus one) to reduce positive skew.</p><p>Key Covariates</p><p>Perception of the value of computers for teacher’s own use (responses range from 1=strongly disagree to 6=strongly agree, alpha=.93): Computers can help me... integrate different aspects of the curriculum; teach innovatively; teach innovatively; direct student learning; model an idea or activity; connect the curriculum to real world tasks; be more productive </p><p>This measure was obtained at time 1 and time 2 for most teachers. Therefore we took the average of the non-missing measures, improving reliability and representing the state of their perceptions roughly between the first and second time points.</p><p>36 Running head: Focus, Fiddle and Friends</p><p>Seeking Help. Please indicate the frequency with which you engage in each activity below (1=never; 2=yearly; 3=monthly; 4=weekly; 5=daily; items used separately – intended as separate dimensions): Seek help from others to learn about new technologies.</p><p>Other Covariates</p><p>Alternative Foci</p><p>Extent to which professional development focused on student skills (alpha=.67): Approximately what percentage of your technology professional development … focused on how to use technology with diverse learners; focused on how to use technology to help students navigate the web; helped you learn how to teach technology skills to your students. </p><p>Focus on teacher skills (items used separately because no combination of 3 or more produced alpha > .6): Approximately what percentage of your technology professional development helped you learn basic technology skills (i.e., word processing, Power Point); Approximately what percentage of your technology professional development focused on … how to use technology to present information (i.e. Power Point); how to use technology for assessment (i.e. using technology to track and evaluate student achievement, record keeping); how to use technology to find information on the web for professional purposes (i.e. lesson plans). </p><p>Linkage to national standards: Approximately what percentage of your technology professional development was associated with (items used separately – intended as separate dimensions): MEAP and/or district testing; curricular changes; changes in hardware or administrative software (i.e., electronic report cards, grade books); state or national curriculum standards. </p><p>Opportunities for other professional development activities</p><p>Please indicate the frequency with which you engage in each activity below (1=never; 2=yearly; 3=monthly; 4=weekly; 5=daily; items used separately – intended as separate dimensions): Attend professional-development conferences about new technologies; Read professional journals about new technologies.</p><p>Professional development pedagogical style</p><p>37 Running head: Focus, Fiddle and Friends</p><p>(items used separately – intended as separate dimensions): Approximately what percent of your technology professional development during this school year was … continuation of technology professional development from previous years; didactic (i.e., you sat and listened to lectures); interactive (i.e. instructors responded directly to your needs and requests); collaborative (i.e., you worked with other participants during the training to achieve common goals); experimental (i.e., you experimented with technology during the professional development). </p><p>Structural characteristics </p><p>Location (items used separately – intended as separate dimensions): Approximately what percentage of your technology professional development during this school year took place … in your school computer lab; in your classroom; at a facility in your district; at an ISD facility; outside the district (public/private agency). </p><p>Timing of professional development (items used separately – intended as separate dimensions): Please indicate approximately what percent of your technology professional development during this school year occurred: before school; after school; during school; on weekends; during vacation don’t know </p><p>Duration of professional development (items used separately – intended as separate dimensions): Please indicate approximately what percent of your technology professional development during this school lasted … a single day; multiple days; a semester; a year. </p><p>Type of instructor (items used separately – intended as separate dimensions): Approximately what percentage of your technology professional development was instructed by a/an</p><p>38 Running head: Focus, Fiddle and Friends</p><p> classroom teacher; district technology specialist; building technology or media specialist; administrator; outside specialist/consultant </p><p>Support other teacher functions (items used separately – intended as separate dimensions): Approximately what percentage of your technology professional development helped you to … manage the classroom while using technology; understand procedures for accessing technology resources (i.e. money from the district for software, technical support); trouble shoot technology problems; access technology experts; understand district technology policies (e.g. acceptable use, restrictions, internet use permission slips).</p><p>Improve teaching style: Approximately what percentage of your technology professional development helped you improve your teaching style?</p><p>Motivation</p><p>Please check any and all of the factors that motivated you to attend technology professional development this year. Compensation (i.e., stipends, release time, credits, reimbursement, hardware/software resources); Contractual stipulation; Increased professional opportunities; Recognition; My excitement to apply technology in my classroom. </p><p>Technology Resources</p><p>Reliability of computer resources (items used separately – intended as separate dimensions): During the current school year what percent of the time have you had technical problems with the computers in the lab?; During the current school year, what percent of the time have you had technical problems with the computer(s) in your classroom? Of the times you have had technical problems with computers, what percent of the time did you solve the problem yourself?</p><p>39 Running head: Focus, Fiddle and Friends</p><p>Adequacy of resources (1= strongly disagree to 6=strongly agree; alpha=.84): Please indicate the extent to which you agree with each of the following statements The computer resources in my room are adequate for my instructional needs (e.g., lesson and unit planning, accessing materials such as pictures); The computer resources in my room are adequate for student uses (e.g., student research, writing, artwork); It is easy to implement new software in this school; It is easy to implement new hardware in this school.</p><p>District Support for Hardware(Please rate the district in terms of the following:1=Poor; 2=Fair; 3=Neutral; 4=Good; 5=Excellent; alpha=.92): Providing enough hardware; Choosing appropriate hardware; Providing a reliable server; Updating hardware; Providing technical support for hardware use.</p><p>District Support for Software (Please rate the district in terms of the following:1=Poor; 2=Fair; 3=Neutral; 4=Good; 5=Excellent; alpha=.95): Providing enough software; Choosing appropriate software; Updating software; Engaging teachers in decisions about software purchases; Providing professional development for software use; Providing technical support for software use; Recognition for technological innovation.</p><p>Perception of the value of computers for students’ use (responses range from 1=strongly disagree to 6=strongly agree, alpha=.93): Computers can help students … develop new ways of thinking; think critically; gather and organize information; explore a topic; be more creative; be more productive </p><p>This measure was obtained at time 1 and time 2 for most teachers. Therefore we took the average of the non-missing measures, improving reliability and representing the state of their perceptions roughly between the first and second time points.</p><p>Pressure to use technology (1= strongly disagree to 6=strongly agree, correlated at .69) Please indicate the extent to which you agree with the following statements. I need to use computers to keep up in this school; Others in this school expect me to use computers</p><p>Pace of technology (1= strongly disagree to 6=strongly agree, correlated at .28)</p><p>40 Running head: Focus, Fiddle and Friends</p><p>Please indicate the extent to which you agree with the following statements. We introduce many new things in this school; It is difficult to implement all of the new things in this school</p><p>Teacher background</p><p>Grade=highest grade taught; Seniority= years in the school; Gender=1 for female, 0 for male; Tendency to innovate: (1= strongly disagree to 6=strongly agree. Alpha=.74) Please indicate the extent to which you agree with the following statements. I try new things in the classroom; I am the first to try something new in the classroom; I enjoy introducing something new in the classroom).</p><p>41 Running head: Focus, Fiddle and Friends</p><p>References</p><p>3Abrahamson, E., and Rosenkopf, L. (1997, May-June). Social network effects on the extent of innovation diffusion: A computer simulation. Organization Science, 8(3). Abdal-Haqq, I. 1996. Making Time for Teacher Professional Development. ERIC Digest. Access ERIC: FullText (071 Information Analyses--ERIC IAPs No. EDO-SP-95-4). District of Columbia: ERIC Clearinghouse on Teaching and Teacher Education, Washington, DC. Adelman, N., Donnelly, M. B., Dove, T., Tiffany-Morales, J., Wayne, A., and Zucker, A. A. (2002). The Integrated Studies of Educational Technology: Professional development and teachers' use of technology. Subtask 5: Evaluation of key factors impacting the effective use of technology in schools. Menlo Park, CA: SRI International. Adler, P. S., and Kwon, S. 2002. Social capital: Prospects for a new concept. Academy of Management Review, 27(1), 17–40. Adelman, N., Donnelly, M. B., Dove, T., Tiffany-Morales, J., Wayne, A., and Zucker, A. A. 2002. The Integrated Studies of Educational Technology: Professional development and teachers' use of technology. Menlo Park, CA: SRI International. 4Aiken, M, S. B. Bacharach, and J. L. French. 1980. "Organizational Structure, Work Process, and Proposal Making in Administrative Bureaucracies." Academy of Management Journal 23:631-52. Akerlof, G., and Kranton, R.E. 2002. Identity and schooling: Some lessons for the economics of education. Journal of Economic Literature, 40, 1167-1201. Allison, P. D. (1990). Change scores as dependent variables in regression analysis. Sociological Methodology, 20, 93-114. Argote, Linda, Sara L. Beckman, and Dennis Epple. 1990. "The Persistence and Transfer of Learning in Industrial Settings." Management Science 36:140-154. Argote, L., P. Ingram 2000. "Knowledge transfer A Basis for Competitive Advantage in Firms." Organizational Behavior and Human Decision Processes 82(1): 150-169. Arrow, K. J. 1974. The Limits of Organization. New York: W. W. Norton. 5Attewell, P. 1992. "Technology Diffusion and Organizational Learning." Organization Science 3(1):1-19. Ball, D. L., and Bass, H. (2000). Interweaving content and pedagogy in teaching and learning to teach: Knowing and using mathematics. In J. Boaler (Ed.), Multiple perspectives on the teaching and learning of mathematics (Vol. 83-104). Westport, CT: Ablex. Barley, S. R. (1990).The Alignment of Technology and Structure through Roles and Networks. Administrative Science Quarterly, Vol. 35, No. 1, Special Issue: Technology, Organizations, and Innovation. (Mar., 1990), pp. 61-103. 6Barr, R. and R. Dreeben. 1977. "Instruction in Classrooms." In Review of Research in Education: Vol. 5, edited by L. Shulman. Itasca, IL: Peacock. ------. 1983. How Schools Work. Chicago: University of Chicago Press. Becker, H. J. 2000. Findings from the Teaching, Learning, and Computing Survey: Is Larry Cuban Right? Education Policy Analysis Archives, 8(51), 2-32. Begon, M., C.R. Townsend, and J.L. Harper. 2006. Ecology: From individuals to ecosystems. 4th edition. Blackwell Publishing, Ltd. Malden, MA. pp.599-600. Berends, M. (2000). Teacher-reported effects of New American School designs: Exploring relationships </p><p>42 Running head: Focus, Fiddle and Friends</p><p> to teacher background and school context. Educational Evaluation and Policy Analysis, 22(1), 65-82. 7Bidwell, C. E. 1965. "The School as a Formal Organization." Pp. 972-1022 in The Handbook of Organizations, edited by J. March. Chicago, IL: Rand McNally. 8Bidwell, Charles E. 2000. School as Context and Construction: A Social Psychological Approach to the Study of Schooling. Maureen T. Hallinan, Ed. Pp. 15-36. New York: Kluwer Academic/Plenum Publishers. Bidwell, C. E. 2001. “Analyzing Schools as Organizations: Long Term Permanence and Short Term Change.” Sociology of Education, Extra Issue, 100-114. 9Bidwell, C. E. and J. D. Kasarda. 1987. Structuring in Organizations: Ecosystem Theory Evaluated. Greenwich, CT: JAI Press Inc. 10Bolman, L. G. and R. Heller. 1995. "School Administrators as Leaders." Pp. 315-58 in Images of Schools, edited by S. Bacharach and B. Mundell. Thousand Oaks, CA: Corwin Press, Inc. Borman, G., and Hewes, G. 2002. The long-term effects and cost-effectiveness of Success for All. Educational Evaluation and Policy Analysis, 24 (4), 243-266. Borman, G.D., Hewes, G.M., Overman, L.T., and Brown, S. 2003. Comprehensive school reform and achievement: A meta-analysis. Review of Educational Research, 73 (2), 125- 230. Burt, R. S. 1997. The Contingent Value of Social Capital. Administrative Science Quarterly, Vol. 42: 339—365. Burt, R 2002. "The Network Structure of Social Capital," (Pre-print of a chapter in Research in Organizational Behavior, Volume 22, edited by Robert I. Sutton and Barry M. Staw. Elsevier Science, 2000) . http://gsbwww.uchicago.edu/fac/ronald.burt/research/NSSC.pdf 11Callahan, R. E. 1962. Education and the Cult of Efficiency. Chicago: University of Chicago Press. Casson, M.C. 1994. Why are firms hierarchical? International Journal of the Economics of Business, 1 (1), 47-76. Cattagni, A., and Farris, E. 2001. Internet Access in U.S. Public Schools and Classrooms: 1994 – 2000. Washington DC: National Center for Educational Statistics. Coase, R. H. 1937. The nature of the Firm. Economica. Vol 4(16): 386-405. Coburn, C. E., and J.L. Russell. 2008. District policy and teachers' social networks. Educational Evaluation and Policy Analysis, 30(3), 203–235. Cochran-Smith, M., and Lytle, S. L. 1999. Relationships of Knowledge and Practice: Teacher Learning in Communities. Review of Educational Research, 24, 249-305. Cohen, J., and Cohen, P. 1983. Applied Multiple Regression/Correlation Analysis for the Behavioral Science. Hillsdale, NJ: Lawrence Erlbaum. 12Cohen, D. K., McLaughlin, M.W., and Joan E. Talbert. 1993. Teaching for Understanding: Challenges for Policy and Practice. San Francisco: Jossey Bass. Cohen, D., Raudenbush, S., and Ball, D. 2003. Resources, instruction, and research (Adobe PDF). Educational Evaluation and Policy Analysis, 25 (2), 1-24. Collins, A. 1996. Whither Technology and Schools? Collected Thoughts on the Last and Next Quarter Centuries. In C. Fisher and D. C. Dwyer and K. Yocam (Eds.), Education and Technology: Reflections on Computing in Classrooms (pp. 51-66). San Francisco, CA: Jossey-Bass.</p><p>43 Running head: Focus, Fiddle and Friends</p><p>Cook, T. D., Shadish, S., and Wong, V. A. (2008). Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons. Journal of Policy and Management. 27 (4), 724–750. Copas, J.B. and Li, H.G. 1997. Inference for Non-Random Samples. Journal of the Royal Statistical Society, Series B (Methodological),59 (1), 55-95. Croninger, Robert G. Jennifer King Rice, Amy Rathbun and Masako Nishio. 2007. Teacher qualifications and early learning: Effects of certification, degree, and experience on first- grade student achievement" Economics of Education Review, Vol 26 (3): pp 312-324. 13Cuban, L. 2001. Oversold and Underused: Computers in Schools 1980-2000. Cambridge, MA: Harvard University Press. Cuban, L. 1999 (August 4). The Technology Puzzle: Why Is Greater Access Not Translating Into Better Classroom Use? Education Week, pp. 68, 47. Daley, B. J. 1999. Novice to expert: An exploration of how professionals learn. Adult Education Quarterly, 49(4), 133-147. Darling-Hammond, L. 2000. Teacher quality and student achievement: A review of state policy evidence, Journal of Education Policy Analysis 8:1. 14Darling-Hammond, L. and M. McLaughlin. 1995. "Policies That Support Professional Development in an Era of Reform." Phi Delta Kappan. Darr, Eric, Linda Argote, and Dennis Epple. 1995. "The Acquisition, Transfer and Depreciation of Knowledge in Service Organizations: Productivity in Franchises." Management Science 41:1750-1762. DeCarolis, D. M. and Deeds, D. L. 1999. The impact of stocks and flows of organizational knowledge on firm performance: an empirical investigation of the biotechnology industry. Strategic Management Journal 20 953-968. 15Delpit, L. D. 1988. The silence dialogue: Power and pedogogy in educating other people's children. Harvard Educational Review, 58(3). Desimone, L. M., Porter, A. C., Garet, M. S., Yoon, K. S., and Birman, B. F. 2002. Effects of Professional Development on Teachers' Instruction: Results from a Three-year Longitudinal Study. Educational Evaluation and Policy Analysis, 24(2), 81-112. Diprete, T. and Gangl, M. 2004. Assessing Bias in the Estimation of Causal Effects: Rosenbaum Bounds on Matching Estimators and Instrumental Variables Estimation with Imperfect Instruments. Sociological Methodology 34. Dynarski, Mark; Agodini, Roberto; Heaviside, Sheila; Novka, Timothy; Carey, nancy; Larissa Campuzano; means, Barbara; Murhpy, Robert; Penuel, William; Javitz, Hal; Emery, Deborah; Sussex, Willow. 2007. Effectiveness of Reading and Mathematics Software Products: Findings from the First Student Cohort. Report to Congress. Institute for Educational Sciences. http://www.mathematica-mpr.com/publications/pdfs/effectread.pdf Elmore, R. F., Peterson, P. L., and McCarthey, S. J. 1996. Restructuring in the classroom: Teaching, learning, and school organization. San Francisco: Jossey-Bass. Ertmer, P. A., and Newby, T. J. 1996. The expert learner: Strategic, self-regulated, and reflective. Instructional Science, 24(1), 1-24. 16Eveland, J.D. and Tornatsky, L. 1990. The deployment of technology. In L. Tornatsky and M. Fleischer (Eds.), The processes of technological innovation (chap. 6). Lexington, MA: Lexington Books. Fishman, B., Penuel, W. R., and Yamaguchi, R. 2006. Fostering innovation implementation: Findings about supporting scale from GLOBE. In S. A. Barab, K. E. Hay and D. T. </p><p>44 Running head: Focus, Fiddle and Friends</p><p>Hickey (Eds.), Proceedings of the 7th International Conference of the Learning Sciences (pp. 168-174). Mahwah, NJ: Erlbaum. Frank, K. 2000. “Impact of a Confounding Variable on the Inference of a Regression Coefficient.” Sociological Methods and Research, 29(2), 147-194. Frank, K.A., Gary Sykes, Dorothea Anagnostopoulos, Marisa Cannata, Linda Chard, Ann Krause, Raven McCrory. 2008. Does NBPTS Certification Affect the Number of Colleagues a Teacher Helps with Instructional Matters? Education, Evaluation, and Policy Analysis. Vol 30(1): 3-30. Frank, Kenneth A., Krause, A., and Penuel, W. 2007. Knowledge Flow and Organizational Change. Invited presentation, University of Chicago Sociology and Northwestern University, Business. Frank, K. A. and Zhao, Y. 2005. “Subgroups as a Meso-Level Entity in the Social Organization of Schools.” Chapter 10, pages 279-318. Social Organization of Schools. Book honoring Charles Bidwell’s retirement, edited by Larry Hedges and Barbara Schneider. New York: Sage publications. Frank, K. A., Zhao, Y., and Borman. 2004. Social Capital and the Diffusion of Innovations within Organizations: Application to the Implementation of Computer Technology in Schools” Sociology of Education, 77: 148-171. Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., and Yoon, K. S. 2001. What Makes Professional Development Effective? Results from a National Sample of Teachers. American Educational Research Journal, 38(4), 915-945. Gess-Newsome, J., and Lederman , L. M. (Eds.). (1999). Examining pedagogical content knowledge. Boston, MA: Kluwer. Glidewell, J. C., Tucker, S., Todt, M., and Cox, S. (1983). Professional support systems: The teaching profession. In A. Nadler, J. D. Fisher and B. M. DePaulo (Eds.), New directions in helping: Applied perspectives on help-seeking and receiving (Vol. 3, pp. 189-212). New York: Academic Press. Goldhaber, D.D. and Brewer, D.J. 1997. Why don’t schools and teachers seem to matter? Assessing the impact of unobservables on educational productivity, The Journal of Human Resources 32 (1997) (3), pp. 505–523. Goldhaber, D.D. and Brewer, D.J.2000. Does teacher certification matter? High school teacher certification status and student achievement, Educational Evaluation and Policy Analysis 22 (2000) (2), pp. 129–146. Granovetter, Mark 1985. Economic action and social structure: The problem of embeddedness. American Journal of Sociology. 1985; 91(3):481-510. Grant, R. M. 2001. Knowledge And Organization. I. Nonaka and D. Teece, eds. Managing Industrial Knowledge: Creation, transfer, and utilization. London: Sage, 145-169. 17Greenfield, T. B. 1975. "Theory About Organization: A New Perspective and Its Implications for Schools." In Administrative Behavior in Education, edited by R. Campbell and R. Gregg. London: Athlone. Gupta, A. K. and Govindarajan, V. 2000. Knowledge management's social dimension: Lessons from Nucor Steel. Sloan Management Review Vol (42) 71. Hammond, L. D., and Sykes, G. (Eds.). 1999. Teaching as the Learning Profession: Handbook of Policy and Practice. San Francisco, CA: Jossey-Bass. Hansen, Morten T. 1999. "The Search-Transfer Problem: The Role of Weak Ties in Sharing Knowledge Across Organization Subunits." Administrative Science Quarterly 44:82-111.</p><p>45 Running head: Focus, Fiddle and Friends</p><p>Hanushek, Eric A. 1981. "Throwing Money at Schools". Journal of Policy Analysis and Management, 1,Journal of Policy Analysis and Management, 1,19-41. Hanushek, E. A. 1989. The impact of differential expenditures on school performance. Educational Researcher, 18(4), 45-65. Hanushek, E. A., 1996. School resources and student performance. In G. Burtless (Ed.), Does money matter. Washington, DC: Brookings Institution. Hargreaves, A. (1991). Contrived collegiality: The micropolitics of teacher collaboration. In J. Blase (Ed.), The politics of life in schools: Power, conflict, and cooperation (pp. 46-72). Newbury Park, CA: Sage. 18Heckman, James J. 1978. “Dummy Endogenous Variables in a Simultaneous Equations System” Econometrica, 47: 153-161. Heckman, J., Hotz, J., 1989. Choosing among alternative nonexperimental methods for estimating the impact of social programs: the case of manpower training. Journal of American Statistical Association 84 (408), 862–874. Hill, H.C., Rowan, B., and Ball, D.L. 2005. Effects of teachers' mathematical knowledge for teaching on student achievement. American Educational Research Journal 42, 371-406. Hodas, S. 1993. Technology refusal and the organizational culture of schools. Educational Policy Analysis Archives, 1(10). 19Holland, P. W. 1986. Statistics and causal inference. Journal of the American Statistical Association 81, 945-70. Honig, M.I. (Ed.). 2006. New directions in education policy implementation: Confronting complexity. Albany, NY: The State University of New York Press. 20Ibarra, Herminia. 1993. "Network Centrality, Power, and Innovation Involvement." Academy of Management Journal 36(3):471-501. Jackson, C. Kirabo, and Elias Bruegmann. 2009. "Teaching Students and Teaching Each Other: The Importance of Peer Learning for Teachers." American Economic Journal: Applied Economics 1(4): 85–108. 21Kanter, R. M. 1983. The Change Masters. New York: Simon and Schuster. Kogut, B. and Zander, U. 1993. "Knowledge Of The Firm And The Evolutionary Theory Of The Mul", Journal of International Business Studies, Vol. 24, No. 4; pp. 625 – 645 von Krogh, G., Ichijo, K., and Nonaka, I. 2000. Enabling Knowledge Creation: How to Unlock the Mystery of Tacit Knowledge and Release the Power of Innovation. Oxford University Press US. Lave, J and Wenger, E. 1991. Situated Learning. Legitimate peripheral participation. Cambridge: University of Cambridge Press Lin, N. 1999. Sunbelt Keynote Address. Connections, 22(1), 28–51. Lin, N. 2001. Social capital: A theory of social structure and action. New York: Cambridge University Press. Little, J. W. 1988. Seductive Images and Organizational Realities in Professional Development. In A. Lieberman (Ed.), Rethinking School Improvement. New York: Teachers College Press. Little, J. W. (1990). The persistence of privacy: Autonomy and initiative in teachers' professional relations. Teachers College Record, 91, 129-151. Little, J. W. (1993). Teachers' professional development in a climate of educational reform. Educational Evaluation and Policy Analysis, 15(2), 129-151. Little, J. W. (2002). Locating learning in teachers' communities of practice: Opening up problems of </p><p>46 Running head: Focus, Fiddle and Friends</p><p> analysis in records of everyday work. Teaching and Teacher Education, 18(8), 917-946. 22Lortie, D.C. 1975. Schoolteacher: A Sociological Study. Chicago: University of Chicago Press. Loucks-Horsley, S., Hewson, P. W., Love, N., and Stiles, K. E. (1998). Designing professional development for teachers of science and mathematics. Thousand Oaks, CA: Corwin Press. 23Loveless, T. 1996. Why aren't computers used more in schools? Educational Policy, 10(4), 448-467. 24Manski, C. 1995. Identification Problems in the Social Sciences. Cambridge, Ma: Harvard University Press. 25Marsden, P., and Friedkin, N. E. 1994. Network studies of social influence. Sociological Methods and Research, 22(1), 127-151. Maurer, I. and Ebers, M. 2006. Dynamics of Social Capital and Their Performance Implications: Lessons from Biotechnology Start-ups. Administrative Science Quarterly, vol. 51: 262-292. McLaughlin, M.W. and J.E. Talbert, Professional Communities: and the Work of High School Teaching. Chicago: University of Chicago Press, 2001. 26Meyer, J. and B. Rowan. 1977. "Institutionalized Organizations: Formal Structure as Myth and Ceremony." American Journal of Sociology 30:431-50. Munby, H., Russell, T., and Martin, A. K. 2001. Teachers Knowledge and How It Develops. In V. Richardson (Ed.), Handbook of Research on Teaching (4th edition ed., pp. 877-904). Washington DC: American Educational Research Association. Nonaka, I. 1994. “A Dynamic Theory of Organizational Knowledge Creation.” Organization Science. Vol 5(1): 14-37. O'Donnell, C. L. (2008). Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K-12 curriculum intervention research. Review of Educational Research, 78(1), 33-84. Osterloh, M. and Frey, B. S. 2000. Motivation, knowledge transfer, and organizational forms. Organization Science 2000 538-551. Penuel, William R. and Andrew L. Cohen. 2003. "Coming to the Crossroads of Knowledge, Learning, and Technology: Integrating Knowledge Management and Workplace Learning." Pp. 57-76 in Sharing Expertise: Beyond Knowledge management, edited by Mark Ackerman, V. Pipek, and V. Wulf. Cambridge, MA: MIT Press. Penuel, W. R., Fishman, B. J., Yamaguchi, R., and Gallagher, L. P. (2007). What makes professional development effective? Strategies that foster curriculum implementation. American Educational Research Journal, 44(4), 921-958. Penuel, W. R., and Means, B. (2004). Implementation variation and fidelity in an inquiry science program: An analysis of GLOBE data reporting patterns. Journal of Research in Science Teaching, 41(3), 294-315.</p><p>Penuel, W. R., Shear, L., Korbak, C., and Sparrow, E. 2005. The roles of regional partners in supporting an international Earth science education program. Science Education, 89(6), 956-979. Penuel, W. R., Fishman, B. J., Yamaguchi, R., and Gallagher, L.P. 2007. What makes professional development effective? Strategies that foster curriculum implementation. American Educational Research Journal, 44(4), 921–958.</p><p>47 Running head: Focus, Fiddle and Friends</p><p>Penuel, W., Riel, M., Krause, A., and Frank, K. A. 2009. Analyzing Teachers' Professional Interactions in a School as Social Capital: A Social Network Approach. Teachers College Record Vol 111 Number 1. 27Perrow, C. 1986. Complex Organizations: A Critical Essay. New York: Random House. Polanyi, Michael. 1966. "The Tacit Dimension". First published Doubleday and Co, 1966. Reprinted Peter Smith, Gloucester, Mass, 1983. Chapter 1: "Tacit Knowing". Portes, A. 1998. Social Capital: Its origins and applications in modern sociology. Annual Review of Sociology 24, 1–24. 28President's committee of advisors on science and technology (Panel on Educational Technology). 1997. Report to the President on the Use of Technology to Strengthen K-12 Education in the United States. Washington, DC: President's committee of advisors on science and technology. Reiser, B. J., Spillane, J. P., Steinmuler, F., Sorsa, D., Carney, K., and Kyza, E. 2000. Investigating the mutual adaptation process in teachers' design of technology-infused curricula. In B. Fishman and S. O’Connor-Divelbiss (Eds.), Fourth International Conference of the Learning Sciences (pp. 342–349). Mahwah, NJ: Erlbaum. Richardson, V. 2003. The Dilemmas of Professional Development. Phi Delta Kappan, 84(5), 401-406. Richardson, V., and Placier, P. 2001. Teacher Change. In V. Richardson (Ed.), Handbook of Research on Teaching (4th edition ed., pp. 905-947). Washington, DC: American Educational Research Association. Robins, J., Rotnisky, A., and Scharfstein, D. 2000. Sensitivity analysis for selection bias and unmeasured confounding in missing data and causal inference models. In Halloran, E. and Berry, D. (Eds.). Title (pp. 1-95). 29Rogers, E. 1995. Diffusion of Innovations. New York: The Free press. Romer, P. 1990. Endogenous technological change. Journal of Political Economy, 98(5), S71- S102. Rosenbaum, P. R. and Rubin, D. B. 1983. The central role of the propensity score in observational studies for causal effects. Biometrika, 70(1), 41-55. Rothstein, J. 2009. "Student Sorting and Bias in Value Added Estimation: Selection on Observables and Unobservables." Education Finance and Policy 4(4): 537-571. 30Rowan, B. 1995. "The Organizational Design of Schools." Pp. 11-42 in Images of Schools, edited by S.B. Bacharach and B. Mundell. Thousand Oaks, CA: Corwin Press, Inc. Rowan, Brian and Robert J. Miller. 2007. "Organizational Strategies for Promoting Instructional Change: Implementation Dynamics in Schools Working with Comprehensive School Reform Providers." American Educational Research Journal 44:252-297. Sandholtz, J. H., Ringstaff, C., and Dwyer, D. C. 1997. Teaching with Technology: Creating Student-Centered Classrooms. New York: Teachers College Press. Scharfstein, D. A. and Irizarry, R.A.\2003. Generalized additive selection models for the analysis of studies with potentially non-ignorable missing data. Biometrics. 59(3): 601-613. Schmidt, W. H., et. al. 2001. Why Schools Matter: Using TIMSS to investigate curriculum and learning. Jossey-Bass. Shadish, W. R., Clark, M. H., and Steiner, P. M. (2008). Can nonrandomized experiments yield accurate answers? A randomized experiment comparing random to nonrandom assignment. Journal of the American Statistical Association, 103(484), 1334-1344. Shadish, W. R., Cook, T. D., and Campbell, D. T. 2002. Experimental and Quasi-Experimental </p><p>48 Running head: Focus, Fiddle and Friends</p><p>Designs for Generalized Causal Inference. Boston, NY: Houghton Mifflin. Skrla, L. and Goddard, R.D. Under re-review. A mixed methods investigation of the sources and effects of collective efficacy in a large urban school district: Implications for equity. American Journal of Education. Smith, T. M., Desimone, L. M., Zeidner, T. L., Dunn, A. C., Bhatt, M., and Rumyantseva, N. L. (2007). Inquiry-oriented instruction in science: Who teaches that way? Educational Evaluation and Policy Analysis, 29(3), 169-199. Spender, J. 1996. Making Knowledge the Basis of a Dynamic Theory of the Firm. Strategic Management Journal, Special Issue: Knowledge and the Firm 17 45-62. Squire, K. D., MaKinster, J. G., Barnett, M., Luehmann, A. L., and S.L. Barab, S. L 2003. Designed curriculum and local culture: Acknowledging the primacy of classroom culture. Science Education 87(4), 468–489. Supovitz, J. A., and Turner, H. M. 2000. The effects of professional development on science teaching practices and classroom culture. Journal of Research in Science Teaching, 37(2), 963-980. Szulanski, Gabriel 1996. "Exploring Internal Stickiness: Impediments to the Transfer of Best Practice within the Firm." Strategic Management Journal 17:17-43. Thompson, J. D. 1967. Organizations in Action. New York: McGraw-Hill.</p><p>31Tornatzky, Louis G. and Fleischer, Mitchell 1990. The Process of Technological Innovation. Lexington Books. Tyack, D., and Cuban, L. 1995. Tinkering toward utopia. Cambridge, MA: HarvardUniversity Press. Pp. 26-39 32US Congress Office of Technology Assessment. 1995. Teachers and Technology: Making the Connection (OTA-EHR-616). Washington DC: Office of Technology Assessment. Uzzi, B. 1999. "Embeddedness in the Making of Financial Capital: How Social Relations and Networks Benefit Firms Seeking Financing." American Sociological Review 64 (August):481_505. Williamson, O. E. 1981. "The Economics of Organization: The Transaction Cost Approach." American Journal of Sociology 87(3). Wilson, S. M., and Berne, J. (1999). Teacher learning and the acquisition of professional knowledge: An examination of research on contemporary professional development. In A. Iran-Nejad and P. D. Pearson (Eds.), Review of research in education (pp. 173-209). Washington, DC: American Educational Research Association. Windschitl, M., and Sahl, K. (2002). Tracing teachers' use of technology in a laptop computer school: The interplay of teacher beliefs, social dynamics, and institutional culture. American Educational Research Journal, 39(1), 165-205. 33Woodward, J. 1965. Industrial Organization: Theory and Practice. New York: Oxford University Press. Zhao, Y., Pugh, K., Sheldon, S., and Byers, J. 2002. Conditions for Classroom Technology Innovations. Teachers College Record, 104(3). Zhao, Y., and Frank, K. A. 2003. Factors Affecting Technology Uses in Schools: An Ecological Perspective. American Educational Research Journal, 40(4), 807-840.</p><p>49 Running head: Focus, Fiddle and Friends</p><p>Zhao, Yong, Frank, Kenneth A., and Ellefson, N. C. 2006. Fostering Meaningful Teaching and learning with technology: Characteristics of Effective professional Development Efforts. In Floden R. and Ashburn, E. (Eds.). Leadership for Meaningful Learning with Technology. Teachers College Press. 34Zmud, R.W.; Apple, L.E. 1992. Measuring Technology Incorporation/Infusion. J. Prod. Innov. Management, 9: 148-155.</p><p>50 Running head: Focus, Fiddle and Friends</p><p>Table 1 Descriptive Statistics of Uses of Computers and Predictors for Teachers at Low, Intermediate and High Levels of Implementation at Time 1 Level of Implementation at time 1 Low Intermediate High n Mean Std n Mean Std n Mean Std</p><p>Variable Uses of computers for core 176 30.6 19.4 141 101 33.6 160 318.6 119.1 teaching tasks at time 1 Change in uses of computers 176 33.4 73.8 141 14.4 93.1 160 -73.3 167.6 for core tasks for teaching between time 1 and time 2 Hours of professional 176 5.8 3.6 136 7.4 4.1 157 8.6 4.4 development focused on student learning (between time 1 and time 2) Focus Days per year exploring or 175 15.8 25.8 139 19.5 24.4 160 49.7 60.8 experimenting with technology (between time 1 and time 2) Fiddle Log of access to knowledge of 176 2.1 1.4 141 2.7 1.2 160 2.7 1.4 technology implementation through talk and help (between time 1 and time 2) Friends Perception of extent to which 174 4.5 1 140 4.8 .76 159 5.1 .90 computers can help with core tasks (time 1) Days per year sought help 174 11.8 28.0 136 11.33 13.2 158 18.7 32.7 from others to learn about new technologies Motivated to attend 176 .34 .48 141 .54 .50 160 .61 .49 professional development for technology because of excitement to apply technology in the classroom</p><p>51 Running head: Focus, Fiddle and Friends</p><p>Table 2</p><p>Regression of Change in Uses of Computers between Time 1 and Time 2 for Teachers at Low,</p><p>Intermediate and High Levels of Implementation at Time 1a</p><p>Level of Implementation at Time 1 Low Intermediate High</p><p>Variable (n=176) (n=141) (n=160) Hours of professional development focused on 4.43** .32 3.39 student learning (between time 1 and time 2) (1.64) (2.36) (3.55) Focus [.22] [.01] [.09] Days per year exploring or experimenting with -.19 .84* .70** technology (between time 1 and time 2) (.21) (.34) (.23) Fiddleb [-.07] [.22] [.25] Log of access to knowledge of technology -2.53 7.16 32.86** implementation through talk and help (between (4.06) (6.82) (10.37) time 1 and time 2) [-.05] [.09] [.27] Friendsc Covariates Perception of extent to which computers can help 11.46* -9.50 -9.49 with core tasks (time 1) (5.65) (11.16) (15.13) Days per year sought help from others to learn .64*** .69 -.46 about new technologies (.20) (.64) (.44) Motivated to attend professional development for -18.81 19.28 53.13d technology because of excitement to apply (11.60) (17.68) (31.24) technology in the classroom R2 .24 .25 .23 Standard errors in parentheses, standardized coefficients in square brackets. a All estimates based on ordinary least squares regression with multiple imputation for missing data. All inferences based on statistical significance confirmed with a Poisson model with over dispersion. b Estimate is statistically smaller for low group than others (p < .10). c Estimate is statistically larger for high group than others (p < .001). d p < .10; * p < .05; ** p < .01; *** p < .001; </p><p>52</p>
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages52 Page
-
File Size-