<<

Across Contexts: Evaluating the Impact of Technology Integration Professional Development Partnerships

Louanne Smolin Kimberly A. Lawless National Louis University of Illinois at Chicago Abstract

Professional development is a neces- ers can use technology to transform the can facilitate changes to teaching and sary component for effectively inte- teaching and learning context in a way learning in schools. They maintain that grating technology into classrooms. that will position their students for future although there is a strong perceived Unfortunately, the evaluation of tech- opportunities in the global context, pre- need for action in terms of TIPD, nology integration professional devel- paring them for the flattened world that the knowledge base derived through opment (TIPD) rarely moves beyond technology has helped to make possible. research does not guide it. In particular, participation satisfaction surveys, Through technology, teachers and stu- they advocate for more careful and more nor does it reflect the concerns of the dents can soften the boundaries between systematic approaches for document- multiple stakeholders participating in life in schools and in communities as well ing how technology integration oc- technology integration efforts. In this as between their present and future lives. curs within schools, what increases its article, the authors discuss collab- Technology has the potential to expand adoption by teachers, and the long-term orative models that hold potential for learning in ways that a traditional cur- impacts that these investments have on evaluating TIPD partnerships. They riculum cannot. teachers and students. Based on this, advocate for TIPD partners to define Yet the evaluation of technology they propose a sequential three-phase a collaborative and holistic vision integration, including professional evaluation design that includes evalua- of success that can guide the evalu- development for technology integration, tion of (1) program characteristics, (2) ation process. The authors discuss has done little to define what constitutes teacher outcomes, and (3) sustained three specific collaborative evaluation effective practices for realizing such teacher change and student achievement models, examine key issues associ- potential. So although the images of our effects. They maintain that these phases ated with implementing them, and classrooms have significantly changed should be sequential. They contend analyze how each model has the po- due to the ubiquity of technology, and that more needs to be known about the tential to strengthen and sustain pro- many teachers are incorporating tech- varieties of program structures, such as fessional development partnerships. nology in their learning environments, mentoring or train the trainers’ models, (Keywords: Technology integration, these changes have done little to truly before student and teacher outcomes can professional development, program reform . In essence, the “more be evaluated. With this sequence, causal evaluation) things change, the more they remain the links between specific program variables same” (Sarason, 1996, p. 338). and outcomes can be substantiated. n his informative work on program This article explores the possibilities Within each of these phases, they raise evaluation, Patton maintains “that so- for collaborative evaluation of technol- key questions that should underscore Icial science has proven especially inept ogy integration professional development the design of evaluation and possible at offering solutions for the great prob- (TIPD) to transform technology practices outcomes of such an evaluation. lems of our time…. There is a pressing in schools. To achieve this transforma- Desimone (2009) proposes a need to make headway with these large tion, we need to shift and expand the re- similar, outcomes-based professional challenges and push the boundaries of search and evaluation processes we use to development evaluation model. She social innovation to make real progress” learn about how TIPD impacts classroom states that the goal when evaluat- (Patton, 2006, p. 28). One arena where practices and student learning. ing professional development is to there is great potential for pushing the move beyond participant satisfaction boundaries of social innovation forward Expanding Current Evaluation Models surveys and to systematically measure is integrating technology in schools. of Technology Integration Professional its impact on teachers and students Students are already well versed and facile Development (TIPD) in a coherent way. However, rather with using technology to shape their In their provocative review of research, than a sequential model, she advocates worlds outside of school (Jones, Johnson- “Professional Development in Inte- for a connected model in which the Yale, Perez, & Schuler, 2007). The poten- grating Technology into Teaching and evaluation of program characteristics, tial for technology to impact students in Learning,” Lawless and Pellegrino (2007) teacher practice, and student outcomes school should be realized as well. Teach- advocate that the evaluation of TIPD are recursive. Such analysis can help

92 | Journal of Digital Learning in | Volume 27 Number 3

Copyright © 2011, ISTE (International Society for Technology in Education), 800.336.5191 (U.S. & Canada) or 541.302.3777 (Int’l), [email protected], iste.org. All rights reserved. evaluators examine relationships among their specific classrooms, and teacher ed- teach, the quality of the participants’ what is taught in professional develop- ucators can tailor their methods courses technology-integration practices, and ment, what occurs in classrooms, and to better prepare teacher candidates for the resulting student outcomes. how it affects students. classroom-based . Teachers are “frontline” stakehold- Both models underscore the necessity Yet often the evaluation of TIPD is ers. Through their roles, they can adjust of analyzing the impact of professional not a collaborative endeavor. Of the and fine-tune what they have learned development program activities on both stakeholders described above, we have in professional development sessions to teacher learning and student achieve- been concerned with three primary their unique contexts and students. For ment. They are noteworthy in their groups: funders, institutions of higher them, success is inextricably bound to attempts to create connected evalua- education, and teachers. From their their teaching and classroom culture. tions that link program inputs to teacher vantage points, each of these stakehold- Teachers deem professional development and student outcomes. Yet they are ers has defined their own vision for what successful if it deepens their teaching of incomplete. Both models leave out the constitutes effective TIPD, which lacks a a particular concept, helps them create importance of the number of stakehold- holistic view. instructional conditions conducive to ers involved in technology integration, For example, funders often deem student engagement, and fostering stu- as well as the impact of the stakehold- TIPD successful if it affects student learn- dent learning of content (Mumtaz, 2000). ers’ collaborative relationships on the ing. Technology integration is a costly, So it is through their direct work with meaningful integration of technology ongoing expense. External sources of students that they can incorporate what in today’s schools. They focus on what funding are necessary to maintain not they have learned within their teaching is to be evaluated rather than how. We only meaningful technology integration, practice and implement transforma- must build on these models, incorporat- but also functional technology-infused tive technology practices within their ing aspects of partnership, to deepen learning environments. Funders have classrooms. Ironically, teachers’ roles in and sustain the impact of TIPD. In our a stake in knowing whether and how evaluation are typically as limited, pas- , we must take into account their support affects teacher and student sive respondents. how to foster long-term partnerships learning. Funders want to know whether These individualistically derived where all partners benefit. the benefits of their technology fund- notions of success have resulted in A myriad of stakeholders is involved ing outweigh the costs (Kleiman, 2004; short-term changes and have hampered in the integration of technology in Lemke & Coughlin, 1998). Yet often long-term improvements in teaching schools, including funders, teachers, the outcomes of collaborative teacher and learning. Teachers have difficulty school administrators, ICT personnel, professional development are difficult sustaining the transformative practices institutions, parents, to measure because they do not tend to they learn in professional develop- and community members. Working provide tangible and immediate evidence ment without ongoing support and together, these collaborators can garner (Lieberman & Grolnick, 1996). Therefore, mentorship. As such, their potential for sustained funding. They can engage in they might not reflect what funders are affecting their students’ learning, as well joint planning that benefits a continuum interested in knowing. Specifically, a as their mentorship of new teachers, is of learners from preservice teachers funder’s vision of successful TIPD is one difficult to achieve. Higher education and inservice teachers to K–12 stu- that quantifies the difference that their partners lose an important laboratory dents. Through collaborative exchanges, funds have made with respect to student of innovation as well as placements for university-based content area experts learning, and without tangible evidence their students. When success cannot and classroom-based teacher practitio- this could be difficult to establish. be sustained long-term, funders are ners can maintain a focus on integrated Institutions of higher education hesitant to continue their support. As a academic content rather than decon- play two important roles: They have result, teaching and learning may revert textualized technology skill building. the potential to create new frontiers for back to the status quo. Therefore, in a professional development transforming teaching and learning en- Lack of collaboration also impacts partnership, a balance of learning occurs. vironments with the use of technology, the evaluation of TIPD. The traditional As such, teachers have the benefit of and they provide a pipeline of technol- program evaluation scenario is one in working closely with content area experts, ogy-proficient inservice and preservice which an external “expert” evaluator and university content area experts have teachers who can implement these carefully collects information about a the benefit of learning about pedagogical transformations and become leader program or some aspect of a program practice situated in local contexts. In this advocates for technology integration to make necessary decisions about the way, professional development approach- across university and school contexts program. This approach does not high- es can be tailored and individualized to (Strudler & Weitzel, 1999). Their vi- light the relationships evaluators must participants’ needs. Teachers have greater sion of successful TIPD is the extent to develop to collect data and information, potential to apply what they are learning which participants use the transforma- nor does it detail who should be mak- in professional development programs to tive technology-infused practices they ing decisions related to the evaluation,

Volume 27 Number 3 | Journal of Digital Learning in Teacher Education | 93

Copyright © 2011, ISTE (International Society for Technology in Education), 800.336.5191 (U.S. & Canada) or 541.302.3777 (Int’l), [email protected], iste.org. All rights reserved. Smolin & Lawless

Table 1. Collaborative Evaluation Models Collaborative approaches to program Evaluation Model Focus of Evaluation Key Processes evaluation emphasize emergent and par- Developmental Continuous program growth and progress Dialogic, facilitating emergent evaluation ticipant-oriented designs. These designs Evaluation designs that nurture program growth are ideally suited for TIPD evaluations Responsive Evaluation Providing assistance to educators/ Collaborative, gaining the multiple perspec- because they are more closely aligned participants that is situated in their class- tives of all stakeholders concerning data with the unique characteristics of suc- room contexts collection strategies and interpretations of findings cessful technology integration, including collaboration between multiple profes- Layered Research Expanding the knowledge base concerning Action research/teacher inquiry, including effective classroom practice based on participant-driven documentation sionals; the importance that contexts participant’s inquiries and contexts play in successful technology integra- tion; and the ever changing nature of who should have access to the results, draw upon the field when designing technology equipment and resources. and how they should implement rec- instruments, analyzing results, and of- This paper will discuss three closely ommendations. Absent these consider- fering recommendations that are feasible related methodological approaches ations, a danger exists that evaluation to integrate in practice. (summarized in Table 1): developmental outcomes will be irrelevant or impracti- evaluation, responsive evaluation, and cal for all stakeholders. Collaborative Approaches that Can layered research. We advocate collaborative and holis- Inform TIPD Evaluation tic visions of success rather than sepa- How can collaborative evaluation be Developmental Evaluation rate visions characteristic of traditional achieved when it is “more often messy, Developmental program evaluations are approaches to evaluation. Through not orderly, emergent, not controlled” designed to support program growth, collaboration stakeholders can arrive (Patten, 2006, p. 33)? TIPD evaluation continuous progress, and change. They at a shared vision of success through must provide feedback and information potentially provide “rapid response to evaluation. In so doing, stakeholders can that is useful for all stakeholders in their complex dynamic situations” (Patten, change and enlarge the lenses through roles. The evaluation should be nested 2006, p. 33) and are therefore “ideally which they evaluate success. in the contextual realities in which the suited for ongoing adaptation and rapid Collaborative evaluation involves all technology is integrated. It should take responsiveness” (Patton, 2006, p. 31) that stakeholders asking common ques- into account the dynamic nature of tech- leads to a reasoned program evolution. tions such as “What is it that teachers nology development. To accommodate The goals of developmental evaluations need to have to successfully integrate such demands, many researchers are are not generalization or exact replication technology?” “What do higher educa- advocating for collaborative evaluation of a program model, but to “discover and tion partners need for success?” and approaches that are participant oriented. articulate principles of intervention and “What demonstrative proof do funders For such approaches, the evaluation development” (Patten, 2006, p. 31) that need?” Stakeholders must also explore methods are “iterative and character- program developers can apply to their what evaluation processes foster col- ized by a process of experimentation, respective programs. There are meth- laboration, how information should be learning, and adaptation” (Patten, 2006, odological and interactional differences gathered, and how it should be shared. p. 33). Collaborative evaluation should with traditional models of evaluation. Evaluations grounded in shared ques- build long-term relationships rather For example, in a traditional evaluation, tioning ensure that all stakeholders’ than short-term encounters. goals and outcomes are routinely identi- needs are met and that the definitions As stakeholders enlarge their fied prior to implementing the evaluation. of success are mutually valued and col- evaluation lenses through collabora- In a developmental evaluation, rather laboratively derived. tive evaluation, there will be an impact than being preset, goals and outcomes When evaluation becomes a collab- on professional development. Through emerge through the learning process. This orative endeavor, boundaries between collaboration, we can move beyond emergent nature affects the relationships the various stakeholders can be crossed. “inoculation” or “dump truck” models between program evaluators and imple- For example, technology integration is of professional development and focus menters. In developmental evaluation, the a field-based process, and the contex- our efforts on enhancing the success of evaluator is often an integral member of tual realities surrounding technology- ongoing long-term professional develop- the program design team, not only vary- integration efforts may be unique from ment strategies such as mentoring and ing the evaluation methodology with the participant to participant. Evaluation of situated modeling. In this way we will problem to be solved, but helping those TIPD can uncover the contextual reali- be providing support and feedback to involved in program design and imple- ties surrounding a program as it is being teachers as they attain new knowledge, mentation to “think evaluatively.” As such, implemented in the field. Stakeholders develop resiliency in the face of chal- developmental evaluation is tailored to the can then arrive at a shared understand- lenges, and apply what they have learned needs of all partners and is “designed to be ing of what is going on in the field and on a long-term basis. congruent with and nurture developmen-

94 | Journal of Digital Learning in Teacher Education | Volume 27 Number 3

Copyright © 2011, ISTE (International Society for Technology in Education), 800.336.5191 (U.S. & Canada) or 541.302.3777 (Int’l), [email protected], iste.org. All rights reserved. Technology Integration Professional Development Partnerships

Table 2. Collaborative Evaluation Affects Professional Development and Partnership Relationships tal, emergent, innovative, and transforma- tive processes” among all those involved How Model Influences Recursive How Model Contributes to Sustained, Evaluation Model Professional Development Long-Term Relationships (Patton, 2006, p. 28). Dialogic processes facilitate the col- Developmental Evalu- Because evaluation is emergent and dialogic, Partners become co-designers. ation new processes and topics for professional Ownership of and commitment to the laborative, emergent nature of devel- development can be implemented during the professional development process can opmental evaluations. Through dialog, course of professional development rather be sustained. Relationships can nurture evaluation participants make adjust- than for future, new programs with new new professional development opportu- ments to evaluation design based on participants. nities within the partnership. what is possible and what is desirable Responsive Evaluation As evaluation focuses on the settings where Partners become co-learners, serving for the program being evaluated. This learning occurs (i.e., both professional devel- as resources to one another both within necessitates an “open-ended approach to opment and classroom contexts), evaluations and outside of the professional develop- can be accomplished across boundaries ment context. data gathering, where the questions and and focus on how teachers learn, regardless concerns are emergent, and where trial of the setting. There is greater potential for and error is carefully mined for learn- professional development to emphasize how ing” (Patton, 2006, p. 33). participants learn. Layered Research Participants and evaluators develop evalu- Partners become co-researchers, Responsive Evaluation ation inquiries and data collection tools. focusing their relationship on developing Whereas developmental evalua- Because participants’ inquiries inform the new knowledge that can inform both evaluation, professional development content classroom practice and the field as a tions emphasize emergent design of can become authentic to participants’ work. whole. evaluation to support the evolution of a program, responsive evaluation approaches (Stake, 1973) emphasize Layered Research are documentation and action research collaboration. Responsive evaluation Another evolution of collaborative (Burnaford, 2006). Through layered re- reflects a “participatory democracy evaluation, layered research (Burnaford, search, teachers frame their own research among stakeholders in evaluation set- 2006), explores how teacher participa- questions and engage in action research, tings” (Schubert, 2008, p. 403) and fo- tion in research and evaluation informs documenting evidence related to their cus on “the settings where learning oc- the instructional decisions they make in inquiries as a part of the action research curs, teaching transactions, judgment their classrooms. As such, the evaluation process. Their action research inquiries data, holistic reporting, and giving and classroom contexts of a program are are incorporated into the overall program assistance to educators” (Stake, 1973, inextricably bound. In layered research, as evaluation. As such, teachers are “involved p. 3). Like developmental evaluation, teachers engage in program implementa- in continuous professional development” responsive evaluation does not begin tion, they are “also engaged in document- (Schubert, 2008, p. 403). with a preordinate approach. Rather, ing and investigating their work with These three collaborative approaches “it is evaluation based on what people respect to student learning and their own shift the processes used for program do naturally to evaluate things: they professional development” (Burnaford, evaluation: from identifying inputs and observe and react” (Stake, 1975, p. 4). 2006, p. 2). In traditional evaluation, the measuring outcomes toward a contex- An educational evaluation is a respon- “agenda for evaluation is typically set by tually bound approach that is inclusive sive evaluation if: (a) it orients more the stakeholders, which includes funders, of the multiple perspectives of all stake- directly to program activities than to board members, and consumers. The holders. These approaches are therefore program intents, (b) it responds to au- agenda shifts when participants in the ideally suited for TIPD. They expand dience requirements for information, implementation contribute as data gather- the number of stakeholders involved in and (c) the different value-perspectives ers and analysts to that agenda” (Bur- the evaluation and value dialogic pro- of the people at hand are referred to in naford, 2006, p. 3). This creates a milieu in cesses, so they foster multiple defini- reporting the success and failure of the which practitioners inform the knowledge tions of success. The core processes for program. In these three separate ways, base about what constitutes an effective data collection, synthesis, and analysis an evaluation plan can be responsive. classroom practice “while still requiring are naturalistic and emergent and can Due to the responsive nature, data the evaluative procedures that provide therefore be derived from the unique collection methods are naturalistic and them with direction for future planning” dynamic contexts in which technology built on the perspectives of those in- of professional development activities is integrated. Finally, the evaluation volved. As such, ongoing observations, (Burnaford, 2006, p. 3). As such, an evalu- methodologies are designed to make interviews, and document analysis are ation can be situated in classroom contexts teaching and learning visible in both important components of responsive and therefore has great potential to inform professional development and class- evaluations, as are the interpretations professional development programs that room contexts. Therefore, the outcomes and meanings that collaborators make affect teacher and student learning. The of professional development can be of the various forms of data collected. core procedures for such an evaluation relevant to all involved stakeholders.

Volume 27 Number 3 | Journal of Digital Learning in Teacher Education | 95

Copyright © 2011, ISTE (International Society for Technology in Education), 800.336.5191 (U.S. & Canada) or 541.302.3777 (Int’l), [email protected], iste.org. All rights reserved. Smolin & Lawless

Each of these models of collabora- •• Assess change in participants’ knowl- uses of technology, a goal of the profes- tive evaluation fosters unique oppor- edge, attitudes, behaviors of technol- sional development program. tunities for the growth of professional ogy integration Through our evaluation methods, we development as well as benefits to the •• Document impact on teacher practice learned that our professional devel- partnership relationships. Table 2 (p. 95) opment program increased teachers’ summarizes these aspects. Our evaluation model included knowledge, attitudes, and behaviors to- Each of the collaborative evaluation pre/post surveys, observations, expert ward technology integration. The quality models affirms qualities advocated by reviews, and artifact analysis. Pro- of the lessons and materials used in both the National Staff Development Council fessional development participants teacher education and K–12 classrooms (NSDC). Each model seeks information completed surveys prior to and follow- also improved. These outcomes affected and feedback from multiple audiences ing professional development summer approximately 1,500 students, 300 teach- and incorporates multiple sources of institutes. Through their responses, we ers, and 35 teacher educators. evidence so that findings can be relevant learned that teachers increased their Even though our professional devel- for all participants. Findings are made knowledge of classroom-based technol- opment program was a collaborative available during the course of profes- ogy integration practices, their attitudes that brought together stakeholders for sional development so participants toward the benefits of technology professional development and curricu- know the impact of their professional integration, and their (self-reported) lum design, the evaluation approaches development experiences and act on behaviors toward integrating technol- discussed above were traditional and those findings rather than waiting for ogy in their classrooms. Through this expert oriented. We did not evaluate the results of yearly testing. Finally, the documentation, we learned that a given the partnerships themselves, nor did we NSDC maintains that audiences should professional development activity was engage in collaborative evaluation pro- have the “prerequisite knowledge and successful because it increased learners’ cesses. We never evaluated the collabo- skills to interpret and use the infor- knowledge of the content, “heightened ration between schools administrators, mation” (National Staff Development the value or importance they placed on teachers, and teacher educators. We did Council, n.d.). integrating these new techniques, and not evaluate the relationships between increased the number of classroom- each of these stakeholders and us as TIPD Evaluation Field Example related behaviors related to these new project implementers. Therefore, we do How might these approaches look practices” (Cunningham et al., p. 157). not know what nor how each of these when grounded in a real evaluation? To The results also provided feedback that groups learned from each other. Al- explore this question, we draw from our was useful for adapting professional though the relationships between school own experiences developing, imple- development sessions based on our and university partners were supportive, menting, and evaluating a TIPD part- program goals. we often had difficulty integrating our nership and share our lessons learned. Experts evaluated the materials that various perspectives beyond the pro- Through a Preparing Tomorrow’s Teach- teams of teacher educators and class- gram activities. As such, it was challeng- ers to Use Technology (PT3) grant pro- room teachers created. This document ing to implement practices outside of the gram between a large urban university analysis ensured that the modules cre- professional development context. and a large urban public school system, ated could be applicable and education- We did not take a close look at the we implemented integrated professional ally relevant in both teacher education scaffolds we put in place for collabora- development and curriculum develop- classes and K–12 classrooms. In this tion, such as the Web-based support ma- ment with teachers in the public school way, document analysis served to secure terials, technical support, and ongoing system and teacher educators in the connections between professional devel- mentoring. Therefore, we were not able university setting. Our program con- opment and classroom contexts. to learn how we could solidify longer- sisted of a multi-week institute, ongoing Observations also helped situate the term relationships. professional development throughout professional development sessions with- Our evaluation enabled us to learn the academic year, and a Web infrastruc- in classroom environments. To evaluate about the benefits of the various pieces ture that supported teachers and teacher the quality of the lessons created during of our program, but we lacked a holistic educators as they created Web-based professional development, the research- view because we drew upon limited classroom curriculum materials. As ers developed and implemented an ob- perspectives to guide the evaluation. program implementers, our evaluation servation protocol in participants’ class- We did not include all the stakehold- goals were to: rooms. Using the protocol, we evaluated ers as agents of the evaluation rather lessons along a four-level continuum, than passive respondents. As such, we •• Provide feedback to program devel- from non-technology use to technology missed the opportunity to guide our opers and implementers that could innovation. Results of these observations participants to become the sources of be used to modify programs based on indicated that teachers were creating les- their own and each others’ professional participants’ needs. sons that employed more sophisticated development. Because of this missed

96 | Journal of Digital Learning in Teacher Education | Volume 27 Number 3

Copyright © 2011, ISTE (International Society for Technology in Education), 800.336.5191 (U.S. & Canada) or 541.302.3777 (Int’l), [email protected], iste.org. All rights reserved. Technology Integration Professional Development Partnerships opportunity, four years later, we see that Big Idea #2: Collaborative evaluation can their own professional growth, docu- our relationships were not sustained, provide information useful for adjusting menting their efforts, and analyzing and without continued funding, long- the program as it is being implemented. student artifacts, as layered approaches term gains were not realized. Through the process of implementation, advocate, there would have been greater How could the collaborative pro- ongoing interpretation of evaluation data potential for them to become sources cesses discussed above foster long-term can create a culture of problem posing of their own professional development success? We will examine some possibil- between all of the stakeholders. In our as well as mentors to one another. By ities by synthesizing big ideas contained PT3 evaluation, had we periodically cre- sharing observation tools with par- in the three approaches to collaborative ated opportunities for our stakeholders ticipants and gathering their feedback, evaluations discussed above. to look over observational data, we might we would have been able to develop a have been able to discuss patterns emerg- shared understanding of what success- Big Idea #1: Collaborative evaluations ing through the data and problemetize ful technology integration means. We build on the benefits of partnership; as these patterns in ways that could have would be getting more feedback from such, collaborative evaluations can be informed our professional development the field, enabling us to garner multiple guided by similar principles. sessions. Our professional development perspectives useful for ongoing program The literature on partnerships indi- activities might have become more mean- modifications as well as more holistic cates that those that are successful ingful and authentic to a larger popula- views of success. share certain features. For example, tion of our participants. In turn, this stakeholders should jointly con- could have eased participants’ abilities to Conclusion ceive and agree upon program goals; incorporate what they learned in the pro- Collaborative evaluations foster work- teachers should be actively involved fessional development context with their ing relationships and shared under- in the implementation process; each students, leading to higher-level technol- standings. They shift and expand the stakeholder must develop an apprecia- ogy integration in the classroom. focus from the evaluation of outcomes tion of the others’ contributions; and only to the evaluation of processes and all should mutually understand and Big Idea #3: Participants are involved in outcomes. By engaging stakeholders in own the outcomes of a program (Cole developing evaluation instruments and both the processes and the outcomes of & Knowles, 1993; Ravid & Handler, interpreting findings. evaluation, professional development 2001). These features can be fostered Instruments used for evaluation data can be dynamic, responsive to the needs when evaluating a program as well. collection must be based on what of a greater number of stakeholders, and When sharing joint responsibility for constitutes success. When all stakehold- sustainable over the long term. To be program goals, participants can jointly ers’ perspectives are brought to bear, sure, funding must support collabora- define visions for what constitutes suc- the outcomes are relevant for them. tive evaluation. Traditionally funders cess and can drive the evaluation. As When participants have input into the support approximately 10% toward stakeholders become knowledgeable construction of data collection instru- evaluation of a professional develop- about the goals and processes of evalu- ments, there is a greater potential for ment program. This is only sufficient ation, program goals become more them to have clarity about the purposes for inoculation forms of professional transparent to them and they can be- for professional development. Because development and rarely leads to insights come invested in program outcomes. participants are a part of the data collec- helpful for sustaining transformations Collaboration creates the condi- tion process, a feedback loop is created in classrooms. When funders provide in- tions for all stakeholders to be equally that maximizes the likelihood that the creased resources for broader and more valued, and therefore all involved evaluation can be responsive to their inclusive evaluations, we will be able to can achieve parity. If our program needs. Therefore, all stakeholders mutu- create ongoing professional development had incorporated the perspectives of ally value and own outcomes. If we had partnerships that transform teaching our stakeholders early on, similar to framed the findings of our surveys and and learning, better positioning students developmental evaluations, we would evaluations through the perspectives to engage in a technology-infused, flat- have been able to build stronger rela- of all involved, stakeholders could have tened world. tionships with our participants, and established learning relationships that Author Notes our program could have developed in would last despite changes in funding. Louanne Smolin is an independent education ways that reflected the needs of our consultant. Prior to that, she was clinical associate participants. The outcomes of our Big Idea #4: Teachers frame their own professor of curriculum and instruction in the De- program, including participants’ in- inquiry questions related to the overall partment of Education at the University of Illinois creased understanding and application evaluation, and their data collection in- at Chicago. She specializes in curriculum develop- ment and evaluation, particularly with respect to of technological pedagogical content forms findings of the overall evaluation. multimedia and technology integration in public knowledge, could have been sustained Had we involved our participants in school classrooms. She has developed professional over the long term. framing their own inquiries concerning development partnerships with numerous Chicago

Volume 27 Number 3 | Journal of Digital Learning in Teacher Education | 97

Copyright © 2011, ISTE (International Society for Technology in Education), 800.336.5191 (U.S. & Canada) or 541.302.3777 (Int’l), [email protected], iste.org. All rights reserved. Smolin & Lawless

public schools. She currently serves as a research Cunningham, C. A., McPherson, S., Lawless, K. Lemke, C., & Coughlin, E. (1998). Technology in and evaluation consultant for organizations devoted A., Radinsky, J., Brown, S. W., & Zumpano, American schools: Seven dimensions for gauging to curriculum reform in urban schools. She has N. (2008). In A. Borthwick & M. Pierson progress. Santa Monica, CA: Milken Exchange published journal articles and book chapters related (Eds.), Transforming classroom practice: on Education Technology. to technology professional development and has pre- Professional development strategies in Lieberman, A., & Grolnick, M. (1996). Networks sented her work at numerous national and interna- (pp. 149­–167). Eugene, and reform in American education. Teachers tional conferences. Please address correspondence to OR: International Society for Technology in College Record, 98, pp. 7–45. Louanne Smolin, 1443 Dartmouth Lane, Deerfield, Education. Mumatz, A. (2000). Factors affecting teachers’ IL 60015. E-mail: [email protected] Desimone, L. M. (2009). Improving impact use of information and communications studies of teachers’ professional development: technology: A review of the literature. Journal of Kimberly A. Lawless is a professor and chair of Toward better conceptualizations and Information Technology for Teacher Education, at the University of Illinois at measures. Educational Researcher, 38, 9(3), pp. 319–342. Chicago. Her research interests involve understanding 181–199. National Staff Development Council Standards: how individuals search for, evaluate, and integrate Handler, M., & Ravid, R. (2001). The many faces of Evaluation. (n.d.). Retrieved from http:// information across multiple sources of information, school-university collaboration: Characteristics www.learningforward.org/standards/ particularly those that are found in digital, networked of successful partnerships. Santa Barbara, CA: evaluation.cfm environments. Here work in these areas has been Teacher Ideas Press. Patten, M. (2006). Evaluation for the way we work. published in more than 100 book chapters and arti- Jones, S., Johnson-Yale, C., Perez, F., & Schuler, J. The Nonprofit Quarterly, 12(1), 28–33. cles. She is currently a principal investigator on two (2007). The internet landscape in college. (p. Sarason, S. (1996). Revisiting “The culture of the grants funded through the Institute of Educational 39–51). In L. Smolin, K. Lawless, & N. Burbules school and the problem of change.” New York, Science in the Department of Education examining (Eds.), Information and communication NY: Teachers College Press. these and related issues. Please address correspond- technologies: Considerations of practice for Schubert, W. (2008). Curriculum inquiry. In M. ence to Kimberly Lawless, 1040 West Harrison Street teachers and teacher educators. Malden, MA. Connelly, M. He, & J. Phillon (Eds.), The Sage UMC 147, University of Illinois at Chicago, Chicago, Blackwell Publishing. handbook of curriculum and instruction ( pp. IL 60607. E-mail: [email protected] Kleiman, G. M. (2004). Myths and realities about 399–419). Thousand Oaks: Sage Publications. References technology in K–12 schools: Five years later. Stake, R. (1973). Program evaluation, Burnaford, G. (2006). Moving toward a culture of Contemporary Issues in Technology and Teacher particularly responsive evaluation. Paper evidence: Documentation and action research Education, 4(2), 248–253. presented at New Trends in Evaluation, inside CAPE’s veteran partnerships. Program Lawless, K. A., & Pellegrino, J. W. (2007). Goteborg, Sweden. Report, Chicago Arts Partnerships in Education. Professional development in integrating Strudler, N., & Wetzel, K. (1999). Lessons from Cole, A., & Knowles, G. (1993). Teacher technology into teaching and learning: Knowns, exemplary colleges of education: Factors development partnership research: A focus unknowns, and ways to pursue better questions affecting technology integration in preservice on methods and issues. American Educational and answers. Review of , programs. Educational Technology Research and Research Journal, 30(3), 473–495. 77(4), 575–614. Development, 47(4), 63–81.

98 | Journal of Digital Learning in Teacher Education | Volume 27 Number 3

Copyright © 2011, ISTE (International Society for Technology in Education), 800.336.5191 (U.S. & Canada) or 541.302.3777 (Int’l), [email protected], iste.org. All rights reserved.