Running Head: DATA, DESIGN, DIALOGUE
Total Page:16
File Type:pdf, Size:1020Kb
DATA, DESIGN, DIALOGUE 1
Running head: DATA, DESIGN, DIALOGUE
Data, design, dialogue:
Kindergarten literacy program evaluation and advocacy
Ryan R. Kelly, Ph.D.
Assistant Professor of Reading, Arkansas State University
Stephanie Martin, MSE
Jonesboro Public Schools
Teri Spillman, MSE
Jonesboro Public Schools DATA, DESIGN, DIALOGUE 2
Abstract
In order to conduct an extensive evaluation of the literacy program already established at a Kindergarten-level facility in a major Arkansas community, standardized, diagnostic and screening assessment data was collected. Additionally, demographic information from multiple sources within and out of this district, along with interviews and multiple surveys, were utilized.
Based on all culminating data and pertinent information, findings indicated that while there was significant qualitative data to support skill-based literacy, three other areas suggested further assessment to drive curricula and instructional decisions would be beneficial. Namely: comprehension, writing and vocabulary, which lacked the same volume of qualitative data, yet were all being taught as part of the comprehensive literacy program in place. Furthermore, through a survey teachers provided feedback regarding the literacy area they wanted to develop and utilize a formative assessment. Multiple professional development meetings were facilitated, and through collaboration and dialogue all stakeholders formed and implemented a common formative assessment.
Keywords: formative assessment, comprehensive literacy model, cognitive development,
literacy coaching, literacy coaching models, professional learning community DATA, DESIGN, DIALOGUE 3
Background and Literacy Coaching Models
Literacy leadership, or “literacy coaching” as it is aptly dubbed in the field of education and literacy, is at its best a collaborative and multi-faceted endeavor. It stems from a belief that members of a collaborative team work together with a common vision, for a common efficacy, and operate with logistical planning and reflection (Vogt & Shearer, 2011; McKenna & Walpole,
2008). Furthermore, literacy coaches should have at their fingertips a flexible palate of models that allow them to interact with various tiers of the district/school curriculum, as well as work in various ways with teachers in order to both build and support approaches directed at improved student literacy. Literacy coaching is also a service directed at improving curriculum, professional development of teaching strategies and approaches, and reflecting upon results
(Moxley & Taylor, 2006). Ultimately, the roles and responsibilities of literacy coaches in the present age of education are highly varied—and in some schools and districts, not as specifically defined (Mraz, Algozzine, & Kissel, 2009).
Dole (2004) emphasized the shift from “reading coaches” when the practice encompassed a great deal more emphasis on teaching strategies, visible performance, and reflection. Walpole and Blamey (2008) note that literacy coaching is an evolving practice and one where experiencing conflict and ambiguity in the role is not unusual; they further suggest this is due to recent arrival of a variety of research-based models in schools (p. 223). Few of these professionals may meet all identifiable standards and roles bundled into literacy coaching; all make critical and highly-informed choices about how to balance their time and invest highly in professional relationships, trust, and collaboration (Walpole & Blamey, 2008; Rainville & Jones,
2008). Literacy coaches must work a professional conflux where they “engage themselves in DATA, DESIGN, DIALOGUE 4 particular ways that are most appropriate to the time, place, people and practices” at hand
(Rainville & Jones, 2008, p. 441).
Strong literacy coaching, especially in the context of our study, is founded on data-driven decisions made among a strong professional unity. School leadership teams are an essential part of literacy coaching, employing a common language and vision (Morgan & Clonts, 2008). In a time of increased accountability at the local, state, and federal level, a system of support for collaborative analysis of data on the road to shared goals holds strong value (Mokhtari,
Rosemary & Edwards, 2007). Further, among this type of climate, collaborative analysis becomes a vehicle for making decisions, promoting shared vision and accountability, and further bond teachers and administrators through shared goals (Mokhtari, Thoma & Edwards, 2009).
Situating literacy coaching among the backdrop of curriculum, McKenna and Walpole
(2008) offer strong insight in their selection and organization of coaching models, and offer a somewhat broader, but highly attainable framework for our study. While they offer a variety of organized models, they grant that “all take the stance that improving student achievement” is at the heart of the matter (p. 4). Most useful of their models are cognitive coaching, which targets
“the invisible thinking that guides a teacher’s work” (p. 5) and peer coaching, which builds “a bridge between formal professional development and classroom implementation (p. 7).
Additionally, they identify subject and program-specific coaching, as well as reform-oriented coaching; however, these are not nearly as applicable to our study (yet, we recognize their importance in settings where various amount of curriculum and pedagogical reform might be necessary). Ultimately, these authors guide what makes literacy coaching a powerful practice: the melding of context, knowledge-building, goal identification, research and policy intersection, respect for existing teacher expertise, and reflection (Walpole, McKenna & Morrill, 2011). DATA, DESIGN, DIALOGUE 5
Project Design
Demographics
This body of work is situated within a Kindergarten-level facility in a major Arkansas community. Students may begin their education at this community’s Pre-K level which encompasses ten classrooms and provides 80 students a year with free child care and education due to an Arkansas Better Chance grant and district matching funds. At the district’s
Kindergarten-level facility students begin their public school education before disseminating to one of five elementary schools. All five schools are demographically diverse due to parent choice for student enrollment. After completing sixth grade, students transfer to one of the two junior high schools, and conclude their educational experience at a high school for grades 10-12.
At the time of this project, there were 5,582 students enrolled district wide; 75.59 percent received free or reduced lunch.
This Kindergarten-level facility is at the epicenter of its community. While most of the surrounding school districts have chosen to embed their kindergarten classrooms into an elementary school, this district retains a separate building to house the entire district of kindergartners. This both provides advantages and challenges as our communication and collaboration with elementary teachers—namely first grade educators—to establish vertical alignment can be difficult. As teaching staff they are able to align curriculum to standards, develop assessments, and collaborate overall as a unified body, which is unique to their grade level. Comprising this Kindergarten-level facility’s faculty are twenty-five certified and highly qualified teachers.
Since the district’s student body begins their elementary foundation at the Kindergarten level, the student population is representative of the district as a whole. Of the four hundred DATA, DESIGN, DIALOGUE 6 eighty-nine students that comprise the diverse student population at this Kindergarten-level facility, forty-six percent are African-American while forty percent are Caucasian. Additionally, thirteen percent of students are of Hispanic decent while only one percent is Asian and Indian.
Figure 1. Kindergarten-level facility demographics. This figure illustrates the demographic breakdown of this Kindergarten-level facility.
It is evident from Figure 1, above, that the demographics of the district and this Kindergarten- level facility nearly mirror one another lending itself as evidence that the school is quite representative of the district student population. Another similarity of this Kindergarten-level facility to that of the district is the free and reduced lunch percentage of 83.87%. Additionally, the Kindergarten-level facility is composed of 269 male and 221 female students.
School Literacy Program
The core literacy program in place at this Kindergarten-level facility is the comprehensive literacy model which is research based and encompasses the five major areas for effective reading development as identified by the National Reading Panel. Majority of the teaching staff has been certified through ELLA to properly implement the comprehensive literacy model using explicit and systematic instruction. Additionally, this year the district chose to fully incorporate common core standards for grades kindergarten through second and this
Kindergarten-level facility’s teachers are integrating them into the comprehensive literacy model. DATA, DESIGN, DIALOGUE 7
During the first semester of kindergarten there is direct implementation of the emergent literacy theory intertwined with sociolinguistic theory as students often need oral language development to build schema, vocabulary and comprehension. Likewise, speaking, listening, reading and writing are processes that are learned at varying rates, thus they emerge through experience (Tracey & Morrow, 2006). While some students at this Kindergarten-level facility have a strong literacy foundation, many initially lack essential skills that are evident through poor oral language expression and reception. According to Hart and Risley (1995), from birth to age three, the “talkativeness” in the home directly correlates to later literacy achievement.
Furthermore, they revealed through a longitudinal study the impact the socio-economic background of the child to the amount of oral language experiences provided. Thus, in more professional homes children experience positive, rich language experiences versus more
“business talk” that lacks the commentary necessary for vocabulary, affective, phonological, orthographic, and lexical storage and later retrieval.
Within the Kindergarten-level facility, the student body reflects a wide range of socio- economic backgrounds and emergent literacy experiences along with social opportunities allow space, experiences and time for each child to develop these essential literacy skills. As the year progresses, there is less reliant focus on oral language as a central component as students develop and begin utilize speaking and listening skills on a deeper level through questioning, analyzing and expressing opinions. Daily conversations regarding their shared experiences through “buzz time” where students express thoughts, feelings or recount events in their lives, share time after writing also allows them to use new vocabulary and orally explain their message. These literacy skills are edified through the IRA English Language Arts twelfth standard: “Students use spoken, written, and visual language to accomplish their own purposes (e.g., for learning, enjoyment, DATA, DESIGN, DIALOGUE 8 persuasion, and the exchange of information)” (IRA, 1996). Often during read-alouds, students make connections between the texts or key words to other books or personal experience.
Emergent literacy continues as students also construct their learning by interacting in small groups, through partner games, and interaction in literacy work stations. Social constructivism is a central component to a classroom at this Kindergarten-level facility, as students are provided many rich opportunities to utilize prior knowledge while exploring novel concepts and adapting their schemas through conversation. Additionally, Rosenblatt’s
Transactional Reading Response theory can be observed during guided reading, read-alouds, and big books as students actively listen or read a story while using prior knowledge to interact with the text. Both efferent and aesthetic responses are primed as a blend of non-fiction and fictional texts are utilized to further develop comprehension and reading attainment. Students also use previous knowledge as they are encouraged to make text-to-self, text-to-text and text-to-work connections. As students establish metacognitive strategies, modeled by the teacher, they begin making inferences using background knowledge in conjunction with text events, character development, setting, and other elements of the story to comprehend. All of these crucial reading skills and applications are founded in the IRA English Language Arts third standard which states: “Students apply a wide range of strategies to comprehend, interpret, evaluate, and appreciate texts” (IRA, 1996).
During actual reading acquisition, students are actively utilizing the cognitive processing perspectives, as this is one of the main premises for the comprehensive literacy model. Adams’ parallel distribution model represents the process by which readers interpret the units of text through varied processors. For instance, as a student reads a word, each grapheme is identified by the orthographic processor based on the schemas established from previous exposure. Then, DATA, DESIGN, DIALOGUE 9 the phonological processor is primed to aide in sounding out the word with phoneme knowledge.
As the individual or grouping of letters (a word) is read the next meaning processor searches established vocabulary for retrieval, yet if it is a novel word, the meaning processor would be disadvantageous. Finally, depending on the syntax of the word in a sentence the final context processor will interplay with the meaning processor to “detect” the next word in the series.
While with this model the orthographic and phonological processor are the foundation of the model, the meaning and context processors are the comprehending functions. With the Phonetic
Connections research-based phonics and phonological program already utilized, students at this
Kindergarten-level facility are building lexicon by developing rich phonological awareness, phonics and word study. Beginning with phonological and phonemic awareness by training the ear to hear phonemes and building synaptic connections for later retrieval, later grapheme and word study lessons increase decoding and encoding strategies for reading acquisition.
Various foundational reading areas are evident through the literacy rich components comprising each day. Fluency experiences are embedded into familiar reading as students spend ten to fifteen minutes reading independent, familiar text that have already been read during guided reading. Buddy reading with a partner also encourages further fluency development and is often evident during work stations or familiar reading with a partner. Students retell stories in the classroom library, reenact stories using props and re-read big books, recite repetitive text and poems to promote increased intonation, comprehension, word recognition, accuracy and speed acquisition that directly correlates with reading fluency.
The central component of the literacy curriculum is writing. Within the classroom, teachers utilize Writer’s Workshop based upon Lucy Calkins to model authorship through meaningful writing mini-lessons often anchored to a mentor text. During this time, all DATA, DESIGN, DIALOGUE 10 components of literacy are utilized as students use vocabulary, schema and oral language skills to develop a message. Next, using phonemic awareness and phonics, the message is encoded through an initially laborious process. By re-reading their story, the student further develops fluency while ensuring meaning and adding details with expanded vocabulary and syntax.
Culminating each lesson is an author’s chair in which students share their writing often under the document camera. Sometimes peers or teachers, themselves, will inquire about aspects of their illustration or story which leads to comprehension and possibly revision.
While the majority of the class is able to make adequate literate progress with the core curriculum and small group differentiated instruction, some students will need further intervention. Edified by brain research, functional magnetic resonance brain scans depict the left side of the brain primed for optimal reading function, yet struggling readers often use their right brain and exert greater effort with little progress (Hempenstall, 2006). Over time and with explicit and systematic phonics instruction the brain can be rewired to read. Thus, this edifies our need for response to intervention.
School-wide, staff at this Kindergarten-level facility are quite conscious in regards to implementing RTI as needed. While there are official Tier II and Tier III procedures in place, the faculty uses small group differentiated instruction and classroom intervention to meet the needs of Tier IIa and Tier IIb guidelines. Once a student has placed on an Academic
Improvement Plan a meeting between the parent, counselor and teacher is arranged, the plan is explained and all stakeholders approve and sign the plan. Throughout the year, interventions are conducted, assessments given, and progress documented. This is a similar process for students with Intensive Reading Improvement except a monthly meeting with the committee is arranged to discuss progress and modify the intervention plan as warranted. If after six consecutive DATA, DESIGN, DIALOGUE 11 assessments, the student has not made acceptable progress, they are placed in Tier III. Faculty at
Kindergarten-level facility take a proactive approach by automatically developing a plan for those retained and documenting progress from the onset.
In an effort to promote RTI on a more comprehensive level, the district hired two additional interventionists so each of the five professional group—referred to in this work as
“pods” (e.g. a grouping of five classrooms)—could have an interventionist to support literacy and math growth. Using AimsWeb data as the main measure, students are selected, a plan is devised and the interventionist works with three students from each classroom daily to execute their specific plan. At the beginning of the year, they used a new intervention curriculum program, 95% but are also using other research based interventions.
As with any school, some students begin the school year with already identified exceptionalities are seen throughout the week by the resource teacher. Based on their individual
IEPs lessons are conducted individually or in a small group. If necessitated, a speech pathologist, school-based therapist, case worker, counselor are available on campus daily while an occupational therapist and physical therapist are based off site. English Language Learners are assessed using the MAC2 assessment and if they qualify are pulled Monday thru Thursday for thirty minute classes. With targeted focus on vocabulary, phonics, phonemic awareness and reading skills, she will strive to increase English speaking, listening, and reading in a group setting. To culminate the year, teachers of ELL students complete the ELDA assessment to measure English language acquisition development.
Data Analysis DATA, DESIGN, DIALOGUE 12
An umbrella of assessments are administered, disaggregated, analyzed and then utilized by various faculty at this Kindergarten-level facility to provide all students with the most comprehensive and differentiated education possible. From state-mandated assessments, standardized summative assessments to AIMsWeb and running records, teachers and administrators alike gather during “Pod” meetings every Thursday to collaborate and encourage continuity. Upon examination of all available data, the consensus is most data is literacy skill based with slight measures for comprehension. As Common Core standards permeate the education landscape further, an emphasis on comprehension, vocabulary and writing the measures in place will be insufficient to guide instructional decision making.
Prior to the first day of school, a teacher-created screener is given to all registered kindergarteners to establish a baseline of development in specific literacy and math skills. By mid-September, students are additionally assessed using the state-mandated, QUALLS observation-based assessment to further establish literacy, math, and work habits development.
The QUALLS data is used along with AIMsWeb data to determine students for AIPs and IRIs by mid-October which is a precursor to the official RTI process in place at this Kindergarten-level facility. All parents of students with AIPs and IRIs are required to meet with the teacher and counselor to discuss the QUALLS results, any other supporting data or work samples, and develop a plan which is then signed by all stakeholders.
Prior to this academic year, this Kindergarten-level facility has utilized the DIBELS assessment but has transitioned to the AIMsWeb online based assessment using iPads to record student responses. Primarily the district chose to adopt AIMsWeb for the technological benefits, namely the data collection, retrieval and ability to modify benchmark scoring. There are three benchmarks, beginning of the year (BOY, September), mid-year (MOY, January), and end of DATA, DESIGN, DIALOGUE 13 year (EOY, May) to establish whether students are on target, at some risk or at risk. Students found below benchmark on any of the four assessments; Letter Naming Fluency, Letter Sound
Fluency, Phoneme Segmentation Fluency and Nonsense Word Fluency, students are progress monitored every two weeks.
The comprehensive data is recorded on a spreadsheet designed and distributed by the literacy facilitator at this Kindergarten-level facility and then submitted to the principal for school-wide analysis and disaggregation. Simultaneously, the data is given to the district literacy specialist for further analysis at the school wide level. Since the AIMsWeb is web-based, automatic plot line graphs are generated to show student progress on each assessment. These reports are easily accessible, tracks progress with target scores and projectiles. If a student is near, below or above target the graph will indicate. Of all the assessments, faculty use
AIMsWeb data to determine differentiation, intervention, literacy program growth, and intervention program efficacy.
In the past several years, students at this Kindergarten-level facility have taken differing standardized summative assessments which measured various skills. Some overlap of content exists but more blatant are the variations leading to insufficient measurement of the literacy program as an entity. Between the past two previous years, the MAT-8 lacked a vocabulary assessment whereas the IBTS did assess this literacy component. To more accurately mark growth with any program initiative, there must be consistency among the components.
Furthermore, the SAT-9 assessment in 2008 was deemed invalid due to the practice test distributed was the actual test provided by mistake. In contrast to previous years, the school district administration, coupled with the administrators of this Kindergarten-level facility, DATA, DESIGN, DIALOGUE 14 decided that a summative evaluation would be devised to assess kindergarten students at the end of this academic year based on literacy and math Common Core Standards.
Figure 2. Standardized assessment comparisons. This figure illustrates the similarities and differences of specific literacy and math subtests of these assessments taken by students at this Kindergarten-level facility.
According to the mid-year AIMsWeb quantitative data, a significant increase in phonemic segmentation from the previous two years was noted by the leadership team in place.
Though there are significant increases in all three subtests even with a dramatic benchmark score variance from the previous year’s assessment. With the transition from DIBELS to AIMsWeb, the benchmark, at risk, and some risk target scores all increased, some exponentially. For instance, the LNF-MOY benchmark for DIBELS was 27, but the equivalent benchmark test with
AIMsWeb is presently 39. DATA, DESIGN, DIALOGUE 15
Table 1
DIBELS Assessment Mid-Year Data Pod LNF PSF NWF Total # of Students At Some Bench. At Some Bench. At Som Bench. Risk Risk Risk Risk Risk e Risk Pod 1 14 25 57 17 25 53 17 28 50 96 Pod 2 6 17 75 6 8 84 5 20 73 98 Pod 3 25 13 61 23 22 53 31 23 44 99 Pod 4 9 20 69 11 15 72 13 25 60 98 Pod 5 13 17 66 21 17 58 23 23 50 96 Av. # at Bench. 66 64 55 Av. % at Bench. 64% 62% 53%
Table 2
AIMsWeb Assessment Mid-Year Data Pod LNF PSF NWF Total #of Students At Some Bench. At Some Bench. At Som Bench. Risk Risk Risk Risk Risk e Risk Pod 1 7 21 70 12 12 74 16 25 57 98 Pod 2 4 17 77 8 11 89 13 22 63 98 Pod 3 10 8 78 18 14 64 21 27 48 96 Pod 4 10 12 77 12 25 62 13 29 57 99 Pod 5 6 14 77 11 7 53 18 17 62 97 Av. # at Bench. 76 69 57 Av. % at Bench. 74% 68% 55%
While a direct correlation to attribute the increase is inconclusive, the literacy facilitator at this
Kindergarten-level facility deduced it could be due to additional intervention initiatives building wide and/or a more refined and systematic teacher instruction and curricula implementation of phonics, phonemic awareness and differentiation. Based on the data in Tables 1 and 2 comparatively, the percentage of the number of students that met benchmark in LNF-MOY was a gain of 10% from 2011. If the variation in benchmark scores were tabulated, the percentage would increase proportionally. DATA, DESIGN, DIALOGUE 16
Results
Initial Preparation
Prior to preparing for the foundational meeting, we met on several occasions discussing the data as it was gathered. Then, we formulated questions for further inquiry to better establish feedback from various sources, teachers, administration, and support staff. From the onset, we established that gathering perspectives and feedback from others was not only informational, but would gain their support as we embedded their voice in each meeting. Additionally, we continued to listen during staff development, professional learning community weekly meetings, and conversations among staff members, e-mails and through surveys.
Initially, we planned on approaching our meetings toward addressing RTI and differentiated instruction, especially with the impetus of the newly established emphasis of our district. Our Pod interventionist had inquired throughout the year for intervention suggestions and we felt this might be quite beneficial. Then, through being receptive and flexible coupled with the lack of data in certain literacy skills directly relating to Common Core Standards we shifted to common formative assessments. The Backwards Design model to lesson planning became an ideal way to establish the practicality and centralization around assessments to gain information.
First Meeting
By beginning with positivity in conjunction with quantified, school-wide data growth seemed to perk interest, while expressing the value of their contributions to implementing research-based curricula and pedagogy. The premise behind the meeting was to establish gains DATA, DESIGN, DIALOGUE 17 within the school, then simultaneously expressing the need for further data in the areas of comprehension, vocabulary and writing. We established the meaning of formative assessments through sound research, further edifying the purposefulness of the meeting. We do feel, in hindsight, that visual examples of summative and formative assessments or a discussion regarding specific characteristics could have further supported this objective. Then, a survey was distributed and all meeting attendees were given time within the meeting to complete it.
Originally, we had planned to send it through Survey Monkey after the meeting adjourned but would have attributed to extra time and less likely to gain full feedback.
During our first professional development meeting, we also discussed the positive results of monitoring phonics skills at this Kindergarten-level facility. Significant growth had occurred in the areas of letter naming fluency and nonsense word fluency. We also discussed the advantages of using formative assessments, and we received feedback from teachers on how and if they use formative assessments in their classrooms. According to our discussions and survey results from this professional development meeting, all teachers and administrators taking the survey expressed that they would like to learn about formative assessments, believe that formative assessments can give them reliable information on their students, and would specifically like to learn more about formative assessments in regards to comprehension. This information helped to shape our second professional development meeting.
Second Meeting
Our second professional development meeting started with discussing some of the results from the last survey. As mentioned above, teachers overwhelmingly supported the idea that they would like to know more about formative assessments, and they would specifically like to know DATA, DESIGN, DIALOGUE 18 about comprehension formative assessments. We then talked about research supporting teachers collaborating together (like we were doing to develop formative assessments). This research came from articles by Denise N. Morgan, Christy M. Clounts, and Barbara Steckel. We also cited research from www.readingrockets.com that advocated using formative assessments, specifically story maps, in the classroom. Additionally we presented research from Wiggins and
McTighe’s (1998) book, Understanding by Design, regarding the backwards design lesson plan.
This approach supports using essential questions to drive lesson plans. This is the approach we took in this professional development meeting to help us design a story map about comprehension and a corresponding rubric. The bulk of the meeting consisted in discussing different types of story maps, choosing a story map, and developing a rubric together to assess story maps in the classroom.
After pitching the idea to use a story map as an informal comprehension assessment tool, we had to devise a way to acquire numerical data for comparison. Further reading led us to the joint decision to collaborate during the meeting to formulate a rubric. This transforms a graphic organizer into a quantitative measuring tool to determine mastery of specific story elements.
Many various story maps were downloaded and again, we wanted each of the teachers to select the most applicable one for use.
We used the survey results to establish purpose, then led into the research and theory to support the specific details related to the initiative. Our time was now limited and the main focus of the meeting were yet to transpire. By making our research points brief, we quickly moved into the core by selecting a story map and developing a rubric. Due to time constraints, we started with the rubric. Typically, the entire formative assessment would be collaborative so all members would have choice and voice. As time slipped away, the meeting became quite rushed. DATA, DESIGN, DIALOGUE 19
Our concerns were realized over the preceding week, when multiple teachers were confused in regard to the date of implementation, when to send the data and the next meeting date. By speaking to them individually we were able to diffuse any further confusion. A reminder e-mail was sent the week the data was due so we could meet one final time to develop discussion questions, the data indicators and create a culminating survey.
Third Meeting
While developing the second meeting, we realized that the only way to determine the effectiveness of the common assessment was to establish a third meeting. By having all the teachers recording their data, e-mailing it and then delving into discussion, the outcome could be deciphered. At the time we had no concept of how impactful and instrumental this culminating meeting would be to our entire semester of longitudinal work.
Once again, the meeting was initiated with data discussion. Of course, the highlight was accolades to the Pod 2 teachers for their rich teaching of sequencing and details. A brief discussion as to specific lessons that could have produced such growth commenced. Next, we examined the discrepancy in setting scores among the classes. It was determined that the two higher scoring classes used trade books with familiar and stationary settings. In contrast, the other three teachers used books that had changing and more ambiguous settings (see Table 3, below): DATA, DESIGN, DIALOGUE 20
Table 3
Story Map Formative Data Collection Results Classroom (# of Characters Setting Details Sequence Total Points Participating Students) Pod 1 (19) 32 26 29 28 115 Pod 2 (14) 27 25 31 31 114 Pod 3 (19) 33 44 54 56 187 Pod 4 (17) 48 51 40 44 183 Pod 5 (14) 39 26 35 39 139 Average Percentage 72% 69% 76% 80%
As a result, the group decided to establish a common book for all teachers to use when assessing the story map.
Not only did all the teachers collaborate to construct a story map rubric, but also reflected both during and after utilization to establish refinements to the assessment for future use.
Teachers liked using the story maps and rubrics, and they also had great ideas on how to adapt them to better suit their classrooms. The meeting adjourned after all attendees completed a culminating survey of the overall experience through the series of meetings and effectiveness of creating a formative assessment. The survey results were quite validating, as 100% of participants felt their opinions were valued and all components of the meetings were worthwhile and purposeful.
Everyone listed the professional development sessions as being, “very helpful,” and teachers felt that looking at the data from their school helped them to understand the areas of strengths and areas of improvement for literacy. Of particular importance was that all teachers and administrators that took this survey marked “yes,” indicating that they felt they were heard and their opinion was valued at the professional development meetings.
Findings DATA, DESIGN, DIALOGUE 21
Challenges and Achievement
Various challenges presented themselves to us as we began our project. It was difficult to get all of the data we needed in a timely manner and distributed to all five teachers at the within a
Pod at this Kindergarten-level facility. In reflection, it may have been wise to have one person appointed to collect and distribute the data. However, we all wanted to feel that we were contributing in this process. Also, developing the first survey was difficult. Since we did not want to burden the survey takers with many different surveys asking the same questions, we wanted to develop one, comprehensive survey. We had different ideas about how long the survey should be, what questions should be asked, and who should be asked to take the survey.
We came to a consensus about the details, and we were able to distribute the survey, although later than some of us wanted.
It was also a little difficult to have everyone available for the professional development meetings. We arranged our meetings during a normal meeting time for the teachers with which we worked. However, at the first meeting, another meeting had been recently scheduled that interfered with our meeting, and the principal and some of the teachers were not able to come.
Although we were discouraged, we knew that no matter when we scheduled a meeting, some people would not be able to come. More people were able to come at the second meeting. This was important to us because we were creating and adapting a story map and rubric that teachers were going to use in their classrooms. We had to adapt and be flexible.
We were very pleased at the goals we were able to achieve during our professional development times. We feel that teachers and staff were able to reflect on the areas of strengths and areas of improvement for their school. We also feel that teachers were successful in discussing, collaborating, and reflecting about literacy and learning. Additionally, the area of DATA, DESIGN, DIALOGUE 22 assessing comprehension in a formative way was enhanced by using a story map and rubric that we had created and adapted together in our professional development meetings. We noticed that teachers’ voices were heard and validated by our questions and the space given to talk. We believed it was vital that teachers understood that these meetings were for them and their concerns. We wanted these sessions to be meaningful for them. We feel that we accomplished this by tailoring our time to their thoughts and ideas. We also feel that by giving surveys and by adjusting our meetings in response to the surveys, teachers were confirmed that their ideas were important to us.
Coaching Models and Student Learning
At the last Pod meeting, a conversation arose regarding future professional development they would prefer to attend regarding Common Core and the teachers expressed fervent interest in common formative assessments that relate to these new standards. As aptly noted by
McKenna & Walpole (2008), “the focus of the initiative must be research-based and realistic so that it can be reliably predicted to improve achievement and so that all teachers can actually adopt it” (p. 76). Since the teachers and administration already identify a need for formative assessments the context is in place and has the potential for sustainability.
From the initial meeting preparation, the foundational coaching models identified through conversation and personal theoretical perspectives were peer coaching and cognitive coaching.
Both of these coaching approaches are soft models by encouraging “bridge building” through meaningful dialogue, collaboration, and a unified goal (McKenna & Walpole, 2008, p. 7). The previously established Pods set the stage for the facilitation of the literacy initiative. All faculty, DATA, DESIGN, DIALOGUE 23 administration, and support staff present reported that they felt their opinions and feedback were welcomed and valued during each meeting.
Most notably, the theoretical backbone that emerged was evident through the preparation for each of the three professional development meetings was a 2002 model by Joyce and
Showers, a Professional Development Cycle (McKenna & Walpole, 2008, p. 78). Within this model, a cyclical coaching system in which theory, demonstration, practice, and feedback are imperative for establishing meaningful, relevant, systematic professional development that leads to longevity. By establishing support that is based on data, theory and is “job-embedded” all stakeholders are more willing to be receptive, attempt implementation, reflection, discussion and seek guidance.
Student learning was impacted at this Kindergarten-level facility because teachers were reflecting on what they did in the classroom. Teachers also worked on improving how they assessed comprehension, which was an area in which they wanted to sharpen their skills. This collaboration during our professional meetings sparked many conversations in classrooms, hallways, and before and after school. Teachers, administrators, and other staff were thinking of ways to improve, and it was trickling down to the students.
By researching how reading is taught at this Kindergarten-level facility, and by understanding the resources and materials available, everyone involved can determine what is working and what needs to be corrected. We were able to tap into the insights and talents available in the teachers and staff with which we worked. Teachers were focusing on the best and research-based ways to teach and assess reading at their school. Additionally, they were sharing and collaborating with each other about their ideas. All of this led to growth in the learning and literacy environments for the students. DATA, DESIGN, DIALOGUE 24
Recommendations
Our experience with this project leaves us with several key experiences and insights that bear consideration as crucial to the processes of literacy leadership and coaching. First, it is essential to examine the available data—perhaps using a backwards design model—and seek as best as possible what is missing from the initial data. Secondly, employ a survey device to further triangulate findings and gain teacher insight; surveys really drove the direction of this project and set the course for each subsequent meeting. Thirdly, meetings should unfold organically (not contrived based on data), should include additional surveys or data needed, and through meeting discourse.
Additionally, essential questions are key to facilitating the discussion and further direct the initial direction; this structure maintained focus without overly restricting its boundaries.
Fifth, this project greatly affirmed the impact of a “Pod” philosophy (e.g. Professional Learning
Community) and its purpose and appropriate design. And finally, during discussion it was natural for leaders and leadership to morph. Though it may be difficult in these meaningful conversations to ascertain a single leader, it was clear that all stakeholders could have space to reflect and voice perspectives.
Ultimately, this project reaffirms our view that the following concepts play a fundamental role in the processes of literacy leadership and coaching:
Longevity and Accountability We knew there would need to be a third meeting to ensure longevity and accountability. This allowed us and the teachers to see the efficacy of the collaboration and assessments.
Teacher-Driven Authenticity During discussions, different leaders morphed and came forward. We gave the group time and space to decompress and look at the data. It was DATA, DESIGN, DIALOGUE 25 imperative that our professional development was not contrived, but relevant, based on real quantitative needs. It needed to be something that the teachers could see was important. The teachers expressed the need for comprehension formative assessments and more data.
Collaborative Reflection Teachers had input in all the major areas of the assessments and professional development. The teachers' voice and discussion were pivotal to the project. The
Pod meetings needed to be bottom-up for the teachers to “buy in.” We used the Pod meetings effectively by making reflection, collaboration, and discussion a main focus in all of the meetings. We gave everyone time to digest and participate. The “meat” of the project happened with collaboration and reflection in the meetings, which is what made it so effective—it wasn’t about us or the project, it was about the teachers and what was happening at their school.
Final Epiphany
From the unfurling of fears at the onset to allowing the process to unfold, this was a growth we never imagined. When specific data was not available, we created surveys to acquire the information and feedback to move forward. Additionally, nearly one hundred students were impacted during this process and we truly feel the momentum will continue onward toward next year. After the final meeting, we spoke with our Pod leader in regards to creating more formative assessments the following year. All teachers seemed to be on board and we are excited about the further impetus this will have for our students. DATA, DESIGN, DIALOGUE 26
References
Calkins, L. (1994). The art of teaching writing. (2nd ed.) Heinemann.
Clark, I. (2010). Formative assessment: ‘There is nothing so practical as a good theory’.
Australian Journal of Education, 54(3), 341-352.
Dole, J. A. (2004). The changing role of the reading specialist in school reform. The Reading
Teacher 57(5), 461-471.
Dufour, R. & Stiggins, R. (2009). Maximizing the power of formative assessments. Phi Delta
Kappan,
Gambrell, L. B. (2011). Seven rules of engagement. The Reading Teacher, 66(3), 172-178.
Hart, B., & Risley, T.R. (1995). Meaningful differences in the everyday experience of young
American. 1st ed.) Baltimore: Paul H. Brookes Publishing.
Hempenstall, K. (2006). What brain research can tell us about reading instruction. Learning
Difficulties Australia Bulletin, 38(1), 15-16.
International Reading Association (1996). Standards for the English Language Arts. Retrieved
on February 7, 2012 from
http://www.reading.org//General/Publications/Books/bk889.aspx?mode=redirect.
McKenna, M.C. & Walpole, S. (2008). The literacy coaching challenge: Models and methods
K-8. (1st ed.) New York: Guilford Press.
Mokhtari, K., Rosemary, C. A., & Edwards, P. A. (2007). Making instructional decisions based
on data: What, how, and why. The Reading Teacher, 61(4), 354-359.
Mokhtari, K., Thoma, J., & Edwards, P. (2009). How one Midwestern elementary school uses
data to help raise students’ reading achievement. The Reading Teacher, 63(4), 334-337. DATA, DESIGN, DIALOGUE 27
Morgan, D. N. & Clonts, C. M. (2008). School leadership teams: Extending the reach of school-
based literacy coaches. Language Arts, 85(5), 345-353.
Moxley, D. E. & Taylor, R. T. (2006). Literacy coaching: A handbook for school leaders.
Thousand Oaks, CA: Corwin Press.
Mraz, M., Algozzine, B., & Kissel, B. (2009). The literacy coach’s companion: PreK-3.
Thousand Oaks, CA: Corwin Press.
National Institute for Literacy (2000). Put reading first: The research building blocks for
teaching children to read. Jessup, MD: National Institute for Literacy at ED Pubs.
Rainville, K. N. & Jones, S. (2008) Situated identities: Power and positioning in the work of a
literacy coach. The Reading Teacher 61(6), 440-448.
Reading Rockets (2012). Classroom strategies: Story maps. Retrieved on March 13, 2012 from
www.readingrockets.org/strategies/story_maps/?theme=print.
Steckel, B. (2009). Fulfilling the promise of literacy coaches in urban schools: What does it take
to make an impact? The Reading Teacher, 63(1), 14-23.
Stiggins, R. & DuFour, R. (2009). Maximizing the power of formative assessments. Phi Delta
Kappan, 90(9), 640-644.
Tatum, A. W. (2004). A road map for reading specialists entering schools without exemplary
reading programs: Seven quick lessons. The Reading Teacher, 58(1), 28-38.
Tracey, D. H., & Morrow, L. M. (2006). Lenses on reading: An introduction to theories and
models. New York: The Guilford Press.
Walpole, S. & Blamey, K. L. (2008). Elementary literacy coaches: The reality of dual roles.
The Reading Teacher 63(1), 222-231. DATA, DESIGN, DIALOGUE 28
Walpole, S., McKenna, M. C., & Morrill, J. K. (2011). Building and rebuilding a statewide
support system for literacy coaches. Reading & Writing Quarterly: Overcoming
Learning Difficulties 27(3), 261-280.
Wiggins, G. & McTighe, J. (1998). Understanding by design. Alexandria, Virginia: ASCD.
Vogt, M., & Shearer, B. A. (2011). Reading specialists and literacy coaches in the real world.
Boston: Pearson.