<<

1

APPLYING TRANSFORMATIVE LEARNING THEORY TO UNDERSTAND PRESERVICE TEACHERS’ LEARNING EXPERIENCES ABOUT FORMATIVE ASSESSMENT STRATEGIES

A doctoral thesis presented

by

Jamie M. Lee Korns

to

the College of Professional Studies – Graduate School of Education

in partial fulfillment of the requirements for the degree of Doctor of Education

in the field of Education Curriculum, Teaching, Learning, and Leadership

Northeastern University Boston, Massachusetts February 2018

Committee Members: Kristal Moore Clemons, Ph.D. Lynda Beltz, Ph.D. Elizabeth Farley-Ripple, Ph.D.

2

Abstract

Despite widespread awareness regarding the value of training educators to use formative assessment strategies (FAS), an overreliance on summative assessment data in K-12 education has caused a deficit in the depth and breadth of preservice teacher training in this area. The purpose of this study was to examine how learning experiences in a course about educational assessment impacted the professional dispositions of elementary preservice teachers regarding

FAS. The central research question and two subquestions explored whether preservice educators experienced a perspective transformation related to the effective use of FAS; the questions also sought to reveal how specific learning experiences contributed to these changes. Mezirow’s transformative learning theory (TLT) was applied as a theoretical framework to structure this study, and research instruments, including a survey, semi-structured interviews, and document analysis, were aligned to TLT to measure transformative learning in response to learning activities.

Utilizing mixed method research strategies and employing case study methodology, this research gathered Learning Activities Survey (LAS) data from thirty-two (32) participants and engaged five (5) participants in semi-structured interviews and document analysis. The findings of this study indicate that 78.1% of participants experienced transformational learning related to types and roles of assessments, and particularly towards the effective use of FAS. These transformations generally may be categorized as shifts in mindsets regarding best assessment practices and the growth of a “teacher mentality” among preservice educators. Approximately fifty-nine percent of participants felt their transformation was impacted by a person involved with the course, such as their peers or the instructor; 78% felt their shift was affected by certain class assignments, such as opportunities for verbal discussion and class/group projects.

3

Implications of these findings include recommendations for praxis for those who oversee and instruct preservice teachers in educational assessment, as well as recommendations for future research.

Keywords: formative assessment strategies, teacher preparation, preservice education, transformative learning, learning experiences

4

Dedication

“If we teach today’s students as we taught yesterday’s, we rob them of tomorrow.” , 1944, in Democracy and Education

“Do the best you can until you know better. Then when you know better, you do better.” Maya Angelou, as cited by Oprah Winfrey, 2011

This research is dedicated to the students of tomorrow, that we do better to teach them than we did for the students of today.

5

Acknowledgements

First, I would like to thank my advisor, Dr. Kristal Clemons, whose understanding of my passion for my topic and my personal and professional goals helped me bring my research to

fruition. I would also like to thank Dr. Lynda Beltz, whose steadfast reassurance and wisdom

guided me to become a stronger writer and a more passionate scholar-practitioner during my

time in her course. Her patience and guidance continued through this research process as part of

my doctoral thesis committee, for which I am extremely grateful. Last, but certainly not least, I

have the utmost gratitude to my third reader, Dr. Elizabeth Farley-Ripple, who ignited within me

an insatiable appetite for using data-based decision-making to drive continuous educational

improvement. My first encounters with Liz were as her student in two sequential graduate

courses when I pursued my master’s degree. Under her tutelage and encouragement, I became a

risk-taker and began to develop from a teacher to a true teacher leader.

In addition to my doctoral thesis committee, I have grown as a woman and as an educator

because of the mentorship of other lifelong educators who have touched my life. As a student,

Franklin Regional Junior High School (Murrysville, PA) teachers Ghislaine Dibiasi-Hess and

Dominic Colangelo taught me everything I never would have learned on my own about the power of believing in a student and never giving up on him or her. Their lessons impressed upon

me the entire heart and soul of teaching that I sought to embody when I became a classroom

teacher. Then, as a classroom teacher, 2004-2005 Delaware State Teacher of the Year

Kathleen—Kathi—Thomas mentored me in the spirit that my former teachers had, reminding me

regularly that showing students we care about them is an essential ingredient for a successful

classroom. The evident passion of Kathi, along with district administrator Dr. Sherri Kijowski, and my coursework with Liz, impacted me at the same time in my career and left remarkable

6 impressions on me regarding the need for educational assessment that truly measures what students know and can do. Of course, I would not have learned so much if it were not for former principal, and current district superintendent, Dr. Kevin Fitzgerald. Supportive of me from the day I entered the building, Dr. Fitzgerald encouraged me to follow my passions and supported me as I honed my craft as a new teacher, learning to employ a variety of assessment practices aligned to standards-based grading (a major passion of mine). Together, these educators were the impetus behind this doctoral research and my deep commitment to supporting current and future educators.

Last, but certainly not least, I want to thank my family. My mother and father, Elaine and Greg Lee, always told me I could do anything I put my mind to. My husband, Michael, is truly my better half and my best friend, and his encouragement and downright insistence that I

“go back to school before we start having children, lest I regret it for the rest of my life and complain about it when we rock in our rocking chairs on our front porch some day in our twilight years,” led me to my pursuit of this degree and research.

7

Table of Contents ABSTRACT ...... 2 DEDICATION ...... 4 ACKNOWLEDGEMENTS ...... 5 CHAPTER 1 – INTRODUCTION ...... 11 PURPOSE OF THE STUDY ...... 12

PROBLEM STATEMENT ...... 12

JUSTIFICATION & EVIDENCE OF THE RESEARCH PROBLEM ...... 13

DEFICIENCIES IN THE EVIDENCE ...... 15

SIGNIFICANCE AND CONTEXT ...... 16

DEFINITION OF TERMS ...... 16

POSITIONALITY STATEMENT ...... 18 Professional and personal perspectives ...... 19 Effects of positionality...... 22

RESEARCH QUESTIONS ...... 24

THEORETICAL FRAMEWORK ...... 25

THEORETICAL FRAMING OF THIS RESEARCH ...... 30

CONCLUSION...... 32

CHAPTER 2 – LITERATURE REVIEW ...... 34 CHARACTERISTICS OF EFFECTIVE FAS ...... 34 Use of FAS in K-12 Education ...... 35 Distinguishing FAS from “Other Assessment”...... 37 Impact of Effective FAS on Student Learning ...... 38 Conclusion ...... 39

EDUCATORS’ PERCEPTIONS OF FAS ...... 39 Teachers’ Misconceptions of FAS ...... 40 Teachers’ Perceptions of FAS with Respect to Training Methods ...... 42 Conclusion ...... 46

EFFECTIVENESS OF FAS TRAINING ...... 47

8

Evolution and Current State of Teacher Preparation Curricula and Standards ...... 47 Traditional Professional Development ...... 50 Preservice Teacher Training ...... 52 The Re-Culturing of Teacher Education ...... 54 Conclusion ...... 57

SUMMATION ...... 57

CHAPTER 3 – METHODOLOGY ...... 62 OVERVIEW OF THE RESEARCH DESIGN ...... 62

RESEARCH PARADIGM ...... 63

RESEARCH METHOD AND ALIGNMENT ...... 64

POPULATION AND RECRUITMENT ...... 65

SAMPLING STRATEGIES AND CRITERIA ...... 66

DATA COLLECTION...... 67

DATA CODING AND ANALYSIS PROCESS ...... 73

RECIPROCITY ...... 77

LIMITATIONS & TRUSTWORTHINESS ...... 77

PROTECTION OF HUMAN SUBJECTS ...... 79

DATA STORAGE...... 80

CHAPTER 4 – ANALYSIS OF FINDINGS ...... 81 CONTEXT AND BACKGROUND ...... 81

STRUCTURE OF DATA ANALYSIS...... 82

OVERVIEW AND ANALYSIS OF TRANSFORMATIVE LEARNING BY DEMOGRAPHICS ...... 83

SELECTED CASE PARTICIPANT (SCP) OVERVIEWS ...... 93

SCP INTERVIEW OVERVIEWS ...... 95

ELLEN: SCP # 1 ...... 95 Profile from survey responses...... 95 Interview discussion...... 96

CHRISTINE: SCP # 2 ...... 98 Profile from survey responses...... 98

9

Interview discussion...... 99

MARIA: SCP # 3 ...... 102 Profile from survey responses...... 102 Interview discussion...... 103

LINDSEY: SCP # 4 ...... 106 Profile from survey responses...... 106 Interview discussion...... 107

SARA: SCP # 5 ...... 109 Profile from survey responses...... 109 Interview discussion...... 110

ANALYSIS OF DATA RELATED TO RESEARCH QUESTIONS ...... 113

PRIMARY RESEARCH QUESTION: IMPACT OF LEARNING EXPERIENCES ...... 113 Primary Research Question: Quantitative Analysis ...... 113 Primary Research Question: Qualitative Analysis ...... 123 Primary Research Question: Analysis Summary ...... 127

SUBQUESTION 1: SHIFTS IN HABITS OF MIND ...... 128 Subquestion 1: Quantitative Data ...... 128 Subquestion 1: Qualitative Data ...... 133 Subquestion 1: Analysis Summary ...... 139

SUBQUESTION 2: EFFECT OF LEARNING EXPERIENCES ON HABITS OF MIND ...... 140 Subquestion 2: Quantitative Data ...... 140 Subquestion 2: Qualitative Data ...... 144 Subquestion 2: Analysis Summary ...... 151

EMERGENT THEMES ...... 152 Remedying Misconceptions and Misuses of Data ...... 153 Increasing Data Literacy Skills ...... 155 Expanding Practice Scope and Sequence ...... 158 Re-culturing Assessment Training ...... 161

SYNTHESIS OF THEMES ...... 163

10

REFLEXIVITY ...... 164

CHAPTER 5 – IMPLICATIONS ...... 167 CONTEXT AND REVIEW OF THE PROBLEM OF PRACTICE ...... 167

REVIEW OF METHODOLOGY AND ANALYSIS APPROACH ...... 168

DISCUSSION OF FINDINGS ...... 169

REVIEW OF THEMES ...... 170 Explication Regarding the Theoretical Framework...... 170 Explication Regarding Scholarly Literature ...... 172

APPLICATIONS OF PRAXIS ...... 174 Implications for Theory ...... 174 Implications for Future Research ...... 177 Direct Application for Praxis ...... 181

CONCLUSION...... 181

APPENDIX A: INVITATION TO PARTICIPATE IN THE STUDY ...... 198 APPENDIX B: INFORMED CONSENT FOR THE ONLINE SURVEY ...... 200 APPENDIX C: THE LEARNING ACTIVITIES SURVEY ...... 202 APPENDIX D: INFORMED CONSENT TO FOLLOW-UP ACTIVITIES ...... 211 APPENDIX E: LEARNING ACTIVITIES SURVEY FOLLOW-UP INTERVIEW ...... 214

11

Chapter 1 – Introduction

Over 100 years ago, John Dewey warned, “If we teach today’s students as we taught yesterday’s, we rob them of tomorrow” (Dewey, 1944, p. 167). Unfortunately, students across the United States are being robbed of the best educations they can receive because their teachers are not using the best assessment strategies that they could be using to elicit evidence of student learning and then adapt their instruction based on their findings. Although research suggests doing just this—that is, using what is often referred to as formative assessment strategies

(FAS)—could vastly improve the educational experience of students, teachers are not entering the field with a common body of knowledge and skills related to how to use FAS. As a result, many students are being failed. They are being taught in a way that hinges on judgments that deem them successes or failures based on their scores on high-stakes assessments; these judgments often are without regard for the students’ needs, awareness of the growth they have demonstrated, and without promise that instruction and assessment measures might be revised to better meet their needs. Specifically, there is an overreliance on the use of summative assessment data in education, leading to misjudgments of students’ abilities and misguided intervention attempts by classroom teachers and school and district administrators alike.

At the root of this misuse of summative assessment data is the need for educators to learn how to gauge what students know, understand, and can do during routine instruction—referred to herein as FAS—so that instruction can be adapted in real time, not after a unit of instruction has concluded and it is too late to remediate or accelerate to best meet the needs of students.

Although shifts in teacher education have occurred to provide some training around the use of

FAS—necessary to adapt to standards for teacher preparation—much of this training comes when educators are already in the field, and not from their initial teacher preparation when it may

12

have the most value and impact on future teachers’ habits of mind—their beliefs and schema—

related to the role of effective assessment strategies with K-12 students.

Purpose of the Study

The purpose of this case study is to examine how learning experiences designed for

preservice teachers in an elementary educational assessment course may affect their perspectives

or habits of mind related to the use of effective FAS. With respect to this study, I refer to

preservice teachers as those students enrolled in traditional educator preparation programs who

are learning how to teach. At this stage in the research, altered perspectives and habits of

mind—the foundation of professional dispositions—will be examined to reveal the types of

learning experiences most impactful for creating positive shifts in professional dispositions

related to FAS. Knowledge generated from this study is expected to inform future education of

preservice teachers.

Problem Statement

An overemphasis on summative assessment in the field of education has contributed to a

weakness in training at the preservice level with respect to the necessary knowledge and skills associated with the effective use of FAS. Assessment in education has long focused more on the

use of summative assessment, or assessment of learning that has already occurred, as opposed to

assessment for learning (Allendoerfer, Wilson, Kim, & Burpee, 2014; Bennett, 2011; Chueachot,

Srisa-ard, & Srihamongkol, 2013; Collins, 2012), the latter of which is targeted explicitly by

FAS (Jonsson, Lundahl, & Holmgren, 2015; Vlachou, 2015). Numerous studies have found that

overemphasis on summative assessment, without providing opportunities for educators to elicit

evidence of student learning and to adjust instruction during teaching, is a common practice

stemming from habits of mind that does not align with best pedagogical practices (Bennett,

13

2011; CfBT Education Trust, 2013; Collins, 2012; Hanover Research, 2014; Vlachou, 2015). To this end, research suggests that improving preservice teacher education in the use of FAS would help educators to be better prepared to support student learning when they enter the field (Curry,

Mwavita, Holter, & Harris, 2015; Hill, 2011b), and that improving teacher preparation programs may be more effective than targeting professional development for educators already in the field

(Burns, 2005; Jacobs, Burns, & Yendol-Hoppey, 2015; Marshall, 2007).

Justification & Evidence of the Research Problem

The current atmosphere of high-stakes testing in public schools is pervasive in education with few exceptions; it requires teachers to prepare their K-12 students to demonstrate mastery of education standards (Azis, 2015; Rashid & Jaidin, 2014; Volante & Becket, 2011; Yan & Cheng,

2015). Research suggests that improved preservice teacher education in the use of FAS would help to ensure a common body of knowledge and skills is shared among educators entering the field (Curry et al., 2015; Hill, 2011a). Given the requirements for educators to be able to prepare students to meet standards, it seems natural to expect that teacher preparation programs may have been revised to ensure that preservice educators are receiving instruction around assessment strategies now, and specifically FAS. Alas, a national panel report issued by the Association of

American Colleges and Universities indicated that “The shape of the undergraduate curriculum was essentially fixed half a century ago. Although listed in the catalog as part of a curriculum, individual courses are effectively ‘owned’ by departments, and most advanced courses by individual professors” (2002, p. 16). In over a decade since this report was issued, scholar

Randy Bass notes that this lack of change in curricula persists despite broad implementation of high-impact practices designed to improve experiential learning in teacher preparation

14 programs—these become “extra-curricular,” he warns, as opposed to integrated within core curricula (Bass, 2012).

Indeed, despite national standards governing teacher preparation that call for educators entering the field to be able to use FAS, many preservice educators completing these programs are not adequately equipped to apply FAS in their own K-12 classrooms (Birenbaum, Kimron, &

Shilton, 2011; Mandinach, Friedman, & Gummer, 2015), particularly when juxtaposed with the habits of mind that have developed around the role of summative assessments (Allendoerfer et al., 2014; DeLuca, Chavez, & Cao, 2013; Leimer, 2012; Yu & Li, 2014). When educators from the classroom level to the administrative level are not adequately trained in data literacy—that is, in the types and uses of data available to them, overreliance on summative assessment data occurs, resulting in an array of data abuses including, but not limited to, students misidentified as underachieving and misappropriated funds for academic intervention programs (Datnow &

Hubbard, 2015; Mandinach & Gummer, 2015).

Current research uniformly articulates that most preservice and practicing teachers do not feel prepared to interpret or use data in ways that can help them better understand students’ needs and revise instruction based on these data (Abrams, McMillan, & Wetzel, 2015; Ciampa &

Gallagher, 2016; Crosswell & Beutel, 2012; Jimerson & Wayman, 2015). The skills of understanding evidence of student learning and responding to it are part of formative assessment strategies (FAS), or the process by which educators deliberately elicit evidence of student learning during instruction to provide feedback and to adjust their teaching, adapting to students’ areas of strength and need (Abrams et al., 2015; Council of Chief State School Officers, 2012).

A significant body of research about preservice and practicing teachers’ exposure to learning about FAS already exists to support these findings; what is underexplored in scholarly

15

literature is a study of how learning experiences around assessment that do occur for preservice

teachers may be devised to impactfully transform their knowledge, skills, and dispositions in the

area of effective use of FAS. By its very nature, teacher education provides opportunities for

preservice teachers to experience and reflect on their habits of mind related to educational

practices. Development and transformation of preservice teachers’ beliefs are critical, and by its

very nature, transformative learning involves a shift in thinking and practice grounded in critical

reflection of learning (Dirkx, 1998; Mezirow, 2003).

Deficiencies in the Evidence

Gathering this information from preservice educators is essential, as it helps to fill a gap in existing research about teacher preparation. For example, determining the extent and quality of current teacher training practices for preservice educators can be challenging because of existing deficiencies in the evidence. While a majority of programs report that they provide coursework about educational assessment, some studies suggest this inclusion may often only refer to a single course offering that focuses almost solely on rudimentary assessment literacy— knowledge and skills related to assessment construction and purposes (Bernhardt, 2004)—and does not adequately teach data literacy (Mandinach et al., 2015). Moreover, examining educator

readiness to implement FAS can be challenging, owing to limitations in research related to the

self-reporting nature of participants regarding their professional dispositions in many surveys

and interviews (Birenbaum et al., 2011; Volante & Becket, 2011). By exploring perspective

transformations and shifts in habits of mind, instead of examining only reported professional

dispositions, this study may contribute research of value to the field of teacher preparation.

16

Significance and Context

The consequences of a lack of adequate training around FAS are that many preservice teachers entering the field report feeling unprepared to use FAS, and that FAS practices among

K-12 public school educators are varied and unreliable (Athanases, Bennett, & Wahleithner,

2013; Nortvedt, Santos, & Pinto, 2016). A study of the general understandings and potential shifts of perspectives or habits of mind among preservice educators undertaking coursework about assessments in education is significant for several reasons. First, it can support existing research around educator preparation in the areas of assessment and data literacy. Second, it can help to address deficiencies in current research that stem from the self-reporting nature of teacher preparation programs (Birenbaum et al., 2011; Volante & Becket, 2011), particularly those that note they offer coursework about the use of assessments in K-12 education but do not provide further details of what this coursework entails, nor how it may impact the professional dispositions of their graduates. Third, it may make it possible for recommendations to be made with regards to standards that govern teacher preparation programs. This last possibility is particularly vital, as better-prepared teachers are better able to educate students.

Definition of Terms

Even within the field of education, there may be some variability in interpretations of a variety of terms, and so I offer the following definitions to provide additional context and guide readers specifically for the purposes of this study:

• Assessment literacy: Having knowledge and skills around sound assessment construction

and purposes (Bernhardt, 2004)

17

• CAEP: The Council for the Accreditation of Educator Preparation, which helps to

establish parameters for qualifying teacher preparation programs. CAEP formed in July

2013 at the merging of the National Council for the Accreditation of Teacher Education

(NCATE) and the Teacher Education Accreditation Council (TEAC)

• Data literacy: Having the ability to select, interpret, and use student-related information

to improve instruction or other aspects of education (Bernhardt, 2004)

• Formative assessment: Any means of gathering data about student learning that can

provide information for use by teachers to modify teaching and learning activities to best

meet the needs of students; formative assessment is designed to be iterative and ongoing

during and throughout units of instruction, as opposed to summative assessment, which

occurs at the end of a unit of instruction (Black & Wiliam, 2006)

• Formative assessment strategies (FAS): Any activities designed to elicit data categorized

as formative assessment data (Black & Wiliam, 2006)

• Habits of mind: Beliefs and schema held that govern thoughts and actions, particularly

around preservice teachers’ understandings about role and use of assessment strategies

• INTASC standards: The name given to the body of standards that define professional

dispositions for teachers, as set forth via the Interstate Teacher Assessment and Support

Consortium (INTASC)

• Learning experiences: Instructional activities planned by instructors in teacher

preparation programs to help preservice educators/teachers master knowledge and skills

related to professional dispositions

18

• Perspective transformation experiences: The process through which “learners shift their

understandings or assumptions in order to cope with new information” (King, 2009, p. 4)

• Professional dispositions: The values, attitudes, and understandings that comprise desired

traits among teachers, as outlined by the Council of Chief State School Officers (CCSSO)

and articulated through INTASC standards (Council of Chief State School Officers,

2011)

• Summative assessment: Any assessment that occurs at the end of a unit of instruction and

is used to provide summary or judgment data about learning processes which have

already occurred; summative assessment is considered to be culminative, as opposed to

formative assessment, which occurs during and throughout instruction (Black & Wiliam,

2006)

• Teacher preparation program: A traditional, four-year undergraduate program at an

accredited college or university that serves with the aim of preparing preservice

educators/teachers

• Transformative learning: Learning that represents evolution from previous thinking

(fixed expectations including habits of mind or perspectives) based on new experiences

that trigger autonomous thinking, reflection, and interpretation (Mezirow, 1997)

Positionality Statement

In brief, I have been a student in a teacher preparation program, a guest lecturer and a course preceptor for another teacher preparation program, and a leader of teacher and administrative professional development at the school, district, state, and national levels within the United States. I now help to oversee teacher certification and training both in the United

19

States and abroad, particularly for several countries in Sub-Saharan Africa. To deny the development of subjectivity that has stemmed from my career trajectory would be, as Freire cautions, both “naïve and simplistic” (Freire, 2013, p. 50). In case study research, however, it is critical to understand the perspectives and perceptions of others so that their story is told, so it is imperative that I contemplate and acknowledge how my own experiences have impacted my positionality and bias as I approach my research (Machi & McEvoy, 2012; Ravitch & Riggan,

2017). To examine my own positionality with respect to this case study, I will relay my professional and personal perspectives with regards to FAS as they have been shaped by my experiences, and then acknowledge my biases.

Professional and personal perspectives. Though I noted my career steps concisely above, its specific development over the years has significantly shaped my own understanding of my problem of practice and impacts my passion about this research topic. I have firsthand experience with how instructors learn about and use a variety of data, and especially information gained from the use of effective FAS. My frames of reference include time spent as a student in an educator preparation program, as a secondary English teacher who experienced professional development initiatives, and as a data and instructional coach. During my undergraduate education in a well-known university not dissimilar from the one that is the site for this research study, I was offered and enrolled in only one course about assessment. The course primarily focused on sound assessment construction, and data collection methods were limited to rubrics and checklists for traditional assessments.

It was not until I entered the classroom as a teacher that I was exposed to concepts about

FAS. In several professional development workshops, I worked with colleagues in multiple content areas to discuss FAS and plan for its use in our classrooms. However, apart from typical

20 verbal questioning techniques, few of my colleagues pursued the use of FAS. In my Master of

Education program, while pursuing a degree in school leadership, I encountered several courses that taught me a great deal about both assessment and data literacy and how to learn about students’ needs.

More specifically, this coursework started by broadly teaching me about assessment literacy and data literacy, which were excellent foundations that supported my later understanding of FAS. I came to understand how vital these terms were to unlocking awareness of my students’ learning, and to knowing how to understand and meet their individual academic needs. Assessment literacy is defined as the knowledge and skills around sound assessment construction and purposes; data literacy refers to the ability to select, interpret, and use student- related information to improve instruction or other aspects of education (Avramides, Hunter,

Oliver, & Luckin, 2015; Bernhardt, 2004; Council of Chief State School Officers, 2012). While the areas of assessment literacy and data literacy are broad, they did provide a frame of reference for the myriad ways to check and act on student learning through true FAS. This learning left me feeling a sense of awe and power over the approved curriculum: I could truly differentiate it based on my students’ needs, and I had the power to uncover those needs by using FAS. It also left me with a raw realization that, while the knowledge about the power of FAS was not new, other educators with whom I worked did not seem to know or use FAS or anything similar to facilitate instruction in their own classrooms.

Using the knowledge gained in my own educator preparation programs and as a classroom teacher, I became increasingly committed to helping other educators learn about and use the knowledge and skills around FAS that I had acquired. When the opportunity arose, I eagerly accepted an offer to leave the classroom to become a data and instructional coach for

21

public school districts and charter schools across the state of Delaware. This position involved

me working with teachers directly, as well as their school and district administrators. When I

worked with teachers, I guided them in ways to access and utilize various measures of data to

improve instruction in English, mathematics, science, social studies, and elective course for pre-

kindergarten through twelfth grade. My work with administrators was different and involved

constructing and presenting workshops about the best ways for them to conduct staff

development in the areas of assessment and data literacy. During this time, my commitment to

educator training increased as I saw the ways that student learning could evolve through

impactful use of FAS.

After several years of coaching other educators in the understanding and uses of data, I

felt driven to find other ways to support educators more specifically around the use of FAS. I moved into a national role designing professional development materials. Much of my work in this role involved designing training materials around assessment and data literacy, and more specifically about cyclical assessment for learning, for a variety of stakeholders across 24 states in America: leaders in state departments of education, district administrators, school administrators, and teachers. I conducted research to frame my work—this research comprised the guiding concepts and theories that reflected what I had learned and observed during my time as a classroom teacher and then as a data and instructional coach. I came to believe deeply what

I discovered in research, namely that effective FAS (1) serves as one of the most effective strategies for engaging students by increasing their learning and achievement (Cooper & Cowie,

2010); (2) positively increases students’ ownership over their own learning with respect to goal setting that correlates to student learning gains (Curry et al., 2015; Hargreaves, 2013); (3) has been found to have a statistically significant impact on students’ self-efficacy (Chueachot et al.,

22

2013); and (4) has been linked to some of the most significant academic gains (Abrams et al.,

2015; Chueachot et al., 2013; Curry et al., 2015; Jonsson et al., 2015; Yu & Li, 2014).

Effects of positionality. Upon analyzing my perspectives, I can clearly identify how my

own evolving understanding of and work in the area of FAS has affected my goals,

commitments, frames of reference, and guiding concept and theories. Together, these areas have

impacted my perspectives, which in turn frame the working assumptions that guide my work as both an educator in the field today and as a scholar-practitioner. Most of all, my experiences have led to my positionality, driven by the belief that presenting methodology and hands-on learning opportunities at the preservice level is critical if educators are to employ effective FAS early in their own classrooms. At the heart of my belief is my underlying assumption that students who are not taught by teachers who employ FAS in their instructional practices are not being served fairly. Carlton Parsons (2008) cautions against misinterpreting “deficit” and

“difference” in terms of equality, and I certainly need to be aware that what I label as concern about students may stem from a bias that “different” in terms of teachers’ use—or lack thereof— of FAS does not necessarily constitute a deficit in the eyes of others.

However, I am committed to being a conscientious researcher, and I took precautions to ensure that my positionality neither biased me nor steered my research conclusions. Specifically, by relying on a mixed methods approach to my research, I worked to ensure that the voices of my participants are developed through their own narratives, from steps I took to member check

survey and interview results to making use of descriptive and in vivo coding that ensures

participants’ words came forth in my representations. Additionally, I worked carefully to

maintain analytic memos in my own reflexive journal, recording what participants portrayed in

the moment, instead of waiting until later when my positionality, combined with time to reflect,

23 may have impacted or steered my interpretations. In this way, I was careful to allow relationships to be established among emerging patterns from the data, rather than imposing any sort of hypothesis, adding to the credibility and trustworthiness of my mixed methods research

(Groenewald, 2008).

Though the above outlines my overarching positionality, particular facets of my positionality stem from specific experiences at each step in my career. For instance, as a novice teacher, I was excited to learn about FAS because it felt like a more accurate way to assess students, but most my colleagues rejected this notion, explaining that “giving participation and homework completion points is just as good and helps students pass.” I quickly developed a sense of those who chose to align with quality assessment practices and the “other,” questioning the abilities of the latter to effectively teach if they were not open to research (Briscoe, 2005). I felt frustrated that best-practice in education was being ignored, but at the time, I recognized I was newer to the profession and felt I should respect the experiences of the seasoned teachers.

When I was moved into the role of a data and instructional coach, I was asked to help construct the framework of knowledge and skills around data use that Delaware educators were to learn, and I realized I had not been misguided in my efforts to implement best-practice in

FAS. It is possible that my feelings were tainted with what Roulston and Shelton (2015) describe as observer bias. Though my work certainly did not constitute targeted research, I felt mental validation when my own experiences as a teacher were confirmed in the cultures of FAS use in the new schools where I worked.

Unfortunately, my concerns regarding the lack of consistent FAS in classrooms and at the state level were again confirmed when I transitioned into a national role. When I traveled to oversee training and meet with educators at high levels in 24 different states, I heard a relatively

24 consistent message about struggles in state and district training attempts to support teachers’ use of FAS. Ultimately, I learned from experience and hearing the voices of others that, despite the best of intentions, intervention programs and professional development efforts in schools rarely are sustainable. I noticed there seldom are structures in place to prepare new educators entering the schools after specific intervention or professional development efforts cease, knowledge is lost when trained educators resign or retire, and intervention implementation and professional development experiences may not be consistent across schools for educators who transfer.

Research Questions

Parallel to the shift in standards that call for teachers to be able to understand and use multiple measures of data to improve student learning are standards that call for teacher education programs to provide the related knowledge and skills to do so (Council of Chief State

School Officers, 2011; Henson, 2009). Specifically, the Council for the Accreditation of

Educator Preparation (CAEP) requires teacher preparation programs to submit evidence of teacher candidates’ dispositions as related to the knowledge and skills encompassed within these programs (Kennedy, 2000; Wayda & Lund, 2005). Though there is research about the professional dispositions of developing teachers reported in the area of FAS, there is a gap in the research related to how perspectives and habits of mind—foundations of disposition development—of preservice educators are shaped during their coursework in teacher preparation programs.

Therefore, the purpose of this research study is to examine how learning experiences designed for preservice teachers during their engagement in an elementary teacher educational assessment course may affect their professional dispositions—that is, their perspectives and habits of mind—related to the effective use of FAS. To explore this research, I will garner

25 information using King’s (2009) Learning Activities Survey (LAS), a valid and reliable instrument to collect information regarding learners’ habits of mind, as well as semi-structured interviews and document analysis. Three questions will guide this study:

• Primary Research Question: How are elementary preservice teachers’ habits of mind

about formative assessment strategies (FAS) impacted by their learning experiences?

• Research Subquestions: (1) What shifts in habits of mind, if any, do elementary

preservice teachers perceive they experienced regarding understanding the role of

effective formative assessment strategies (FAS) in education as a result of their

coursework about educational assessment? (2) What learning experiences, if any, do

elementary preservice teachers believe affected their habits of mind during their

coursework about educational assessment?

Theoretical Framework

Scholar William S. Moore argues, “The most powerful learning—the learning that most of us really want to see learners achieve as a result of their experiences with classes/curricula— involves significant qualitative changes in the learners themselves” (Moore, 1994, p. 60). It is my compelling desire to understand the nature of powerful learning experiences and their effect on learners’ perceptions; application of Transformative Learning Theory (TLT) is wholly suited to this pursuit because of this framework’s underlying tenets, years of evolving applicability, and its alignment with carefully chosen research instruments.

When used as a lens to understand learning, TLT, based on the work of Jack Mezirow, helps to portray how dispositions, or habits of mind, of learners evolve as a result of learning experiences. TLT has been noted as a lens through which individual reflection and change can

26 be analyzed, and it is to a variety of career fields (Taylor, 2009), ranging from medical care to business management (Mezirow, 1991; Mezirow & Associates, 2009). Its general application, however, has been in higher education, with specific attention paid to transformations of adult learning and the relationships between planned learning experiences resulting habits of mind

(Glisczinski, 2007). In seeking to classify how dispositions that lead to transformative learning may be quantified, Glisczinski (2007) expands on the research of Herbers (1998), explicating four quadrants in which these thoughts or actions occur for learners:

I. learners experience a disorienting dilemma; then,

II. learners engage in individual critical reflection; and/or

III. learners engage in rational dialogue. Resultingly,

IV. learners take action.

Mezirow (2000) details how applications to adult learning are especially germane when one compares simplistic theories of learning for children and adults: learning in childhood generally is considered formative; what is learned originates from authority and socialized sources outside of oneself and helps to create initial knowledge and understanding. Adult learning, however, is transformative: “adults are more capable of seeing distortion in their own beliefs, feelings, and attitudes” (Mezirow, 1991, p. 58).

The following diagram, adapted from Mezirow (1997), portrays the basic tenets of TLT that he built from his evolving work and first presented at Columbia Teacher’s College in 1978.

27

Figure 1. Basic tenets about TLT (adapted from Mezirow, 1997)

Reflecting a constructivist paradigm of reality, formative learning may start to shift as adults engage with new experiences. Through autonomous thinking, independent reflection, and facilitated education, adults start to construct new meanings related to their experiences, engaging cyclically in further thinking and reflection as their formative learning transforms to new understandings.

Primary criticisms of TLT were leveled at the generalizations Mezirow concluded from

his original work, which began around 1953, claiming that his focus—primarily with

adolescents, community development, and related to the effects of social change—was limited in

scope and premature in Mezirow’s efforts at generalizability (Collard & Law, 1989). This lack

of generalizability, in particular, was targeted because critics felt that the theory failed to account

for the role of communication and social aspects that impact adult learners, as emphasized

28

heavily within the works by Glaser and Strauss (1967) and Becker, Geer, and Hughes (1968) that were of primary influence to Mezirow (Taylor, 2000; Tennant & Pogson, 1995).

However, later works by Mezirow and his contemporaries have continued to explore applicability of TLT which propose that, through its very nature, TLT allows opportunities for adults to reflect on and construct new meaning independently or socially: the meaning making is up to the individual (Mezirow, 2000; Mezirow & Associates, 1990). In fact, multiple learning theories are grounded in the idea that learning experiences can be transformative, including social constructivism, which considers the “social” aspect of the learning process as learners

“construct” their knowledge (Dewey, 1938; Mertens, 2010); sociocultural perspective, which scrutinizes how perceptions and perspectives of learners are shaped by the social and cultural influences of a learning environment (David, 2018; Lave & Wenger, 1991; Wenger, 1998;

Wenger & Snyder, 2000); and experiential learning, which explores the role of experience in transformative learning (Cranton, 2006; Kolb, 1984; McLeod, 2017).

More recently, and more specifically applicable to this study, researcher and educator

Kathleen P. King has made significant contributions to the field of adult learning in higher education through her explorations of TLT and shared research around implications for reshaping pedagogy. King (2005) advocates that TLT offers a model through which the design and implementation of learning opportunities can and should be planned, with planning centered around desired dispositional outcomes for learners. Specifically, she urges those who develop learning experiences to consider the impacts of these experiences on learners to lead them to question previous assumptions, attempt new strategies, reflect on their own views and approaches to the subject being taught, and plan for their own transitions to apply their new learning (King, 2009).

29

King’s work with regards to TLT is rooted in Mezirow’s and can help to frame how preservice teachers experience learning “that transforms problematic frames of reference—sets of fixed assumptions and expectations—to make them more inclusive, discriminating, open, reflective, and emotionally able to change” (Mezirow, 2003, p. 58). As preservice education programs are designed to transform the dispositions of teacher candidates, TLT is a compelling lens through which this research may be viewed. With respect to this study, TLT may be used to measure the transformation of habits of mind related to the context of assessment practices in education.

King (2009) advocates that what has been suspected of transformative learning for years has now been substantiated by numerous research studies: “it is critical reflection, dialogue, situated learning, and relationships that are the most effective as the facilitators” (p. xxiv). The research instruments she has developed, including the Learning Activities Survey (LAS) and

LAS follow-up interview questions that will be primary vehicles of data collection for this study, are designed to measure contextualized adult learning in terms of these facilitators of

transformative learning, and specifically align to the ten phases that define Mezirow’s TLT as

delineated in Mezirow (1991):

• Phase 1: Disorienting dilemma (from the result of an experience)

• Phase 2: Self-examination (including awareness of feelings of shame, confusion,

frustration, guilt, fear, or anger)

• Phase 3: Critical assessment of initial assumptions

• Phase 4: Recognition of discontent and of the process of transformation

• Phase 5: Exploration of options related to new roles, relationships, or actions

30

• Phase 6: Planning a course of action

• Phase 7: Acquisition of knowledge and skills related to plan implementation

• Phase 8: Provisional trying of new roles

• Phase 9: Construction of new competence, confidence, and efficacy in new roles and

relationships

• Phase 10: Reintegration into one’s roles and relationships inclusive of transformed

perspective

Theoretical Framing of This Research

As a framework for this study, TLT directly relates to the problem of practice by helping to reveal how learning experiences for educators may succeed or fail in transforming the dispositions of preservice educators related to the effective use of FAS. More specifically, TLT is woven into the research questions and the basis of the primary research instrument underlying this cast study, and the ten phases of TLT are most appropriate at helping to illuminate how learning outcomes for preservice teachers have been guided by their learning experiences.

Applying the underlying concepts and assumptions of Cranton (2006); Dirkx (1998);

King (2005); Mezirow (1997); Mezirow and Associates (2009); Taylor (2009), a number of propositions with respect to this study were initially made. Firstly, that preservice teachers who feel they did experience a transformation in their learning will likely do so (Phase 1) in response to a disorienting dilemma—a conversation, class activity, or encounter with another individual, for instance—that results in (Phase 2) self-examination and (Phase 3) critical reflection about one’s prior beliefs that (Phase 4) lead one to either accept or reject those beliefs. Then, preservice teachers (Phase 5) are able to contemplate the implications of this realization and

31

(Phase 6) plan for how they may act with regards to the subject in the future. They (Phase 7 and

8) may have opportunities to practice applying this new learning—in class activities, for instance—or apply these new strategies in a field setting, (Phase 9) developing over time new skills and feelings related to the topic until (Phase 10) their new understanding becomes part of their routine practices. Though that is a simplified summary that relies on underlying assumptions of how preservice teachers may experience transformative learning, each aspect of the study is designed to measure, in part or in whole, how preservice teachers experience shifts related to the ten phases of TLT.

Method

This research used a sequential, explanatory case study approach that drew on mixed methods through work with students in a higher education course about educational assessment.

This course was offered to preservice teachers in an elementary education program at a large

Mid-Atlantic liberal arts university. Primary research methods centered around the research instrument found in King’s (2009) Handbook of the Evolving Research of Transformative

Learning (referred to herein as the Handbook). In the Handbook, King (2009) features a TLT research instrument adapted to align with research context and used under copyright release “to original purchasers of the book […] for their own research efforts” (p. 35). The Handbook includes directions from King to researchers for how to modify specific sections of the instrument, and the final instrument featured in this study reflects King’s (2009) directives, as well as subtle inclusion from modified versions King includes in Part Three of the Handbook.

All adaptations created are reflective of these elements and received approval from Dr. King.

The survey was provided to all teacher candidates in the class (N=39); it served as the first

32

method of data collection. The survey was primarily close-ended but also included two open- ended prompts to which participants were asked to respond.

King’s (2009) tools to study TLT also include a follow-up interview for respondents to the LAS; this tool is designed to be used selectively to fit the needs of the research study, and as such were administered to approximately five (5) participants, chosen via purposive, critical case

sampling (Creswell, 2012; Miles, Huberman, & Saldaña, 2013) from among respondents who

expressed willingness to be interviewed and who did undergo a transformative learning

experience, as identified from their scores on the LAS. Additionally, document analysis of

available materials including the program description that encompasses this course, the course

description, course syllabus, and related evidence of student learning.

Relating the Discussion to Audiences

Both teacher candidates and those already in the field would benefit from this research, as

the potential to share common language and understanding of FAS could help improve skills

across faculty while reducing the sense of inconsistent preparation educators may feel as they

enter the field. Additionally, students and their families would benefit from more equally

prepared teachers. Finally, those involved in teacher training—including policy makers for the

CCSSO, CAEP, and administrators and professors in higher education—may be affected by this

research as they work to guide measures designed to shape educator preparation initiatives.

Conclusion

As educational accountability around high-stakes testing persists, teachers are expected to

enhance their data literacy to include research-based best practices, particularly those around formative assessment strategies (FAS). However, despite related legislation and initiatives that call for teachers to be knowledgeable and skilled at using data to adjust instruction, training for

33

educators at the preservice level is neither sufficient nor consistent. This results in educators’

misconceptions with respect to their uses of FAS that transfer into incorrect practices. The

following chapter (chapter two) explores relevant literature related to the effectiveness of FAS,

educators’ perceptions about their own training related to FAS, and implications for teacher

training, particularly at the preservice level. Chapter three will provide an overview of the

research paradigm and methods employed in this study, as well as specifics related to population

and recruitment; sampling strategies and criteria; data collection, coding, and analysis;

limitations; data storage; and more. Chapter four presents the findings and analysis of the results of the LAS survey, interviews, and document analysis, while chapter five expands on these data, discussing and examining implications of this research for preservice education and considerations for future research.

34

Chapter 2 – Literature Review

Although there is abundant research about the value of FAS, and as such the value of

training educators to use FAS effectively, the advent and persistence of a high-stakes

accountability culture in education has focused much of America’s attention on the use of

summative assessment measures to ascribe value judgments about the quality of schools,

teachers, and students. Resultingly, the call for improved preservice education in the types and

uses of assessment data has not been heeded consistently, resulting in inadequately and

unequally prepared teachers entering the field. With the goal of exploring related research that

provides vital context to this problem, this literature review presents characteristics of effective

FAS, including their placement in teacher preparation curricula and standards, as well as their

impacts on instruction and student learning in K-12 education. Next, educators’ perceptions about FAS and their related training in this area will be reviewed. Advantages and disadvantages of these methods of training will be portrayed. Finally, implications for teacher training, particularly at the preservice level, will be discussed. Recommendations to enhance teacher preparation will be offered, as will considerations for future research.

Characteristics of Effective FAS

Defining the characteristics of effective FAS and understanding how FAS is perceived are vital to understanding the strengths and needs of existing training structures around these measures. This section explores this background of how FAS are used, reviews research about traits that affect quality FAS implementation, and summarizes measures of effective FAS with respect to student learning.

35

Use of FAS in K-12 Education

Research spanning the past two decades calls for teacher preparation programs and

professional development for practicing educators to incorporate training around testing and

resulting information about student learning, yet findings by numerous scholars indicate that

these efforts have not been demonstrably successful on a national or even global scale

(Birenbaum et al., 2011; Yan & Cheng, 2015). The CCSSO have helped to prioritize this

training as part of a national effort in the United States and other nations to support educators

around assessment literacy and data literacy (U.S. Department of Education, 2016). Assessment

literacy is generally defined as having knowledge and skills around sound assessment

construction and purposes; data literacy refers to having the ability to select, interpret, and use

student-related information to improve instruction or other aspects of education (Avramides et

al., 2015; Council of Chief State School Officers, 2012). Representing a combination of these

two categories, FAS enable educators to understand and act upon the real-time data they collect from a variety of assessments to best support student learning.

Nonetheless, there appears to be little evidence that training measures around assessment literacy and data literacy are sufficient to support sustainable instructional practices. First, as indicated, studies have revealed widespread inconsistencies in the training of educators, resulting in misaligned perceptions and practices (Curry et al., 2015; Reeves & Honig, 2015). These inconsistencies result in educators feeling un- or under-prepared to collect and analyze evidence of student learning (Abrams et al., 2015; Ciampa & Gallagher, 2016; Crosswell & Beutel, 2012;

Jimerson & Wayman, 2015), resulting in an urgent need to improve teacher training in these areas.

36

In a longitudinal research study examining traits of effective FAS implementation in school systems across several different nations, Davies, Busick, Herbst, and Sherman (2014) concluded that training had the best chance of creating sustainable paradigm shifts for educators around FAS only with strong support from school and district leaders and meticulously planned and delivered training that occurred over time. Some of this successful training has occurred through widespread professional development adoption supported by administrator instructional rounds, or supportive classroom visits (DeLuca, Klinger, Pyper, & Woods, 2015); lesson study models that occurred over a period of at least two years (Pella, 2012); and ongoing collaborative sessions (such as professional learning communities, or PLCs) among teachers with administrative and data coach support (Jonsson et al., 2015; Linder, 2011; Marsh, Bertrand, &

Huguet, 2015).

Researchers also depict teacher-dependent traits, including teaching style and classroom culture, that now are identified as characteristic indicators that FAS are more likely to be successfully implemented. These are generally ascribed to educators who identify with constructivist, rather than authoritative, philosophies of education and who generally are described as open-minded to professional learning, comfortable providing descriptive feedback to students, and consistently construct cohesive learning that provides transparency around learning targets, sequencing of content, and assessments (Birenbaum et al., 2011; DeLuca et al.,

2015; Pierce, 2013). With respect to classroom culture, Birenbaum et al. (2011) argue that teachers who routinely invest time in encouraging students’ intellectual risk-taking and engendering mutual trust create environments in which FAS are most likely to flourish.

37

Distinguishing FAS from “Other Assessment”

Although some FAS may utilize techniques that overlap those used in summative

assessments of learning, the chief difference is how the results are used (Chueachot et al., 2013;

Pella, 2012). Simply, if results are used to provide a summary or judgment of learning, then the

technique is considered summative and does not qualify as evidence of learning for use in FAS.

In addition to intended use, the most important aspects of FAS are that they occur during—not

after—instruction and that information, or data, gathered from these techniques are used to adjust

teaching practices. Adjustment includes providing timely, non-evaluative, descriptive feedback

to students to guide them. It also may include engaging students in peer- or self-assessment to help them reflect on their own learning, revising the pace of instruction, providing further supplemental supports around areas of need, and ensuring students are on the best course to demonstrate mastery when they reach summative assessments (Abrams et al., 2015; Curry et al.,

2015).

Among the ways that educators can assess students, FAS include, but are not limited to, a variety of types of formative assessments that can be either formal (such as quizzes or benchmark tests) or informal (such as exit tickets or thumbs-up/thumbs-down) (Abrams et al.,

2015; Cooper & Cowie, 2010). FAS provides crucial sources of information about what students know, understand, and can do. Teachers who demonstrate data literacy with respect to assessments for learning are better able to adjust instruction on a regular basis to ensure that classroom learning is reflective of what students need to progress academically (Allendoerfer et al., 2014; Athanases et al., 2013; Baere, Marshall, Torgerson, Tracz, & Chiero, 2012; Cooper &

Cowie, 2010).

38

Impact of Effective FAS on Student Learning

When implemented effectively, FAS has been found to have a range of positive effects

on student learning. FAS are found to naturally engage students in active learning which results

in both increased learning and improved student performance (Cooper & Cowie, 2010; Pierce,

2013). Curry et al. (2015) collected evidence of FAS from classroom observations and anecdotal

notes, interviewed educators, and reviewed student achievement data from both formative and

summative sources that spanned a three-year period. They assert that FAS positively increased students’ ownership over their own learning with respect to goal setting that appeared to be linked to achievement gains. Chueachot et al. (2013)utilized interviews and a questionnaire in a study of FAS in elementary math classrooms and determined that these strategies have a statistically significant impact on students’ self-efficacy. In fact, students in classrooms found to include effective FAS reported these same feelings of active learning, increased ownership over their own learning and goal setting, and self-efficacy (Hargreaves, 2013).

Several studies have found that, when FAS is implemented deliberately and successfully, it leads to some of the most impactful gains of any sort of instructional practices or interventions

(Chueachot et al., 2013; Hill, 2011a). These gains appear consistently from the classroom achievement level to results from high-stakes assessments. For instance, a small study reports that gains for students exposed to FAS exceeded 10% gains in achievement, a significant improvement over students in the control group who did not learn under FAS conditions and improved by about 2.4% (Yu & Li, 2014). Jonsson et al. (2015) report a clear correlation between students’ grade point averages and their responses on a questionnaire about their teachers’ FAS practices. Other studies found increases linking FAS to improved scores on

Common Core State Standards (CCSS)-aligned state and federal assessments (Curry et al.,

39

2015), particularly for students who typically scored among the lowest achievers (Abrams et al.,

2015; Chueachot et al., 2013; Curry et al., 2015).

Conclusion

Instructional styles and classroom culture are directly linked to educators’ own training and philosophies of education, which can result in disparity among students’ perceptions of FAS and potential achievement gains. While evidence demonstrates the potential for FAS to

positively impact student learning experiences and achievement, particularly for historically low-

achieving students, these effects are only sustainable when school leadership is committed to

concerted, ongoing training in ways that support teachers in learning about and using FAS.

Given the focus on high-stakes testing in the United States, it is imperative that educators are

taught how to utilize effective FAS to improve student achievement. For this to happen, though,

educators must have a common understanding of FAS practices and an awareness of their own

perceptions and knowledge.

Educators’ Perceptions of FAS

Evidence from numerous studies illustrates pervasive irregularities in assessment and

data literacy training, resulting in misconceptions and misaligned practices of FAS (Curry et al.,

2015; Reeves & Honig, 2015). Other researchers contend that these inconsistent training

practices contribute to teachers’ lack of preparation around FAS (Abrams et al., 2015; Ciampa &

Gallagher, 2016; Crosswell & Beutel, 2012; Jimerson & Wayman, 2015). After conducting an

extensive review of scholarly research with respect to international FAS in education, Azis

(2015) engaged in mixed-methods research that included a survey and interview of teachers

engaged in related ongoing professional development. She reported that analysis of teachers’

understandings of FAS and their actual practices indicate confusion around techniques that are

40 formative and those that are summative, concluding that teachers’ perceptions merited very serious consideration before related studies around the actual effectiveness of FAS continued.

These studies indicate that linking teachers’ perceptions and misconceptions of FAS to specific training methods may provide further insight into how to improve teachers’ knowledge and skills.

Teachers’ Misconceptions of FAS

There is some evidence to suggest that teachers’ misconceptions of FAS pertain to nuances in understanding, wherein teachers understand concepts but either lack knowledge or do not feel comfortable implementing them. For instance, Rashid and Jaidin (2014) undertook a qualitative approach using a phenomenographic methodology to gain insight into teachers’ application of FAS in elementary classrooms. The results of semi-structured, in-depth interviews revealed that nearly all teachers had learned about and conscientiously employed questioning techniques that spanned the levels of Bloom’s Taxonomy. Remarks from the interviews highlighted teachers’ feelings of satisfaction in their increased use of questioning, which they ascribed as elemental to the use of varied FAS. However, following classroom visits and observations and subsequent interview sessions, it was found that though teachers understood the process of questioning, they struggled to provide feedback to students that was descriptive, as opposed to evaluative. Another study finds similar results of teachers leaving students out of the

FAS process: teachers in elementary settings are the least likely to engage students in peer- and self-assessment, citing hesitation to try either approach for fear that they may cause students to feel anxiety or tension (Volante & Becket, 2011).

Further insight into teachers’ beliefs about their own practices versus their actual practices can be gleaned from a study by Allendoerfer et al. (2014), who surveyed and

41

interviewed 350 educators across 13 different schools about the learning environments in their

classrooms. When asked if their classrooms were knowledge-centered, community-centered, assessment-centered, or learner-centered, every respondent stated that his or her classroom was knowledge-centered. However, follow-up research revealed that only 25% of teachers had accurately assessed their own classroom environments—the rest were split among the other environments. Similarly, results from this same study questioned whether teachers’ approaches to instruction were generally teacher-centered or student-centered. Though a majority reported student-centered approaches, the reverse was actually found to be true (Allendoerfer et al.,

2014).

In trying to determine the cause of these teachers’ misconceptions with respect to their use of FAS, teaching style, and classroom cultures, the prevailing theory is that a majority of educators simply do not understand the role of FAS as a process, not an isolated technique, that is designed to be integrated into regular teaching practices (Yan & Cheng, 2015). In a study that focused on helping teachers shift their use of rubrics as summative to formative, more than 80% of teachers reported they felt comfortable with this shift; however, classroom observations and analysis of actual practice revealed that less than 25% of these teachers were able to use rubrics to provide descriptive feedback without linking summative grades (Mui So & Hoi Lee, 2011).

Other researchers cite teachers’ self-efficacy as an attribute likely to lead teachers to feel confident in their general abilities to teach, and therefore in their incorporation of FAS, leading them to overestimate themselves. In fact, across several studies that surveyed both teachers and students, teachers’ self-efficacy ratings were found to be incongruently related to students’ ratings of teachers’ uses of FAS (Pat-El, Tillema, Segers, & Vedder, 2015; Reeves & Honig,

2015; Yu & Li, 2014).

42

Teachers’ Perceptions of FAS with Respect to Training Methods

Studies indicate there are extensive inconsistencies in teacher training that leads to misaligned perceptions and practices around FAS (Curry et al., 2015; Reeves & Honig, 2015), leaving educators feeling ill-equipped to use student learning data to adjust instruction (Abrams et al., 2015; Ciampa & Gallagher, 2016; Crosswell & Beutel, 2012; Jimerson & Wayman, 2015).

Understanding educators’ perceptions of their own learning regarding FAS can provide insight into how misconceptions might be addressed through improved training. Although isolated cases of teachers resisting district-wide initiatives around FAS resulted in some teachers finding success individually learning about and employing FAS through such means as lesson studies

(Pella, 2012), most research reveals that training for practicing teachers is effective when it is collaborative in nature and carefully planned and implemented with ongoing administrative and data coach support (DeLuca et al., 2015; Jonsson et al., 2015; Linder, 2011; Marsh et al., 2015).

These findings are supported by a combination of teacher perceptions of training and of FAS that are corroborated in reviews of actual classroom practice and longitudinal student achievement.

Positive perceptions with respect to training methods. In a case study involving middle school teachers, Linder (2011) concluded that collaborative time together was the number one factor that contributed to data literacy and successful implementation of FAS, citing educators’ comments about shared planning time that enabled them to explore new data-based practices and review evidence of student learning together. The teachers in Linder’s study did note administrator support to hold collaborative time sacred helped them to learn. In a similar but larger study, DeLuca et al. (2015) reported that collaborative learning was key for teachers but second to very strong support from educational leadership. The determining factor in this study included that educators were most successful when their principals elected to join them in

43

training sessions and engaged in instructional rounds (IRs), or classroom visits, to provide

teachers with feedback about their use of FAS.

Marsh et al. (2015) conducted a study at low-performing schools across four districts that drew on a collection of artifacts and achievement data, collaborative meeting observations, interviews, and focus groups. Perceptions reported from teachers, coaches, and PLC lead teachers contributed to the view that administrative support was particularly beneficial when school leaders regularly attended and contributed to training sessions. Marsh et al. (2015) also found that their data generally revealed very positive perceptions of data coaches who helped teachers understand how to analyze formative assessment data and to use this data in support of

FAS.

Other studies involving collaborative sessions with educators also found success when these sessions were carefully planned and implemented with administrator support. In fifteen focus group sessions involving elementary and middle school teachers (N=67), teachers largely credited collaborative time as enabling them to discuss patterns of strength and need in student data and to share instructional strategies to address these needs (Abrams et al., 2015). Using questionnaires and interviews to gather teachers’ perceptions of a larger-scale implementation of teacher learning communities (TLCs) as a model for imparting data literacy, Jonsson et al.

(2015) reported teachers’ positive views regarding more frequent pedagogical discussions, an increase in school-wide and classroom-level transparency, and an increase in FAS to support student learning.

Similarly, teachers in another study were asked about the impact of using FAS in their classrooms; these teachers reported they felt collaborative time greatly improved their own effectiveness as educators, including their ability to improve student learning in relation to

44

academic standards (Cooper & Cowie, 2010). The reference to standards-based learning in the

study by Cooper and Cowie (2010) is strongly echoed in other research around collaborative data

literacy training with respect to FAS. Generally, educators appear to support FAS when they can

see a clear tie to educational standards (Baere et al., 2012), noting they feel FAS is one of the

most powerful, transparent methods they can use to adjust their instruction to help reach student

achievement goals (Curry et al., 2015). In the study by Abrams et al. (2015), teachers reported

that formative assessment data used solely for FAS purposes were the greatest sources available

to them to help recognize and address gaps in their instructional practices.

Negative perceptions with respect to training methods. Despite the generally-positive

perceptions of teachers about their development of FAS from data literacy training, research also

reveals that some teachers report concerns about an increase in their work loads. Jonsson et al.

(2015) attributes the concern of teachers in their study to students’ ongoing demands for

opportunities to revise their work based on feedback, despite teachers’ obligations to eventually

pause instruction to utilize summative assessments. Jonsson et al. (2015) further linked teachers’

concerns about workload to their potential misconceptions of their self-efficacy with regards to

using FAS effectively and fostering knowledge-centered learning environments, which may shed

light on findings noted previously in this literature review with respect to Allendoerfer et al.

(2014), Mui So and Hoi Lee (2011), and Yan and Cheng (2015).

Another concern with respect to FAS and teachers’ workload is raised with respect to the

timeliness of data availability. In cases where educators elicit evidence of student learning

themselves, via exit tickets, thumbs-up/thumbs-down, quizzes, etc., teachers have instant or near- instant access to student learning data that they can act on to adjust instruction. In some districts, however, administrators may seek to utilize formative assessments on a larger scale (for instance,

45

at a grade or school level), which can lead to an increase on teachers’ workloads to grade these

assessments or wait for the resulting data. Unfortunately, using formative assessments on a

larger scale or delaying the time it takes teachers to access data directly contradicts the

underlying principles of FAS (Abrams et al., 2015).

Perceptions with respect to needs across training methods. Although training for

teachers is certainly happening at different levels, the greatest needs seem to be in helping

teachers to engage in paradigm shifts with respect to instructional practices and to provide

targeted training based on needs that teachers express. With respect to paradigm shifts, Rashid

and Jaidin (2014) learned that many elementary teachers in one study were not implementing

FAS correctly despite their intentions and concluded that training for educators must stress the

importance of using formative assessment as a part of routine instruction that includes students

as owners of their own learning, not an assessment measure independent of instruction and

student participation. In a very similar study, teachers stated they felt it difficult to include

students in the FAS process because they were concerned doing so would overwhelm students or

cause them to feel uncomfortable about their learning (Volante & Becket, 2011). In a case study

involving eighth and ninth grade teachers, Ciampa and Gallagher (2016) used qualitative data

sources including anecdotal notes, teacher interviews, and professional learning community blog

entries to evaluate teachers’ needs with respect to data literacy training and FAS. They

concluded that most teachers found it difficult to shift their emphasis from grading to learning.

With respect to needs directly expressed by teachers, most research centers around data

interpretation and use. In a district-wide qualitative case study, researchers sought to determine teachers’ perceptions and needs around FAS. Though a majority of teachers felt their practices had changed for the better, they emphasized that learning to use data effectively to understand

46

and respond to students’ needs is a skill that can only be developed over time and with support

from others versed in data analysis (Curry et al., 2015).

In another study, three specific teachers’ needs with respect to using formative

assessment data to make instructional decisions were categorized by Jimerson and Wayman

(2015): (1) making meaning from data, (2) learning how to communicate about data

interpretations in low-stakes conversations with other educators and with students, and (3)

assimilating knowledge from data training into sustainable, ongoing practice (pp. 11-12). Based

on this study, the researchers concluded that what teachers most needed were training in data

conversations, so that all educators learned what questions to ask of their data, what conclusions

to draw from their data, how to become very comfortable using these findings to impact

instruction, and in what ways to share their findings with multiple stakeholders. Essentially, they

felt that if teachers knew how to talk about their data, FAS would become internalized in their

school culture (Jimerson & Wayman, 2015).

Conclusion

Examining the research with respect to teachers’ understandings and perceptions of FAS

provides insight into the current state of FAS implementation and further needs in this field.

Although FAS training is not a new concept in education, multiple studies reveal that many

teachers do not wholly understand aspects of FAS, resulting in misaligned practices being

implemented in their classrooms. Some early research into these misconceptions describes

teachers’ sense of self-efficacy and confidence as traits that over-inflate teachers’ sense of their

classrooms being learner-centered and knowledge-centered. These findings are also revealed in other studies that find some teachers admit to being hesitant about involving their students in the learning process, particularly at the elementary level, for fear of making students uncomfortable.

47

Training for educators that is carefully planned and implemented seems to contribute to

teachers’ positive perceptions of FAS in ways that do support student learning and achievement,

particularly when supported by administrators or data coaches in ways that enable teacher

collaboration. Despite these positive perceptions, there is still a demonstrated need for training

to address data analysis and data conversations. Doing so would address teachers’ requests for

training in these areas and, in many cases, may cut down on the time it takes to engage in data

analysis, thereby mitigating some concerns about their workloads.

Effectiveness of FAS Training

Thus far, this literature review has examined the use and effectiveness of FAS as well as

teachers’ perceptions of FAS with respect to specific training methods. This section provides a

general review of the evolution and current state of standards that impact teacher education, and

then reviews the two primary methods of teacher training around FAS, including preservice

preparation and professional development. Furthermore, it explores the benefits, drawbacks, and

needs of each method.

Evolution and Current State of Teacher Preparation Curricula and Standards

The use of FAS by teachers has become increasingly important as, over the past several decades, various educational movements in the United States have led to an increased emphasis on educational achievement. Legislative initiatives including No Child Left Behind (2001) and

Race to the Top, part of the American Recovery and Reinvestment Act of 2009, highlight the emphasis that the American public school system has placed on accountability. Pursuant to this emphasis, the United States utilizes high-stakes assessments as measures of student achievement and supports initiatives and programs that augment teacher training so educators are prepared to help students reach the standards tested on these tests (U.S. Department of Education, 2016).

48

Today, “teachers are expected to not only be experts in their content, but they are also expected

to understand the needs of all learners and how to differentiate instruction to meet those needs”

(Mikulec & Miller, 2012, p. 34). In pace with these initiatives and expectations, various

standards and bodies related to teacher education have evolved. For years, the National Council

for the Accreditation of Teacher Education (NCATE) and the Teacher Education Accreditation

Council (TEAC) helped to establish parameters for qualifying teacher preparation programs; in

July of 2013, these organizations became known as the Council for the Accreditation of Educator

Preparation (CAEP) (Council for the Accreditation of Educator Preparation, 2014).

Examining trends in teacher education and pre-K-12 learning, Kennedy (2000) notes the development of the Interstate Teacher Assessment and Support Consortium (InTASC) standards and the National Council for the Accreditation of Teacher Education (NCATE), citing that is essential for educators to learn to use multiple measures of student learning to plan for and adjust instruction that meets the needs of all learners. Unfortunately, Kennedy (2000) also describes that, while many states have outlined their own standards for teacher education that reflect national guidelines, a variety of competing influences have led to such complications that implementation of the standards at the federal and state levels are inconsistent and ineffective.

Acknowledging complications in bureaucracy and legislation with relation to the InTASC standards, Shaw (2013) examines how the 10 standards around which educators are expected to be competent have evolved since their inception in 1992. Projecting further evolution of standards and needs in education, she cautions that teachers must recognize the strengths and limitations of standardized testing and be skilled in using a variety of measures of data to have the clearest understanding of students’ areas of strength and need—in other words, the need for

49

educators to learn how to use FAS has persisted for nearly twenty-five years and this need

continues to increase to match federal expectations.

Despite national standards calling for teacher preparation programs to include training

around assessment and data literacy for some of these very reasons, professional development

delivery in these areas are still necessary because many educators entering the field have not

received adequate coursework in these areas. Sadly, over a decade ago, a national panel report

issued by the Association of American Colleges and Universities indicated that “The shape of the

undergraduate curriculum was essentially fixed half a century ago. Although listed in the catalog

as part of a curriculum, individual courses are effectively ‘owned’ by departments, and most

advanced courses by individual professors” (National Panel Report, 2002, p. 16). More recently,

scholar Randy Bass notes that this lack of change in curricula continues despite broad

implementation of high-impact practices designed to improve experiential learning in teacher

preparation programs—these become extra-curricular as opposed to integrated within core

curricula (Bass, 2012, p. 25).

Based on the evolution of the InTASC standards and articulable goals first by NCATE

and TEAC, which are now espoused by CAEP, scholars in teacher education call for explicit

training for educators that are hands-on and authentic, such that educators can integrate these

standards into ongoing practice (Henson, 2009; Memory, Yoder, & Williams, 2003; Shaw,

2013). Responses to these standards and calls for action can be noted through examination of the

two primary methods of teacher education: traditional professional development and preservice teacher training.

50

Traditional Professional Development

In a survey that explored the relationship between teachers’ accountability for student learning and their acquisition of assessment and data literacy, participants (N=129) revealed great disparity in the scope, sequence, and quality of their professional development experiences

(Jacobs et al., 2015). As discussed previously, data literacy and FAS that occur through

traditional professional development—used in this context to refer to training for teachers who

are already in the field—generally takes place as administrator-directed or teacher-directed.

Administrator-directed professional development generally occurs as individual training days,

ongoing professional development in the form of a series of training sessions, or meetings of

collaborative teacher teams (Birenbaum et al., 2011; Birenbaum, Kimron, Shilton, & Shahaf-

Barzilay, 2009; Davies et al., 2014). Teacher-directed professional development often occurs in

the form of lesson studies, wherein individual or small groups of teachers select topics to

research and utilize through short action research (Cooper & Cowie, 2010; Pella, 2012).

Traditional professional development generally is effective for teachers who value

opportunities to learn in a linear fashion, engage in question-and-answer sessions, collaborate

with instructors or peers, and receive feedback on their own learning (Birenbaum et al., 2009;

Hiebert & Morris, 2012). Specific to data literacy and FAS training, traditional professional development most benefits educators who describe perceived value from discussing data and

instructional practices with their peers, particularly when they feel supported by administrators

or data coaches (Linder, 2011; Marsh et al., 2015).

Additionally, the PLC model of training generally affords educators the opportunity to

feel as though their professional development is customized to their needs, particularly because

leaders in such training typically are aware of and even members of various subject departments

51 and grade levels from their own schools (Hill, 2011a). In PLC-centered professional development guided by school-based administration or peer leaders, Ciampa and Gallagher

(2016) find reflective practice among educators increases because of opportunities to engage in experiential learning. They add that this type of embedded professional learning is particularly effective for teachers when it fosters peer interaction about new skills, such as data literacy and

FAS, because collaboration around authentic practice directly impacts self-efficacy in a way that can cause cultural shifts (Ciampa & Gallagher, 2016).

Although traditional professional development has its merits, several studies related to its effectiveness reveal challenges teachers encounter and the negative perceptions they hold. For example, although middle school teachers in her study were generally positive about their professional development experiences in PLCs, Linder (2011) reports two negative experiences of teachers in this training: (1) they felt isolated from their peers who were not directly involved in this method of professional development, such as non-core educators, and (2) they felt as though this collaborative time focusing specifically on training content sometimes took away from opportunities to discuss other school-based concerns, including non-instructional events or administrative tasks (p. 81).

A study by Jimerson and Wayman (2015), which focused more specifically on the effectiveness of professional development on teachers’ knowledge and skills around FAS, support Linder’s (2011) finding that traditional training for educators can be isolating. As such, despite the effort that may go into traditional professional development, conservation and availability of knowledge and tools after training has concluded is vital (Jimerson & Wayman,

2015). Finally, implementation and sustainability of professional development at the secondary

52

education level is also historically difficult, which detracts from this method’s effectiveness

(Hill, 2011a; Volante & Becket, 2011).

Throughout the research, several traits emerge as enabling traditional professional development to succeed by helping teachers to develop sustainable knowledge and skills. First, it is recommended that administrators cast themselves in the role of change agents (Davies et al.,

2014; Hill, 2011a), ensuring that planning and intentionality are at the heart of all teacher training (Jimerson & Wayman, 2015). Additionally, teachers must be able to perceive the value and relevance of training (Hiebert & Morris, 2012; Volante & Becket, 2011) and must be able to

interact with other staff members to ensure their learning is not isolated (DeLuca et al., 2015).

Preservice Teacher Training

Several studies with respect to preservice teacher training have overarching implications for assessment and data literacy, particularly with respect to FAS. First, a survey of 350 faculty members from 13 teacher education colleges summarizes the need for preservice teachers to

acquire both theoretical and practical knowledge (Allendoerfer et al., 2014). These findings are

supported by earlier research from a survey of student teacher site supervisors (N=293), the

results of which led Shepherd and Alpert (2012) to caution that student teachers who do not have

explicit opportunities to practice applying their learning in authentic situations would not be

prepared to support students based on contemporary research. Finally, Crosswell and Beutel

(2012) distributed a questionnaire designed to assess preservice teachers’ (N=34) perceptions of

how prepared they were to teach—few responses included references to theoretical knowledge

and all responses cited the need for more authentic, hands-on practice.

The findings of these studies seem to be perceived by educators in the field as well.

Volante and Becket (2011) interviewed practicing elementary and secondary teachers (N=20) to

53

explore their beliefs about how their training prepared them to use formative assessment data as

part of FAS. Though their participants had a range of experience between eight and 28 years,

they found a majority credited their primary source of data literacy preparation as their

preservice preparation. More interestingly, in follow-up questions, they could not determine any patterns with respect to teacher training or background. In other words, most teachers felt their best preparation to use data for FAS came from their undergraduate and graduate coursework before they entered the field, but they did not study at the same schools or in the same programs.

Indeed, research does indicate that teacher preparation coursework does benefit those entering the teaching field, in terms of skill and confidence levels. In one study by Reeves and

Honig (2015), student teachers (N=64) took part in a single, six-hour seminar on data literacy; effects from a pre-test and post-test fond that all participants demonstrated a greater ability to use data analysis tools, such as Excel, to identify patterns of strength and need for students with respect to instructional content. Moreover, these student teachers also reported greater self- efficacy and greater understanding around using data to prescribe adaptations to instructional plans. In a much larger study of students (N=190) enrolled in a semester-long course about the use of assessment data, including as FAS, the Assessment Literacy Inventory (ALI) was used to measure students’ progress. Coursework was found to increase assessment literacy in all areas for all students, and to have particular positive impact in the areas of ethical data use, selecting assessment methods, and scoring (McGee & Colby, 2014).

However, the results of the study by McGee and Colby (2014) also indicate that preservice education students grew the least in the areas of interpreting and communicating assessment results—skills for which students received theoretical, but not hands-on, exposure within this course. This finding is not surprising, given the aforementioned research about best

54

practices with respect to teacher preparation by Allendoerfer et al. (2014) and Shepherd and

Alpert (2012).

Moreover, the need for preservice teachers to receive practical training around the use of

data literacy and FAS skills can be seen in the results of additional research. With consideration

to “practical” training, Mikulec and Miller (2012) remark that “While preservice teachers are good at parroting notions of differentiating instruction and meeting the needs of learners, they

live in blissful ignorance of what this means in practice” (p. 36). Indeed, several studies reflect

this analysis. For example, a separate, six-year longitudinal study of the training of preservice

teachers (N=80) who then entered the teaching field found that those who had coursework about

data collection and data analysis during their preparation training felt they were better equipped

to choose and collect data from various assessment measures than educators who had not

received this training but less-prepared to actually use these data to alter their instructional

(Athanases et al., 2013). In another study, Ogan-Bekiroglu and Suzuk (2014) reported that

preservice teachers (N=28) who received training in assessment and data literacy generally had

strong theoretical knowledge but lacked practical experience applying their learning.

The Re-Culturing of Teacher Education

Many sources included in this literature review refer to the effectiveness of FAS based on

educator training approaches and conclude that these methods need to be revised. A major

problem raised with respect to restructuring professional development for preservice educators stems from resistance to changing culture and existing beliefs about the role of data in education

(Baere et al., 2012). As such, changes to cultural perceptions of data are not widespread, and the most prevalent mode of training occurs for teachers who already are practicing in the field is in

55

the form of school-based PLCs led or directed by administrators that support collaboration for

practicing educators (Birenbaum et al., 2011; Ciampa & Gallagher, 2016; Davies et al., 2014).

Because of the perceived need for training to alter school culture to be effective, not all

researchers are as confident that professional development, particularly when guided by

administrators or other members of the school community perceived as leadership are involved.

In their study about training for data literacy and FAS in elementary and secondary classrooms,

Volante and Becket (2011) conducted extensive interviews of teachers and found an

overwhelming sense of displeasure with professional development directed by school and district

administration, particularly by secondary teachers. Participants reported perceptions that top-

down teacher training mandates are neither practical nor sustainable because they do not mesh

with existing school culture, offering self-selected and self-directed approaches as alternatives to extending their professional learning.

These findings are consistent with those from a mixed-methods study by Birenbaum et al.

(2011), which considered survey and interview data to understand the contexts that affect

educators’ perceptions of school-based professional development. Birenbaum et al. (2011)

summarized their findings, stating that teachers in their study generally resist forms of teacher

training if they do not perceive an immediate and authentic connection between content of the

professional development and the needs and challenges facing their schools and their own

classrooms.

Despite findings from her study which were generally positive with respect to

administrator and teacher-leader support of professional development, Hill (2011a)

acknowledged that affecting school-wide culture with respect to assessment practices is complex

and challenging because it requires a complete paradigm shift in educators’ beliefs and practices.

56

Based on results from their quantitative study, Yan and Cheng (2015) posit that this is

particularly difficult because educators frequently perceive training around new skills as practices being added to their already-extensive existing responsibilities. Curry et al. (2015)

offer a teacher-led approach to professional development as an alternative to administrator-led or

directed training, suggesting that teacher motivation and collaboration are higher when teachers

can select data literacy skills they feel are necessary and choose their own approach to learning.

The consensus among researchers seems to be that a shift in teachers’ culture is nearly

impossible unless teachers perceive authentic value in new knowledge and are taught how to

integrate new skills into their regular practice instead of taking on skills as new practices in

addition to their regular responsibilities (Curry et al., 2015; Yan & Cheng, 2015).

Revisions to teacher training for preservice teachers also requires a shift in culture, but

these changes may be easier to affect, given the very nature of teacher preparation. In their

article analyzing factors affecting preservice teachers’ assessment and data literacy preparation,

Baere et al. (2012) find that a majority of higher education programs do not include sufficient

coursework in these areas, despite program mission statements to the contrary. For instance, in a

review of teacher preparation programs’ syllabi and a survey across 208 institutions of higher

education, Mandinach et al. (2015) reveals grave concerns about teacher preparation programs

with respect to readying educators to use data as FAS.

While a majority of programs report that they provide coursework around data-based

decision-making, this generally refers to one single course offering that focuses almost solely on

assessment literacy, not data literacy (Mandinach et al., 2015). Additionally, DeLuca et al.

(2013) argue that the demand for teachers to demonstrate data literacy in the field is not matched

by preparation measures in programs for preservice educators, owing to cultural resistance at

57 institutions of higher education that focus more on traditional methodology than research-based best practices that have emerged in the past several decades.

Conclusion

Although traditional professional development is used for practicing educators to learn about FAS, many practicing educators credit coursework they received when they were preparing to enter the field as their primary source of knowledge about using data to improve instruction, particularly with respect to FAS. However, preservice coursework, while clearly beneficial in preparing teachers before they enter the field, may not provide participants with authentic, hands-on field experience needed to use data effectively in support of student learning.

There is room for improvement in preservice teacher training, though shifts in the culture of education must occur first for these changes to be sustainable.

Summation

One looming gap in some of this research is the inherent weakness in gathered data that are self-reported from participants, and collecting this information from preservice educators will be essential because it will help to fill a gap about teacher preparation that is not wholly present in existing research. For example, determining the extent and quality of current teacher training practices for preservice educators can be challenging because of existing deficiencies in the evidence. While a majority of programs report that they provide coursework around data-based decision-making, some studies suggest this inclusion may often only refer to a single course offering that focuses almost solely on assessment uses and does not adequately cover assessment nor data literacy (Mandinach et al., 2015). Moreover, examining teacher readiness to implement

FAS can be challenging, owing to reported limitations in research related to the self-reporting nature of participants in many surveys and interviews (Birenbaum et al., 2011; Volante &

58

Becket, 2011). Given each of these factors, a study that can help provide insight into preservice teachers’ conceptions of FAS prior to and resulting from learning experiences about educational assessment—that is, a study that can provide insight into how transformative learning may occur for preservice teachers as a result of this coursework—is appropriate and may provide invaluable insight for teacher preparation. Though study results could not be generalized, they certainly would give rise to grounds for further research.

Despite limitations described in this literature review, the general conclusions help to support existing research around the need for educators to be trained to use FAS effectively.

Though FAS are one of the most effective for improving student achievement, particularly among historically lower-achieving students, there clearly are misconceptions resulting in weak or incorrect data use among some teachers. Many of these misunderstandings and misaligned practices are attributable to flaws in the primary learning experiences used to train teachers, whether for practicing educators or within teacher preparation programs. There are strengths and drawbacks to both types of training, but there is little logic in relying on professional development for practicing teachers as the primary method to convey knowledge and skills that support data literacy and FAS. Research further demonstrates that most current preservice teacher training is not sufficient in scope and sequence, nor does it provide enough authentic, hands-on experience to adequately prepare those seeking to enter the teaching profession.

Additionally, research calls for revising teacher professional development, nothing that current measures alone are not sufficient for data literacy and FAS training. For instance, despite identified conditions under which traditional professional development is most likely to be successful, research demonstrates that knowledge and skills imparted through this training method are not consistent across schools or districts, let alone across the K-12 public education

59

system (Athanases et al., 2013; Nortvedt et al., 2016). Moreover, educators sometimes appear to

resent top-down professional development, particularly at the secondary level (Hill, 2011a;

Volante & Becket, 2011). While some practicing educators appear to credit their preservice

coursework as the basis for their ability to use data to inform instruction (Volante & Becket,

2011), researchers caution that preservice coursework, while beneficial, generally places more

emphasis on assessment literacy instead of data literacy (Mandinach et al., 2015) and does not

provide sufficient opportunities for preservice educators to practice the application of their

learning (DeLuca et al., 2013).

Making lasting changes to any type of teacher training would require widespread policy

changes (Curry et al., 2015; Hill, 2011a). Though the cultural shifts referenced in research that

would be needed for such broad changes may not be forthcoming, a step in the right direction

would be requiring a much stronger foundation in data literacy be offered at the preservice

education level. One important result of increasing preservice education are the potential savings

in terms of both time and money: most efforts to improve student achievement in American

public education are largely reactive, as opposed to proactive (Baere et al., 2012; Bernhardt,

2004; Ginsberg & Kingston, 2014). In other words, a great deal of time and money are spent on

teacher professional development to help educators already in the field develop knowledge and

skills that they might have acquired during their teacher preparation programs had then been given the opportunity.

Unfortunately, there currently does not seem to be a standard measure of the actual effectiveness of teacher preparation programs (Athanases et al., 2013; Ginsberg & Kingston,

2014), revealing this as one area for future research. Though some in education cite teacher licensure examinations as a measure of teacher preparation, these types of assessments are a

60

minimum standard that fall short at measuring what teachers are able to do (Benedict, Thomas,

Kimerling, & Leko, 2013; Libman, 2009). Without a better standard to measure this effectiveness, best practices must be ascertained from existing research, which includes recommendations for coursework that provide theoretical knowledge of both assessment and data literacy, coupled with student teaching requirements that provide hands-on opportunities to practice using data as part of FAS (Allendoerfer et al., 2014; Benedict et al., 2013; Crosswell &

Beutel, 2012; Shepherd & Alpert, 2012).

Moreover, authentic exercises should be included that support preservice teachers in learning to communicate about analysis and instructional strategies with multiple stakeholders, including students and other teachers, in low-stakes data conversations (Jimerson & Wayman,

2015). If preservice teachers enter the field equipped with this solid foundation, there will be more opportunities for universal implementation of FAS and positive impacts on student achievement. Ultimately, traditional professional development would better serve educators if it operated as a refresher or a means for sharing research-related best practices with practicing educators, as opposed to being relied on as the primary training method for knowledge and skills as crucial as FAS.

Both teacher candidates and those already in the field would benefit from this research, as

the potential to share common language and understanding of FAS could help improve skills

across faculty while reducing the sense of isolation educators may feel from inconsistent

preparation. Additionally, students and their families would benefit from more equally prepared

teachers. Ultimately, those involved in teacher training—including policy makers for the

CCSSO, CAEP, and those responsible for teacher preparation in higher education be most

affected by further research. The latter is particularly vital, as findings ultimately would enhance

61 teachers’ skills and improve education for K-12 students. The following chapter will provide an articulation of the selected methodology for this research project.

62

Chapter 3 – Methodology

The general purpose of this mixed-methods case study was to examine how learning experiences designed for preservice teachers at a large liberal arts university in the Mid-Atlantic region during their engagement in an elementary teacher educational assessment course affected their habits of mind related to the effective use of FAS.

Overview of the Research Design

Primary data collection methods for this study included a survey, semi-structured interviews, and document analysis. These methods are explored in greater detail in the Data

Collection section of this chapter. Using these methods, I sought to answer the following research questions:

• Primary Research Question: How are elementary preservice teachers’ habits of mind

about formative assessment strategies (FAS) impacted by their learning experiences?

• Research Subquestions: (1) What shifts in habits of mind, if any, do elementary

preservice teachers perceive they experienced regarding understanding the role of

effective FAS in education as a result of their coursework about educational assessment?

(2) What learning experiences, if any, do elementary preservice teachers believe affected

their habits of mind during their coursework about educational assessment?

Case study methodology, the approach chosen for this study, was ideal because case studies are designed to explore the beliefs and practices of participants within an environment bound by time and space (Stake, 1995; Yin, 2013), offering an exploration of participants and how they experienced the given phenomena in a real-life context (Merriam, 1998; Yin, 2013). This case

63

study considered how course participants experienced learning experiences in this assessment

course.

Research Paradigm

This study applied case study research that used mixed methods of data collection, an approach that is ideally suited to describe the narrative that builds from a study’s participants

(Creamer, 2018; Creswell & Plano Clark, 2018; Kitchenbaum, 2010). Integration of both quantitative and qualitative methods helped to ensure the planning of this study and its research instruments supported me through “multiple ways of seeing” (Creswell & Plano Clark, 2018, p.

4) the participants’ narrative as each built, particularly through the use of case study research.

Case study research is one of the most frequently used mixed methods approaches, although seminal scholars vary in their assessment of its protocols and methodologies (Baxter &

Jack, 2008; Boblin, Ireland, Kirkpatrick, & Robertson, 2013; Creamer, 2018; Creswell & Plano

Clark, 2018; Yazan, 2015). Some scholars trace case study research to the renowned work of

Charles Darwin in the nineteenth century (Stewart, 2014), although case study research did not

gain significant traction as a research method until the past several decades through its

applications in social and health science fields (Baharein & Noor, 2008; Baxter & Jack, 2008;

Yazan, 2015). More specifically, case study research is an approach that enables researchers to

explore the “how” and “why” of subjects or issues (Yin, 2013), including events, groups, individuals, or other entities (Baharein & Noor, 2008; Baxter & Jack, 2008). With respect to my

study, I hoped to gain insight into how learning experiences planned by the instructor may have

influenced students’ habits of mind regarding the use of FAS.

Additionally, case study research serves as empirical inquiry that investigates

relationships between phenomena and their real-life context (Merriam, 1998; Yin, 2013),

64 allowing exploration of these relationships through multiple lenses (Baxter & Jack, 2008). Case studies focus on an environment bounded by time and space (Stake, 1995; Yin, 2013), using various sources of data that may be triangulated to help describe a given phenomenon by providing a holistic view (Baharein & Noor, 2008; Boblin et al., 2013; Yazan, 2015). This course, which lasted for approximately three- and one-half months and offered multiple instances of data to explore my research problem, fit well into the parameters of case study research.

Research Method and Alignment

I approached this problem of practice from a constructivist paradigm, believing ontologically that truth is based on multiple, socially constructed realities that stem from qualitative, hermeneutical, and dialectical methodology (Lincoln & Guba, 2000; Mertens, 2010) that would be strongly represented through narrative presentation. Stake (1995) and Merriam

(1998) both agree that a case study is an ideal approach from the constructivist paradigm and that the narrative co-created between researcher and participant(s) is a valid, quality representation of the data. By having employed mixed methods within this case study, there was integration of both qualitative and quantitative instruments that generated two distinct sets of data, enabling richer interpretation and the opportunity for “enhanced understanding of the integrated conclusions” from these sets (Creswell & Plano Clark, 2018, p. 118).

The theoretical perspective, research question, and time/activity boundaries inherent in studying preservice teachers within their class environments supported strong alignment with a case study approach (Merriam, 1998; Miles et al., 2013; Stake, 1995). Specifically, this research strove to describe preservice teachers’ perspectives in a single course to support richer findings and generalizability (Baharein & Noor, 2008), explore similarities and differences between and

65 within case settings (Yin, 2013), and thus took on the shape of an explanatory, or descriptive, case study (Baxter & Jack, 2008).

In order to maintain fidelity to my chosen paradigm and research approach, I maintained a reflexivity journal in the form of personal, analytic memos, as well as field notes, throughout the research process. These ancillary materials served as supplementary data for my research, inviting me to reflect on my practices as a researcher and reminding me of my purpose as a scholar-practitioner. Though these materials will not be submitted as part of my formal research, their maintenance and regular consideration greatly enhanced my learning and growth.

Population and Recruitment

The population for my study focused on a cohort of preservice educators who were participating in a required course on educational assessment. This course was one of two courses about educational assessment offered at a large, public university in the Mid-Atlantic region of the United States. Known for its liberal arts educational offerings, the university and this course were chosen based on typical case sampling. Additionally, this is the primary course about educational assessment that each student in this class takes, and for some it will be the only one.

However, a number of students may elect to take an additional course on educational evaluation for exceptional children in the future. Access to this cohort was negotiated through permission from the instructor of the course who was willing to let me conduct research with her students.

The course itself is unique because of its nontraditional structure, which mimicked a flipped classroom style, in part. Slated in its description as a course with two sessions per week for the duration of the semester, students worked on a series of assignments independently the first day each week instead of reporting to class. Then, they brought their work with them to

66 class the second session of each week to discuss their reflections and concerns with their peers and the instructor.

Students in this course accounted for a pool of 39 possible participants, all of whom were asked to voluntarily participate in the survey portion of this study, which was be conducted online using SurveyMonkey. The course included sophomores, juniors, and seniors who were preparing to enter their first preservice teaching experience following this term. Additionally, all students were invited to participate in follow-up activities, which included providing a release to review evidence of their learning, sharing this evidence with me electronically, and participating in interviews, which were conducted online utilizing Zoom videoconference software.

Sampling Strategies and Criteria

This mixed methods research used a case study approach that drew on participants from a course about educational assessment for elementary preservice teachers. The case selected for this study was at a large liberal studies university in the Mid-Atlantic known for its commitment to teacher preparation, and therefore represented typical case sampling to highlight standard participants’ perceptions (Creswell & Poth, 2013). The pool of potential participants (N=39) and actual respondents (N=32) was large enough to support logical generalizations, though statistical generalizations were not possible with this method (Miles et al., 2013). Participants included those who voluntarily completed the survey (N=32); both the initial survey informed consent demographics questions helped to ensure that no data were collected from any participants under the age of eighteen years old; in this way, the instrument measures perceptions of learners as intended (King, 2009).

Additionally, using purposive, critical case sampling, five participants were chosen to partake in follow-up activities—each indicated they had experienced a shift in their thinking; this

67 shift was associated explicitly with the course, including other participants and/or the instructor and related course activities; and they consented to take part in these activities. Ensuring that participants who agreed to partake in follow-up activities were selected only if their transformation was associated with an element of the course—as opposed to any participants who indicated they experienced a significant life change (e.g., change in relationship or life circumstance) contributed to ensuring internal validity by filtering out from further analysis those students for whom rival explanations for their transformation may be present. Data regarding the trigger for the transformation was further member checked during the follow-up interview.

Ultimately, based on the possible pool of participants and this particular case, these five participants represent an ideal number for semi-structured interviews in case study research

(Creswell, 2012; Miles et al., 2013).

Data Collection

As noted, this study relied on a survey, semi-structured interviews, and document analysis, which represent three of the six primary evidentiary sources recommended by Yin

(2013) for case study research. Utilizing multiple sources of data is considered one way to increase research construct validity by providing opportunities for triangulation that support a researcher’s efforts in data analysis of a single case under study.

68

Figure 2. Chosen sources of evidence to triangulate participants’ perspective transformations

The first two vehicles of data collection for this study—the survey and interviews—

represent slight adaptations from the instrument presented by King (2009) as her LAS Higher

Education Format and included other very minor adjustments recommended and approved by

King. The instrument utilized for this study received King’s approval prior to their use. It was

carefully designed to match the original, calibrated research instrument, which has been found to

reliably identify “whether […] learners have had a perspective transformation in relation to their educational experience; and if so, determining what learning activities have contributed to it”

(King, 2009, p. 14). The survey, an instrument designed to help researchers identify and

categorize transformative learning experiences as a result of learning activities (King, 1997), was

distributed to all students (N=39) in this educational assessment course.

This study’s survey was based on adjustments prescribed and approved by Dr. King and

included elements from calibrated, modified versions that she offers in the Handbook. First,

King (2009) notes researchers must make modifications, particularly in “the learning activities

(Items 4 and 7) and demographic questions (Items 10-14) [sections]” (p. 37). These minimal

69

variations enhanced the instruments’ applicability in higher education specifically to preservice teachers; validity of the survey as a data collection instrument was not compromised.

Adaptations approved contextualized the questions specifically to address learners and learning

activities for the participants of this research.

Specifically, the adaptations represented previously piloted and validated versions of

King’s (2009) instruments. Some adaptations came from the Preservice Teachers version

(Caruana, 2011), which was chosen for its adaptations of question 6 to offer learning activities

more typical of undergraduate teacher preparation, as well as minor deviations that modernized

and contextualized “life events” in question six (6) by including “immigration” and [a

relationship] “breakup,” respectively. Another minor adaptation reflected the approach taken in

the ESL Format (1998) featured by King (2009), the beginning of which was selected and

modified to provide a more appropriate introduction to participants of this study, including

language stipulated by university IRB informed consent forms.

The four sections of the LAS corresponded to the ten phases outlined in Mezirow’s TLT,

and results were scored with King’s Perspective Transformation (PT)-Index, which helped to

articulate to what extent, if any, a participant experienced a perspective transformation as a result

of his or her education or other factors. The PT-Index enabled scoring of participants as those who did not have a perspective transformation experience (PT-Index = 1); those who had one not associated with their education (PT-Index = 2); and those who had one associated with their education (PT-Index = 3). The survey asked learners whether they have experienced a changed point of view about the research topic at hand, and if so, what factors may have contributed to this change. It furthers asked learners if they tend to reflect on their own learning and growth,

70

and what, if any, life events may have occurred during their enrollment in the course. The

survey concluded with general demographic questions designed to aid in research analysis.

This survey was delivered using SurveyMonkey and packaged to include informed consent information, then distributed to potential participants through emails that I sent, as well as posted by me in the online course platform. Survey results were collected anonymously in general, though the final page of the survey referred to follow-up activities and required participants who elected to opt-in to provide their names so they could be contacted for additional research. Any records of these results that were preserved, including those transcribed and included within this study, were anonymized with pseudonyms.

In addition to the LAS, I conducted semi-structured interviews, which offered flexibility coupled with the opportunity to collect rich data (Baharein & Noor, 2008); these interviews represent another critical resource that helped me to understand participants’ experiences and perceptions while maintaining the authentic narratives (Creswell & Poth, 2013; Merriam, 1998) that built upon participants’ survey responses. Interview questions reflected King’s (2009) higher education follow-up questions and, as noted, included very slight, contextualized adaptations to reference this specific research topic. The interview utilized questions designed to draw explanatory and descriptive responses from participants based on their “score” on King’s

PT-Index.

Data from these semi-structured interviews reflect the same pseudonyms chosen for participants’ survey results; data were captured via a paid Zoom account that provided a recording option to capture interviews in either audio/visual and audio-only format.

Transcription service from Rev.com was utilized; transcripts were reviewed by the researcher, corrected as needed to be aligned with the interview recordings, and then uploaded into NVivo to

71

facilitate with coding. Recordings were supplemented with handwritten notes that I took during each interview to capture my own perceptions (Seidman, 2006) and reflected back upon during review of the recordings to enhance my ability to discern patterns among responses

(Groenewald, 2008). These findings will be shared in chapter four and reviewed for implications in chapter five.

As a final data source, document analysis of program and course descriptions, the course syllabus, and relevant student learning artifacts further provided context (Merriam, 1998; Stake,

1995) for this study by shedding light on contextual factors that helped to define intended learning experiences (Baxter & Jack, 2008; Boblin et al., 2013; Yin, 2013). I took steps during collection and maintenance of these documents to ensure privacy related to the university, instructor, and students, removing names and any other identifying information where applicable. Specifically, I reviewed the university-provided program description; the course description; course syllabus; and participants’ course assignments that they elected to provide after opt-in through the informed consent process.

Among participant assignments were a variety of discussion prompts and tasks used both formatively and summatively. Many prompts outside of class were designed to elicit evidence of student learning and perceptions, such as discussion prompts that asked students to view or read about a particular topic, and then to explain an idea that squared or agreed with their thinking, an idea that struck them as interesting, and a question that they still had. Another prompt, for example, asked students to describe their thoughts, opinions, and/or feelings about an assignment.

In addition to prompts designed to provoke thinking, other assignments more concretely asked students to demonstrate hands-on learning and skills. For each of these assignments, the

72

instructor provided a clear description of the assignment, explained why the students were doing

it, provided examples of what she was looking for, included a rubric of precisely how students

would be graded, and included notes about how and when students would have opportunities to

improve their work prior to submitting the final assignment. One such assignment, for example,

was a formative assessment task. The task was broken down into five sub-tasks that asked

students to respond to the task from different angles: (1) deconstruct a standard into concrete,

measurable objectives; (2) ascribe the level of Bloom’s Taxonomy for each objective; (3) review

provided examples for more insight; (4) list and explain formative assessment strategies that

could be used to measure the objectives, and link these strategies to the objectives identified for

sub-task 1; (5) use Google Forms to create a formal assessment based on certain parameters.

Students were asked to bring a copy of their work to class to get peer feedback prior to

submitting the final assignment for a grade.

Altogether, these resources served as extremely rich data sources that helped to complement the narrative constructed through students’ survey and follow-up interview responses. Table 1, below, summarizes the overarching methodology for this study by connecting research questions to corresponding sources of evidence:

Table 1. Research Questions and Corresponding Sources of Evidence

Research Question Sources of Evidence (Primary Research Question) How are • Survey elementary preservice teachers’ habits of • Interviews mind about formative assessment • Document analysis strategies (FAS) impacted by their (assignments) learning experiences?

73

(Subquestion 1) What shifts in habits of • Survey mind, if any, do elementary preservice • Interviews teachers perceive they experienced • Document analysis regarding understanding the role of (assignments, program & effective FAS in education as a result of course description) their coursework about educational assessment?

(Subquestion 2) What learning • Survey experiences, if any, do elementary • Interviews preservice teachers believe affected their • Document analysis habits of mind during their coursework (assignments, course syllabus) about educational assessment?

Data Coding and Analysis Process

Of the pool of 39 possible participants, 32 students completed the survey. Data gathered from the 32 surveys were member checked through repetition of key questions during the interviews with interview participants (N=5), and interview participants had an opportunity to member check their own interview responses through two closing questions designed to allow them to reflect back on what they said and to ask me, the interviewer, questions. Then, responses and notes from the surveys, interviews, and related document analysis were recorded in tables and coded to help me to identify emergent themes as they related to King’s PT-Index and the Mezirow’s ten phases of TLT. I initially reviewed all qualitative (open-ended) survey results and interview responses to ensure optimum familiarity with each participant’s responses, then reviewed and coded materials pertaining to my document analysis. Next, I incorporated

SPSS and NVivo software to provide help with coding and organization to facilitate statistical analysis of quantitative data, as well as pattern and theme identification through qualitative analysis. Ultimately, answers to these research questions help to reveal strengths and areas of

74

need for preservice teacher training about FAS, inform implications for preservice training, and

to illustrate considerations for future research.

Because I wanted to learn about preservice teachers’ perceptions and experiences related

to transformative learning about FAS, inductive reasoning guided me to help participants’

narratives unfold. Specifically, inductive reasoning assisted me in discovering the stories that

are revealed in my data, rather than leading me to try to use my data to define the stories being

told (Trochim, 2006). Inductive reasoning was the best method for the mixed methods nature of

my study, and was, of course, opposite of how I may have conducted my study if I were trying to

confirm a specific hypothesis about educators and their use of FAS.

Broadly, data analysis, which is discussed in greater length in chapter four, occurred as a

sequential explanatory design, ideally suited to the nature of mixed-methods research (Creamer,

2018; Creswell & Plano Clark, 2018). Simply, sequential explanatory design led me through clear steps involving first quantitative, then qualitative analysis, such that quantitative results were expanded on and more fully explained as patterns and themes were identified. Analysis relied on “three concurrent flows of activity: (1) data condensation, (2) data display, and (3) conclusion drawing/verification (Young, n.d.). Through my coding and analysis, I followed the six steps in the analysis process outlined by Smith, Flowers, and Larkin (2012), including (1)

reading and rereading, (2) initial noting of codes, (3) developing emerging themes, (4) searching

for connections; (5) moving to the next case; and (6) searching for patterns between cases.

Coding and analysis of my quantitative data involved tabulating responses from the

survey and determining a PT-Index of 1, 2, or 3 and averaging these data across participants to

gain insight into which learning activities may have been most impactful on the students’

learning experiences and perspective transformations related to this course. As noted, SPSS

75

software was employed to reveal descriptive statistics and, when appropriate, determine

correlation (namely through chi-squared and Fisher exact tests) across these and other survey

questions. Results were analyzed with notes related to potentially emerging themes recorded in

NVivo for later reference after qualitative analysis.

In terms of qualitative analysis, my first-level coding identified a total of 673 comments, including 622 from interview transcripts, 25 from shared student work that helped to share the participants’ stories in their own words, 5 salient phrases in the course description, and 21 in the course syllabus. Though the program description was reviewed as intended, it did not reference educational assessment in any way, and therefore was not coded nor considered in the qualitative component of this research. I marked each of these in NVivo software as descriptive and in vivo codes, creating approximately 30 initial pattern codes, as I sought to create a sort of “inventory of topics for indexing and categorizing” (Miles et al., 2013, p. 53).

Use of these tools gave way to second-level coding, and I was able to narrow my comments and codes more clearly to fourteen pattern codes and four themes (Chenail, 1995;

Miles et al., 2013) that could be cross-referenced with those noted during quantitative analysis.

The pattern codes determined were guided by those derived from the theoretical propositions inherent in the validated LAS materials, and allowed for matching of empirical data derived from qualitative analysis, creating a third and final layer of analysis that revealed emergent themes.

Thus, the process of sequential explanatory design was fulfilled.

The table below portrays the final patterns codes derived from qualitative analysis for this study, along with code frequency and percent. The number of times items were coded are sorted from highest to lowest, with “Learning of New Information from New Material” (N=75) being the most frequently noted pattern, and the “Benefits of Non-Traditional Course Structure”

76

(N=21) being the least frequently noted, but still important, pattern. Based on these patterns, four key themes emerged, which will be explored in chapters four and five: (1) remedying misconceptions and misuses of data; (2) increasing data literacy skills; (3) expanding practice scope and sequence; and (4) re-culturing assessment training.

Table 2. Pattern Codes from Qualitative Analysis Leading to Thematic Development

Frequency Percent by Pattern Codes from Qualitative Analysis by Code Code Learning of New Information from New Material 75 11.1% Growth & Development of a Teacher Mentality 72 10.7% Instructor-Guided Reflection & Thought 65 9.7%

Value of Quality Examples, Materials, & Resources 61 9.1%

Value of a Collaborative Learning Environment 58 8.6% Learning a New Meaning from Old Material 52 7.7% Presence of Peer Discussion to Facilitate Learning 50 7.4% Instructor Support & Reinforcement 43 6.4%

Benefits of Opportunities to Practice in Class 42 6.2%

Value of Repetitive Practice of Skills In/Out of Class 41 6.1%

Learning to Distinguish Formative & Summative 33 4.9% Assessment Best Practices

Impact of Peer Feedback on Individual Learning 33 4.9%

Instructor Modeling of Best Practices 27 4.0% Benefits of This Non-Traditional Structure of Course 21 3.1%

77

Reciprocity

Patton (2002) aptly articulates, “Participants in research provide us with something of great value, their stories and their perspectives on their world. We show that we value what they

give us by offering something in exchange” (p. 415). Seeking to relate to my participants’

passion for the field of education, I offered to share the results of my research in its final format

with each participant who completed the survey, as well as the instructor of the course. There

was not additional compensation provided for participants.

Limitations & Trustworthiness

Broadly speaking, there are four categories of threats to validity in case studies, which are

a form of social research: (1) construct validity; (2) internal validity; (3) external validity; and (4)

reliability (Yin, 2013). Some of these factors have been touched upon briefly during explanation

related to King’s (2009) LAS materials. The following table denotes strategies used during

planning and implementation, and analysis of this research.

Table 3. General Strategies to Ensure Validity & Reliability (adapted from Yin, 2013)

Challenge Research Phase Researcher Strategy Construct Validity • Data collection & analysis • Use multiple sources of evidence • Data collection • Establish a clear chain of evidence through study documentation & dissertation construction

78

Internal Validity • Research design • Selection of validated research tool; all modifications approved • Data analysis • Use pattern matching • Research design & data • Address rival explanations analysis

External Validity • Research design • Apply theory in case study

Reliability • Research design & data • Use research-based case study collection protocol

Within the constructs of this specific study, limitations include the self-report nature of the volunteers who were recruited as participants. As such, the narrative co-constructed with the researcher represents the perspectives of these participants, only, and is not generalizable to preservice teachers at large. The primary threats, and steps taken to assure trustworthiness, included possible threats to reliability and to internal validity. Reliability of the scores from the

LAS result from the survey being administered to only a limited sample, and therefore it is subject to errors of measurement (Crocker & Anlgina, 1986). In her work, developer King

(2009) has demonstrated previous reliability of scores attained through use of her instrument

(King, 2009), which was constructed to be administered at different times to measure different perspective transformations for a wide variety of possible participants and scenarios.

Additionally, King (2009) highlights threats to internal validity for researchers to

consider, including experimental mortality, validity, primacy, and social desirability bias

(LeCompte & Schensul, 1999; Miles et al., 2013). Experimental mortality did not occur, as the

five participants who indicated they would be willing to partake in follow-up activities all did so.

The primacy effect, the name given to describe the phenomenon that leads participants to select options positioned first in close-ended selected response surveys (LeCompte & Schensul, 1999),

79

is a risk related to utilizing the LAS. To counter this risk, follow-up questions from semi-

structured interviews were structured to probe beyond initial responses. Finally, social

desirability bias, inherent in self-reported nature of any case study, was minimized through the

use of electronic survey delivery (SurveyMonkey).

Protection of Human Subjects

Yin (2013) cautions that honesty and transparency are vital hallmarks of research whose lack can jeopardize the integrity of a case study. In keeping with protection of human subjects, the privacy of all participants has been maintained, as I coded any provided names during data collection and analysis to ensure strict confidentiality related to all materials was maintained throughout and after the research process. Additionally, I sought to ensure that the greatest respect to participants’ perspectives and voices was afforded to participants throughout the process.

From the outset of my communication with potential participants, I established my role clearly as a position that had neither authority nor impact on the students’ academic achievement. Informed consent was provided electronically. For the survey, informed consent was unsigned, and it was made clear to participants that continuing the survey past the introduction screen signified their consent (Appendix B). For the follow-up activities, participants selected were sent informed consent documents electronically (Appendix D); these documents were discussed verbally at the start prior to the commencement of interviews to ensure participants had an opportunity to seek clarification on any aspect of the informed consent. Then, participants signed and returned their written informed consent, and these signed forms were securely stored as noted in the subsequent section, Data Storage.

80

Throughout the study, participants were assured that their role was voluntary; that any

and all data collected was treated in accordance with strict confidentiality; that measures were

and would continue to be taken to protect their privacy; that no identifying information with regard to the university nor them, individually, will be shared in this research; and that data collected will be used only in support of research to illustrate findings and recommendations that may help to improve teacher education. Finally, I sought IRB approval and followed all IRB guidelines for ethical considerations related to participant selection and study guidelines; participant recruitment materials and IRB consent forms are referenced and provided in the appendices of this doctoral thesis.

Data Storage

All electronic data, including survey responses, transcripts, pseudonym information

(which were assigned as early as feasible within the data collection process), my reflexive journal, field notes, and documents collected for analysis, were stored in password-encrypted local (not cloud) storage that requires two-step authentication. Any tangible, hard copies of the same are locked in a fire-resistant metal filing cabinet accessible only to me. Precautions have been taken to ensure names and other identifying information were never captured in interview

recordings, and any names or identifying information that may have been inadvertently captured

were edited out of these materials at the transcription stage. Once all data collection was

completed, all materials that were stored electronically were placed on an external hard drive and placed in the aforementioned fire-resistant, locked metal filing cabinet. After conferral of my doctoral degree and an additional three years, during which time an authority may request verification of my research, has elapsed, all transcripts and recordings associated with this research will be destroyed.

81

Chapter 4 – Analysis of Findings

Context and Background

The overall purpose of this research was to examine to what extent learning experiences designed for and implemented within a teacher preparation course on educational assessment may impact the professional dispositions of elementary teacher candidates with respect to FAS.

Within the constructs of this study, professional dispositions are understood to comprise the

values, attitudes, and understandings that comprise desired traits among teachers (Council of

Chief State School Officers, 2011). Although research has demonstrated that various factors, including life changes may impact transformations within learners, learning experiences were targeted in this research as the lone variable connected by research to changes in preservice teachers’ perspectives that can be deliberately manipulated by course instructors. Further, learning experiences were explored in this research because of their under-exploration in related scholarly literature with respect to how perspectives and habits of mind—which are foundations of preservice teacher disposition development—may be shaped during their coursework in teacher preparation programs.

Using both quantitative and qualitative data collection and analysis, this study sought to

explore the connection from dispositions to perspective transformations among preservice

educators, and to evaluate the extent to which learning experiences may have contributed to

these perspective transformations. A sequential explanatory design, ideally suited to mixed

methods research analysis, was employed, such that data were analyzed—and are presented—in

terms of quantitative and then qualitative analysis, followed by pattern matching across all data

through analysis of individual responses and correlations among all participants (Creamer, 2018;

82

Creswell & Plano Clark, 2018; Kitchenbaum, 2010). The study was framed by three research

questions to guide this exploration:

• Primary Research Question: How are elementary preservice teachers’ habits of mind

about formative assessment strategies (FAS) impacted by their learning experiences?

• Research Subquestions: (1) What shifts in habits of mind, if any, do elementary

preservice teachers perceive they experienced regarding understanding the role of

effective formative assessment strategies (FAS) in education as a result of their

coursework about educational assessment? (2) What learning experiences, if any, do

elementary preservice teachers believe affected their habits of mind during their

coursework about educational assessment?

Structure of Data Analysis

After a brief series of explanations and tables to provide demographic background and simplified analysis of transformative learning by demographic traits, participant profiles and individual interviews from specific case participants (SCPs)—those individuals selected for interviews and assignment document analysis—are depicted. The next section shares a presentation and deeper analysis of all data sources, exploring each research question by quantitative and qualitative findings, in turn. SCPs’ verbal examples and germane segments from participant-submitted documents are further woven into these qualitative findings to illustrate key patterns.

To conclude this chapter, emergent themes that were generated from the data are delineated next, briefly and individually. To more broadly illustrate these themes, a summation is provided as it relates to the transformative change experienced by participants. Finally, a

83 synthesis is offered that corresponds to this study’s purpose and leads into a short review of the researcher’s reflexivity in relation to this analysis.

Overview and Analysis of Transformative Learning by Demographics

To orient readers of this study to the participants involved, the following short explanations and tables provide an overview of the participants involved in this study. These explanations analyze both program and personal demographics in terms of frequencies and rankings. Participant demographics (Tables 4 – 16) review class status and semester longevity; prior education; prior assessment course experience; participants’ academic majors; age; sex; race; and relationship status. Where warranted by enough responses to survey questions, further analysis by demographic trait is analyzed by the frequency and percent of students who do and do not indicate a transformative learning experience.

Of a possible pool of 39 participants, 32 chose to participate in the LAS component of this study, including sophomores (N=19), juniors (N=12), and one senior. In an open-ended question, the participants indicated the number of semesters in which they had been enrolled in their program, which helps to provide some insight into the experience within the preservice teaching program each participant held. These data are presented in Table 4:

Table 4. Participants by Class Status and Semesters Enrolled

Frequency of Percent of # of Semesters Class Status Participants by Participants by Enrolled Semesters Enrolled Semester Enrolled Sophomore (N=19) 3 18 58.1 % 4 1 3.2 % Junior (N=12) 5 12 35.5 % Senior (N=1) 8 1 3.2 %

84

Because there was some variation in class status, data were next analyzed in terms of the

frequency and percentage of students indicating transformative learning by class status. The data

reveal that that those who had been enrolled in the preservice education program for fewer

semesters reported a higher degree of transformative learning, with 43.8% of all students being sophomores who indicated a transformative learning experience and 31.2% of all students being

juniors who indicated a transformative learning experience. The lone senior in the data reported

a transformative learning experience, constituting 3.1% of all students responding they had

experienced transformative learning. However, that result is not significant given the single

senior response. Additionally, as the senior reported having 8 semesters of experience, and the

next closest number of semesters reported was 5, the researcher felt this gap in experience was

significant enough to choose not to organize the data from the senior into the juniors’ responses

for the sake of statistical analysis.

Calculations regarding these data, excluding the senior participant, return a chi-squared test of significance between sophomore and junior class status and the frequency of students who reported experiencing transformative learning as 0.3917, equating to a p-value of 0.53, a moderately positive correlation. One possible implication of the correlation between class status and frequency of experience of transformative learning may suggest that students who had been enrolled for less time in the program were more likely to be impacted by the course on educational assessment than those who had been enrolled longer, and therefore had more experience in the program. Results are captured in Table

85

Table 5. Participants Indicating Transformative Change by Class Status

Frequency of Percent of Frequency of Percent of Participants Participants Participants Not Participants Class Status Indicating Indicating Indicating Not Indicating Transformative Transformative Transformative Transformative Learning Learning Learning Learning Sophomore 14 43.8% 5 15.6% (N=19) Junior (N=12) 10 31.2% 2 6.3% Senior (N=1) 1 3.1% - -

Most participants had limited prior formal education, with a majority of all participants

holding only high school diplomas or GEDs (81.3%), four holding associate’s degrees (12.5%),

and one holding a bachelor’s degree (3.1%). These answer choices corresponded to a selected

response question. Due to the small number of students with a degree beyond a high school

diploma or GED, this study does not attempt an analysis of transformative learning by this

demographic. Frequency and percent of participants by prior education level are submitted in

Table 6:

Table 6. Participants’ Prior Education

Frequency of Percent of Participants by Participants by Prior Education Completed Education Education Completed Completed High school diploma/GED 26 81.3% Associate’s degree 4 12.5% Bachelor’s degree 1 3.1% Did not answer 1 3.1%

86

Illustrating the lack of prior assessment course experience, a majority of participants had

not taken any previous assessment course(s), despite the fact that none were freshmen. As

revealed below in Table 7, twenty-nine (29) participants (87.5% of all participants) had not

previously taken any other course on assessment; three (3) participants (12.5% of all

participants) had taken one other course on assessment prior to this course, which they identified

in a textbox that prompted them to specify which course, if they responded yes, as a course for

assessing students whose native language was not English. Though some respondents clearly

indicated the course number and name, it has been genericized to preserve anonymity of the site.

Table 7 illustrates the frequency and percent of students among the participants who had and had

not taken a prior course, “Second Language Assessment” (course name genericized).

Table 7. Participants with Prior Assessment Course Experience

Frequency of Percent of Other Assessment Participants by Participants by Course(s) Taken Other Courses Other Courses Taken Taken None 29 90.6%

Second Language Assessment 3 9.4%

Best practices for quantitative data analysis do not recommend use of data for fewer than

five participants to draw any generalizations (Creswell, 2012), and as such no calculations to

determine correlation were attempted. However, the researcher felt analysis of this small data set may be relevant simply to log considerations for future research, since each of the three students who reported they had a prior course on assessment also reported a transformative learning shift.

In other words, while the total number of participants who had taken the prior course on assessment (genericized as “Second Language Assessment”) is not statistically significant from

87

among this small pool of participants, it is perhaps of interest to note that every student (100%)

who took a prior course in education assessment felt they had experienced transformative

learning, whereas a majority (68.8%) of those who had not taken a prior assessment course still

noted transformative learning from their experience in this course.

Again, while these data are not significant because of the extremely small number of students who reported having taken a prior course in educational assessment, a future study of a much larger participant pool, and particularly students who have taken more than one course on educational assessment, may consider how such an experience may contribute to transformative learning among preservice students. Data from this analysis can be found in Table 8:

Table 8. Participants Indicating Transformative Change by Prior Assessment Course Experience

Percent of Frequency of Frequency of Percent of Participants Participants Participants Who Participants Who Did Not Who Did Not Took Prior Who Took Take Prior Take Prior Took Prior Course Course Prior Course Course and Course and Did Indicating Indicating Did Not Not Indicating Transformative Transformative Indicate Transformative Learning Learning Transformative Learning Learning Yes, Took Prior 3 100% - - Course

No, Did Not Take 22 75.86% 7 24.14% Prior Course

All of the participants were elementary education majors (N=32), which aligns with the

purpose of the course as one designated for educational assessment for those majoring in

elementary education. However, several students who are majoring in elementary education had

also declared a concentration in special education (N=12), formally. These responses were

88

indicated in response to a close-ended item, though an “other: please specify” field was provided.

Results are portrayed in Table 9:

Table 9. Academic Major of Participants

Frequency of Percent of Participants by Participants by Academic Major Academic Academic Major Major Elementary Education, only 20 62.5% Elementary Education, Special Education concentration 12 37.5%

As nearly 40% of the elementary majors also specialize in special education, data were

disaggregated by those who did and did not specialize in special education and those who did or

did not indicate a transformative learning experience. Variation in results compared across the

two demographics may reveal that students who did not specialize in special education

experienced a marginally larger rate of transformative change. Specifically, 46.9% of

participants majoring solely in elementary education indicated transformative learning occurred

for them; 15.6% of the same indicated no transformative learning. Of those participants

majoring in elementary education with a specialization in special education, 31.2% indicated

transformative learning occurred for them; 6.3% of the same indicated it had not.

Considered a different way, the ratio of students majoring only in elementary education who felt they experienced transformative learning compared to those who did not is 3:1; the ratio of elementary education majors who also specialize in special education who felt they experienced transformative learning compared to those who did not is 5:1. As indicated in Table

10, these results return a calculated chi-square correlation of 0.3048, with a p-value of 0.58,

89 suggesting there is some correlation between those who major only in elementary education without a specialization in special education and their reported experience of transformative learning. Though a variety of explanations for this correlation may surface, it is possible that students exposed to other coursework known for its emphasis on differentiation through multiple measures, including assessments, were therefore less impacted by this course on educational assessment.

Table 10. Participants Indicating Transformative Change by Major Specialization

Frequency of Percent of Frequency of Percent of Participants Participants Participants Participants Academic Major Indicating Indicating Not Indicating Not Indicating Transformative Transformative Transformative Transformative Learning Learning Learning Learning Elementary Education, 15 46.9% 5 15.6% only

Elementary Education, 10 31.2% 2 6.3% Special Education concentration

In addition to education and program-related demographics, some personal demographics were also sought and are extrapolated below to facilitate not only understanding of these data, but also to help situate this research among research within the field. Personal demographic data

(Tables 11 - 16) depict participants’ relative ages; sex; race; and marital status.

Of these participants, a majority were between the ages of 18 and 21 (N=24), while others were between the ages of 21 and 24 (N=8). No participants were under the age of 18 nor were any age 25 or over. Table 11 depicts the ratio of preservice teachers by age bracket, indicated in a selected-response item, as those who were below or at/over the age of “adult” (21) learning:

90

Table 11. Participants by Age Bracket

Frequency of Percent of Age Bracket Participants by Age Participants by Age Bracket Bracket 18-20 24 75%

21-24 8 25%

Table 12 presents self-reported transformative learning by age bracket among all preservice teachers. From the responses, 53% of all participants who indicated whether they had or had not experienced transformative change were between the ages of 18-20, and 71% of those responded that they had experienced transformative learning. Additionally, 22% of all participants—29% in that age bracket—did not. Additionally, all students between the ages of

21 and 24 indicated they experienced transformative learning, representing 25% of all participants and 100% of participants from that age bracket. Calculation using the Fisher exact test returns a statistical correlation of 0.1497 between transformative learning and age.

Unfortunately, as age bracket was not reported more finitely in comparison to class level

(sophomore, junior, senior), further analysis by precise age or class level is not possible for the purposes of this study.

Table 12. Participants by Age Bracket Indicating Transformative Change Among All Participants

Frequency of Percent of Frequency of Percent of Participants Participants Participants Not Participants Not Age Bracket Indicating Indicating Indicating Indicating Transformative Transformative Transformative Transformative Learning Learning Learning Learning 18-20 17 53% 7 22%

21-24 8 25% - -

91

Selected-response options for sex self-identification included male, female, transgender, non-binary, or “other”; participants chose only male (N=4) or female (N=28). As a majority

(88%) of students identified themselves as female, this study does not attempt an analysis of transformative learning by this demographic. Results of this selected-response question are portrayed in Table 13:

Table 13. Sex of Participants

Frequency of Participants Percent of Sex by Sex Participants by Sex Male 4 12%

Female 28 88%

Once again, best practices for quantitative data analysis do not recommend use of data for fewer than five participants to draw any generalizations (Creswell, 2012), and as such no calculations to determine correlation were attempted. Of note, though, 100% of male participants noted they had experienced transformative learning, whereas 75% of female participants noted they had experienced transformative learning. This sample size is too small for meaningful analysis, but future research may merit consideration of transformative learning in terms of impactful persons, life changes, learning experiences, and so on, of preservice teachers by sex. Frequency and percent experiencing transformative learning by sex are depicted in Table 14:

92

Table 14. Participants Indicating Transformative Change by Sex

Frequency of Percent of Frequency of Percent of Participants by Participants by Participants by Participants by Sex Not Sex Not Sex Sex Indicating Sex Indicating Indicating Indicate Transformative Transformative Transformative Transformative Learning Learning Learning Learning Male 4 100% - -

Female 21 75% 7 25%

With respect to race, participants were provided an array of options including White, non-Hispanic; Black, non-Hispanic; Native American; Asian or Pacific Islander; Hispanic; Bi-

Racial; or Multi-Racial. Four categories were marked by participants, representing full participation in response (i.e., no one skipped the question). As a majority (90.6%) of students identified themselves as White, non-Hispanic, this study does not attempt an analysis of transformative learning by this demographic. Results are displayed below in Table 15:

Table 15. Race of Participants

Frequency of Percent of Race Participants by Participants by Race Race White, non-Hispanic 29 90.6% Black, non-Hispanic 1 3.1% Native American - - Asian or Pacific Islander 1 3.1% Hispanic 1 3.1% Bi-Racial - - Multi-Racial - -

93

The final personal demographic question asked participants to identify their relationship

status. Students were asked to designate one of five possible options: single; married; partnered;

divorced/separated; widowed. All participants indicated they were either single or partnered.

Because “single” and “partnered” were not defined, limited interpretation is possible from this

question. And, because only three participants indicated their relationship status was one other

than single, data analysis by this domain is not recommended. The 90.6% of participants who

stated their relationship status was single, and 9.4% who reported being partnered, are displayed

in Table 16:

Table 16. Relationship Status

Frequency of Percent of Participants by Relationship Status Participants by Relationship Relationship Status Status Single 29 90.6% Married - - Partnered 3 9.4% Divorced/separated - - Widowed - -

Selected Case Participant (SCP) Overviews

Of the 32 respondents to the survey, five (5) participants who indicated they had a

transformative learning experience related to the role of formative assessment and assessment

purposes agreed to participate in follow-up activities, including a semi-structured interview and sharing artifacts that represented their learning in the course about educational assessment. To best explore the research questions, data from all participants, including close and open-ended

94

survey responses, are portrayed. Selected Case Participant’s (SCP’s) interviews and shared artifacts provided further qualitative context that enabled the researcher to gain a more complete sense of the answers to each research question and a stronger grasp on the emergent themes that resulted from this study. To orient readers to the demographics and basic transformative

learning survey responses that are specific to each SCP, Tables 17 and 18 provide

pseudonymized information about each presented by the order in which each was interviewed:

Table 17. Pseudonymized SCP Profiles

Relationship Name Gender Age Range Race Major Status White, non- Ellen Female 21-24 Hispanic Single Elem. Ed. Elem. Ed. White, non- Christine Female 18-20 Hispanic Single w/ Spec. Ed. White, non- Maria Female 18-20 Hispanic Single Elem. Ed. White, non- Lindsey Female 21-24 Hispanic Single Elem. Ed. Elem. Ed. White, non- Sara Female 18-20 Hispanic Single w/ Spec. Ed.

95

Table 18. Pseudonymized SCP Profiles, Continued

Prior Course Experienced Change Related to a… Prior Education Name Class Status in Transformative Class Life Level Person Assessment Learning Assign. Event Ellen H.S. Diploma/ Junior No Yes Yes Yes No GED Christine H.S. Diploma/ Sophomore No Yes Yes Yes No GED Maria H.S. Diploma/ Sophomore No Yes No Yes No GED Lindsey H.S. Diploma/ Junior No Yes Yes Yes No GED Sara H.S. Diploma/ Sophomore No Yes Yes Yes No GED

SCP Interview Overviews

The researcher met with each SCP individually using Zoom videoconference software.

Of the five interviews, the longest lasted for 42 minutes, and the shortest was 25 minutes. The

following narratives describe each SCP’s general survey responses, reveal pertinent open-ended

survey responses, and then briefly summarize the context and contents of each interview through

subsections titled “Profile from Survey Responses” and “Interview Discussion.” More detailed

analyses of the interviews are discussed during the quantitative and qualitative “Analysis of Data

Related to Research Questions.”

Ellen: SCP # 1

Profile from survey responses. The first SCP was Ellen, a junior who has no prior

educational assessment course experience. As with all SCPs, Ellen’s survey responses indicated

she had experienced a transformational change related to her beliefs, opinions, or expectations

about assessment techniques and the roles of formative and summative assessment.

Additionally, Ellen had responded that she would not normally characterize herself as one who

96

normally thinks back over previous decisions or past behavior, but who does frequently reflect

on the meaning of her studies for herself, personally. In an open-ended survey response to

question 3, she briefly described her reflection about how this change occurred for her: “I

realized that assessment is so much more than testing students. It is a process almost that is

happening all the time when it is done right. It is so much more than just summative assessments

all the time.”

Like many of her classmates who responded to the survey, Ellen felt people, including another student’s support, her classmates’ support, a challenge from her instructor, and her instructor’s support, had shaped this change. Additionally, she attributed the change to a variety of course assignments, including readings, class/group projects, verbally discussing her concerns, self-assessment in the class, class activities/exercises, and personal reflection had helped her to experience this transformation most of all. Of the instructor and activities, Ellen noted they impacted her change mainly because of numerous opportunities for discussion, prompted by the instructor, to discuss what they were reading about and doing in class. On question 10, Ellen

submitted: “It was mostly discussion and the professor pushing us to think about what we had

experienced as students and how they influenced us. Thinking about strategies I had experienced

but not realized it really made me think about what to do as a teacher.”

Interview discussion. The researcher met with Ellen to conduct the interview within

days of Ellen submitting the survey online. She was finishing her school day and took time to

interview in the middle of finals. The interview lasted for 25 minutes. Ellen seemed very

excited to interview and to share her experiences in the course. Ellen’s interview answers

aligned with the survey responses she had submitted. She confirmed she felt she had

experienced a time when she realized that her values, beliefs, opinions or expectations about

97

assessment techniques, the role of formative assessment, or the role of summative assessment

have changed. Specifically, Ellen described the transformation in her thinking in terms of learning—both learning entirely new concepts, but also learning deeper meaning of concepts to which she had been exposed as a student. For instance, she noted that she learned new student- assessment strategies with which she had not been familiar. Additionally, she noted that she

realized some of her K-12 teachers had used a variety of formative assessment strategies with her

classmates and her, though she had not realized or understood this when she was a student.

What solidified many of her new understandings about assessment techniques and the

role of assessments, Ellen asserted repeatedly, were ongoing opportunities for discussion and

practice in class. She explained that she and her classmates were placed into small groups that

mimicked Professional Learning Communities, or PLCs, for the duration of the semester, and

that they had numerous opportunities “to collaborate together and to […] bounce ideas off of

each other, […] and then had time to apply feedback or ideas from our classmates.” The class

time to collaborate and practice, Ellen highlighted, was particularly beneficial because they were

regularly spurred by thought-provoking assignments and discussion started from the instructor.

Instructor-initiated collaboration was often based on assignments and supplemental materials the instructor shared to engage and educate Ellen and her peers.

The instructor, explained Ellen, “Challenged us to think a lot about what we had experienced as students, and then to think a lot about videos and articles and stuff. We practiced building assessments, looking at assessments, and she really made sure we had all the help we needed to improve our understanding. It wasn’t just, hand in an assignment, get a grade, to the next one.” Ellen acknowledged several times, in several different ways, that the instructor had modeled the same best practices about which she was teaching, ensuring that students’ learning

98 was supported and measured at multiple checkpoints prior to students’ receiving grades on summative assessments.

In addition to crediting practice and discussion with her peers, often grounded in assignments or challenges from the instructor, Ellen felt that she was encouraged and chose to take the time to reflect regularly on her future role as a teacher, juxtaposing this future with her own past experiences as a student. For Ellen, this type of thinking appeared to delve into metacognition — “You can’t have a change in thinking without thinking what you’re thinking about,” Ellen quipped. Ellen articulated that all of the thinking and learning led to a transformation in her own values, beliefs, opinions, or expectations about assessment types and roles, which she perceived in an “Aha! moment.” In Ellen’s words, “It’s different than what I always thought […]. It’s not just all about recording grades […]. It’s about so much more than that. It’s about kids, and helping kids grow. If you focus on just the bottom line, the grades, you don’t really know if the kids are learning what they need to learn […]. And it’s not about them growing at all, it’s just about that grade in your grade book.”

When asked explicitly to what she credits the “Aha! moment” when she realized she had experienced a transformation in her thinking, Ellen cited the discussions about assessments in class that helped her to better understand the purpose, and power, of assessments. She said she has changed her thinking about what she will do regarding the role and use of assessments when she is a teacher in her own classroom, closing her insight by submitting, “I think it’s going to be really helpful as a teacher. It’s going to make me a better teacher from the start.”

Christine: SCP # 2

Profile from survey responses. Christine is a sophomore who had no prior experience with another course in educational assessment. In her survey responses, she characterized

99

herself as one who does not usually think back over previous decisions or past behavior, nor one

who frequently reflects upon the meaning of her studies. However, she indicated that she did

realize while completing this survey that she had experienced a transformative change related to

her values, beliefs, opinions, or expectations related to assessment techniques and the role of

types of assessments. To the open-ended question 3, she briefly described her reflection about

how this change occurred for her: “I started to realize how important it is to measure what

students learn. I realized assessment is about student growth and learning, not about grades.”

Aligned with responses from many classmates, Christine ascribed this change to her

instructor’s support and four different class assignments, including supplemental reading or

materials, verbally discussing her concerns, self-assessment in the class, and personal reflection.

Christine felt people, including another student’s support, her classmates’ support, a challenge

from her instructor, and her instructor’s support, had shaped this change. Additionally, she

attributed the change to a variety of course assignments, including readings, class/group projects,

verbally discussing her concerns, self-assessment in the class, class activities/exercises, and

personal reflection had helped her to experience this transformation most of all. Of the instructor

and activities, Christine noted they impacted her change mainly because of numerous

opportunities for discussion, prompted by the instructor, to discuss what they were reading about

and doing in class. Christine submitted for question 10: “She [the instructor] kept prompting us

to think about our own experiences with grading, and reflecting on assignments we give to

students and what their point is, to know if we are effective as teachers.”

Interview discussion. The interview with Christine took place the same evening as the interview with Ellen, and Christine’s interview lasted for 27 minutes. Christine was in the middle of studying for final exams and wanted to interview before she concluded her exams and

100 traveled home for the holidays. As the interview progressed, Christine seemed to gain confidence, adding richness to her answers, all of which coincided with and built off of the responses Christine had provided in her survey.

At the beginning of the interview, Christine described her experience of shifting in her values, beliefs, opinions, or expectations about the types and roles of assessments as one grounded in her experiences in small group PLCs during the semester. In particular, Christine related an experience she had during a discussion activity that involved creating formative assessment questions related to the branches of government. Through peer collaboration and feedback, Christine observed that some of her peers evidently had had different experiences as

K-12 students than she had, and that they seemed to have a stronger grasp than she did regarding how teachers can use formative assessment strategies to learn about what students know, and then can adapt and scaffold instruction in response to data gathered through the use of these strategies. Christine felt this particular conversation with her peers influenced her thinking around the purpose of planning and grading assignments as a teacher. Christine articulated, “It clicked for me. The purpose of grading is actually to help students. Now, […] I want to grade things with the purpose of improving my instruction, to help students learn better.”

As the interview continued, Christine continued to attribute her shift in thinking largely to her instructor’s prompted thinking, as well as the support of her classmates through repeated practice opportunities that made use of peer feedback and collaborative discussion. Specifically,

Christine felt the instructor regularly invited her and her classmates to reflect on experiences they had had as students; to glean new learning from a variety of quality examples, materials, and resources; and to practice applying new or adapted learning of what they were coming to

101

understand while providing one another with feedback to improve what they could do with their new understanding.

For instance, Christine cited video excerpts and articles that helped her to identify numerous examples of how teachers she had had when she was a younger student had misused assessment purposes. She noted most of her teachers had graded everything, and they did not often make use of the variety of types of assessments that could have been used to gather information about what she and other students already knew. Through instructor-initiated dialogue and peer-collaboration, Christine felt her views on the purpose of grading transformed substantially during this particular course about educational assessment. She remarked that rethinking what she thought she previously knew caused her to be open to learning a variety of purposes for multiple types of assessments, including learning what students know and improving instruction, but also to reflecting on her own teaching to realize if she could have improved upon her use of pedagogical strategies.

In addition to reframing her prior understandings about assessment types and purposes around videos and articles the instructor had shared with the class, Christine described new learning that occurred through peer collaboration as a result of their instructor’s use of assessment examples. Another key learning moment for Christine had occurred during PLC- discussion about creating rubrics. After reviewing multiple exemplars shared from the instructor and working to create her own rubric for the first time, Christine received feedback from her peers that led her to realize she still was not measuring precisely what she had intended to measure.

Ultimately, Christine ascribed the intersection of various supports—her instructor’s support and reinforcement, instructor prompted-thinking, peer feedback, repeated opportunities

102 to practice what she was learning in class, and the chance to learn from and interact with multiple exemplar materials, led to a significant transformation in her thinking. She felt this transformation in her thinking had “everything to do with the class” and submitted, “If it weren’t for this class, I’d probably just be that teacher who grades everything to have grades in the grade book.” Demonstrating her reflection on past experiences and her recent growth, Christine added,

“I want to assess students differently than a lot of my teachers assessed me.” Finally, and perhaps most powerfully, Christine acknowledged that while she does not fully know what teachers in the field are doing now [as opposed to what they had done when she was a K-12 student], “I think if a lot of us are learning to do better, we will do better. […] And that’s just so important.”

Maria: SCP # 3

Profile from survey responses. A sophomore with no prior experience in another course about educational assessment, Maria describes herself as one who usually does think back about her previous decisions or past behavior, as well as one who frequently reflects on the meaning of her studies, for her own sake. Maria responded that she had experienced a change in her thinking about assessment techniques and the role of assessment types. In describing what happened, Maria claimed: “I used to believe assessments were for teachers only. In creating formative assessments myself, I realized they are checkpoints for not only the teachers, but the students. Teachers fit future instruction to help [students] based on data” (question 3).

Maria was the only SCP and one of only five (5) students in the course who indicated in her survey responses that she did not feel either a person or life change triggered this transformation for her. More specifically, she was one of three (3) students in the course who felt this transformation was caused by an experience in the course, and in particular the non-

103

traditional structure of the course, which gave her the opportunity to do many activities through

online modules and homework assignments, and then to get feedback in class to help her

improve her own knowledge and skills. In her open-ended survey response to question 10,

Maria asserted: “The experience of actually making assessments provided the change in my

thoughts. I saw that my own motives to see where my students are at, and have them meet

objectives, with the willingness to change my instruction, likely mirrored that of my past and

future teachers.”

Interview discussion. Maria and the researcher had the opportunity to meet about a week and a half after she had completed the survey. The interview occurred on a Sunday afternoon and lasted for 29 minutes. Maria was very curious about the purpose of this study and the nature of teacher preparation with respect to educational assessment, so the researcher shared the general scope of the research, summarizing the purpose as outlined in the study’s Informed

Consent paperwork and sharing the synopsis written into the invitation to participate in the study.

The researcher was careful to ensure that the level of detail was objective and not stated in a way that could bias the participant.

During the interview, Maria did maintain that her experiences in class, and particularly assignments, had triggered the shift in her values, beliefs, opinions, or expectations about assessment techniques, the role of formative assessment, and the role of summative assessment.

However, throughout the interview, Maria also made it clear that this shift would not have occurred for her if it were not for prompts from the instructor that helped her to frame and reframe her understandings about these assessment concepts, nor without opportunities to collaborate with and gain feedback from her peers. In fact, towards the end of the interview,

Maria noted that the repeated opportunities for practice and for collaboration in her PLC had

104

substantially impacted her mindset as both a learner and a future teacher. When asked what

could have been done differently in the class to have further helped her transformation, Maria

laughed, “The course could have been offered longer, past the semester. I think more opportunities to practice, more time in the PLCs, was really helpful. The class had to end when the semester ended. It’s a really good thing it was a required class because it was so beneficial!

I’m kind of realizing more right now just having to talk about the course!”

Most of all, Maria highlighted how the non-traditional structure of the course afforded she and her classmates with opportunities she felt may not have been present in a more traditional class setting. She described the non-traditional experience as one that shifted how class time was used for half of the class sessions. That is, rather than meet twice each week, students had “homework assignments” that involved preparing for the following class through research and preparation. Maria felt that researching new learning through assignments outside of class, on her own, helped her to concentrate on learning without distractions and made her better equipped to discuss what she had learned, and what questions she had, when she came to class. And, she noted she felt it made her peers more prepared for this class than peers in other, more traditional classes.

Because she and her peers were more prepared, suggested Maria, their PLC discussion time was much more focused and afforded them with ample opportunity to really help one another. Maria noted that PLC collaboration helped her to come to the realization that, while much of the work she did initially was flawed, she felt encouraged to take risks and learn for the sake of learning, instead of feeling pressured to perform perfectly from the start. Maria reflected

on an experience she had in class one day while getting feedback on an assignment to create a

standards-aligned rubric. A peer in her group helped her to see that the rubric parameters she

105 had constructed were ambiguous and did not measure intended learning. Maria realized that, without the opportunity to practice this sort of assignment in class, she would have made and repeated this mistake as a new teacher.

In conjunction with discussing how she felt free to take risks and could really apply what she was learning to improve her own understanding of the subject, Maria credited the instructor with “practicing what she preached.” Maria stated that she felt the instructor truly modeled the appropriate application of formative and summative assessment, ensuring she and her classmates had many chances to learn about course topics from multiple perspectives prior to having to submit final evidence of learning for a grade. Referring back to the example of how she realized the rubric she had created did not measure intended learning, Maria talked about how the instructor did not penalize her grade for her mistakes. Maria explained, “She graded our work,

[…] but if it didn’t come out perfect, she’s like ‘It’s fine, just write about it. Just say it didn’t work and why and she’ll take that, too, because we’re thinking about it and learning, and that’s what matters.’”

Another factor that Maria attributed to her learning was the presence of numerous high- quality exemplars that the instructor provided, including assessment artifacts from the field (e.g., sample assessment items, rubrics, or exit tickets), coupled with additional supplemental materials

(e.g., videos and articles). Maria noted the instructor did an exceptional job of helping to focus she and her peers on the most salient aspects of these artifacts and materials. Rather than just give the class assessment items as samples, the instructor highlighted some of the items’ strengths and weaknesses. And, instead of just giving them links to multiple videos, the instructor provided links, but also provided notes to call the students’ attention to the points in each video that would be most germane to the topic about which they had been learning.

106

As the interview concluded, Maria articulated that she felt being in this course about educational assessment had transformed her learning because her instructor, a veteran teacher, had shared many ideas about assessment in ways that made it fun and made her think differently.

Maria feels that she learned many new ideas from her instructor that affect both her mindset as a future teacher, as well as the toolkit of strategies she will use when employed as a teacher. Maria will enroll in her student teaching next year, and alluding to this, she ended the interview by noting, “I feel like I wouldn’t be ready to go into lesson planning and creating formative assessments next year without [this course].”

Lindsey: SCP # 4

Profile from survey responses. Lindsey, a junior with no prior course experience about educational assessment, characterized herself in the survey as one who usually thinks back over previous decisions or past behavior, as well as one who frequently reflects upon the meaning of her studies for herself, personally. Indicating that she had experienced a transformation related to her values, beliefs, opinions, or expectations related to assessment techniques and roles, she described this shift simply in an open-ended survey response (question 3): “I learned that in order to have successful summative assignments, you need to test your students formatively throughout the school year.”

Lindsey, like many of her classmates, seemed to feel that other people and class assignments influenced this change in her thinking. She identified influences including her instructor’s support, class/group projects, verbally discussing her concerns, and class activities/exercises. Lindsey identified that she first realized this change was taking place when,

“We were talking about successful summative assessment at the time and that is what really got me to change my views” (question 10).

107

Interview discussion. Lindsey and the researcher spoke about a week and a half after she had completed her survey, several hours after the interview with Maria occurred. The interview lasted for 32 minutes and provided the opportunity for the researcher to learn more about Lindsey’s survey responses and experience of transformative learning as a result of being a student in this course about educational assessment. During the interview, Lindsey’s responses reiterated and expanded upon the answers she had submitted to the LAS. However, Lindsey also revealed a greater impact from the non-traditional structure of the course than she had noted in the survey, and she highlighted this impact at several points throughout the interview.

Lindsey opened by sharing her enthusiasm for the course and, particularly, the instructor.

She referenced the instructor’s passion for the instructor seven (7) separate times, and then confided that she had been rethinking becoming a teacher because of previous course experiences, but that her experiences in this course had solidified her commitment to entering the profession. What had particularly bothered Lindsey about becoming a teacher was what she described as a building misconception about assessment and the uses of assessment data. Having grown up in a time when high-stakes testing was at its peak in the United States, Lindsey felt that teachers’ roles were perhaps more rigid than she wanted to deal with in the profession.

However, Lindsey beamed when she related how the instructor “showed her how free you can be” and articulated a variety of strategies for developing types of assessments that encourage relationship-building with students, such that assessment is a tool for guiding the teaching- learning relationship in classrooms.

Lindsey described this shift in her thinking, noting that she still sees that assessing students is important, but for entirely different reasons than she was coming to believe. Now, she feels formative assessments in particular can be used as a tool to better understand her

108 students, “meet them where they are at,” and then empower them through high-quality instruction. She noted: “This is one of the courses that I really spent a lot of time on, and not only because you necessarily needed to, but just because of the things she was putting out were so interesting to me.” Lindsey now feels more passionate than ever about becoming a teacher and working with children to personally help them learn and grow.

How did this shift in her thinking occur? According to Lindsey, it was a combination of the non-traditional structure of the course, exposure to numerous quality exemplars, and instructor-guided use of class time that facilitated collaboration and discussion among her peers.

In describing how the non-traditional structure affected her, she highlighted that the instructor was “behind the scenes” planning everything. She felt the instructor had very carefully chosen their “homework” assignments, which were to be explored on the days when the class did not meet, so that all students came to class prepared and class time was maximized for optimal learning. Lindsey noted with a wry chuckle, “It was weird because […] homework is not normally the most fun thing, but it was actually kind of interesting every week to see what she would put up, and what else we have to learn about and read about. And it really helped!”

Lindsey felt the practice in class time was invaluable to her and to her peers, noting that all of the assignments were “thought provoking.” She described the assignments as both challenging but also fun, and she felt that repeated opportunities to gain feedback from her peers and the instructor helped to build her learning in a nonjudgmental environment. Like Ellen,

Christine, and Maria, Lindsey referenced the PLC-like structure of the course and talked about how being placed in small groups that met regularly to provide one another with feedback on their work was “less intimidating” than other peer feedback models other instructors had used in various courses. Additionally, Lindsey felt that it made the instructor feel more approachable,

109

noting that the instructor regularly spent time with each group and speaking with individual

students in the groups, clarifying any misunderstandings for students and, if needed, articulating these clarifications for the benefit of everyone in the class.

While Lindsey felt everything that she learned in the class would help her when she becomes a teacher, she did submit that she wished they had learned more about how to analyze data they collect in the classroom: “I think a bit more time could have been spent on analyzing student data because while we did that . . . . I think that was one of the hardest parts of the class is kind of looking at how to compile data about your students in your class as a whole. So, I think that could have been a little stressed a little bit more, but other than that it was really one of my favorite classes I've had.”

Despite wishing she had learned more about analyzing data, Lindsey stated that she feels

“great” about her experience in this class, claiming, “a huge part of my worry about teaching was the rigidity of giving tests […], and now that I know that it’s a much more flexible and creative kind of thing, it makes me so much happier.”

Sara: SCP # 5

Profile from survey responses. Sara is a sophomore who, like the other SCPs, had no prior experiences in an educational assessment course. Sara characterized herself in the survey as one who typically does think back over previous decisions or past behavior, and also as one who frequently reflects upon the meaning of her studies for herself. Acknowledging she had experienced a transformation in her thinking, Sara briefly described what happened that helped her to realize this: “I think that many of my previous conceptions about assessment were altered.

I learned the purpose that assessment has for both teachers and students. Furthermore, there were numerous types of assessment that I learned about that I didn't know previously. As a future

110

educator, I have learned new strategies and methods of testing student knowledge and ways to

improve instruction” (question 3).

Sara was another of three students who specifically noted the impact of the non- traditional structure of the course on her learning. Additionally, she felt her classmates’ and instructor’s support influenced this change. She also responded that class activities and exercises were contributors to this change. When prompted in the survey (question 10) to think back to when she realized her views had changed, Sara reported: “The course materials explained through various supplemental readings the importance of assessment in the classroom, and their different purposes. Also, discussion with peers/instructor in the classroom changed my prior conceptions.”

Interview discussion. Approximately a week and a half after she completed the survey,

Sara was able to meet during a weekday morning. The researcher and Sara met for 42 minutes, during which time Sara shared her experiences in the class and the role she felt her classmates, her instructor, and course assignments played in her transformative thinking regarding assessment. Sara’s responses in the interview matched and expanded upon her survey responses, and she spoke at length about the role her peers and the instructor played in her shift in thinking about types of assessments and their respective roles.

Overall, Sara described the instructor providing numerous high-quality examples of assessment tools, supplemental materials to explain and build on these examples, and ongoing opportunities to practice learning to create and use her own assessments. The practice time seemed key to Sara, as she noted it was typically provided in class, based on real-world scenarios, and coupled with opportunities to get peer and instructor support as she refined her

111 own understanding and abilities related to creating and using a variety of assessment tools for a variety of purposes.

Sara also talked about having the opportunity to practice applying what she was learning in terms of two separate examples. First, she was concurrently enrolled in a course for English language learners (ELLs), in which she learned strategies to support ELL education. Sara felt the types of assessments she learned about in this educational assessment course provided her with numerous examples to reflect on her learning and solidify her understanding of the purpose and various types of formative assessment. Second, due to her ELL course, Sara had the opportunity to work with ELL students in the field over several visits. Sara felt this course in educational assessment equipped her much better than many of her peers, who had not yet taken the educational assessment course. Sara discussed how she was able to apply formative assessment strategies to better understand the needs of the ELL student, an emergent reader, with whom she was paired directly.

Sara felt her ability to practice applying in the field what she had learned in the classroom was an invaluable opportunity that enabled her to build her knowledge of educational assessment. Sara also noted that, while the site provides numerous field placement opportunities in conjunction with courses, none of those opportunities are combined directly with this course on educational assessment, and she felt that such an opportunity would be very pertinent to all students’ understanding of course concepts. Summarizing this position, Sara reiterated that the instructor of this course had exposed them to a variety of formative assessment strategies that could be used in their own classrooms. Sara asserted, “I think it would be really cool actually to be able to have a placement with [this] class. I think being able to actually utilize what we were learning would be beneficial to our knowledge because just seeing the differences. […] It

112 would be really beneficial for us to see how students are actually able to communicate what they’re learning […] through formative assessments.”

Like both Maria and Lindsey, Sara talked about how the non-traditional structure of the course, which prompted students to engage with a variety of materials one day a week in lieu of class time, led her to come prepared to class and helped she and her peers to maximum learning during their class time. Furthermore, Sara attributed the instructor’s modeling of best practices to be one that encouraged risk-taking and creativity. Through learning time during class sessions, Sara noted that she and her peers were afforded numerous opportunities to practice improving upon their work, and she emphasized that her growth occurred because the instructor built in opportunities for peer and instructor feedback to be provided. In this way, the instructor utilized formative assessment multiple times prior to requiring students to submit their work for summative evaluation.

Ultimately, Sara felt that her enrollment in this course had impacted a transformation in her learning because of the opportunities for practice both in the class and in her external field placement. She stated that she realized her mindset about the role of assessment was changing midway through the course as she thought about how she would be able to use what she was learning as a teacher—and how it differed from what her K-12 teachers had done when she was a student in their classrooms. Sara stated that being a student in this course and receiving the combined value of collaborative learning with her peers, support from her instructor, and opportunities to practice using the multiple strategies to which she was exposed, “Just honestly opens up your eyes to a whole new world of teachers and assessing.” She ended the interview talking about how she looked forward to using everything she had learned when she was a teacher by constantly balancing the relationship between planning for instruction, measuring

113

what students were learning, and utilizing summative assessments when she was ready to gauge

students’ final understanding. Sara smiled, ending with, “This is really beneficial because it

guides you in figuring out how to better your students, better your instruction, better your

classroom as a whole.”

Analysis of Data Related to Research Questions

Now that preliminary program and personal demographic data have been explored in

isolation, they will be reviewed and analyzed in conjunction with the research questions that

guided this study, including the overall guiding question regarding the impact of learning

experiences on preservice teachers, as well as two subquestions related to potential shifts in

habits of mind of these preservice teachers and the extent to which learning experiences may

have contributed to these shifts.

Primary Research Question: Impact of Learning Experiences

The primary research question seeks to explore, “How are elementary preservice

teachers’ habits of mind about formative assessment strategies (FAS) impacted by their learning

experiences?” Key to exploring this research question is understanding how many students felt

they experienced transformations in their learning, and how many students who indicated they

experienced these shifts attributed them to the individuals or activities that comprised their

learning experiences. Strong context is provided through open-ended survey responses, as well

as salient remarks made from case study participants during interviews, and relevant passages

from analysis of their work.

Primary Research Question: Quantitative Analysis

Survey Questions 2 and 4-9 provide quantitative data that help to reveal how elementary

preservice teachers’ habits of mind about formative assessment strategies are impacted by their

114

learning experiences by revealing whether preservice teachers felt they experienced a transformation as a result of others, learning activities in their course, or significant life changes.

They are then able to be quantified again with respect to TLT.

Question 2 of the survey asked students to identify whether they had experienced a time,

while enrolled in this course about educational assessment, when they had realized that their

values, beliefs, opinions, or expectations related to assessment techniques, the role of formative

assessment, or the role of summative assessment had changed. All participants responded to this

question. A majority of participants (N=25) responded “yes” (representative of 78.1% of

responses); some (N=7) responded “no” (representative of 21.9% of responses). Follow-up

questions asked whether a person (question 4: yes, N=19; no, N=6; 7 omissions); part of a class assignment (question 6: yes, N=25; no, N=7; 0 omissions); or a significant change in the participant’s life (question 8: yes, N=3; no, N=22; 7 omissions) had influenced the change. The following table places these data into perspective among those participants who indicated they had experienced a change. Note that individuals who responded “no” to survey question 2 were instructed to skip ahead to question 11, and as such, data from those individuals is not represented in Table 19.

The results from Table 19, below, reveal that a majority of students felt a person or class assignment impacted their shift in thinking at 59.4% and 78%, respectively, while less than 10% of students felt a significant life change impacted this shift. Though small, and though not indicated as the sole reason for their shift in thinking, the 9.4% of students who did feel a significant life change impacted this change is still essential to note, as research indicates that individuals are more open to changes in habits of mind, and therefore transformations in learning, from a variety of factors, including major life events (Dirkx, 1998; King, 1997, 2004,

115

2005, 2009; Mezirow, 1991, 1997, 2003; Mezirow & Associates, 2009; Taylor, 2009; Taylor,

2000). Life changes, which will be explored later in conjunction with Table 22, noted by

students who indicated they felt it was part of their transformation shift regarding educational

assessment (question 9) included moving (N=1), a divorce/separation/break up (N=1), and the

death of a loved one (N=1).

Table 19. Participant-Indicated Causes of Perspective Change

Frequency of Percent of Indicated Cause Indicated Cause Cause Person Yes 19 59.4% No 6 18.8% Skipped 7 21.9% Class Assignment Yes 25 78% No 7 22% Skipped - - Significant Life Change Yes 3 9.4% No 22 68.8% Skipped 7 21.9%

Reviewing these data related to transformation shifts starts to get at the heart of King’s

(2009) work, and she articulates what she defines as an identifiable PT-Index to describe the sort of shift, or perspective transformation, a participant experienced as a result of his or her education or other factors. With respect to this study, a suitable adapted description of each PT-

Index involves applying the Index to the central topic of the study, as refined in question 2: Since

you’ve been enrolled in this course about educational assessment, do you believe you have

experienced a time when you realized that your values, beliefs, opinions or expectations about

116

assessment techniques, the role of formative assessment, or the role of summative assessment

have changed? The modified PT-Index enabled scoring of participants as those who did not

have a perspective transformation experience toward the role of assessment (PT-Index = 1); those who had a perspective transformation experience toward the role of assessment that was

not associated with their experiences in this course (PT-Index = 2); and those who had a perspective transformation toward assessment that was associated with their experiences in their course (PT-Index = 3). Notably, PT-Index 3 encompasses the levers of change involving support and class assignments, which were the primary learning experiences intended to be measured through the LAS as entities known to impact transformative learning in teacher education.

Ascribing the PT-Index to questions 2, 4, 6, and 8 as just explored through the depiction

and analysis of Table 18—that is, whether individuals had experienced a transformation shift and

whether it was the result of a person, class assignment, or significant life change, helps to portray

those data. Participants who responded “no” to question 2 are identified with PT-Index 1, as

those who did not experience a transformation change (N=7). Participants who indicated that

they had experienced a transformation change but ascribed it only to a significant life change, in

response to question 8, are identified with PT-Index 2 (N=3).

More complex was determining the frequency and percent of responses for PT-Index 3, which required investigation of associated question 5 (this question named a variety of individuals, some involved directly with the course and some not, who may have impacted a shift associated with a person as per question 4) and combined these data with responses to question 6 (Was it part of a class assignment that influenced the change?). Specifically, question 5 asked participants who indicated an individual was responsible for their

117

transformation change to identify whether the person(s) responsible included those directly

involved with the course (other students or classmates, the instructor); whether it was an advisor

(someone not directly involved with the course); and provided space for another individual to be

described.

A total of nine (9) participants indicated another student or classmates were responsible for the change; 21 participants named their instructor; four (4) participants indicated their

advisor; no participants selected other. To analyze those identified by PT-Index 3, only those

individuals connected directly with course learning experiences (another student, classmates, or

the instructor) were accounted for. The other type of trigger identified by PT-Index 3 is a class

assignment, as identified in question 6. Seventy-eight percent, representing a majority of

participants, also indicated their transformation change was impacted by a class assignment

(N=25).

No students indicated that only a significant change in their lives affected this

transformation, although the indication of such would have been grounds for exclusion from the

study data. However, because some students may have indicated transformations associated with

factors both outside of the course and because of the course (for instance, both an advisor

external to the course and the course instructor; or, a major life change and course assignments),

the total number by both frequency and percent of index shown in Table 20 does indicate more

than the total number of participants in the class:

118

Table 20. PT-Indices of Participants Indicating Transformative Learning Toward the Role of Educational Assessment

Frequency of Perspective Percent of Participants by Transformation Indices Participants by Index Index PT-Index 1: No 7 21.9% transformation PT-Index 2: 3 9.4% Transformation not associated with the course PT-Index 3: 25 75% Transformation associated with the course

To further explore the question, “How are elementary preservice teachers’ habits of mind about formative assessment strategies (FAS) impacted by their learning experiences?” it is necessary to delve more deeply into responses from participants who indicated that their shift in thinking was related to a person or to a class activity in order to better understand the relationship among their responses and the potential impact of learning experiences.

Participants who indicated that a person impacted this change for them indicated that only individuals involved in this educational assessment course were responsible (question 4); though participants were encouraged to mark and delineate others who may have been responsible, none chose this option. In question 5, a majority of participants who answered this question (N=20) noted that their instructor’s support (80%), and/or a challenge from their instructor (30%), were the leading contributors to their shift in thinking; some suggested another student (20%) or, more broadly, the support of their classmates (25%), were contributors.

119

Four participants (20%) indicated their advisor’s support also contributed to the change,

but a review of individual responses indicate that no one who indicated their advisor (external to

the class) was the only person(s) contributing to this change. These data are displayed in Table

21, which indicates the frequency and percent by person(s) that participants felt were responsible

for their shift in thinking:

Table 21. Person(s) Indicated as Contributing to Transformative Learning by Participants

Frequency of Percent of Person(s) Person(s) Person(s) Indicated Indicated by Indicated by Participant Participant Another student’s support 4 20% Your classmates’ support 5 25% Your advisor’s support 4 20% A challenge from your instructor 6 30% Your instructor’s support 16 80%

Other (please specify) - -

Finally, as noted in previous Table 19, 78% of survey participants indicated that they felt

a class assignment contributed to their shift in thinking (question 6). To explore the potential

impact of various class assignments, question 7 asked participants who responded “yes” to

question 6 to identify all class assignments they felt had contributed to this shift. Twenty-five participants responded to this question; seven left it blank, which reflects those who had said that part of a class assignment had influenced the change for them (N=25) versus those who responded that a class assignment had not influenced their change (N=7).

120

The top three class assignments indicated in the participants’ responses to have been most

influential included (1) “class activity/exercise” (N=16); (2) “verbally discussing your concerns”

(N=15); (3) and “class/group projects” (N=14). Those answer selections significantly led others,

such as “supplemental readings or materials” (N=8); “personal reflection” (N=7); and several

others that received less than four selections total, as shown in Table 22, below. It should be noted that two participants selected “other” and remarked: “Mostly interactions with my

professor” (N=1) and “Creating an assessment, learning objectives and scoring guides, giving out

the assessment, analyzing data, thinking about how this affects future instruction” (N=1). The

participant who noted “mostly interactions with my professor” did respond “yes” to question 5,

indicating that it was a person’s influence that affected the shift for them, and he or she also did

note both a challenge from the instructor and the instructor’s support; therefore, the researcher

contends that that response has been captured accurately. Additionally, the participant who

delineated a specific activity as their note to “other” did select both “class/group projects” and

“class activity/exercise,” so the researcher considers that sentiment to be reflected accurately in

the data as well.

Table 22. Class Assignments Indicated as Contributing to Transformative Learning by Participants

Frequency of Percent of Assignments Assignments Class Assignment Choices Indicated by Indicated by Participant Participant Readings in a textbook 3 12% Chapter questions in a book 1 4% Supplemental readings or 8 32% materials Class/group projects 14 56% Verbally discussing your 15 60% concerns

121

Writing about your concerns 2 8% Journal entries - - Self-assessment in the class 6 24% Class activity/exercise 16 64% Deep, concentrated thought 4 16% Personal reflection 7 28% Autoethnography or personal - - learning experience paper/project

Non-traditional structure of this 3 12% particular course Field experience - - Other 2 8%

The final broad category that participants had the choice to select as a contributor to a

shift in their thinking about assessment techniques and the role of assessments was the influence

from a significant change in their life. If participants selected “Yes” to question 8 (indicating a

significant life change had impacted their shift in thinking), they were asked to choose from

among an array of options, including space to indicate and explain “Other,” in question 9. Of note, two participants who said that a significant life change was not an influential factor in the

shift of their thinking responded to question 9. One participant indicated the death of a loved one

had occurred during his or her time in the class; the other participant simply indicated “Other”

but specified, “None.”

An analysis of individual responses reveals that three participants who felt a significant

life change had impacted their thinking about assessments and the role of assessments in the

class had experienced the “death of a loved one” (N=1); “moving” (N=1); and a

122

“divorce/separation/break up” (N=1). Because of how some participants responded to this question, the column labels of Table 23 are revised versions of those used in similar tables previously in this research:

Table 23. Significant Life Changes Indicated by Participants as Contributing to Transformative Change

Frequency Percent Frequency Percent Indicating Indicating Indicating Indicating Change Change Change Did Not Change Did Not Significant Life Impacted Impacted Impact Impact Change Transformative Transformative Transformative Transformative Learning by Learning by Learning by Learning by Participant Participant Participant Participant Marriage - - - - Birth/adoption of - - - - a child

Moving 1 20% - -

Divorce/ 1 20% - - separation/ break up Death of a loved 1 20% 1 20% one

Illness of a loved - - - - one

Change of job - - - -

Loss of job - - - -

Immigration to a - - new country

Other - - 1 20%

123

Primary Research Question: Qualitative Analysis

Qualitative analysis was conducted to allow for matching of empirical data with quantitative findings and depended on coding as a means of formulating an inventory of topics that could be indexed or categorized (Miles et al., 2013). These qualitative data are cross-

referenced with quantitative findings and explored later in this study in terms of emergent themes

(Chenail, 1995; Miles et al., 2013). To explore the primary research question, “How are

elementary preservice teachers’ habits of mind about formative assessment strategies (FAS)

impacted by their learning experiences?” responses to open-ended survey question 3 are

reviewed in conjunction with interview data and data resulting from document analysis.

Survey question 3 was a follow-up to question 2. Question 2 had asked participants if

they had experienced a change in their values, beliefs, opinions, or expectations with regards to

assessment techniques, the role of formative assessment, or the role of summative assessment,

since they had been enrolled in this course about educational assessment. Question 3 prompted

participants who replied “Yes” to question 2 to “Briefly describe what happened.” Seventy-eight

percent of participants (N=25) had indicated they did feel they had experienced a change

(question 2), and just fewer than two-thirds of participants (N=21) chose to respond to question

3, while about one-third of participants (N=11) skipped this question.

A variety of key words relate to the tenets of TLT surfaced from the responses with some

frequency, including verbs such as “realized” (N=11), “learned” (N=8), and “thought” (N=6).

Put simply, in their own words, preservice teachers described how their thinking shifted as a

result of the course as they realized, learned, and adapted what they thought during the course.

The following sample quotations, extracted from question 3’s results, help to highlight this shift

in the words of the participants:

124

• “I started to realize how important it is to measure what students learn. I realized

assessment is about student growth and learning, not about grades.”

• “I think that many of my previous conceptions about assessment were altered.

I learned the purpose that assessment has for both teachers and students. Furthermore,

there were numerous types of assessment that I learned about that I didn't know

previously. As a future educator, I have learned new strategies and methods of testing

student knowledge and ways to improve instruction.”

• “I realized that assessment is so much more than testing students. It is a process almost

that is happening all the time when it is done right. It is so much more than just

summative assessments all the time.”

• “I realized that formative assessments can be very innovative and that teachers have the

ability to make them fun and engaging for their students.”

• “I used to believe assessments were for teachers only. In creating formative assessments

myself, I realized they are checkpoints for not only the teachers, but the students.

Teachers fit future instruction to help [students] based on data.”

• “I thought that formative assessments were just worksheets for student's [sic] to complete

almost to waste time. Instead I realized that formative assessments can be just as

important as a summative assessment.”

Data from interviews and document analysis also revealed similar patterns in how participants’ habits of mind related to FAS were impacted by their learning experiences. As revealed from the survey, the participants’ interview responses indicated their habits of mind with respect to FAS were transformed as participants developed a new understanding of the

125

types and purposes of assessments. Relevant pattern codes included “Learning of New

Information from New Material” (N=75) “Learning a New Meaning from Old Material” (N=52);

“Value of Quality Examples, Materials, & Resources” (N=61); and “Value of a Collaborative

Learning Environment” (N=58).

Participants’ interview responses related to these codes revealed that, while each was familiar with a variety of assessment techniques from their time as K-12 students, they developed a deeper understanding of types of assessments and their roles. Specifically, each participant recognized areas where their previous teachers had or had not used FAS or summative assessments appropriately. Participants also repeatedly articulated the value of quality examples, materials, and resources from which to learn, as well as a collaborative learning environment to reinforce new learning.

“Learning of New Information from New Material” was revealed (N=75) in all transcripts and documents analyzed. Codes generated related to excerpts such as this remark by

Ellen’s remark in her interview: “A lot of the strategies we learned felt really new, […] and we talked about how they were designed, what they were designed to measure.” Similarly, Maria noted, “I got a lot of new ideas on how to [measure what students know]. So, just trying to implement some new ideas and making sure that my students in the future know that assessment in this class is to see where you’re at, so I know where you’re at, how to help you.” In her interview, Lindsey talked about an activity where she and her peers had built team-Pinterest boards and collected a variety of resources related to new FAS they had learned about in the course. In one discussion assignment shared with the researcher by Sara, she wrote about a variety of FAS that taught her about multiple ways to measure students’ learning. And, in another assignment, Christine wrote about learning to perceive the value of FAS, particularly

126 when used before and during instruction, noting that it is often too late to gain actionable information by the time summative assessment occurs. Patterns evident from these data align to the course description and the course syllabus, which articulate that students will be exposed to a variety of methods to support teacher inquiry into student learning.

“Learning a New Meaning from Old Material” relates to the participants’ learning of the value of FAS to which they had previously been exposed without realizing the potential of these strategies, and this pattern was noted 52 times. In her interview, Ellen remarked, “I was a student in this class learning to be a teacher, learning strategies teachers had used with me” and

“I always thought of assessments as test. I never realized that some of my teachers, my better teachers, were always gathering data, were always getting information about what I knew, what my classmates knew.” Similarly, Christine recalled, “We were all learning about assessments. I think a lot of us had not realized when teachers we had before had used formative assessments with us.” She added that she felt most of her previous teachers had just graded everything, and that she felt that mentality had affected she and many other aspiring teachers. Maria noting the peer-aspect of learning from prior experiences during her interview, stating, “We all came from different states, so we all had different experiences with assessment.” Additionally, Lindsey wrote about and talked in her interview about how different her learning was in this course from her prior experiences as a K-12 student, noting that only now does she understand using FAS is important […] throughout the year, rather than just kind of one at the end of the year test.”

Two additional patterns that help to reveal how participants’ habits of mind were impacted by their learning experiences include repeated references to the value of quality examples, materials, and resources, as well as the value of a collaborative learning environment.

With respect to exemplars, every participant referenced videos, articles, and a wide variety of

127

sample assessment items. They also wrote about these exemplars in multiple discussion board

and other assignments, several of which were analyzed by the researcher after attaining

participants’ consent. With respect to the value of a collaborative learning environment, every

participant also discussed how they felt collaboration with their peers and instructor helped to

clarify misunderstandings, reveal perspectives not previously considered, and reinforce new

learning. Both of these sets of patterns are discussed further in conjunction with research

subquestion 2, as they are expanded upon in conjunction with patterns related to multiple peer

and instructor supports.

Primary Research Question: Analysis Summary

In summary, the primary research question sought to answer, “How are elementary

preservice teachers’ habits of mind about formative assessment strategies (FAS) impacted by

their learning experiences?” Generally, this question can be answered with the statement that

elementary preservice teachers’ habits of mind about FAS are impacted by the people and the

assignments with which they come in contact as part of their learning experiences. These people

and class assignments helped preservice teachers to develop their learning and grow as future

teachers.

Both quantitative and qualitative data were cross-referenced for patterns and correlations, and together they reveal that participants’ values, beliefs, opinions, or expectations with regards to assessment techniques, the role of formative assessment, or the role of summative assessment had transformed for 78.1% of participants since they had been enrolled in this course about educational assessment. In exploring how participants’ mindsets were transformed, it was revealed that 59.4% felt their shift was impacted by a person, such as their peers or the instructor, and 78% felt their shift was impacted by assignments in the class. Quantitative and qualitative

128

data confirm that the individuals primarily credited with impacting the shift were peers and their

instructor. Ultimately, participants were impacted by their learning experiences because of their

learning from new material and gaining new understanding from material to which they had

previously been exposed, particularly with respect to the collaborative learning environment of

the class setting.

Subquestion 1: Shifts in Habits of Mind

Having examined how elementary preservice teachers are impacted by their learning experiences, the first subquestion seeks to examine more specifically what habits of mind in relation to transformative learning were impacted in this course. It asks, “What shifts in habits of mind, if any, do elementary preservice teachers perceive they experienced regarding understanding the role of effective formative assessment strategies (FAS) in education as a result of their coursework about educational assessment?” In this section, quantitative survey data are explored in conjunction with the phases and quadrants that frame Mezirow’s TLT. Data from an open-ended survey question, as well as interviews and document analysis, are discussed in the qualitative analysis section, and then conclusions related to research subquestion 1 are drawn.

Subquestion 1: Quantitative Data

Shifts in habits of mind may be categorized through several means, and quantitative data for this study related to shifts in habits of mind may be gleaned fully from survey question 1, with further insight provided by survey questions 11 and 12. In conjunction with the theoretical framework that guides this study, the researcher does so by cross-referencing responses to survey question 1, specifically. Question 1 examines the general thoughts and actions students felt shifted for them in the class, and these responses are analyzed with regards to Mezirow’s ten phases of TLT. As a reminder, the ten phases of Mezirow’s TLT are generally described as

129 indicative of those experiences, thoughts, or actions that lead toward a transformation in learning. Responses from question 1 can be considered in terms of these ten phases, or classified by the quadrants that lead to a transformation in learning as identified by Glisczinski (2007) and

Herbers (1998).

Question 1 was structured to assess all participants’ perspectives in alignment with the ten phases of TLT. The question asked students to think about their educational experiences in this course and to select any statements that may apply to how they felt or responded to these experiences. All students responded, and only three (3) felt they did not experience any of the statements described. Many students indicated that they questioned their normal actions or perceptions, realized their peers did as well, and at least contemplated trying new roles and actions. The following table describes the frequency and percent of students who identified with one or more elements of Mezirow’s (1991) TLT framework:

Table 24. Responses of Students Indicating Connections to the TLT Framework

Frequency of Percent of Statement Statement TLT Phases & Corresponding Thought/Action Statements Indicated by Indicated by Participant Statement Phase 1: a. I had an experience that caused me to question the way I 10 31.25% normally act. b. I had an experience that caused me to question my ideas 13 40.63% about social roles (Examples of social roles include what a mother or father should do or how an adult child should act).

Phase 2: c. As I questioned my ideas, I realized I still agree with my 9 28.13% beliefs or role expectations.

130

Phase 3: d. Or instead, as I questioned my ideas, I realized I no longer 7 21.88% agree with my beliefs or role expectations.

Phase 4: e. I realized that other people also questioned their beliefs. 13 40.63%

Phase 5: f. I thought about acting in a different way from my usual 8 25.00% beliefs and roles. g. I felt uncomfortable with traditional social expectations. 3 9.38%

Phase 6: h. I tried out new roles so that I would become comfortable or 14 43.75% confident in them.

Phase 7: i. I tried to figure out a way to adopt these new ways of acting. 14 43.75%

Phase 8: j. I gathered the information I needed to adopt these new ways 13 40.63% of acting.

Phase 9: k. I began to think about the reactions and feedback from my 12 37.50% new behavior.

Phase 10: l. I took action and adopted these new ways of acting. 8 25.00%

N/A m. I do not identify with any of the statements above. 3 9.38%

131

Separate, but corresponding to, the ten phases of Mezirow’s TLT are the experiences,

thoughts, or actions that lead toward habits of mind that result in transformative in learning.

Glisczinski (2007) classified the dispositions of these ten phases based on the research of

Herbers (1998). These data can be considered in two ways. First, and more complex, lists and

then tallies the total number of responses indicated by participants from survey questions 1 and

7, revealing all data by response and by quadrant. Using this approach, the phases and

corresponding statements from question 1 correspond to one of each of the four quadrants in

which these thoughts or actions occur for learners that lead to transformative change. Phase 1

connects to the experience of a Disorienting Dilemma. Phases 2-9 correspond with Critical

Reflection. Phase 10 corresponds with taking action.

Essential to fully identifying activities in conjunction with these quadrants are data from

question 7, which considers certain class activities that involved critical reflection and rational

dialogue, are also tabulated in these quadrants. Likewise, select statements in question 7 also correspond to Rational Dialogue. (One statement from question 7, “field experience,” corresponds to the Action quadrant; although no participants marked it as an influencer in question 7, SCP “Sara” amended her response during her interviewer with the researcher, and that response is quantified in Table 24). If the same phases and statements outlined in Table 23 are quantified with respect to the quadrants that lead toward transformative learning, and coupled with responses to question 7 that involved critical reflection and rational dialogue, analysis summarizes the data from Table 24 and adds data derived from responses to question 7 in Table

25, which follows:

132

Table 25. Responses of Participants Indicating Connections to the TLT Framework

Percent Response Frequency of Response Occurrence (All Transformative Learning Quadrant Occurrence (All Responses) by Responses) by Quadrant Quadrant I. Disorienting Dilemma 10 5.24% Phase 1 corresponding statement 10

II. Critical Reflection 157 82.20% Totals from Phases 2-9 corresponding 93 statements Readings in a textbook 3 Chapter questions in a book 1 Supplemental readings or materials 8 Class/group projects 14 Writing about your concerns 2 Journal entries - Self-assessment in the class 6 Class activity/exercise 16 Deep, concentrated thought 4 Personal reflection 7 Autoethnography or personal learning - experience paper/project Non-traditional structure of this particular 3 course

III. Rational Dialogue 15 7.85% Verbally discussing your concerns 15

IV. Action 9 4.71% Phase 10 corresponding statement 8 Field experience 1

133

Second, and more simply, the total number of responses by quadrant can be reviewed to

determine which dispositions most influenced transformative learning for the participants in this research. Table 26 depicts the data from Table 25 in a more succinct view, to help provide

insight into the specific domains affected for learners, instead of portraying the variety of tasks

and processes attributed by participants as most influential. From the data portrayed in Table 26,

a majority (82.20%) were most influenced by critical reflection. These dispositions by reported

experiences are shown by quadrant below, in Table 26.

Table 26. Simplified Totals of Dispositions by Reported Experiences Leading to Transformative Learning

Percent of Frequency of Response Response Transformative Learning Quadrant Occurrence (All Occurrence (All Responses) by Quadrant Responses) by Quadrant I. Disorienting Dilemma 10 5.24% II. Critical Reflection 157 82.20% III. Rational Dialogue 15 7.85% IV. Action 9 4.71%

Subquestion 1: Qualitative Data

Analysis of qualitative data related to subquestion 1 occur primarily through a review of

responses to survey questions 11 and 12, which asked participants to reflect on their own

metacognition related to past behavior and learning, as well as question 3, which asked

participants to describe their experience of change related to their habits of mind. Additionally,

abundant data from interviews and document analysis were coded for patterns.

134

Generally, participants in this course considered themselves to be students who are thoughtful about previous decisions and past behavior, and many participants also consider themselves to frequently reflect upon the meaning of their studies. To point, survey question 11 asked students, “Would you characterize yourself as one who usually thinks back over previous decisions or past behavior?” Of the 32 participants in this survey, 94% (N=30) responded,

“Yes,” whereas just 6% (N=2) responded “No.” Additionally, question 12 asked, “Would you say that you frequently reflect upon the meaning of your studies for yourself, personally?”

Again, 94% (N=30) responded, “Yes,” and just 6% (N=2) responded “No.” Of note, none of the participants who responded “No” to either question felt that they had not experienced a shift in their thinking related to assessment in this course (question 2). While consideration to the majority of participants who feel they generally are reflective of their pasts and learning experiences does not directly relate to a research question, consideration of the same helps to provide context about the participants in this study and aids the researcher’s exploration of how participants perceive shifts in their learning as a result of this coursework.

Though survey questions 2 and 3 were considered in the quantitative and qualitative analyses related to the primary research question of this study, to determine how preservice teachers’ habits of mind are impacted by their learning experiences, careful consideration of responses to survey question 3 are also germane to this subquestion, which considers what shifts in habits of mind preservice teachers perceive they experienced. Specifically, an analysis of all

21 responses indicates a direct correlation to the first two TLT Phases (Mezirow, 1997, 2003;

Taylor, 2009), including the “Disorienting Dilemma” and “Critical Reflection” quadrants of transformative learning (Glisczinski, 2007; Herbers, 1998). Accordingly, as survey questions 2 and 3 sought only descriptions of changes in thinking, and did not broach class assignments,

135 experiences, or activities, neither quadrants III nor IV, “Rational Dialogue” or “Action,” are represented in these responses. However, those quadrants are explored later in the qualitative analysis section pertaining to research subquestion 2, which looks at what learning experiences preservice teachers believe affected their habits of mind during this course.

Descriptive coding was applied to the responses to question 3, then tabulated by frequency and percent of responses. Responses that indicated Phase 1 shifts, such as participants having experiences that caused them to question how they normally act, or experiences that caused participants to question their ideas about social roles in the classroom, were coded in line with Quadrant I: Disorienting Dilemma. Responses that indicated Phases 2-9 shifts are coded in line with Quadrant II: Critical Reflection, and include remarks including, but not limited to,

Phases 2-9 descriptors, such as participants questioning their ideas to determine if new learning aligned with previous understanding or with new beliefs or expectations; discomfort with traditional social roles; or trying to find new ways of acting based on new learning. Those data are portrayed by dispositions inherent in transformative learning quadrants in Table 27, and include sample responses elicited from participants from question 3:

136

Table 27. Dispositions by Described Experiences of Transformative Learning

Frequency Percent of of Response Response Occurrence Transformative Occurrence Relevant Sample Responses to Survey (All Learning Quadrant (All Question 3 Responses) Responses) by by Quadrant Quadrant I. Disorienting 4 19% • “I always found assessments as a bad Dilemma thing from a students [sic] perspective. As I learned more about that and how we can structure them positively I found them as a good thing” • “I used to believe assessments were for teachers only. In creating formative assessments myself, I realized they are checkpoints for not only the teachers, but the students. Teachers fit future instruction to help [students] based on data.”

II. Critical Reflection 21 100% • “I realized that assessment is so much more than testing students. It is a process almost that is happening all the time when it is done right. It is so much more than just summative assessments all the time.” • “I realized that formative assessments can be very innovative and that teachers have the ability to make them fun and engaging for their students.” • “I realized that the kinds of assessments we give are interconnected and dictate the ways we teach not only the current class but also future classes.” • “I realized the different ways students need to be assessed, the different types of environments in a classroom and the role and importance of a wide range of assessments.”

137

Qualitative data patterns reveal two primary shifts in habits of mind among participants, including the inclination towards a teacher mentality and the ability to better understand various types and purposes for assessments. Related pattern codes included “Growth & Development of a Teacher Mentality” (N=72) and “Learning to Distinguish Between Formative & Summative

Assessment Best Practices” (N=33).

The development of a growth mentality as participants are starting to transition from thinking of themselves as students to thinking of themselves as future teachers was evident through a variety of qualitative sources; it surfaced at least 72 times, as coded by the researcher.

The course description articulates a goal of presenting “methods to enhance teacher inquiry into student learning,” and is written to indicate the budding preservice teacher. The course syllabus likewise outlines objectives for novice teachers, outlining eight (8) separate indicators of student success for students learning to be teachers. Participant interviews and shared assignments highlighted that each preservice teacher was experiencing a growth mentality through both direct and indirect remarks to this effect.

In her interview, Ellen understood “I was a student in this class learning to be a teacher.”

Her shift in thinking about her role was evident when she added that she had had a change in thinking about her role from when she thinks back to when she was a student. Ellen also wrote about how she had never felt comfortable speaking up as a student when she did not understand a topic, and how she now intends to use FAS to understand what her students know without embarrassing them. Christine talked and wrote about how she hated when teachers asked an entire class if they understood material, noting like Ellen that she did not feel comfortable indicating if she did not understand something. In her interview, she spoke at length about wanting to do things differently when she is a teacher by applying what she has learned about

138 assessments and using them to gather the right information about student learning at the right times. Lindsey emphasized that she had a major shift in terms of understanding assessments now as a way to build relationships with students by showing them that she cares what they know, and how much she can help them to learn. Lindsey noted that this represents a major shift for her because, prior to this course, she had thought of assessment only in terms of rigid, scheduled testing.

Finally, “Learning to Distinguish Between Formative & Summative Assessment Best

Practices” surfaced in each participants’ interview, shared written assignments, and analysis of the course syllabus a total of 33 times. The course syllabus has explicitly called for students to be able to engage in “the decision-making that results from teacher assessments,” and for preservice teachers to be able to understand the “use of assessment data to inform instruction and improve teaching and learning.” Based on the qualitative data collected from participants, it appears both goals were met.

For instance, in discussing assessment practices, Ellen admitted that she had learned to distinguish between types of assessments based on their respective purposes, adding that

“[Assessment] is about kids, and helping kids grow.” Christine articulated what she felt was the value of FAS with respect to how teachers plan and use assessment, stating, “If everybody does poorly on an assignment, then maybe it’s not because they don’t know the information as much as it is that I didn’t teach them the information, or I was not getting it across to them in the way that they needed.” Maria, who had previously felt many assessments were intended to be used summatively, shared her revised understanding in her interview: “[FAS] is just a check! It’s just a check.” Lindsey revealed the moment she realized that her learning about assessment practices had transformed, explaining, “And that's when kind of it hit me like, ‘Oh, testing isn't just this

139 thing that you have to do a for your students to get them to move onto another level, but it's something to really aid in understanding.’” Similarly, Sara elucidated, “I think the most important thing about formative assessment is that you are getting some kind of knowledge about how your students are learning. Not like the grade for your students because actually the point of formative assessment is to see how to change your instruction or how to benefit your students.”

Subquestion 1: Analysis Summary

Overall, two main answers align with research subquestion 1, “What shifts in habits of mind, if any, do elementary preservice teachers perceive they experienced regarding understanding the role of effective formative assessment strategies (FAS) in education as a result of their coursework about educational assessment?” These answers are a shift in the growth, or development, of a teacher mentality from that of a student mindset, and also the development of a working understanding of formative and summative assessment best practices.

Quantitative and qualitative data were analyzed and cross-referenced for patterns and correlations. Matching of these data, particularly in conjunction with Mezirow’s TLT and

Glisczinski (2007) and Herbers’ (1998) Quadrants of TLT, reveal that a significant majority of participants engaged in critical reflection, which is connected to shifts in habits of mind. For instance, the data revealed a frequency of 157 coded survey responses related to critical reflection. Further survey data indicated that 94% of participants characterize themselves as students who typically think carefully about their own learning, and the same amount, 94%, indicated they would characterize themslves as individuals who typically reflect on the meaning of their past actions. Numerous spoken and written remarks by participants during their interviews and through their shared assignments reveal they each frequently engaged in critical

140

reflection about their own mindsets and learning in the educational assessment course as they

planned for their future careers as teachers.

Subquestion 2: Effect of Learning Experiences on Habits of Mind

With an understanding of how elementary preservice teachers’ habits of mind were impacted, and what specific shifts in habits of mind they experienced, research subquestion 2 delves more deeply into the learning experiences that impacted learners’ habits of mind. It asks,

“What learning experiences, if any, do elementary preservice teachers believe affected their habits of mind during their coursework about educational assessment?” Quantitative survey data help to provide very clear, direct insight into the potential impact of learning experiences indicated as impactful by participants, and qualitative data stemming from an open-ended survey question, interviews, and document analysis provide greater depth to this subquestion’s analysis.

Subquestion 2: Quantitative Data

Primary quantitative data sources related to research subquestion 2 include those survey

questions which asked participants what experiences related to the course were most impactful

for them, including which persons involved in the class (question 5) and which assignments

(question 7), as well as question 13, which simply asked participants which persons and

assignments had been a part of the participants’ experience in the course. By comparing data

across these questions, the researcher is able to gain insight into what resources were available to

students, and which of those resources the students felt most affected their habits of mind during

their coursework about educational assessment. Related questions include survey question 7, to

a great extent, and survey question 13, to a lesser extent. When analyzed in conjunction with

the dispositions discussed in tables 25 and 27, there is a strong correlation between those individuals and activities that engaged students in critical reflection were those most likely to be

141

credited by preservice teachers as affecting their habits of mind during their coursework about

educational assessment.

Tables 28 and 29, below, explore these correlations. Table 28 depicts data from two

survey questions, 5 and 13, which explored people who participants noted impacted the shift for

them, and which people were part of their experience in the course, respectively. Table 29

depicts data from survey questions 7 and 13, which explored class assignments that participants

noted impacted the shift for them, and which assignments were part of their experience in the

course, respectively. To facilitate with data presentation, the items have been reordered in ranked order from greatest to least based on which people or class assignments the participants responded affected their habits of mind during their coursework. In the event that a person or class assignment received the same number of indications of its impact, the items are ranked by participants’ indication of presence. In the event that both the impact and presence were reported in equal measures by participants, they are presented in the order in which they appeared in the survey. Overall, responding participants reported that their instructor’s support (80%), a challenge from their instructor (30%), and their classmates’ support (25%) were most influential on their experiences of transformative learning. Likewise, these three person(s) indicated as most influential were also the three indicated as most present. These data are communicated in

Table 28:

142

Table 28. Significant Life Changes Indicated by Participants as Contributing to Transformative Change

Frequency of Percent of Frequency of Percent of Participants Participants Participants Participants Indicating Indicating Person(s) Indicating Indicating Person(s) Person(s) Indicated Presence of Presence of Impacted Impacted Person(s) in Person(s) in Transformative Transformative Course Course Learning Learning Your instructor’s 16 80% 27 84.38% support

A challenge from 6 30% 16 50% your instructor

Your classmates’ 5 25% 22 68.75% support

Another student’s 4 20% 15 46.88% support

Your advisor’s 4 20% 8 25 % support

Other - - - -

Similarly, participants were asked which class assignments most contributed to their experience of transformative learning, and which were present in their class experience. While

Table 22 depicted which assignments participants noted the presence of, those data alone do not indicate the impact of these assignments as perceived by participants. Exploring the perceived impact of these assignments is integral to understanding subquestion 2. Once again, it should be noted that two participants had selected “other,” but one such response indicated a specific class assignment and is included with the “class activity/exercise” data from Table 29. The other response cited interactions with the professor, and this response is aggregated in the data in Table

143

28. Inclusion of these remarks within more appropriate categories facilitates comparison across questions 7 and 13. The leading class assignments attributed by participants as impactful on their experiences of transformative learning include verbally discussing their concerns (64%), a class activity/exercise (64%), class/group projects (56%), supplemental reading or materials

(32%), and personal reflection (32%). As was the case in citing impact and frequency of person(s) influencing transformative learning, participants indicated the frequency of the top five class assignments were also the most present in this course.

Table 29. Class Assignments Indicated as Contributing to Transformative Learning by Participants

Frequency of Percent of Frequency of Percent of Participants Participants Participants Participants Indicating Indicating Class Assignment Indicating Indicating Assignment(s) Assignment(s) Choices Presence of Presence of Impacted Impacted Assignment (s) Assignment (s) Transformative Transformative in Course in Course Learning Learning Verbally discussing 16 64% 12 37.50% your concerns Class 16 64% 21 65.63% activity/exercise Class/group projects 14 56% 23 71.88% Supplemental reading 8 32% 17 53.13% or materials Personal reflection 8 32% 15 46.88% Self-assessment in 6 24% 10 31.25% the class

Deep, concentrated 4 16% 5 15.63% thought

Readings in a 3 12% 3 9.38% textbook

144

Non-traditional structure of this 3 12% 3 9.38% particular course

Writing about your 2 8% 7 21.88% concerns Chapter questions in 1 4% 1 3.13% a book

Journal entries - 0% - 0.00%

Autoethnography or personal learning - 0% 1 3.13% experience paper/project

Field experience - 0% 1 3.13%

Subquestion 2: Qualitative Data

Survey question 10 was also an open-ended survey question that considered the extension of participants’ experiences of transformative learning. Specifically, question 10 did explore how class assignments, experiences, or activities were perceived by participants to have affected their experiences of change. Question 10 in the survey prompted participants to move beyond their potential experiences with a disorienting dilemma (quadrant I) and instead to consider what learning experiences, specifically, they felt affected their habits of mind. As such, Quadrants II,

III, and IV, “Critical Reflection,” “Rational Dialogue,” and “Action” are represented in their responses.

In response to question 10, 19 participants answered, while 13 skipped it. Participant responses were coded through descriptive coding, then tabulated in terms of frequency and percent. A majority of participants’ responses (79%) suggest that their experiences in the class

145 impacted Critical Reflection for them, while nearly half (42%) of participants who answered this question were engaged in the Rational Dialogue component of TLT. One participant, representing 5% of the responses to this question, may have teetered on the verge of “Action,” though it merits noting that this action was taken within the class as part of a course assignment, and not action that was conducted by the participant independently as a result of transformative learning. Those data are portrayed by dispositions inherent in transformative learning, along with sample responses elicited from participant responses to question 10 in Table 30, below:

Table 30. Dispositions by Described Experiences of Transformative Learning

Percent of Frequency of Response Transformative Response Occurrence Relevant Sample Responses to Survey Learning Occurrence (All (All Question 3 Quadrant Responses) by Responses) by Quadrant Quadrant II. Critical 15 79% • “The class exposed me to new ideas Reflection concerning formative assessment and helped me reach the realization that there are lots of different and fun things that you can do with formative assessments that will make them fun for students.” • “Online modules, group collaboration, and lectures all influenced the change in my thinking about assessments and different ways I could administer an assessment in my future classroom.” • “The experience of actually making assessments provided the change in my thoughts. I saw that my own motives to see where my students are at, and have them meet objectives, with the willingness to change my instruction, likely mirrored that of my past and future teachers.”

146

III. Rational 8 42% • “It was discussion led by the instructor a Dialogue lot. She kept prompting us to think about our own experiences with grading, and reflecting on the assignments we give to students and what their point is, to know if we are effective as teachers.” • “It was mostly discussion and the professor pushing us to think about what we had experienced as students and how they influenced us. Thinking about strategies I had experienced but not realized it really made me think about what to do as a teacher.”

IV. Action 1 5% • “Creating our own formal assessments.”

In addition to qualitative data gleaned from the survey, a variety of pattern codes were noted with relation to interviews and document analysis to help the researcher explore subquestion 2, “What learning experiences, if any, do elementary preservice teachers believe

affected their habits of mind during their coursework about educational assessment?” These

codes correspond to ten categories, which can be further organized into learning experiences

related to the course; instructor-driven or instructor-related learning experiences; and peer- related learning experiences. Each of these patterns are truly interrelated and help to provide insight into this research question.

As noted previously, the course followed a non-traditional structure that is not dissimilar to a “flipped classroom” mentality, in which students learn course content as part of “homework” assignments, then enhance their learning from their homework during in-person class sessions.

Students credited the influence of the instructor for the success of this non-traditional model with

147 relation to the educational assessment course, and the instructor and their peers for the quality of their learning during in-person class sessions.

Pattern codes were used to categorize comments related to the “Benefits of This Non-

Traditional Course Structure” (N=21). Many comments that related to this category were cross- indicated under the pattern codes “Value of Repetitive Practice of Skills In/Out of Class” (N=41) and “Benefits of Opportunities to Practice in Class” (N=42). Accordingly, Lindsey talked about how a great deal of initial learning for the course occurred through the online learning platform environment outside of class, noting that she and her peers had new modules to complete each week that involved watching videos, reading articles, and responding to short, written prompts.

Lindsey, Sara, and Maria referred to the homework as “surprisingly neat,” “interesting,” and

“something to actually look forward to” because it enhanced everyone’s learning outside of class so that students generally came to class prepared to discuss their learning and ask questions if needed. Maria added that she felt the preparation work helped she and her peers to be prepared for “higher-order thinking.”

This higher-order thinking was part of what the participants felt they gained through the flipped model, such that their in-class lessons were able to engage them more deeply than if they were just learning the material for the first time in-person. As such, participants felt afforded more opportunities to practice in class, and felt there was great value in practicing what they were learning both in and out of class. Ellen referred to the non-traditional course structure as one that provided she and her classmates with numerous practice opportunities, saying, “we practiced a lot of the types of formative assessments we can use as teachers with our own students, and then we talked about these all the time.” Christine expanded on this, noting, “We had a lot of chances to practice what we read about online or about projects we were working on,

148 and could bounce ideas off each other, get feedback [in class]. It was supposed to be really collaborative most days [and] it was.” Maria talked about the learning experience of being able to create rubrics, write learning objectives, and create formative assessment activities outside of class, and then bring these into class to receive feedback and then continually practice improving these assignments prior to submitting them for a grade to their instructor.

While revealing how the instructor supported their learning and affected their habits of mind, participants spoke and wrote about a variety of influences. Related pattern codes include

“Instructor-Guided Reflection & Thought” (N=65), “Instructor Support & Reinforcement

(N=43), “Instructor Modeling of Best Practices” (N=27), and also the “Value of Quality

Examples, Materials, & Resources” (N=61). Related to the non-traditional structure of the course and opportunities to practice, Ellen credited her instructor, saying, “She [the instructor] has us practice all the time and talk about it with our classmates in little groups to better understand.” Christine felt the instructor constantly helped to guide their thinking through prompts, then allowed she and her classmates plenty of time to discuss and explore their thinking, noting, “She made a big difference in our thinking […], made you rethink the way you’re doing things to improve how [students] are doing.”

Maria lauded how the instructor guided their use of supplemental materials—for instance, the instructor would pinpoint sections of videos to watch, or point out page or paragraph numbers of readings she wanted the students to pay particular attention to. Maria felt the instructor facilitated an optimal learning environment, noting that even though she and her classmates were frequently engaged in sharing their development understandings with one another and giving and receiving critical feedback, “It wasn’t intimidating in any way. I think that’s just the environment my professor created.” Lindsey, likewise, spoke very positively

149

about the influence of the instructor in how she supported and reinforced learning, making the

learning environment fun and engaging because of her passion for the subject.

Additionally, the participants felt they learned a great deal about the role and proper uses

of formative and summative assessment because the instructor “practiced what she preached”

and provided numerous exemplars as models to enhance their learning. The class environment was described by all as one in which they felt encouraged to take risks through the instructor’s own use of a variety of low-stakes formative assessment techniques. Ellen reported, “Our grades were not about getting the answers perfectly right, they were about us practicing, and us showing we were learning” and “She really made sure we had all the help we needed to improve our understanding. It wasn’t just, hand in an assignment, get a grade, do the next one.” Maria talked about how she and her peers used rubrics they created in class to formatively assess their classmates learning related to an entirely different assignment about the branches of government.

Christine, Lindsey, and Sara each talked about how consistently-prepared the instructor was, and how she regularly devoted time to them in their PLCs and also individually. Lindsey even noted, “In every conversation that we had or every lesson that she was teaching, she planned it and executed it with so much excitement and enthusiasm.” The experiences of the

participants in this class appear to align with the instructor’s expectations of herself, as the

course syllabus articulates: “Teacher candidates can expect the instructor to be knowledgeable

and fully prepared to discuss the content of the course; to provide engaging classroom

experiences, to support your learning by providing timely and constructive feedback and being

available to assist with individual needs or concerns; and to provide explicit, transparent, and fair

expectations and assessments for the course.”

150

In addition to the evident impact of the instructor on preservice teachers’ learning experiences, participants also credited the influence of their peers in the course learning environment. Pattern codes were assessed with regards to the “Value of a Collaborative

Learning Environment” (N=58), along with the “Presence of Peer Discussion to Facilitate

Learning” (N=50) and the “Impact of Peer Feedback on Individual Learning” (N=33). The course syllabus clearly indicated that collaborative learning was an expectation of the instructor, and included this language: “During class, teacher candidates are expected to participate through active listening, asking relevant questions, and contributing to small and whole group discussions.” Responses in the participant interviews indicate that there was mutual understanding of this expectation.

For instance, Maria remarked that, though the course was one of her larger courses in the program, the presence and use of small learning groups—PLCs—helped the class to feel more intimate and her learning to feel more personalized. Christine described how the collaborative learning environment felt for her and how it impacted her learning experience: “We were learning together. I did not really know anyone in the class, so it wasn’t like we were trying to be cool and impress each other with our grades, and most of the stuff, like it wasn’t graded for being perfect, you know, so we could just actually learn together without worrying.” Sara talked about how many opportunities she and her peers had for discussion, and she noted that a major benefit of the collaborative learning environment was learning from other students whose prior educational experiences had occurred in a variety of settings all across the United States.

In addition to the benefits inherent in rich discussion, participants seemed to feel that peer feedback significantly impacted their habits of mind during their learning experiences in this course. Christine directly credited peer feedback as the significant factor that impacted her

151

learning, then explained, “We had a lot of chances to practice what we read about online or about

projects we were working on and could bounce ideas off each other, get feedback.” She

elaborated, “We talked […] about assignments we were doing to apply what we learned and

talked to help each other get better.” Likewise, Maria felt peer feedback helped to improve her

own learning, stating that she and her classmates had multiple chances to improve their work

before submitting it for a grade. These chances were based on the opportunities provided in

class to speak with PLC members, and then edit their work based on the feedback they received.

Maria noted how important of a learning experience this was for her because it occurred at the

“practice” level while she still had time to improve, and while her mistakes did not affect anyone. She compared this to what would have happened had she not had the chance to learn from peer feedback and instead had employed her flawed work—such as a rubric that did not measure what she had intended—in the field with elementary students.

Subquestion 2: Analysis Summary

Analysis and review of patterns and correlations across quantitative and qualitative data contribute to an understanding of research subquestion two, “What learning experiences, if any, do elementary preservice teachers believe affected their habits of mind during their coursework about educational assessment?” The primary learning experiences that participants indicated affected their habits of mind with respect to these dispositions include the experiences of being challenged and coached by their instructor, being guided and supported by their peers, and having numerous opportunities to practice and improve upon their learning through a variety of class assignments. Specifically, dispositions of preservice teachers were noted as shifting from learning experiences engaged through the TLT Quadrants of Critical Reflection (79% of participants) and Rational Dialogue (42% of participants).

152

Overall, quantitative survey data revealed that 80% of participants credited their instructor’s support, 30% credited a challenge from their instructor, and 25% credited their classmates support. Primary class assignments indicated as most impactful in helping to transform their habits of mind include verbally discussing their concerns (64%), a class activity/exercise (64%), class/group projects (56%), supplemental reading or materials (32%), and personal reflection (32%). The persons and class assignments noted in the quantitative data results also emerged in qualitative data analysis through each participant’s interview responses and documents submitted for analysis. Finally, the course description and course syllabus referenced the expected influence of class assignments and a collaborative learning environment on preservice teachers in this course.

Emergent Themes

Through the analysis of case study data collected through sequential, explanatory design, patterns were identified and pattern matching explored through analysis of individual responses and correlations among all participants. This analysis led to the emergence of four primary themes. While these themes clearly emerged through patterns observed in this study’s participants’ experiences in their course about educational assessment, it is notable that the themes generally reflect and build upon findings implicated in the literature review as the researcher intended.

Accordingly, these themes contribute to the researcher’s understanding of how learning experiences designed for preservice teachers affected their professional dispositions—their perspectives and habits of mind—related to the effective use of FAS. Furthermore, these themes contribute to the researcher’s purpose for conducting this study based on gaps identified in the literature review. The most prominent themes that emerged were related to (1) remedying

153

misconceptions and misuses of data; (2) increasing data literacy skills; (3) expanding practice

scope and sequence; and (4) re-culturing assessment training.

Remedying Misconceptions and Misuses of Data

“I think if a lot of us are learning to do better, we will do better. […] And that’s just so important.”

(SCP Lindsey, interview, referring to how many of her former teachers had misused assessments and data)

This theme encompasses both assessment and data literacy, insofar as teachers need to be proficient in identifying what they need to know, then selecting the appropriate type of assessment, to ensure that the resulting data are aligned with the initially identified need. Data

generated from this study helps to provide insight into preservice teachers’ perceptions of their

own learning related to FAS, which provides insight into how misconceptions and misuses of

data may be addressed at the preservice level. These data were analyzed from quantitative

survey results, and observed through reiterations of patterns such as “Growth & Development of

a Teacher Mentality” (N=72), “Learning a New Meaning from Old Material” (N=52), “Learning

of New Information from New Material” (N=75), “Value of Quality Examples, Materials, &

Resources” (N=61), and “Value of a Collaborative Learning Environment” (N=58), each of

which dealt with an aspect of participants learning to distinguish between inappropriate and correct understandings of assessment purposes and uses of data through reflection and discussion of their experiences and available exemplars.

Quantitative data from this study demonstrated that 78.1% of participants (N=32) felt they had experienced a shift in their values, beliefs, opinions, or expectations related to assessment techniques, the role of formative assessment, or the role of summative assessment.

154

And, 75% of these participants felt they had experienced a transformation in their mindset

related to these areas associated directly with the course. Further exploration into qualitative

data reveals these shifts impacted the preservice teachers’ professional dispositions related not

only to what they thought, but also what they felt they learned how to do, related to the effective

use of FAS.

For instance, in describing the transformation they experienced (survey question 3), many

participants referenced misconceptions or misuses of data they had witnessed and internalized as

a result of their own teachers in previous classes, and they wrote about how their thinking had

changed as a result of what they realized, learned, and had come to think. Similar remarks were

observed through document analysis of participants’ written assignments, as well as related

comments that surfaced in interviews.

The participants clearly reflected on prior misconceptions and shifts in their thinking, often reflecting back to previous learning experiences. One participant responded to survey

question 3, stating: “I Never [sic] gave any thought to informal assessments because as a student

in K-12 I was always given formal assessments. I love the ideas of creating fun and imaginative

ways in assessing student progress.” Another noted, “Before this class I thought that tests were

meant to like keep students on their toes. However, this class showed me assessments are more

about seeing where the students are at so you can make sure they are learning.” A third

responded, “I thought that formative assessments were just worksheets for student's [sic] to

complete almost to waste time. Instead I realized that formative assessments can be just as

important as a summative assessment.” During her interview, SCP Lindsey talked about prior

misuses of formative and summative data by her former teachers but noted that she now

understands they simply did not know how to use data to “know their students.” Acknowledging

155 a weakness in how her teachers had taught, Lindsey asserted the value of strong teacher preparation in the area of educational assessment, stating, “I think if a lot of us are learning to do better, we will do better. […] And that’s just so important.”

Additionally, SCP Christine talked about how her former teachers had misused formative and summative assessments. During her interview, Christine explained, “I want to assess students differently than a lot of my teachers assessed me.” However, participants also demonstrated a commitment to learning best practices and using data correctly to support students. Writing about changed conceptions and learning to use data correctly, one participant responded to question 3, asserting, “I realized the different ways students need to be assessed, the different types of environments in a classroom and the role and importance of a wide range of assessments” while another participant supplied, “I learned the definitions of formative and summative assessments. Before this class, I have taken both types of assessments but did not realize what they were.”

Increasing Data Literacy Skills

“I think a bit more time could have been spent on analyzing student data because while we did that […] one of the hardest parts […] is looking at how to compile data about your students in your class as a whole.”

(SCP Lindsey, interview, revealing what she would have changed about this course)

The theme of Data Literacy relates to the clear and persistent need for training in the area of using data literacy skills and emerged in this study through quantitative and qualitative data analysis. This theme is distinctly different from the theme of Remedying Misconceptions and

Misuses of data, as the former presumes teachers should know what types of assessments will generate data needed. Data literacy refers more finitely to the knowledge and understanding of

156 how to analyze and act upon collected data to support the needs of all learners. As noted, quantitative data from this study demonstrated that 78.1% of participants (N=32) felt they had experienced a shift in their professional dispositions related to assessment types and purposes, and 75% of those who indicated they experienced this shift associated the shift directly with their participation in this course. Additionally, patterns related to participants “Learning of New

Information from New Material” (N=75), “Value of Quality Examples, Materials, & Resources”

(N=61), “Learning New Meaning from Old Material” (N=52), “Instructor Modeling of Best

Practices” (N=27), “Learning to Distinguish Formative & Summative Assessment Best

Practices” (N=33) surfaced in interviews and document analysis, contributing to the emergence of this theme.

Although misconceptions and misuses of data resulting from inadequately prepared teachers in the field is broadly documented, participants in this educational assessment course appear on track to at least have an understanding of their own misconceptions, and to be appreciative of opportunities to not only remedy prior learning, but also build their skill sets with relation to data literacy. While data pertaining to the theme of Misconceptions and Misuses of

Data indicated that participants learned the difference between types of assessments, patterns related to the theme of Increasing Data Literacy support the theory that participants also learned the skills associated with this knowledge in this educational assessment course.

For example, in response to survey question 3, which asked participants to briefly describe their experience of transformation, one participant disclosed, “I previously did not know how to create formative or summative assessments, so this class XXXX [course name, redacted] allowed me to create and grasp skills to do so.” In her response to this survey question, SCP

Maria wrote, “I used to believe assessments were for teachers only. In creating formative

157

assessments myself, I realized they are checkpoints for not only the teachers, but the students.

Teachers fit future instruction to help [students] based on data.”

Survey Question 10 asked participants to describe what being in this course had to do

with their experience of change, and multiple responses related to the pattern of being able to use

what they learned in this class. One preservice educator described the skill of learning to construct a formative assessment with an activity that helped to clarify his or her prior misconception about the role of formative assessment: “I never fully agreed with what I believed

about formative assessment because I was never good with tests, but when I was creating a

formative assessment for this class I learned the true purpose of formative assessment and that

really changed my opinion. My teachers used everything for grades, practically like tests. Now

I feel like I can create true formative assessments to learn what students know, to change my

instruction if I need to, or just give a kid more support.” Another wrote, very simply, that the

experience of “creating our own formal assessments” changed his or her perceptions about the

types and roles of assessments. In her answer to this question, SCP Maria explained, “The

experience of actually making assessments provided the change in my thoughts. I saw that my

own motives to see where my students are at, and have them meet objectives, with the

willingness to change my instruction, likely mirrored that of my past and future teachers.”

Very notably, each SCP expanded on the types of written responses received in the

survey, discussing how it was not just the act of learning to create formative assessments, but the

practice they received in learning how to analyze collected data and collaborating with peers to

plan for ways to adjust instruction based on these data. Indication of this theme may not have

surfaced without SCP elaboration about the application of this skill, as the survey was not

158

tailored very closely to data literacy or the skills associated with professional dispositions of

emerging teachers.

Two SCPs indicated this was an area they wish had been expanded on in class; a third

SCP felt she had enough practice that she could replicate this skill if needed. For instance, Maria

mentioned that she and her peers learned how to create both formative and summative

assessments. She noted, however, that they mainly concentrated on analyzing summative

assessments, so they could create purposefully-aligned formative assessments that teach them to

improve student progress toward summative standards. SCP Lindsey felt that they spent some

time learning to analyze data, but that it was one of the hardest skills practiced in class, and noted

she wished they had spent more time on this skill. SCP Sara talked about one activity where the

class was asked to analyze a student’s scores on a standardized assessment and then role play

communicating the implications of this analysis to the student’s parents. Sara felt she could

replicate this skill in her own classroom because she took a lot of notes based on definitions she

learned and discussions she had with her classmates about this assignment.

Expanding Practice Scope and Sequence

“The course could have been offered longer, past the semester. I think more opportunities to practice, more time in the PLCs, was really helpful. The class had to end when the semester ended. It’s a really good thing it was a required class because it was so beneficial!”

(SCP Maria, interview, responding to what could have been done differently in the class)

Building on the themes of Remedying Misconceptions and Misuses of Data and

Increasing Data Literacy Skills is the theme of Expanding Practice Scope and Sequence in teacher preparation programs. Participants from this course indeed indicated they recognized the

159

need for repeated practice, and for authentic, hands-on practice opportunities. However, expanding beyond what scholarly literature had revealed, they overwhelmingly asserted the importance of gaining peer feedback to help them improve on the skills they were practicing.

These results indicate that opportunities for preservice educators to practice data literacy should be scaffolded in such a way that numerous opportunities to receive feedback and build on learning are routinely present in educational assessment coursework.

Patterns indicative of the theme Expanding Practice Sequence and Scope are implied through some quantitative data, such as the number of participants who indicated that a challenge from their instructor (30%), their instructor’s support (80%), class activities/exercises (64%), and class/group projects (56%) helped to transform their learning, these deductions would not be possible without correlating qualitative data. As such, the primary source of data leading to pattern codes associated with this theme stem from qualitative sources, and particularly qualitative survey questions and interview responses. The pattern code “Benefits of

Opportunities to Practice” arose 42 times, while the “Value of Repetitive Practice of Skills

In/Out of Class” arose 41 times. The “Sequence and Scope” component of this theme are indicated through the actual comments that resulted in multiple pattern codes, such as “Value of a Collaborative Learning Environment” (N=58), “Presence of Peer Discussion to Facilitate

Learning” (N=50), “Instructor Support & Reinforcement” (N=43), and “Impact of Peer Feedback

on Individual Learning” (N=33). In other words, opportunities to practice new learning may not

be sufficient to prepare preservice teachers to enter the field. Participants in this course cited

tremendous value in repetitive practice and opportunities to revise their work based on feedback

from peers and their instructor.

160

A synthesis of these pattern codes emerged in the participants’ interviews. For instance, each SCP spoke multiple times about the value of ongoing practice both in and out of the course based on challenges and guidance from the instructor and course assignments. Maria rued that the class was only one semester long, noting how beneficial the opportunities to practice and collaborate had been. Ellen said they, “practiced a lot of the types of formative assessments we can use as teachers with our own students, and we talked about these all the time,” stated of the instructor, “She had us practice all the time and talk about it with our classmates in little groups to better understand,” and articulated, “We practiced building assessments, looking at assessments, and she [the instructor] really made sure we had all the help we needed to improve our understanding. It wasn’t just, hand in an assignment, get a grade, do the next one.”

Christine’s interview echoed similar comments. Christine noted, “We had a lot of changes to practice what we read about online or about projects we were working on and could bounce ideas off each other, get feedback.” Christine also shared an experience of working to create a rubric for the first time, and she noted that only through repeated practice and gaining feedback did she feel she created a product worthy of using with students in the field. Lindsey discussed how her own knowledge and skills improved when she had opportunities to bring drafts of her assignments to her PLC and other classmates, articulating that the instructor “had us do a lot of just small group work and discussions in class. And those were the best because we could ... we bounced ideas off of each other, and not only in our little table groups, but really throughout the whole class.”

Sara related a variety of class activities and individual and group projects that helped her to practice using her new knowledge and skills. For instance, she described creating a variety of assessments, comparing and contracting formative and summative assessments, reviewing SAT

161

data and role playing communicating it to a parent, and writing learning objectives about the

branches of government that were scaffolded on Bloom’s Taxonomy. Of each of these, she noted that the instructor “never graded us on practicing,” and felt multiple opportunities to improve at each assignment helped her to learn and grow as a future teacher.

Re-culturing Assessment Training

“A lot of teachers, teachers I had […] graded everything… and you know, I think it’s probably because they didn’t have a class like this. I think probably research about how students learn has evolved a lot, so we are learning what teachers we had didn’t really know how to do.”

(SCP Ellen, interview, referring to the role having this course played in her shift in thinking about formative assessment strategies)

Many scholarly sources refer to the effectiveness of FAS based on educator training approaches and conclude that these methods need to be revised. While the course chosen for this study is integrated within the core curricula at this site, the value perceived by participants in the course—and their expressed desire for more like it—emerge from qualitative data as both a need from these participants and a theme that spans scholarly literature and results from this study alike. During the discussion of this potential site for study participation, the course instructor noted that this particular course was one of the only courses regarding educational assessment at the university, and it was the only such course that many of her students would take.

Unsurprisingly, only three (3) of the participants (12.5%) indicated they had taken any other course about educational assessment. In the case of each, they had taken a single course on assessing students who were ELLs. One student actually responded to this question with a “sad face” emoticon: “NONE ;(”.

162

Overarching pattern codes and the emergence of themes related to the need to clarify

misconceptions and misuses of data, as well as to increase the scope and sequence of practice

related to data literacy, strongly support the need for a re-culturation of assessment training, particularly at the preservice level. Key pattern codes that supported the explication of this theme include “Growth & Development of a Teacher Mentality” (N=72), “Instructor-Guided

Reflection & Thought” (N-65), “Learning a New Meaning from Old Material” (N=52),

“Presence of Peer Discussion to Facilitate Learning” (N-50), “Instructor Support &

Reinforcement” (N=43), “Instructor Modeling of Best Practices” (N=27), and “Benefits of Non-

Traditional Structure of Course” (N=21).

Most impactful, and underlying the importance of each code, the sheer number of participants in this class who referenced their former teachers’ misconceptions and misuses of data underscore that teachers had not been entering the field prepared with adequate data literacy. And, the frequency of participants indicating their thirst for more practice, and particularly authentic practice, coupled with peer and instructor feedback, emphasize the point that preservice teachers are open and eager to build their data literacy. In her interview, SCP

Ellen felt that having had even one course in educational assessment was invaluable, noting that she felt the knowledge and skills she had gained were “going to make me a better teacher from the start.” As aforementioned, Christine recognized that a number of her previous K-12 teachers had been ill-prepared to use assessments correctly, but she felt the learning she and her peers attained in this single class would prepare them to be better teachers than many they had learned from. She asserted, “I think if a lot of us are learning to do better, we will do better. […] And that’s just so important.”

163

Maria noted she planned to put much more time and effort into assessment going

forward, now that she recognizes more deeply the value of FAS. While chatting after our formal

interview ended, Maria shared that some of her friends were pursuing teacher education degrees

at other universities and did not have even one course about educational assessment like she did.

She seemed aghast as she shared this, arguing, “They’re never going to be prepared to support

student learning. They’re just not!’ Lindsey touted how her views about FAS had changed,

highlighting, “It will definitely affect what I do in the future!” Sarah shared similar sentiments,

discussing how she had a clearer understanding of how to implement formative and summative

assessments as a result of the course. She stated that learning this balance “…is really beneficial

because it guides you in figuring out how to better your students, better your instruction, better

your classroom as a whole.”

Synthesis of Themes

The patterns that emerged from quantitative and qualitative analysis of this study build a

collective narrative of themes that helped the researcher to gain a better understanding of how

learning experiences designed for preservice teachers may affective their professional

dispositions related to the effective use of FAS. Furthermore, the narrative of these themes relates to and build upon the survey of scholarly literature provided in the literature review.

Understanding educators’ perceptions of their own learning regarding FAS can provide insight into how misconceptions might be addressed through training. When misconceptions can be remedied through refined preservice teacher education, a stronger ability to understand and correctly use data may be present among teachers entering the field. To facilitate this improved teacher preparation, scholarly research, coupled with data analysis from this study, suggest that

164 preservice teacher education should focus not only on assessment and data literacy, but also explicitly aim to increase skills associated with data literacy.

While this case study represents only the experience of a small group of research participants (N=32) and cannot be generalized, there is certainly an indicated need for more types of practice activities. Additionally, results from these participants suggest the possible value inherent in receiving and being able to act on peer and instructor feedback to improve one’s nascent professional dispositions. Finally, to help bring about these shifts in preservice teacher education, there is a need for the re-culturing of assessment training that starts at the preservice level, well before teachers ever enter the field.

Reflexivity

When I commenced this doctoral program and initially brainstormed my problem of practice, I approached this research with the intent of becoming a true scholar-practitioner in the field of teacher education. My own positionality was affected by my former roles as a K-12 student, a student in an undergraduate teacher preparation program, a classroom teacher, a graduate student learning about school leadership, and additional career roles training teachers at the state and national levels. I presently serve as the Dean of Students for STEM teacher education program.

My experiences in each of these roles has led to the culminating belief that many, if not most, teacher preparation programs are not fully preparing teachers to enter the field with adequate knowledge and skills related to data literacy, and particularly related to the effective use of FAS. Additionally, I have come to believe that teachers in the field also are not receiving appropriate training through professional development to build their own knowledge and skills to an adequate level. At the heart of my concern about these areas of need is the driving belief that

165

students who are not taught by teachers who employ FAS in their instructional practices are not

being served fairly.

Having completed this study, I find that even one educational assessment class that seems

to have been incredibly well-received by its participants is still just… one class. I was taken aback to discover that the program description for this site—one well known for its liberal arts education and its teacher preparation program—made almost no reference to how standards for teacher preparation have evolved, noting only the emergence of technology in its program in response to these standards. Absolutely no mention was made with regards to assessments, assessment literacy, data literacy, or any similar topic. After analyzing collected survey data, reading the course syllabus, reviewing assignments shared with me by participants, and revisiting interview transcripts multiple times, I was struck with the realization that one class, no matter how wonderful, was simply not enough. Indeed, this study caused me to reflect again on my own teacher preparation, and I looked up my undergraduate teacher preparation transcript from over a decade ago: in the scope of earning my full Bachelor of Science in English Education degree, I had taken dozens of courses. Nineteen of them were specifically courses about literature required for my degree. Two were based on educational psychology. One was about school law. One was about special education. And just one, like this research site required, was about educational assessment.

Armed with the knowledge that little appears to have changed in the field of teacher education, I am still committed to acting as a change agent in teacher preparation. I desired to learn about the types of learning experiences that would help better prepare preservice teachers entering the field to use FAS effectively, and I feel empowered to do that now, more than ever.

As often happens when one gathers and analyzes data, however, I have just as many new

166 questions as I have answers to the questions I did ask. It is my greatest hope for my career that I can continue to research and learn about implications related to the learning needs of preservice teachers. If preservice teachers enter the field equipped with a more solid data literacy foundation, there will be more opportunities for universal implementation of FAS and positive impacts on K-12 student achievement.

167

Chapter 5 – Implications

In this chapter, a summary of the study is provided, which includes a brief review of

context, reiteration of the research questions, and a review of the methodology and analysis

utilized. Findings and themes are reiterated briefly, then explicated with respect to the theoretical framework and scholarly literature. Finally, applications of praxis are explored.

Context and Review of the Problem of Practice

Based on the evolution of the InTASC standards and articulable goals first by NCATE and TEAC, which are now espoused by CAEP, scholars in teacher education call for explicit training for educators that are hands-on and authentic, such that educators can integrate these standards into ongoing practice (Henson, 2009; Memory et al., 2003; Shaw, 2013).

Unfortunately, there has been an overemphasis on summative assessment in the field of education, which has contributed to a weakness in training around the knowledge and skills associated with the effective use of FAS (Allendoerfer et al., 2014; Bennett, 2011; Chueachot et al., 2013; Collins, 2012).

Multiple studies suggest that this overemphasis on summative assessment has been caused by a misalignment between professional dispositions that are not aligned with pedagogical best practices (Bennett, 2011; CfBT Education Trust, 2013; Collins, 2012; Hanover

Research, 2014; Vlachou, 2015). To address this misalignment, researchers have suggested that improving teacher preparation programs would likely be more time and cost effective than

targeting professional development for teachers who are already in the field (Burns, 2005; Jacobs

et al., 2015; Marshall, 2007).

While existing scholarly research delineates current knowledge about the professional

dispositions of preservice teachers in the domain of FAS, there is a gap in the research related to

168 how these dispositions are shaped as a result of teacher preparation coursework. This research was intended to help provide information related to this gap by exploring how professional dispositions—that is, the perspectives and habits of mind—of preservice teachers may be affected by their learning experiences in a course about educational assessment. Three questions guided this study:

• Primary Research Question: How are elementary preservice teachers’ habits of mind

about formative assessment strategies (FAS) impacted by their learning experiences?

• Research Subquestions: (1) What shifts in habits of mind, if any, do elementary

preservice teachers perceive they experienced regarding understanding the role of

effective formative assessment strategies (FAS) in education as a result of their

coursework about educational assessment? (2) What learning experiences, if any, do

elementary preservice teachers believe affected their habits of mind during their

coursework about educational assessment?

Review of Methodology and Analysis Approach

This research relied on a mixed-methods case study to examine how learning experiences designed for preservice teachers at a large liberal arts university in the Mid-Atlantic region during their engagement in an elementary teacher educational assessment course affected their habits of mind related to the effective use of FAS. Case study methodology was chosen because it is ideal for exploring beliefs and practices of participants within an environment bound by time and space (Stake, 1995; Yin, 2013), offering an exploration of participants and how they experience the given phenomena in a real-life context (Merriam, 1998; Yin, 2013). Specifically,

169

this case study considered how course participants experienced shifts in professional dispositions

based on their participation in a course about educational assessment.

Primary data collection methods for this study included a survey, semi-structured

interviews, and document analysis. Thirty-two participants responded to the survey, and five

participants of this thirty-two took part in follow-up activities, which included the semi-

structured interviews and providing consent for their coursework to be studied. Document

analysis was intended to include the program description, course description, and course syllabus

so that the approach of each with regards to teacher training in the areas of assessment or data

literacy could be coded. However, the program description made no references to either, so this

was noted, and this resource was not coded.

Discussion of Findings

To explore resulting data from this study, a sequential, explanatory design approach was applied. This approach is ideally suited to mixed methods research and structures the process first of quantitative and then of qualitative analysis, followed by pattern matching that involves an analysis of individual responses and correlations among participants (Creamer, 2018;

Creswell & Plano Clark, 2018; Kitchenbaum, 2010). Use of this approach revealed, in short, that a majority of participants (78.1%) indicated experiencing transformative learning, and they indicated this occurred primarily through personal meaning-making opportunities influenced by their peers, instructor, and class assignments. Just over 59% of participants cited the influence of their peers or instructor, while 78% cited the impact of class assignments. The most significant shift that occurred for participants was a shift in their understanding of how to use assessment data elicited from a variety of assessments, and particularly formative assessment strategies.

Also noted was the emergence of a teacher mentality as participants began to think about

170

themselves less as students, and more as future teachers of students. Finally, participants cited

that repetitive practice, both inside and outside of class, and ongoing opportunities to engage in

critical reflection and rational dialogue with their peers, helped to shift their thinking about the

use of formative assessment strategies.

Review of Themes

Four themes emerged from analysis of these data. These themes each correspond to

professional dispositions of preservice teachers and are related to (1) remedying misconceptions and misuses of data; (2) increasing data literacy skills; (3) expanding practice scope and

sequence; and (4) re-culturing assessment training. Additionally, these themes may be

considered with respect to the theoretical framework of TLT that shaped various aspects of this

research, as well as the scholarly literature that this research sought to explore and add to.

Explication Regarding the Theoretical Framework

TLT was chosen for its potential to reveal how learning experiences for educators may

succeed or fail in transforming the dispositions of preservice educators related to the effective

use of FAS. TLT was woven throughout the research questions and was the basis for selection

of the primary research instrument, the LAS and LAS follow-up activities, based on the research

of King (2009). The researcher determined that TLT would be the most appropriate framework for helping to illuminate how learning outcomes for preservice teachers have been guided by their learning experiences.

Based on careful exploration of the underlying concepts and assumptions of Cranton

(2006); Dirkx (1998); King (2005); Mezirow (1997); Mezirow and Associates (2009); Taylor

(2009), a number of propositions with respect to this study were initially made to determine how preservice teachers in this study may demonstrate their engagement with the ten classic phases of

171

TLT. Firstly, that preservice teachers who feel they did experience a transformation in their

learning will likely do so (Phase 1) in response to a disorienting dilemma—a conversation, class

activity, or encounter with another individual, for instance—that results in (Phase 2) self-

examination and (Phase 3) critical reflection about one’s prior beliefs that (Phase 4) lead one to

either accept or reject those beliefs. Then, preservice teachers (Phase 5) are able to contemplate

the implications of this realization and (Phase 6) plan for how they may act with regards to the

subject in the future. They (Phase 7 and 8) may have opportunities to practice applying this new

learning—in class activities, for instance—or apply these new strategies in a field setting, (Phase

9) developing over time new skills and feelings related to the topic until (Phase 10) their new

understanding becomes part of their routine practices.

The engagement of preservice teachers in the phases of TLT indicate neither a positive

nor negative outcome. The process itself leaves open for researchers the opportunity to analyze

and interpret the outcome of each participants’ experiences based on their transformation. Each

of the five (5) SCPs indicated strong, positive transformations related to their values, beliefs,

opinions, or expectations regarding assessment types and purposes. Additionally, each SCP

reported a positive experience in their new learning and efficacy regarding the effective use of

FAS.

To trace whether preservice teachers’ learning outcomes were guided by their learning

experiences, a careful analysis of Mezirow’s (1991) TLT Phases were explored in conjunction

with quantitative survey data (see Table 24). The researcher was able to conclude that 91% of participants had experiences during the course that aligned with the initial TLT framework presuppositions noted above. Furthermore, the researcher was able to match these data with

Glisczinski (2007) and Herbers’ (1998) Quadrants of TLT, finding that a significant majority of

172 participants engaged in critical reflection, which is connected to shifts in habits of mind. All participants (100%) who felt they had experienced a disorienting dilemma indicated they had engaged in Critical Reflection (Quadrant II) as a result. And, of the 79% of those who had engaged in Critical Reflection as a result of an experience associated with the course, 42% were then able to engage in Rational Dialogue (Quadrant III) related to this reflection.

The researcher was able to analyze sufficient quantitative and qualitative data to explore the purpose of the research as intended and to answer each of the research questions. As such, the researcher concludes that TLT was an appropriate framework for exploring how learning experiences for educators succeeded or failed in transforming the dispositions of preservice educators related to the effective use of FAS.

Explication Regarding Scholarly Literature

Once again, the four primary themes that emerged from the data include (1) remedying misconceptions and misuses of data; (2) increasing data literacy skills; (3) expanding practice scope and sequence; and (4) re-culturing assessment training. These themes are indicated through the data analysis of this study, and they also address and build upon scholarly literature reviewed in chapter two.

Theme 1: Remedying misconceptions and misuses of data. Although correctly applied

FAS are known to be among the most effective strategies for improving student achievement, research has demonstrated myriad misconceptions among educators, resulting in weak or incorrect data use. Misconceptions and misuses of data generally stem from inconsistent teacher training FAS (Curry et al., 2015; Reeves & Honig, 2015), leaving teachers entering the field feeling improperly prepared to use student learning data to adjust instruction (Abrams et al.,

2015; Ciampa & Gallagher, 2016; Crosswell & Beutel, 2012; Jimerson & Wayman, 2015).

173

Theme 2: Increasing data literacy skills. While a review of scholarly literature clearly

indicates that a majority of practicing teachers who feel familiar with assessment or data literacy

credit their preservice coursework for these knowledge and skills (Volante & Becket, 2011),

researchers caution that preservice coursework, while beneficial, generally places more emphasis

on assessment literacy instead of actual skills associated with data literacy (Mandinach et al.,

2015). As a reminder, assessment literacy tends to relate to the knowledge and skills associated

with sound assessment construction and assessment purposes, while data literacy refers to the

ability to select, interpret, and use student-related information to improve instruction or other aspects of education (Bernhardt, 2004). DeLuca et al. (2013) caution that most preservice educator coursework does not provide sufficient opportunities for preservice educators to practice the application of their learning, particularly with regards to data literacy.

Theme 3: Expanding practice scope and sequence. Scholarly research demonstrates that most current preservice teacher training is neither sufficient in scope nor sequence of scaffolded practice experiences, nor does it provide enough authentic, hands-on experience to adequately prepare those seeking to enter the teaching profession. Allendoerfer et al. (2014) urge those who facilitate teacher preparation programs to ensure that preservice educators are equipped with both theoretical and practical knowledge. Shepherd and Alpert (2012) call for preservice teachers to have ample opportunities to practice not only creating a variety of assessments, but also learning to analyze and plan to act on data resulting from assessment implementation. McGee and Colby (2013) cite an increase in overall preservice educator exposure to assessment literacy, yet they caution professors of preservice teachers that they must incorporate numerous opportunities for students to practice interpreting and communicating assessment results. Jimserson and Wayman (2015) discuss the range of activities discussed in

174 other scholarly literature, and they find that repeated practice opportunities for preservice teachers to practice their skills in low-stakes data conversations are essential to adequate teacher preparation.

Theme 4: Re-culturing assessment training. A major problem raised with respect to restructuring professional development for preservice educators stems from resistance to changing culture and existing beliefs about the role of data in education (Baere et al., 2012).

Baere et al. (2012) find that a majority of higher education programs do not include sufficient coursework in these areas, despite program mission statements to the contrary. For instance, in a review of teacher preparation programs’ syllabi and a survey across 208 institutions of higher education, Mandinach et al. (2015) reveals grave concerns about teacher preparation programs with respect to readying educators to use data as FAS. These concerns have spanned time. For instance, a national panel report issued by the Association of American Colleges and Universities

(2002) indicated that “The shape of the undergraduate curriculum was essentially fixed half a century ago” (p. 16), and scholar Randy Bass indicates that even when preservice programs respond to evolving standards for preservice teachers, added courses are usually extra-curricular, as opposed to being required within core curricula.

Applications of Praxis

Findings from this study are not generalizable, owing to the short period of time over which data were collected and the small sample size. However, findings provide implications for theory and implications for future research, as well as some direct application for praxis.

Implications for Theory

The researcher sought to explore how learning experiences designed for preservice teachers during their engagement in an elementary teacher course about educational assessment

175 may affect their professional dispositions related to the effective use of FAS. Numerous theoretical frameworks exist to help guide research through deeper analysis of how learners experience changes in their mindsets as a result of learning experiences. TLT, which was chosen for this study, is one such framework that helped the researcher to explore the ways that learners experience a shift in their values, opinions, beliefs, or expectations about a particular concept.

A variety of implications for those in administrative and instructional capacities in the field of teacher preparation surfaced, and these are revealed through the analysis of rich data gathered from among participants. The themes that emerged from these data may be considered and used in any number of ways. On the surface, the themes simply help to synthesize the nearly

700 comments coded for further exploration, which resulted in 14 pattern codes. For example, if those in charge of structuring teacher preparation considered insights gained from these data, they may choose to review each of the themes in conjunction with standards for teacher preparation, building in opportunities to help address professional dispositions related to misconceptions and misuses of data; expand practice opportunities related to data literacy in terms of both scope and sequence; increase data literacy skills by enhancing both theory and application; and provide more required and elective course offerings in assessment training at the preservice level. Facets of these themes apply to those who structure teacher preparation programs, as well as for instructors of preservice teachers.

The themes that are most applicable to structuring preservice education courses are those related to expanding the scope and sequence of practice opportunities for teachers, as well as providing more required and elective course offerings in assessment training. Preservice teachers in this study articulated they perceived tremendous value in the “flipped classroom” component of this course, which bolstered their ability to come to class prepared to discuss and

176 collaborate about new learning garnered from instructor-selected video excerpts, articles, and other resources. There is not presently extensive research available about the application of a

“flipped classroom” model in higher education, nor specifically with respect to teacher education, so this concept represents an opportunity both for future research and for potential application in the field for progressive programs and instructors. Additionally, a “flipped classroom” initiative may be more beneficial for programs that are able to increase the number of course offerings in assessment training. Considerations that emerged from this study related to increasing course offerings include providing separate courses in assessment literacy and in data literacy, as well as providing offerings that enable preservice teachers to engage in simple opportunities to apply their learning in real classroom settings through intermittent field experiences before engaging in full-time student teaching or matriculating into the field.

Expanding the scope and sequence of practice opportunities is also germane for instructors of preservice teachers. For instance, instructors of preservice teachers may deliberately act on the levers of change involving preservice teacher support and class assignments, which were the primary learning experiences measured through the LAS that preservice teachers indicated impacted their transformative learning. One way to act on these levers of change is by applying effective FAS as part of the course design to ascertain incoming students’ perceptions and attitudes towards the roles and purposes of different types of assessments, in conjunction with their familiarity in analyzing and acting on data. Doing so may help to mitigate misconceptions and potential misuses of data for preservice teachers, as scholarly research shows a high likelihood that these are common among incoming future teachers, and data from this study indicated a majority (78.1%) of preservice educators in this course also held misconceptions about the role of assessments when they entered this course.

177

Another way that instructors of preservice teachers may act on implications from these

data are to ensure that future teachers have ample opportunities to practice applying what they have learned through authentic, hands-on scenarios. This need is demonstrated in scholarly literature and emerged repeatedly as pattern codes that helped contribute to the emergence of themes regarding expanding the scope and sequence of practice, and also incorporating greater opportunities to target learning in the area of data literacy. Preservice teachers in this cohort, for example, referenced numerous real-life data practice activities that spanned both assessment and data literacy.

Assessment literacy activities included creating and applying rubrics and developing and testing learning objectives aligned to standards and Bloom’s Taxonomy. The primary data literacy activity noted by participants involved and analyzing data and explaining the interpretation to parents of students. Based on participant feedback, including more data literacy activities would better prepare preservice teachers entering the field. Moreover, ensuring all courses offered to teachers in the areas of education assessment embed numerous opportunities to practice these types of learning experiences, in conjunction with iterative peer and instructor feedback, may be considered invaluable in future learning opportunities for other preservice educators.

Implications for Future Research

The results from this study provide any number of implications for future research, particularly in terms of opportunities to explore the impacts of learning experiences for additional demographics, opportunities to measure at additional points in time and over time, and opportunities to explore this research through different applications of this or other theoretical frameworks.

178

Future research by demographics. Missing from this research, for instance, were opportunities to disaggregate and explore the data further based on attributes of the preservice

teacher population such as class status, whether participants were specializing in a particular

concentration, or whether participants had experience with any other course on educational

assessment, owing to the relatively small sample size of this study. For instance, while best

practices for quantitative data analysis do not recommend use of data for fewer than five

participants to draw any generalizations (Creswell, 2012), the researcher did note that every

student (100%) who took a prior course in education assessment felt they had experienced

transformative learning, whereas a majority (68.8%) of those who had not taken a prior

assessment course still noted transformative learning from their experience in this course. Of

course, these data were not significant because of the extremely small number of students who

reported having taken a prior course in educational assessment, but a future study of a much

larger participant pool, and particularly students who have taken more than one course on

educational assessment, may be conducted to explore how such an experience may contribute to

transformative learning among preservice students.

Other instances for future research may be associated more narrowly with demographic

traits, such as sex, age, or relationship status. For example, 100% of male participants and only

75% of female participants in this study indicated transformative learning. Yet, there were only

four (4) male participants, so there can be no statistical significance to this finding. Nonetheless, future research may explore how male and female preservice educators experience transformative learning.

Future research at additional points in time and over time. Particularly useful for future research may be opportunities to measure how learning experiences may affect the

179 transformational learning of preservice teachers at additional points in time, as well as over time.

For instance, this study essentially acted as an exit interview of sorts, gathering data from preservice teachers as they conclude this course. One SCP actually noted that she had not realized she had a clear shift in her learning until she received my survey. A study that assessed preservice teachers’ perceptions as they entered a course about educational assessment, at a midpoint during the course, and at the conclusion of the course may provide different data.

Alternatively, a study may be employed to track a cohort of participants as they progress through a given teacher preparation program. For instance, the possible pool of participants for this study, and the participants who responded to this research, were primarily sophomores and juniors. Research that explored the perspectives of these preservice teachers in their freshmen year, prior to their taking this course, and then compared their responses when they finished the course, may provide useful insight into the full experience of transformative learning.

Another potential longitudinal study could measure how the experience of transformative learning does or does not persist for preservice educators as they enter the field, with attention paid to the types of learning experiences that seemed to affect professional dispositions in a long- term capacity. Likewise, a study of the transformation in professional dispositions that followed participants through their senior year and into the first several years in the field could yield rich data into the types of learning experiences that had and had not been most beneficial once educators entered the field. Finally, while not touched upon in this study, there is abundant research in the area of teacher attrition. Longitudinal research about how well prepared teachers entering the field were to use data literacy skills—skills that are increasingly necessary in the era of high-stakes testing—could prove not only tremendously interesting, but also immensely useful.

180

Future research shape by additional/alternative theoretical frameworks. Without a doubt, the application of TLT was the skeleton of this entire study, impacting it from research structure through research analysis. The researcher acknowledges that there are multiple ways that TLT could have been applied to a study’s structure, and that each way may have yielded a different angle for structure an interpretation. One way may be to take further steps to frame the professional dispositions of preservice teachers with respect to the effective use of FAS within the phases of TLT. For instance, shifts in dispositions were observed with relation to all phases, ranging from the experience of a disorienting dilemma based on misconceptions or misuses of data or assessment literacy, and following through subsequent phases of self-examination; critical assessment of assumptions; transformation; exploration of new roles, relationships or actions; planning a course of action; acquiring new knowledge and skills; trying new roles; and construction of new competences, confidence, and efficacy. However, deeper exploration into participants’ experiences as they cycled through these phases was not conducted and may provide much greater insight into planning learning experiences to impact preservice teachers’ professional dispositions.

Finally, TLT suggests that learners may experience shifts in their mindsets related to their experiences within a course, but it also articulates that significant life changes, such as a divorce, move, change in job, etc., may contribute to one’s susceptibility to undergo transformative learning. No participants in this study indicated transformative learning as a result only of a significant life change. However, further exploration into the nexus of preservice educators who did experience significant life changes that they felt triggered transformative learning is another possible area for future research.

181

Direct Application for Praxis

Findings from this study may be explored in conjunction with any number of

presentations and workshops related to preservice teacher education, and they may be taken into

account for planning purposes of guest lecture activities in higher education. Additionally, the

research here may be expanded upon for future study, and those results may be publishable in

any number of peer-reviewed journals. Perhaps more germane, though, are how the results of

this study provide some insight into the general understandings and shifts of habits of mind

among preservice educators in a course about educational assessment has the potential for direct

application in the field in several ways.

First, findings and themes support existing research around educator preparation in the

areas of assessment and data literacy by helping to portray the learning experiences that

participants of this study felt were most impactful. Second, findings and themes articulated

herein help to address deficiencies in current research that stem from the self-reporting nature of

teacher-preparation programs (Birenbaum et al., 2011; Volante & Becket, 2011) by providing

insight from preservice teachers’ perspectives. Third, and perhaps most vital, the findings and

themes from this study help to convey that, despite a decades-old call for teacher preparation

programs to offer robust training to prepare teacher candidates to use data effectively, even an

institution of higher education touted for its exemplary teacher preparation offers only two, and

requires only one, course in educational assessment.

Conclusion

Despite a variety of federal, state, and district-wide initiatives that call for educators to be both knowledgeable and skilled in the use of data to adjust instruction, training at both the

preservice teacher and professional development levels are neither sufficient nor consistent

182

(Athanases et al., 2013; Nortvedt et al., 2016). This lack of adequate preparation has resulted in educators’ misconceptions and misuses of data in the field. An array of research indicates that most practicing teachers, despite having exposure to traditional professional development to learn about FAS, instead credit the training they received at the preservice level instruction

(Volante & Becket, 2011). Yet, preservice coursework generally lacks the breadth and depth needed to ensure future educators have enough adequate, hands-on experiential learning that will equip them to use data effectively to support student learning when they enter the field (DeLuca et al., 2013; Mandinach et al., 2015).

There is room for improvement in both arenas of teacher training, but such improvement is likely to occur only through shifts in the culture of education, and particularly with regard to assessment training. For the reasons explored in this doctoral thesis, it is my belief that teacher preparation programs need to adapt to widespread policy change implications (Curry et al., 2015;

Hill, 2011a) by conducting thorough reviews of the courses required in their degree programs and ensuring adequate coverage of coursework to address the professional dispositions required of those entering the field. Ultimately, the quotations I included in the Dedication of this doctoral thesis resonate with my motivations as a scholar-practitioner, particularly having completed this research. In many instances, including both post-secondary teacher education programs and K-12 education, we have failed to heed John Dewey’s (1944) from Democracy in

Education: “If we teach today’s students as we taught yesterday’s, we rob them of tomorrow” (p.

167). While we may know better, in many cases, we certainly are not doing better.

183

References

Abrams, L. M., McMillan, J. H., & Wetzel, A. P. (2015). Implementing benchmark testing for

formative purposes: Teacher voices about what works. Educational Assessment,

Evaluation and Accountability, 27(4), 347-375. doi:10.1007/s11092-015-9214-9

Allendoerfer, C., Wilson, D., Kim, M. J., & Burpee, E. (2014). Mapping beliefs about teaching

to patterns of instruction within science, technology, engineering, and mathematics.

Teaching in Higher Education, 19(7), 758-771. doi:10.1080/13562517.2014.901962

Athanases, S. Z., Bennett, L. H., & Wahleithner, J. M. (2013). Fostering data literacy through

preservice teacher inquiry in English language arts. The Teacher Educator, 48(1), 8-28.

doi:10.1080/08878730.2012.740151

Avramides, K., Hunter, J., Oliver, M., & Luckin, R. (2015). A method for teacher inquiry in

cross-curricular projects: Lessons from a case study. British Journal of Educational

Technology, 46(2), 249-264. doi:10.1111/bjet.12233

Azis, A. (2015). Conceptions and practices of assessment: A case of teachers representing

improvement conception. TEFLIN Journal, 26(2), 129-154.

doi:10.15639/teflinjournal.v26i2/129-154

Baere, P., Marshall, J., Torgerson, C., Tracz, S., & Chiero, R. (2012). Toward a culture of

evidence: Factors affecting survey assessment of teacher preparation. Teacher Education

Quarterly, 39(1), 159-173.

Baharein, K., & Noor, M. (2008). Case study: A strategic research methodology. American

Journal of Applied Sciences, 5(11), 1602-1604.

184

Bass, R. (2012). Disrupting ourselves: The problem of learning in higher education. EDUCAUSE

Review, 47(2), 23-33.

Baxter, P., & Jack, S. (2008). Qualitative case study methodology: Study design and

implementation for novice researchers. The Qualitative Report, 13(4), 544-559.

Becker, H. S., Geer, B., & Hughes, E. C. (1968). Making the grade: The academic side of

college life. New York: John Wiley & Son.

Benedict, A. E., Thomas, R. A., Kimerling, J., & Leko, C. (2013). Trends in teacher evaluation:

What every special education teacher should know. Teaching Exceptional Children,

45(5), 60-68.

Bennett, R. E. (2011). Formative assessment: a critical review. Assessment in Education:

Principles, Policy & Practice, 18(1), 5-25. doi:10.1080/0969594X.2010.513678

Bernhardt, V. (2004). Data analysis for continuous school improvement. Larchmont, NY: Eye on

Education.

Birenbaum, M., Kimron, H., & Shilton, H. (2011). Nested contexts that shape assessment "for"

learning: School-based professional learning community and classroom culture. Studies

in Educational Evaluation, 37(1), 35-48.

Birenbaum, M., Kimron, H., Shilton, H., & Shahaf-Barzilay, R. (2009). Cycles of inquiry:

Formative assessment in service of learning in classrooms and in school-based

professional communities. Studies in Educational Evaluation, 35(4), 130-149.

doi:10.1016/j.stueduc.2010.01.001

Black, P., & Wiliam, D. (2006). Assessment and classroom learning. Assessment in Education:

Principles, Policy & Practice, 5(1), 7-74. doi:10.1080/0969595980050102

185

Boblin, S. L., Ireland, S., Kirkpatrick, H., & Robertson, K. (2013). Using Stake’s qualitative case

study approach to explore implementation of evidence-bad practice. Qualitative Health

Research, 23(9), 1267-1275.

Briscoe, F. M. (2005). A question of representation in educational discourse: Multiplicities and

intersections of identifies and positionalities. Educational Studies, 38(1), 23-41.

Burns, C. (2005). Tensions between national, school and teacher development needs: A survey

of teachers' views about continuing professional development within a group of rural

primary schools. Journal of In-Service Education, 31(2), 353-372.

Carlton Parsons, E. R. (2008). Positionality of African Americans and a theoretical

accommodation of it: Rethinking science education research. Science Education, 92(6),

1127-1144. doi:0.1002/sce.20273

Caruana, V. (2011). Preservice teachers’ perceptions of their perspective transformations: A

case study. (Doctor of Philosophy Doctoral dissertation), University of South Florida.

(UMI 3477812)

CfBT Education Trust. (2013). Assessment for learning: Effects and impact. Retrieved from

http://files.eric.ed.gov/fulltext/ED546817.pdf

Chenail, R. J. (1995). Presenting qualitative data. The Qualitative Report, 2(3), 1-8.

Chueachot, S., Srisa-ard, B., & Srihamongkol, Y. (2013). The development of an assessment for

learning model for elementary classroom. International Education Studies, 6(9), 119-124.

doi:10.5539/ies.v6n9p119

Ciampa, K., & Gallagher, T. L. (2016). Teacher collaborative inquiry in the context of literacy

education: Examining the effects on teacher self-efficacy, instructional and assessment

practices. Teachers and Teaching, 22(7), 858-878. doi:10.1080/13540602.2016.1185821

186

Collard, S., & Law, M. (1989). The limits of perspective transformation: A critique of Mezirow's

theory. Adult Education Quarterly, 39(2), 99-107.

Collins, N. M. (2012). The impact of assessment for learning: Benefits and barriers to student

achievement. Cardinal Stritch University. ProQuest Dissertations and Theses Global

database.

Cooper, B., & Cowie, B. (2010). Collaborative research for assessment for learning. Teaching

and Teacher Education, 26(4), 979-986. doi:10.1016/j.tate.2009.10.040

Council for the Accreditation of Educator Preparation. (2014). Update on NCATE/TEAC U.S.

Department of Education recognition. Retrieved from

http://caepnet.org/accreditation/caep-accreditation/caep-accreditation-resources/update-

on-ncate-teac

Council of Chief State School Officers. (2011). Interstate Teacher Assessment and Support

Consortium (InTASC) model core teaching standards: A resource for state dialogue.

Retrieved from

http://www.ccsso.org/Documents/2011/InTASC_Model_Core_Teaching_Standards_201

1.pdf

Council of Chief State School Officers. (2012). Our responsibility, our promise: Transforming

educator preparation and entry into the profession. Retrieved from

http://ccsso.org/Documents/2012/Our%20Responsibility%20Our%20Promise_2012.pdf

Cranton, P. (2006). Understanding and promoting transformative learning: A guide for

educators of adults (2nd ed.). San Francisco, CA: Jossey Bass.

Creamer, E. G. (2018). An introduction to fully integrated mixed methods research. Thousand

Oaks, CA: SAGE Publications, Inc.

187

Creswell, J. (2012). Educational research: Planning, conducting, and evaluating quantitative

and qualitative research (4 ed.). Boston: Pearson.

Creswell, J., & Plano Clark, V. L. (2018). Designing and conducting mixed methods research.

Thousand Oaks, CA: SAGE Publications, Inc.

Creswell, J., & Poth, C. N. (2013). Qualitative inquiry & research design: Choosing among five

traditions (4th ed.). Thousand Oaks, CA: Sage.

Crocker, L., & Anlgina, T. (1986). Introduction to classical & modern test theory. New York:

Harcourt Brace Jovanovich College Publishers.

Crosswell, L., & Beutel, D. (2012). Investigating pre-service teachers' perceptions of their

preparedness to teach. The International Journal of Learning, 18(4), 203-211.

Curry, K. A., Mwavita, M., Holter, A., & Harris, E. (2015). Getting assessment right at the

classroom level: Using formative assessment for decision making. Educational

Assessment, Evaluation and Accountability, 28(1), 89-104. doi:10.1007/s11092-015-

9226-5

Datnow, A., & Hubbard, L. (2015). Teachers' use of assessment data to inform instruction:

Lessons learned from the past and prospects for the future. Teachers College Record,

117(4), 1-26.

David, L. (2018). Summaries of learning theories and models. Learning Theories. Retrieved

from https://www.learning-theories.com/

Davies, A., Busick, K., Herbst, S., & Sherman, A. (2014). System leaders using assessment for

learning as both the change and the change process: Developing theory from practice.

The Curriculum Journal, 25(4), 567-592. doi:10.1080/09585176.2014.964276

188

DeLuca, C., Chavez, T., & Cao, C. (2013). Establishing a foundation for valid teacher judgement

on student learning: The role of pre-service assessment education. Assessment in

Education: Principles, Policy & Practice, 20(1), 107-126.

doi:10.1080/0969594x.2012.668870

DeLuca, C., Klinger, D., Pyper, J., & Woods, J. (2015). Instructional rounds as a professional

learning model for systemic implementation of assessment for learning. Assessment in

Education: Principles, Policy & Practice, 22(1), 122-139.

Dewey, J. (1938). Experience and education. New York, NY: Touchstone Rockefeller Center.

Dewey, J. (1944). Democracy and education. New York, NY: Macmillan Company.

Dirkx, J. M. (1998). Transformative learning theory in the practice of adult education: An

overview. Journal of Lifelong Learning, 7, 1-14.

Freire, P. (2013). Pedagogy of the oppressed. In D. J. Flinders & S. J. Thornton (Eds.), The

curriculum studies reader (4th ed.). New York: Routledge.

Ginsberg, R., & Kingston, N. (2014). Caught in a vise: The challenges facing teacher preparation

in an era of accountability. Teachers College Record, 116(1), 1-48.

Glaser, B. G., & Strauss, A. C. (1967). The discovery of grounded theory. Chicago: Aldine

Publishing Co.

Glisczinski, D. J. (2007). Transformative higher education: A meaningful degree of

understanding. Journal of Transformative Education, 5(4), 317-328.

Groenewald, T. (2008). Memos and memoing. In L. M. Given (Ed.), The SAGE encyclopedia of

qualitative research methods. Thousand Oaks, CA: SAGE Publications, Ltd.

Hanover Research. (2014). The impact of formative assessment and learning intentions on

student achievement. Retrieved from Washington, D.C.:

189

Hargreaves, E. (2013). Inquiring into children’s experiences of teacher feedback:

Reconceptualising assessment for learning. Oxford Review of Education, 39(2), 229-246.

doi:10.1080/03054985.2013.787922

Henson, K. (2009). Making the most of INTASC standards. SRATE Journal, 18(2), 34-40.

Herbers, M. S. (1998). Perspective transformation in preservice teachers. University of

Memphis, Tennessee. Unpublished doctoral dissertation.

Hiebert, J., & Morris, A. K. (2012). Teaching, rather tham teachers, as a path toward improving

classroom instruction. Journal of Teacher Education, 63(2), 92-102.

doi:10.1177/0022487111428328

Hill, M. F. (2011a). 'Getting traction': Enablers and barriers to implementing assessment for

learning in secondary schools. Assessment in Education: Principles, Policy & Practice,

18(4), 347-364. doi:10.1080/0969594X.2011.600247

Hill, M. F. (2011b). "Getting traction": Enablers and barriers to implementing assessment for

learning in secondary schools. Assessment in Education: Principles, Policy & Practice,

18(4), 347-364. doi:10.1080/0969594X.2011.600247

Jacobs, J., Burns, R. W., & Yendol-Hoppey, D. (2015). The inequitable influence that varying

accountability contexts in the United States have on teacher professional development.

Professional Development in Education, 41(5), 849-872.

doi:10.1080/19415257.2014.994657

Jimerson, J. B., & Wayman, J. C. (2015). Professional learning for using data: Examining

teacher needs & supports. Teachers College Record, 117(4), 1-36.

190

Jonsson, A., Lundahl, C., & Holmgren, A. (2015). Evaluating a large-scale implementation of

assessment for learning in Sweden. Assessment in Education: Principles, Policy &

Practice, 22(1), 104-121. doi:10.1080/0969594X.2014.970612

Kennedy, H. L. (2000). Assessing outcomes in teacher education: Weathering the storms.

Assessment Update, 12(5), 3-14.

King, K. P. (1997). Examining learning activities and transformational learning. International

Journal of University Adult Education, 36(3), 23-37.

King, K. P. (2004). Both sides now: Examining transformative learning and professional

development of educators. Innovative Higher Education, 29(2), 155-175.

King, K. P. (2005). Bringing transformative learning to life: Understanding its meanings and

opportunities for learners and educators. Malabar, FL: Krieger.

King, K. P. (2009). Handbook of the evolving research of transformative learning. Charlotte,

NC: Information Age Publishing.

Kitchenbaum, A. D. (2010). Mixed methods in case study resaerch. In A. J. Mills, G. Durepos, &

E. Wiebe (Eds.), Encyclopedia of case study research (pp. 562-564). Thousand Oaks,

CA: SAGE Publications, Inc.

Kolb, D. A. (1984). The process of experiential learning Experiential learning: Experience as

the source of learning and development (pp. 20-39).

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New

York, NY: Cambridge University Press.

LeCompte, M. D., & Schensul, J. J. (1999). Analysis and interpretation. Analyzing and

interpreting ethnographic data. Walnut Creek, CA: Altamira Press, A Division of Sage,

Inc.

191

Leimer, C. (2012). Organizing for Evidence-Based Decision Making and Improvement. Change:

The Magazine of Higher Learning, 44(4), 45-51.

Libman, Z. (2009). Teacher licensing examinations—True progress or an illusion? Studies in

Educational Evaluation, 35(1), 7-15. doi:10.1016/j.stueduc.2009.01.003

Lincoln, Y. S., & Guba, E. G. (2000). Paradigmatic controversies, contradictions, and emerging

confluences. In N. K. Denzin & Y. S. Lincoln (Eds.), The SAGE handbook of qualitative

research (2nd ed., pp. 163-188). Thousand Oaks, CA: Sage.

Linder, R. (2011). A professional learning community implements formative assessment in

middle school language arts classes. College Reading Association Yearbook, 34, 71-85.

Machi, L. A., & McEvoy, B. T. (2012). The literature review: Six steps to success. Thousand

Oaks, CA: Corwin, a SAGE Company.

Mandinach, E. B., Friedman, J. M., & Gummer, E. S. (2015). How can schools of education help

to build educators' capacity to use data? A systemic view of the issue. Teachers College

Record, 117(4), 1-50.

Mandinach, E. B., & Gummer, E. (2015). Data-driven decision making: Components of the

enculturation of data use in education. Teachers College Record, 117(4), 1-8.

Marsh, J. A., Bertrand, M., & Huguet, A. (2015). Using data to alter instructional practice: The

mediating role of coaches and professional learning communities. Teachers College

Record, 117(4), 1-40.

Marshall, K. (2007). Preparing better teachers focus of new book published by Pennsylvania

Academy for the Profession of Teaching and Learning: Focus is on Professional

Development Schools and their role in enhancing teacher quality.

192

McGee, J., & Colby, S. (2014). Impact of an assessment course on teacher candidates’

assessment literacy. Action in Teacher Education, 36(5-6), 522-532.

doi:10.1080/01626620.2014.977753

McLeod, S. (2017). Kolb — Learning Styles. Simply Pyschology. Retrieved from

https://www.simplypsychology.org/learning-kolb.html

Memory, D. M., Yoder, C. L., & Williams, R. O. (2003). Using problem-centered learning for

teachimg collaboration in a general methods course. The Clearing House: A Journal of

Educational Strategies, Issues and Ideas, 77, 67-72.

doi:http://dx.doi.org/10.1080/00098650309601231

Merriam, S. (1998). Case studies as qualitative research. San Francisco, CA: Jossey-Bass.

Mertens, D. M. (2010). Chapter 1: An introduction to research. In D. M. Mertens (Ed.), Research

and evaluation in education and psychology: Integrating diversity with quantitative,

qualitative, and mixed methods (4th ed.). Thousand Oaks, CA: Sage.

Mezirow, J. (1991). Transformative dimensions in adult learning. San Francisco, CA: Jossey-

Bass.

Mezirow, J. (1997). Transformative learning: Theory to practice. New Directions for Adult and

Continuing Education, 74, 5-12. doi:10.1002/ace.7401

Mezirow, J. (2000). Learning as transformation: Critical perspectives on a theory in progress.

San Francisco, CA: John Wiley & Sons, Inc.

Mezirow, J. (2003). Transformative learning as discourse. Journal of Transformative Education,

1(1), 58-63. doi:10.1177/1541344603252172

Mezirow, J., & Associates. (1990). Fostering critical reflection in adulthood: A guide to

transformative and emancipatory learning. San Francisco: Jossey-Bass.

193

Mezirow, J., & Associates. (2009). Transformative learning in practice: Insights from

community, workplace, and higher education. San Francisco, CA: Jossey-Bass.

Mikulec, E., & Miller, P. C. (2012). The odd couple: Freire and the InTASC teacher education

standards. Journal of Thought, 47(3), 34-48.

Miles, M., Huberman, A., & Saldaña, J. (2013). Qualitative data analysis: A methods

sourcebook. Thousand Oaks, CA: SAGE Publications.

Moore, W. S. (1994). Student and faculty epistemology in the college classroom: The Perry

schema of intellectual and ethical development. In K. W. Prichard & R. M. Sawyer

(Eds.), Handbook of college teaching: Theory and applications (pp. 45-68). Westport,

CT: Greenwood Press.

Mui So, W. W., & Hoi Lee, T. T. (2011). Influence of teachers’ perceptions of teaching and

learning on the implementation of assessment for learning in inquiry study. Assessment in

Education: Principles, Policy & Practice, 18(4), 417-432.

doi:10.1080/0969594x.2011.577409

National Panel Report. (2002). Greater expectations: A new vision for learning as a nation goes

to college. Washington, D.C.: Association of American Colleges and Universities.

Nortvedt, G. A., Santos, L., & Pinto, J. (2016). Assessment for learning in Norway and Portugal:

The case of primary school mathematics teaching. Assessment in Education: Principles,

Policy & Practice, 23(3), 377-395. doi:10.1080/0969594X.2015.1108900

Ogan-Bekiroglu, F., & Suzuk, E. (2014). Pre-service teachers' assessment literacy and its

implementation into practice. The Curriculum Journal, 25(3), 344-371.

doi:doi.org/10.1080/09585176.2014.899916

194

Pat-El, R. J., Tillema, H., Segers, M., & Vedder, P. (2015). Multilevel predictors of differing

perceptions of assessment for learning practices between teachers and students.

Assessment in Education: Principles, Policy & Practice, 22(2), 282-298.

doi:10.1080/0969594X.2014.975675

Patton, M. Q. (2002). Qualitative research & evaluation methods. Thousand Oaks, CA: Sage

Publications.

Pella, S. (2012). What should count as data for data-driven instruction? Toward contextualized

data-inquiry models for teacher education and professional development. Middle Grades

Research Journal, 7(1), 57-75.

Pierce, R. J. (2013). Assessment for learning: Improving student outcomes through course

design. Journal of Applied Learning Technology, 3(4), 26-30.

Rashid, R. A., & Jaidin, J. H. (2014). Exploring primary school teachers’ conceptions of

“assessment for learning”. International Education Studies, 7(9), 69-83.

doi:10.5539/ies.v7n9p69

Ravitch, S. M., & Riggan, M. (2017). Reason and rigor: How conceptual frameworks guide

research (2nd ed.). Los Angeles: SAGE.

Reeves, T. D., & Honig, S. L. (2015). A classroom data literacy intervention for pre-service

teachers. Teaching and Teacher Education, 50(1), 90-101.

Roulston, K. J., & Shelton, S. A. (2015). Reconceptualizing bias in teaching qualitative methods.

Qualitative Inquiry, 21(4), 332-342. doi:10.1177/1077800414563803

Seidman, I. (2006). Interviewing as qualitative research: A guide for researchers in education

and the social sciences. New York: Teachers College Press.

195

Shaw, P. A. (2013). Are you good enough to teach our grandchildren? Kappa Delta Pi Record,

49(4), 148-152. doi:10.1080/00228958.2013.845507

Shepherd, C. M., & Alpert, M. (2012). Reaping what you sow: The quality of a student teacher

program is reflected in its student teachers. Review of Higher Education and Self-

Learning, 5(15), 21-26.

Smith, J. A., Flowers, P., & Larkin, M. (2012). Collecting data. Interpretive phenomenological

analysis: Theory, method and research (pp. 56-78). Thousand Oaks, CA: Sage

Publications.

Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: SAGE Publications.

Stewart, A. (2014). Case study. In J. Mills & M. Birks (Eds.), Qualitative methodology: A

practical guide (pp. 145-159). Thousand Oaks, CA: SAGE Publications.

Taylor, E. (2009). Fostering transformative learning. In Mezirow, Taylor, & Associates (Eds.),

Transformative learning in practice: Insights from community, workplace, and higher

education (pp. 3-17).

Taylor, E. W. (2000). Analyzing research on transformative learning theory. In J. Mezirow &

Associates (Eds.), Learning as transformation (pp. 29-310). San Francisco, CA: Jossey-

Bass.

Tennant, M., & Pogson, P. (1995). Learning and change in the adult years. San Francisco, CA:

Jossey-Bass.

Trochim, W. M. K. (2006). Deduction & induction. Retrieved from

http://www.socialresearchmethods.net/kb/dedind.php

196

U.S. Department of Education. (2016). FY 2015 Annual Performance Report and FY 2017

Annual Performance Plan. Retrieved from

http://www2.ed.gov/about/reports/annual/2017plan/2015-2017-apr-app.pdf

Vlachou, M. A. (2015). Does assessment for learning work to promote student learning? The

England paradigm. The Clearing House, 88(3), 101-107.

doi:10.1080/00098655.2015.1032194

Volante, L., & Becket, D. (2011). Formative assessment and the contemporary classroom:

Synergies and tensions between research and practice. Canadian Journal of Education,

34(2), 239-255.

Wayda, V., & Lund, J. (2005). Assessing dispositions: An unresolved chanllenge in teacher

education. JOPERD, 76(1), 34-41.

Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. New York, NY:

Cambridge University Press.

Wenger, E. C., & Snyder, W. M. (2000). Communities of practice: The organizational frontier.

Harvard Business Review, 78(1), 139-145.

Winfrey, O. (Producer). (2011). The powerful lesson Maya Angelou taught Oprah. Oprah’s Life

Class. Retrieved from http://www.oprah.com/oprahs-lifeclass/The-Powerful-Lesson-

Maya-Angelou-Taught-Oprah-Video

Yan, Z., & Cheng, E. C. K. (2015). Primary teachers' attitudes, intentions and practices regarding

formative assessment. Teaching and Teacher Education, 45, 128-136.

doi:10.1016/j.tate.2014.10.002

Yazan, B. (2015). Three approaches to case study methods in education: Yin, Merriam, and

Stake. The Qualitative Report, 20(2), 135-152.

197

Yin, R. K. (2013). Case study research: Design and methods (5th ed.). Thousand Oaks, CA:

SAGE Publications.

Young, C. (n.d.). Coding the data collection. Lecture. Northeastern University. Retrieved from

https://northeastern.blackboard.com/bbcswebdav/pid-18087533-dt-content-rid-

40872720_1/xid-40872720_1

Yu, H., & Li, H. (2014). Group-based formative assessment: A successful way to make

summative assessment effective. Theory and Practice in Language Studies, 4(4), 839-

844. doi:10.4304/tpls.4.4.839-844

198

Appendix A: Invitation to Participate in the Study

To: Undergraduate Students in This Educational Assessment Course

From: Jamie Lee Korns, M.Ed., Doctoral Candidate at Northeastern University

My name is Jamie Lee Korns, and I am a graduate student at Northeastern University.

This Fall 2018, I am conducting a study about how preservice teachers learn about educational assessment and, specifically, about formative assessment strategies. Your instructor has allowed me to invite you to participate in this study. As an educator who is passionate about teacher training, I am interested in how preservice teachers learn new information about educational assessment and then understand it for themselves. Through this research, I hope to learn more about this process and how your learning experiences impacted your development as a preservice teacher.

Everyone who is a part of the study will be asked to answer a brief questionnaire concerning his or her experience in the course chosen for this study. This should take about 15 minutes and will be conducted online. All responses will be confidential. At the end of the questionnaire, you will be asked whether you would be willing to participate in brief follow-up activities that will also be conducted online. If so, you will be able to sign up. These follow-up activities include a short interview and permission for me to review documents relevant to your learning experiences about formative assessment strategies (e.g., your K-W-L and related class assignments). Interviews and review of these documents will greatly assist me in more accurately interpreting the survey results. The decision to participate in this research will have no effect on your school performance or school records.

199

I am available to answer any questions you may have about this study or the results of this research at (XXX) XXX-XXXX or [email protected]. The Human Subject

Research Protection coordinator at my university’s IRB Office, Nan Clark Regina, may be contacted concerning rights of participants in this study at (617) 373-4588 or [email protected].

Your consent to participate in this study is given if you take the survey online. Thank you for considering this research project! You may access this survey at https://www.surveymonkey.com/r/NEULAS

200

Appendix B: Informed Consent for the Online Survey

This informed consent template will appear as the first screen of the online Learning Activities Survey, implemented through SurveyMonkey.

Northeastern University, College of Professional Studies, Doctor of Education (EdD) Name of Investigator(s): Dr. Kristal Clemons; Jamie Lee Korns, M.Ed. Title of Project: Applying Transformative Learning Theory to Understand Preservice Teachers’ Learning Experiences About Formative Assessment Strategies

Request to Participate in Research

We would like to invite you to participate in a web-based online survey. The survey is part of a research study whose purpose is to explore how preservice teachers learn new information about educational assessment, and particularly the role of formative assessment strategies, and then understand it for themselves. This survey should take about 15 minutes to complete.

We are asking you to participate in this study because you are a preservice teacher enrolled in a course about educational assessment. You must be at least 18 years old to take this survey.

The decision to participate in this research project is voluntary. You do not have to participate, and you can refuse to answer any question. Even if you begin the web-based online survey, you can stop at any time.

There are no foreseeable risks or discomforts to you for taking part in this study.

There are no direct benefits to you from participating in this study. However, your responses may help us learn more about preservice teacher education.

You will not be paid for your participation in this study.

Your part in this study will be handled in a confidential manner. Any reports or publications based on this research will use only group data and will not identify you or any individual as being affiliated with this project.

201

If you have any questions regarding electronic privacy, please feel free to contact Mark Nardone, NU’s Director of Information Security via phone at 617-373-7901, or via email at [email protected].

If you have any questions about this study, please feel free to contact Jamie Lee Korns, the person mainly responsible for the research, at [email protected] or (XXX) XXX-XXXX. You can also contact Dr. Kristal Clemons, the Principal Investigator, at [email protected] or (850) 629-9132.

If you have any questions regarding your rights as a research participant, please contact Nan C. Regina, Director, Human Subject Research Protection, Mail Stop: 560-177, 360 Huntington Avenue, Northeastern University, Boston, MA 02115. Tel: 617-373-4588, Email: [email protected]. You may call anonymously if you wish.

This study has been reviewed and approved by the Northeastern University Institutional Review Board (# CPS18-10-06).

By clicking on the “accept” button below you are indicating that you consent to participate in this study. Please print out a copy of this consent form for your records.

Thank you for your time.

Yours in education, Jamie Lee Korns

202

Appendix C: The Learning Activities Survey =

Instructions: As preservice teachers learn about new information, they often think about how it fits into their way of thinking. This questionnaire is about your experience as a preservice teacher in this course about educational assessment. Only with your help can we learn more about how preservice teachers learn about educational assessment, and especially about the role and use of formative assessment strategies (FAS). The survey only takes a short time to complete, and your responses will be kept completely anonymous. Thank you for being part of this project. We greatly appreciate your cooperation.

1. Consider your knowledge and skills related to what you know and will be able to do as a

teacher. Then, thinking about your educational experiences in this course about

educational assessment, check off any statements that may apply.

a. I had an experience that caused me to question the way I normally act.

b. I had an experience that caused me to question my ideas about social roles

(Examples of social roles include what a mother or father should do or how an adult

child should act).

c. As I questioned my ideas, I realized I still agree with my beliefs or role

expectations.

d. Or instead, as I questioned my ideas, I realized I no longer agree with my beliefs

or role expectations.

e. I realized that other people also questioned their beliefs.

f. I thought about acting in a different way from my usual beliefs and roles.

203

g. I felt uncomfortable with traditional social expectations. = h. I tried out new roles so that I would become comfortable or confident in them.

i. I tried to figure out a way to adopt these new ways of acting.

j. I gathered the information I needed to adopt these new ways of acting.

k. I began to think about the reactions and feedback from my new behavior.

l. I took action and adopted these new ways of acting.

m. I do not identify with any of the statements above.

2. Since you’ve been enrolled in this course about educational assessment, do you believe

you have experienced a time when you realized that your values, beliefs, opinions or

expectations about assessment techniques, the role of formative assessment, or the role of

summative assessment have changed? If "yes," please continue the survey with question

# 3. If "no," please continue the survey with question # 11.

a. Yes

b. No

3. Briefly describe what happened. ______

______

______

______

______

______

4. Was it a person who influenced this change?

a. Yes

b. No

204

5. If "Yes," was it . . . (check all that apply) = a. Another student's support

b. Your classmates' support

c. Your advisor’s support

d. A challenge from your instructor

e. Your instructor’s support

f. Other (please specify): ______

6. Was it part of a class assignment that influenced the change?

a. Yes

b. No

7. If "Yes," what was it? (check all that apply)

a. Readings in a textbook

b. Chapter questions in a book

c. Supplemental reading or materials

d. Class/group projects

e. Verbally discussing your concerns

f. Writing about your concerns

g. Journal entries

h. Self-assessment in the class

i. Class activity/exercise

j. Deep, concentrated thought

k. Personal reflection

l. Autoethnography or personal learning experience paper/project

205

m. Non-traditional structure of this particular course = n. Field experience

o. Other (please specify): ______

8. Was it a significant change in your life that influenced the change?

a. Yes

b. No

9. If "Yes," what was it? (Please check all that apply)

a. Marriage

b. Birth/adoption of a child

c. Moving

d. Divorce/separation/break up

e. Death of a loved one

f. Illness of loved one

g. Change of job

h. Loss of job

i. Immigration to a new country

j. Other (please specify): ______

10. Thinking back to when you first realized that your views or perspective about assessment

techniques, the role of formative assessment, or the role of summative assessment had

changed, what did program or course materials in your class have to do with the

experience of change? ______

______

______

206

______= ______

______

11. Would you characterize yourself as one who usually thinks back over previous decisions

or past behavior?

a. Yes

b. No

12. Would you say that you frequently reflect upon the meaning of your studies for yourself,

personally?

a. Yes

b. No

13. Which of the following have been a part of your experience in this class? (Please check

all that apply)

a. Another student's support

b. Your classmates' support

c. Your advisor’s support

d. A challenge from your instructor

e. Your instructor’s support

f. Readings in a textbook

g. Chapter questions in a book

h. Supplemental reading or materials

i. Class/group projects

j. Verbally discussing your concerns

207

k. Writing about your concerns = l. Journal entries

m. Self-assessment in the class

n. Class activity/exercise

o. Deep, concentrated thought

p. Personal reflection

q. Autoethnography or personal learning experience paper/project

r. Non-traditional structure of this particular course

s. Field experience

t. Other (please specify): ______

14. Which of the following occurred while you have been enrolled in this class?

a. Marriage

b. Birth/adoption of a child

c. Moving

d. Divorce/separation/break up

e. Death of a loved one

f. Illness of loved one

g. Change of job

h. Loss of job

i. Immigration to a new country

a. j. Other (please specify): ______

15. What is your sex?

a. Male

208

b. Female = c. Transgender

d. Non-binary

e. Other (please specify): ______

16. What is your marital status?

a. Single

b. Married

c. Partnered

d. Divorce/separated

e. Widowed

17. What is your race? How do you identify yourself?

a. White, non-Hispanic

b. Black, non-Hispanic

c. Native American

d. Asian or Pacific Islander

e. Hispanic

f. Bi-Racial

g. Multi-Racial

h. Other (please specify): ______

18. What is your current major?

a. Education-Elementary

b. Education-Secondary

c. Education-Special Education

209

d. Education-Early Childhood = e. Education-Technology

f. Non Education Major—Please specify: ______

19. Prior education:

a. High school diploma/GED

b. Associates degree

c. Bachelors degree

d. Masters degree

e. Doctorate

f. Other (please specify): ______

20. How many semesters have you been enrolled in your program? _____

21. What is your age?

a. Below 21

b. 21-24

c. 25-29

d. 30-39

e. 40-49

f. 50-59

g. 60 or over

22. If you have taken any other courses on educational assessment, please indicate the course

name(s): ______

210

As a participant in this survey, you are also invited to take part in follow-up activities, which include a short interview and permission for me to review documents that you share with me = relevant to your learning experiences about formative assessment strategies (e.g., your K-W-L and related class assignments). These activities can all be done online. The follow-up interview may take about 30-45 minutes, and sharing any materials you choose should only take time to locate and email them to me. As a reminder, any information you share will be kept completely confidential. Please indicate your willingness to participate: Yes, I am willing to participate a follow-up interview. No, I would not like to participate in the follow-up interview.

If you answered “Yes,” you may receive email and/or a phone call from Jamie Lee Korns, Doctoral Candidate, to set up a time for the interview.

Name: ______Email Address: ______Phone Number: ______

Thank you so much for completing this survey!

211

Appendix D: Informed Consent to Follow-Up Activities = Northeastern University, College of Professional Studies, Doctor of Education (EdD) Name of Investigator(s): Dr. Kristal Clemons; Jamie Lee Korns, M.Ed. Title of Project: Applying Transformative Learning Theory to Understand Preservice Teachers’ Learning Experiences About Formative Assessment Strategies

Informed Consent to Participate in a Research Study We are inviting you to take part in a research study. This form will tell you about the study, but the researcher will explain it to you first. You may ask this person any questions that you have. When you are ready to make a decision, you may tell the researcher if you want to participate or not. You do not have to participate if you do not want to. If you decide to participate, the researcher will ask you to sign this statement and will give you a copy to keep.

Why am I being asked to take part in this research study? We are asking you to be in this study because you are a preservice teacher enrolled in a course about educational assessment. Your responses on the survey indicate that you have undergone a shift in how you think about course material, and I would like to understand this shift better.

Why is this research study being done? This study involves researching how preservice teachers learn about educational assessment, and especially about the role and use of formative assessment strategies. I want to understand how different types of learning experiences that an instructor might plan may affect learners' perceptions, beliefs, and habits about using formative assessment strategies. The goal of this research is to understand which types of learning experiences might be the most impactful for creating positive shifts in preservice teachers' knowledge and abilities to use formative assessment strategies.

What will I be asked to do? If you decide to take part in this part of the study, we will ask you to complete two activities as a follow-up to the survey you already took. The activities include (1) participating in an interview and (2) providing written permission for me to review documents relevant to your learning experiences (e.g., your K-W-L and related class assignments) that you choose to share with me.

Where will this take place and how much of my time will it take? These follow-up activities are all online. The interview will take place using free video conference software and should take about 30-45 minutes of your time. Sharing evidence of your learning in this class, like a copy of your completed K-W-L, reflection papers, or anything else you think help to demonstrate your understanding about formative assessment strategies, can be done via email and should take no longer than 30 minutes maximum to locate and email via file attachments or scanned attachments.

212

Will there be any risk or discomfort to me? = There are no foreseeable risks or discomfort for you.

Will I benefit by being in this research? There will be no direct benefit to you for taking part in the study.

Who will see the information about me? Your part in this study will be confidential. Only the researchers on this study will see the information about you. No reports or publications will use information that can identify you in any way or any individual as being of this project.

Any reporting of your experiences will use a pseudonym. No other identifying information will be provided in the research report. You will also have an opportunity to review the transcripts of your interview to ensure accuracy. All information and data collected from you or about you will be stored for the duration of this study in password-encrypted local (not cloud) storage that requires two-step authentication and is available only to me, Jamie Lee Korns, as the researcher. If any tangible, hard copies of the same are generated (e.g., printed transcripts of interviews to assist in data analysis), they will be locked in a fire-resistant metal filing cabinet accessible only to me. Once this study is completed, all materials stored electronically will be placed on an external device such as a flash drive or external hard drive and placed in the locked, fire-resistant metal filing cabinet. Finally, after conferral of my doctoral degree and an additional three years, during which time an authority may request verification of my research, has elapsed, all transcripts and recordings associated with this research will be destroyed.

In rare instances, authorized people may request to see research information about you and other people in this study. This is done only to be sure that the research is done properly. We would only permit people who are authorized by organizations such as the Northeastern University Institutional Review Board to see this information.

What will happen if I suffer any harm from this research? No research-related injuries are possible from this research.

Can I stop my participation in this study? Your participation in this research is completely voluntary. You do not have to participate if you do not want to and you can refuse to answer any question. Even if you begin the study, you may quit at any time. If you do not participate or if you decide to quit, you will not lose any rights, benefits, or services that you would otherwise have as a student. Who can I contact if I have questions or problems?

213 If you have any questions about this study, please feel free to contact Jamie Lee Korns, the person mainly responsible for the research, at [email protected] or ( ) - XXX XXX = XXXX. You can also contact Dr. Kristal Clemons, the Principal Investigator, at [email protected] or (850) 629-9132.

Who can I contact about my rights as a participant? If you have any questions about your rights in this research, you may contact Nan C. Regina, Director, Human Subject Research Protection, Mail Stop: 560-177, 360 Huntington Avenue, Northeastern University, Boston, MA 02115. Tel: 617.373.4588, Email: [email protected]. You may call anonymously if you wish.

Will I be paid for my participation? You will not be paid for your participation.

Will it cost me anything to participate? There is no cost to you to participate.

Is there anything else I need to know? You must be at least 18 years old to participate in this research.

I agree to take part in this research.

Signature of person agreeing to take part Date

Printed name of person above

Signature of person who explained the study to the participant Date above and obtained consent

Printed name of person above

214

Appendix E: Learning Activities Survey Follow-Up Interview =

Name:

Date:

This interview is part of research that included the survey you took. The research is about the experience of preservice teachers in this course about educational assessment. Only with your help can we learn more about how preservice teachers learn about educational assessment, and especially about the role and use of formative assessment strategies (FAS). The interview should only take about 30-45 minutes to complete, and your responses will be kept completely anonymous. Thank you for being part of this project. We greatly appreciate your cooperation.

The interview questions are designed to gather further information about the topics covered in the original survey, so some of them may sound familiar to you. Because your responses are so important, and I want to make sure to capture everything you say, I would like to audio tape our conversation today, so I will ask if I have your permission to record this interview. If you grant permission, I will turn on the recording feature for this interview, and then I will ask permission once more so my request and this permission are captured in the recording. Do I have your permission to record this interview?

[Recording will be turned on with participant’s permission]

Do I have your permission to record this interview?

215

1. Since you’ve been enrolled in this course about educational assessment, do you believe = you have experienced a time when you realized that your values, beliefs, opinions or

expectations about assessment techniques, the role of formative assessment, or the role of

summative assessment have changed?

2. Briefly describe that experience.

3. Do you know what triggered it? If so, please explain.

4. Which of the following influence this change? (Check all that apply)

A. Was it a person who influenced this change?

a. Yes

b. No

B. If "Yes," was it . . . (check all that apply)

c. a. Another student's support

d. b. Your classmates' support

e. c. Your advisor’s support

216

f. d. A challenge from your instructor = g. e. Your instructor’s support

h. f. Other: ______

C. Was it part of a class assignment that influenced the change?

i. Yes

j. No

D. If "Yes," what was it? (check all that apply)

a. Readings in a textbook

b. Chapter questions in a book

c. Supplemental reading or materials

d. Class/group projects

e. Verbally discussing your concerns

f. Writing about your concerns

g. Journal entries

h. Self-assessment in the class

i. Class activity/exercise

j. Deep, concentrated thought

k. Personal reflection

l. Autoethnography or personal learning experience paper/project

m. Non-traditional structure of this particular course

n. Field experience

o. Other: ______

E. Or, was it a significant change in your life that influenced the change?

217

k. Yes = l. No

F. If "Yes," what was it? (Please check all that apply)

a. Marriage

b. Birth/adoption of a child

c. Moving

d. Divorce/separation/break up

e. Death of a loved one

f. Illness of loved one

g. Change of job

h. Loss of job

i. Immigration to a new country

j. Other: ______

G. Perhaps it was something else that influence the change. If so, please describe it:

5. Describe how any of the above educational experiences influenced the change:

6. What could have been done differently in the class to have helped this change? What

specific activities?

218

= 7. Thinking back to when you first realized that your views or perspectives had changed:

A. When did you first realize this change had happened? Was it while it was

happening, mid-change, or once it had entirely happened (retrospective)?

B. What made you aware that this change had happened?

C. What did your being in this class have to do with it?

D. What did you do about it?

E. How did/do you feel about the change?

8. We are nearly at the end of this interview. Do you feel there is anything we have left out

of th It should be noted that this theme encompasses both assessment and data literacy,

insofar as teachers need to be proficient in identifying what they need to know, then

selecting the appropriate type of assessment, to ensure that the resulting data are aligned

with the initially identified need. e interview related to your experience of change about

219

your values, beliefs, opinions or expectations regarding assessment techniques, the role of = formative assessment, or the role of summative assessment have changed?

9. As we wrap up, do you have any questions for me?