EXAMINATION OF THE CHANGE IN CONTENT KNOWLEDGE, PERSONAL SCIENCE TEACHER EFFICACY, AND SCIENCE TEACHING OUTCOME EXPECTANCY DUE TO PARTICIPATION IN MODELING INSTRUCTION PROFESSIONAL DEVELOPMENT

Gloria Kreischer-Gajewicz

A Dissertation

Submitted to the Graduate College of Bowling Green State University in partial fulfillment of the requirements for the degree of

DOCTOR OF EDUCATION

December 2019

Committee:

Judith Jackson May, Advisor

Starr E. Keyes Graduate Faculty Representative

Tracy Huziak-Clark

Colleen Megowan-Romanowicz

Rachel Vannatta © 2019

Gloria Kreischer-Gajewicz

All Rights Reserved iii ABSTRACT

Judith Jackson May, Advisor

Highly effective teachers have a positive impact on their students and their performance.

Therefore, in-service teachers must continue to grow and develop their craft (Marzano, 2003).

Research has shown factors influencing teacher effectiveness include their content knowledge, their self-efficacy, and their outcome expectancy (Bandura, 1977; Ohle, Boone, & Fischer, 2014;

Sargent, Ferrell, Smith, & Scroggins, 2018; Tschannen-Moran & Hoy, 2001).

Teachers who spend almost all of their class time teaching content and not working on improving student understanding have lower self-efficacy and outcome expectancy and struggle to improve their student’s academic success (Petty, 2009). According to research, professional development should focus on pedagogical content knowledge (PCK) to have a more significant impact on student achievement (Kleickmann, Richter, Kunter, Elsner, Besser, Krauss, &

Baumert, 2013; Shulman, 1986).

Modeling Instruction is a type of professional development focused on the improvement of science teaching pedagogy. There is significant research on the impact of Modeling

Instruction (MI) on students. However, little research on the effects of MI on teachers exists

(Brewe, 2008). The purpose of this research was to examine how Modeling Instruction professional development impacts teachers. Examining a t-test of dependent samples using a pretest-posttest design will help to determine if MI professional development for in-service science teachers in an appropriate means for improving teacher content knowledge, self-efficacy, and outcome expectancy. The sample for this study included 567 participants in physical science Modeling Instruction professional development workshops from 21 different states in the

United States. The researcher utilized a quasi-experimental research design utilizing surveys iv disseminated through voluntary participation in the Modeling Instruction professional development from 2016 to 2018 to ascertain the level of content knowledge (CK), personal science teaching efficacy (PSTE), and science teaching outcome expectancy (STOE) for each of the teachers both pre and post participation.

Paired t-tests revealed that Modeling Instruction professional development has a positive impact on content knowledge, self-efficacy, and outcome expectancy. Multiple regression analysis revealed several predictors for the dependent variables, with gender as a common thread throughout. This research also provides implications for leadership and teaching. v

This dissertation is dedicated to my family who has stood by my side, giving me help, hope and

motivation when the journey seemed never-ending. I want to thank my husband Chris, whose

unwavering support throughout tremendous personal struggles, has taught me persistence, resiliency, and that laughter truly is the best medicine. To my son Jordan, whose gentle soul and

amazing compassion has taught me how to love unconditionally. To my son Daniel, whose

random acts of kindness and perpetual curiosity has made me a better teacher and mother. To

my best friend, Amanda, who walked into this journey with me and never left my side. Life knocked me down a few times; I experienced sadness and failure. But you were there to help me

get back up. To my parents, Randye and Michael, who showed me the power of education,

dedication, and perseverance. I saw firsthand how hard you worked to put your family first

while furthering your education.

I thank you all for inspiring me to pursue my doctorate. vi ACKNOWLEDGMENTS

The completion of my doctorate would not have been possible without the financial support generously extended to me by the Dr. Neil Pohlman award. I want to thank Dr. Patrick

Pauken and Dr. Judith Jackson-May for nominating me and the Leadership Studies doctoral faculty at BGSU for selecting me to receive this award. Also, I would like to thank the Neil

Pohlman family for initiating this award.

I am humbled by the five incredible women that were part of my dissertation committee.

I value their presence in my life and am lucky to have such amazing women share their experiences and be with me along this dissertation journey. I hope that one day I can do the same for other women in my life.

I am grateful to the American Modeling Teacher’s Association senior advisor, Dr.

Colleen Megowan-Romanowicz, who inspired me to become a Modeling Instruction teacher leader. Her unparalleled knowledge about Modeling Instruction and classroom discourse are inspirational. Colleen granted me access to AMTA data files that were instrumental to this project. AMTA has lit the fire for thousands of science teachers and transformed classrooms all over the world.

I would like to thank my chair, Dr. Judith Jackson May, for her insightful comments and editor’s eye. She kept encouraging me to focus on the theoretical constructs that informed my research and pushing (sometimes dragging) me toward my goal of becoming a doctor. She has shown me, by her example, what a great leader looks like. Judith has a prodigious presence and

I will remember to always take the cannoli.

I am also indebted to Dr. Rachel Vannatta, whose expertise in quantitative methodology is invaluable. She has taught me a great deal about scientific research. Rachel’s patience, vii encouragement, and friendship bolstered the process of understanding complex statistical analyses into manageable ingredients for my research.

Dr. Tracy Huziak-Clark inspired me to drink the Kool-Aid and started me on my path of transforming my classroom. She guided me to embrace Modeling Instruction and provided me with a topic idea for this research. Tracy has aided many teachers to transform their classrooms, but without her vision of bringing modeling to BGSU, I would not be the teacher I am today.

Thank you for lighting my fire.

Finally, I would like to thank Dr. Starr Keyes, my graduate representative. She has a keen eye for details and her suggestions made my research more succinct. Starr has shed light on areas of my research that guided my editing process and made my experience more robust.

viii

TABLE OF CONTENTS

Page

CHAPTER I. INTRODUCTION ...... 1

Background ...... 3

Purpose of the Study ...... 5

Research Questions ...... 9

Theoretical Foundation ...... 9

Overview of Methodology ...... 12

Significance of the Study ...... 13

Importance of Modeling Instruction in Professional Development ...... 15

Researcher Assumptions ...... 16

Definition of Key Terms ...... 19

Organization of the Study ...... 20

CHAPTER II. LITERATURE REVIEW ...... 22

Historical Background ...... 23

Conceptual Framework ...... 24

Review of the Literature ...... 27

Content Knowledge ...... 27

Development of content knowledge and its impact on student

learning ...... 28

Pedagogical Content Knowledge ...... 31 Development of pedagogical content knowledge ...... 33 Impact of pedagogical content knowledge on student learning ...... 33 Self-Efficacy ...... 34 ix

Development of self-efficacy ...... 35 Impact of self-efficacy on student learning ...... 36 Collective Efficacy ...... 36 Outcome Expectancy ...... 38 Development of outcome expectancy ...... 38 Impact of outcome expectancy ...... 39 Modeling Instruction ...... 39 Models ...... 40 Modeling ...... 42

Modeling Instruction and student learning ...... 44 The Modeling Cycle ...... 47 Science misconceptions ...... 48 Modeling Instruction impact ...... 50 The Force Concept Inventory and Assessment of Basic Chemistry Concepts ...... 51 Force Concept Inventory development ...... 52 Impact of the Force Concept Inventory ...... 53 Assessment of Basic Chemistry Concepts ...... 54 Science Teaching Efficacy Belief Instrument ...... 55 Development of the Science Teaching Efficacy Belief Instrument-A ...... 55

Effective Professional Development ...... 57 Components of effective professional development ...... 57 Framework for Modeling Instruction professional development ...... 59 Leadership ...... 60 Next Generation Science Standards and professional development ...... 62 x

Gender differences in the science classroom ...... 62 Summary ...... 64

CHAPTER III. METHODOLOGY ...... 66

Introduction ...... 66

Research Design ...... 66

Participants ...... 67 Instrumentation ...... 68 Science Teacher Attitudes Survey ...... 69

Concept Inventory ...... 70 Procedures ...... 71 Research Questions ...... 72 Data Analysis ...... 75 Research Question One ...... 76 Research Questions Two, Three, and Four ...... 76 Study Assumptions ...... 80

CHAPTER IV. RESULTS ...... 81

Descriptive Statistics ...... 81

Inferential Statistics ...... 87

Research Question One ...... 88 Research Question Two ...... 89 Research Question Three ...... 92 Research Question Four ...... 95 Chapter Summary ...... 95 CHAPTER V. DISCUSSION ...... 98 Summary of Conclusions ...... 98 Study Overview ...... 99 xi

Discussion of The Findings ...... 103 Introduction ...... 103

Impact on self-efficacy ...... 103

Impact on outcome expectancy ...... 105 Impact on content knowledge ...... 109 The Degree to Which the Independent Variables Predict Content Knowledge ...... 112 Correlation between independent variables and content knowledge ...... 112 Predictive model for content knowledge ...... 113 The Degree to Which the Independent Variables Predict Self-Efficacy ...... 114 Correlation between independent variables and self-efficacy ...... 114 Predictive model for self-efficacy ...... 115 The Degree to Which the Independent Variables Predict Outcome Expectancy ...... 116 Recommendations ...... 116 Implications for Professional Development ...... 117 Implications for Leadership ...... 120 Implications for Modeling Instruction Practice ...... 122 Limitations and Recommendations for Future Research ...... 124 Limitations ...... 124 Future Research ...... 125 Final Thoughts ...... 126

REFERENCES ...... 128

APPENDIX A. INSTITUTIONAL REVIEW BOARD STATEMENT ...... 149

APPENDIX B. SCIENCE TEACHER ATTITUDES SURVEY ...... 150 xii

APPENDIX C. ASSESSMENT OF BASIC CHEMISTRY CONCEPTS SAMPLE QUESTIONS ...... 153 APPENDIX D. FORCE CONCEPT INVENTORY SAMPLE QUESTIONS ...... 154 APPENDIX E. SCORING FOR THE SCIENCE TEACHER ATTITUDES SURVEY ..... 155 xiii

LIST OF FIGURES

Figure Page

1 The five goals of Modeling Instruction ...... 47

2 Number of teachers by years of experience ...... 83

3 Number of teachers by gender and teaching experience ...... 83

4 Number of teachers by content alignment and gender ...... 84

5 Number of teachers by content alignment and years of experience ...... 84

6 Number of teachers by content alignment, gender, and years of experience ...... 85 xiv

LIST OF TABLES

Table Page

1 Example Agenda for Modeling Instruction professional development ...... 74

2 Variables, Values, and Source ...... 75

3 Research Questions, Variables, and Data Analyses ...... 79

4 Sample Size and Percentage for Gender, Highest Degree, and Teaching

Content (n=576) ...... 82

5 Descriptive Statistics of Pre and Post Subscales ...... 86

6 Descriptive Statistics of Predictor Variables ...... 87 7 Paired t-test Results for Dependent Variables ...... 89

8 Pearson Correlation Between Content Knowledge and Predictors ...... 91

9 Regression Predictors of Content Knowledge for Pre, Post, and Growth Scores ..... 92 10 Pearson Correlation Between Personal Science Teaching Efficacy and Each Predictor ...... 93 11 Regression Predictors of Personal Science Teaching Efficacy for Pre, Post, and Growth Scores ...... 94 12 Pearson Correlation Between Science Teaching Outcome Expectancy and Each Predictor ...... 95 13 Results Summary by Research Question ...... 96 1

CHAPTER I. INTRODUCTION

The is underperforming in science when compared to other countries around the world. Research by the National Commission on Mathematics and Science Teaching for the 21st Century claims new reports do not look favorably upon the performance of our students from middle to high school in science (National Commission on Mathematics and

Science Teaching for the 21st Century, 2000). Sixty-six percent of 8th-grade and seventy-eight percent of 12th-grade students scored below the proficient level on the 2015 National

Assessment of Educational Progress in Science (McFarland, Hussar, Wang, X., Zhang, Wang,

K., Rathbun, Barmer, Cataldi, & Mann, 2018). As reported by the Organization for Economic

Co-operation and Development’s (OECD) 2015 report on The Program for International Student

Assessment (PISA), the U. S. ranks twenty-fourth in science out of seventy-one when compared to countries like Japan, Estonia, Finland, and Canada. Research on the reasons for the lag in

U.S. performance is varied. One relatively new area of inquiry that may contribute to low performance relates to the dearth of high-quality science teacher professional development geared towards preparing students for the 21st Century science classroom.

Teaching science through inquiry has been advocated for over a decade and is supported throughout the Next Generation Science Standards (Wilcox, Kruse, & Clough, 2015). Inquiry- based science education utilizes opportunities for students to investigate natural phenomena by asking questions, exploring possible solutions, developing explanations, and evaluating their understanding based on the data they have collected. “Scientific inquiry requires the use of evidence, logic, and imagination in developing explanations about the natural world” (Newman,

Abell, Hubbard, McDonald, Otaala & Martini, 2004, p. 258). Inquiry-based teaching requires teachers to have an understanding of the of science, discipline-specific content 2 knowledge, and pedagogical content knowledge (Newman et al., 2004). In order to use inquiry techniques, teachers need to know how scientific laws, principles, and mathematical expressions will subsume the content they teach (Newman et al., 2004). Teachers also must possess a firm understanding of the scientific evidence that informs their content and the process of engaging students in scientific endeavors (Newman et al., 2004). Teachers tend to have experiences with teaching and learning through inquiry and must be engaged in professional development to develop these skills (Newman et al., 2004). In the 2011-2012 school year, 85% of K-12 educators reported participating in subject-specific professional development. However, most were eight hours in length or less (Goldring, Gray, & Bitterman 2013). Typical professional development does not facilitate critical thinking necessary for science classrooms of the 21st century and remains challenging for most teachers to achieve within their classrooms (Darling-

Hammond, Wei, Andree, Richardson & Orphanos, 2009). Inquiry teaching requires considerable content knowledge and pedagogical content knowledge. Teachers need to be able to understand what their students are hearing, interject with real-world examples, and manage the unpredictable nature of the student’s experiences (National Academies of , Engineering, and

Medicine, 2015).

Several sections organize Chapter One. First, the background of the study provides the reader with information about why this study is necessary and appropriate. Next, the purpose of the study sets the stage for this research, introduces the use of Modeling Instruction as a science teaching practice, and research questions are presented to further the topics of study. The theoretical foundation examines the work of Shulman and Bandura in the context of science education. An overview of the methodology briefly explains how the research questions guided the quantitative study proposed. The significance of the study, the role of researcher, and 3 researcher assumptions, examines the context of the research. The conclusion of the chapter lists key terms with their definitions, followed by an explanation of the organization of the rest of the dissertation. The purpose of this study is to establish the impact of Modeling Instruction on science teachers to drive home the importance of this professional development on content knowledge, self-efficacy, and outcome expectancy.

Background

Before it's too late: A report to the nation from the National Commission on Mathematics and Science Teaching for the 21st Century (2000) claims that students in America need to improve their performance in mathematics and science in order to succeed in today's world economy. One of the reports’ main messages suggests “the most direct route to improving mathematics and science achievement for all students is better mathematics and science teaching,” (National Commission on Mathematics and Science Teaching for the 21st Century,

2000, p. 18). The National Science Foundation (2001) promotes improvement in science teaching since “advances in science and engineering … determine economic growth, quality of life, and health and security of our planet” (p.7).

Unfortunately, we face a growing shortage of qualified science teachers as the rate of replacement teachers graduating with content-specific education is not able to meet the number of teaching vacancies in the U.S. (Hestenes, Megowan-Romanowicz, Osborn Popp, Jackson, &

Culbertson, 2011). For example, cross-over teachers, or teachers that have certification in a science specialty that is different from the one they teach, have been necessary to address a long- standing shortage in classrooms (Hestenes et al., 2011; Hobbs, 2012; Ingersoll, 2002).

One crucial step for science education reform is to focus on improving in-service teachers’ content knowledge by cultivating expertise (Hestenes et al., 2011). A lack of content knowledge 4 and pedagogical content knowledge is a crucial issue for content-specific teachers (Darling-

Hammond, 2006).

Improvement in science education also requires educators to place pedagogy at the core of their focus while implementing a curriculum that incorporates contemporary science while infusing (Hestenes, 2013). In other words, schools continue to be ill-equipped to support laboratory experiences, and educators lack the necessary expertise in science and mathematics (Bradshaw, 2012; Hestenes et al., 2011; Jackson, 2010). Therefore, an essential outcome for science teacher education is providing pre-service teachers with practical situations to practice the application of content knowledge with pedagogical knowledge (Krepf, Plöger,

Scholl, & Seifert, 2018). Moreover, essential outcomes for in-service teachers should be to provide opportunities for professional development that deepen content knowledge, improve pedagogical skills, help keep up-to-date with current scientific developments, opportunities to make professional contributions, and to improve feedback to students while reflecting on teaching and learning (National Commission on Mathematics and Science Teaching for the 21st

Century, 2000).

Content knowledge refers to the laws, principles, and mathematical expressions and the necessary prerequisite science knowledge that students need or learn through their science courses (Loewenberg Ball, Thames, & Phelps, 2008). Content knowledge is also the general conceptual understanding of a subject the teacher obtained by completing their required coursework (Shulman, 1986). Pedagogical knowledge is unique to teachers and refers to the specialized knowledge they possess in creating and facilitating productive learning environments for their students. 5

Problems of insufficient content knowledge and teachers’ participation in limited and sporadic professional development have exacerbated the mismatch between pre-service science teacher education and the subjects they teach. Many new science teachers utilize didactic instruction with coverage of curriculum topics taking the lead in driving curriculum choices.

Knowledge learned in breadth requires students to depend on others that have a deeper understanding of the content, while knowledge learned in-depth allows students to become the expert (Egan, 2010). Didactic instruction is a traditional approach to education that is teacher- centered and values learners that listen attentively while passively accepting the teacher as the expert. According to Banilower and colleagues (2013), 43% of high school in-service science teachers received 15 hours or less of professional development focused on science content within the last three years. The education system in the U.S. lacks coherent, well-articulated professional development opportunities for teachers to advance their expertise further while in the classroom (National Academies of Sciences, Engineering, and Medicine, 2015). A plethora of evidence also exists that indicate content-heavy professional development should be replaced by engagement in activities that advance pedagogical content knowledge to improve science teacher efficacy beliefs (Fox, 2014; Halim, Abdullah, & Meerah, 2014; Martin, 2018; Menon &

Sadler, 2016; Shulman, 2013; Swackhamer, Koellner, Basile, and Kimbrough, 2009).

Purpose of the Study

According to Marzano (2003), highly effective teachers have a more positive impact on student learning. Therefore, it is essential to growing their ranks. There is a strong need to deliver professional development for science teachers to improve their effectiveness (Marzano,

2003). Professional development that focuses on content and teacher engagement in learner- centered pedagogies should, in turn, improve student performance in science (Blank, De las 6

Alas, & Smith, 2008). Modeling Instruction (MI), a research-based pedagogy, may provide an ideal professional development platform under its support of students’ engagement in the processes and discussion of science (Jackson, Dukerich, & Hestenes, 2008). Modeling

Instruction guides students with each facet of scientific knowledge as they work as a scientist and develop basic models for topics such as energy, matter, motion, and forces. Those facets include our understanding of natural phenomena through asking questions, developing models, using computational thinking, obtaining and evaluating information, and constructing explanations. This instruction integrates structured inquiry techniques to develop critical thinking and communication skills.

There is significant research on the impact of Modeling Instruction (MI) on students; however, little quantitative research exists on the effects of the professional development on science teachers (Brewe, 2008). Many references describe MI professional development opportunities for teachers, designated as an exemplary science education program by the U.S.

Department of Education in 2001 (USDOE, 2001). For example, The Ohio Revised Science

Education Standards and Model Curriculum, a document that serves as a basis for what all students should know and be able to do, mentions MI in the Physics and Physical Science education standards as an opportunity for professional development. The document states,

“Modeling workshops are available nationally that help teachers develop a framework for using guided inquiry in their instruction” (Ohio Department of Education, 2011, pp. 282, 285 & 333).

Even though Ohio Revised Science Education Standards, Model Curriculum, and The Next

Generation Science Standards mention MI several times, this methodology is uncommon when training pre-service teachers in colleges and universities in Ohio. 7

The Next Generation Science Standards (NGSS) set the standards for expectations for all science students and were developed to improve science education nationally. The National

Research Council (2012) discusses the use of models and modeling when implementing the Next

Generation Science Standards:

Modeling can begin in the earliest grades, with students’ models progressing from

concrete pictures and physical scale models (e.g., a toy car) to more abstract

representations of relevant relationships in later grades, such as a diagram representing

forces on a particular object in a system. Students should be asked to use diagrams,

maps, and other abstract models as tools that enable them to elaborate on their ideas or

findings and present them to others. Young students should be encouraged to devise

pictorial and straightforward graphical representations of the findings of their

investigations and to use these models in developing their explanations of what occurred.

(p. 58)

More sophisticated types of models should increasingly be used across the grades, both in

instruction and curriculum materials, as students advance through their science education.

The quality of a student-developed model will be highly dependent on prior knowledge

and skill and also on the student’s understanding of the system modeled, so students

should be expected to refine their models as their understanding develops. Curricula will

need to stress the role of models explicitly and provide students with modeling tools …

so that students come to value this core practice and develop a level of facility in

constructing and applying appropriate models. (p. 59) 8

Models provide a framework for students while they develop their science conceptions and are necessary tools during scientific inquiry (Halloun, 2004). Science conceptual models make comparisons to the familiar to help explain an idea that may be unfamiliar and help better visualize the concept under investigation and to develop a possible solution to a problem (NRC,

2012). For example, the Bohr model of the atom represents electrons orbiting a nucleus similar to the way planets orbit around the sun.

To guide students in this process, teachers need to have an in-depth understanding of science content and how to facilitate the development of the core content models for their curriculum. The National Research Council (2012) suggests “[c]urricula will need to stress the role of models explicitly and provide students with modeling tools ...so that students come to value this core practice and develop a level of facility in constructing and applying appropriate models,” (p. 59).

The goal of professional development should be the development of the pedagogical skills necessary for science teachers, as indicated in NGSS and the Framework for K-12 Science

Education (NRC, 2012). This study explored MI as a means to improve the pedagogical content knowledge of physical science teachers by measuring content knowledge (CK), self-efficacy

(SE), and outcome expectancy (OE). Physical science content comprises exploring phenomena of the physical world as it relates to motion, matter, and energy. A Concept Inventory was utilized to collect pretest and posttest scores measuring CK. Teachers completed survey questions that measured SE and OE to assess how they perceived their participation in an MI workshop impacts their comfort in teaching science before and after professional development.

The outcome of the data analysis for CK, SE, and OE was used to show the impact of Modeling 9

Instruction as an effective means of professional development and ultimately as a means to improve student performance in physical science on standardized tests.

Research Questions

This research focused on four primary questions:

1. What is the impact of participation in the Modeling Instruction professional development on science teachers’ content knowledge, self-efficacy, and outcome expectancy?

2. What is the degree to which years of teaching experience, gender, highest level of educational attainment, and teaching in content predict: a) content knowledge before Modeling

Instruction; b) content knowledge after Modeling Instruction; and c) growth in content knowledge?

3. What is the degree to which years of teaching experience, gender, highest level of educational attainment, and teaching in content predict: a) self-efficacy before Modeling

Instruction; b) self-efficacy after Modeling Instruction; and c) growth in self-efficacy?

4. What is the degree to which years of teaching experience, gender, highest level of educational attainment, and teaching in content predict: a) outcome expectancy before Modeling

Instruction; b) outcome expectancy after Modeling Instruction; and c) growth in outcome expectancy?

Theoretical Foundation

Scientists accumulate experiential, content, and procedural knowledge as they develop working models of natural phenomena (Brewe, 2008). In science, a model is a representational tool that is used for reference when describing or explaining phenomena (Hinrichs, 2006).

Models become central to the work of scientists, both in their research as well as when communicating their results. Content knowledge includes facts, theories, and experiments within 10 the context of the science content (Shulman, 1986). Extensive research has shown that content knowledge is necessary for effective science teaching. Without a solid understanding of science content, teachers cannot effectively teach their subject material (Zhang, Parker, Koehler, &

Eberhardt, 2015).

Teachers who possess intellectual and pedagogical proficiency are essential in the creation of experiences that actively engage students’ ideas, crosscutting concepts, and discussion of phenomena (National Academies of Sciences, Engineering, and Medicine, 2015).

Crosscutting concepts help to bridge academic boundaries to unite core ideas throughout the different disciplines of science and to deepen student understanding of those core ideas. When teachers are not able to assess and understand their missteps in content material, they are not able to assist their students in correcting misconceptions of science content (National Research

Council, 1997).

Misconceptions are preconceived notions or conceptual misunderstandings that do not match currently accepted scientific knowledge and concepts. Holding on to misconceptions often leads to difficulty in giving up faulty ideas, primarily if the misconception existed for an extended period (NRC, 1997). Continuing to build content knowledge with misconceptions as a foundation of information can have a severe impact on learning. Science teachers need broad, but in-depth knowledge of their content to make connections with concepts, representations such as graphs, and students’ understanding to demonstrate comprehension. For example, Desimone found teachers who do not have a strong foundation of content knowledge often present their subject material as a series of unrelated facts that do not interact, when in fact, all science subjects share universal concepts and models (2009). 11

Shulman first described pedagogical content knowledge (PCK) as the blending of content knowledge with an understanding of how to deliver the content to students within the classroom

(1986). Shulman and Shulman (2004) state that for students to learn, teachers must first understand how students learn the material to improve student comprehension. Teachers must be aware of the preconceptions and misconceptions that are common for students with specific content. Awareness requires teachers to understand how to organize, adapt, and represent specific topics to a diverse group of learners. Weak teachers spend almost all of their class time teaching content and not working on improving student understanding by utilizing pedagogical content knowledge (Petty, 2009). Research shows that professional development should focus on PCK to have a more significant impact on student achievement (Kleickmann, Richter, Kunter,

Elsner, Besser, Krauss, & Baumert, 2013; Shulman, 1986).

Self-efficacy and outcome expectancy arose from constructs within Bandura’s Social-

Cognitive Theory. Bandura (1977) contends that human behavior influences an individual’s belief regarding these two classes of expectations. Self-efficacy is a judgment about one’s capabilities to deal with different situations and their effectiveness in teaching every student

(Bandura, 1977; Tschannen-Moran, & Hoy, 2001). Self-efficacy is dependent on performance achievements, vicarious experiences, verbal persuasion, and motivational processes. Bandura

(1977) defines outcome expectancy as a person’s evaluation of whether or not certain behaviors will lead to particular outcomes. Within a classroom, outcome expectancies are judgments made by a teacher about how they will be able to perform in a given situation and the likelihood that specific outcomes will result.

Teachers with high self-efficacy are highly confident in their ability to teach every student. They are also more likely to try new teaching strategies and encourage a student- 12 centered classroom atmosphere while working patiently with academically challenging students.

Teachers with low self-efficacy, on the other hand, are more likely to become discontent with struggling students and agonize over their ability to teach science content (Bandura, 1977, 1997).

High outcome-expectancy scores are exhibited by teachers that feel confident in their ability to positively impact their students within their classroom, regardless of outside factors such as socioeconomic status, education level of parents, or school setting (Riggs & Enochs, 1989).

Teachers with low outcome expectancies identify a student’s external circumstances as obstacles to a student’s achievement (Guskey, 1988).

Overview of Methodology

The study I proposed examined the change in science content knowledge, self-efficacy, and outcome expectancy due to participation in Modeling Instruction professional development.

The first research question utilized a series of paired t-tests for each dependent variable, content knowledge, self-efficacy, and outcome expectancy. Paired t-tests were employed to determine if mean differences existed between pre and posttest data for content knowledge, self-efficacy, and outcome expectancy as a result of participation in Modeling Instruction professional development. I also examined the degree to which years of teaching experience, gender, the highest level of education attainment, and teaching in content, predicted content knowledge, self- efficacy, and outcome expectancy. I utilized multiple regression for the pre-treatment data and post-treatment data. Multiple regression explored the relationship between the dependent and independent variables.

I examined the content knowledge of teachers before MI professional development and established a baseline for growth using a Concept Inventory, either the Force Concept Inventory for physics or the Assessment of Basic Chemistry Concepts for chemistry. I also explored each 13 teacher's years of experience, self-efficacy, and outcome expectancy using a survey method. The

Science Teaching Efficacy Belief Instrument, developed by Enochs and Riggs (1990) is the basis for the development of the survey that I used in this research, the Science Teacher Attitudes

Survey. I used the pretest scores from the MI professional development survey to help predict the posttest content knowledge scores on the Concept Inventory.

Significance of the Study

Teachers utilizing traditional teaching methods may need help in transforming their practice to a more inquiry-based pedagogy like Modeling Instruction. Often MI begins with activity before content and is the antithesis of traditional teaching methods (Jackson et al., 2008).

Engaging students in laboratory experiences before delving into the vocabulary and conceptual explanations allows students to explore and have their ideas and predictions before teachers prejudice their students with their thoughts and ideas about science content (Hestenes, 1997;

Wells, Hestenes, & Swackhamer, 1995).

A powerful tool to encourage colleagues to work together is peer coaching. Peer coaching focuses on the teacher as a learner and allows teachers to interact with one another to brainstorm, receive feedback, and alleviate the frustration that results from working in isolation when making changes (DeChenne, Nugent, Kunz, Luo, Berry, Craven, & Riggs, 2012).

Teachers are afforded the time to take risks, to try out new teaching strategies and approaches, and to discuss results with their colleagues. MI professional development utilizes peer coaching to help teachers to overcome the frustrations and challenges when utilizing Modeling Instruction pedagogy within the classroom. Explicitly focusing on developing and refining lessons helps to improve self-efficacy by celebrating teachers achieved goals in implementing inquiry and develop models in their classrooms (DeChenne et al., 2012). Through the process of 14 participation in MI professional development, facilitators act as science peer coaches and help participants to develop pedagogical content knowledge that incorporates inquiry through model development. Participants also become part of a cohort of teachers engaged in the process of reflection and transformation within their classrooms. This cohort of teachers supports each other with the transition from the traditional teaching methodology to more inquiry-based teaching while they implement new skills and strategies learned within the MI professional development to guide students as they develop and deploy scientific models.

Science teachers are on the precipice of radical education reform. President Barack

Obama declared in his 2011 state of the Union address, “This is our generation’s Sputnik moment” when discussing the importance of American innovation. “We know what it takes to compete for the jobs and industries of our time. We need to out-innovate, out-educate, and out- build the rest of the world.” Science innovation and discoveries are dependent upon scientifically literate citizens (NRC, 2012). Future job skills and environmental issues such as global climate change require our populace to interpret information from a variety of sources and draw conclusions based on evidence. Science Technology Engineering and Math (STEM) education has become a code word for this comprehensive education reform. STEM education has sought to incorporate experimentation, data collection, group discussion, and projects into classrooms to allow students to prepare essential skills necessary to become 21st-Century citizens (NRC, 2012). As such, they can make informed decisions about energy and resource consumption, environmental issues, national security, quality of life, and personal health.

Students should be engaging in scientific inquiry and not passive learners as they have been through traditional classrooms. Students need to acquire and apply the necessary skills and knowledge to be able to be global citizens in situations involving science and technology 15

(National Commission on Mathematics and Science Teaching for the 21st Century, 2000).

Competencies such as problem-solving, critical thinking, collaboration, communication, innovation, and creativity are necessary to succeed in citizenship, career, and life. Teachers play a crucial role in creating environments that enhance scientific literacy. Science education is the common denominator of political, environmental, economic, and social changes in modern times.

Hestenes (2013) explains that modeling is a means to frame science and mathematics, and “Modeling Instruction refers to making and using conceptual models of real systems and processes (both natural and artificial) as central to learning and doing science and engineering”

(p. 16). Successful implementation, according to Hestenes (2013), of modeling curriculum requires collaboration throughout the school year to resolve problems and discuss alternative solutions. Ideally, universities would partner with local schools to continue to support teachers with the implementation of modeling pedagogy within their classrooms. Modeling Instruction is an inquiry-based teaching style that was developed over three decades ago to integrate scientific research with classroom experience. Through Modeling Instruction professional development, teachers can improve their students’ abilities to make sense of the physical world, understand scientific claims, articulate and defend their arguments, and evaluate evidence (Hestenes, 2013, p. 17).

Importance of Modeling Instruction in Professional Development

Scientific endeavors begin through inquiry. Moreover, science classrooms should embody the principles of the scientific process. However, the challenges accompanying standardized testing endangers teachers’ willingness to embrace authentic science (Buxton,

2006). Our current system of testing students’ knowledge of science typically reflects what they 16 do not know and cannot do rather than what they do know and can do (Buxton, 2006). The scientific process includes how scientists search for answers to scientific questions by making observations, conducting experiments, analyzing data, and discussing their findings within the scientific community (NRC, 2000). Models drive questions, and those questions drive and inform experimentation (Dukerich, 2015). Modeling Instruction professional development helps teachers understand the process of MI and envelopes them in a network of others in the MI realm

(Dukerich, 2015). Teachers can draw upon this network to help them modify their curricula and seek advice from their more experienced peers.

Professional development is necessary to prepare teachers for the 21st century to ensure that all students possess sufficient knowledge of and skills in science. The results of this study could establish Modeling Instruction professional development as an avenue for improving teachers’ pedagogical content knowledge, self-efficacy, and outcome expectancy. High-quality teaching makes a significant difference in student learning and maintains quality science teachers require meaningful, ongoing professional development. Schools should devote more time and resources for productive and useful professional development, such as Modeling Instruction for all K–12 teachers of science, to support learning throughout their teaching careers. Attention to such professional development could have a significant impact on student achievement in science and boost our nation’s standing on international assessments (Darling-Hammond et al.,

2009).

Researcher Assumptions

In Milner’s (2007) four-part framework, it is essential to examine the researcher positionality. During this process, the researcher pays attention to any assumptions they are making about the subjects within the research. They also need to examine how they relate to the 17 topic while being reflexive about their own identity. Feelings the researcher has connected to their research should also be identified. There are three types of dangers that a researcher may encounter during an examination of their positionality. Those dangers include seen, unseen, and unforeseen.

Seen dangers within the context of research are the basis on which the researcher makes decisions. These dangers include those that are on the surface of the study. They include the variables selected to study, the collection of data, and the selection of participants for the study.

Modeling Instruction professional development workshops that were used to collect data for this research had seen dangers, or confounding variables, such as years of teaching experience, the gender of the participants within the study, and the type of school district that the participants serve.

Unseen dangers included those that are implicit or hidden. These dangers may surface if the researcher does not appropriately evaluate their research within the context of science education professional development. Hidden dangers within this study included communication and culture within the framework of the workshop. Often, communication embeds these dangers. Communication is the way we code, analyze, and process our experiences. Culture, in turn, integrates our communication (Gay, 2010). Therefore, the researcher needs to consider how they communicate within the context of their research. Within the context of MI professional development, unseen dangers could include how leaders communicated subject matter to the participants. Communication with the participants needs to be culturally neutral to try to reduce the impact of hidden dangers.

Unanticipated or unpredicted dangers may also prove to be problematic within our research practice. These dangers are difficult to predict; therefore, careful reflection is necessary 18 to identify them so they can be addressed and exposed. Within the context of research, the language we use can often be an unforeseen danger. Language is the way we communicate with others about our experiences or expectations (Gay, 2010). Language often has hidden meaning and can be easily misinterpreted, especially when we are dealing with subjects who do not use

English as their primary language. Modeling Instruction professional development was in

English. Unforeseen dangers may impact any participants who do not use English as their primary language. Also, local colloquialisms may have affected how workshop participants translated the information given during the workshop.

In researching the relationship to the system, one must examine the barriers that exist.

Those barriers include structural, cultural, and semantic obstacles (Milner, 2007). Structural barriers consist of the policies and rules that are in place associated with the school. Those barriers may inhibit cultural diversity and inhibit teachers from participating in a Modeling

Workshop. Professional development practices that are encouraged by the school or money available to participate in them could be another source of structural barriers. Many science teachers do not have rich experiences in their content, nor with redesigning their curriculum

(National Academies of Sciences, Engineering, and Medicine, 2015). Cultural barriers that may have existed in the context of this research may include the attitudes of the participants.

Participants that came from a background that embraces collaborative efforts may have felt comfortable in the environment of the workshop. Those cultures that embrace individuality may have struggled more in the collaborative environment. The behaviors that teachers bring are another type of cultural barrier. Teachers who were more skeptical and did not immerse themselves in the process of change nor form relationships with other teachers within the 19 workshop had a more challenging time with gaining knowledge and development of pedagogical tools within the MI professional development.

Definition of Key Terms

This research includes several essential terms that need defining for further understanding.

Content knowledge: The facts, representations, and theories common to a particular field of study (Shulman, 1986).

Inquiry: Students are engaged in open-ended, student-centered, hands-on activities within a science classroom (Colburn 2000, p. 42).

Misconception: Alternative conceptions or preconceptions that differ from currently accepted scientific knowledge (Burgoon, Heddle, & Duran, 2010).

Model: A knowledge structure; a conceptual representation of a real thing or structure in a physical system and/or its properties (Hestenes, 1997, p. 8).

Modeling: An activity that results in the construction, validation, and application of models (Megowan, 2007).

Modeling Instruction: Research-based, student-centered teaching methodology that fosters the development and use of models in a culture of scientific collaboration and application of structured inquiry techniques (Hestenes, 2013).

Modeling Workshop: A peer-led professional development experience where teachers learn to utilize Modeling Instruction pedagogy through the practice and interaction with other teachers to within a community (Hestenes, Megowan-Romanowicz, Osborn Popp, Jackson, &

Culbertson, 2011). 20

Outcome-expectancy: A teacher's expectation that a given course of behavior will produce specific outcomes (Bandura, 1997).

Pedagogical content knowledge: The ways teachers represent and formulate science to make it comprehensible to students, including the conceptions and misconceptions that students bring with them to the learning (Shulman, 2013).

Professional development: Development of the knowledge and skills of the classroom teacher (Birman, Desimone, Porter, & Garet, 2000).

Self-efficacy: The perceived capability to perform a behavior (Bandura, 1997).

STEM education: An acronym that refers to education in the fields of science, technology, engineering, and math; teaching, learning, and curriculum based on the idea of educating students in the fields of science, technology, engineering, and math in an interdisciplinary approach (Ramaley, 2002).

Organization of the Study

Five chapters help to organize this study. This chapter provides critical concepts and terminology commonly referenced in science education and professional development. A synopsis includes the state of science education, as well as a description of Modeling Instruction.

The chapter also provides an examination of the problem to be studied in this research and its significance. A theoretical foundation for the study provides a context for the research. Chapter

Two presents a review of the literature related Modeling Instruction, self-efficacy, and outcome expectancy. The research assists a reader to gain a deeper understanding of scholarly investigations in these areas. This helps set the stage since each of these are components for the research questions in this study. Chapter Two examines relevant research about professional development in science and what we know about sound professional development practices. 21

Chapter Three reviews the methodology utilized in the collection and analysis of the data, explicitly detailing the process of using multiple regression analysis. This chapter includes information about the development of the survey tool for the study. Chapter Four reports the data for each research question to present the findings for the study. Chapter Five provides recommendations based on data and literature for future science teacher professional development concerning Modeling Instruction. A summary of the data with conclusions ties the study together. Future research and study limitations address how to develop future science teacher professional development for in-service teachers as well as recommendations for pre- service science teacher development. 22

CHAPTER II. LITERATURE REVIEW

Presently, there is a dearth of research on the impact of quality professional development, such as Modeling Instruction on teachers and their content knowledge (Ross & Bruce, 2007), self-efficacy, and outcome expectancy. Improving professional development for teachers is imperative to transform schools and improve student academic achievement (Darling-Hammond et al., 2009). Researchers found that while most teachers report participating in professional development within the last 12 months, many did not find the content useful (Darling-Hammond et al., 2009). By examining professional development opportunities such as MI, education leaders and policymakers can begin to evaluate the impact of teachers’ learning opportunities in professional development such as MI and determine ways it can be further supported.

Next Generation Science Standards and the Ohio Model Curriculum acknowledge that model development is an essential strategy for students to be successful in the development of their science content knowledge (NGSS Lead States, 2013; Ohio Department of Education,

2011). Modeling Instruction, augmented by professional development research, integrates best practices in professional development to further the progress of science teachers as professionals in their field.

This review discusses the relevant literature that impacts significant components of this study. Background information for each variable sets the stage for what required further examination. Then, a conceptual framework for this study is expressed utilizing Shulman’s work with pedagogical content knowledge (PCK) and Bandura’s Social Cognitive Theory. A description of how this theoretical framework applied to the problem informed the research.

Lastly, current literature relevant to this research examined what we know about teacher content knowledge, PCK, self-efficacy, outcome expectancy, and their importance in science education. 23

Critical features that embody essential elements of professional development help explain the strengths of Modeling Instruction professional development. This research included how

Concept Inventories measure content knowledge growth. Current research studies established the essential features of sound professional development practices. The strengths and weaknesses of previous research revealed what this research builds upon and improves.

Historical Background

Traditionally, teachers teach how they have been taught and tend to focus on the memorization of narrated content and use repetition to teach their students. Often, we view students as receptacles for the knowledge that a teacher possesses. Inquiry and problem-solving techniques are becoming a more acceptable method of instruction and must replace this banking concept of education in the science classroom to break the traditional vertical pattern of education. Teachers should utilize dialogue as a means to reveal a students’ cognitive process and to stimulate creativity.

Many studies have proposed that self-efficacy has a puissant impact on student learning

(Bray-Clark & Bates, 2003; Bruce, Esmonde, Ross, Dookie, & Beatty, 2010; Fox, 2014;

Goddard, Hoy & Hoy, 2000; Lekhu, 2013; Tschannen-Moran & Hoy, 2001), while other studies have revealed the influence of outcome expectancy (Angle & Moseley, 2009; Trusz, 2018;

Sargent, Ferrell, Smith, & Scroggins, 2018; Williams, 2010) on student learning. However, self- efficacy and outcome expectancy are not the only factors influencing teachers’ goal setting and persistence; teachers’ content knowledge is also a factor (Loewenberg Ball et al., 2008; Ohle,

Boone, & Fischer, 2015; Rice & Roychoudhury, 2003; Sadler, Sonnert, Coyle, Cook-Smith &

Miller, 2013; Swackhamer et al., 2009). Swars and Dooley (2010) have also suggested that teachers that have deficiencies in science content knowledge may have lowered personal self- 24 efficacy, highlighting the influence of both content and pedagogical components of teacher knowledge.

Teachers with low self-efficacy and outcome expectancy often have low content knowledge and weak PCK (Swackhamer et al., 2009). These struggling teachers are of significant concern to schools and the teaching community since they have a critical role in the development of content knowledge of their students (Loewenberg Ball et al., 2008). Modeling

Instruction professional development can provide an enriching environment for teachers to develop their pedagogical content knowledge, learn how to recognize and correct common student misconceptions, and practice effective teaching strategies, all within a learning community that encourages teachers to thrive and develop (Cabot, 2008). Teaching content knowledge is not the deliberate design of MI professional development. However, some teachers learn science content and experience gains in content knowledge. More importantly, research indicates that the best way for students to improve their achievement in science is to learn from teachers with strong content knowledge (Sadler et al., 2013). There is a need, therefore, to examine the impact of MI on teacher knowledge to determine future funding structures of professional development and best practices regarding offering quality professional development.

Conceptual Framework

Bandura’s social cognitive theory explains the relationship that exists between perceived self-efficacy and behavioral change. According to Bandura (1977), self-efficacy has in two expectations, personal self-efficacy and outcome expectancy. Bandura (1977) identifies the effects of self-efficacy on behavior as a probable determinant of our actions in response to stressful situations. A person’s perceived self-efficacy predicts the choices they make and their 25 accomplishments (Bandura, 1997). While self-belief may not guarantee achievement, Bandura implies its importance when he states, ‘‘self-disbelief assuredly spawns failure’’ (1977, p. 77).

The perception of self-efficacy accounts for how steadfast someone is and how much effort they expend in the face of challenging situations and rests on more than a teacher’s ability to transmit subject knowledge. Studies have shown that students whose teachers have lower self-efficacy than their peers have lowered performance expectations (Bandura, 1993). Teachers tend to work within their perceived capabilities and altogether avoid any situations they feel go beyond their capabilities. Teachers with high self-efficacy tend to adopt student-centered strategies and inquiry-based techniques (Czerniak, 1990).

Lee Shulman has studied professional practice throughout his career. Two themes emerged from his research: (a) professional judgment in times of ambiguity; and (b) professional expertise. His studies generated comparative frameworks across professional education including signature pedagogies for a specific profession and the cognitive preparation to think like a professional, more specifically, a teacher. Shulman drew from the work of John Dewey to inform his construct about what teachers should know and what they should be able to accomplish. Dewey (1938) believed inquiry is at the heart of knowledge activation and is indispensable in reflective teaching, and requires a specialized body of knowledge paired with the knowledge necessary for teaching. These concepts are still at the forefront of education and led Shulman (1986) to describe pedagogical content knowledge, moving beyond subject matter knowledge to incorporate knowledge for teaching a particular subject.

Teaching is a highly complex cognitive activity presuming a teacher applies knowledge from multiple domains. Teachers with differentiated and integrated knowledge have a more exceptional ability to perfect their craft than those whose knowledge is limited and fragmented. 26

Shulman has defined what professional knowledge teachers should embrace. Shulman (1986) described the importance of a teacher’s knowledge of their students’ conceptions and preconceptions. When those preconceptions are misconceptions, and many times are, teachers need to know how to guide their students in reconstructing knowledge and replace those misconceptions.

Shulman (1987) identified seven components of teacher knowledge, including subject matter content knowledge, general pedagogical knowledge, PCK, knowledge of educational aims, curricular knowledge, knowledge of learners, and knowledge of the educational context.

Subject matter content knowledge is the knowledge base that teachers need and is at the heart of a teacher’s practice. Content knowledge is specific to the subject that a teacher teaches, and includes the overarching paradigms, language, and critical structures for that subject. General pedagogical knowledge includes principles and strategies for classroom management that cross the curriculum. Pedagogical content knowledge is the knowledge of integrating subject-specific content knowledge and the pedagogical knowledge for teaching that particular subject.

Pedagogical content knowledge includes the required cognitive knowledge for creating effective teaching and learning environments. Knowledge of educational aims includes the moral direction and values of education. Curricular knowledge includes knowledge of the curriculum materials used to teach the subject and the program of study. Teachers also need to possess knowledge of their learners; in other words, they need to know about their pupil’s developmental stage and background information about students within their classroom. Knowledge of context allows the teacher to assess other factors that influence their teachings, such as type and size of the school, school climate, and amount and quality of teacher support (Shulman, 1987). Of these 27 seven components, Shulman’s work has focused on PCK and represents an academic construct that has been extensively studied to understand better what makes learning easy or difficult.

Review of the Literature

Content Knowledge

Content knowledge refers to the magnitude and arrangement of knowledge about a particular discipline in the mind of the teacher. There are some ways to represent content knowledge, including Bloom's cognitive taxonomy, from simple content recall to understanding, applying, and analyzing content. Content knowledge goes beyond the mere recall of facts and figures and delves into an understanding of the importance of said content within the context of learning.

The definition of content knowledge is the understanding of discipline-specific subject matter. A mixed-method study by Loewenberg Ball, Thames, and Phelps (2008) examined the qualities of content knowledge for mathematics teachers. The purpose of their research was to examine mathematics teaching to suggest the nature of professionally oriented subject matter knowledge in mathematics. In doing so, they identified the mathematical knowledge for teaching in conjunction with the development of measures for mathematical knowledge. Their research indicates two subdomains within what Shulman refers to as PCK. Those two subdomains include knowledge of content and students and knowledge of content and teaching.

Loewenberg Ball et al. (2008) also revealed a vital subdomain of the knowledge of content that is pure content knowledge and is unique to the work of teaching. The authors explain that pure content knowledge is highly specialized, and consists of the specific knowledge necessary, only in a classroom setting, explicit to the purpose of teaching, and not and does not consist of knowledge of students or teaching pedagogy. Knowledge of content and teaching is the 28 specialized knowledge that is needed to engage students in a classroom setting and combines knowing about teaching and knowing about science. This knowledge is necessary to build a bridge with pedagogy and establishing an understanding of students within the context of a particular subject (Loewenberg Ball et al., 2008).

Development of content knowledge and its impact on student learning. Teachers gain their specialized content knowledge through formal learning opportunities and informal situations not intentionally organized. Subject-specific content courses during undergraduate preparation generally lead to teaching qualifications in a particular content area. Pre-service teacher programs, professional development, and teaching experience all contribute to a teacher’s knowledge of their content and may offer learning opportunities for acquiring content knowledge (Kleickmann et al., 2013). Routine teaching, however, does not necessarily improve knowledge competence. Ohio, for example, has reexamined what science teachers must know to be highly qualified as part of the federal Excellent Educators for All Initiative approved in 2015

(Ohio Department of Education, 2015).

No Child Left Behind unleashed a wave of testing student knowledge; however, relatively little testing of a teacher’s content knowledge resulted. A quantitative study conducted by Sadler, Sonnert, Coyle, Cook-Smith, and Miller (2013) used a pre-posttest design to examine the influence of a teacher’s knowledge on their students’ learning. According to the authors, studies of teacher effectiveness that focus on proxy measures, such as previous coursework, grades, degrees, or certification, are a poor predictor of their students’ achievement (Sadler et al.,

2013). They found relatively little research conducted that directly measures teacher content knowledge, as opposed to alternative measures such as instructional effectiveness and student achievement. 29

Attempts in the past to use tests of teacher subject matter knowledge as a predictor of student learning have failed by not taking the same test as the one administered to their students.

Sadler (2013) and colleagues did apply the rarely used method of administering an identical measurement tool to both teacher and student and found student gains are related to teacher knowledge. While Sadler et al. (2013) suggest that teacher content knowledge is a strong predictor of gain scores on content material that does not have strong misconceptions, it is not sufficient to predict gain scores on content that does possess strong misconceptions. When teachers are unaware of their students’ misconceptions, their attempts at teaching essential concepts may be compromised (Sadler et al., 2013).

The creation of a simple multiple-choice assessment utilized in this study provided an easy way to measure a teachers’ awareness of their students’ mental models, which is a component of PCK. The authors further suggest that teachers who know the common misconceptions held by their students are more efficient at teaching than their counterparts that do not. Also, this study informs future research of the importance of identification of both teacher and student misconceptions alongside the development of content knowledge to deepen the impact of science professional development and aid teachers in constructing learning activities, laboratory experiments, demonstrations, and discussions (Sadler et al., 2013).

A mixed-method study of random fourth-grade classrooms from urban and rural elementary schools in NW Germany investigated the impact of teachers’ content knowledge in

Physics on student outcomes (Ohle, Boone, & Fischer, 2015). This study consisted of student and teacher components. The student components included a student questionnaire of interest in physics and a pre-post design measurement of student content understanding. The teacher component consisted of a questionnaire to determine interest in teaching Physics, a test of 30 teacher content knowledge, and a video analysis of actual classroom teaching to observe the quality of instruction. According to this study, content knowledge (facts, concepts, understanding of the structures of a subject, and the relation to other content) can limit what opportunities a teacher can present to students to facilitate learning within the classroom. The authors further suggest that an elementary teacher’s lack of content knowledge and routine for dealing with incomplete knowledge is an international issue. This study proposes that both CK and PCK are essential for successful teaching, suggesting that future research should examine both to understand the effectiveness of the intervention.

A mixed-method study investigated how pre-service teachers’ self-efficacy changed while participating in a school-based science methods course within a Professional Development

School and revealed an increase in personal science teacher efficacy (Swars & Dooley, 2010).

Outcome expectancy did not show significant changes. The concurrent triangulation design examined both self-efficacy and outcome expectancy values that were obtained using the Science

Teaching Efficacy Belief Instrument-B and an open-ended questionnaire. Some pre-service teachers that expressed low self-efficacy cited a lack of adequate content knowledge, while those with strong self-efficacy cited strong content knowledge (Swars & Dooley, 2010). This study suggests that a deliberate content knowledge component is particularly important for science teacher preparation, especially so pre-service teachers can develop their pedagogy through an understanding of student learners and knowledge about the process of scientific inquiry. Swars and Dooley (2010) recommend the exploration of a more detailed view of teaching efficacy to provide a better understanding of this construct as it relates to science teacher development.

Swackhamer, Koellner, Basile, and Kimbrough (2009) conducted a study to explore whether in-service teachers’ level of self-efficacy changed as a result of the development of 31 content knowledge intertwined with pedagogy. They conducted a five-year project, targeting middle school teachers in seven districts in Denver. Over 200 teachers enrolled in at least one of the 15 content-based math and science courses. The project utilized a Science Teacher Efficacy

Belief Instrument (STEBI), enhanced with math and science questions to measure self-efficacy and the number of content courses as a measure of content knowledge. The results showed that with four or more content courses, self-efficacy was much higher than those teachers that completed only one to three courses. The results of this study suggest that support for the development of a teacher's content knowledge and pedagogy could increase levels of self- efficacy. The authors of the study suggest the use of a pretest/posttest design to strengthen their results and allow for a more definitive interpretation pre-existing levels of each teacher's self- efficacy.

The studies, as mentioned above, solidify the importance of measuring content knowledge when examining the effectiveness of science teacher professional development.

Teacher content knowledge is a critical lens to view the impact of intervention strategies that ultimately influence student achievement.

Overall, few studies exist that measure teacher content knowledge to determine the impact of professional development. Therefore, the proposed research uses this practice. These studies also suggest examination of additional factors such as self-efficacy, a factor that influences teacher PCK. Therefore, this research examined PCK to discover the measurement of such to inform the proposed research using a pretest/posttest design.

Pedagogical Content Knowledge

Current research has put a strong focus on the first of Shulman’s three components, pedagogical content knowledge. Shulman has defined PCK as “the most useful forms of 32 representation of those ideas, the most powerful analogies, illustrations, examples, explanations and demonstrations” (1986, p. 6). The notion of PCK, since its introduction in 1986, has permeated educational research as an epistemological concept that usefully blends the traditionally separated knowledge bases of content and pedagogy. A well-developed knowledge of teaching procedures (e.g. concept maps, laboratory experiments, Socratic dialogue, etc.) and specialized content knowledge (e.g. physics, biology, or chemistry) is demonstrated by a skilled teacher’s ability to combine such knowledge of content and pedagogy in meaningful ways

(Loughran, Berry, & Mulhall, 2012).

When a rich understanding of content is lacking, teachers struggle to deliver elements of professional practice, such as conceptual hooks, triggers for learning, and how to deal with student misconceptions (Loughran et al., 2012) as experienced by teachers with low self- efficacy. Increasing the level of content knowledge influences the level of self-efficacy and will affect the implementation of PCK (Swackhamer et al., 2009). Bandura (1993) showed that an increase in knowledge or mastery experiences would lead to an increase in self-efficacy beliefs.

Persons who are persistent respond to situations with active and assured responsiveness

(Bandura,1977) and explore how to teach particular content in ways that promote student understanding.

Shulman (1986) describes PCK as an amalgamation of productive knowledge of content along with how to select, shape, and modify lessons to reach students in a particular content.

Pedagogical content knowledge is the integration of subject expertise with the skills needed to teach science. Pedagogical content knowledge is general knowledge about how students learn, best teaching practices, methods of assessment, and the background knowledge of different learning theories (Shulman, 1986). Pedagogical content knowledge serves as the assimilation of 33 content and pedagogy, where a teacher focuses on particular aspects of the subject matter to adapt and represent the content for instruction to improve student learning. The manner of transforming the subject matter for teaching occurs when the teacher interprets the subject matter and finds different representational tools to make content accessible to learners.

Development of pedagogical content knowledge. Studies investigating the development of content knowledge and along with PCK are limited in their scope and size since they are case studies or have small sample sizes (Kleickmann et al., 2013). Many quantitative studies have relied on self-reporting measures to determine the relationship between a teacher’s content knowledge and their PCK. Two unresolved issues still need to be addressed. Is content knowledge a prerequisite for PCK? What is the role of thoughtful reflection and practice in the development of a teacher? Much documentation exists to show limited knowledge and misconceptions impact a teacher’s ability to appropriately respond to their students’ conceptions

(Kleickmann et al., 2013). The cross-sectional quantitative study conducted by Kleickmann and his colleagues (2013) suggests that strong content knowledge does not directly lead to the development of PCK, preferably experience coupled with reflection is needed to develop a teacher's PCK further. This study examined content knowledge and PCK of different phases of

German teachers’ education beginning during pre-service and into the in-service phase of their career. Other studies refute this claim and focus on the relationship between CK and PCK.

Impact of pedagogical content knowledge on student learning. Pedagogical content knowledge enables teachers to make the subject matter accessible to their students. Few studies have investigated the link between teacher PCK and student learning. A paper-pencil test assessed the effect of teacher PCK of teachers in Finland, Germany, and Switzerland (Ergonenc,

Neumann, & Fischer, 2014). This study focused on three main areas linked to Shulman’s initial 34 definition of PCK: knowledge about misconceptions, knowledge about curriculum, and knowledge about difficulties. Video analysis of teacher instruction was used to determine the effect of teacher PCK on student cognitive activation (Ergonenc et al., 2014). Cognitive activation by teachers includes their ability to plan for the learning process such that the materials used in the lesson encourage student learning. An essential facet of PCK is the knowledge of student misconceptions (Ergonenc et al., 2014). A misconception in science is an incorrect conception that stems from a student’s experiences. Pedagogical content knowledge is how teachers translate their subject material into something comprehensive for their students.

Few studies have investigated the effect of a teacher’s PCK on student learning. The results of this study suggest there is a correlation between teacher PCK and student cognitive activation, indicating the importance of PCK to student learning and content knowledge.

Self-Efficacy

Since Bandura identified self-efficacy as the judgment of one’s capabilities, researchers have explored how teachers make judgments about their abilities. Self-efficacy pairs with a teacher’s capacity to respond adequately to student misconceptions. “Research has indicated that teachers with strong, positive efficacy beliefs about their teaching ability are more likely to take risks and use new techniques,” (Bray-Clark & Bates, 2003, p. 15). Research has also shown that teachers with high efficacy beliefs are more persistent in helping students who are having difficulty or struggle with misconceptions (Bray-Clark & Bates, 2003). Teachers are also more likely to try out strategies that have positive outcomes on student misconceptions (Bray-Clark &

Bates, 2003). Teachers with high self-efficacy are more inclined to yield better student performance because they are more tenacious when helping struggling students (Haney,

Czerniak, & Lumpe, 1996). 35

Efficacy is a future-oriented belief about one’s capabilities to deal with different circumstances and perform specific tasks (Bandura, 1993). Self-efficacy is the belief in one’s abilities to manage a variety of circumstances while performing particular tasks. From the perspective of teaching, efficacy is the judgment that one’s abilities would be effective for the achievement and learning of students (Tschannen-Moran & Hoy, 2001).

Development of self-efficacy. The development of self-efficacy depends on four interrelated sources, according to Bandura’s (1977) theory; mastery experiences, vicarious experiences, social persuasion, and emotional arousal processes. Of the four primary sources, mastery experiences are the most powerful (Tschannen-Moran & Hoy, 2007).

Mastery experiences occur when we overcomes obstacles through perseverance and effort. The performance is perceived as successful when the task at hand is difficult and completed with little intervention. Physiological and emotional arousal helps to focus a teacher’s energy on the task at hand. Success builds a robust belief in the teacher's sense of efficacy, while failure can undermine efficacy (Tschannen-Moran, Hoy, & Hoy, 1998).

Vicarious experience allows for the modeling of appropriate skills and behavior and serves as a useful tool for boosting self-efficacy. Teachers do not only rely on direct experiences. Hearing about how their colleagues have solved related problems can also promote efficacy. Novice teachers are particularly sensitive to vicarious experiences in situations where they are uncertain about their abilities (Mulholland, & Wallace, 2001).

Social persuasion can boost attempts at new strategies when others express faith in one’s capabilities. The support from colleagues significantly contributes to a novice teacher’s self- efficacy beliefs (Tschannen-Moran & Hoy, 2007). Encouragement and feedback from the leadership can also encourage a faculty to give extra effort, ultimately leading to student success. 36

Persuasion can support a teacher’s persistence, and persistence can lead to teachers finding ways to solve problems.

Finally, emotional arousal processes relate to how teachers deal with the pressures and stress of their jobs. How teachers learn to adapt and cope with challenges reveals how efficacious they are. Highly efficacious teachers do not misinterpret stimuli and are not dysfunctional in their response to challenging situations (Bandura, 1977). On the other hand, teachers with low efficacy do not deal very well with stressful situations.

Impact of self-efficacy on student learning. Self-efficacy is the confidence we have in our abilities that we can overcome obstacles and achieve our learning goals (Hattie, 2012).

Efficacy impacts the perseverance and effort we expend when working on achieving specific objectives and plays a significant role in how we approach tasks, goals, and challenges (Bandura,

1997). As a component of the social-cognitive theory, self-efficacy focuses on the interrelationships among self-efficacy, outcome expectancy, and behavior (Bandura, 1993).

Teacher efficacy beliefs will affect a teacher’s implementation of PCK due to the challenging nature of implementing new teaching methods (Hodges, Gale, & Meng, 2016). Teachers demonstrating a strong self-efficacy tend to invest more effort in their teaching (Tschannen-

Moran et al., 1998), develop alternative teaching strategies (Guskey, 1988), create student- centered atmospheres (Ramey-Gassert, Shroyer & Starver, 1996), and work patiently with academically challenged students (Gibson & Dembo, 1984).

Collective Efficacy

Hattie’s meta-analysis about what works best in schools revealed that collective efficacy is at the top of the list with an effect size of 1.57, indicating its substantial influence (2012).

Social cognition results from the shared belief that people can work together to make a 37 difference and ultimately improve efficacy. The social nature of collective efficacy implies that we are more than the individual parts. The sum of the personal attributes results in “the group’s shared belief in conjoint capabilities to organize and execute courses of action required to produce given levels of attainment” (Bandura, 1997, p. 477). The collective efficacy of teachers is affected by the tasks, efforts, persistence, and shared ideas and the achievement of their students.

According to Bandura (1997), collective efficacy in a school setting refers to the perception teachers maintain their efforts as a whole faculty will have an impact on student learning. Bandura also stated that in order to impact student learning positively, teachers need to believe, as a group, they can improve student learning. Teachers who believe in their instructional efficacy create mastery experiences for their students. On the other hand, staff that feels powerless to get students to achieve, convey a group sense of futility. Collective efficacy results from collegial environments where people learn from and collaborate with colleagues.

This environment is especially crucial for new teachers to develop their place within the school.

“Once formed, efficacy beliefs contribute significantly to the level and quality of human functioning” (Bandura, 1993, p. 145).

The goals of self-directed learners include personal growth and participation in an open dialogue (Bandura, 1997). Success in accomplishing tasks and support from educational leaders strengthen teachers' confidence (Goddard et al., 2004). Teachers and educational leaders believe the fundamental task of teaching is to evaluate the effect of their practice on students' achievement (Hattie, 2012). Educational leaders need to shift as much control as possible to teachers in the learning process to improve their self-efficacy and in turn, collective teacher efficacy. Education leaders work to build a collective efficacy by inspiring teachers to achieve 38 and have ownership in their learning (Bandura, 1997). Therefore, the future professional development of science teachers should not take place in isolation from colleagues.

Outcome Expectancy

If we affix Bandura’s theory to the construct of teacher efficacy, outcome expectancy would primarily reflect the degree to which teachers believe they can overcome their student's environment, that is, the extent to which students are teachable, given such factors as family background, IQ, and school conditions. Self-efficacy beliefs are a teachers’ evaluation of their ability to bring about positive student change, given certain environmental conditions (Bandura,

1984).

According to Bandura (1977), “an outcome expectancy is defined as a person’s estimate that a given behavior will lead to certain outcomes” (p. 193). In other words, our ideas about the likelihood that something will happen based on a specific behavior. Outcome expectancy is how well a teacher judges their ability to perform specific tasks in given situations. Gibson and

Dembo explain outcome expectancy regarding self-efficacy as “the degree to which teachers believed the environment could be controlled, that is, the extent to which students can be taught given such factors as family background, IQ, and school conditions” (1984, p. 570).

Development of outcome expectancy. Our understanding of expected outcomes is critical to understanding what teachers believe their students are capable of accomplishing.

Outcome expectancy, therefore, is highly predictive of behavior. Research within the last decade has rekindled the debate about the influence of outcome expectancy as a causal determinant of self-efficacy (Williams, 2010). If outcome expectancy is a determining factor of self-efficacy, then its importance should also be considered when designing professional development and measuring its effectiveness. 39

Impact of outcome expectancy. Self-efficacy and outcome expectancy, as measured by the STEBI, were used to predict the score on a Biology end-of-course exam in a study conducted by Angle and Mosley (2009). The authors found that 99% of the variance of the

Biology exam scores were accounted for by the outcome expectancy score, meaning that regardless of a students’ home environment, motivation, or availability of course materials, a teacher’s expectation of their students’ ability to learn Biology content is significantly related to the Biology exam scores (Angle & Mosely, 2009). A study from Poland revealed the comparison of student test scores in Math to a teacher’s outcome expectancy. Outcome expectancy was found to have a significant and positive effect on the Scholastic Assessment scores in Math (Trusz, 2018). In another study measuring the outcome expectancy of reading teachers, 84% of teachers scored low or average on a 16-question survey. The authors believe these results indicate many teachers lack the steadfast conviction that their teaching will have a positive impact on literacy development, and in turn, have an impact on pedagogy and, ultimately, student achievement (Sargent et al., 2018). These studies suggest a relationship between outcome expectancy and student achievement exists and warranted further examination in the proposed research.

Modeling Instruction

The human brain has enabled us to create mental maps to navigate our world, design tools, communicate, and cooperate with other humans. Model-based reasoning integrates multiple representations of information into neatly structured mental models constructed all within the mind. Conceptual models activate these mental models so that we can communicate with others through the construction of shared conceptual models (Hestenes, 1997). Scientific models are the basic unit of coherently structured knowledge, from which we can make logical 40 inferences, predictions, explanations, hypotheses, and design experiments. Modeling Instruction produces students who learn and understand fundamental science concepts by experiencing the natural world around them and by constructing, evaluating, refining, and applying conceptual models to understand and explain the natural phenomena they experience (Jackson et al., 2008).

Models. “A model is a representation of structure in a physical system and/or its properties” (Hestenes, 1997, p. 8). A model enables one to group large amounts of information into smaller units, placing fewer demands on perceptual processing (Gerjets, Scheiter, &

Catrambone, 2004). The National Science Education Standards (NRC, 1996) and Benchmarks for Science Literacy (AAAS, 1994) both recommend models and modeling as a central theme for science and math education. In MI, a model depicts the basic structure of physical phenomena

(Wells, Hestenes, & Swackhamer, 1995) and “…is a representation of a specific pattern in the real world, and modeling involves habits of mind for model construction, corroboration, and deployment” (Halloun, 2011, p.78). The Framework for K-12 Science Education defines a model as a “system (or parts of a system) under study to aid in the development of questions and explanations, and to communicate ideas to others” (NRC, 2012, p. 57) by generating data that can be used to make predictions. This definition implies that students need to be able to evaluate and refine models through an orderly progression. “Core knowledge corresponds to a limited number of the most basic models in the context of which, from a pedagogical perspective, can be developed the most fundamental and critical conceptions and habits of mind of the corresponding scientific theory” (Halloun, 2011, p. 79).

Models have a serendipitous element of making student reasoning visible (Lehrer &

Schauble, 2000). Models become the visible foothold for students to ground their explanations

(Windschitl, Thompson, Braaten & Stroupe, 2012). Scientific models depict, predict, diagram, 41 and monitor the variables of physical phenomena utilizing mathematical, graphical, and pictorial means (Wells, Hestenes, Swackhamer, 1995). They provide a connection between scientific theory and the accessible, efficient, and reliable ingredients in knowledge construction and deployment (Halloun, 2011). Models are a way for people studying science to understand scientific theories of natural phenomena. In science, a theory is an accurate explanation of some facet of natural phenomena, based on facts or data that have been, time and again, confirmed through observation and experiment. Halloun (2006) defined a scientific model as “a conceptual system mapped, with the context of scientific theory, onto a specific pattern in the structure and/or behavior of a set of physical systems as to reliably represent the pattern in the question and serve specific function in its regard” (p.24). Passmore, Stewart, and Cartier (2009) further explained that a model gathers “theoretical objects and the processes they undergo and thus serves as a mechanism that can be used to explain why something in the natural world works the way it does” (p. 395). Models construct factual information into consumable and reasoned bits in conjunction with general laws or principles (Dukerich, 2015).

The development and utilization of conceptual models of natural phenomena are essential to learning and doing science (Hestenes, 1987; Hestenes, 1997; Wells, Hestenes & Swackhamer,

1995). To understand science is to know how scientific models are developed and verified

(Hestenes, 2010). Once developed, a model is validated by correlating what the model predicts from the physical system with data collected from the system in action (Hestenes, 1997).

Students in science classes must apply models to illustrate, analyze, design experiments, or anticipate outcomes in order to understand the work of scientists (Hestenes, 1997). A model, when manipulated, can help students understand unique situations, make predictions, or answer questions (Megowan, 2007). A model can serve as the foundation of instruction (Windschitl et 42 al., 2012) and should be the focus of science content (Manthey & Brewe, 2013). Scientific models become meaningful units of structured knowledge (Hestenes, 1997).

The supposition that things with definite, physical properties occupy the cosmos and require multiple representations serve as the basis for models (Hestenes, 1997) and conceptually represent a real thing or structure in a system (Hestenes, 1987). There are five kinds of model structures: systemic, geometric, temporal, interactive, and object (Hestenes, 1992). Depictions of the composition, environment, and connections of objects within the system are systemic structures or system schemas. Geometric structure, or relations among places, include the position and configuration of objects within the system. An event structure or chronological structure over time, including descriptive or causal features represent temporal structures. The interaction structure illustrates the interaction laws in the natural world. The specification of what the system contains and the intrinsic properties represents the object structure. A model may have any mixture of these five kinds of structures (Hestenes, 2006). In science, models are regarded as a part of a more extensive, more comprehensive system and are a “set of hypothesized relationships among objects, processes, and events” (Windschitl & Thompson,

2006, p. 796).

Modeling. Modeling is essentially the cognitive process of applying theoretical principles to produce, test, modify, and use a model (Hestenes, 1987). Modeling engages students collaboratively in making and using models to describe, explain, predict, design and manipulate physical phenomena (Jackson, Dukerich & Hestenes, 2008) and tackles how to restructure students’ intuitions by engaging them in the construction and manipulation of externally structured representations (Hestenes, 1997). Modeling begins with a simple situation that students seek to understand. First, students represent the salient features of a system before 43 they identify the variables. To further develop a model, students also represent how parts of the system interact, the geometric properties of the system, and how the system may vary. Then, students compare measurements of the behavior of the system to their initial model for validation. Finally, students use their model to solve problems or apply the model to new situations (Megowan, 2007). Modeling is the coordination of facts with a model underlying scientific theory (Hestenes, 1987).

The general intent of modeling is to test an idea, represented as a system of related processes, events, or structures, against observations in the real world (Windschitl et al., 2006).

Modeling is a problem-solving process that has four stages: description, formulation, ramification, and validation (Hestenes, 1987). Modeling allows scientists to predict patterns as they gather data to formulate descriptions of natural phenomena (Passmore et al., 2009).

Essentially, modeling is how scientists reason, argue, and structure their knowledge and is the central activity of engaging in scientific endeavors (Manthey & Brewe, 2013). Therefore, it is crucial for our students, or rather our future scientists, to engage in modeling (Manthey & Brewe,

2013).

Modeling helps teachers to ground unifying concepts around scientific models that engage students collaboratively in a small group setting. Teachers create interactive classrooms through hands-on experiences. A supportive atmosphere guides students as they collect data through observations in a non-threatening environment. As a result, teachers reduce the likelihood of passing along scientific misconceptions and may challenge inaccurate preconceptions (Dukerich, 2015). When adequately treated, many misconceptions fall away

(Hestenes, 1997). 44

Modeling Instruction pedagogy specifically asks students to create, validate, deploy, and review scientific models (Manthey & Brewe, 2013). Student discourse is a vital component of modeling (Megowan, 2007). Model application is used to solve problems, and students make model-based inferences (Hestenes, 1997). Teachers utilizing a modeling approach to the scientific inquiry should have proficiency with conceptual modeling tools, qualitative reasoning skills with model representations, procedures for quantitative measurements, and be able to compare models to data (Hestenes, 2010).

Modeling pedagogy cannot be disseminated through traditional means, such as a textbook, since it entails social practices learned through the synergy with other teachers and deliberate classroom practice (Passmore et al., 2009). Modeling should be the primary activity in which students are engaged throughout a science course (Manthey & Brewe, 2013). The instructional framework for modeling guides students by engaging them with a question or a problem. Students develop a tentative model about the relationships of phenomena before they create a model to account for observations made during laboratory activities. Evaluation of the model’s usefulness, predictive power, or explanatory adequacy influences any revisions to the original model before applying the refined model in new situations (Windschitl et al., 2006).

Modeling Instruction and student learning. Modeling Instruction is a constructivist approach that reflects how previous scientists advanced their understanding of science concepts

(Dukerich, 2015). Students in an MI classroom are invariably engaged in a variety of hands-on, minds-on modeling activities to help them purposely establish scientific conceptions and develop scientific habits of mind (Halloun, 2011). “Coherent understanding cannot be transferred from teacher to student by lucid explanation or brilliant demonstrations” (Wells, Hestenes, 45

Swackhamer, 1995, p. 21). Instead, understanding must come from immersion in the process of developing an understanding of the content (Dukerich, 2015).

Modeling Instruction is a student-centered instructional method formulated in Modeling

Cycles and focuses on the conceptual representation of natural phenomena (Hestenes, 1987,

1997). A Modeling Cycle is like the learning cycle and provides a framework to structure learning so students can develop and use the conceptual models that form the content core of the discipline they are studying. By integrating curriculum and pedagogy, MI addresses obstacles students have for learning science and scaffolds content with inquiry-based lab activities

(Hestenes, 2015). A teacher’s role in MI is to introduce representational tools when necessary to sharpen models, facilitate modeling activities, manage discourse through facilitating Socratic dialogue and introduction of appropriate scientific terms, and address misconceptions (Hestenes,

1997). Teachers must also mediate student interactions to prevent students from going awry and getting tangled in unproductive discourse (Halloun, 2011). Modeling Instruction, therefore, encourages cooperation among students and facilitates discourse about complicated concepts in a supportive environment (Jackson, Dukerich & Hestenes, 2008).

The overall objectives of MI pedagogy include high quality student thinking, sense- making and learning through the development of a clear concept of a model while students become familiar with core models of science content, utilize the skills and techniques of modeling, and gain experience with deploying models to understand the physical world

(Hetentes, 2015). Modeling Instruction differs from traditional science teaching methods in the way content is organized, the role laboratory experiments have to develop concepts, and the manner an instructor interacts with their students (Dukerich, 2015). Modeling Instruction is a pedagogy that exemplifies a meaningful execution of all eight science and engineering practices 46 described as essential for all students to learn by the Framework for k-12 Science Education

(NRC, 2012). Those eight practices include: proposing questions and clarifying problems, developing and applying models, devising and conducting investigations, examining and explaining data, using arithmetic and computational reasoning, building explanations and devising solutions, engaging in argument from data, and obtaining, evaluating, and communicating information (NRC, 2012).

Teachers participate in professional development during the summer to experience MI.

Teacher leaders facilitate and model instructional practices by engaging teachers as learners of science content and pedagogy. Model-based pedagogy introduces a systematic approach to design curriculum and instruction, accomplished through the examination of educational research along with the exploration of the significance of those models to science teaching and learning. Also, rotation between the roles of student and instructor helps the teacher practice instructional strategies that engage and guide learners in cooperative inquiry, developing and applying models, evaluating evidence, and conducting discourse. Modeling Instruction professional development explores ways to integrate resources into teaching and to collaborate when rethinking and redesigning high school science courses and curriculum materials to enhance learning (Wells et al., 1995). Whiteboard sessions have become a signature feature in

MI due to the productive classroom interactions they support as students summarize their model and evidence to display to the class. The whiteboard also serves as a focal point for class discussion to anchor shared understanding within the classroom (Hestenes, 2010). Whiteboards reinforce understanding of concepts; students are motivated to understand what they put on the board (Jackson, Dukerich & Hestenes, 2008). 47

During MI professional development, teachers develop robust teaching methodology in a clear pedagogical framework to help students make sense of physical experiences, understand scientific claims, articulate and defend their opinions, and evaluate evidence (Jackson, Dukerich

& Hestenes, 2008). A teacher’s skill at implementing pedagogy and understanding science content are the critical factors of success in the modeling method (Hestenes, 2010). There is a strong focus on how to help students rectify naive conceptions by constructing appropriate models (Wells et al., 1995).

Perceive modeling as the method of Build scientific obtaining and models to using scientific Develop a small describe, knowledge. set of models explain, predict, that serve as the and manipulate content core for physical the subject. phenomena. Goals of MI

Represent Evaluate physical systems scientific models using pictorial, by evaluating the graphical, and connections to mathematical experimental expressions. data.

Figure 1. The five goals of Modeling Instruction to guide students during a high school science course according to L. Dukerich, 2015.

The Modeling Cycle. Modeling cycles organize science content around scientific models in an MI classroom instead of in traditional units. The Modeling Cycle starts with a paradigm lab to construct the model followed by refinement and expansion of the model before the application of the model (Megowan-Romanowicz, 2016). There are two main stages within 48 the cycle. The first stage is model development and encompasses exploration, concept introduction, or invention (Wells et al., 1995). To begin the first stage of the cycle, the teacher presents a demonstration of a testable phenomenon. Students discuss to determine the variables they wish to quantify and correlate to predict the outcome of the relationship between variables.

Students collaborate to plan and collect data, analyze their data, and share their findings through whiteboard presentations (Belcher, 2017; Megowan, 2007). Further laboratory activities with justification based on evidence refine and expand the models.

The second stage of the modeling cycle is the deployment of the model (Wells et al.,

1995). Deployment activities apply the model to a different situation to test how well the model adequately explains natural phenomena (Belcher, 2017; Jackson, Dukerich & Hestenes, 2008;

Megowan, 2007). Models build on each other, and when a model fails to account for empirical data, students must refine their model or create a new one (Belcher, 2017). As models are challenged and refined, the cycle continues. It is through this cycle that students and teachers genuinely engage in the process of science. Many traditional classrooms today are disengaged from this process.

Science misconceptions. Misconceptions, also known as naïve beliefs or preconceptions, are based on faulty understanding and result in incorrect thinking, inaccurate perception, preconceived notions, or a flawed understanding (NRC, 1997). The basis of science instruction should be to challenge these misconceptions student possesses and link analogies between and within our physical, mental, and cultural worlds to change our understanding.

Teachers cannot help children learn what they do not understand (Sadler et al., 2013). Many misconceptions exist in the field of physical science. Examples of common misconceptions include: (a) all atoms have a charge, (b) objects use up energy, (c) inertia is a force, (d) doubling 49 the speed of an object doubles the kinetic energy, (e) when an object is not moving there is no energy, (f) adding heat always increases temperature, (g) the mass after a reaction is less than the reactants, and (h) particles of solids do not exhibit motion (New York Science Teacher, 2010).

A multitude of resources have extensive lists of student misconceptions (DiSpezio, 2010; Heller

& Stewart, 2010; New York Science Teacher, 2010; Rosenblatt, 2012; Sadler & Sonnert, 2016).

Students' misconceptions about natural phenomena diminish their ability to learn new concepts when alternative models already exist in their minds (NRC, 1997). Halloun and

Hestenes (1985) found that not only did college students enter their first course in physics lacking basic physics concepts, but their “alternative misconceptions about mechanics are firmly in place” (p. 11). These misconceptions persist as uncertainty in a student's mind and impede further learning (McDermott, 1991).

Conceptual change happens when cognitive conflict arises from a discrepancy in one’s existing conceptions with new experiences. Those existing conceptions need restructuring.

Humans organize knowledge by arranging basic principles gained through observation and then make inferences when information is incomplete or only moderately understood (Posner, Strike,

Hewson, & Gertzog, 1982). When contradiction exists between naive conceptions and new experiences or accommodations, a conceptual change occurs and reflects the cognitive dissonance between the two (Posner et al., 1982). The strength of the misconceptions students maintain is not due to the lack of competence, instead as a result of the resistance to changing their misconception when they do not meet the underlying conditions for conceptual change

(Hestenes, 1987). According to Posner, Strike, Hewson, and Gertzog (1982), those conditions include dissatisfaction with current conceptions, the new concept should be clear and understandable, appear initially plausible, and suggest further investigations. To help facilitate 50 students’ conceptual change, science teachers should design their curriculum to expose students to this cognitive conflict and to devise strategies to help make conceptions scientifically aligned

(Posner et al., 1982).

Modeling Instruction reduces the number of misconceptions retained by students (Wells et al., 1995). Students discover through scientific discourse that when their misconceptions do not fit the model developed in class, they must modify their prior conceptions. Modeling is different from other inquiry methods of instruction since it incorporates scientific discourse while students are making sense of data from experiments (Megowan, 2007). This discourse is essential for students to challenge and replace their misconceptions with more accurate understanding (Campbell, Oh, & Neilson, 2012; Chen, Benus, & Yarker, 2016; Ford, 2012; Van

Hook & Huziak-Clark, 2008). Conceptual change research contends that students’ prior conceptual understanding of scientific phenomena can interfere with embracing new scientific information (Chi, 2008; Vosniadou, Vamvakoussi, & Skopeltiti, 2008). The longer a misconception exists, the more difficult is it to change. Therefore learning must move students progressively toward deeper conceptual understanding (Gooding & Metz, 2011).

Modeling Instruction impact. Improvement of teachers’ conceptual understanding of science content potentially enhances their influence on student learning and comfort with teaching content that addresses misconceptions (Kleickmann et al., 2013), and ultimately their overall self-efficacy of teaching (Haag & Megowan, 2015). Students who learn through

Modeling Instruction have shown an increase in their content knowledge scores from pretests to posttests as measured by Concept Inventories since the 1990s (Hestenes, 2000; Jackson et al.,

2008). 51

How does participation in an MI workshop affect a teacher’s instructional practice?

Barlow, Frick, Barker, and Phelps (2014) sought to find the answer to this question and determine the fidelity of the implementation of MI. The hope of a project titled Transforming

Instruction through Modeling Experiences was to increase secondary science teachers’ content knowledge after participation in the program. A total of 59 teachers attended for two weeks in the summer. Nine teachers were purposively selected to participate in the research, all of whom taught within a southeastern state in the U.S. The Reformed Teaching Observation Protocol

(RTOP) and an observational checklist were used to record evidence based on classroom observations (Barlow et al., 2014). The RTOP is an observational instrument used to measure reformed teaching practices demonstrated by a teacher within their classroom (Barlow et al.,

2014). Observations occurred in pairs of researchers with an interview immediately following.

Participants received a total score of 25 items across five categories on the RTOP. The quantitative RTOP scores suggested that all participants demonstrated improvement in their content knowledge concerning propositional knowledge as measured through observations

(Barlow et al., 2014). Further research is needed to quantify the gain teachers experience through participation in professional development such as MI. Concept Inventories would further improve studies such as the one described here.

The Force Concept Inventory and Assessment of Basic Chemistry Concepts

The Force Concept Inventory (FCI) is a multiple-choice assessment with Newtonian concepts and common sense alternatives widely used for evaluating student’s understanding of

Newtonian Physics. The use of the FCI, first published in March of 1992, extends to physics classrooms throughout the country and around the world (Hestenes, Wells, & Swackhamer, 52

1992). The Inventory is not a test of intelligence; it is a probe of common physics misconceptions.

Force Concept Inventory development. The FCI, developed by Halloun and Hestenes in 1985 and revised by Halloun, Hake, Mosca, and Hestenes in 1995, has been translated into 33 different languages. The inventory consists of 30 questions organized into six groups of

Newtonian concepts: kinematics, Newton’s first law, Newton’s second law, Newton’s third law, superposition principle, and types of force (Hestenes, Wells, & Swackhamer, 1992). The inventory itself is not a measure of intelligence, rather an inquiry of previous conceptions

(Hestenes et al., 1992).

The FCI is the most widely used, most researched, and statistically reliable instrument for assessing physics instruction (Hake, 1998). According to Hestenes and Halloun (1995), the FCI is useful to make comparisons among different teaching methods or to determine the effectiveness of new science teaching pedagogy. The design of the FCI is to provide a choice between Newtonian concepts and common sense alternatives or misconceptions. About 60% of the FCI and was based on the Mechanics Diagnostic Test (MDT), previously used as a diagnostic tool to identify and classify misconceptions in mechanics (Hestenes et al., 1992). There is strong evidence that suggests the FCI score correlates with other Newtonian skills, such as problem- solving (Hestenes & Halloun, 1995).

The FCI has a Newtonian mastery threshold of 85% correct responses, while a score of

60% is indicative of an entry-level threshold to Newtonian physics (Hestenes & Halloun, 1995).

Questions on the FCI were designed to be meaningful to students that do not have any training in mechanics to reveal their understanding of the subject (Hestenes, 1997). At first, the FCI was constructed to gauge whether college mechanics courses could prepare students to reliably 53 discriminate between science concepts and naive conceptions in everyday physical situations.

The FCI data revealed that students do not easily give up their misconceptions (Hestenes, 1997).

In a study to corroborate the reliability of the FCI, Lasry, Rosenfield, Dedic, Dahan, and Reshef collected pre and posttest scores for 100 students enrolled in an electricity and magnetism class at John Abbott College, a two-year college in Montreal, Canada (2011). Data collection occurred over three years during the fall term each year. The instructor remained constant for the class. The internal consistency is determined by examining how students responded on a posttest, as compared to their pretest answers. Force Concept Inventory test-retest response expected frequencies show that the FCI total score is a precise metric (Lasry, Rosenfield, Dedic,

Dahan, & Reshef, 2011). The results of this study suggest that the FCI is a reliable predictor of student understanding in a subfield of physics, such as electricity and magnetism. Also, the total score on the FCI has high test-retest reliability.

Impact of the Force Concept Inventory. Savinainen and Viiri (2008), used the FCI as a measure of 49 Finnish students’ conceptual coherence using posttest percentage data. They defined conceptual coherence in terms of the conceptual framework, contextual coherence, and the representational coherence. Through their research, they wanted to determine the extent that a student whose FCI score was over 80% exhibited contextual coherence and in what dimensions and representations of the force concept students were the strongest and weakest. An Interactive

Conceptual Instruction approach, during a 27-hour high school mechanics course, was used for collecting FCI scores followed by interview questions to probe the students’ understanding.

Classification of the results on the posttest fell into three levels of achievement in contextual coherence (Savinainen & Viiri, 2008). This study points to the effectiveness of utilizing FCI scores, even for students with strong conceptual knowledge. In the proposed study, some 54 teachers possessed strong content knowledge. However, the use of the FCI proved to still be fruitful in data collection.

Coletta, Phillips, and Steinert (2007) studied the correlation of normalized gains on the

FCI and SAT scores as a result of participation in an interactive engagement physics course.

Pre-Instruction math, verbal, and cumulative SAT scores, as well as normalized gains on the

FCI, were collected for 292 students in various introductory Mechanics classes at Loyola

Marymount University in addition to 335 high school physics students taught using interactive engagement. The correlation coefficient for math and verbal with FCI scores indicated significance with a p-value < 0.0001. The FCI normalized gain and SAT math and verbal scores were used to determine the relationship by using the slope plotted on a graph (Coletta, Phillips,

& Steinert, 2007). The researchers suggested that the SAT is correlated with student learning, as measured by the normalized gain score on the FCI, and that the attained level of formal reasoning could benefit from interactive engagement pedagogy.

Assessment of Basic Chemistry Concepts. The Assessment of Basic Chemistry

Concepts (ABCC) was created by modifying the Chemistry Concept Inventory designed by

Mulford and Robinson (2002) for the evaluation of first-semester college chemistry students.

The Chemistry Concept Inventory was first designed to measure entering students’ conceptions about topics found in the first semester of a general chemistry college course and to observe any changes after such a course. The development of the 22-question multiple-choice assessment arose from responses to the pilot inventory. The authors were able also to demonstrate that traditional teaching methods used in the general chemistry course resulted in only modest improvements in understanding of basic concepts, thus supporting the proposed research of utilizing a new pedagogical approach to teaching science. 55

Modification of the Chemistry Concept Inventory was necessary for use in a high school chemistry classroom. Adaptation of this inventory included an additional set of energy-related questions developed by Dr. Guy Ashkenazi (Royce, 2012). There are 27 questions on this

Concept Inventory. The internal reliability, as measured by the coefficient alpha .798, was presented in the master’s thesis research by Brenda Royce in 2012 (Royce, 2012). The ABCC was used in this research to gather data about content knowledge for the chemistry MI professional development.

Science Teaching Efficacy Belief Instrument

Science teaching efficacy has given researchers something to measure to quantify the factors that contribute to an educational environment such as persistence in professional circumstances and response to criticism (Deehan, 2016). The Science Teaching Efficacy Belief

Instrument-A (STEBI-A) has been a cornerstone in science education research, giving researchers a reliable measurement tool for self-efficacy and outcome expectancy for over 25 years (Deehan, 2016).

Development of the Science Teaching Efficacy Belief Instrument-A. Gibson and

Dembo (1984) further developed Bandura’s construct of self-efficacy by establishing a link between teacher efficacy and persistence in difficult teaching situations while implementing new teaching techniques. Gibson and Dembo (1984) applied Bandura’s self-efficacy theory to teaching outcomes. Gibson’s Teacher Efficacy Scale has been the seed in a growing number of subject-specific scales that investigate how teachers create student learning activities and how they persist while adapting their instruction to meet the diverse needs of students (Guskey,

1988). This instrument influenced the development of the Science Teaching Efficacy Belief

Instrument (STEBI-A) by Riggs and Enochs (1989) to measure in-service science teacher self- 56 efficacy and outcome expectancy. The STEBI-A instrument maintained many of the items from

Gibson’s survey with some minor alterations specific to science content. Self-efficacy beliefs are situational and depend on the context as well as the specific subject matter (Bandura, 1997;

Tschannen-Moran et al., 1998).

The STEBI-A was developed by Riggs and Enochs (1989) to measure two constructs that comprise in-service science teaching efficacy beliefs. Riggs and Enochs (1989) explained the association among belief, attitude, and behavior in teaching. The instrument’s two subscales include Personal Science Teacher Efficacy (PSTE) and Science Teacher Outcome Expectancy

(STOE). PSTE measures a science teacher’s belief about their ability to develop their students' science knowledge and skills (Deehan, 2016). The items that were measuring STOE examine a teacher's belief about their ability to overcome external obstacles that can impact a student’s ability to learn (Deehan, 2016). On the original questionnaire, respondents rated their level of agreement with 30 statements on a 5-point scale from ‘strongly agree’ to ‘strongly disagree’

(Riggs & Enochs, 1989). The Cronbach’s alpha, or reliability, for PSTE questions, was 0.92, and the STOE questions were 0.74 (Riggs & Enochs, 1989).

The STEBI-A instrument was used by Lekhu in 2013 to explore the relationship between science content knowledge, STOE, and PSTE. The author found that their content knowledge influenced a teacher's confidence to teach chemistry content. The author also discovered that there exists a significant relationship between content knowledge and confidence to teach several concepts (Lekhu, 2013). The STEBI-A has proven to be an accurate and decisive instrument that makes the intangible measurable and should be utilized in more science education research to evaluate the effectiveness of interventions. This research utilized questions from the STEBI-A 57 that were tailored specifically for MI professional development as a tool to measure PSTE and

STOE.

Effective Professional Development

Yoon, Duncan, Lee, Scarloss, and Shapley (2007) accede that professional development affects student achievement in three steps. In the first step, professional development enhances teacher content and pedagogical knowledge and skills. Secondly, improved knowledge and skills lead to improved classroom teaching. Finally, improved teaching raises student achievement (Guskey & Yoon, 2009). Scholars Yoon, Duncan, Lee, Scarloss, and Shapley

(2007), analyzed the findings of 1,343 studies that address professional development activities.

These studies were screened using a protocol inspired by What Works Clearinghouse, from the

Institute of Education Sciences. The review process included studies from Australia, Canada, the

United Kingdom, and the United States. The protocol used in consideration of the impact of in- service professional development on student achievement included studies: (a) with an effect size of 0.25 or higher; (b) specifically focused on English, math and science teachers K-12; (c) that utilized randomized control trials or quasi-experimental design; (d) that include a measure of student achievement; (e) considered to be accurate and consistent; and (f) conducted between

1986 and 2006. Only nine studies of over 1300 reviewed passed all the established criteria

(Yoon et al., 2007). These nine studies aide our understanding of the components of effective professional development and their impact on student achievement.

Components of effective professional development. The review of these studies revealed four essential characteristics for professional development to influence student achievement. First, time is a crucial factor in successful professional development. Studies of fourteen or more hours show a positive and significant effect on student achievement when it is 58 organized, well-structured, purposefully directed, and focused on content, pedagogy, or both.

Another crucial factor included significant, structured, and sustained follow-up after the primary professional development activity. Third, activities are another key to success. Activities must be adapted to specific content, process, and context elements to be useful in professional development. The final component is the subject or content. Professional development that focuses on what teachers teach, pedagogic practices, and how students acquire knowledge and skills have been shown to influence student achievement (Yoon et al., 2007; Zhang et al., 2015).

Research has shown that the effectiveness of professional development improves when it is longer in duration, is supportive during the learning process, encourages active participation, models the process of science, and is grounded in the content of a teacher’s particular discipline

(Desimone, 2009; Garet, Porter, Desimone, Birman, & Yoon, 2001; Gulamhussein, 2013; Zhang et al., 2015). In-depth studies have not revealed an exact turning point for the duration of professional development. However, research suggests there is a significant impact on changing teacher practice when extended more than 14 hours (Haag & Megowan, 2015; Yoon et al.,

2007), activities that include 20 or more hours spread over a semester, or intensive summer training with additional contact time during the semester (Desimone, 2009). Collective participation is more likely to afford opportunities for active learning. Teachers need to have a sophisticated understanding of their content to teach students the new content standards.

Encouraging teachers to be more engaged in meaningful discussion, planning, and practice has a more significant impact on the potential change in teaching practices.

A survey research design using a Likert scale from a summer professional development opportunity through a mid-western US university discussed patterns that emerged from their survey of 118, K-12 in-service teachers. Teachers chose among 230 science topics they would 59 like to receive professional development on for improving their practice. The results showed four significant reasons why teachers chose specific topics. First, a teacher felt they possessed weak or insufficient training of specific content. Second, they found a topic challenging for students to learn or that the topic was significant in the learning of their students. Third, they wanted to refine content for student engagement. Finally, teachers may have selected a particular topic because they need assistance with a realignment of the curriculum to new content standards (Zhang et al., 2015). Data analysis revealed that teachers needed help with several areas of pedagogical content knowledge, particularly inquiry teaching, which remains one of the most significant challenges for most teachers (Zhang et al., 2015). Therefore, professional development should take into account teacher background, target common topics that many teachers find challenging to teach and be designed to help teachers use inquiry to teach specific science concepts.

There are several key features that studies have shown to influence the effectiveness of professional development opportunities. Professional development that is long in duration tends to have more contact hours and a more significant period for professional development. Both dimensions were found to be significant from a quantitative summary of findings from a correlational analysis of representative teacher data (Garet et al., 2001). Also, factors such as a focus on content (Garet et al., 2001), active engagement (Fullan & Langworthy, 2014), and alignment with the teacher and state goals (Guskey, 1994) are significant in their influence on effective professional development.

Framework for Modeling Instruction professional development. Modeling

Instruction professional development possesses characteristics that researchers have described as necessary components of professional development to improve student achievement (Desimone, 60

2009). Leaders of MI workshops focus on helping science teachers understand how to help their students learn content knowledge by providing them with robust pedagogy that enables them to identify misconceptions and naïve beliefs. Participants are actively engaged in conducting science experiments as they develop models for teaching and learning (Wells, Hestenes, &

Swackhamer, 1995). Another critical feature of the workshop is the collective participation of teachers that are in the same discipline and encouraged to participate as a collective group to implement the transformative steps of MI in their classrooms. Active participation allows teachers to work together and discuss the process of transforming their schools and classrooms to use MI. Modeling Instruction aligns content with the 2013 Next Generation Science Standards

(Haag & Megowan, 2015). Finally, MI professional development takes place through two or three weeks in the summer, often with follow-up sessions with cohort members during the school year. Participants join a national pool of participants who share ideas, questions, and challenges through a daily LISTSERV, webinars, and regional conferences.

Leadership. Leithwood, Jantzi, and Steinbach (1999) found that transformational

leadership consistently impacted a teacher’s willingness to change their classroom practices and attitudes. The qualities of transformational leaders that impact teacher efficacy include their relentless pursuit of values, their curiosity to learn, their dissatisfaction with the status quo, their desire to rise above adversity by their unwillingness to compromise with injustice, and their desire for feedback and collaborative expertise (Sahgal & Pathak, 2007). In his book, What

Works Best in Education: The Politics of Collaborative Expertise, Hattie (2015) explains:

The most significant influence on student progression in learning is having highly expert,

inspired and passionate teachers and school leaders working together to maximize the

effect of their teaching on all students in their care. There is a significant role for school 61

leaders: to harness the expertise in their schools and to lead successful transformations.

There is also a role for the system: to provide the support, time and resources for this to

happen. Putting all three of these (teachers, leaders, system) together gets at the heart of

collaborative expertise. (p. 2)

As a consequence of the Every Student Succeeds Act of 2015, many school leaders encounter new challenges regarding district-wide decisions that solicit challenges to previous leadership models. Townsend, Acker-Hocevar, Ballenger, and Place (2013) revealed three essential factors driving instructional leadership; defining the school’s mission, promoting a positive school learning environment, and managing the instructional program. Educational leaders who have a devoir to meet federal, state, and local expectations are demanding enhanced insight into ways their efforts will satisfy the needs of each stakeholder.

Within the last ten years, student performance on standardized tests has become a focal point in education. Effective leadership addressing student achievement is, therefore, a must.

Successful learning institutions cannot exist without highly compelling leadership (Townsend et al., 2013). Evaluation of the leader-follower relationship, the position of power, and task structure will aid in determining the most effective style of leadership within Fiedler’s

Contingency Model framework (Cunningham & Cordeiro, 2009).

The current vision for science education involves the emphasis on doing science rather than just learning about science (National Academies of Sciences, Engineering, and Medicine,

2015). As professional development focused on science content is growing, it is challenging to identify high-quality experiences supportive of science teachers during this age of educational reform. The science teacher is essential in the creation and orchestration of experiences that actively engage students in science content, crosscutting concepts, and discussing natural 62 phenomena requiring intellectual and pedagogical proficiency (National Academies of Sciences,

Engineering, and Medicine, 2015). There is a gap between the expression of current science teaching practices and the vision for science learning in NGSS. Science and education leaders need to identify and encourage participation in effective professional development, such as MI, for teachers to refine their craft and cultivate future science teacher leaders.

Next Generation Science Standards and professional development. Teaching science through inquiry has been advocated for over 30 years and is supported throughout the Next

Generation Science Standards (Wilcox et al., 2015) and A Framework for K-12 Science

Education, which preceded and established the agenda for the development of NGSS. In 2009,

90% of K-12 educators reported participating in some form of professional development

(Fleischman, Hopstock, Pelczar, and Shelley, 2010). However, most found that it was not conducive to improving their teaching practices (Fleischman et al., 2010). Typical professional development does not facilitate the critical thinking necessary for science classrooms of the 21st century. Teacher preparation has focused a great deal on inquiry teaching (Darling-Hammond et al., 2009). However, this remains difficult for most teachers to achieve within their classrooms

(Darling-Hammond et al., 2009) in the absence of effective professional development, such as workshops on MI.

Gender differences in the science classroom. Females have made advances in many areas of science, despite their underrepresentation in the STEM-related fields, such as physics, engineering, computer sciences, and chemistry. The marginalization of females in science education is predominately due to their persistent underrepresentation in science careers (NSF

2015). The scientific community must further bolster females' access to and advancement in

STEM fields. However, this would require the scientific community to understand how science 63 education is falling short. While there have been improvements to promote females and encourage them to take classes in STEM, females still enroll in fewer advanced-level science courses in high school and graduate from college with fewer STEM degrees than their male counterparts (NCES 2012; Sadler et al. 2012).

Promoting successful female role models in science is a research-based practice for supporting girls in STEM (Baker 2013). Also, ensuring that professional development and teacher training programs include discussions about gender equity pedagogy may help teachers change the gender inequities in science. Professional development should aid teachers in the creation of an inclusive learning environment that respects and values all students (NSTA, 2011).

Promoting classroom environments where all students participate fully in class discussions, science activities, and investigations should take top priority as well as the practice of self- reflection to discover any gender biases teachers may have to minimize the negative impacts on student achievement (Halpern et al., 2007). In-service professional development opportunities are a valuable type of gender equity intervention in science education.

In addition to creating a safe and open learning environment, research suggests that giving attention to the different learning styles of males and females will provide many opportunities to improve gender equity in the science classroom (NSTA, 2011). Research by Witkin and

Goodenough (1981) supports that female learning styles are generally more cooperative and interdependent than their male counterparts (Witkin & Goodenough, 1981). There are other differences, as well. Males have a preference for abstract conceptualization, whereas females tend to seek applicability or personal connections with new material. Females are more socially and performance-oriented, while males tend to be achievement-oriented. Females rank social interaction with others and self-confidence of higher importance than males (Hayes, 2001). 64

Summary

Few studies have examined the relationship between content knowledge and self-efficacy

(Lakshmanan, Heath, Perlmutter, & Elder, 2011; Menon & Sadler, 2016). A mixed-method study that investigated changes in pre-service teachers’ self-efficacy and physics content knowledge observed significant gain scores on both the STEBI-B (used explicitly for pre-service teachers) instrument and a physical science concept test. A specialized physics content course, whose aim was to teach teachers to learn to integrate their understanding of science with pedagogical models, may give those teachers an advantage of having higher science self-efficacy beliefs than their peers and support a smooth transition into their science teaching practice

(Menon & Sadler, 2016). Studies have also shown the most influential models for professional development include a focus on content, are longer in duration, actively engage participants, and provide follow-up activities (Yoon et al., 2007; Zhang et al., 2015). However, missing from the research is the impact of specific professional development opportunities such as MI on teacher content knowledge, self-efficacy, and outcome expectancy. School districts are under pressure from their respective state education department and federal education mandates to select professional development that is thoroughly different from the professional development of the past. They must seek out robust research data and practical applications to meet the cognitive demands of the new science curriculum standards like NGSS. This research grappled with the growing demands for innovative approaches to professional development that is theoretically grounded and supported by research.

The role of self-efficacy, at the heart of the social cognitive theory, is a compelling, intervening factor between learning and future performance and has been established by a multitude of research from many different contexts, including teacher development. Designing 65 in-service professional development that focuses on not only pedagogical content knowledge but also self-efficacy, may provide a sound framework for understanding teacher growth. Practical tools such as feedback, access to content knowledge experts, and collegial support can be used to foster positive efficacy beliefs, enhance teacher content, and ultimately raise student achievement.

Therefore, in light of the preceding arguments, and in order to fully understand the impact of MI on teacher knowledge, the following research questions were examined:

1. What is the impact of participation in the MI professional development on content

knowledge, self-efficacy, and outcome expectancy?

2. What is the degree to which years of teaching experience, gender, highest level of

educational attainment, and teaching in content predict: a) content knowledge before MI;

b) content knowledge after MI; and c) growth in content knowledge?

3. What is the degree to which years of teaching experience, gender, highest level of

educational attainment, and teaching in content predict: a) self-efficacy before MI; b)

self-efficacy after MI; and c) growth in self-efficacy?

4. What is the degree to which years of teaching experience, gender, highest level of

educational attainment, and teaching in content predict: a) outcome expectancy before

MI; b) outcome expectancy after MI; and c) growth in outcome expectancy? 66

CHAPTER III. METHODOLOGY

Introduction

This study examined the change in science content knowledge, personal science teacher efficacy, and science teacher outcome expectancy following participation in the MI professional development. Pre and post-treatment data were examined. The study also examined the degree to which years of teaching experience, gender, the highest level of educational attainment, and teaching in content area predict content knowledge, self-efficacy, and outcome expectancy both before and after participation in MI professional development. This chapter examined the study’s research design, the participants, the methods used for data collection, and each research question in depth.

Research Design

This study utilized a pretest-posttest treatment only, Quasi-Experimental design. Quasi- experiments aim to evaluate interventions using non-randomized groups (Creswell, 2009). For this study, the selection of in-service teachers consisted of those who enrolled in a summer MI workshop. Modeling Instruction professional development workshops occur during two- or three-week sessions in the summer. Workshop leaders were classroom science teachers with experience using MI pedagogy in their classrooms. Workshop participants experienced the MI methodology as learners, while leaders focused on enhancing their scientific, pedagogical content knowledge. Participants were actively engaged in fundamental elements of MI as they engage in data collection through laboratory experimentation. Participants worked collaboratively in small groups to develop models and practice modeling techniques.

Participants were also provided time to practice questioning and discourse management strategies and receive feedback from workshop leaders. Through observation of MI experts, 67 participants saw the impact of whole-class scientific discourse to make sense of the concepts under investigation. Teachers cannot learn MI by merely reading a book. The process of becoming a Modeling Instructor includes learning through direct interaction with other teachers and classroom practice (Hestenes, 2013).

The Modeling Cycle organizes teaching into two stages; model development and model deployment. A typical Modeling Cycle is about two weeks in length, including whiteboard mediated modeling discourse during whiteboard sessions to report and discuss outcomes and anchor shared understanding. Each modeling cycle generates a model that can solve a multitude of problems. Modeling Instruction professional development provides teachers an opportunity to practice the Modeling Cycle and how they should utilize the necessary pedagogical tools to help their students construct models in their classrooms.

A pre-post evaluation strategy examined participants before and after MI training to determine if this intervention improved teachers’ content knowledge and could be used to predict self-efficacy and outcome expectancy. Gain scores allowed for conclusions about the effect of

MI and the improvement of teacher content knowledge, self-efficacy, and outcome expectancy.

Participants

This study included data from 567 certified, practicing secondary science teachers who have participated in the MI professional development designed for physical science teachers in grades 9-12. Physical science teachers focus on content in physics or chemistry. The participants selected for this study were those who voluntarily participated in MI professional development. Over 40 states have offered MI professional development through the American

Modeling Teachers Association, which maintains a national data set collected over the past twenty years. This study utilized data from 2016 to 2018. 68

Modeling Instruction professional development is open to all science teachers across the country, with workshops currently offered in 14 states. According to the National Science

Teachers Association website (2019), there are 111,000 science teachers at the high school level.

Nationwide, 47% of high school Chemistry teachers are women, and 70% of physics teachers are men. Most teachers (82%) at the high school level have a B.S. or related science degree (NSB,

2014).

Modeling Instruction professional development enrollment identified the participants for this study. This study utilized a convenience sample of participants. Since these workshops are optional, not mandatory, the sample is nonrandomized.

The researcher sought approval to conduct this study from Bowling Green State

University’s Human Subjects Review Board by completing the appropriate paperwork. The researcher did not directly interact with the participants and did not obtain data through direct intervention; therefore, requested the exempt category for this study (see Appendix A). No personal identifiers were associated with the secondary data set utilized in this study, rendering the participants as entirely anonymous to the researcher.

Instrumentation

Data collection for this research occurred both before and after teachers participated in

MI professional development. The Concept Inventory assessed content knowledge, while self- efficacy and outcome expectancy was measured using the Science Teacher Attitudes (STA) survey. The questions on this survey were based on those found on the STEBI-A. The STEBI instrument has two versions, STEBI-A and STEBI-B. The recommended use of the STEBI-A questionnaire is with an in-service teacher population while the best use of the STEBI-B questionnaire is with a pre-service teacher population (Bleicher, 2004; Enochs & Riggs, 1990). 69

Further description of the STA survey and the Concept Inventory follow.

Science Teacher Attitudes Survey

The original STEBI-A is a 23 item instrument explicitly used to assess the efficacy of teaching science and has been extensively studied (Tschannen-Moran et al., 1998). Two distinct subscales exist for the STEBI-A. The first subscale, personal science teacher efficacy (PSTE), consisted of 13 items and measured and examined teachers’ beliefs in their capacity to teach science. The second subscale, science teacher outcome expectancy (STOE), measured teachers’ beliefs that effective methods of teaching science can influence student achievement. Twelve items measured the STOE subscale. The STEBI-A survey has been referenced and utilized in many studies, and the data has been found to have an alpha of 0.92 for original 13 PSTE items and an alpha of 0.77 for the 12 original STOE items (Riggs & Enochs, 1989).

This study utilized an adapted version of the STEBI-A called the Science Teacher

Attitudes (STA) survey. Personal Science Teacher Efficacy and Science Teacher Outcome

Expectancy for this research are both measured using survey questions from the STA. The STA survey consisted of 14 PSTE items and 10 STOE items. This survey collected the number of years of experience, along with other demographic information. See Appendix B for a copy of the STA survey questions based on the STEBI-A. Questions that are negatively worded must be reverse scored in order to produce consistent values between positively and negatively worded questions. Reversing the scores on these items produced high scores for those with high PSTE and STOE and low scores for the low in PSTE and STOE beliefs.

Data collected from 2016 to 2018 occurred electronically. The STA survey questions are separate from the CI questions. Generation of survey access codes for each participant occurred before participation in the MI professional development. Participants were invited to complete 70 the survey by using an email invitation through LimeSurvey that provided them with a survey link. Surveys were open to receive answers before or at the beginning of the professional development.

The twenty-four question survey measured PSTE and STOE by utilizing a five-point

Likert scale on a continuum from strongly disagree (1) to strongly agree (5). Respondents are offered the choice uncertain if they neither agree nor disagree (3). A response of ‘‘strongly agree’’ on positively worded questions were given a value of 5. A response of ‘‘agree’’ was assigned a value of 4. A response of uncertain was assigned a value of 3, and so on. Negatively worded questions have reverse scoring. Based on this scoring, the total sum ranges from 24 to

120. For self-efficacy, 14 items had scores that could range from 14 to 70. Ten questions measure outcome expectancy with scores ranging from 10 to 50. Higher scores correspond with higher self-efficacy and outcome expectancy.

Concept Inventory

To measure content knowledge, teachers took a Concept Inventory (CI) of primary physical science content. The CI content is based on input from many science teachers and professors and is constructed based on a variety of common misconceptions (Hestenes &

Halloun, 1995). The CI for physics, in particular, has been used many times by others in the field. Therefore, the reliability of the CI is high. Lasry, Rosenfield, Dedic, Dahan, and Reshef

(2011) determined the overall reliability coefficient for the Force Concept Inventory (FCI), designed explicitly for physics questions, was 0.865. In another study, the FCI was validated by correlating students' scores with their scores on the Force and Motion Conceptual Evaluation, and instrument specifically used to identify Newtonian and prevalent student views of force and motion (Thornton, Kuhl, Cummings, & Marx, 2009). 71

The CI used for this study is either the Force Concept Inventory (FCI) for physics content questions or the Assessment of Basic Chemistry Concepts (ABCC) for chemistry content questions. The FCI consists of 30 multiple choice questions that have four to five choices for each question. Many studies utilize the FCI as the standard instrument for evaluating the conceptual understanding of physical science content (Belcher, 2017; Desbien, 2002; Hake,

1998; Hestenes et al., 2011; Hestenes et al., 1992; Jackson, 2010; Megowan, 2007; Wright,

2012). The ABCC consists of 27 questions, with three to five choices for each question. Scores were converted into percent scores to accommodate both CI scales for comparison. The CI is designed to probe students’ understanding of science concepts and examine their ability to discriminate fundamental aspects of science. See Appendix C for the ABCC for sample chemistry content questions and Appendix D for the FCI for Physics content sample questions.

The full version of each CI is available through the American Modeling Teacher’s Association website and is password protected.

The CI occurred at the beginning of the workshop. Again, data collection from 2016 to

2018 occurred electronically. The number of correct responses out of 30 allowed for the computation of each participant's score. Teachers with a higher total score are presumed to possess higher content knowledge than those with a lower total score. A paired sample t-test examined the mean difference in content knowledge before and after participation in the MI workshop.

Procedures

Responses to the survey questions were collected before and after participation in MI professional development. Teachers were given access to complete the survey just before the professional development and immediately after the completion of the professional development. 72

Workshop leaders provided participants with electronic access to surveys. The STA survey took approximately 15 minutes for the participants to complete.

Taken alongside the STA survey, a collection of responses to the CI also occurs before and after participation in the MI professional development. Teachers were again given access to complete the survey, both at the start of the professional development and immediately after its completion. Workshop leaders administered the CI to participants at each workshop location.

The CI took approximately thirty minutes for completion for each participant.

The MI professional development spans the course of two or three weeks resulting in 80 to 120 contact hours. Participants were engaged in a series of Modeling Cycles in student mode, mimicking how their students will encounter the material. Participants practiced with the development of scientific models and deployed them in new situations throughout the workshop.

Participants also engaged in professional conversations in teacher mode facilitated by workshop leaders and other participants. The article discussions were meant to challenge traditional teaching methods and develop a sense of urgency in adopting MI pedagogy. Table 1 reviews a sample agenda for a two-week workshop, illustrating the timing of pre and posttests, model development, model deployment, and professional conversations.

Research Questions

1. What is the impact of participation in the MI professional development on content

knowledge, self-efficacy, and outcome expectancy?

2. What is the degree to which years of teaching experience, gender, highest level of

educational attainment, and teaching in content predict: a) content knowledge before MI;

b) content knowledge after MI; and c) growth in content knowledge? 73

3. What is the degree to which years of teaching experience, gender, highest level of

educational attainment, and teaching in content predict: a) self-efficacy before MI; b)

self-efficacy after MI; and c) growth in self-efficacy?

4. What is the degree to which years of teaching experience, gender, highest level of

educational attainment, and teaching in content predict: a) outcome expectancy before

MI; b) outcome expectancy after MI; and c) growth in outcome expectancy? 74

Table 1

Example Agenda for Modeling Instruction professional development

Day Activities

1 • Introductions, go over schedule, and establish norms • Participants complete the Concept Inventory and adapted STEBI-A survey (pretest) • Participants enter student mode to develop the first model and to engage in the modeling cycle • Participants enter teacher mode talk about and answer questions about the day • Leaders assign reading for homework to begin to develop a sense of urgency with teacher participants to adopt MI methods

2 • Article discussion from previous night’s homework • Review the contents of the binder provided to the participants • Participants enter student mode to continue to develop the model from Day 1 • Participants enter teacher mode talk about and answer questions about the day • Leaders assign reading for homework

3 -5 • Participants engage in a discussion of the article and begin to flush out their ideas about MI • Participants enter student mode to deploy the model from Day 1 • Review the first unit of material from the binder provided to the participants • Participants enter teacher mode talk about and answer questions about the day • Leaders assign reading for homework

6 – 7 • Reflection of the workshop so far • Revisit the norms and schedule for the week • Participants enter student mode • Participants enter teacher mode to talk about and answer questions about the day • Leaders assign reading for homework

8-9 ● Article discussion from previous night’s homework ● Participants enter student mode to continue development of the next model and engage in the modeling cycle ● Participants begin the process of leading classroom discourse and whiteboard sessions ● Participants enter teacher mode to talk about MI ● Leaders assign reading for homework to reflect on MI methods

10 ● Article discussion from previous night’s homework ● Wrap up of the week and discuss an overview of the unit ● Discuss where participants go from this point in model development ● Participants complete the Concept Inventory and adapted STEBI-A survey (posttest) 75

Data Analysis

The quantitative data for this research was downloaded into the Statistical Package for

Social Sciences (SPSS Version 23) to record the values for each variable and to prescreen the data. Computation of subscales scores occurred with respective items after the examination of missing data and outliers. Table 2 presents all of the subscales, variables, their associated items, and calculation methods.

Table 2

Variables, Values, and Source

Variable Score/Items Values Source

Content knowledge Scale 0-30 Concept Inventory

PSTE Scale 1 = Strongly disagree STA survey (sum of items) 2 = Disagree 3 = Uncertain 4 = Agree 5 = Strongly agree

STOE Scale 1 = Strongly disagree STA survey (sum of items) 2 = Disagree 3 = Uncertain 4 = Agree 5 = Strongly agree

Years of experience Actual number STA survey

Gender Categorical 1 = Male STA survey 0 = Female

Education level Categorical 1 = Bachelor's degree STA survey 2 = Masters 3 = Doctoral

Teaching in content Categorical 1= Yes STA survey 0 = No 76

Research Question One

Research question one examined group differences to determine if there was a causal relationship between participation in MI professional development and each quantitative dependent variable (content knowledge, self-efficacy, and outcome expectancy). The groups examined included pre and post participation in MI professional development. A paired-samples t-test was used to analyze the same participants over time. The paired t-test determined if the mean difference between the pretreatment and posttreatment scores for content knowledge, self- efficacy, and outcome expectancy scores is zero. The effect size was computed to measure the magnitude of the treatment effect. Boxplots are used to detect univariate outliers within a data set. Examination of the z-scores determined if 99% of the content knowledge, self-efficacy, or outcome expectancy scores were within plus or minus three units of each other will help to identify outliers within the data. Scores beyond the 99% limit were considered outliers and caused the removal of those participants from the data set. A test of assumptions of normality and homogeneity occurred before paired-samples t-test on gain scores. The generation of descriptive statistics followed by data screening. These include the mean and standard deviation for each quantitative variable and sample size (content knowledge, self-efficacy, and outcome expectancy.

Research Questions Two, Three, and Four

Forward multiple regression was used to examine research questions two, three, and four.

Variables added to the model identified which ones increased R2 the most. Variables of greater theoretical importance were entered first into the predictive model. The dependent variables, content knowledge, self-efficacy, and outcome expectancy, were quantitative. The independent variables were entered into the predictive model and chosen from years of teaching experience, 77 gender, the highest level of educational attainment, and teaching in the content area. Multiple regression is an extension of a simple linear regression with more than one independent variable.

The objective of multiple regression is to explain more of the variation of the dependent variable

(in this case, content knowledge, SE, or OE) than with simple linear regression. When we include several predictors, we can calculate the relationship each predictor (years of experience, gender, the highest level of educational attainment, and teaching in the content area) has with the dependent variables, independent of other predictors. The information generated by a multiple regression equation does not produce a simple slope as in simple linear regression. A multiple regression produces coefficients for each independent variable. An R2 value reveals the best set of independent variables accounting for the most variation (Mertler & Vanatta, 2010).

The deletion of participant variables with missing or incomplete data was necessary when significant problems arose in the analysis. The calculation of the mean value allowed for the estimation of any missing values for the years of experience (Mertler & Vannatta, 2010). After pre-screening the data, a check of a linear relationship between the dependent variable and each independent variable was necessary. Examination of a scatterplot of the independent versus the dependent variable showed if there were a linear correlation. This scatterplot also revealed the direction of the relationship. For each combination of independent variables, the dependent variables should reveal a normal distribution with constant variance. Curvilinear relationships may need transformation as plots that display more elliptical shapes that are representative of linearity and normality. Residual plots within a preliminary regression examine multivariate normality and homoscedasticity. Values consistently spread out on the residual plots are indicative of normality and homoscedasticity (Mertler & Vannatta, 2010).

Multicollinearity is present when independent variables used to explain the dependent 78 variable account for similar portions of variance, making it difficult to determine their individual effects (Mertler & Vannatta, 2010). Examination of a scatter plot will show the relationship between each independent variable to one another. A strong linear relationship of the independent variables to one another sets up the potential for multicollinearity.

Tolerance indicates the strength of the linear relationships among the independent variables. A tolerance value close to zero indicates that the variable is almost a linear combination of the other independent variables and is considered to have multicollinearity. The tolerance value should be closer to one to avoid multicollinearity (Mertler & Vannatta, 2010).

There were several outputs generated through multiple regression. An R2-value reveals the strength of the relationship of the dependent variable and the combined set of independent variables. The R2 output variable will determine the percentage of variation in the response explained by the model. Models with larger predicted R2 values have a better predictive ability

(Mertler & Vannatta, 2010). The adjusted R squared (R2adj) factors in all of the independent variables in the study. The R2adj is converted into a percentage and represents the variance of the years of experience, gender, highest degree attainment, and content alignment that predict self-efficacy, outcome expectancy, and content knowledge scores (Mertler & Vannatta, 2010).

Multiple regression generated a p-value for each independent variable. This p-value tests the null hypothesis (H0) to determine if a particular beta (coefficient) is equal to zero. A low p- value allows one to reject the null hypothesis, and the independent variable likely to be meaningful to the regression model by showing how changes in the independent variables are related to the changes in the dependent variable.

The ANOVA summary table indicates the significance of the model of predicting each dependent variable. We can reject the null hypothesis if all coefficients do not equal zero. The 79

ANOVA tests all of the slopes at once (i.e., the coefficients) to keep the probability of a Type 1 error low (if level alpha is 0.05 or 5%). The F-value and level of significance tells whether or not the model is significantly related to the dependent variable in the population (Mertler &

Vannatta, 2010). Table 3 summarizes the data analysis techniques for each research question.

Table 3

Research Questions, Variables, and Data Analyses

Research Question Independent Dependent Data Analysis Variable(s) Variable(s)

1. What is the impact of participation in MI Participation in a) CK Paired-samples professional development on content MI b) PSTE t-test knowledge, self-efficacy, and outcome c) STOE expectancy?

2. What is the degree to which years of ● Teaching a) Pre CK Multiple teaching experience, gender, highest level experience b) Post CK regression of educational attainment, and teaching in ● Gender c) CK Growth content predict: a) content knowledge ● Education before MI; b) content knowledge after MI; level and c) growth in content knowledge? ● Teaching in content

3. What is the degree to which years of ● Teaching a) Pre SE Multiple teaching experience, gender, highest level experience b) Post SE regression of educational attainment, and teaching in ● Gender c) SE Growth content predict: a) self-efficacy before MI; ● Education b) self-efficacy after MI; and c) growth in level self-efficacy? ● Teaching in content

4. What is the degree to which years of ● Teaching a) Pre OE Multiple teaching experience, gender, highest level experience b) Post OE regression of educational attainment, and teaching in ● Gender c) OE Growth content predict: a) outcome expectancy ● Education before MI; b) outcome expectancy after level MI; and c) growth in outcome ● Teaching in expectancy? content 80

Study Assumptions

Participants are presumed to have responded to the survey honestly and without the use of outside sources to answer the questions. Anonymity and confidentiality will be preserved for each participant, who are volunteering to complete the surveys, and may withdraw from the study at any time and with no ramifications. Teachers are expected to accurately report their years of teaching experience, gender, teaching content, and highest level of educational attainment. The researcher assumes the participants will answer the STOE and PSTE survey questions honestly. Correlational studies do not establish cause and effect. The selection of the samples and the distribution of the populations are assumed to be normal. All participants are presumed to have all experienced similar exposure to MI professional development. 81

CHAPTER IV. RESULTS

The purpose of this study was to determine the change in personal science teacher efficacy, science teacher outcome expectancy, and content knowledge as a result of participation in MI professional development. The study also examined which science teacher demographic variables (years of teaching experience, gender, the highest level of educational attainment, and teaching in the content area) best predicted CK, PSTE, and STOE, both before and after participation in MI professional development. Self-reported surveys allowed the researcher to collect and analyze the data. Personal Science Teacher Efficacy and Science Teacher Outcome

Expectancy were measured using items from the Science Teacher Attitudes survey. Appendix E presents the question items for each subscale. Content knowledge was measured using either the

Force Concept Inventory (Halloun, Hake, Mosca, & Hestenes,1995) or the Assessment of Basic

Chemistry Concepts (Royce, 2012). Both descriptive and inferential statistics generated from the research data examined the four research questions. This chapter will review those statistics and present a summary of the results for this nonrandomized, quasi-experimental study.

Descriptive Statistics

Only completed data sets informed this study. A complete data set included both pre and post-Science Teacher Attitudes Surveys, as well as both pre and post Concept

Inventories. Teachers from the following states provided the data for this study: Alabama,

Arizona, Arkansas, California, Colorado, Illinois, Indiana, Iowa, Kansas, Maine, Maryland,

Massachusetts, Michigan, Minnesota, New Jersey, New York, Ohio, Tennessee, Texas,

Vermont, and Virginia. Data for each variable were obtained from 68 workshops from 2016 to

2018, with more than 1,000 participants invited to take the surveys at the beginning of the professional development. Eighteen of those workshops submitted complete data sets from the 82

MI chemistry professional development, and 23 from physics workshops, yielding 567 participants in the final data set (N = 567) used for this study. The removal of participants from the study occurred for those with incomplete data sets.

As indicated in Table 4, more participants had a Master’s degree (n = 353) than a

Bachelor’s (n = 173) or Doctoral degree (n = 41). Fifty percent of the participants were women, and forty-seven percent were men. Three percent of the responders indicated other for gender.

Most of the teachers (77.4%) were teaching the content for which they were prepared to teach through their college preparation. The number of years of teaching experience ranged from zero and forty-five years.

Table 4

Sample Size and Percentage for Gender, Highest Degree, and Teaching Content (n=567)

Characteristic N %

Gender Female 285 50.3

Male 267 47.1

Other 15 2.6

Highest Degree Bachelors 173 30.5

Masters 353 62.3

Doctoral 41 7.2

Teaching in Content Yes 439 77.4

No 128 22.6

Figure 2 depicts the breakdown of teaching experience into four categories for participants.

Categories include 0 to 3 years 4 to 9 years, 10 to 19 years, and 20 years or more of teaching

experience. The category with the most participants, between 4 to 9 years of experience, is 83

comparable to 0 to 3 and 10 to 19 years of experience (see Figure 2)

200

150

100

50 Number of Teachers Number of 0 0-3 4-9 10-19 20+ Years of Experience

Figure 2. Number of teachers by years of experience.

Figure 3 displays the number of teachers based on experience for each gender. The category with the most female teachers was four to nine years of experience. Males between zero to three years of experience had the largest representative group. Both males and females possessed between 45 to 55 participants with 20 years or more of teaching experience.

90 80 70 60 50 40

Number of Teachers Number of 30 20 10 0 0-3 4-9 10-19 20+ 0-3 4-9 10-19 20+ 0-3 4-9 10-19 20+ Females Males Other

Figure 3. Number of teachers by gender and teaching experience. 84

A comparison between gender and content alignment (Figure 4) revealed that the number of female teachers aligned with their content (212) was very similar to the number of males

(214). There were differences, however, between females and males not aligned with their content. There were twenty more females not aligned with their content than males.

250 200 150 100 50 0 Not Aligned Aligned Not Aligned Aligned Not Aligned Aligned Number of Teachers Number of Female Male Other

Figure 4. Number of teachers by content alignment and gender.

When comparing the years of experience with content alignment, one can see that those teachers aligned with their content had the most representative teachers from 10 to 19 years of experience. More teachers not aligned with their content possessed 4 to 9 years of experience.

The data showed a higher number of teachers possessing 20 or more years of experience aligned with their content (see Figure 5).

140 120 100 80 60 40 20 0

Number of Teachers Number of 0-3 4-9 10-19 20+ 0-3 4-9 10-19 20+ Not Aligned Aligned

Figure 5. Number of teachers by content alignment and years of experience. 85

Figure 6 illustrates the number of teachers for each gender based on content alignment and years of teaching experience. More females with 4 to 9 years of experience were not aligned with their content, while teachers with 10 to 19 years of experience were aligned. Males with 0 to 3 years of experience represented the largest category of teachers not aligned with their content. Two categories were very similar in the number of males aligned with content, 0 to 3

(60) and 4 to 9 (59).

80

70

60

50

40

30 Number of Teachers Number of 20

10

0 0-3 4-9 0-3 4-9 0-3 4-9 0-3 4-9 0-3 4-9 0-3 4-9 20+ 20+ 20+ 20+ 20+ 20+ 10-19 10-19 10-19 10-19 10-19 10-19 Not Aligned Aligned Not Aligned Aligned Not Aligned Aligned Females Males Other

Figure 6. Number of teachers by content alignment, gender, and years of experience.

The descriptive statistics, shown in Table 5, reveal pretest, posttest, and growth scores for

PSTE, STOE, and CK. The trend for mean posttest scores for each subscale shows a positive 86 change from the pretest scores. Growth scores reveal a more significant growth for CK (M =

5.52), followed by PSTE (M = 4.03), with the smallest mean growth for STOE (M = 1.13).

Table 5

Descriptive Statistics of Pre and Post Subscales

Pre Post Growth

M SD M SD M SD

PSTE 50.31 7.89 54.34 6.96 4.03 6.06

STOE 33.96 3.85 35.10 3.58 1.13 3.76

CK 79.80 20.75 85.32 16.72 5.52 11.43 Note. PSTE = Personal Science Teacher Efficacy; STOE = Science Teacher Outcome Expectancy; CK = Content Knowledge.

Each independent demographic variable, including gender, years of teaching experience, level of educational attainment, and teaching in the content area of teacher preparation, generated descriptive statistics (see Table 6). The mean for the number of years of teaching experience was 10.13 years. Although the remaining independent variables were categorical, they were treated quantitatively for inferential analysis. Table 6 includes means and standard deviations for each independent variable. Gender was coded 1 for females and 2 for males for statistical analysis. The mean value for gender (M = 1.48) indicated an almost equal representation of males and females within the MI professional development. The highest degree earned among participants had a mean value of 1.75, corresponding with a value between a Bachelor’s (value =

1) and a Master’s degree (value = 2). The mean value for content alignment was M = 1.78 (value one = no, value two = yes), indicating that more than 80% align with the content for which they are prepared to teach. 87

Table 6

Descriptive Statistics of Predictor Variables

M SD

Gender 1.48 0.50

Experience 10.13 8.03

Degree 1.75 0.54

Alignment 1.78 0.41 Note. N = 566.

Inferential Statistics

Before conducting the statistical analysis using SPSS (version 23) for each research question, four questions on the Science Teacher Attitudes survey were reverse coded since they were negatively worded. Negatively worded questions were reverse scored in order to produce consistent values between positively and negatively worded questions; high scores for those with high PSTE and STOE and low scores for the low in PSTE and STOE beliefs. Multiple regression analyses were used to answer research questions two, three, and four. The four demographic variables (years of teaching experience, gender, the highest level of educational attainment, and teaching in the content area) served as the predictor variables for those analyses. These four variables were used to predict self-efficacy, outcome expectancy, and content knowledge growth scores. This study checked the assumptions for normality, homogeneity of variance, and multicollinearity. The effect size was assessed using R2 values and represented the proportion of the mean change in self-efficacy, outcome expectancy, or content knowledge scores that can be explained by the combination of the predictor variables entered into the model (Mertler & Vannatta, 2010). 88

Research Question One

Research question one examined the impact of participation in the MI professional development on science teachers' content knowledge, self-efficacy, and outcome expectancy. A paired-samples t-test was used to examine the pre and posttest scores for PSTE, STOE, and

CK. The purpose of the analysis was to determine whether a significant statistical difference between the mean difference of paired samples (pretest and posttest scores) is significantly different from zero. The examination of normality occurred before conducting the analysis. Table 7 presents the results of this test. Each of the three paired-samples generated positive t values that indicated a significant difference in scores for PSTE (M=4.03, SD=6.06; t(566)=15.83, p<.0001), STOE (M=1.13, SD=3.76; t(566)=7.19, p<.0001), and CK (M=5.52,

SD=11.44; t(566)=11.49, p<.0001). The p-value for each dependent variable is less than 0.0001, rejecting the null hypothesis of equal pretest and posttest scores.

The most significant difference was PSTE, followed by CK, and finally, STOE. These results suggest that participation in MI professional development positively impacts a participant's PSTE, STOE, and CK. Cohen's d refers to the magnitude of the difference between the pretest and posttest scores and is determined by calculating the mean difference between pretest and posttest scores divided by the result of the standard deviation. Cohen's d indicated a medium effect for PSTE (0.67), CK (0.48), and STOE (0.30) dependent variables (Cohen, 1988). 89

Table 7

Paired t-test Results for Dependent Variables

Paired Differences

M SD 95% CI T P Cohen’s d

Lower Upper

PSTE 4.03 6.06 3.53 4.52 15.83 <.0001 .665

STOE 1.13 3.76 0.82 1.44 7.19 <.0001 .302

CK 5.52 11.43 4.58 6.46 11.49 <.0001 .484

Note. PSTE = Personal Science Teacher Efficacy; STOE = Science Teacher Outcome Expectancy; CK = Content Knowledge; n=566.

Research Question Two

Research question two examined the degree to which years of teaching experience, gender, the highest level of educational attainment, and teaching in content predict: a) content knowledge before MI; b) content knowledge after MI; and c) growth in content knowledge. Multivariate outliers were revealed using Mahalanobis’ distance, which eliminated

44 cases due to exceeding the chi-square criterion of 7.81.

The relationship between gender (as indicated on the Science Teachers Attitudes survey) and Content Knowledge scores (as measured by the Concept Inventory) was investigated using

Pearson product-moment correlation coefficient (Pearson r). Exploratory analyses were executed to establish that no violation of assumptions of normality, linearity, and homoscedasticity existed. There was a positive correlation between both pre, r = .218, p<.05, and posttest, r = .193, p<.05, Content Knowledge scores and gender. The results revealed that males had higher pretest and posttest Content Knowledge scores than females (see Table

8). There was a negative correlation between gender and Content Knowledge growth (r = -.112, 90 p<.05), indicating that females experienced more growth than males due to the coding for gender

(females = 1; males = 2).

A positive correlation between alignment and pretest scores (r = .153, p<.05) and posttest scores (r = .119, p<.05) was revealed, suggesting those teachers aligned with their content had higher Content Knowledge scores. A negative correlation between Content Knowledge growth and the variable of content alignment was found, which indicates that those that are not prepared to teach the assigned subject had higher growth scores, r = -.101, p<.05. The generation of a positive relationship (r = .126, p<.05) between the type of degree and Content Knowledge pretest scores as well as posttest scores (r = .097, p<.05) suggested that those with a higher degree had higher pretest and posttest scores. Finally, a negative Pearson r for experience and Content

Knowledge growth (r = -.084, p<.05), indicates that those with less teaching experience had more Content Knowledge growth.

Content Knowledge pretest scores indicated a positive correlation with experience (r =

.093, p<.05), suggesting that teachers with more teaching experience have higher Content

Knowledge scores. Content Knowledge growth scores revealed a negative correlation with experience (r = -.099, p<.05), suggesting that teachers with less experience acquired more

Content Knowledge growth after participating in MI professional development. 91

Table 8

Pearson Correlation Between Content Knowledge and Each Predictor

Pre CK Post CK Growth CK

Gender .218* .193* -.112*

Alignment .153* .119* -.101*

Degree .126* .097* -.084*

Experience .093* .046 -.099* Note. *Indicates significance at p<0.05. CK = Content Knowledge. N=523

Forward multiple regression was completed to determine which independent variable

(gender [gender], how many years teaching [experience], the highest level of educational attainment [degree], and teaching in content [alignment]) best predict Content Knowledge (pre, post, and growth). Data screening led to the elimination of 44 cases. Regression results revealed three predictors (gender, content alignment, and level of degree attainment) that significantly predict Content Knowledge pretest scores, F(3,519) = 15.09, p<.0001. Three variables accounted for 8% of the variance in the predictor model. Degree accounted for 1.2%, content alignment accounted for 2%, and finally, gender accounted for 4.8% of the variance in the predictor model for the CK pretest scores.

Three of the predictors (gender, content alignment, and degree) significantly predicted posttest Content Knowledge scores, F(3,519)=10.44, p<.0001. Degree accounted for 0.8% of the variance in posttest Content Knowledge scores, alignment accounted for 1.2% of the variance, and gender accounted for 3.7%. Overall, gender, content alignment, and degree account for

5.7% of the variance in Content Knowledge posttest scores.

Finally, a two-factor model (gender and experience) was generated to predict Content

Knowledge growth. The coefficients indicated that females with less teaching experience 92 witnessed more significant growth in Content Knowledge, F(2,520)=6.53, p<.002, than other participants. Gender and experience both accounted for 1.2% of the variance. Overall, gender and experience account for 2.4% of the variance in Content Knowledge growth scores. The tolerance level for all cases predicting Content Knowledge was >.982. In other words, multicollinearity was not present in the predictor models. Therefore, predictor variables can be used to determine their effect on Content Knowledge scores. Table 9 presents a summary of regression coefficients.

Table 9

Regression Predictors of Content Knowledge for Pre, Post, and Growth Scores

Pre CK R R2 B ß Part Partial

1. Gender .218 .048 8.79 .213 .213 .217

2. Alignment .261 .068 6.40 .128 .127 .131

3. Degree .283 .080 4.30 .112 .111 .115

Post CK

1. Gender .193 .037 6.29 .190 .189 .191

2. Alignment .222 .049 3.97 .099 .098 .101

3. Degree .239 .057 2.69 .087 .086 .089

Growth in CK

1. Gender .112 .012 -2.81 -.122 -.121 -.122

2. Experience .156 .024 -.158 -.110 -.110 -.099 Note. N=523. CK = Content Knowledge.

Research Question Three

Research question three examined the degree to which years of teaching experience, gender, the highest level of educational attainment, and teaching in content predict: a) self- efficacy before MI; b) self-efficacy after MI, and; c) growth in self-efficacy. 93

The relationship between all potential predictors and the three dependent variables for

PSTE (pre, post, and growth), was investigated using the Pearson product-moment correlation coefficient (Pearson r). A positive correlation generated between gender and pretest scores, r =

.180, p<.05 for PSTE, indicated males had higher PSTE scores than females (see Table

10). Growth of PSTE showed a negative correlation, r = -.115, p<.05, indicating that females experienced more significant growth scores. Highest degree attainment revealed a positive correlation with pretest scores, r = .104, p<.05, and a negative correlation with growth, r = -1.04, p<.05. These results suggested that those with a higher degree initially had a higher PSTE score, and those with a Bachelor’s degree experienced more growth in efficacy scores. Content alignment experienced a negative correlation with PSTE growth, r = -.097, p<.05, revealing that those teachers who were not teaching in the science content for which they were prepared to teach experienced more growth than their peers.

Table 10

Pearson Correlation Between Personal Science Teacher Efficacy and Each Predictor

Pre PSTE Post PSTE Growth PSTE

Gender .189* .109 -.115*

Alignment .037 -.037 -.097*

Degree .104* .029 -.104*

Experience .023 -.025 -.063 Note. *Indicates significance at p<0.05; PSTE = Personal Science Teacher Efficacy.

This research conducted a forward multiple regression to determine which independent variable (gender [gender], how many years teaching [experience], the highest level of educational attainment [degree], and teaching in content [alignment]) best predicted PSTE. Data screening led to the elimination of 44 cases. Regression results indicated that gender and degree 94 significantly predict pretest PSTE scores, F(2,522)=12.80, p<.0001. Gender accounted for the most variance, 3.6% followed by degree, 1.1%. Overall, gender and degree account for 4.7% of the variance in PSTE pretest scores. Of the four predictors, gender significantly predicted the posttest scores for PSTE, F(2,522)=15.51, p<.0001, meaning that males have higher PSTE posttest scores than females.

Gender and degree significantly predicted growth in PSTE scores, F(2,522)=6.43, p<.002, and were negatively correlated. Degree accounted for the least amount of variance,

1.1%, followed by gender, 1.3%. Overall, gender and degree account for 2.4% of the variance for PSTE growth score. The negative correlation coefficients suggested that females with a lower degree experienced more growth in PSTE scores. Table 11 summarizes the regression coefficients for this model. The tolerance level for all cases predicting PSTE were >.894, indicating no Multicollinearity, meaning that the predictors for PSTE can be used for analysis and do not interact with one another.

Table 11

Regression Predictors of Personal Science Teacher Efficacy for Pre, Post, and Growth Scores

Pre PSTE R R2 B ß Part Partial

1. Gender .189 .036 2.81 .190 .190 .191

2. Degree .217 .047 1.47 .107 .107 .109

Post PSTE

1. Gender .109 .012 1.54 .109 .109 .109

Growth PSTE

1. Gender .115 .013 -1.27 -.116 -.116 -.116

2. Degree .155 .024 -1.07 -.105 -.105 -.106 Note. N=523; PSTE = Personal Science Teacher Efficacy. 95

Research Question Four

Research question four examined the degree to which years of teaching experience, gender, the highest level of educational attainment, and teaching in content predict: a) outcome expectancy before MI; b) outcome expectancy after MI; and c) growth in outcome expectancy.

Table 12 presents the Pearson r correlation between STOE and each of the independent variables (gender, content alignment, highest degree attained, and years of experience). Two variables, gender and content alignment, show a significant correlation with posttest scores for

STOE, suggesting that males aligned with the content for which they prepared to teach possessed higher STOE scores. Multiple regression analysis did not reveal any predictors for STOE.

Table 12

Pearson Correlation Between Science Teacher Outcome Expectancy and Each Predictor

Pre STOE Post STOE Growth STOE

Gender .053 .217* -.028

Alignment -.031 .106* .000

Degree -.053 .078 .017

Experience -.017 .007 .054 Note. *Indicates significance at p<0.05; STOE = Science Teacher Outcome Expectancy.

Chapter Summary

This chapter provided descriptive data for each dependent variable, Personal Science

Teacher Efficacy, Science Teacher Outcome Expectancy, and Content Knowledge. Modeling

Instruction professional development participant responses concerning gender, education level, years of experience, and alignment with teaching preparation were collected and reported. The researcher utilized data collected by the American Modeling Teachers Association from workshops between 2016 and 2018 throughout the United States for physics and chemistry 96 topics. The original sample consisted of 1,037 participants. After data screening, 567 participants remained in the sample. Multivariate outliers using Mahalanobis’ distance eliminated 44 cases that exceeded a chi-square criterion of 7.81, reducing the sample size for multiple regression analysis to N=523.

Four research questions guided the data analysis. A summary of the results, provided in

Table 13, relates to each of the four research questions. The paired-samples t-test revealed a significant difference in pretest and posttest scores for all three dependent variables. The most significant impact was PSTE, then CK, and finally, STOE.

A forward multiple regression analysis generated a predictive model for CK and

PSTE. Females with less experience accomplished higher growth in CK, while females with a lower degree exhibited more growth in PSTE scores.

Table 13

Results Summary by Research Question

Research Question Results

1. What is the impact of Paired samples t-test: participation in the MI • A significant difference exists between the pretest professional development on and posttest scores for CK, PSTE, and STOE science teachers’ content indicating Modeling Instruction professional knowledge, self-efficacy, and development had a positive impact on the outcome expectancy? dependent variable scores.

2. What is the degree to which Pearson r: years of teaching experience, • Gender and degree was significantly and positively gender, the highest level of correlated with both pretest and posttest scores. educational attainment, and • Males had higher pretest and posttest CK teaching in content predict: a) scores than females. content knowledge before MI; b) • Teachers with a higher degree had higher content knowledge after MI; and pretest and posttest scores. c) growth in content knowledge? • Teachers aligned with their content had higher CK posttest scores. • Those with a higher degree had higher pretest and posttest scores. 97

• Gender, experience, and alignment were negatively correlated with CK growth scores. • Females with less teaching experience and not aligned to their content, had more Content Knowledge growth. Multiple Regression: • Three predictors (gender, content alignment, and degree attainment) significantly predict pretest Content Knowledge scores. • Gender and experience predict Content Knowledge growth • Females with less experience revealed more Content Knowledge growth.

3. What is the degree to which Pearson r: years of teaching experience, • Gender and degree are significantly correlated with gender, the highest level of pretest scores while gender, alignment, and degree are educational attainment, and significantly correlated with growth PSTE scores. teaching in content predict: a) • Males had higher PSTE pretest scores. self-efficacy before MI; b) self- • Teachers with a higher degree had higher efficacy after MI; and c) growth pretest scores. in self-efficacy? • Those with a BA degree experienced more growth. • Teachers not aligned experienced more growth. Multiple Regression: • Gender and degree predict pretest scores, gender predicts posttest scores, while gender and degree predict growth scores for PSTE. • Males had higher pretest and posttest self- efficacy scores. • Females experienced greater growth in self- efficacy than males.

4. What is the degree to which Pearson r: years of teaching experience, • Gender and alignment are significantly correlated with gender, the highest level of posttest scores for STOE. educational attainment, and • Males aligned with their content had higher teaching in content predict: a) STOE posttest scores. outcome expectancy before MI; Multiple Regression: b) outcome expectancy after MI; • No predictors were entered into the model and c) growth in outcome expectancy? 98

CHAPTER V. DISCUSSION

The previous chapter presented the results of the data analysis in this study. Chapter five begins with a summary of conclusions followed by an overview of the problem, which includes the study purpose and description of the research design. This chapter will explore each of the findings in more detail and discuss the statistical results from the data gathered in Chapter IV for each research question. This chapter will also discuss the application of the results to professional development settings, leadership roles, as well as the impact of MI on teaching practice. Conclusions drawn will precede a list of limitations for the study. The chapter concludes with a discussion of how the results will inform future research, recommendations for practice, and final thoughts.

Summary of Conclusions

Results from this study lead to the following conclusions:

1. Participation in Modeling Instruction professional development positively impacts a

participant’s self-efficacy, outcome expectancy, and content knowledge. The mean for

posttest scores for each dependent variable (self-efficacy, outcome expectancy, and content

knowledge) is higher than the mean for pretest scores indicating that each dependent variable

experienced growth.

2. Three variables entered into the predictive model for CK pretest scores. Gender, content

alignment, and level of degree attainment significantly predict CK pretest scores.

3. Three variables entered into the predictive model for CK posttest scores. Gender, content

alignment, and degree significantly predict content posttest scores.

4. Two variables entered into the predictive model for CK growth. Gender and experience

significantly predict CK growth scores. 99

5. There was a significant correlation between gender and CK and years of experience and CK.

Females with fewer years in the classroom experienced the most growth in CK after

participating in MI professional development.

6. Two variables entered into the predictive model for self-efficacy pretest scores. Gender and

degree significantly predict PSTE scores.

7. One variable entered into the predictive model for self-efficacy posttest scores. Gender

significantly predicts PSTE posttest scores.

8. Two variables predict self-efficacy growth scores. Gender and degree significantly predict

PSTE growth scores.

9. A significant correlation existed between gender and self-efficacy and degree and self-

efficacy. Females with a bachelor’s degree experienced more PSTE growth than males after

participating in MI professional development.

Study Overview

The sample for this study included 567 participants from a population of 1,037 total participants in physical science MI professional development workshops from 21 different states in the United States. The researcher utilized a quasi-experimental research design utilizing surveys disseminated through voluntary participation in the MI professional development from

2016 to 2018. The surveys collected data to ascertain the level of PSTE, STOE, and CK for each of the teachers, both pre and post participation. Teachers completed the surveys before and after the MI professional development through a program called LimeSurvey. Analysis in this study only included participants who completed both pre and post surveys.

Inquiry-based science education, advocated for over a decade, employs opportunities for students to investigate natural phenomena by asking questions, exploring possible solutions, 100 developing explanations, and evaluating their understanding based on the data they have collected (Wilcox, Kruse, & Clough, 2015). The Next Generation Science Standards suggest that “scientific inquiry requires the use of evidence, logic, and imagination in developing explanations about the natural world” (Newman, Abell, Hubbard, McDonald, Otaala & Martini,

2004, p. 258). Scientific inquiry within the classroom, however, requires teachers to have a deep understanding of the nature of science, discipline-specific content knowledge, and pedagogical content knowledge, such as how scientific laws, principles, and mathematical expressions intertwine with the content they teach (Newman et al., 2004). In-service teachers tend to have limited experiences from their college training with teaching and learning through inquiry (Swars

& Dooley, 2010). Therefore, they must be engaged in professional development to promote these skills beyond their undergraduate college courses (Newman et al., 2004).

Conditions perceived as typical professional development do not always promote the critical thinking needed in the science classrooms of the 21st century. Traditional professional development is often a one-day workshop model that is not sustained long term and typically does not engage teachers in the same learning activities designed for their students (Darling-

Hammond, Hyler, Gardner, & Espinoza, 2017). Also, professional development is not typically designed to encourage feedback and reflection and lacks support from experts. Science inquiry, therefore, remains challenging for most teachers to achieve within their classrooms as a result of traditional professional development (Darling-Hammond, Wei, Andree, Richardson & Orphanos,

2009). To truly teach through inquiry requires a considerable amount of CK and PCK that requires more in-depth professional development. Science teachers must have a deeper understanding of the preconceptions of their students, how to base lessons on real-world 101 examples, and have a grasp on the variable nature of the student’s previous experiences

(National Academies of Sciences, Engineering, and Medicine, 2015).

Modeling Instruction, a research-based pedagogy, provides a professional development platform to support students’ engagement in the processes and discussion of science (Jackson,

Dukerich, & Hestenes, 2008). Modeling Instruction guides students with each facet of scientific knowledge, such as understanding natural phenomena by asking questions, developing models, using computational thinking, obtaining and evaluating information, and constructing explanations. Modeling Instruction integrates structured inquiry techniques to develop critical thinking and communication skills. The research presented here shows the positive impact of participation in MI professional development on teachers’ PSTE, STOE, and CK and subsequent pedagogical skills in the classroom. However, MI methodology remains uncommon when training pre-service teachers with only a few colleges and universities in about six states using

MI as a science methods course. Furthermore, less than ten percent of all in-service teachers in

Ohio have participated in MI professional development (C. Megowan-Romanowicz, personal communication, May 3, 2019).

The current study is one of the first to have affirmed that MI professional development had a positive, quantitative impact on growth scores for science teachers’ content knowledge

(M=5.52), self-efficacy (M=4.03), and outcome expectancy (M=1.13). The research findings presented here make a case for increasing enrollment in MI workshops as a means to improve in- service physical science teachers’ PSTE, STOE, and CK. Improving PSTE, STOE, and CK improves the effectiveness of teachers (Knaggs & Sondergeld, 2015; Menon & Sadler, 2016;

Swackhammer et al., 2009; Swars & Dooley, 2010). 102

According to Marzano, highly effective teachers have a more positive impact on student learning. Therefore, it is essential to increase the number of highly effective teachers within the science classroom (2003). Specifically, there is a strong need to deliver high-quality, nationally recognized, and valuable professional development such as MI for science teachers to improve their performance. Professional development that focuses on content and teacher engagement in learner-centered pedagogies should improve CK and efficacy, in turn, improving student performance in science (Blank et al., 2008).

The study presented here examined the degree to which years of teaching experience, gender, the highest level of educational attainment, and teaching in content area predict content knowledge, self-efficacy, and outcome expectancy both before and after participation in MI professional development utilizing a pretest-posttest treatment. The Science Teacher Attitudes survey measured both SE and OE and was adjusted to accurately determine the SE beliefs and

OE of science teachers before and after participating in MI professional development. This study is the first to use the STA to explore the impact of Modeling Instruction on teachers’ SE and OE.

The selection of in-service teachers for this study consisted of those who enrolled in a summer MI Workshop. The purpose of this research was to explore the development of a predictive model for participation in MI professional development. Although multiple regression predictors cannot necessarily account as contributory (Warner, 2008), they do provide an area of focus for further study. The predictors provided information regarding those variables that did not contribute to the model. 103

Discussion of The Findings

Introduction

Research question one queried the impact of participation in MI professional development on science teachers’ content knowledge, self-efficacy, and outcome expectancy. A paired samples t-test revealed a significant, positive difference between pretest and posttest scores for all three dependent variables (Personal Science Teacher Efficacy t(566)=15.83, p<.0001, Science Teacher Outcome Expectancy t=(566)=7.19, p<.0001, and Content Knowledge t=(566)=11.49, p<0.001). Therefore, the null hypothesis was rejected for research question one, indicating that most participants experienced growth after participation in MI professional development. Exploration of each of these variables transpires in the subsequent discussion.

Impact on self-efficacy. The scores indicated, with 95% confidence, that the mean

PSTE score was less on the pretest (M=50.31, SD=7.89) than on the posttest

(M=54.34, SD=6.96). The mean range for the gain scores for this variable showed that participants' PSTE grew between 3.5 to 4.5 points after participating in an MI professional development workshop. Teacher SE is a construct propelled by motivation, meaning if a teacher believes they can accomplish a task, they will more than likely implement steps toward achieving their goal (Bandura, 1997). "Persons who have a strong sense of efficacy deploy their attention and effort to the demands of the situation and are spurred by obstacles to greater effort"

(Bandura, 1986, p. 394).

This study revealed that participation in MI further developed the SE of the participants, thus increasing their confidence in their ability to teach science. Improving the self-efficacy of science teachers is a particularly desirable outcome since research presented in this current study indicated teachers with a higher level of perceived SE accomplish more and persist longer at a 104 specific task when compared to individuals with lower perceived SE. The level of self-efficacy is of specific interest to workshop leaders to determine which participants persist while implementing MI in their classrooms and which participants may need additional support.

The primary focus of MI professional development is for participants to experience MI methodology as learners while enhancing their scientific, pedagogical content knowledge (Cabot,

2008; Jackson et al., 2008). This experience allows participants to strive towards improving their SE. Teachers actively participated in the fundamental elements of MI as they engaged in data collection through laboratory experimentation. Teachers utilized their data to build conceptual models to solve problems and facilitate discussion. The structure of the workshop provided teachers with opportunities for success that encouraged content knowledge, self- efficacy, and outcome expectancy growth, ultimately developing their MI pedagogy. This process is similar to how teachers would engage in MI in their classrooms. Teachers help their students develop conceptual models by engaging them in similar experiments, classroom discussions, and problem-solving. These activities, in turn, will improve their own students' content knowledge and self-efficacy in science. Modeling Instruction pedagogy specifically asks students to develop and deploy conceptual models through scientific inquiry. Modeling

Instruction professional development is essential, considering teachers are performing the same tasks as their students, giving them first-hand experience with what they are asking their students to perform.

There are several possibilities to account for the modest gains in SE experienced by some teachers. The teachers in the current study may have overestimated their self-efficacy at the beginning of the professional development. An overestimation in SE can influence the amount of effort that a teacher might put forth during professional development (Tschannen-Moran & 105

Hoy, 1998). Another possible explanation for the modest gains in SE scores for some participants is that teachers who rate their SE high in the pre-workshop survey may realize, by the end of the workshop, they initially possessed an inflated notion of their expertise as a teacher.

They may have also come to appreciate the level of critical thinking required for higher-level teaching and learning. By the end of the professional development, those teachers with an inflated sense of SE may have raised their ceiling on what is required for higher-level thinking and teaching, thus impacting their SE identity. Truly immersing in the process of designing teaching strategies for student understanding is arduous if a teacher possesses gaps in their CK.

Similarly, merely achieving efficiency at covering content, performing well on formal teacher evaluations, and excelling at classroom management does not implicate higher-level teaching. The MI workshop allows teachers to glimpse new possibilities for student thinking never explored before. As a result, teachers may see themselves as novices at higher level teaching and learning. Further examination of the current study's gain scores for different groups of teachers based on their level of experience, degree attainment, or content alignment may reveal differences in the size of self-efficacy growth.

Impact on outcome expectancy. Teachers responded on the STA about the impact they feel they have on their students’ achievement. Science Teacher Outcome Expectancy scores indicated with 95% confidence the mean for pretest scores (M=33.96, SD=3.85) was smaller than the mean for posttest scores (M=35.10, SD=3.58), indicating that participation in MI professional development improved participants’ outcome expectancy scores by a modest and significant amount.

Teachers’ gain scores for OE in the current study revealed that teachers did not experience as sizable an impact after participating in MI professional development on their OE as with other 106 dependent variables. The mean pretest STOE percent score was 67.9% and grew to a mean posttest score of 70.2%. These scores were less than the mean PSTE pretest percent score

(71.8%) and the mean posttest percent score (77.6%).

A few contributing factors may account for the modest gain score in the current study.

One possible explanation may be that some teachers may not be as experienced in judging the academic potential of their students as a result of efforts to improve their students’ performance in the classroom, therefore contributing to lower pretest OE scores. Examining OE by teaching experience may reveal differences. New teachers may have similar OE as pre-service teachers due to their lack of classroom teaching experience. Comparisons of OE of new teachers to pre- service teachers may reveal a pattern for OE growth. Shroyer, Riggs, and Enochs (2014) report,

“As a point of comparison, outcome expectancy beliefs of pre-service teachers were more difficult to interpret and not as easily changed” (p. 108).

A second possibility is that the activities planned during MI professional development did not specifically address OE, nor did the process of participation inspire teachers to evaluate themselves in terms of their potential impact on students during participation. Factors external to the scope of the current study impact OE related to the teachers’ estimation of their influence on student learning. External factors would include a student’s reaction or performance based on instruction, observation of how students respond to the instruction, and methodology or delivery of instruction. MI professional development is not designed to gather feedback from students during the workshop. As a result, the teachers must rely on other means of determining the effectiveness of their instruction, such as peer interaction and vicarious experiences during the workshop. Future workshops may need to explicitly incorporate student feedback to inform participants of their instructional practices. 107

A third possibility may be due to the transformative outcome of MI on a teachers’ PCK.

Teachers may not feel as though they can accurately predict the performance of their students until they successfully implement the new teaching practices within their classrooms. We know very little about the degree to which changes in expectancies predict outcomes. Further understanding of the predictive pathway may inform teaching strategies and, in turn, augment outcomes. Individuals with low self-efficacy and low outcome-expectancy may benefit from additional strategies designed to enhance their expectancies, though further research is needed.

Altering deeply ingrained rituals of teaching science requires intellectual, emotional, and social support to unlearn those rituals for transformation to take place, leading to more profound behavioral changes to create 21st-century educational practices. This process takes time with continued support. Therefore, measurement of OE after the first year of implementation of MI may provide for a more accurate measure of this subscale.

Lastly, OE may not have been a valid construct to consider in this study. Bandura concluded that OE and SE are closely linked (1997). Therefore, measuring OE alone may not provide additional information. Examination of SE and OE combined subscales may reveal a more considerable impact of MI on the participants in the current study.

The modest improvement for OE in this current study is not surprising since research revealed other studies had found little to no statistically different OE scores (Liang &

Richardson, 2009; Nauerth, 2015; Schoon & Boone, 1998) or small change according to gender or experience (Azar, 2010). Bandura (1997) defined outcome expectancy as “a judgment of the likely consequences such performances will produce” (p.21). In other words, OE is the perception that a teacher has about how they will influence student learning or that their behavior will result in the desired outcome. The likelihood of teachers increasing OE is minimal when 108 considering teachers feeling as though they are novices when transforming their curriculum.

Teachers may feel as though they have mastered teaching after their seventh year in the classroom. However, the MI professional development placed teachers in the role of a novice once again. Bandura argued that accomplishments are the most compelling strategy for enhancing self-efficacy. Therefore, allowing teachers to accomplish MI in their classrooms for at least three years may help them feel the expert once again. The prime time to evaluate OE might be after teachers use MI pedagogy in their classrooms for at least three years.

Performance accomplishment (i.e., improving student participation) is likely to impact outcome expectancies as well. Thus, behavioral mastery during MI professional development may be particularly useful in raising both SE and OE. Also, workshop leaders should specifically address potential difficulties when implementing the MI approach, and verbal encouragement from the leaders about confidence in the participants’ ability to implement MI can improve SE. Similarly, more in-depth discussions with participants about MI pedagogy and leaders’ belief in their ability to implement this pedagogy within the participants’ classrooms may be beneficial. Additional methods for enhancing SE from the literature include actively praising small successes, reframing perceived failures as moderate successes, and preventing or removing barriers through problem-solving skills.

Teachers are likely to participate in particular professional development they believe will expand their knowledge and skills, contribute to their growth, and augment their effectiveness with students. However, teachers are also pragmatic and hope to gain specific, concrete, and practical ideas from their professional development that directly relate to the operation of their classrooms. Some teachers that participated in MI may not have believed that the changes to 109 their teaching pedagogy will directly impact their students, a concept that may warrant further exploration in the future.

Impact on content knowledge. Of the three dependent variables, CK is one of the most impactful influences over student performance, yet it is an unintended consequence of MI participation. The mean range for CK scores indicated, with 95% confidence, that scores improved from pretest to posttest between 4.6% to 6.5%. Therefore, for the sample in this study, participation in MI professional development improved a participant’s knowledge of the physical science content utilized within the workshop.

While the primary focus of MI professional development is to aid teachers in transforming their classrooms to align with modeling pedagogy, improving the CK of participants may be an unintended benefit. Increased CK, therefore, is a bonus that is especially important for participants that are teaching outside of the content area they were trained to teach or for in- service teachers with less experience within the classroom (Hestenes et al., 2011; Hobbs, 2012;

Ingersoll, 2002). Since CK is not the primary focus of MI professional development, any gain in

CK scores is notable. Due to the positive effect of MI, as observed in the current study, it should be considered as a means to improve CK for in-service teachers. However, the primary focus of

MI professional development should continue to be on the implementation of MI pedagogy.

When the focus becomes strictly on improving CK, teachers will lose the vital pedagogical component of MI professional development. The purpose of MI professional development is to engage in hands-on, minds-on modeling activities to help purposely establish scientific conceptions and develop scientific habits of mind (Halloun, 2011).

Upon closer inspection of the scores in the current study, increased CK growth scores by the participants did not necessarily correspond to high growth scores for SE and OE. The reason 110 for the disconnect might be the 51 participants with a perfect CK score on the pretest and posttest. Those participants (about 9% of the total) could skew the results for CK gain scores.

While mean scores are designed to measure the center of a data set, they are affected by outliers and can skew the data by moving away from the typical value (Mertler & Vannatta, 2010).

Also of note, there was one participant whose CK score went from a 100% on the pretest to a 17% on the posttest, leading the researcher to believe that this participant may not have completed the posttest with fidelity. Other participants also may have rushed through their answers. As a result, their overall score may have decreased or not improved at all. Workshop leaders may want to carefully consider the importance of collecting the pretest and posttest data to be sure that adequate time is set aside for completion of the surveys. The importance of data collection must be relayed to the participants to further our understanding of the impact of MI professional development.

Since teachers have considerable control over what is taught, improving teachers’ CK is an essential step in improving student achievement (NRC, 1996). Improving CK is especially vital in light of how teachers make use of their subject knowledge to organize and use CK more effectively for their students to understand. A lack of CK can limit the opportunities a teacher brings to their students, thus influencing the learning that happens within the classroom (Ohle et al., 2015). Additionally, teachers with high levels of CK are more likely to be able to respond to the needs of a particular class, recognizing those students who are struggling, and changing the way subject matter is presented to make it more understandable to students.

Modeling Instruction is a pedagogy that requires a departure from the traditional approach to teaching and learning in the science classroom. Based on the research in this study, one cannot deny that the CK of the teacher influences pedagogy. Content knowledge literature also 111 implies that there is a degree that CK influences the transformative nature of PCK (Loewenberg

Ball et al., 2008). When we regard the primary focus of MI professional development as aiding teachers through the transformation of their classrooms to align with modeling pedagogy, then we must address the importance of CK. Content knowledge is exceedingly essential to teaching science and its improvement. Modeling Instruction professional development should embrace the opportunity to improve CK to enhance PCK further. Workshop leaders need to explicate better on how to use CK for effective teaching. Doing so can have significant implications for understanding teaching and for improving the content preparation of science teachers.

Teaching science requires subject-specific CK that is unique to a particular area of study such as chemistry, physics, or biology. Teachers lacking CK will struggle to change their pedagogy. For instance, a teacher who possesses misconceptions or incomplete content knowledge limits their ability to respond to student misconceptions or create cognitively challenging learning opportunities (Kleickmann et al., 2013). A teacher lacking a deep understanding of friction force, for example, will have a difficult time developing activities that help their students overcome the common misconception that everything that moves will eventually come to a stop. Alternatively, a teacher that does not understand electrostatic forces will not be able to dispel the misconception that ionic bonds form when nonmetals take an electron.

Researchers regard CK as a prerequisite for developing PCK. However, high content knowledge does not imply high PCK. Teachers with high PCK know the what, when, why, and how of their content. They tap into a reservoir of knowledge of best practices and experience.

There is significant evidence that teaching subject-specific content demands that CK pushes substantially beyond the bounds of a typical college and university classes (Shulman, 1987). 112

Practical professional development opportunities such as MI have a pivotal role in developing the CK and PCK of in-service teachers.

Participants of MI professional development have a unique opportunity to strengthen their CK during participation. However, basic content knowledge is essential to help participants also grow their PCK. Basic CK may also help participants avoid becoming overwhelmed during the experience. The implication, therefore, is that there is a delicate balance between what teachers need to know prior to participation and what MI professional development is asking them to accomplish. Workshop leaders may need to pay special attention to those teachers lacking CK necessary to develop basic models within the workshop.

The Degree to Which the Independent Variables Predict Content Knowledge

Research question two explored the degree to which years of teaching experience, gender, the highest level of educational attainment, and teaching in content predict content knowledge before and after participation in MI professional development. The predictive model also explored growth scores for content knowledge.

Correlation between independent variables and content knowledge. The Pearson r correlation revealed that females with less teaching experience (r=-.099, p=<.05) who are teaching outside of their content area (r=-.101, p=<.05) showed the most growth in CK. Upon further inspection of the data, females had an average pretest CK score of 79.6%, while their male counterparts scored an average of 86.1%. On average, females improved their CK score two percent more than males after participating in MI professional development. While it may seem intuitive that teachers with less experience would display more growth, the question remains why do more females experience more CK growth than males? The remaining factors, experience, and content alignment revealed some compelling information. 113

Twenty more women than men enrolled in MI professional development were teaching out of their content area for which they are prepared to teach (see Figure 4). This difference may account for why females experienced more CK growth than males. Any teacher teaching out of their content would benefit from participation in MI professional development as a means of increasing their CK. An example of a future study might be to determine how many of those teaching out of their content area possessed a background in life science or biology rather than chemistry or physics, further influencing the data. The average number of years that females indicated they taught was 11.4, while men had an average of 10.2 years of experience.

Experience, therefore, seems an unlikely reason for the difference in CK scores between the genders. The social nature and structure of the workshop as a mock classroom may be the key to this difference. Cognitive differences in learning styles between men and women in a classroom setting may help explain these results. The difference in men and women in CK are essential to consider when designing MI professional development to maximize the efforts to influence CK and produce the highest impact on women in particular, and to develop a gender-responsive environment.

Predictive model for content knowledge. Multiple regression analysis revealed that gender is a predictor for pretest, posttest, and gain scores for CK. Males aligned with the content for which they are prepared to teach, with a higher degree, experienced higher pretest and posttest content knowledge scores than females. Females with less teaching experience were revealed to have the most growth. These results further our understanding of the impact of the

MI professional development on each gender and provide workshop leaders with important information about how to help their participants gain the most from their professional development. 114

The following discussion about gender differences may shed light on the discrepancies in

CK in the current study. Research by Witkin and Goodenough (1981) supports that males have a preference for abstract conceptualization, whereas females tend to seek applicability or personal connections with new material. Males tend to be achievement-oriented, whereas females are more socially and performance-oriented. Furthermore, males are more likely to attribute their success in the classroom to external reasons, whereas females tend to see their success linked to their efforts. Genders also vary in their views about what is essential in their education. Females rank social interaction with others and self-confidence of higher importance than males (Hayes,

2001). The cognitive style theory suggests that females who rely on peer input to organize experiences and interpret situations are more dependent on interpersonal relationships, and often have different information processing and personality styles than males (Witkin & Goodenough,

1981).

The Degree to Which the Independent Variables Predict Self-Efficacy

Research question three explored the degree to which years of teaching experience, gender, the highest level of educational attainment, and teaching in content predict self-efficacy before and after participation in MI professional development. The generation of a predictive model for growth scores for self-efficacy revealed any relationships.

Correlation between independent variables and self-efficacy. Pearson r revealed a significant correlation between gender and pretest self-efficacy scores (r=.180, p<.05). Also, there was a significant and positive correlation between self-efficacy and the highest level of educational attainment (r=1.04, p<.05). These results reveal that males with a master's or doctoral degree have higher PSTE pretest scores than females with a bachelor's degree.

Correlation values for PSTE posttest scores were not significant. However, females who were 115 teaching out of their content area, with less experience and a bachelor's degree, showed the most growth in PSTE scores (p<0.05). These results complement the correlations found in research question two and the literature. Research literature has established the importance of a teacher's

CK as a significant factor influencing PCK, which, in turn, influences SE.

The research recommends participation in MI as a means of increasing science teacher

SE in the current study. The work of Shulman provided a conceptual framework as well as analytical distinctions between the different kinds of knowledge needed for effective teaching

(1986). Research about teacher knowledge focused on two overlapping domains over the past two decades: PCK and CK, which concluded that teachers' CK and PCK are related

(Loewenberg Ball et al., 2008; Kleickmann et al., 2013). Teachers who acquire more training outperform their colleagues with less training in both CK and PCK. Content knowledge and

PCK determine the quality of instruction and students' achievement in the classroom. Those teachers with increased CK as a result of their participation in professional development, such as

MI, tend to implement inquiry practices in their classrooms than those teachers that have not

(Buczynski & Hansen, 2010).

Predictive model for self-efficacy. Multiple regression analysis revealed that gender is a common predictor for pretest, posttest, and gain scores for PSTE. Males with higher degrees had higher PSTE pretest scores than their female counterparts, F(2,522)=12.80, p<.0001.

Posttest PSTE scores were also higher for males, F(2,522)=15.51, p<.001. Females with a bachelor's degree revealed the highest growth for PSTE. We may attribute gender differences to the cognitive style theory suggesting that females rely on peer input to organize experiences and interpret situations and are more dependent on interpersonal relationships (Witkin &

Goodenough, 1981). Modeling Instruction is a student-centered instructional method that 116 integrates curriculum and pedagogy and encourages cooperation among students while facilitating discourse in a supportive environment (Jackson et al., 2008). The MI environment, therefore, may be suited to a female’s cognitive style.

The Degree to Which the Independent Variables Predict Outcome Expectancy

Research question four explored the degree to which years of teaching experience, gender, the highest level of educational attainment, and teaching in content predict outcome expectancy before and after participation in MI professional development. This research also explored a predictive model for growth scores.

Pearson r correlation analysis showed a significant relationship between males teaching within their content area and outcome expectancy. Multiple regression revealed that the statistical model did not have predictor variables. Thus, a predictor model for OE cannot be generated from the data collected in this study and cannot forecast participant pretest, posttest, or gain scores. The lack of a predictor model could be since this study only detected small changes in OE. As discussed previously, OE may be difficult for the teachers to demonstrate growth until after the implementation of MI within their classroom. Therefore, a future examination of SE scores that encompassed OE may reveal a different predictor model for overall SE.

Recommendations

Science professional development must remove the barriers that prevent inquiry pedagogy in science classrooms and should, therefore, support more learning opportunities such as MI. Barriers include a lack of time to teach using new pedagogical tools, mandated curriculum and pacing guides, lack of resources such as technology and equipment, and lack of an integrated, coherent approach to instruction (Darling-Hammond et al., 2017). Improving CK leads to an increase in self-efficacy beliefs (Bandura, 1989), which, in turn, improves student 117 performance. Research indicates content-heavy professional development, such as MI, should be replaced by engagement in activities that advance PCK alongside CK to improve science teacher efficacy beliefs (Fox, 2014; Halim et al., 2014; Martin, 2018; Menon & Sadler, 2016; Shulman,

2013; Swackhamer et al., 2009).

Implications for Professional Development

This research supports the notion that engagement in MI pedagogy positively impacted

SE, especially for females teaching science. Of all of the independent variables examined in this study, gender was consistently shown to have an impact on the dependent variables. Men tended to have higher pretest and posttest scores in this study, while females showed the most growth.

The women in this research showed higher gain scores for CK, but also SE. The link between these variables is clear in light of the research. The more CK a teacher possesses about their subject, the more confident they are to teach that content. The more confident they are to teach the content, the more SE they possess. Female teachers are still underrepresented in the physical sciences (approximately 44% of all high school science teachers) and could benefit the most from participation in MI professional development (Hill & Stearns, 2015). To better serve female learners, science instructional practices should be primarily inquiry-based, utilize open- ended discussions, and hands-on learning opportunities (Halpern et al., 2007).

When we consider the positive impact the workshop has on female teachers, it is not difficult to imagine that MI is a way that could increase participation of female students in science class by increasing the students’ CK and belief in their ability to succeed in physical science. This research study suggests that MI may improve content knowledge, self-efficacy, and outcome expectancy of the participants. Further implicating the importance of MI professional development, notably impacting females with less teaching experience, MI 118 encompasses concrete and abstract thinking to promote learning opportunities for both genders.

Future research may include qualitative data concerning the female students of those female teachers that participated in MI professional development to determine the impact on their students’ success in physical science classes.

According to Darling-Hammond (2009), “An effective teacher is one who learns from teaching rather than one who has finished learning how to teach” (p. 3). Teachers must continually evaluate the effectiveness of their instruction to develop their pedagogy further.

Ensuring teachers are knowledgeable and skilled in their content area can be accomplished through continued professional development that fosters a productive learning environment, such as MI workshops, to allow teachers to become engrossed in the process of scientific inquiry and discovery (Manthey & Brewe, 2013). Darling-Hammond (1999) suggested that teachers be afforded opportunities to engage in discourse with colleagues to examine instruction more closely, focus on content to reform curriculum grounded within the context of the science classroom, and evaluate lesson effectiveness in order to enhance professional development.

These aspects are all found within the MI professional development workshop (Jackson et al.,

2008).

The ability of science teachers to implement NGSS learning standards will require the educational leaders to rethink how pre-service teacher programs navigate the advanced study of science and how to set a course for personal and professional development for in-service teachers. Teachers immersed in a culture of pressure to ‘cover the content’ could face trouble when teaching science through inquiry, even though inquiry advocates for focusing on both learning content and understanding scientific practices. Researchers have recently conceptualized MI as a vital pedagogy for science education (Dukerich, 2015; Halloun, 2011; 119

Jackson et al., 2008; Manthey & Brewe, 2013; Megowan, 2007; Windschitl et al., 2012).

Teachers are immersed in the robust development methodology within an MI workshop to help them show their students how to make sense of natural phenomena by developing, using, and modifying models (Jackson et al., 2008). Modeling Instruction is a means to connect science content with inquiry practices, improve CK, SE, and OE, and helps teachers gain insight on how to implement science teaching through inquiry.

Since recent science education reform documents in the U.S.such as A Conceptual

Framework for K-12 Science Education (NRC, 2012) cite MI as having a prominent role, educational leaders need to remove barriers and find ways to enroll their science teachers in this professional development. Teachers need more time to plan how they will utilize the new pedagogical tools from MI professional development and more time to collaborate with their peers. The mandated curriculum and pacing guides should be lifted from teachers while implementing MI pedagogy to alleviate the pressure of covering the curriculum during the school year. Also, the lack of resources, such as technology and equipment, may require additional funds to acquire additional materials. While MI is known for utilizing simple materials to gather data, the importance of infusing technology in an integrated, coherent approach to instruction is also important. Much research exists, citing the positive impact MI has on students (Brewe, 2008; Dukerich, 2015; Hestenes, 2006; Wells et al., 1995; Wright,

2012). The current study is one of the first to affirm that this professional development had a positive, quantitative impact on growth scores for science teachers’ content knowledge

(M=5.52), self-efficacy (M=4.03), and outcome expectancy (M=1.13). Modeling Instruction may assist teachers in further developing their pedagogy to improve science education for all students, especially female teachers, and those new to the profession. 120

Implications for Leadership

To break the cycle of misconceptions perpetuated by schools, educational leaders must address the misconceptions some science teacher candidates hold to help improve their SE.

Current classroom teachers may have completed their teacher preparation before the advent of misconception research and may not understand how to plan instruction with student misconceptions in mind adequately. Modeling Instruction pedagogy helps to address this issue.

Buczynski and Hansen (2010) reported that teachers who improved their CK were more likely to implement inquiry-based practices in their classrooms. Science professional development must provide support and remove barriers that prevent inquiry pedagogy such as limited resources, mandated curriculum pacing, and fidelity of implementation (Buczynski &

Hansen, 2010). Inquiry pedagogy requires a shift in how to teach and construct scientific knowledge in the classroom. Science instruction taught through traditional means often ignores the theoretical tenants of instruction. Pre-service teachers, for example, understand specific science content but do not know how to address student difficulties or misconceptions. Since pedagogical knowledge goes beyond CK and includes strategies to present material to students

(Shulman, 1986), teacher preparation programs should perhaps augment the way science is taught to develop a conceptual understanding based on the theoretical underpinnings of science.

Teacher training programs should be monitored periodically and focus on activities that enhance self-efficacy (Azar, 2010). Research has revealed the importance of self-efficacy, and professional development opportunities, such as MI, should have a prominent role in teacher training programs based on the positive impact revealed in the current study. Modeling

Instruction professional development provides teachers with content-focused time studying science pedagogy through scientific investigations that lead teachers to increase their self- 121 efficacy. Teachers with higher self-efficacy have the propensity to try new instructional methods such as MI (Guskey, 1988; Tschannen-Moran & Hoy, 1998) that previous research has shown positively impacts student performance.

According to the Schools and Staffing Survey, 25% of science teachers did not participate in any content-specific professional development in the last three years, and less than half (45%) of science teachers had opportunities to engage in scientific investigations (Hill &

Stearns, 2015). To further develop teachers in science education, these statistics need to change.

More science teachers need to be engaged in high-quality professional development, such as MI.

Ideally, more workshops should be offered across the U.S. to allow more teachers to participate.

There are several things that policymakers could do to improve science education in their schools. Learning is an energetic pursuit in which students must be engaged as genuine participants. Ergo, the design of school's policies should have active learning in mind.

Therefore, they should adopt standards that would require in-service teachers to engage in MI professional development.

School leadership should identify and develop expert teachers and peer coaches to further support the development of MI in science classrooms. Research indicates that peer coaching is a powerful tool to encourage colleagues to work together since the focus is on the teacher as a learner. Focusing on the teacher as a learner allows them to interact with one another to brainstorm, receive feedback, and alleviate the frustration that results from working in isolation when transforming their pedagogy (DeChenne et al., 2012).

Lastly, policies limiting classroom time to 45-minute increments need to be redesigned to accommodate the development of MI in classrooms. When teachers are afforded the time to take risks, to try out new teaching strategies and approaches, and to discuss results with their 122 colleagues, they can overcome the frustrations and challenges when utilizing MI pedagogy within the classroom. If students can spend more uninterrupted time engaged in the Modeling

Cycle, the more cooperative learning activities completion can happen in one class period. Also, students and teachers have more time for reflection when given more time per class period.

Implications for Modeling Instruction Practice

The ultimate goal of educational research is to improve practice to improve student achievement. Learning communities, such as the cohort model in MI, help teachers to increase their CK along with their PCK. These learning communities help to facilitate change in teacher practice to more inquiry-based approaches (Dogan, Pringle, & Mesa, 2016). The current research has shown the impact of MI professional development on science teachers. The

Modeling approach takes hold within contexts that support, shape, and value inquiry.

Development, revision, and manipulation of models to figure things out and solve problems are not skill typically nurtured in school. Instead, teachers often use these activities as conventional representations or lack connections to the kinds of conventions they were meant to address.

Environments, such as an MI classroom, that support the expression of ideas with multiple representations enhance learning (Dukerich, 2015).

Modeling Instruction professional development holds substantial advantages for teachers, such as improving self-efficacy, outcome expectancy, and science content knowledge based on the current study. Since students record models and publically defend and debate their validity, they render student thinking highly visible and generate ongoing informal assessment that guides instruction. Students learn to pose, evaluate, and pursue questions as they develop models.

Modeling Instruction is a way to get students thinking, as evidenced by previous studies mentioned in this research. 123

Classroom teachers confront many challenges, including high-stakes testing, prescribed curricula, increased numbers of students within a class, and decreasing budgets. In this turbulent time of education, self-disbelief or low efficacy can impair a teacher’s classroom and instruction and ability to implement new pedagogy (Hodges, Gale, & Meng, 2016). If educational leaders want teachers to improve their PCK, then future professional development of science teachers should encourage enhancing self-efficacy to make room for new pedagogical approaches such as

MI. The need for more effective professional development, like MI, has heightened in this time of reform in science education. Science education reform has required teachers to have different knowledge and skills. While there is a consensus of what contributes to effective professional development, our new standards-driven environment requires teachers to deepen their science

CK while developing the teaching practices consistent with how children learn (Desimone,

2009).

Many of our professional development activities do not consider how teachers make sense of their experiences (Drago-Severson, 2012). They also do not appear to include how to offer long-term support nor offer challenges that promote teachers’ learning that fosters their growth.

Attempts made by typical professional development to effect significant gains in teaching and learning have been arguably futile (Drago-Severson, 2012). However, the evidence from the research presented here suggests that MI may provide a way to effect significant gains in teaching. Policymakers and administrators who allocate funds for improving science teaching and learning must navigate through data to decipher appropriate programs that should be supported. Modeling Instruction professional development not only addresses these issues, but improves self-efficacy, outcome expectancy, and content knowledge. 124

The current study builds on an expansive body of research that well designed professional development such as MI, when effectively implemented, can lead to a desirable change in teacher pedagogy and student achievement. Modeling Instruction workshops serve as a role model for impactful professional development, engaging teachers in the same process of collaboratively constructing models reinforcing the MI pedagogy for the participants.

Limitations and Recommendations for Future Research

Limitations

There are several limitations implicit to this study. First, the data produced within the

Science Teacher Attitudes survey were self-reported. Each respondent was asked to accurately and honestly rate themselves according to their own beliefs. The intent was to minimize this limitation through the generation of a unique code. Slightly skewed data is possible due to the participants’ bias, degree of self-knowledge, and perception of the constructs for which the data was collected and measured (Austin, Gibson, McGregor, & Dent,1998).

Another limitation of the study is that data were gathered in MI professional development workshops throughout the United States from many different workshop leaders. With varying policies, budgets, and teaching cultures throughout the U.S., bias may be a factor in the responses received. Additionally, the data for this study were collected electronically and may, by its very nature, skew the data. Teachers may not have been able to resist the temptation to search for the correct answers to the Concept Inventory while online, thus skewing the data collected for content knowledge.

The use of a quasi-experimental design is another limitation of this study. Despite predictor variables often pointing toward a causal relationship, the outcomes found in this study provide no absolute proof that any of the dependent variables (self-efficacy, outcome 125 expectancy, and content knowledge) are causal to any of the independent variables (gender, years of experience, highest degree earned, and teaching in content). They merely assess their contributions to the variance and determine which independent variables can be used to significantly predict, rather than cause outcomes when we control for other predictor variables

(Mertler & Vannatta, 2010).

Participants who obtained a perfect score on the pretest did not leave room for improvement of CK. Removal of the group of teachers with perfect scores would not be ideal for the overall experience of the MI workshop. This group of teachers adds to the overall content knowledge of the cohort. Also, they help their peers understand concepts and facilitate meaningful discussions. While including their results may skew the data, this study notes the importance of those with perfect pretest CK scores.

This research relied on self-reported self-efficacy scores to establish growth, bringing to the forefront the accuracy of utilizing this technique. Actual performance within the classroom may be an additional layer missing from this study to measure self-efficacy more accurately.

Without direct classroom observation of a teacher’s behavior, the accuracy of a teacher’s self- reported score is challenging to determine. A future study might include a direct classroom observation component.

Future Research

Further study of CK gain score data that separates participants into different groups of teachers based on their level of experience, degree attainment, or content alignment may reveal differences in the size of content knowledge growth. Future research could further examine differences among specific groups within the data gathered for this current study. Comparison of

MI Physics PSTE, STOE, and CK gain scores may yield differences when compared to MI 126

Chemistry professional development gain scores. Another avenue to explore is the examination of gain scores for early, mid, and late-career science teachers, which may reveal differences among each group for PSTE, STOE, and CK. Lastly, different regions of the United States may reveal differences in their growth as a result of participation in the MI professional development.

Data collection from additional MI workshops, such as Biology, can serve as a comparison to the data from the current study. Classroom observations could be explored to determine a teacher’s pedagogical content knowledge or to determine how self-efficacy and outcome expectancy share a link to a teacher’s ability to implement MI within their classroom.

The Reformed Teaching Observation Protocol provides a standardized means for detecting the degree to which classroom instruction uses student-centered, engaged learning practice, and could be a useful tool for further research that includes a classroom observation component.

The crucial point is that MI professional development gives teachers the experience of implementing this pedagogy in their classroom. Successful implementation of new teaching practices changes teachers’ SE and OE. Teachers will further their belief that MI works when they have seen it work in their classroom. This experience shapes their attitudes and beliefs, thus, significantly changing a teacher’s SE and OE, leading to the improvement of the learning outcomes of their students. Examination of SE and OE after implementation of MI within the first three years after the MI workshop may provide a clearer picture of the impact of this pedagogical approach on teachers’ practice.

Final Thoughts

According to the Every Student Succeeds Act, professional development programs must be evidence-based. As a result, these programs must demonstrate a record of success with reliable and valid evidence. The current study presents evidence for the success of MI 127 professional development to impact teachers positively. Therefore, this research will help to justify the use of Title II funds for teachers’ participation in MI professional development.

Title II Part A replaced the Eisenhower Professional Development and the Class-Size

Reduction programs to fund teacher professional development. Historically, the Eisenhower program focused on professional development for math and science, while Title II, Part A funds are available to support teacher professional development across all core academic subjects.

Access to professional development funds to improve science pedagogy has become challenging as a result. Local teachers need to request these funds for professional development programs like MI to improve self-efficacy, content knowledge, science instruction, and ultimately students’ academic performance.

Fewer females teach physical science than males, even though the education profession is predominately composed of female teachers. Professional development programs that promote the development of females in the physical sciences, therefore, need to take center stage if we are to reach equal representation of the genders within science education. Modeling Instruction has shown to have a positive impact, particularly on females, and should be seriously considered as school districts decide on which professional development to invest in to improve self-efficacy, outcome expectancy, and content knowledge. That is not to say that a workshop should consist of only female teachers with less experience. Workshops with teachers from both genders, of varying levels of experience, different degrees, and with differing levels of content knowledge are what make this experience impactful.

128

REFERENCES

American Association for the Advancement of Science. (1994). Benchmarks for science literacy.

Oxford University Press.

Angle, J., & Moseley, C. (2009). Science teacher efficacy and outcome expectancy as predictors

of students' end-of-instruction (EOI) biology I test scores. School Science and

Mathematics, 109(8), 473-483.

Austin, E. J., Deary, I. J., Gibson, G. J., McGregor, M. J., & Dent, J. B. (1998). Individual

response spread in self-report scales: Personality correlations and consequences.

Personality and Individual Differences, 24(3), 421-438. doi:10.1016/S0191

-8869(97)00175-X

Azar, A. (2010). In-service and pre-service secondary science teachers’ self-efficacy beliefs

about science teaching. Educational Research and Reviews, 5(4), 172-185.

Baker, D. (2013). What works: Using curriculum and pedagogy to increase girls' interest and

participation in science. Theory into Practice, 52(1), 14-20.

doi:10.1080/07351690.2013.743760

Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change.

Psychological review, 84(2), 191.

Bandura, A. (1984). Recycling misconceptions of perceived self-efficacy. Cognitive therapy and

research, 8(3), 231-255.

Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory.

Englewood Cliffs, NJ: Prentice-Hall.

Bandura, A. (1989). Human agency in social cognitive theory. American Psychologist, 44(9),

1175. 129

Bandura, A. (1993). Perceived self-efficacy in cognitive development and functioning.

Educational psychologist, 28(2), 117-148.

Bandura, A. (1997). Self-efficacy: The exercise of control (pp. 3-604). New York: W. H.

Freeman.

Banilower, E. R., Smith, P. S., Weiss, I. R., Malzahn, K. A., Campbell, K. M., & Weis, A. M.

(2013). Report of the 2012 national survey of science and mathematics education.

Horizon Research, Inc.(NJ1).

Barlow, A. T., Frick, T. M., Barker, H. L., & Phelps, A. J. (2014). Modeling instruction: The

impact of professional development on instructional practices. Science Educator, 23(1),

14.

Belcher, N. T. (2017). Modeling instruction in AP physics C: Mechanics and electricity and

magnetism

Blank, R. K., De las Alas, N., & Smith, C. (2008). Does teacher professional development have

effects on teaching and learning?: Analysis of evaluation findings from programs for

mathematics and science teachers in 14 states. Council of Chief State School Officers.

Birman, B. F., Desimone, L., Porter, A. C., & Garet, M. S. (2000). Designing professional

development that works. Educational leadership, 57(8), 28-33.

Bleicher, R. E. (2004). Revisiting the STEBI‐B: Measuring self‐efficacy in preservice

elementary teachers. School Science and Mathematics, 104(8), 383-391.

Bradshaw, T. J. (2012). Impact of inquiry based distance learning and availability of classroom 130

materials on physical science content knowledge of teachers and students in central

Appalachia. University of Kentucky.

Bray-Clark, N., & Bates, R. (2003). Self-efficacy beliefs and teacher effectiveness: Implications

for professional development. Professional Educator, 26(1), 13-22.

Brewe, E. (2008). Modeling theory applied: Modeling Instruction in introductory physics.

American Journal of Physics, 76(12), 1155-1160.

Bruce, C. D., Esmonde, I., Ross, J., Dookie, L., & Beatty, R. (2010). The effects of sustained

classroom-embedded teacher professional learning on teacher efficacy and related student

achievement. Teaching and Teacher Education, 26(8), 1598-1608.

Buczynski, S., & Hansen, C. B. (2010). Impact of professional development on teacher practice:

Uncovering connections. Teaching and teacher education, 26(3), 599-607.

Burgoon, J. N., Heddle, M. L., & Duran, E. (2010). Re-examining the similarities between

teacher and student conceptions about physical science. Journal of Science Teacher

Education, 21(7), 859-872. doi:10.1007/s10972-009-9177-0

Buxton, C. A. (2006). Creating contextually authentic science in a ‘‘low-performing’’ urban

elementary school. Journal of Research in Science Teaching, 43(7), 695–721.

Cabot, L. H. (2008). Transforming teacher knowledge: Modeling instruction in physics.

University of Washington.

Campbell, T., Oh, P., & Neilson, D. (2012). Discursive modes and their pedagogical functions in 131

model-based inquiry (MBI) classrooms. International Journal of Science

Education, 34(15), 2393-2419.

Chen, Y., Benus, M., & Yarker, M. (2016). Using models to support argumentation in the

science classroom. American Biology Teacher, 78(7), 549-559.

Chi, M. T. H. (2008). Three types of conceptual change: Belief revision, mental model

transformation, and categorical shift. In International Handbook of Research on

Conceptual Change (pp. 61-82). New York, NY: Routledge.

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed). Psychology

Press. New York, USA.

Colburn, A. (2000). An inquiry primer. Science scope, 23(6), 42-44.

Coletta, V. P., Phillips, J. A., & Steinert, J. J. (2007). Interpreting force concept inventory

scores: Normalized gain and SAT scores. Physical review special topics-physics

education research, 3(1), 010106.

Creswell, J. (2009). Research design: Qualitative, quantitative, and mixed methods approaches.

Sage publications.

Cunningham, W. G., & Cordeiro, P. A. (2009). Educational leadership: A bridge to improved

practice. (Fourth ed.). Boston: Pearson

Czerniak, C. M. (1990). A study of self-efficacy, anxiety, and science knowledge in preservice

elementary teachers. National Association for Research in Science Teaching, Atlanta,

GA. 132

Darling-Hammond, L. (2000). How teacher education matters. Journal of Teacher

Education, 51(3), 166-173. doi:10.1177/0022487100051003002

Darling-Hammond, L. (2006). Constructing 21st-century teacher education. Journal of Teacher

Education, 57(3), 300-314. doi:10.1177/0022487105285962

Darling-Hammond, L. (1999). Professional development for teachers: Setting the stage for

learning from teaching. Center for the Future of Teaching & Learning.

Darling-Hammond, L., Hyler, M. E., & Gardner, M. (2017). Effective teacher professional

development. Palo Alto, CA: Learning Policy Institute.

Darling-Hammond, L., Wei, R. C., Andree, A., Richardson, N., & Orphanos, S. (2009).

Professional learning in the learning profession. Washington, DC: National Staff

Development Council, 12.

DeChenne, S. E., Nugent, G., Kunz, G., Luo, L., Berry, B., Craven, K., & Riggs, A. (2012). A

case study of coaching in science, technology, engineering, and math professional

development. National Center for Research on Rural Education.

Deehan, J. (2016). The science teaching efficacy belief instruments (STEBI A and B): A

comprehensive review of methods and findings from 25 years of science education

research. Springer.

Desbien, D. M. (2002). Modeling discourse management compared to other classroom

management styles in university physics

Desimone, L. M. (2009). Improving impact studies of teachers’ professional development:

Toward better conceptualizations and measures. Educational Researcher, 38(3), 181-199. 133

Dewey, J. (1938). Experience and education. New York: Macmillan.

DiSpezio, M. (2010). Misconceptions in the science classroom. Science Scope, 34(1), 16-21.

Dogan, S., Pringle, R., & Mesa, J. (2016). The impacts of professional learning communities on

science teachers’ knowledge, practice and student learning: A review. Professional

development in education, 42(4), 569-588.

Drago-Severson, E. (2012). New opportunities for principal leadership: Shaping school climates

for enhanced teacher development. Teachers College Record, 114(3)

Dukerich, L. (2015). Applying modeling instruction to high school chemistry to improve

students' conceptual understanding. Journal of Chemical Education, 92(8), 1315; 1315.

Egan, K. (2010). Learning in depth: A simple innovation that can transform schooling.

University of Chicago Press.

Enochs, L. G., & Riggs, I. M. (1990). Further development of an elementary science teaching

efficacy belief instrument: A preservice elementary scale. School science and

mathematics, 90(8), 694-706.

Ergonenc, J., Neumann, K., & Fischer, H. E. (2014). The impact of pedagogical content

knowledge on cognitive activation and student learning. Quality of Instruction in Physics,

145-160.

Fleischman, H. L., Hopstock, P. J., Pelczar, M. P., & Shelley, B. E. (2010). Highlights from

PISA 2009: Performance of US 15-Year-Old Students in Reading, Mathematics, and

Science Literacy in an International Context. NCES 2011-004. National Center for

Education Statistics. 134

Ford, M. (2012). A dialogic account of sense-making in scientific argumentation and reasoning.

Cognition and Instruction, 30(3), 207.

Fox, A. M. (2014). Teacher self-efficacy, content and pedagogical knowledge, and their

relationship to student achievement in Algebra I. The College of William and Mary.

Fullan, M., & Langworthy, M. (2014). A rich seam: How new pedagogies find deep learning

MaRS Discovery District.

Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes

professional development effective? Results from a national sample of teachers.

American Educational Research Journal, 38(4), 915-945.

Gay, G. (2010). Culturally responsive teaching: Theory, research, and practice. Teachers

College Press.

Gerjets, P., Scheiter, K., & Catrambone, R. (2004). Designing instructional examples to reduce

intrinsic cognitive load: Molar versus modular presentation of solution procedures.

Instructional Science, 32(1-2), 33-58.

Gibson, S., & Dembo, M. H. (1984). Teacher efficacy: A construct validation. Journal of

Educational Psychology, 76(4), 569.

Goddard, R. D., Hoy, W. K., & Hoy, A. W. (2000). Collective teacher efficacy: Its meaning,

measure, and impact on student achievement. American Educational Research Journal,

37(2), 479-507.

Goldring, R., Gray, L., & Bitterman, A. (2013). Characteristics of Public and Private Elementary 135

and Secondary School Teachers in the United States: Results from the 2011-12 Schools

and Staffing Survey. First Look. NCES 2013-314. National Center for Education

Statistics.

Gooding, J., & Metz, B. (2011). From misconceptions to conceptual change: Tips for

identifying and overcoming students’ misconceptions. The Science Teacher, 78(4):

34-37.

Gulamhussein, A. (2013). Teaching the teachers: Effective professional development in an era of

high stakes accountability. Alexandria, VA: The Center for Public Education.

Guskey, T. R. (1988). Teacher efficacy, self-concept, and attitudes toward the implementation of

instructional innovation. Teaching and teacher education, 4(1), 63-69.

Guskey, T. R. (1994). Professional development in education: In search of the optimal mix.

Guskey, T. R., & Yoon, K. S. (2009). What works in professional development?. Phi delta

kappan, 90(7), 495-500.

Haag, S., & Megowan, C. (2015). Next generation science standards: A national Mixed

Methods study on teacher readiness. School Science and Mathematics, 115(8), 416-426.

Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student

survey of mechanics test data for introductory physics courses. American journal of

Physics, 66(1), 64-74.

Halim, L., Abdullah, S. I. S. S., & Meerah, T. S. M. (2014). Students’ perceptions of their

science teachers’ pedagogical content knowledge. Journal of Science Education and

Technology, 23(2), 227-237.

Halloun, I. A. (2004). Modeling theory in science education. Boston: Kluwer Academic 136

Publishers. doi:10.1007/1-4020-2140-2

Halloun, I. A. (2006). Modeling theory in science education: Science & technology education

library v. 24. NL: Springer.

Halloun, I. A. (2011). From modeling schemata to the profiling schema: Modeling across the

curricula for profile shaping education. In Models and modeling (pp. 77-96). Springer,

Dordrecht.

Halloun, I., Hake, R. R., Mosca, E. P., & Hestenes, D. (1995). Force Concept Inventory (Revised

1995). Available online (password protected) at http://modeling. la.asu.

edu/R&E/Research. html.

Halloun, I. A., & Hestenes, D. (1985). The initial knowledge state of college physics

students. American journal of Physics, 53(11), 1043-1055.

Halpern, D. F., Eliot, L., Bigler, R. S., Fabes, R. A., Hanish, L. D., et al. (2011). The

pseudoscience of single-gender schooling. Science, 23, 1706–1707.

doi:10.1126/science.1205031.

Haney, J. J., Czerniak, C. M., & Lumpe, A. T. (1996). Teacher beliefs and intentions regarding

the implementation of science education reform strands. Journal of Research in Science

Teaching: The Official Journal of the National Association for Research in Science

Teaching, 33(9), 971-993.

Hattie, J. (2012). Visible learning for teachers: Maximizing impact on learning. Routledge.

Hattie, J. (2015). What works best in education: The politics of collaborative expertise. British

Columbia Teachers' Federation. 137

Hayes, E. R. (2001). A new look at women's learning. New Directions for Adult and Continuing

Education, 2001(89), 35. doi:10.1002/ace.6

Heller, P., & Stewart, G. (2010). College ready physics standards: A look to the future. Physics

Teacher, ISSN, 151.

Hestenes, D. (1987). Toward a modeling theory of physics instruction. American Journal of

Physics, 55(5), 440-454.

Hestenes, D. (1997, March). Modeling methodology for physics teachers. In AIP conference

proceedings (Vol. 399, No. 1, pp. 935-958). AIP.

Hestenes, D. (2006). Notes for a modeling theory. In Proceedings of the 2006 GIREP

conference: Modeling in physics and physics education (Vol. 31, p. 27). Amsterdam:

University of Amsterdam.

Hestenes, D. (2007). Findings of the Modeling Workshop Project. excerpt from Final Report

submitted to the National Science Foundation for the Teacher Enhancement grant

Modeling Instruction in High School Physics. Downloaded from http://modeling. asu.

edu/r&e/research. html.

Hestenes, D. (2010). Modeling theory for math and science education. In Modeling students'

mathematical modeling competencies (pp. 13-41). Springer, Boston, MA.

Hestenes, D. (2013). Remodeling Science Education. European Journal of Science and

Mathematics Education, 1(1), 13-22.

Hestenes, D. (2015). Conceptual Modeling in physics, mathematics and cognitive science.

Hestenes, D., & Halloun, I. (1995). Interpreting the force concept inventory: A response to

March 1995 critique by Huffman and Heller. The Physics Teacher, 33(8), 502. 138

Hestenes, D., Megowan-Romanowicz, C., Osborn Popp, S. E., Jackson, J., & Culbertson, R. J.

(2011). A graduate program for high school physics and physical science teachers.

American Journal of Physics, 79(9), 971-979.

Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics

Teacher, 30(3), 141-158.

Hill, J., & Stearns, C. (2015). Education and Certification Qualifications of Departmentalized

Public High School-Level Teachers of Selected Subjects: Evidence from the 2011-12

Schools and Staffing Survey. NCES 2015-814. National Center for Education Statistics.

Hinrichs, B.E. (2006). Using the system schema representational tool to promote student

understanding of Newton’s third law. In AIP Conference Proceedings (Vol. 790, No. 1,

pp. 117-120). AIP.

Hobbs, L. (2012). Teaching out-of-field : Factors shaping identities of secondary science and

mathematics. Teaching Science, 58(1), 21-29.

Hodges, C., Gale, J., & Meng, A. (2016). Teacher self-efficacy during the implementation of a

problem-based science curriculum. Contemporary Issues in Technology and Teacher

Education, 16(4), 434-451.

Ingersoll, R. (2002). Out-of-field teaching, educational inequality, and the organization of

schools: An exploratory analysis.

Jackson, J. (2010). Arizona State University’s preparation of out-of-field physics teachers: MNS

summer program. Journal of Physics Teacher Education Online, 5(4), 2-10. 139

Jackson, J., Dukerich, L., & Hestenes, D. (2008). Modeling Instruction: An Effective Model for

Science Education. Science Educator, 17(1), 10-17.

Knaggs, C. M., & Sondergeld, T. A. (2015). Science as a learner and as a teacher: Measuring

science self-efficacy of elementary preservice teachers: Science as a learner and as a

teacher. School Science and Mathematics, 115(3), 117-128. doi:10.1111/ssm.12110

Kleickmann, T., Richter, D., Kunter, M., Elsner, J., Besser, M., Krauss, S., & Baumert, J. (2013).

Teachers’ content knowledge and pedagogical content knowledge: The role of structural

differences in teacher education. Journal of Teacher Education, 64(1), 90-106.

doi:10.1177/0022487112460398

Krepf, M., Plöger, W., Scholl, D., & Seifert, A. (2018). Pedagogical content knowledge of

experts and novices—what knowledge do they activate when analyzing science lessons?.

Journal of Research in Science Teaching, 55(1), 44-67. Doi: 10.1002/tea.

Lakshmanan, A., Heath, B. P., Perlmutter, A., & Elder, M. (2011). The impact of science content

and professional learning communities on science teaching efficacy and standards-based

instruction. Journal of Research in Science Teaching, 48(5), 534-551.

Lasry, N., Rosenfield, S., Dedic, H., Dahan, A., & Reshef, O. (2011). The puzzling reliability of

the Force Concept Inventory. American Journal of Physics, 79(9), 909-912.

Lehrer, R., & Schauble, L. (2000). Developing model-based reasoning in mathematics and

science. Journal of Applied Developmental Psychology, 21(1), 39-48.

Leithwood, K., Jantzi, D., & Steinbach, R. (1999). Changing leadership for changing times.

McGraw-Hill Education (UK) Journal of Research in Science Teaching, 55(1), 44-67. 140

Lekhu, M. A. (2013). Relationship between self-efficacy beliefs of science teachers and their

confidence in content knowledge. Journal of Psychology in Africa, 23(1), 109-112.

Liang, L. L., & Richardson, G. M. (2009). Enhancing prospective teachers’ science teaching

efficacy beliefs through scaffolded, student-directed inquiry. Journal of Elementary

Science Education, 21(1), 51-66.

Loughran, J., Berry, A., & Mulhall, P. (2012). Understanding and Developing ScienceTeachers’

Pedagogical Content Knowledge (Vol. 12). Springer Science & Business Media.

Loewenberg Ball, D., Thames, M. H., & Phelps, G. (2008). Content knowledge for teaching:

What makes it special? Journal of Teacher Education, 59(5), 389-407.

Manthey, S., & Brewe, E. (2013). Toward university modeling instruction—Biology: Adapting

curricular frameworks from physics to biology. CBE—Life Sciences Education, 12(2),

206-214.

Martin, C. L. (2018). Correlational analysis of self-efficacy and technological pedagogical

content knowledge of board certified teachers.

Marzano, R. J. (2003). What works in schools: Translating research into action. Alexandria,

Virginia: Association for Supervision and Curriculum Development.

McDermott, L. C. (1991). Millikan Lecture 1990: What we teach and what is learned—Closing

the gap. American Journal of Physics, 59(4), 301-315.

McFarland, J., Hussar, B., Wang, X., Zhang, J., Wang, K., Rathbun, A., Barmer, A., Cataldi, E.

F., & Mann, F. B. (2018). The Condition of Education 2018. NCES 2018-144. National

Center for Education Statistics. 141

Megowan, M. C. (2007). Framing discourse for optimal learning in science and mathematics.

Megowan-Romanowicz, C. (2016). Whiteboarding: A tool for moving classroom discourse from

answer-making to sense-making. The Physics Teacher, 54(2), 83-86.

Menon, D., & Sadler, T. D. (2016). Preservice elementary teachers’ science self-efficacy beliefs

and science content knowledge. Journal of Science Teacher Education, 27(6), 649-673.

Mertler, C. A., & Vannatta, R. V. (2010). Advanced and multivariate statistical methods:

Practical application and interpretation. Routledge.

Milner IV, H. R. (2007). Race, culture, and researcher positionality: Working through dangers

seen, unseen, and unforeseen. Educational Researcher, 36(7), 388-400.

Mulford, D. R., & Robinson, W. R. (2002). An inventory for alternate conceptions among

first-semester general chemistry students. Journal of Chemical Education, 79(6), 739.

doi:10.1021/ed079

Mulholland, J., & Wallace, J. (2001). Teacher induction and elementary science teaching:

Enhancing self-efficacy. Teaching and Teacher Education, 17(2), 243-261.

doi:10.1016/S0742-051X(00)00054-8

National Academies of Sciences, Engineering, and Medicine. (2015). Science teachers' learning:

Enhancing opportunities, creating supportive contexts. National Academies Press.

National Commission on Mathematics and Science Teaching for the 21st Century.

(2000). Before it's too late: A report to the nation from the National Commission on

Mathematics and Science teaching for the 21st century. Washington, D.C: U.S. Dept. of 142

Education.

National Research Council. (1996). National science education standards. National Academies

Press.

National Research Council. (1997). Science teaching reconsidered: A handbook. National

Academies Press.

National Research Council. (2000). Inquiry and the national science education standards: A

guide for teaching and learning. National Academies Press.

National Research Council. (2012). A framework for K-12 science education: Practices,

crosscutting concepts, and core ideas. National Academies Press.

National Science Foundation. (2001). Toward a More Effective Role for the US Government in

International Science and Engineering. Washington, D.C.

National Science Foundation. (2015). Women, minorities, and persons with disabilities in

science and engineering: 2015.

National Science Teachers Association - NSTA. (n.d.). About NSTA. Retrieved from

https://www.nsta.org/about/clpa/faq.aspx

National Science Teachers Association - NSTA. (n.d.). NSTA Position Statement. Retrieved

from https://www.nsta.org/about/positions/genderequity.aspx

Nauerth, D. A. (2015). The impact of lesson study professional development on teacher self

-efficacy and outcome expectancy (Doctoral dissertation, Kansas State University).

Newman, W. J., Abell, S. K., Hubbard, P. D., McDonald, J., Otaala, J., & Martini, M. (2004).

Dilemmas of teaching inquiry in elementary science methods. Journal of Science Teacher 143

Education, 15(4), 257-279. doi:10.1023/B:JSTE.0000048330.07586.d6

New York Science Teacher. 2010. Common science misconceptions.

www.newyorkscienceteacher.com/sci/pages/ miscon/subject-index.php

NGSS, Lead States. (2013). Next Generation Science Standards: For States, by States. Retrieved

from https://ebookcentral.proquest.com

Obama, B. (2011). Remarks by the President in State of the Union Address. White House

transcript. January, 25.

Organization for Economic Co-operation and Development. (2016). PISA 2015 results (Volume

I): Excellence and equity in education. OECD Publishing.

Ohio Department of Education. (2011). Ohio’s new learning standards: Science standards.

Columbus, Ohio.

Ohio Department of Education. (2015). Ohio’s 2015 plan to ensure equitable access to excellent

educators. Columbus, Ohio.

Ohle, A., Boone, W. J., & Fischer, H. E. (2015). Investigating the impact of teachers’ physics ck

on students’ outcomes. International Journal of Science and Mathematics Education,

13(6), 1211-1233.

Passmore, C., Stewart, J., & Cartier, J. (2009). Model-based inquiry and school science: Creating

connections. School Science and Mathematics, 109(7), 394-402.

Petty, G. (2009). Evidence-based teaching: A practical approach. Nelson Thornes.

Posner, G. J., Strike, K. A., Hewson, P. W., & Gertzog, W. A. (1982). Accommodation of a

scientific conception: Toward a theory of conceptual change. Science Education, 66(2),

211-27. 144

Ramaley, J. A. (2002). New truths and old verities. New directions for higher education,

2002(119), 15-22.

Ramey-Gassert, L., Shroyer, M. G., & Staver, J. R. (1996). A qualitative study of factors

influencing science teaching self-efficacy of elementary level teachers. Science

Education, 80(3), 283-315.

Rice, D. C., & Roychoudhury, A. (2003). Preparing more confident preservice elementary

science teachers: One elementary science methods teacher's self-study. Journal of

Science Teacher Education, 14(2), 97-126.

Riggs, I. M., & Enochs, L. G. (1989). Toward the development of an elementary teacher's

science teaching efficacy belief instrument. Science Education, 74(6), 625-637.

Rosenblatt, R. J. (2012). Identifying and addressing student difficulties and misconceptions:

Examples from physics and from materials science and engineering

Ross, J.A., & Bruce, C.D. (2007). Professional development effects on teacher efficacy:

Results of randomized field trial. Journal of Educational Research, 101(1), 50–60.

Royce, B. R. (2012). Evaluation of student thinking on the ABCC. Unpublished master's

thesis, Fresno Pacific University, Fresno, California. Retrieved from

http://modeling.asu.edu/thesis/RoyceBrendaABCCthesis.pdf

Sadler, P. M., & Sonnert, G. (2016). Understanding misconceptions: Teaching and learning in

middle school physical science. American Educator, 40(1), 26.

Sadler, P. M., Sonnert, G., Coyle, H. P., Cook-Smith, N., & Miller, J. L. (2013). The influence

of teachers’ knowledge on student learning in middle school physical science classrooms.

American Educational Research Journal, 50(5), 1020-1049. 145

Sadler, P. M., Sonnert, G., Hazari, Z., & Tai, R. (2012). Stability and volatility of STEM career

interest in high school: A gender study. Science Education, 96(3), 411-427.

doi:10.1002/sce.21007

Sahgal, P., & Pathak, A. (2007). Transformational leaders: Their socialization, self-concept,

and shaping experiences. International Journal of Leadership Studies, 2(3), 263-279.

Sargent, S., Ferrell, J., Smith, M., & Scroggins, J. (2018). Outcome expectancy in literacy: Is

average good enough? Reading Improvement, 55(1), 1.

Savinainen, A., & Viiri, J. (2008). The force concept inventory as a measure of students

conceptual coherence. International Journal of Science and Mathematics Education,

6(4), 719-740.

Schoon, K. J., & Boone, W. J. (1998). Self-efficacy and alternative conceptions of science of

preservice elementary teachers. Science Education, 82(5), 553-568.

Shroyer, G., Riggs, I., & Enochs, L. (2014). Measurement of science teachers’ efficacy beliefs:

The role of the science teaching efficacy belief instrument. In The Role of Science

Teachers' Beliefs in International Classrooms (pp. 103-118). Brill Sense.

Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational

Researcher, 15(2), 4-14. doi:10.2307/1175860

Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform. Harvard

Educational Review, 57(1), 1-23.

Shulman, L. S. (2013). Those who understand: Knowledge growth in teaching. Journal of

Education, 193(3), 1-11.

Shulman, L. S., & Shulman, J. H. (2004). How and what teachers learn: A shifting perspective. 146

Journal of Curriculum Studies, 36(2), 257-271.

SPSS, IBM. (2015). IBM SPSS statistics for Windows, version 23.0. New York: IBM

Corp, 440.

Swackhamer, L. E., Koellner, K., Basile, C., & Kimbrough, D. (2009). Increasing the self

efficacy of inservice teachers through content knowledge. Teacher Education Quarterly,

36(2), 63-78.

Swars, S. L., & Dooley, C. M. (2010). Changes in teaching efficacy during a professional

development school‐based science methods course. School Science and Mathematics,

110(4), 193-202.

Thornton, R. K., Kuhl, D., Cummings, K., & Marx, J. (2009). Comparing the force and motion

conceptual evaluation and the force concept inventory. Physical Review Special

Topics-Physics Education Research, 5(1), 010105.

Townsend, T., Ivory, G., Acker-Hocevar, M. A., Ballenger, J., & Place, A. W. (2013). School

leadership challenges under No Child Left Behind: Lessons from UCEA’s project. A.

Shoho & B. Barnett (Series Eds.) & B. Barnett, A. Shoho, & A. Bowers (Vol. Eds.),

International Research on School Leadership, 4.

Trusz, S. (2018). Four mediation models of teacher expectancy effects on students’ outcomes in

mathematics and literacy. Social Psychology of Education, 21(2), 257-287.

Tschannen-Moran, M., & Hoy, A. W. (2001). Teacher efficacy: Capturing an elusive construct.

Teaching and Teacher Education, 17(7), 783-805.

Tschannen-Moran, M., Hoy, A. W., & Hoy, W. K. (1998). Teacher efficacy: Its meaning and

measure. Review of Educational Research, 68(2), 202-248. 147

Tschannen-Moran, M., & Hoy, A. W. (2007). The differential antecedents of self-efficacy beliefs

of novice and experienced teachers. Teaching and Teacher Education, 23(6), 944-956.

U.S. Department of Education, Mathematics and Science Expert Panel. (2001). Exemplary and

Promising Science Programs 2001 (ED Publication No. 1.310/2:460863). Retrieved from

www.phschool.com/ebs/award_winning.pdf

Van Hook, S. J. & Huziak-Clark, T. (2008) Lift, squeeze, stretch and twist: Developing

kindergartners’ understanding of energy using inquiry-based instruction. The Journal of

Elementary Science Education, V. 20(3),1-16.

Vosniadou, S., Vamvakoussi, X., & Skopeliti, I. (2008). The framework theory approach to

the problem of conceptual change. In S. Vosniadou (Ed.), International Handbook of

Research on Conceptual Change (pp. 3-34). New York, NY: Routledge.

Warner, R. M. (2008). Applied statistics: From bivariate through multivariate techniques. Sage.

Wells, M., Hestenes, D., & Swackhamer, G. (1995). A modeling method for high school

physics instruction. American Journal of Physics, 63(7), 606-619.

Wilcox, J., Kruse, J. W., & Clough, M. P. (2015). Teaching science through inquiry. The Science

Teacher, 82(6), 62.

Williams, D. M. (2010). Outcome expectancy and self-efficacy: Theoretical implications of an

unresolved contradiction. Personality and Social Psychology Review, 14(4), 417-425.

Windschitl, M., Thompson, J., Braaten, M., & Stroupe, D. (2012). Proposing a core set of

instructional practices and tools for teachers of science. Science Education, 96(5), 878-

903. 148

Windschitl, M., & Thompson, J. (2006). Transcending simple forms of school science

investigation: The impact of preservice instruction on teachers' understandings of

model-based inquiry. American Educational Research Journal, 43(4), 783-835.

doi:10.3102/00028312043004783

Witkin, H. A., & Goodenough, D. R. (1981). Cognitive styles, essence and origins: Field

dependence and field independence. New York: International Universities Press.

Wright, T. L. (2012). The effects of modeling instruction on high school physics academic

achievement.

Yoon, K. S., Duncan, T., Lee, S. W., Scarloss, B., & Shapley, K. L. (2007). Reviewing the

evidence on how teacher professional development affects student achievement. issues &

answers. REL 2007-no. 033. Regional Educational Laboratory Southwest (NJ1).

Zhang, M., Parker, J., Koehler, M. J., & Eberhardt, J. (2015). Understanding inservice science

teachers’ needs for professional development. Journal of Science Teacher Education,

26(5), 471-496.

149

APPENDIX A. INSTITUTIONAL REVIEW BOARD STATEMENT

DATE: June 19, 2019

TO: Gloria Gajewicz

FROM: Bowling Green State University Institutional Review Board

PROJECT TITLE: [1156285-1] Examination of the Change in Science Content Knowledge, Personal Science Teacher Efficacy, and Science Teacher Outcome Expectancy As a Result of Participation in Modeling Instruction Professional Development

SUBMISSION TYPE: New Project

ACTION: DETERMINATION OF EXEMPT STATUS

DECISION DATE: June 19, 2019

REVIEW CATEGORY: Exemption category # 4

Thank you for your submission of New Project materials for this project. The Bowling Green State University Institutional Review Board has determined this project is exempt from IRB review according to federal regulations AND that the proposed research has met the principles outlined in the Belmont Report. You may now begin the research activities.

Note that changes cannot be made to exempt research because of the possibility that proposed changes may change the research in such a way that it no longer meets the criteria for exemption. If you want to make changes to this project, contact the Office of Research Compliance for guidance.

We will retain a copy of this correspondence within our records.

If you have any questions, please contact the Office of Research Compliance at 419-372-7716 or [email protected]. Please include your project title and reference number in all correspondence with this committee.

This letter has been electronically signed in accordance with all applicable regulations, and a copy is retained within Bowling Green State University Institutional Review Board's records. 150

APPENDIX B. SCIENCE TEACHER ATTITUDES SURVEY

Anonymous identifier: In the next four questions, we ask you for a series of letters and/or numbers to use for building an anonymous identifier to match your pre- and posttest scores.

1. What is the 5-digit zip code of your home address? (your current home address--if you're not from the US, please enter the zip code of your current residence in the US) 2. Enter the two-digit month and day of your birthday (for example, if your birthday is May 23rd you would enter 0523). 3. Enter the two letter uppercase abbreviation for the state in which this workshop is taking place (e.g., if your workshop is in California you would enter CA). 4. Enter the first five digits of your street address (e.g. If your street address is 5808, you would enter "05808")

Self-Efficacy and Outcome Expectancy: Please indicate the degree to which you agree or disagree with each statement below using the following scale:

Strongly Agree= 5, Agree = 4, Uncertain= 3, Disagree = 2, Strongly disagree = 1

5. When a student does better than usual in science, it is often because the teacher exerted extra effort. 6. I am continually finding better ways to teach science. 7. When the science grades of students improve, it is most often due to teachers finding a more effective teaching approach. 8. I know the steps necessary to teach science effectively. 9. I am effective in monitoring science activities. 10. I am effective in monitoring engineering activities. 11. If students are underachieving in science, it is most likely due to ineffective science teaching. 12. The inadequacy of a student's science background can be overcome by good teaching. 13. The low science achievement of some students cannot generally be blamed on their teachers. 14. When a low achieving child progresses in science, it is usually due to extra attention given by the teaher. 15. I understand science concepts well enough to be effective in teaching the science courses I am assigned to teach. 16. The teacher is responsible for student achievement in science. 17. A student's achievement in science is directly related to their teacher's effectiveness in science teaching. 18. If parents comment that their child is showing more interest in science at school, it is probably due to the performance of the child's teacher. 19. I find it difficult to explain to students why science experiments produce the data that they do. 20. Effectiveness in science teaching has little influence on the achievement of students with low motivation. 21. I am typically able to answer students' science questions. 22. I am typically able to answer students' engineering questions. 23. I am confident in integrating engineering practices into my science instruction. 24. I am typically anxious about my preparation in the area of engineering. 25. I currently have the necessary skills to integrate science and engineering practices into my teaching. 26. I am currently able to use science activities in my classroom. 27. I am currently able to use engineering activities in my classroom. 28. I am currently able to teach content using NGSS science and engineering practices. 151

Engagement Questions: Please indicate the degree to which you agree or disagree with each statement below using the following scale:

Strongly Agree= 5, Agree = 4, Uncertain= 3, Disagree = 2, Strongly disagree = 1

29. My classroom is usually interactive. 30. Students in my class interact with me often. 31. Students in my class interact with each other often. 32. I don't check for understanding (verify that my students understand) often enough. 33. I usually check for understanding (verify that my students understand) by asking questions out loud. 34. I usually check for understanding (verify that my students understand) by giving quizzes/tests. 35. I usually check for understanding (verify that my students understand) by assessing a project/presentation/written item. 36. I usually check for understanding (verify that my students understand) by observing student actions/expressions. 37. Most of my feedback to students illustrates what they know and what they don’t (i.e., what they got right and what they got wrong). 38. Most of my feedback to students shows them what they should do next (i.e., comments on something to fix and how). 39. I provide feedback to my students often.

Demographics: Please provide some basic demographic information about yourself.

40. If you would you be willing to complete a follow-up survey if contacted a year from now, please enter an email address where we can contact you. 41. Please comment on your confidence in integrating science and/or engineering practices into your teaching. 42. Do you teach middle school or high school. a. Choose 1 for high school b. 0 for middle school c. NA and enter comment. 43. What subjects are you prepared to teach? (If you are prepared to teacher more than one subject, check your area of greatest preparation and list others in the comment box) a. physics b. chemistry c. biology d. and space science e. middle school science f. engineering g. technology h. mathematics i. other: please comment 152

44. What subjects are you assigned to teach? (If more than one, please check your primary subject assignment and list the other subjects you teach in the comment box.)

a. physics b. chemistry c. biology d. earth and space science e. middle school science f. engineering g. technology h. mathematics i. other: please comment 45. How many years have you been teaching? Only numbers may be entered in this field. 46. What is your highest level of educational attainment? Bachelor's degree = 1; Masters = 2; Doctoral = 3 47. What was your major? Please write your answer here: 48. How many other Modeling Workshops have you taken besides this one? a. 0 b. 1 c. 2 d. 3 e. 4 f. 5 g. 6 h. 7 i. 8 j. 9 49. Is your school an urban school? Yes = 1, No =0 50. Is the student population of your school > 50% free and reduced lunch? No=0; 1=Yes 51. What is the 5-digit zip code of your school? (if you are between schools please enter 00000) 52. What is your gender? 1 = MALE; 0 = FEMALE

Thank you for completing this survey.

The STA was developed by Susan Haag and Colleen Megowan-Romanowicz, 2011

153

APPENDIX C. ASSESSMENT OF BASIC CHEMISTRY CONCEPTS SAMPLE

QUESTIONS

1. Which of the following must be the same before and after a chemical reaction? A. The sum of the masses of all substances involved. B. The number of molecules of all substances involved. C. The number of atoms of each type involved. D. Both (a) and (c) must be the same. E. Each of the answers (a), (b), and (c) must be the same.

2. Assume a beaker of pure water has been boiling for 30 minutes. What is in the bubbles in the boiling water? A. Heat. B. Air. C. Oxygen gas and hydrogen gas. D. Oxygen gas. E. Gaseous water.

3. A glass of cold milk sometimes forms a coat of water on the outside of the glass (often referred to as 'sweat'). How does most of the water get there? A. Water molecules from the air condense onto the outside of the glass. B. Water molecules from the milk pass through the glass and condense on the outside of the glass. C. Water molecules evaporate from the milk and condense on the outside of the glass. D. The coldness causes oxygen and hydrogen from the air to combine on the glass forming water.

4. What is the weight of the solution when 1 pound of salt is dissolved in 20 pounds of water? A. More than 21 pounds. B. 21 pounds. C. Between 20 and 21 pounds. D. 20 Pounds. E. Less than 20 pounds.

5. The diagram represents a mixture of sulfur (S) atoms and oxygen (O2) molecules in a closed container. Which diagram shows the results after the mixture reacts as completely as possible according to the equation: 2S + 3O2 → 2SO3 154

APPENDIX D. FORCE CONCEPT INVENTORY SAMPLE QUESTIONS

1. Two metal balls are the same size but one weighs twice as much as the other. The balls are dropped from the roof of a single story building at the same instant of time. The time it takes the balls to reach the ground below will be: (A) about half as long for the heavier ball as for the lighter one. (B) about half as long for the lighter ball as for the heavier one. (C) about the same for both balls. (D) considerably less for the heavier ball, but not necessarily half as long. (E) considerably less for the lighter ball, but not necessarily half as long.

2. The two metal balls of the previous problem roll off a horizontal table with the same speed. In this situation: (A) both balls hit the floor at approximately the same horizontal distance from the base of the table. (B) the heavier ball hits the floor at about half the horizontal distance from the base of the table than does the lighter ball. (C) the lighter ball hits the floor at about half the horizontal distance from the base of the table than does the heavier ball. (D) the heavier ball hits the floor considerably closer to the base of the table than the lighter ball, but not necessarily at half the horizontal distance. (E) the lighter ball hits the floor considerably closer to the base of the table than the heavier ball, but not necessarily at half the horizontal distance.

3. A stone dropped from the roof of a single story building to the surface of the earth: (A) reaches a maximum speed quite soon after release and then falls at a constant speed thereafter. (B) speeds up as it falls because the gravitational attraction gets considerably stronger as the stone gets closer to the earth. (C) speeds up because of an almost constant force of gravity acting upon it. (D) falls because of the natural tendency of all objects to rest on the surface of the earth. (E) falls because of the combined effects of the force of gravity pushing it downward and the force of the air pushing it downward.

4. A large truck collides head-on with a small compact car. During the collision: (A) the truck exerts a greater amount of force on the car than the car exerts on the truck. (B) the car exerts a greater amount of force on the truck than the truck exerts on the car. (C) neither exerts a force on the other, the car gets smashed simply because it gets in the way of the truck. (D) the truck exerts a force on the car but the car does not exert a force on the truck. (E) the truck exerts the same amount of force on the car as the car exerts on the truck. 155

APPENDIX E. SCORING FOR THE SCIENCE TEACHER ATTITUDES SURVEY

Sum Scale Items Items from the two scales are scattered randomly throughout the Science Teacher Attitudes (STA) survey. The scale designed to measure Personal Science Teaching Efficacy (PSTE) consists of:

question 6 question 15 question 23 question 26 question 8 question 19 question 24 question 27 question 9 question 21 question 25 question 28 question 10 question 22

The scale for Science Teaching Outcome Expectancy (STOE) consists of:

question 5 question 12 question 16 question 18 question 7 question 13 question 17 question 20 question 11 question 14

Reverse Selected Response Values The following questions must be reverse scored in order to produce consistent values between positively and negatively worded questions. Reversing the scores on these items will produce high scores for those with high PSTE and STOE and low scores for the low in PSTE and STOE beliefs.

question 14 question 19 question 20 question 24

Recode command: 5 = 1; 4 = 2; 2 = 4; 1 = 5. Before performing a sum of scale scores, recode the four questions above.