Annual Reporting Measures
Total Page:16
File Type:pdf, Size:1020Kb
APRIL 30, 2019 ANNUAL REPORTING MEASURES COLLECTING INFORMATION ON PROGRAM IMPACT AND OUTCOMES Berea College Education Studies Department Teacher Education Program Table of Contents Section Page # Purpose of Report 2 The Eight CAEP Annual Reporting Measures 3 Measure 1: Impact on P-12 Learning & Development 4 Measure 2: Indicators of Teaching Effectiveness 5 Measure 3: Employer Satisfaction and Completer Persistence 9 Measure 4: Completer Satisfaction 10 Measure 5: Completion/Graduation Rates 11 Measure 6: Licensure/Certification Rates 12 Measure 7: Employment Rate 15 Measure 8: Consumer Information 17 1 Purpose of Report The Council for the Accreditation of Teacher Preparation (CAEP) requires that Education Preparation Providers (EPPs) publicly share program information and data that is accessible to multiple stakeholders. At Berea College, the EPP is known as the Teacher Education Program (TEP) within the Education Studies Department (EDS). There are eight annual reporting measures used to examine both program impact and program outcomes. In the 2018 CAEP Handbook1, the measures are defined as “basic indicators” of EPP performance that are associated with “candidates as they complete preparation, and to completers once they are on the job” (p. 26). Data for these measures were gathered from multiple sources including departmental surveys and focus groups; reports generated by the Berea College Office of Institutional Research and Assessment; and reports published by state agencies such as the Kentucky Center for Statistics (KYStats) and the Kentucky Council for Postsecondary Education. 1 At this time, the referenced (and most recent) handbook is the CAEP Handbook: Initial Level Programs 2018. The current CAEP Accreditation Handbook can be accessed at: http://caepnet.org/accreditation/caep- accreditation/caep-accreditation-handbook 2 The Eight CAEP Annual Reporting Measures IMPACT: Performance of program completers once they are employed as P-12 teachers 1. Impact on P-12 Learning & Development (Component 4.1) 2. Indicators of Teaching Effectiveness (Component 4.2) 3. Employer Satisfaction and Completer Persistence (Component 4.3) 4. Completer Satisfaction (Component 4.4) OUTCOME: Outcomes of program preparation and consumer reporting information 5. Completion/Graduation Rates 6. Licensure/Certification Rates 7. Employment Rate 8. Consumer Information 3 Measure 1: Impact on P-12 Learning & Development At this time, Kentucky no longer has a mandatory, state-funded process for obtaining data related to completers’ performance and P-12 student growth. This is due to change in regulation that once required newly inducted teachers (program completers) to participate in the Kentucky Teacher Internship Program (KTIP). Participation in KTIP generated performance data related to completers’ performance in the classroom. Additionally, new leadership at the Kentucky Department of Education has indicated that the state will continue its practice of not releasing P-12 student data. Therefore, this impacts plans for collecting data on P-12 student learning and development. With these changes in state leadership and mandated processes, EDS will focus on working with local school districts to obtain P-12 student data. Relationship building with community partners has been a priority for the EDS thus far during the 2018-2019 academic year. EDS has started a discussion with the Community of Teachers (a stakeholder group consisting of P-12 educators) on how to obtain P-12 student data in a way that respects school districts, completers, and P-12 students. EDS will continue this collaboration and formulate plans for obtaining P-12 student data during summer retreats planned for the Community of Teachers and for EDS faculty and staff. EDS anticipates that data collection will begin during the 2019-2020 academic year after receiving guidance and permission from school partners. EDS will continue conducting completer focus groups so that the department can maintain strong relationships with completers and learn how they utilize program preparation to positively influence their students. From analysis of focus group discussions, EDS has learned that completers apply program knowledge to measure student growth through the use of formative assessments. Completers discussed how they analyze and interpret assessment results in order to better understand students’ abilities, to measure students’ progress, and to inform classroom interventions. P-12 student growth data is available through the use of completers’ classroom assessments. EDS must ensure that the processes for sharing assessment results and drawing conclusions based upon the data have mutual benefits for completers and EDS. 4 Measure 2: Indicators of Teaching Effectiveness To successfully meet this component, CAEP requires that teaching effectiveness be demonstrated “through structured and validated observation instruments and/or student surveys”. Currently, the EDS has collected employer survey and completer survey data that asked about teaching effectiveness (data are presented below). However, since observation instruments and students surveys are the desired sources of evidence, EDS will collaborate with our partners to develop processes as well as observation tools and P-12 student perception surveys. The measures need to be developmentally appropriate for users and must be approved by the school districts who are our collaborators. Additionally, data from these measures should inform the professional growth of program completers and provide them with feedback that can be used to refine their practice. Employer Survey Results The employer survey is a departmental survey that was administered for the first time in January 2019 to the principals of program completers. Survey indicators are elements from the Charlotte Danielson Framework for Teaching2 that were selected by stakeholders; content validity for the indicators was established using the Lawshe (1975 method). Employers were selected based upon the responses from a survey of 2015-2016 and 2016-2017 completers who specified that they were currently employed as P-12 teachers. Twelve out of 29 completers indicated that they are current P-12 teachers and provided their employers’ contact information. Six of the 13 employers who were contacted replied to the survey resulting in a 46.2% response rate. Employers were asked to rate their employees’ performance for each element using the following categories: Exemplary (4): Meets the objective or standard with a high level of consistency. Accomplished (3): Cleary developing the ability to meet the objective or standard with some degree of consistency. Developing (2): Shows some development in addressing the objective or standard Ineffective (1): Shows little or no evidence of development in addressing the objective or standard. The survey data reveal that employers perceive completers as effective teachers and are satisfied with their preparation from the TEP. Overall, employers rated completers’ teaching effectiveness higher on elements related to planning and preparation and lower on elements related to instruction and management of instructional groups. 2 The Framework for Teaching that has been adapted for the Kentucky Department of Education can be accessed at: https://education.ky.gov/teachers/PGES/TPGES/Documents/Kentucky%20Framework%20for%20Teaching.pdf 5 Employer Ratings of Completers’ Teaching Effectiveness Average Framework for Teaching Element Rating 1B. Demonstrating Knowledge of Students – Knowledge of the Learning Process 3.67 1C. Selecting Instructional Outcomes – Value, Sequence, and Alignment 3.67 1E. Designing Coherent Instruction – Learning Activities 3.67 2B. Establishing a Culture for Learning – Expectations for Learning and Achievement 3.67 1A. Demonstrating Knowledge of Content and Pedagogy – Knowledge of Content-Related Pedagogy 3.50 1E. Designing Coherent Instruction – Lesson and Unit Structure 3.50 1F. Designing Student Assessment – Congruence with Instructional Outcomes 3.50 1F. Designing Student Assessment – Design of Formative Assessments 3.50 2A. Creating an Environment of Respect and Rapport – Teacher Interaction with Students 3.50 3A. Communication with Students Expectations for Learning 3.50 3D. Using Assessment in Instruction – Monitoring of Student Learning 3.50 2D. Managing Student Behavior –Expectations 3.33 2D. Managing Student Behavior –Monitoring of Student Behavior 3.33 2D. Managing Student Behavior – Response to Student Misbehavior 3.33 2C. Managing Classroom Procedures – Management of Instructional Groups 3.17 3C. Engaging Students in Learning – Activities and Assignments 3.17 3D. Using Assessment in Instruction – Feedback to Students 3.17 Completer Survey Results The completer survey is a departmental survey that was administered for the first time in January 2019. Survey indicators are elements from the Charlotte Danielson Framework for Teaching3 that were selected by stakeholders; content validity for the indicators was established using the Lawshe (1975 method). The survey was sent to completers who graduated in either 2015- 2016 or 2016-2017. Of the 29 completers who were contacted, 21 responded to the survey resulting in a 72.4% response rate. Thirteen completers indicated that they are currently employed full-time as P-12 teachers. Completers represented the following specialty licensure areas: Elementary Education, Art, Physical Education and Health, Middle School Science, Middle School Math, English, Vocal Music, Instrumental