TEAC Audit Report

Total Page:16

File Type:pdf, Size:1020Kb

TEAC Audit Report

Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

CONFIDENTIAL DOCUMENT Do NOT Quote or Cite Without Permission

TEAC Site Visit Report Based on the Inquiry Brief of the Brigham Young University Educator Preparation Program Provo, UT December 2-4, 2014

Supplementary materials sent to auditors by January 14, 2015 First draft of audit report sent to program faculty on January 26, 2015 Final audit report accepted by program faculty on February 11, 2015

Audit Team Members Glenn L. Koonce, Lead Auditor, Chair, Educational Leadership Programs, School of Education, Regent University, Virginia Beach, VA Joan B. Johnson, Consulting Auditor, Associate Dean, School of Education, Norfolk State University, Norfolk, VA Donna D. Cooner, Consulting Auditor, Director, School of Teacher Education and Principal Preparation, Colorado State University, Fort Collins, CO Nedra Call, Curriculum and Professional Development Director, Nebo School District, Spanish Fork, UT Travis Rawlings, State Representative, Educator Licensure Coordinator, Utah State Office of Education

Brief Authors Lynnette Erickson and Aaron Popham

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 1 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Part One: INTRODUCTION

Summary of the Case Brigham Young University Educator Preparation Program December 2 - 4, 2014 The Summary of the Case is written by the auditors and approved by program faculty. The summary reflects the auditors’ understanding of the case the faculty are making for accreditation.

Authorship and approval of the Inquiry Brief The Inquiry Brief was prepared by Lynnette Erickson, Aaron Popham, Charles Graham, Sharon Black, Steve Shumway, Michelle Marchant, and Blair Bateman. The Educator Preparation Program faculty approved the Brief on August 29 and September 2, 2014. The Brief was approved by the Educator Preparation Program Executive Committee on August 11, 2014 and by the University Council on Teacher Education on September 16, 2014.

Introduction Brigham Young University (BYU) was founded in 1875 as a private faith-based Institution sponsored by The Church of Jesus Christ of Latter-day Saints (LDS) offering free tuition to those planning to go into teaching. With 33,000 students, BYU is now one of the largest privately owned faith-based universities in the country, as well as one of the largest teacher preparation programs. In 1996, the College of Education was changed to the David O. McKay School of Education, and in 2003 the teacher education preparation unit was redefined as the university-wide Educator Preparation Program (EPP) consisting of 27 majors and 23 minors, grouped into 20 program areas across 7 colleges. The EPP is governed by the University Council on Teacher Education (UCOTE) and the EPP Executive Committee.

Table 1 EPP Program Options, Completers, and Enrollment Completers Enrollments 2 2 2 2 0 0 0 0 Level 20 Program Options 0 1 1 1 201 (UG, 13 9- 0- 1- 2- To 2- 27 Majors grad, - post- 2 2 2 2 tal 201 20 bacc) 0 0 0 0 3 14 1 1 1 1 0 1 2 3 Educator Preparation Program (EPP) UG 612 706 663 613 2594 2735 2514 Art Education K-12 (BA) UG 17 18 13 10 58 91 116 Biological Science Education (BS) UG 18 19 13 21 71 75 67 Dance Education K-12 (BA) UG 8 5 12 9 34 35 33 Early Childhood Education (BS) UG 55 74 54 55 238 128 171

2 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Elementary Education (BS) UG 209 221 221 163 814 997 883 English Teaching (BA) UG 44 37 37 35 153 165 145 Family & Consumer Sciences Education UG 21 20 30 23 94 133 110 (BS) French Teaching (BA) UG 2 7 7 8 24 15 13 German Teaching (BA) UG 2 0 3 4 9 12 13 History Teaching (BA) / UG 57 68 69 66 260 245 219 Teaching Social Sciences (BS) Latin Teaching (BA) UG 0 1 2 1 4 2 1 Mathematics Education (BS) UG 38 43 48 47 176 218 173 Music Education, K-12 Choral (BM)/ Music Education, K-12 Instrumental UG 18 27 21 20 86 90 97 (BM)/ Music Education, Elementary Music Specialist (BM) Physical Education Teacher Education/ UG 18 31 20 25 94 100 90 Coaching K-12 (BS) Physical Sciences— Physics Teaching (BS)/ Teaching Physical Science (BS)/ UG 14 17 13 18 62 83 88 Earth & Space Science Education (BS)/ Chemistry Education (BS) School Health Education* (BS) UG 13 25 17 16 71 57 45 Spanish Teaching (BA) UG 13 17 14 11 55 37 42 Special Education, Mild/Moderate (BS)/ Special Education, Severe/Profound UG 50 48 45 47 190 147 120 (BS) Technology & Engineering Education UG 8 13 17 25 63 72 54 (BS) Theatre Arts Education, K-12 (BA) UG 7 15 7 9 38 33 34 * This EPP program is not offered after Summer 2014 (http://reports.byu.edu)

Table 2 Faculty Gender, Ethnicity, and Academic Track and Rank, 2009-2013 EPP Total Gender Ethnicity* Academic track* Academic rank* program faculty domain F M C M U A P1 P2 N A1 A2 P ECE 18 14 4 17 0 1 15 0 3 15 2 1 0 ELED 170 122 48 159 5 6 127 4 39 127 8 17 18 SCED 140 72 68 131 7 2 59 11 70 63 23 26 28 SPED 13 10 3 13 0 0 3 4 6 3 5 3 2 Total 341 218 123 320 12 9 204 19 118 208 38 47 48 Note. * C=Caucasian, M=Minority, U=Unknown; A=Adjunct, P1=Professional (CFS), P2=Professorial (CFS); N=None, A1=Assistant, A2=Associate, P=Professor. ** Numbers are based on the primary assignment of faculty even though some may teach in multiple programs.

The Inquiry Brief highlights five program elements as contributing to candidates’ preparation in education at Brigham Young University: (1) the atmosphere created by the BYU culture, (2) the quality of the EPP faculty, (3) the quality of the teacher ©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 3 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway candidates, (4) the collaboration of the BYU-Public School Partnership (BYU-PSP), and (5) the opportunity to substitute an internship for traditional student teaching. The IB also describes the program’s foundation as its belief that teaching is a moral endeavor, its identification of Five Commitments to Education (see below), and its alignment to the Utah Effective Teaching Standards (UETS).

Program Claims EPP faculty at BYU make six claims they feel meet TEAC Quality Principle 1 (QP1). The first five claims are linked to the program’s Five Commitments to Education. 1. The BYU EPP prepares candidates who model and teach the knowledge, skills, and dispositions required for civic virtue and engagement in our society (Civic Preparation and Engagement).

2. The BYU EPP develops candidates who are competent and caring, who promote engaged learning through appropriate instructional strategies and positive classroom environments and relationships (Engaged Learning Through Nurturing Pedagogy).

3. The BYU EPP develops candidates who are committed to and actively provide equitable access to academic knowledge and achievement by maintaining rigor in the mastery of curriculum content and instructional skills (Equitable Access to Academic Knowledge and Achievement).

4. The BYU EPP assists candidates in becoming responsible stewards in their schools and communities by dedicating themselves to shared purpose, renewal, and high standards of educator competence and learner performance (Stewardship in School and Community).

5. The BYU EPP fosters in candidates a commitment to renewal through consistent inquiry, reflection, and action within their professional practice, resulting in continuous improvement (Commitment to Renewal).

6. The BYU EPP prepares candidates to apply the Utah Effective Teaching Standards (UETS) in K-12 schools. Evidence Supporting the Claims

 Clinical Dispositions Scale (CDS) (Claims 3 & 4) The CDS is a self-report instrument used by faculty and administrators to better understand candidates’ perceptions of their dispositions and self- efficacy at the beginning and end of their programs. The paired sample t- test results showed that the post-program CDS score on the Locus of Control (3.78), Aspirations (3.51), and Diversity (4.19) are significantly

4 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

greater than the pre-program CDS scores on the Locus of Control (3.70), Aspirations (3.41), and Diversity Scales (4.05).  Clinical Practice Assessment System (CPAS) (Claims 1-6) The CPAS instrument is completed by mentor teachers and university supervisors to rate candidates’ clinical performance during pre-student teaching field experiences and student teaching/internships to provide candidates with formative and summative feedback. The average scores for graduates are above 4.0 (on a 5-point scale) for every item except one, and in all cases the scores are well above the 3.0 basic competence level. A one-sample t-test demonstrated that candidates were performing at a statistically significant level above the faculty criterion 3.0 cut score (university supervisor mean = 4.32, sd = .488 and t(25245) = 136.568, p<.001;mentoring teacher mean = 4.39, sd = .503 and t(2514) = 138.854, p<.001).  EPP Employer Survey (ES) (Claims 1, 2, 3, 5 & 6) The ES requests feedback from Utah public school principals on the EPP graduates’ performance during their first three years of teaching, with items related to the INTASC standards and to candidates’ preparedness to teach. With a 70.7% response rate, principals rated EPP graduates from 2010- 2013 in two areas: (a) teacher skills and behavior on four items at a favorable 97%, 94%, 94% and 93% respectively and (b) knowledge, preparedness, and comparison with other teachers at a favorable 94%, 87%, and 81% respectively.  Major GPA (Claims 2 & 3) Major GPA is the cumulative GPA for all required courses and is considered to be important criteria for identifying candidates’ qualifications in both subject matter and pedagogy. The average major GPA during the accreditation period was 3.63 and a one-sample t-test demonstrated the candidates performed at a statistically significant level above the 2.85 major GPA requirement (mean = 3.62, sd = .28275, t(2560) = 139.485, p , .001).  Praxis II test series (Claim 3) Praxis II series tests are designed by Educational Testing Services (ETS) to assess teachers’ content knowledge in their specialty area. Praxis II pass rate among graduates was 98.5%, and a one-sample t-test showed that candidates’ aggregated mean scores were at a statistically significant level above the aggregated mean of the state cut scores at both the larger and smaller scales: t(135) = 17.962, p,.001 and t(24060 = 101.084, p,.001, respectively.  Professional Interpersonal Behavior Rating Scale (PIBS) (Claim 1) After three revisions the PIBS has been designated as a “red flag” instrument for faculty or program administrators to use when a candidate needs particular feedback or remediation on his or her professionalism and interpersonal dispositions. Use of the PIBS is a program-by-program option. Average scores for each individual assessment item are all above 3.5. ©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 5 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Candidates at the 2.0 level are “red flagged,” as noted above. The one- sample t-test results showed that candidates performed at a statistically significant level above the 2.0 “red flag” score with p<.001 on all 10 PIBS items.  Teacher Work Sample (TWS) (Claims 1, 2, 3, 5 & 6) The TWS is a capstone assessment for which candidates develop, teach, and assess a unit of instruction and report the results. On the 0-2 scale candidates’ average performance was significantly above the 1.0 passing cut score (m = 1.84, sd = .193). A one-sample t-test showed that candidates scored statistically significantly above the 1.0 cut score on all eight TWS items, t(2265) = 217.833, p ,.001.  Technology Skills Assessment (TSA) (Claims 2 & 3) The TSA is a performance-based assessment representing basic technological skills foundational for eventually integrating technology into teaching. Candidates must successfully complete each assessment in 30 minutes or less with 100% accuracy on the evaluated skill. Results indicate that on each of the four assessments a significant number were able to pass the first time: 84.98% (Word Processing), 71.46% (Spread Sheet), 70.99% (Presentation Software), and 88.85% (Internet & Communications).

Reliability and Validity of All Assessments Because traditional educational data sources (e.g., grades, rating scales, etc.) are subject to variability that makes them less than ideally reliable for program evaluation, the BYU EPP faculty use multiple measures and sources for each claim to increase reliability and validate their confidence in the results. Content validity is the primary source of validity for seven measures used by the faculty (Praxis II, CDS, CPAS, PIBS, TWS, TSA, & ES). Reliability for the CDS, CPAS, PIBS, & TWS are measured using Cronbach’s Alpha. The program relies on ETS for total test consistency range in Praxis II scores. Three measures (Major GPA, TSA, & ES) have not been studied for reliability.

Internal Audit The Internal Audit was led by the TEAC Accreditation Team (TAT) who developed the Quality Control System (QCS) and the audit plan that was approved by the EPP Executive Committee and University Council on Teacher Education in the fall of 2013. The audit was completed in summer 2014, with the TAT developing the final report that was approved September 16, 2014 by UCOTE. Faculty conclusions for further action include:  A desire for more accurate assessment of candidates’ academic, behavioral, and teaching performance.  Work to form clear and unified definitions, expectations, and policies for the EPP.  Re-educate all EPP members along with BYU-Public School Partnership colleagues concerning changes in the guiding vision, Utah State Department of Education (USOE) licensure requirements, and Council for the Accreditation of

6 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Educator Preparation (CAEP) accreditation standards.  Investigate rank advancement for faculty.  Analyze the mYlink data management system for effectiveness.  Reconsider the current QCS for a more explicit guide for a quality program.

Plans for Program Improvement

 Additions and adaptations were identified to correct noted weaknesses and make strong candidates and faculty even stronger: (a) address any weakness areas noted in the data results; (b) more thoroughly analyze Claim 4; (c) more clearly align courses with national, state, & program outcomes; (d) investigate ongoing candidate self-assessment and faculty evaluation of candidates’ dispositions and behaviors; and (e) investigate advancement of faculty to the rank of professor.  Additional plans include (a) making data more widely available throughout the EPP and involving more faculty in data analysis;(b) assuring that the new mYlink data system meets faculty needs; (c) implementing an annual “data day” to examine program and major data; (d) establishing subcommittees within the EPP to conduct internal audits for specific probes; and (e) recommending that the EPP audit and accreditation be conducted more directly by UCOTE, with a full-time FTE allocated under UCOTE for this responsibility.  Quality control will be improved based on analysis of new program goals and needs where possible:(a) evaluating how much data are actually needed; (b) exploring additional candidate learning assessments; (c) addressing the non- pass rate on the TSA; (d) broadening understanding of alternative methods and processes for evaluating candidate performance; and (d) developing and implementing further training for university supervisors and mentor teachers for greater inter-rater reliability.  In an already large program with additional requirements (e.g.: new CAEP standards), there is a need for reducing unnecessary sources of confusion and awareness of program complexity by: (a) responding to findings from the internal audit of the QCS; (b) renegotiating student teacher classroom observation requirements to support the EPP programs and hold them accountable; (c) providing re-education for all EPP members and partnership colleagues regarding the shifts and changes that have occurred in the vision and standards foundational to the EPP; and (d) increasing compliance with Utah Administrative Board Rule and the new accrediting body, CAEP, faculty should consider restructuring the EPP to include all education-related licensure programs on the BYU campus (e.g., Audiology, Speech Language Pathology, School Leadership, School Psychology, School Counseling, &School Social Work).

Statement Regarding Commitment and Capacity ©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 7 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

The faculty concluded that Brigham Young University is committed to the EPPand that there is sufficient capacity to offer a quality program.

1Undergraduate program options in Early Childhood Education, Elementary Education, Secondary Education, and Special Education with 27 majors including Art Education (K-12), Dance Education (K- 12), Music Education (K-12), Physical Education/Coaching (K-12), and Theatre Arts (K-12); and secondary options in Biological Science, Chemistry, Earth & Space Science, English, Family & Consumer Sciences, French, German, History/Social Studies, Latin, Mathematics, Physical Sciences, Physics, School Health, Spanish, and Technology & Engineering. (Note that the School Health option will not be offered after summer 2014.)The State of Utah, at its discretion, offers teaching licenses to program completers in the option areas.

Acceptance of the Summary of the Case The faculty accepted the Summary of the Case as accurate on 11/29/2014.

Audit Logistics Auditors examined documents and compiled their work in Room 327 McKay Building, met with various groups in 185 & 305 McKay Building, 3252 Wilkinson Student Center, 217 Hinckley Center, and the BYU Conference Center. Classes were observed in McKay Building Rooms 280 & 283.

Audit Opinion Overall the Brief earned a clean audit opinion, including a clean opinion for evidence of student learning and a qualified opinion for institutional learning and quality control. The auditors also concluded that the evidence supports the view that Brigham Young University is committed to the Educator Preparation Program.

8 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Part Two: METHOD OF THE AUDIT

The IB Commission staff and the auditors selected a number of targets from the Brief and created tasks designed to verify these targets. (A target is any aspect of the Brief, such as text, data, or a figure, which is related to any of TEAC’s principles and standards.) In addition, the auditors may have created follow-up audit tasks based on their on-site experiences.

With regard to any one component of the TEAC system, the auditors employ a range of tasks. Some tasks (the clarification questions) are intended to clarify the meaning of targets in the Brief that the auditors find unclear. Most tasks are straightforward probes designed to verify or confirm the target (e.g., recalculating figures, interviewing informants, examining catalogs or policy manuals). Some tasks seek to reconcile multiple representations of the same target in the Brief to establish internal consistency (e.g., figures in two tables on the same point, restatements of the target in other places of the Brief). A few tasks seek to corroborate the target by examining evidence not cited in the Brief but relevant to assertions in the Brief. The auditors may corroborate the evidence in the Brief by new or extended statistical analyses of the evidence cited in the Brief and related evidence outside the Brief (e.g., on-site and on- line surveys of key informants).

The auditors will also, whenever possible and feasible, examine the primary source for any target (e.g., the actual rating or survey forms, formal documents, student portfolios, artifacts, roll & grade books, classroom facilities, budgets, correspondence).

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 9 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Part Three: AUDIT MAP

Audit tasks are organized by TEAC elements and components and are noted as verified, verified with error, not verified, or disclaimer. Audit task numbers are hyperlinked to the audit tasks in the accompanying report.

Table 1: Audit Tasks by TEAC Component and Result Verified with TEAC Component Verified Not verified Disclaimer error A1, A2, A3, 1.1 Subject matter A4, A6, A7 A5, A20 1.2 Pedagogy A8, A9, A21 1.3 Caring and effective A10, A22 teaching skill A11, 1.4 Cross-cutting A12,A13, A15 themes A14, A23 1.5 Evidence of A16, A17, reliability and A19 A18, A24 validity 2.1 Rationale for B1, B2, B3 assessments B4, B5, B6, 2.2 Use of evidence B7, B8, B9, B10 B11, B12 B13, B14, 2.3 Quality control B15, B16, B19 B17, B18 system B20, B21, B22

Part Four: AUDIT OPINION

Scoring and Meaning of the Audit Task Findings Each audit task is scored in one of four ways:  Verified, indicating that the auditors found that the target was accurately described or represented in the Brief  Verified with error, indicating that the auditors found some inaccuracy in the target, but the inaccuracy did not alter the basic meaning of the target  Not verified, indicating that the auditors found inaccuracy in the target that did alter its the basic meaning  Disclaimer, indicating that the auditors were unable to undertake the task

10 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Table 2: Audit Findings and Audit Opinions for the Brief I. 2 3. Number 1. Number of targets 2/1 3/1 Audit TEAC Element . Number of targets of verified with % % opinions targets errors 1.0 Evidence of 24 23 5 95.8% 20.8% Clean student learning 2.0 Institutional 22 20 4 90.9% 18.2% Clean learning and quality control Overall 46 43 9 93.4% 19.6% Clean totals Column 1 = Total number of targets Column 2 = Number of targets scored as verified and verified with error Column 3 = Number of targets scores as verified with error and not verified Column 4 = Column 2 divided by Column 1 gives the percentage of targets verified Column 5 = Column 3 divided by Column 1 gives the percent of targets with errors

Audit Opinion The Inquiry Brief overall received a clean audit opinion, because 93.4% of the audit tasks were found to be verified. Since 93.4% of the targets were verified, the Brief was found to be acceptably accurate and trustworthy. A majority of errors were due to missing items/scores, possibly indicating a need for attention to data collection and management.

The auditors are initially guided in their award of clean, qualified, adverse, or disclaimer audit opinions by the following considerations. An element receives a clean opinion if at least 90% of its associated targets are confirmed. An element is given a qualified opinion when at least 75% but less than 90% of its targets are confirmed or if more than 25% of the targets reveal misstatements of any kind (i.e., if the associated audit tasks are scored as either verified with error or not verified). If less than 75% of the targets can be verified, the element or component receives an adverse opinion if the examined evidence did not support the target or a disclaimer opinion if the audit tasks could not be performed or completed.

These guidelines are not strict rules, because a simple counting of outcomes of probes may be misleading with regard to the trustworthiness of the Brief. Some audit tasks may be more revealing than others. For example, some may have targeted only minor points, and some may be merely following up on other audit tasks on a single point. Others may probe significant targets that are critical in the case for accreditation. The guidelines may prove unreliable in cases with a small number of

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 11 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway audit tasks. The auditors therefore do not treat the guidelines or heuristics as rules that can be mechanically applied. If the findings suggest anomalies that make the heuristic unworkable, the auditors rely on their good judgments, explaining in their audit report the difficulties they experienced and the reasons for their opinions.

The auditors are also alert to evidence that is at variance with how the program is represented in the Brief, and they report events and experiences during the audit that were not fully consistent with the manner in which the Brief portrays the program.

Finally, it must be reemphasized that the audit opinion is not an opinion about the quality of the program or the degree to which the evidence in the Brief satisfies TEAC’s quality principles and capacity standards. It is solely an opinion about whether the Brief is accurate as written.

Part Five: AUDIT FINDINGS The audit findings consist of clarification task findings and audit task findings. Both clarification tasks and audit tasks consist of a target from the Brief and a probe about that target. The audit tasks are associated with specific components of the TEAC system, which are denoted in parentheses following the task number.

Clarification Tasks This section of the report contains tasks intended to clarify or elaborate statements in the Brief.

Clarification Task 1 Target: On page 83, the IB states, “The faculty of the EPP conducted the internal audit of the QCS led by the TAT.” In contrast, on pages 91 and 94, the audit procedures indicate that the targets were assigned to all TAT members. Question: Who conducted the internal audit of the QCS? Campus Response: The faculty responded to this question in writing prior to the audit visit: The TAT members led the internal audit and included the faculty and staff as appropriate to complete the tasks and probes. Each TAT member was assigned different tasks, which they coordinated with the appropriate faculty and staff to collect, analyze, and report the needed data for each task.

Clarification Task 2 Target: “The EPP employs full-time professorial and professional track faculty with continuing faculty status (CFS)…” (p. 11) Question: What is the difference between “professorial” and “professional” track faculty as mentioned on p. 11 and throughout the IB? Campus Response: The faculty responded to this question in writing prior to the audit visit: The professorial track represents the traditional expectations for university faculty. The university definition of professional track faculty reads,

12 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Professional faculty are faculty who have specialized responsibilities. Professional faculty include teaching faculty, research faculty, clinical faculty, librarians, athletic professionals, and others. Professional faculty enjoy the same basic privileges as professorial faculty. They may receive continuing faculty status (except for athletic professionals, including trainers) and rank advancement. They may vote in departmental decisions regarding faculty appointments, continuing faculty status, rank advancement, and all other matters. They may serve as chairs or deans, on committees, and in other administrative assignments, and they are eligible for university awards. (Rank and Status Policy, https://policy.byu.edu/view/index.php?p=103#s620 [note that this site is password protected.])

Clarification Task 3 Target: “In selected national rankings from the 2013-2014 academic year (http://yfacts.byu.edu/Article?id=306), BYU has been recognized in several areas” (p. 1). One area, No. 1 “stone cold sober” institution in the country for the last 15 years” was not found on the website provided. Question: Where was this item obtained? Campus Response: The faculty responded to this question in writing prior to the audit visit: The BYU public information office has updated the information reported on the Y Facts page reported in the Inquiry Brief. This original fact is attributed to the Princeton Review, which indicates that BYU has been the No. 1 “stone cold sober” institution in the country for the last 17 years. To support the claim please refer to the following external sites: http://www.princetonreview.com/schools/college/CollegeRankings.aspx? iid=1023349#/ http://www.deseretnews.com/article/865584140/Surprise-BYU-is-nations-stone-cold- sober-school-once-again.html?pg=all http://www.sltrib.com/news/1393197-155/university-college-princeton-review-party- syracuse http://fox13now.com/2014/08/04/byu-named-most-stone-cold-sober-school-in-nation- for-17th-year/

Clarification Task 4 Target: “In 1996 the partnership extended to include the BYU colleges and departments that participate in preparing secondary teachers, forming a tripartite organization in which these entities participate as equal partners, sharing governance, resources, and responsibilities” (p. 2). The authors of the Brief describe in various places (particularly pp. 5, 6 & 7) the importance of the BYU-PSP to the EPP. Question: How is “equal partners” defined for the three separate categories: “sharing governance, sharing resources, and sharing responsibilities”? Campus Response: The faculty responded to this question in writing prior to the audit visit: The Brigham Young University-Public School Partnership (BYU-PSP)

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 13 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway brings together a private university and five public school districts. The university and the public school districts are dissimilar in distinct ways such as purposes, settings, needs, resources, relationships, and administrator support. The school districts vary in size, number of employees, students served, community needs, and facilities. It is through its common mission that the university and the districts are able to collaborate in ways that benefit all partners. Not all partners contribute in the same way or to the same degree, and not all partners benefit from the association in an equal or proportional manner. Equal does not imply sameness. The longevity of the Partnership (30 years) is due in a large part to each partner having something different to offer, possessing some self-interests that overlap and maintaining a willingness to make some sacrifices that may benefit one partner more that another. The BYU-PSP has operated on the belief that partners working together can achieve more than they can accomplish working separately.

The sense in which the partners are equal derives from an equal commitment to the mission and purposes of the Partnership and its shared governance. One indicator of equal commitment is through resource acquisition and allocation. Each of the partners makes significant financial and human resource contributions. This signifies that the partners are “equally yoked” to the goals, mission, and objectives of the Partnership.

Another indicator is shared governance. Members of the Governing Board share the same privileges, status, and rights. A chair of the Governing Board is selected by its members and acts in concert with the Partnership’s executive director in carrying out the responsibilities and activities decided by the board. Board decisions direct the acquisition and use of resources and the assignment and stewardship of responsibilities. Resources and responsibilities are best deployed after discussion and deliberation among the board members who jointly share power in decision making. As “equal partners” each representative of the partner organizations contributes to determining which issues to pursue, what resources are needed to address the needs, how to secure the resources, and how to assign responsibility and accountability for their use. Working together the partners are best able to achieve more than they could accomplish separately.

In some instances certain partners may contribute more resources, commit specific types of resources, or derive more benefit than another partner. Attempting to quantify costs or benefits to each partner is not useful because of the varying levels of needs, implementation, and readiness. The partners are however equal in their ability to draw upon the resources and activities of the Partnership regardless of their size, resource contribution, or any other distinguishing factor.

Clarification Task 5 Target: “The Center for the Improvement of Teacher Education and Schooling (CITES)” is noted on page 2 as “an administrative unit.” Question: Who and what titles do members of this “administrative unit” hold?

14 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Campus Response: The faculty responded to this question in writing prior to the audit visit: The following list of personnel and organization chart show the members of CITES: Gary Seastrand, director of CITES &co-director of Principals’ Academy Barry Graff, assistant director for professional development Lynnette Christensen, assistant director for research Paul Wangemann, assistant director of support services Robert Bullough, professor and lead researcher Paul Caldarella, associate professor CallyFlox, director of the Arts Partnership Program Joyce Terry, executive secretary Leslie Williams, research associate Doug Allen, Arts program associate Barry Newbold, co-director of Principals’ Academy Nettina Smith, co-director of the Central Utah Science and Engineering Program

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 15 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Clarification Task 6 Target: “Students are admitted to LEPs by application” (p. 8). Question: Appendix D, pp. 163-164, “Table D3, Requirements for Admission to EPP Programs” does not indicate the specifics for admission to an LEP beyond regular

16 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway admission requirements. What distinguishes those who are selected for an LEP other than an application, and does the application itself explain this? Campus Response: The faculty responded to this question in writing prior to the audit visit. The BYU EPP includes nine LEP programs: Art Education K-12 Dance Education K-12 English Teaching Music Elementary Specialist Music K-12 Choral Music K-12 Instrumental Spanish Teaching Special Education Mild/Moderate Special Education Severe

Each of these LEP programs establishes its own admission requirements as part of its specific application process. Information regarding each of these programs and its admission requirements can be found in the BYU catalog at http://registrar.byu.edu/catalog/2014-2015ucat/Advisement/LEPList.php .

Clarification Task 7 Target: On page 14 the following statement is made: “To support and operationalize this vision, the BYU-PSP adopted the Five Commitments to Education, an extension which accompanies our vision statement. These commitments focus the EPP mission, program, and course objectives/outcomes, and course instruction.” Question: It is evident that the “Five Commitments” are reflected in the EPP claims (pp. 17-18) and aligned with EPP evidence (Table 5, p. 19). What evidence exists that the “Five Commitments are found in course objectives/outcomes and course instruction”? Can they be found anywhere at the course level (e.g., syllabi)? Campus Response: The faculty responded to this question in writing prior to the audit visit. The TAT assessed the level to which the Moral Dimensions of Teaching/Five Commitments were addressed by program learning outcomes (see Appendix A.III.4.1.3) and found that “100% of the major [programs] have learning outcomes aligned with the Moral Dimensions of Teaching/Five Commitments” (Appendix A, p. 98). Additionally the TAT assessed faculty syllabi (see Appendix A.III.4.2.4) for indicators of the Moral Dimensions of Teaching/Five Commitments. Our findings indicated “only half (53%) of the syllabi referred to the Moral Dimensions/Five Commitments. Further effort is needed to educate and encourage faculty to include the Five Commitments in their syllabi and to address them in their courses” (Appendix A, p. 99).

Clarification Task 8

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 17 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Target: “The CDS is a candidate self-report instrument; therefore the EPP has not set performance criteria” (p. 21). Question: Why can’t the EPP set performance criteria for the CDS? Campus Response: The faculty responded to this question in writing prior to the audit visit: The CDS is a candidate self-report instrument. Candidates rate their perceptions of their locus of control, aspirations, and diversity at the beginning and again at the end of the program. Our purpose in administering the CDS was to document changes in student perceptions, with the hope that their efficacies would increase from the beginning to the end of the program. No performance was established because there were no minimal expectations of efficacy.

Clarification Task 9 Target: Table 9 (pp. 38-40) indicates that Claims 1-6 have various CPAS items measured (Claim 1, CPAS 3 & 9; Claim 2, CPAS 1-7; Claim 3, CPAS 1-7; Claim 4, no CPAS; Claim 5, CPAS 8 & 9; and Claim 6, CPAS 1-9). Question: How were the various items selected for each claim? It wasn’t clearly revealed in the CPAS description and rationale (pp. 21-22). Campus Response: The faculty responded to this question in writing prior to the audit visit. The INTASC standards (1992) were originally adopted and analyzed for their alignment to each of our six claims. Each of the ten items on the CPAS was originally created from the INTASC standards (1992). When we transitioned to the InTASC standards (2011) and Utah Effective Teaching Standards (UETS; 2011) the CPAS items were adjusted to align to the new standards allowing us to maintain our alignment to each of the six claims. Therefore the CPAS items allow for a direct assessment of our six claims. We provide crosswalks showing this process and alignment in Appendix G, Tables G.1-G.3.

Clarification Task 10 Target: Table 17(p. 51) “CPAS (MT and US) Data Comparing Programs to EPP Mean +/- 0.84 x SD” Question: It is understood why there are no scores in “MT CPAS 10” and “US CPAS 10.” What is the rationale for having a final column for both mentoring teacher and university supervisor that indicates “MT CPAS 3 (INTASC 6 only) and US CPAS 3 (INTASC 6 only)”? Campus Response: The faculty responded to this question in writing prior to the audit visit: While we aligned from INTASC to InTASC and then to UETS standards, we choose to report our data only according to the UETS standards (see crosswalk in Appendix G, Table G.1). The UETS standards do not include INTASC Standard 6, “Communication and Technology,” so we could not directly report the data using the UETS standards. In our crosswalk moving from INTASC to InTASC and finally to UETS, we interpreted what was formerly INTASC 6 to now be part of UETS 3, thus INTASC CPAS 6 now is UETS CPAS 3. Therefore, we chose to report these data as INTASC Standard 6 in Table 17 because they did not align with UETS Standard 3.

Clarification Task 11

18 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Target: “During our reporting period the EPP assessment instruments have been aligned with three different sets of standards: INTASC (1992), InTASC (2011), and UETS (2011). This situation complicated the tasks of aggregating and analyzing assessment data across academic years. Therefore, we developed a series of crosswalks (see Appendix G) to align our EPP claims, UETS standards, and TEAC Quality Principle I and cross-cutting themes. Once this alignment was completed, we were able to compile all our assessment data into a comprehensive dataset for analysis” (p. 31 bottom-32 top) Question: Who developed the crosswalks and how were the alignments developed in Appendix G, Tables G1- G3 (pp. 427-432)? What does the term Composite mean in Table G3 (pp. 430-432)? In Table G3, was there no UETS alignment with QP1 4.3, Technology? Campus Response: The faculty responded to this question in writing prior to the audit visit. The TAT analyzed the standards and developed alignment of the INTASC (1992), InTASC (2011), and UETS (2011), which resulted in Tables G.1-G.3, which were reviewed and approved by the BYU EPP Executive Committee. The term composite in Table G.3 refers to the inclusion of all 10 UETS standards. In Table G.3 there is no direct UETS alignment to QP1.4, Technology, because UETS does not have a standard that directly addresses technology. Technology is embedded throughout the standards.

Clarification Task 12 Target: On the third version of the CPAS (pp. 282-284), there is no place for summary statements as on two earlier versions. Question: Is there a reason why there seems to be no place for a summary statement on the third version of the CPAS? Campus Response: The faculty responded to this question in writing prior to the audit visit: On pages 282-284, the third CPAS instrument is a copy of our formative observation form. Because it is formative, there is room for evaluator comments after each item. We do not consider a summative statement appropriate for formative observations.

Clarification Task 13 Target: “For our reporting period, we evaluated the internal consistency and reliability of both the CPAS US and CPAS MT, calculating Cronbach’s Alpha coefficients to be alpha = .919 and alpha = .923, respectively” (pp. 35 & 36) “and for the CDS Cronbach’s Alpha coefficients to be alpha = .780 and alpha = .799 respectively” (p. 35). Question: Where can the raw data for these calculations be found (Cronbach’s Alpha)? Campus Response: The faculty responded to this question in writing prior to the audit visit: Please see our online evidence website (https://sites.google.com/site/byueducatorpreparationprogram/misc) for these raw data and Cronbach’s Alpha calculations.

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 19 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Clarification Task 14 Target: Tables 17-19 present a cell for each core assessment item for each program. Cells left blank represent programs with a mean value within a z-score of +/-0.84 of the overall EPP mean. Green cells represent scores above a z-score of 0.84, and pink cells represent scores below a z-score of -0.84. The number in the cell represents the difference between the program mean and the EPP mean +/- 0.84 multiplied by the standard deviation. Question: Where can the raw data for z-scores be found? Campus Response: The faculty responded to this question in writing prior to the audit visit: Please see our online evidence website (https://sites.google.com/site/byueducatorpreparationprogram/misc) for these raw data and z-score calculations.

Clarification Task 15 Target: “Figure 4” (p. 46) shows the EPP data for candidate performance on the Professional Interpersonal Behavior Scale (PIBS). Question: For Item 3, “Learning Community,” the n is 941, while the other nine items have an n score that ranges only between 1207 and 1218 (a difference of 12). Why does Item 3 have a difference of 266 from the lowest n (1207)? Campus Response: The faculty responded to this question in writing prior to the audit visit. We rechecked the raw student data, and there are no responses to PIBS Item 3 for the 266 students in question. Without further investigation we cannot explain why there are so many missing data.

Clarification Task 16 Target: “Table 16,EPP Pass Rates on Four Skill Assessments of the TSA 2009- 2013” (p. 49) indicates that the following percentage of students did not pass the designated skill: “Word Processing 8.02%, Spread Sheet 10.71%, Presentation Software 10.43%, and Internet and Communications 10.43%.” The following is stated on p. 28: “Candidates must successfully complete each assessment in 30 minutes or less with 100% accuracy on the evaluated skills. Candidates who are unable to pass an assessment in two attempts are offered individualized tutoring from the TEC lab before taking the assessment again. With the individualized tutoring, candidates are expected to pass the TSA the third time they take the assessment.” Question: What happens to the individuals who fell into the “did not pass” category? Campus Response: The faculty responded to this question in writing prior to the audit visit. The EPP relies on program faculty to monitor students and uphold the EPP policy. We found in our report that for various reasons EPP faculty did not enforce the policy and require candidates to pass the TSA. In our subsection What We Plan for Further Improvement, we explain, “We found that approximately 10% of our candidates never passed the TSA. We propose to further explore this concern and develop processes to ensure that all candidates pass the TSA prior to being accepted into an EPP program” (p. 74).

20 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Clarification Task 17 Target: Page 32 of the IB indicates that the TWS are distributed to raters in the program for scoring with the EPP approved rubric. Question: Who are raters and how are they selected? How are raters of the TWS trained? Campus Response: The faculty responded to this question in writing prior to the audit visit: TWS raters consist of university personnel working within each of the BYU EPP programs. Additional raters are sometimes employed to assist in the scoring of the TWS. These additional raters are drawn from a pool of teachers, former clinical faculty, and retired faculty. Raters receive instruction on how to score the TWS from individual departments.

Clarification Task 18 Target: Page 107 of the IB states the “EPP has made changes in our assessment instruments on three separate occasions”; however, the faculty elected to use data from years 2009-2013. Question: What is the rationale for using this period of time for the IB? Would the changes in assessment instruments impact validity, reliability, and program decisions? Campus Response: The faculty responded to this question in writing prior to the audit visit. The BYU EPP accreditation period began in 2009 and ended in 2013; therefore we elected to use data from 2009-2013 in our report. We believe that the changes in the assessment instruments during that time probably had an impact on the reliability of our data and the confidence we have in the data for program decision making. But had we not used the data we had from the multiple instruments that we used during that time, we would not have had any more than two years of data to report. The TAT created the crosswalks (see Appendix G) between the standards and the items on the assessments in an effort to increase the reliability and validity of the assessment instruments used.

The EPP Executive Committee made the decision to align our assessments with the revised InTASC standards after a year of reviewing and studying the revised standards and evaluating the potential impact on our assessments. Less than a year after that decision to move to the revised InTASC standards, the USOE mandated the transition to the UETS, which was not good timing for the BYU EPP, introducing new reliability and validity concerns. The BYU EPP had prepared for the transition from the original INSTAC standards to the revised InTASC standards, but had no choice in the matter of moving to the UETS.

Clarification Task 19 Target: “Table 3, Faculty Gender, Ethnicity, and Academic Track and Rank, 2009- 2013, “Note. * C=Caucasian, M=Minority, U=Unknown; A=Adjunct, P1=Professional (CFS), P2=Professorial (CFS); N=None, A1=Assistant, A2=Associate, P=Professor (pp. 11-12).

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 21 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Question: What would “N=None” under Academic Rank mean? Campus Response: The faculty responded to this question in writing prior to the audit visit: The N or “None” column in the Academic Rank section of Table 3 indicates those faculty members who are classified as Non-CFS (non-tenure). Non-CFS faculty members (e.g., adjunct faculty) do not hold an academic rank at BYU.

Clarification Task 20 Target: Table 6 (pp. 20-21) and page 24 of the IB: “We have chosen to use the major GPA as an EPP assessment because it supports our claim concerning our candidates’ ability to provide equitable access to academic knowledge and achievement, which is linked with TEAC Quality Principle I, subject matter knowledge and pedagogical knowledge (QP 1.1, 1.2), and with the cross-cutting themes of multicultural perspectives and accuracy and technology (QP 1.4.2, 1.4.3).” Question: What is the rationale for using GPA as evidence for the two cross-cutting themes of multicultural perspectives and technology? Campus Response: The faculty responded to this question in writing prior to the audit visit. The candidates’ major GPAs include the grades they earned in the required multicultural and technology courses in every EPP program. Additionally, the content and pedagogy of the multicultural and technology courses are implicit cross- cutting constructs in all of the professional course work and required knowledge demonstrated not only in course work but also in the EPP assessments (e.g., TWS, CPAS). We believe that total major GPAs represent collective evidence of individual candidate’s subject matter knowledge and pedagogical knowledge, providing evidence of the cross-cutting themes of multicultural perspectives and accuracy and technology.

Audit Tasks This section of the report addresses targets associated with Quality Principle I: Evidence of Candidate Learning, which has the following requirements:

Program Content and Outcomes 1.1 Subject matter knowledge. The program candidates must learn and understand the subject matter they will teach. 1.2 Pedagogical knowledge. The program candidates must be able to convert their knowledge of subject matter into compelling lessons that meet the needs of a wide range of pupils and students. 1.3 Caring and effective teaching skill. The program candidates must be able to teach effectively, professionally, and in a caring manner. 1.4 Cross-cutting liberal education program content themes. For each component of element 1.0, the program must also address three cross-cutting liberal education themes: o Learning how to learn. Candidates must demonstrate that they have learned how to learn information on their own, that they can transfer what they have learned to new situations, and that they have acquired the dispositions and skills that will support lifelong learning in their field. o Multicultural perspectives and accuracy. Candidates must demonstrate that they have learned accurate and sound information on matters of race, gender, individual differences, and ethnic and cultural perspectives.

22 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

o Technology. Candidates must know the technologies that enhance student learning and the work of leaders and staff. TEAC requires evidence that graduates have acquired the basic productivity tools of the profession. 1.5 Evidence of valid assessment. The program must provide evidence regarding the trustworthiness, reliability and validity of the evidence produced from the assessment method or methods that it has adopted.

Audit Task A1 (1.1) Target: Clarification Task 9. In responding to how the various items that were selected for each claim in Table 9 (pp. 38-40 in the Brief), the faculty stated, “The INTASC standards (1992) were originally adopted and analyzed for their alignment to each of our six claims. Each of the ten items on the CPAS was originally created from the INTASC standards (1992). When we transitioned to the InTASC standards (2011) and Utah Effective Teaching Standards (UETS; 2011) the CPAS items were adjusted to align to the new standards, allowing us to maintain our alignment to each of the six claims. Therefore the CPAS items allow for a direct assessment of our six claims. We provide crosswalks showing this process and alignment in Appendix G, Tables G.1- G.3.” Probe: Check Appendix G, Tables G.1-G.3 as a crosswalk that includes UETS Standards. Check these items to see that they are the Utah Effective Teaching Standards by visiting the Utah State Office of Education website and/or contacting a representative from the Utah State Office of Education. Finding: UETS standards are included in Tables G1, G2, & G3 (pp. 427-432). These UETS standards are the same as found on the Utah State Office of Education at http://www.uen.org/k12educator/uets/. In addition, Travis Rawlings, Educational Licensure Coordinator at the Utah State Office of Education, confirmed the standards in Appendix G1-G3 are the UETS. Verified

Audit Task A2 (1.1) Target: “A candidate must maintain a major GPA of 2.85 or above to qualify for student teaching/internships” (pp.8 & 33). Probe: For a sample of 20 undergraduate completers, verify that minimum GPA is 2.85 at entrance to student teaching Finding: 19 students had a minimum GPA of 2.85 or above, and one student (DB, 2.74) did not. The sample of 20 students used for these audit tasks was selected by using the random number generator function in Excel and sorted into their respective majors. Within the major, students were sorted by their assigned random generated number. Students with the highest numbers were included in the sample.

Table A2: GPA of Sample of 20 Students Name Major GPA 1. TB 3.97 2. AS 3.68

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 23 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

3. JJ 3.1 4. MP 3.41 5. MP 3.53 6. AR 3.19 7. BE 3.58 8. JD 3.78 9. RP 3.2 10. AF 3.76 11. HA 3.87 12. JB 3.98 13. AK 3.23 14. DG 3.41 15. LW 3.3 16. CU 3.42 17. DB 2.74 18. HF 3.54 19. SP 3.19 20. RL 3.69

Verified with error due to one student (5%) in the sample (DB) having a 2.74 major GPA (minor error)

Program Response: We acknowledge the findings of the audit team: One student in our random sample did not have a major GPA of 2.85 or higher. On page 8 of the IB we cite our then current policy and link to it (http://education.byu.edu/sites/default/shared/documents/ess/documents/GPA %20and%20C-%20Grade%20Policy.pdf). Please note that this policy went into effect in September 2012. According to the student data set provided to the auditors (file: Master Student Data.xlsx) found on our Google Site, the student in question, DB, student taught winter semester 2010 and graduated in April 2010, prior to the implementation of this policy. Thus DB was not held accountable for the 2.85 major GPA criterion since the policy was not yet in place.

TEAC Response: TEAC reviewed page 8 in the IB, the policy located at the link noted in the program response, and the Master Student Data.xlsx and agree that DB should not be held accountable for the 2.85 major GPA criterion since the policy had not at that time been put into place. This information changes the outcome of the finding from Verified with error to Verified. TEAC thanks the program for this clarification. Verified

Audit Task A3 (1.1)

24 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Target: “The average scores for graduates are above 4.0 (on a 5-point scale) for every item on the CPAS except CPAS 2 (Learning Differences): m = 3.95, sd = .662 as rated by the university supervisors” (p. 52). Probe: For a sample of 20 program completers, verify average scores above 4.0 on every item of CPAS except CPAS 2. Finding: All 20 program completers had an average score above 4.0 on the CPAS. The sample of 20 students used for these audit tasks was selected by using the random number generator function in Excel and sorted into their respective majors. Within the major, students were sorted by their assigned random generated number. Students with the highest numbers were included in the sample.

Table A3: CPAS Averages of Sample of 20 Students Averages Item Item Item Item Item Item Item Item Item 1 2 3 4 5 6 7 8 9 Mentor teachers 4.2 4.3 4.1 4.5 4.2 4.6 4.3 4.5 4.4 University 4.1 4.2 4.2 4.2 4.0 4.4 4.3 4.7 4.5 supervisors Verified

Audit Task A4 (1.1) Target: “Average performance on the TWS items is significantly above the 1.0 passing cut score” (p. 55). Probe: For a sample of 20 completers, verify that TWS items are above the 1.0 passing cut score. Finding: Five of the TWS samples were “not found”: AS, BE, AK, SP & RL. The sample of 20 students used for these audit tasks was selected by using the random number generator function in Excel and sorted into their respective majors. Within the major, students were sorted by their assigned random generated number. Students with the highest numbers were included in the sample.

Table A4: TWS Scores of Sample of 20 Students Name TWS Final Score 1. TB 2 2. AS NF 3. JJ 1.9 5 4. MP 1.7 5 5. MP 1.6 8 6. AR 1.9 1 7. BE NF 8. JD 2 9. RP 1.9

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 25 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

3 10. AF 2 11. HA 1.7 1 12. JB 2 13. AK NF 14. DG 1.2 8 15. LW 1.8 6 16. CU 1.9 2 17. DB 2 18. HF 1.8 19. SP NF 20. RL NF Verified with error, as 5 (25%) of the 20 sample students’ grades on the TWS were “not found” (major error)

Program Response: We acknowledge the findings of the audit team: In the data sample, scores for the TWS were missing for 5 students. In the IB on pages69-70 we stated, “The inquiry into our quality control system showed that how and when the assessments are administered, collected, and reported and how faculty and programs are held accountable for the process have been inconsistent across the EPP, resulting in more missing data than desired,” which shows that we recognize that this is a problem that needs to be improved in our data management system and data collection process. We continue on page 71, stating, “Having recognized the inadequacy of the EPP’s fragmented data management system, we are creating a system to meet the specific needs of our program. We have initiated development of a centralized system for data collection and management, which we call mYlink. Over the past three years we have made significant progress in creating and refining mYlink to unify the EPP student assessments and licensure data, along with various other data necessary for maintaining the quality of our program.” On page 70 of the IB we also explain that the size and complexity of our program contribute to our difficulty in collecting all the data for all our candidates.

In our plan for further improvements, we explain on page 73, “The data management system that has been developed and initially piloted for its ability to furnish accurate and complete candidate, faculty, and curriculum data, mYlink, will be analyzed to be sure it meets our needs. Plans must be solidified for moving through additional phases and implementing mYlink across the EPP for collecting, archiving, and analyzing EPP data. We anticipate that the mYlink system will make data for individual and aggregated groups of candidates available and accessible to EPP programs in real time, including faculty, program designees, and the accreditation team.” We also explain on page 75, “As the size of the EPP is unlikely to change, we need to allow the findings of our assessments to guide us in reducing unnecessary 26 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway complexity. Our practices and policies need to be expressed clearly and disseminated widely. Consistency in definition, intent, and application must be maintained. In responding to the new CAEP requirements, we may find it necessary to make our organization even more complex; thus reducing unnecessary sources of confusion becomes critical.”

Though we acknowledge the audit team’s findings, we believe that we presented evidence in our Section 5: Discussion and Plan to alert the audit team that the program size and complexity and the antiquated data management systems we had depended on during this accreditation cycle would yield findings that would include multiple instances of missing data.

TEAC Response: TEAC acknowledges that the program included statements in the IB on pages 69-71, 73, and 75 identifying problems with missing data and plans for further improvement of the quality control system. TEAC appreciates knowing the program has plans for continued improvement in this area. The finding of this audit task remains Verified with Error.

Audit Task A5 (1.1) Target: “Principals have generally been very pleased with the level of graduates’ knowledge in the areas they are teaching” (p.58). Probe: Examine a sample of responses to the most recent Employer Survey and interview superintendents to verify. Finding: The Employer Survey consisted of 874 surveys with a 70.7% response rate. The principals rated the content knowledge category very high, reporting that 97% of the graduates always or frequently demonstrate that they know the content and create meaningful learning experiences. Five superintendents from partner school districts were interviewed and corroborated the survey results. Verified

Audit Task A6 (1.1) Target: “. . . develops candidates who are committed to and actively provide equitable access to academic knowledge” (p. 48). Probe: Interview faculty and students to determine the meaning of “equitable access to academic knowledge” and discover what evidence the program collects to satisfy this goal; then examine the evidence for 20 completers. Finding: Students and faculty were able to define “equitable access to academic knowledge” as providing differentiated instruction to maximize learning for all students. Both students and faculty had similar definitions and indicated it was a major part of the curriculum. Both faculty and students identified the Teacher Work Samples (TWS) as containing evidence of students’ demonstration of understanding. TWSs of 20 completers were reviewed for evidence. Seventeen TWSs contained information to support the claim, but three TWSs were not found (NF).

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 27 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Table A6: Evidence of Providing Equitable Access in TWS of Sample of 20 Students Name Evidence in TWS 1. TB yes 2. AS yes 3. JJ yes 4. MP yes 5. MP NF 6. AR yes 7. BE NF 8. JD yes 9. RP NF 10. AF yes 11. HA yes 12. JB yes 13. AK yes 14. DG yes 15. LW yes 16. CU yes 17. DB yes 18. HF yes 19. SP yes 20. RL yes Verified with error, as 17 TWSs contained information to support the claim, but 3 TWSs (15%) were not found (NF) (minor error)

Program Response: We acknowledge the findings of the audit team. We verified that in the random sample for this probe 3 of the 20 students had no TWS assignment available for the audit team to review.

TEAC Response: TEAC appreciates the program’s response acknowledging this error.

Audit Task A7 (1.1) Target: “… assists candidates in becoming responsible stewards in their schools and communities.” (p.49) Probe: Interview faculty and students to determine the meaning of “responsible stewards” and to discover what evidence the program collects to satisfy this goal; then examine the evidence for a sample of 20 completers. Finding: Both faculty and student groups were able to describe and give examples of how candidates are assisted and encouraged to become responsible stewards in their schools and communities. Principles 9 and 10 on the CPAS were identified as evidence for this goal (see below). Both mentor teachers and university supervisors rated completers on these principles during student teaching on the CPAS using the following scale: 5 = exceptional; 4 and 3 = competent; and 2 and 1 = emerging.

28 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

PRINCIPLE 9: Reflective Practitioner. The teacher is a reflective practitioner who continually evaluates the effects of his/her choices and actions on others (students, parents, and other professionals in the learning community) and who actively seeks out opportunities to grow professionally. PRINCIPLE 10: Professionalism and Interpersonal Relationships. The teacher fosters relationships with school colleagues, parents, and agencies in the larger community to support students’ learning and well-being. The sample of 20 students used for these audit tasks were selected by using the random number generator function in excel and sorted into their respective majors. Within the major, students were sorted by their assigned random generated number. Students with the highest numbers were included in the sample.

Table A7: Scores for Items #9 and #10 on CPASS for Sample of 20 Students

Mentor Teacher Mentor Teacher Univ. Super. Univ. Super. Name CPASS 9 CPASS 10 CPASS 9 CPASS 10

1. TB 5 5 5 5 2. AS 4 4 5 4 3. JJ 5 4 5 5 4. MP 4 4 4 4 5. MP -- -- 5 5 6. AR 5 5 5 5 7. BE 5 5 5 5 8. JD 4 3 5 5 9. RP 3 4 5 2 10. AF 4 5 5 5 11. HA 4 4 4 4 12. JB 5 5 5 5 13. AK ------

14. 4 4 5 5 DG

15. -- -- 4 3 LW

16. 4 4 4 5 CU 17. DB 5 4 3 3 18. HF 5 4 5 5 19. SP 5 5 5 5 20. RL 5 5 5 5 Verified with error, as three students (15%) were missing data (minor error)

Program Response: We acknowledge the findings of the audit team based on the CPAS evaluations in the students’ electronic files

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 29 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

(https://sites.google.com/site/byueducatorpreparationprogram/cpas-folder) provided to the audit team during our site visit. However, according to the student data set provided to the auditors (file: Master Student Data.xlsx) found on our Google Site, mentor teacher CPAS data are available for MP, AK, and LW. No university supervisor CPAS data are available for AK. See the table below with the corrected data.

Mentor Teacher Mentor Teacher Univ. Super. CPAS Univ. Super. CPAS Name CPAS 9* CPAS 10* 9* 10*

5. MP 5 5 5 5 13. AK 5 4 -- -- 15. LW 5 5 4 3 *Note: Due to changes from INTASC to InTASC to UETS (explained in Appendix G), the data found in the Master Student Data file are all reported based on UETS. Therefore, the data reported in this table and in the Master Student Data file are taken from the Standard 8 and 9 columns, which align with Items 9 and 10 of the version of the CPAS used for MP, AK, and LW.

TEAC Response: TEAC appreciates the program’s response and clarified information. Because results are missing for AK in terms of the university supervisor ratings, the finding remains scored Verified with Error.

Audit Task A8 (1.2) Target: “Technology & Engineering Education, Claim 2: Engaged Learning Through Nurturing Pedagogy, QP 1.2: Pedagogical Knowledge” (Appendix I: Disaggregated Assessment Data for Each of the 20 EPP Programs, under the Teacher Work Sample column, p. 1227):“TWS 1m = 1.93sd = 0.210 n = 6196.83%#recorded 61 #expected 63” Probe: Review the data for QP 1.2 and re-compute from the raw data to confirm findings. Finding: Faculty calculated the standard deviation using the Excel STDEV.P function. When auditors sought verification of the data, they used STDEV and reviewed TWS1- 6 for QP 1.2 for each academic year and total cohort. The two functions treat missing data differently, and difference is increased by more missing data and smaller N sizes, but the actual difference in data is insignificant (i.e. 0.2284 vs 0.227, N=63). Verified

Audit Task A9 (1.2) Target: “Elementary Education, Claim 2: Engaged Learning Through Nurturing Pedagogy, QP 1.2: Pedagogical Knowledge” (Appendix I: Disaggregated Assessment Data for Each of the 20 EPP Programs, under the Teacher Work Sample column, p. 654):“TWS 1m = 1.82sd = 0.279n = 69985.87%recorded 699 #expected 814” Probe: Review the data for QP 1.2 and re-compute from the raw data to confirm findings. Finding: Auditors reviewed TWS1-6 for QP 1.2 for each academic year and total cohort with the following results: m = 1.82 sd = 0.279 n = 699, 85.87% correct.

30 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Auditors computed the difference between the number of scores expected, 814, minus the number of scores recorded, 699, and found a difference of 120. The EPP (on p. 654) indicated the missing 120 scores as a percentile, 85.87%. Auditors computed the percentile difference between 699 and 814 as 85.35%. The difference between 85.87% and 85.35% was found by the auditors to be .52%. The .52% difference between data is insignificant. Verified

Audit Task A10 (1.3) Target: In “Table D6, Summary of Requirements for Secondary Education,” Caring Teaching Skills are evaluated by CPAS2-10 in Sc Ed 378, Sc Ed 476 or Sc Ed 496” (p. 166). Probe: From a random sample of six course syllabi from Sc Ed 378, Sc Ed 476 and Sc Ed 496, check for evidence that CPAS 2-10 are included in evaluating student competencies for Caring Teaching Skills. Finding: Two syllabi from Sc Ed 378 and four syllabi from Sc Ed 476 including two with Sc Ed 496 were reviewed by auditors for evidence that the CPAS is included for the course assessment:  Mathematics Education 377/378, Fall Semester 2013, CPAS found on p. 7; Family and Consumer Sciences Education SFL 378 Practicum Syllabus Foods: Section 1, Winter 2013, CPAS found on pp. 1, 2, 3, 4, 8, and the actual assessment on the last 3 pages of the syllabus; Bio 476 Student Teaching, no date…instructors M. Adair & H. Tippets, CPAS found on p.2;  Secondary Education 476R Section 8 (Physical Education) Winter Semester 2012, CPAS found on p. 1;  Secondary Education 476R and 496R Student Teaching in French, Fall 2012, CPAS found on p. 3; and  476/496R Guidelines for Student Teachers and Interns: Fall, 2011 (English), CPAS found on p. 5. Verified

Audit TaskA11 (1.4.1) Target: “Elementary Education, Claim 3: Stewardship in School and Community, QP 1.4.1: Learning How to Learn 2009-2013” (Appendix I: Disaggregated Assessment Data for Each of the 20 EPP Programs, under CDS Pre & CDS Post columns, p. 665):“CDS Pre m = 3.74sd = 0.257 n = 77795.45% and CDS Post m = 3.81sd = 0.231n = 81399.88%, CDS Pre #recorded 777 #expected 814 and CDS Post #recorded 813 #expected 814” Probe: Review the data for QP 1.4.1 and re-compute from the raw data to confirm findings. Finding: Auditors computed CDS Pre and CDS Post QP 1.4.1 from the raw data with the following results:

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 31 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

On CDS Pre m = 3.74 sd = 0.257 n = 777 95.45% and CDS Post m = 3.81 sd = 0.231 n = 813 99.88%, results match the figures indicated in the target. Auditors computed the difference between the number of CDS Pre scores expected, 814, minus the number of scores recorded, 777, and found a difference of 37. The EPP (p. 665) indicated the missing 37 scores as a percentile, 95.45%. Auditors computed the percentile difference between 777 and 814 as 95.45% matching the target. Auditors then computed the difference between the number of CDS Post scores expected (814) minus the number of scores recorded (813) and found a difference of 1. The EPP indicated the missing 1 score as a percentile, 99.88% (p. 665). Auditors also computed the percentile difference between 813 and 814 as 99.88%, matching the target. Verified

Audit TaskA12 (1.4.1) Target: “Technical & Engineering Education, Claim 3: Stewardship in School and Community, QP 1.4.1: Learning How to Learn 2009-2013”(Appendix I: Disaggregated Assessment Data for Each of the 20 EPP Programs, under CDS Pre & CDS Post columns, p. 1239):“CDS Pre m = 3.72sd = 0.281 n = 6298.41%#recorded 62 #expected 63 and CDS Post m = 3.72sd = 0.249 n = 63100%#recorded 63 #expected 63” Probe: Review the data for QP 1.4.1 and re-compute from the raw data to confirm findings. Finding: Auditors computed CDS Pre and CDS Post QP 1.4.1 from the raw data with the following results: CDS Pre m = 3.72 sd = 0.281 n = 6298.41% and CDS Post CDS Post m = 3.72 sd = 0.249 n = 63100%. Results match the figures indicated in the target. Auditors computed the difference between the number of CDS Pre scores expected (63) minus the number of scores recorded (62), and found a difference of 1. The EPP, on pg. 1239, indicated the missing 1 score as a percentile, 98.41%. Auditors computed the percentile difference between 62 and 63 as 98.41%, matching the target. Auditors then computed the difference between the number of CDS Post scores expected and recorded as63, thus no difference. The EPP indicated the percentile difference as 100% (p.1239). Auditors also computed the percentile difference between 63 and 63 as 100%, matching the target. Verified

Audit TaskA13 (1.4.2) Target: “Elementary Education, Claim 1: Civic Preparation & Engagement, QP 1.4.2: Multicultural Perspectives and Accuracy”(Appendix I: Disaggregated Assessment Data for Each of the 20 EPP Programs, under University Supervisor CPAS & Mentor Teacher CPAS columns, p. 651):“CPAS 3, University Supervisor m = 4.26 sd = 0.604 n = 778 95.58%#recorded 778 #expected 814 and Mentor Teacher m = 4.33 sd = 0.622 n = 774 95.09% #recorded 774 #expected 814” Probe: Review the data for QP 1.4.2, CPAS 3 and re-compute from the raw data to confirm findings. Finding: Auditors reviewed CPAS 3 for both university supervisor and mentor

32 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway teacher. The auditors then computed CPAS 3, University Supervisor and found m = 4.26 sd = 0.604 n = 778 95.58%#recorded 778 #expected 814 and Mentor Teacher m = 4.33 sd = 0.622 n = 774 95.09% #recorded 774 #expected 814. Results match the figures indicated in the target. Auditors computed the difference between the number of CPAS 3 for university supervisors expected (814) minus the number of scores recorded (778) and found a difference of 36 missing scores. The EPP indicated the missing 36 scores as a percentile, 95.58% (p. 651). Auditors computed the percentile difference between 778 and 814 as 95.58%, matching the target. Auditors then computed the difference between the number of CPAS mentor scores expected (814) and recorded (774), a difference of40. The EPP indicated the percentile difference between 814 and 774 as 95.09% (p. 651). Auditors also computed the percentile difference as 95.09%, matching the target. Verified

Audit TaskA14 (1.4.2) Target: “Technology & Engineering Education, Claim 1: Civic Preparation & Engagement, QP 1.4.2: Multicultural Perspectives and Accuracy” (Appendix I: Disaggregated Assessment Data for Each of the 20 EPP Programs, under University Supervisor CPAS & Mentor Teacher CPAS columns, p. 1225): “CPAS 3 University Supervisor m = 4.39sd = 0.574 n = 63 100% #recorded 63 #expected 63 and CPAS Mentor Teacher m = 4.25sd = 0.689 n = 6298.41% #recorded 62 #expected 63”. Probe: Review the data for QP 1.4.2 and re-compute from the raw data to confirm findings. Reviewed CPAS 3 for both university supervisor and mentor teacher for each academic year and total cohort. Finding: Auditors reviewed CPAS 3 for both university supervisor and mentor teacher. The auditors then computed CPAS 3, University Supervisor and found m = 4.39 sd = 0.574 n = 63 100%#recorded 63 #expected 63 and CPAS Mentor Teacher m = 4.25sd = 0.689 n = 6298.41% #recorded 62 #expected 63. Results match the figures indicated in the target. Auditors computed the difference between the number of CPAS 3 for university supervisor expected and recorded as 63. The EPP indicated any difference in scores as a percentile, in this case100% (p. 1225). Auditors computed the percentile difference as 100%, matching the target. Auditors then computed the difference between the number of CPAS mentor scores expected (63) and recorded (62), a difference of 1. The EPP indicated the percentile difference as 98.41% (p.1225). Auditors computed the percentile difference also as 98.41%, matching the target. Verified

Audit Task A15 (1.4.3) Target: “EPP candidates are required to pass the TSA with 100% accuracy on the skills tested in 30 minutes or less at the beginning of their teaching program” (p. 57). Probe: For a sample of 20 completers, verify that all passed the technology skills test in 30 minutes or less at the beginning of their teaching program.

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 33 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Finding: Four candidates did not complete the TSA, and the score for one was not found. The sample of 20 students used for these audit tasks was selected by using the random number generator function in Excel and sorted into students’ majors. Within the major, students were sorted by their assigned random generated number; students with the highest numbers were included in the sample.

Table A15: TSA Scores for Sample of 20 Students Name TSA 1. TB PASS 2. AS NF 3. JJ PASS 4. MP PASS 5. MP PASS 6. AR PASS 7. BE NC 8. JD PASS 9. RP PASS 10. AF PASS 11. HA NC 12. JB PASS 13. AK NC 14. DG PASS 15. LW NC 16. CU PASS 17. DB PASS 18. HF PASS 19. SP PASS 20. RL PASS Verified with error as 4 (20%) of the students sampled never completed the test (NC), and the score for one (5%) was not found (NF). Thus 5 total scores (25%) are missing, indicating five candidates did not pass the technology skills test in 30 minutes or less at the beginning of their teaching program.

Program Response: We acknowledge the findings of the audit team. We were aware of this problem, and we reported on page 48 in Table 16, EPP Pass Rates on Four Skill Assessments of the TSA 2009-2013, in Section 4: Results, that we had a large percentage of missing data. We also report this problem on page 67 in our Section 5: Discussion and Plan where we state, “We were surprised to find that 2.47% (n=64) of our EPP completers had been admitted without completing the TSA. (See Section 4, pp. 47-48).” We continue on page 74 in our plan for further improvement, “We found that approximately 10% of our candidates never passed the TSA. We propose to further explore this concern and develop processes to ensure that all candidates pass the TSA prior to being accepted into an EPP program.”

34 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Though we acknowledge the audit team’s findings, we believe that we presented evidence in Sections 4 and 5 of our IB to alert the audit team that the process and procedures that we depended on during this accreditation cycle would yield findings with multiple missing data.

TEAC Response: TEAC acknowledges that the program included statements on pages 47-48, 67, and 74 in the IB recognizing problems with missing data and plans for further improvement of the quality control system. TEAC appreciates knowing the program has plans for continued improvement in this area. The finding remains Verified with Error.

Audit Task A16 (1.5) Target: “The EPP Executive Committee conducted three reliability studies to investigate the rater agreement on the Clinical Practice Assessment System (CPAS). Appendix H contains the executive reports of these studies. The third study was a comparison between university supervisors’ and mentor teachers’ CPAS scores on the same student, CPAS MT-US Reliability Study Summary 2011-2013. Analysis looked at inter-rater agreement between the MT and US for each candidate. Inter- rater agreement was calculated based on number of agreed upon evaluation decisions divided by the total number of evaluation decisions. Inter-rater agreement was calculated using the raw data and three transformations of the data intended to better understand where disagreements in rating might reside. The first transformation looked at the agreement for passing or failing of the assessment. With the pass/fail decision for the candidates, the MT/US inter-rater agreement ranged from 80 to 100% across all of the programs. Every program had at least 80% inter- rater agreement” (Appendix H: EPP Reliability Studies, pp. 437-438). Probe: Check the table titled CPAS MT-US Reliability Study Summary 2011-2013on page 438, Pass/Fail column to verify all scores being at or above 80%, then re- compute the raw data (Appendix I) for accuracy in the scores reported in the table for the Pass/Fail column. Finding: Using two methods, auditors verified the formulas and execution of methodology in providing a “cleaned” inter-rater reliability data file. “Cleaned inter- rater reliability data file” refers to the data BYU created doing the inter-rater reliability study. This study used observations specifically created for this study. Essentially the auditors looked at the study data and verified their statistical findings. The second method was an attempt to do the same inter-rater reliability study with raw student data in general rather than the controlled study data. This was difficult, as the data sets did not quite align between the reliability study and the master data file, but the statistical finding using the raw, real observation data was within reason of the percentages displayed on page 438 when compared to the study that was done, thus helping verify the validity of the study. Verified

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 35 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Audit TaskA17 (1.5) Target: “For our reporting period we evaluated the internal consistency and reliability of both the CPAS US and CPAS MT, calculating Cronbach’s Alpha coefficients to be alpha = .919 and alpha = .923, respectively”(pp. 35-36). Probe: Re-compute Cronbach’s Alpha coefficients for the CPAS US and CPAS MT. Finding: Using SPSS, auditors recomputed CPAS-US alpha = 0.925, CPAS-MT alpha = 0.928, although not 9.19 and .923, the scores surpass the .70 acceptable level used by psychometrics experts: “The widely-accepted social science cut-off is that Cronbach’s alpha should be .70 or higher for a set of items to be considered an internally-consistent scale” (Rovai, A., Baker, J. &Ponton, M., 2014, Social Science Research Design and Statistics, Waterfree Press, Inc., Chesapeake, VA., p. 346). Verified

Audit Task A18 (1.5) Target: “For our reporting period, we evaluated the internal consistency and reliability of both the pre- and post-program administration of the CDS and calculated Cronbach’s Alpha coefficients to be alpha = .780 and alpha = .799, respectively” (p. 35). Probe: Re-compute Cronbach’s Alpha coefficients for the CDS. Finding: Using SPSS, auditors recomputed CDS-pre alpha = 0.778 and CDS-post alpha = 0.800 respectively, which surpasses the .70 acceptable level used by psychometrics experts: “The widely-accepted social science cut-off is that Cronbach’s alpha should be .70 or higher for a set of items to be considered an internally- consistent scale” (Rovai, A., Baker, J. & Ponton, M., 2014, Social Science Research Design and Statistics, Waterfree Press, Inc., Chesapeake, VA., p. 346). Verified

Audit Task A19 (1.5) Target: Page 32 indicates that the TWS are distributed to raters in the program for scoring with an EPP-approved rubric. Furthermore, on page 36, the authors state, “We have tried to keep core aspects of the TWS rubric consistent across all EPP programs, allowing for some adaptation by program.” Probe: Interview faculty to identify who the raters of the TWS are, to determine if a common rubric is used across multiple programs, and to determine how raters are trained. Examine the TWS rubrics used across multiple programs for consistent core criteria and evaluation expectations. Finding: During the group faculty meeting, the team was advised that the raters of the TWS vary across programs. For example, the Family Consumer Science program hires specialists to evaluate/rate only the TWS. In contrast, in the special education program university supervisors complete the evaluation/rating for the TWS.TWS rubrics were reviewed for a sample of 20 students across multiple programs. The TWS rubric criteria/form was found in 16 of the 20 completed rubrics; 4 rubrics were missing from the sample (Eliason/Elem, Kimball/Latin, Lowe/Schl Health, Smith/Bio Sci).

36 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Of the 16 rubrics reviewed, 3 rubrics had been submitted as blank templates. All blank templates were from 2013 (Wilkes/Theatre Arts, Perry/Physics, Johnson/ Physical Science). Not verified, as 7 program rubrics (35%) were not available for review to verify consistency.

Program Response: We acknowledge the findings of the audit team, noting that 4 of the students (BE, AK, RL, AS) in the random sample did not have a TWS rubric recorded in the LiveText data system, and 3 students (JJ, SP, CW) had a blank rubric with a score assigned (i.e., Pass, A). Thus for 7 out of 20 students in the random sample there is no evidence that the university evaluators used a rubric to score the TWS.

TEAC Response: TEAC appreciate the programs knowledge of this error.

*******

The following audit tasks are from the TEAC online survey of the program’s candidates, faculty and mentors. Prior to the audit TEAC requests email addresses for program faculty, program candidates, and mentors who work with the program. Table A below indicates how many in each category were invited to take the email survey, how many of the emails were successfully delivered, how many opened the email, and how many and what percentage responded to the email survey:

Table A On-Line Survey Responses Email Email Responses Percentage Invited Delivered Opened Completed Responding Program Faculty 241 232 85.30% 100 43.10% Program 1000 999 84.90% 125 12.50% Candidates Mentors 929 774 86.30% 285 36.80%

Candidates responding to the survey were asked to report their overall GPA. The mean GPA of the responding candidates was 3.60, which is comparable to the mean GPA of 3.63 of program candidates reported in the Brief, suggesting that candidates responding to the survey were academically similar to candidates in the program overall.

Audit Task A20 (1.1) Target: Results of subject matter knowledge assessments Probe: Corroborate the results of the program assessments of subject matter knowledge by determining that TEAC on-line survey results reflect those of the program assessments. Finding: The TEAC on-line survey results are given in Table A20.1

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 37 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Table A20.1 Candidate, Faculty, and Cooperating Teacher Mean Ratings on the Adequacy of the Candidates’ Accomplishments in Subject Matter Knowledge Minimum Maximum Topic of Survey Question Mean Rating STD Rating Rating Candidate ratings of own knowledge 2 5 4.19 0.85 Candidate ratings of adequacy of 1 5 4.11 0.94 courses Candidate ratings of adequacy of 1 5 4.25 0.92 faculty Faculty ratings of candidate 2 5 4.27 0.85 knowledge Mentor ratings of candidate 1 5 4.09 0.89 knowledge 1=Inadequate, 2=Barely Adequate, 3=Adequate, 4=More than Adequate, 5=Excellent

The mean ratings for subject matter knowledge were more than adequate in all categories. The low ratings of 1 by “Candidate ratings of adequacy of courses,” “Candidate ratings of adequacy of faculty,” and “Mentor ratings of candidate knowledge” are troubling, but the means for each group (4.09-4.27) were in the more- than-adequate range. Verified (Data reported were corroborated by survey responses.)

Audit Task A21 (1.2) Target: Results of pedagogical knowledge assessments Probe: Corroborate the results of the program assessments of pedagogical knowledge by determining that TEAC on-line survey results reflect those of the program assessments. Finding: The TEAC on-line survey results are given in Table A21.1

Table A21.1 Candidate, Faculty and Cooperating Teacher Mean Ratings on the Adequacy of the Candidates’ Accomplishments in Pedagogical Knowledge Minimum Maximum Topic of Survey Question Mean Rating STD Rating Rating Candidate ratings of own knowledge 2 5 4.01 0.97 Candidate ratings of adequacy of 1 5 4.09 0.95 courses Candidate ratings of adequacy of 1 5 4.26 1.00 faculty Faculty ratings of candidate 2 5 4.23 0.65 knowledge Mentors ratings of candidate 1 5 3.91 0.90 knowledge 1=Inadequate, 2=Barely Adequate, 3=Adequate, 4=More than Adequate, 5=Excellent

The mean ratings for pedagogical knowledge were more than adequate in all but one category. “Mentors ratings of candidate knowledge” (3.91) was just below the more than adequate rating. The three low ratings of 1 by “Candidate ratings of adequacy of

38 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

courses,” “Candidate ratings of adequacy of faculty,” and “Mentor ratings of candidate knowledge” in terms of candidate pedagogy are troubling. But the means for each group were in the adequate (3.91) to more-than-adequate (4.01-4.26) range. Verified (Data reported were corroborated by survey responses.)

Audit Task A22 (1.3) Target: Results of teaching skill assessments Probe: Corroborate the results of the program assessments of teaching skill by determining that TEAC on-line survey results reflect those of the program assessments. Finding: The TEAC on-line and on-site survey results are given in Table A22.1

Table A22.1 Candidate, Faculty and Mentor Mean Ratings on the Adequacy of the Candidates’ Accomplishments in Teaching Skill Minimum Maximum Topic of Survey Question Mean Rating STD Rating Rating Candidate ratings of own knowledge 1 5 4.33 0.90 Candidate ratings of adequacy of 1 5 4.09 0.96 courses Candidate ratings of adequacy of 1 5 4.36 0.94 faculty Faculty ratings of candidate 3 5 4.62 0.57 knowledge Mentor ratings of candidate 1 5 4.18 0.88 knowledge 1=Inadequate, 2=Barely Adequate, 3=Adequate, 4=More than Adequate, 5=Excellent

The mean ratings for teaching skill were more than adequate in all categories. The three low scores of 1 by “Candidate ratings of own knowledge,” “Candidate ratings of adequacy of courses,” “Candidate ratings of adequacy of faculty,” and “Mentor ratings of candidate knowledge” in terms of candidate teaching skill are troubling. But the means for each group were in the more-than-adequate (4.09-4.62) range. Verified (Data reported were corroborated by survey responses.)

Audit Task A23 (1.4) Target: Results of cross-cutting themes assessments Probe: Corroborate the results of the program assessments of cross-cutting themes by determining that TEAC on-line and on-site survey results reflect those of the program assessments. Finding: The TEAC on-line survey results are given in Table A23.1

Table A23.1 On-Line Candidate, Faculty and Mentor Mean Ratings on the Adequacy of the Candidates’ Accomplishments in the Cross-Cutting Themes

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 39 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Minimum Maximum Topic of Survey Question Mean Rating STD Rating Rating Learning How to Learn Candidate ratings of own knowledge 1 5 4.26 0.94 Faculty ratings of candidate 2 5 4.26 0.72 knowledge Mentor ratings of candidate 1 5 3.99 0.95 knowledge Multicultural Perspectives and Accuracy Candidate ratings of own knowledge 1 5 3.99 1.01 Faculty ratings of candidate 2 5 3.71 0.73 knowledge Mentor ratings of candidate 1 5 3.74 0.91 knowledge Technology Candidate ratings of own knowledge 1 5 3.88 1.06 Faculty ratings of candidate 2 5 4.05 0.78 knowledge Mentor ratings of candidate 2 5 4.11 0.86 knowledge 1=Inadequate, 2=Barely Adequate, 3=Adequate, 4=More than Adequate, 5=Excellent

The mean ratings for cross-cutting themes were adequate/more than adequate in all categories; the three low ratings of 1 or 2 in all categories in terms of candidate learning how to learn, multicultural perspectives, and technology are troubling. But the means for each group were in the adequate (3.71-3.99) to more than adequate (4.05- 4.26) range. Verified (Data reported were corroborated by survey responses.)

Audit Task A24 (1.5) Target: The validity of mentor ratings. Probe: Corroborate the program assertion of the validity of mentor ratings of candidates by determining that TEAC survey results reflect the raters’ preparation for their role. Finding: The results are given in Table A24 below:

Table A24 Mentor Ratings of Their Connections with the Teacher Education Program Minimum Maximum Mean STD Rating Rating Rating Relationship with program faculty 1 5 3.87 1.12 Training for evaluation role 1 5 3.58 1.06 Understanding of program 1 5 3.65 1.01 1=Inadequate, 2=Barely Adequate, 3=Adequate, 4=More than Adequate, 5=Excellent

In spite of low ratings of 1 (inadequate) in terms of “Relationships with program faculty,” “Training for evaluation role,” and “Understanding of program,” the mean ratings on the online survey of cooperating teachers were in the adequate range (3.58–3.87). Verified (Means are adequate, and the on-site interview raised no issues.)

40 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Summary of Tasks Related to Quality Principle 1: Evidence of Candidate Learning

On the whole the auditors verified the evidence cited in the Inquiry Brief for the assessments associated with the program’s claims. The auditors also verified the program’s history of collecting data of student progress mainly by reviewing candidate performance on eight assessments. TEAC survey data collected corroborate the faculty’s conclusions about student outcomes. Audit Tasks 2, 4, 6, 7, 15, and 19were problematic in that items/scores were missing from the data audited, indicating a need for attention to data collection and management.

B. Tasks Related to Quality Principle 2: Evidence of a Quality Control System

This section of the audit report addresses targets that are associated with Quality Principle II, which has the following requirements:

2.1 Rationale for the assessments. There must be a rationale for the program’s assessment methods that explains why the faculty thinks the assessments are valid and why the criteria and standards the faculty have selected as indicating success are appropriate. 2.2 Program decisions and planning based on evidence. Where appropriate, the program must base decisions to modify its assessment systems, pedagogical approaches, and curriculum and program requirements on evidence of student learning. 2.3 Influential quality control system. The program must provide evidence, based on an internal audit conducted by the program faculty, that the quality control system functions as it was designed, that it promotes the program’s continual improvement, and that it yields outcomes specified in TEAC subcomponents 2.3.1 Curriculum, 2.3.2 Faculty, 2.3.3 Candidates, and 2.3.4 Resources.

Audit Task B1 (2.1) Target: “Although we have found a high level of inter-rater agreement within programs, we recognize that not all programs have derived the same value from the TWS. We consider this value inconsistency as a potential challenge to TWS validity and plan to explore the issue further in the future” (p. 32). Probe: Interview faculty members to determine their understanding for this inconsistency and how they plan to respond to it. Finding: Members of faculty, including adjunct and other instructional members, indicate agreement that the large number of students and various individuals who evaluate TWS impact reliability. Most programs have begun the discussion of how to improve and manage this challenge. Several programs (Spec. Ed, English) indicate that the School of Education has been responsible for additional training, involving video-case analysis among raters, which has greatly improved reliability. CFA also ©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 41 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway indicate they are involved in monthly training for development and evaluation of the TWS. Verified

Audit Task B2 (2.1) Target: “Appendix H: EPP Reliability Studies. The EPP Executive Committee conducted four reliability studies to investigate the rater agreement on the Clinical Practice Assessment System (CPAS) and Teacher Work Sample (TWS). Appendix H contains the executive reports of these four studies. The first study looked at the rater agreement within EPP programs on the CPAS. The second study investigated the within-program agreement after the faculty had completed a calibration experience to increase their agreement on the CPAS. The third study was a comparison between university supervisors’ and mentor teachers’ CPAS scores on the same student, while the fourth study looked at within-program agreement on TWS scores of the same student” (p. 433). Probe: Interview faculty to determine if they were involved with the CPAS reliability studies and rationale for its use. Finding: In meetings with the faculty (27 present at the am session and 32 present at the pm session), which involved different faculty from different programs engaging in the discussion, auditors learned that there had been widespread involvement with the reliability studies. (In the pm session, faculty indicated their involvement by a unanimous raise of hands when the question was asked). Comments included “A lot of our collective time was put into both the CPAS and the TWS reliability studies” and “The results of the studies worked to bring us closer together on inter-relator reliability.”

They discussed the 5-point scale, particularly on the two top ratings. It was stated that principals like the 5-point scale to show differences in candidates who are above average. On the other hand, several faculty commented that the 5-point scale was “unacceptable” and that “the top scores were very difficult to differentiate.” One faculty member brought the meeting’s attention to the table on p. 438 describing the range of the CPAS mentoring teacher (MT) and university supervisor (US) inter-rater agreement scores and their meaning, particularly in transforming the CPAS to a different rating scale than the 1-5 currently being used. He stated that the main challenge for raters is agreeing on a difference between a score of 4 and a score of 5. As the discussion continued, faculty from Engineering Education indicated that as a result of the inter-rater data review and discussions, some faculty were rating candidates against a master teacher’s skill level vs rating against other student teachers. Verified

Audit Task B3 (2.1) Target: “The CPAS data comprise one of two major pillars in the EPP's assessment system, providing relevant data for decisions involved with current program strengths and weaknesses” (p. 22).

42 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Probe: Since the CPAS is one of two major pillars in the EPP assessment system, interview faculty to determine whether it is an accurate measure of subject matter knowledge. Finding: In two meetings with faculty and a meeting with CFAs, liaisons, field supervisors, and one principal, participants indicated that the CPAS is a good tool for measuring subject matter knowledge for student teachers. One comment regarding the assessment’s strengths was that the CPAS instrument contains many headings that allow for customization through the use of individual sub-headings. This results in the EPP being able to collect and review additional data on subject matter knowledge. Another strength noted was that CFA’s have reviewed recorded videos of teachers and have developed a standard rubric for using the instrument. Verified

Audit TaskB4 (2.2) Target: “Using the raw 5-point ratings, no EPP program had average inter-rater agreement above 70%. When the data were transformed to consider the pass/fail inter-rater agreement, all programs showed agreement in the high 90% range. When the data were transformed to combine failing scores of 1-2 and highest passing scores of 4-5 to make a low/medium/high rating, average inter-rater agreement scores were 79% or above for all programs but two (Art Education at 73%, Theatre and Media Arts at 69%). (More detailed results of the MT-US CPAS reliability study are given in Appendix H.) This analysis shows that we have very high agreement regarding whether a candidate is qualified to teach (pass/fail decision) and that we struggle most in agreeing on whether a candidate is good or excellent” (p. 35). Probe: Interview faculty and supervisors/mentor teachers to explore their analysis that differentiating between good and excellent performance is difficult. How will the program establish criteria for scoring a 4 (good) or 5 (excellent)? Finding: Charles Graham (faculty) provided an excellent explanation regarding faculty processes for investigation and analysis of items on scales and on the inter- rater agreement study. Other department faculty, supervisors, and liaisons all stated that the new rating scale has made the scoring much more clear for rating advanced skills with the maximum rating. One faculty member from Engineering offered some rationale for previous challenges with differentiation. Additionally, this faculty member stated that now because of this data analysis faculty have learned that raters were using different levels of student expectation (based on mastery at the pre-service level and not practicing teacher level), which impacted reliability. Aaron Popham (assessment and accreditation director) indicated that with the change the program did not create new criteria for the modified scoring categories. The former rating for 5 (excellent) was simply dropped. The new revised CPAS rating scale contains the former 4 criteria as the maximum performance level. Verified

Audit Task B5 (2.2) Target: “Additionally, based on a review of available data, the TSA was revised twice

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 43 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway during the review period to adapt to new technologies and to increase the rigor of the assessment. Changes in these assessments and their uses have increased their value in determining program quality and gaining insights into candidate learning” (p. 73). “Additionally, the TSA was revised twice during the review period. These revisions were made to adapt to new technologies and to increase the rigor of the assessment based on a review of available data” (p. 105). Probe: Ask authors for a copy of the newly revised TSA and a copy of the previous TSA to review the differences involved with “adapt[ing] to new technologies and increas[ing] the rigor.” Finding: Auditors compared the Technology Skills Assessment 2011 with the Technology Skills Assessment 2013 and found upgrades to new technology skills resulting in an increase in rigor to the new TSA assessment. For example, under “Internet and Communications” Item 3, the item “Create a blog post” was replaced with “Convert audio file from WAV to MP3 using “iTunes, Audacity or any other program.” A new item was added requiring students to “Create video file from WMV to MP4 Streamclip, Handbrake or any other program.” Another example is that under “Spreadsheets” the number of items increased from 6 to 8. Verified

Audit Task B6 (2.2) Target: “The EPP had a cumbersome system for reporting and following up on data related to the number and duration of formative observations of candidates during their field experiences, resulting in many supervisors failing to submit reports of their formative observations. To address this issue, the EPP Executive Committee added a section to the CPAS assessment instrument requiring field supervisors to report formative observation data. This requirement prompts supervisors to submit these reports, which are needed to assess the quality of our program” (pp. 73-74). “The EPP had a cumbersome system for reporting and following up on data related to the number and duration of formative observations of candidates in their field experiences. Thus many supervisors did not submit reports of their formative observations. To address this issue, the EPP Executive Committee added a section to the CPAS assessment instrument requiring field supervisors to report formative observation data. This requirement prompts supervisors to report requested candidate data that are needed to assess the quality of our program” (p. 106). Probe: Ask authors for a sample of the CPAS that displays the field supervisor’s report of formative observation data. Finding: Auditors were provided a copy of the most current “Brigham Young University Clinical Practice Assessment (CPAS) Form, Final Evaluation.” In reviewing this document and comparing it to the same instrument in “Appendix F,” assessors found that at the top right corner the assessment prominently includes “Total # Formative Observations” and “Total # Formative Observation Minutes.” In fact, in this corner a new box of information from the evaluator has been included, which is new to the assessment. Verified

44 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Audit Task B7 (2.2) Target: “Standardizing the requirement that ALL fingerprinting/background checks be completed before candidates participate in any field/practicum experiences has eliminated much confusion. The new policy expanded the requirement by adding that faculty must also obtain fingerprint/background clearance if they are involved in doing research or supervision in the schools” (p. 73). Probe: Ask authors for a copy of the new fingerprint policy. Finding: Auditors reviewed a number of documents/sites regarding this policy, including Drafts of Fingerprint Background, Clearance Policy, Example of Fingerprint Background Clearance Policy in 2010-2011 BYU Catalogue, Fingerprint Background Clearance Policy in 2014-15 BYU Catalogue, Fingerprint Background Clearance Information on ESS Website, Fingerprint Background Clearance Policy approved by AVP Council, Fingerprint Background Clearance Policy approved by UCOTE, UCOTE Meeting Minutes regarding Fingerprint Background Clearance Policy (November 16, 2010). Verified

Audit Task B8 (2.2) Target: “The PIBS and CDS have been used differently across the EPP programs during the reporting period. Upon consideration, the EPP Executive Committee decided to allow this differentiated use. As a result the PIBS is not administered to all EPP candidates; it is currently used as a “red flag” for candidates across the EPP who demonstrate lack of progress or cause particular concern for the faculty (e.g., questionable professionalism, behaviors, or dispositions). The EPP Executive Committee reviewed the data and purpose of the CDS, and as a result a new assessment, the Professional Disposition Instrument (PDI), was subsequently developed and piloted in fall semester 2013” (pp. 104-105). Probe: Ask authors for a sample of the new “Professional Disposition Instrument”. Finding: Auditors were provided with and reviewed the “Brigham Young University Professional Disposition Instrument (PDI) Form, Final Evaluation.” It is a 2-page assessment requesting student information and evaluator information on the first page. Additionally, the first page contains candidate assessment on three factors: “The Learner and Learning, Content Knowledge,” and “Instructional Practice.” “Candidate Professional Responsibility” is assessed on page 2. Verified

Audit Task B9 (2.2) Target: “Any candidate who does not receive a passing score on the Praxis II test must retake the test until a passing score is achieved. Any candidate who is unable to improve his/her score to a passing level will not be eligible to graduate as a teaching major nor be recommended to the state for licensure” (p. 26). Probe: Interview program leaders and faculty to determine how many candidates do not pass Praxis and whether the program limits the number of times candidates can take the test.

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 45 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Finding: In the meeting with the authors it was stated that Praxis exam pass rates are “98.5%,” meaning 1.5% do not pass. The authors noted that this score has “not changed much.” In addition, they indicated that they have no “second time PRAXIS test data,” “no quality data to discuss attempts,” and that “the current data collection system does not have capacity to keep records of test retake frequency“. The EPP does not limit the number of times a candidate can take the test. Faculty state that students are not allowed to graduate with a teaching major without passing the Praxis II, which is also a state requirement. Authors are hoping that the newest data system, mYlink, can assist with this type of data collection in the future. Verified

Audit Task B10 (2.2) Target: The IB indicates that the “Employer Survey (ES) data are used to learn of graduates’ overall performance during their first three years of teaching” (p. 23). Results summarized on page 49 highlight positive results from 2010 and 2013, and Appendix F (pp. 332-334) reveals only 5% of graduates were rated below average. Comments from the Employer Survey appear on pages 341-358 of Appendix F. Probe: Review faculty meeting(s) minutes for documentation of discussion and description of proposed program goals and changes as a result of the review of the Employer Survey results. Request access to any faculty or program meeting minutes. Finding: During group faculty meetings (including adjunct and other instructional members) faculty indicated that “yes” results are shared in faculty meetings and used to establish future program goals. No minutes or other documentation were provided or shared from across multiple programs. Verified with error due to lack of any documentation that Employer Survey results are shared in faculty meetings and used to establish future program goals

Program Response: We acknowledge the findings of the audit team. We are aware that program minutes have not been consistently kept across the 27 teaching majors and 23 teaching minors.

TEAC Response: TEAC appreciate the program’s knowledge of this error.

Audit Task B11 (2.2) Target: “The CPAS data comprise one of two major pillars in the EPP's assessment system, providing relevant data for decisions involved with current program strengths and weaknesses” (p. 22). Probe: Interview faculty to learn examples of how results were used for program improvement. Finding: When interviewed, faculty referred to use of scores and written plans in the Expected Learning Outcomes –University Assessment Online System. Evidence was found in the online system for the 2013-2014 Elementary program. “CPAS scores reflect weakness in candidates’ ability to demonstrate assessment of student learning. As a result, the program modified IP&T213 courses in Instructional Design and

46 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Assessment to address weakness in skills.” The following year’s report reflected improvements in CPAS scores and reduced standard deviations. Verified

Audit Task B12 (2.2) Target: TWS Section 3 (Assessment Plan) has been the lowest scored TWS section across the EPP (p.46). Probe: Interview faculty to determine their understanding of why Section 3 Assessment Plan would have the lowest scores and what changes to the program will be made to address the low scoring area. Finding: During interviews, faculty indicated that content of assessment was not given enough emphasis in coursework. Review of the online Expected Learning Outcome system found documentation within the plans where Elementary faculty identified assessment as a weakness and were able to modify several instructional design courses to address the student weakness. Program changes were recorded in the Learning Outcomes –University Level Assessment matrix over multiple years. Verified

Audit Task B13 (2.3) Target: “EPP faculty are qualified to teach the courses to which they are assigned and are highly rated by their students. Students recognize and value faculty members’ preparation and their teaching experience (the majority having taught K-12 in addition to university)” (p. 70 and p. 107). Probe: Review the Online Student Ratings Report for 5 randomly selected faculty from Physical Education Teacher Education (PETE) and for 5 randomly selected faculty from Elementary Education (EL ED) for indicators of high ratings. Finding: Auditors reviewed the “University Mean Instructor Overall” ratings for the following courses: Fall 2013 PETE 169-001; PETE 212-00; 230-001; 234-00; & 274-001 El ED 202-001; EL ED 203-001; 323-001; 324-001; & 331-001 The “Instructor Above Overall” rating was indicated in 5 out of 5 PETE courses and 5 out of 5 EL ED courses. Random selection was accomplished by the auditors choosing PETE and EL ED and requesting the first 5 courses listed in Section 001 of each course for PETE and the first 5 listed in Section 001 of each course for EL ED. Verified

Audit TaskB14 (2.3) Target: Clarification Task 4 (see p. 12 of this Audit Report).In responding to the question of what “equal partners” means to the BYU-Public School Partnership (BYU- PSP) the faculty state: “The BYU-PSP has operated on the belief that partners working together can achieve more than they can accomplish working separately. The sense in which the partners are “equal” derives from an equal commitment to the mission and purposes of the Partnership and its shared governance. One

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 47 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway indicator of equal commitment is through resource acquisition and allocation. Each of the partners makes significant financial and human resource contributions. This signifies that the partners are “equally yoked” to the goals, mission, and objectives of the Partnership. Another indicator is shared governance. Members of the Governing Board share the same privileges, status, and rights. A chair of the Governing Board is selected by its members and acts in concert with an executive director in carrying out the responsibilities and activities decided by the board. Board decisions direct the acquisition and use of resources and the assignment and stewardship of responsibilities. Resources and responsibilities are best deployed after discussion and deliberation among the board members, who jointly share power in decision making.” Probe: At the meeting with the superintendents who are involved with the BYU-PSP. auditors asked about their commitment to the partnership and how they define “equal partnership” in “sharing governance, sharing resources, and sharing responsibilities”(p. 2 of the Brief). Finding: All five superintendents in the partnership were in attendance: Terry Shoemaker, Wasatch School District; Rick Nielson, Nebo School District; Keith Rittel, Provo City School District; Vern Henshaw, Alpine School District; and Patricia Johnson, Jordan School District. Early on Superintendent Vern Henshaw indicated he had been a principal when the partnership began in 1984, and that it has never been about the superintendents but about the partnership being an “integral connection to the daily work of the school districts and the university.” Comments included that this partnership has been in existence for 30+ years and indicated a very high level of collaboration, commitment, and shared resources. Professional learning communities (PLCs) and a “team” approach to the partnership were noted by each superintendent; they did not talk about their individual districts but about the “partnership.” Verified

Audit Task B15 (2.3) Target: “Most departments and colleges that house the EPP have their own requirements for teaching load etc., as well as varying requirements for their students”(p. 110). “We recognize the need to determine a policy on load for on-campus faculty and field supervisors. However, most departments and colleges that house the EPP programs have their own requirements for teaching load etc., as well as varying requirements for their students. We believe consistency can be achieved with ongoing efforts to be aware of these specific program expectations and to balance them with the need for consistent EPP policies and procedures as we consider adaptations” (p. 71). Probe: Interview faculty or university administration to investigate the faculty teaching load across colleges and departments and review teaching load policies for multiple departments. Finding: 1. During the meeting with BYU Central Administration, VP Brent Webb was able to articulate general faculty load policy as “15 credits or equivalent for full-time

48 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

faculty (3 credits for citizenship and 12 for scholarship & teaching).” However, the VP and other administrators also indicated that “each department has authority to establish unique guidelines.” 2. The BYU policies for university level teaching load were shared via electronic document (1 November 1989). The Faculty Work Load policy states, “Normally, 12 credit hours of teaching per semester represents a regular teaching load, although this must be flexible according to college and type of course.” 3. Although VP Webb stated that the general load policy is“15 credits or equivalent for full-time faculty,” he acknowledged that some inequality exists among teaching loads across campus because “each department has flexibility.” In addition, VP Webb wrote in a memo to Lynette Erickson (Dec. 3, 2014), “Note that the implementation process of the Faculty Workload Policy has changed (specifically, the activity reports), but the principles governing assignment of faculty load in the policy remains in force. The 12 credit hours of ‘regular teaching load’ identified in the policy is the expectation for faculty who have no research expectation.” Verified that there are inconsistencies in faculty workload among EPP programs

Audit TaskB16 (2.3) Target: “The one exception to this policy is the requirement that a candidate successfully pass a USOE fingerprint/background check. Without completing this requirement, a candidate will not be allowed to begin an EPP program, since this procedure is required by Utah administrative rule for anyone entering K-12 schools” (p. 8). Probe: For a sample of 20 program completers, verify that each successfully passed a USOE fingerprint/background check. Finding: The sample of 20 students used for these audit tasks was selected by using the random number generator function in Excel and sorted into students’ respective majors. Within a major, students were sorted by their assigned random generated number; students with the highest numbers were included in the sample. All of the 20 program completers had passed the USOE-required fingerprint/background check. Verified

Audit Task B17 (2.3) Target: “Any candidate who does not receive a passing score on the TWS will have a remediation plan developed and his/her student teaching will be extended to implement the plan.” (p. 37) Probe: Examine files for any students from the 2010-2013 academic years who did not receive a passing score on the TWS. Check for the remediation plan. Finding: Four students were selected at random who did not receive a passing score on the TWS. Documentation was requested from academic majors. Emails from academic contacts indicated some passed the TWS on further attempts, but scores

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 49 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway were not recorded. Remediation plans were not evident for these students, nor was student teaching extended.

Table B17 Examination of candidates’ TWS remediation plans Name TWS Score Email Remediation Plan Student Teaching Documentation extended From: CJ Database indicates Chris Moore N N student did not pass; Family and faculty indicated student Consumer Sciences did pass on further attempts NS Database indicates Ronald Terry N N student did not pass; faculty indicated student did pass AT Database indicates Chris Moore N N student did not pass; Family and faculty indicated student Consumer Sciences did pass on further attempts PL Database indicates Tim Morrison N N student did not pass; Department of faculty indicated student Teacher Education did pass Not verified, as none of the student samples contained a remediation plan, and student teaching was not extended for any of the candidates

Program Response: We acknowledge the findings of the audit team and realize that we need to incorporate in our policy more uniform expectations and documentation methods for candidate remediation. We state on page 75 of our IB in Section 5: Discussion and Plan, “To make the complexity easier to manage, we need to respond to the findings of our quality control system and work on forming clear and unified definitions, expectations, and policies. Clarification of existing policies and procedures should also be a priority. In the future we will need to negotiate and communicate concerning additional EPP policies (e.g., data submission, field supervision credit hour load equivalents, essential assessments for all EPP majors).”

TEAC Response: TEAC appreciates learning of the program’s plan for continued improvement.

Audit TaskB18 (2.3) Target: “Candidates at the 2.0 level are “red flagged” for concerns with their professional interpersonal behaviors” (p. 55). Probe: Examine files for any students from the 2010-2013 academic years who scored at the 2.0 level on the Professional Interpersonal Behavior Scale. How are they “red flagged” and by whom? What remediation plan was put in place for them, and how was progress monitored?

50 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Finding: Three students were selected who scored at or below the 2.0 level on the PIBS. Documentation was requested from academic majors.

Table B18 Examination of candidates’ PIBS remediation plans Name Email Red Flag Remediation Plan Documentation From MO Tim Morrison yes Yes. Repeated student teaching and completed new PIBS with passing score BH Jeff Nokes yes Although emails reviewed from university supervisor and mentor teacher expressed concerns, no resolution or remediation plan was documented after final email of concern in March. RO Ronald Terry No No remediation plan Not verified, as two of three student samples (66.75%) selected for a remediation plan did not have such a plan.

Program Response: We acknowledge the findings of the audit team and realize that more uniform expectations and methods of documentation for candidate remediation of behavior and professionalism must be incorporated into the policy.

TEAC Response: TEAC appreciates further explanation from the program regarding the error and rationale that may correct the error.

Audit Task B19 (2.3) Target: “Qualified advisors are available to all candidates at times that are typically convenient to them” (p. 107). Probe: Interview faculty and students to describe the quality of advisement. Finding: Although students and faculty recognized advisors were available, it was sometimes confusing to students which advisor would be best to assist EPP students. One faculty member indicated a student could have four different advisors depending on the major and minor. Some students (8 of 17 at the conclusion of the discussion on advising services for students) indicated frustration and confusion with the quality of advisement. Verified with error due to the number of students indicating frustration with the quality of advisement

Program Response: Candidate feedback and perceptions are very valuable to us; however, the size and complexity of our program prohibit us from assigning one advisor to support any given student throughout his or her educational career, continually meeting that student’s particular needs. EPP secondary education candidates are served in at least two advisement centers: a center in their content area college as well as the Education Student Services in the McKay School of Education. These students receive guidance in their colleges for their content area coursework and general education. The ESS provides advisement regarding their ©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 51 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway professional education coursework and their student teaching or internships; the ESS recommends them for licensure. A candidate with a teaching minor outside his or her major college will be advised through the college housing the minor as well. Working with more than one advisor may be confusing for some candidates.

TEAC Response: TEAC appreciates further explanation from the program regarding the error and rationale for the error’s occurrence.

Audit Task B20 (2.3) Target: “In place of student teaching, a candidate may opt to complete a full-year internship. This internship allows a teacher candidate to be hired as “teacher of record” for an entire school year and be paid half of a regular teacher’s salary with full benefits. This arrangement has been approved by the university and the Utah State Office of Education (USOE)” (p. 7). Probe: Check with Utah State Department of Education if the practice in the target has been approved. Finding: Travis Rawlings, State Representative, Educator Licensure Coordinator, Utah State Office of Education, stated that during the time of the Brief the target was accurate. Verified

Audit Task B21 (2.3) Target: “The inquiry into our quality control system showed that how and when the assessments are administered, collected, and reported and how faculty and programs are held accountable for the process have been inconsistent across the EPP, resulting in more missing data than desired” (pp. 70-71). Probe: Review Appendix I for evidence of missing data and interview authors as to what “administered,” “collected,” and “reported” means in the context of missing data. Finding: The authors stated they “always report the number of expected scores, then the number of actual scores” for all data sets. In addition, a percentage of the differences is indicated followed by the appropriate results. Authors indicate they displayed this throughout the Brief in order to be as accurate as possible and to be consistent. Auditors found this to be true in reviewing data from Appendix I. Auditors selected the following pages from Appendix I (pp. 441-1303) at random: pages 500, 600, 700, 800, 1000, 1100, 1200, and 1003; these pages revealed the following missing data:  p. 500, “Art Education, 2011-2012, Major GPA, # expected 13, # recorded 10, a difference of 3 or 76.92%”  p. 600, “Dance Education, 2012-2013, CPAS 7, 8 & 9, 9 # expected 9, # recorded 8, a difference of 1 or 88.89%”  p. 700, “English Teaching, 2009-2010, TWS 1, 2, 3, & 4, # expected 44, # recorded 42, a difference of 2 or 95.45%”  p. 800, “French Teaching, 2009-2013, ES, # expected 9, # recorded 7, a difference of 2, or 77.8%”

52 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

 p. 900, “Latin Teaching, 2009-2013, CPAS 3, 5, 6, & 7, # expected 4, # recorded 2, a difference of 2 or 50.00%”  p. 1000, “Music Education, 2010-2011, TWS 1-8, # expected 27, # recorded 23, a difference of 4 or 85.19%”  p. 1100, “School Health Education, 2012-2013, TWS 4, # expected 16, # recorded 2, a difference of 10 or 12.50%”  p. 1200, “Special Education, 2011-2012, CPAS 8 7 9, 3 expected 47, number recorded 46, a difference of 1 or 97.81%” p. 1300, “Theater & Media Arts Education, 2010-2011, TWS 8, # expected 15, # recorded 9, a difference of 6 or 60.00%”

In addition, audit tasks already completed in this report show missing data: A4, A6, A7, A8, A9, A11, A12, A13, A14, & A15. Verified

Audit Task B22 (2.3) Target: “Any complex operation must have a solid, consistent core to which all participants can commit. Under the direction of the Center for the Improvement of Teacher Education and Schooling, a study and discussion group was formed to examine the Moral Dimensions of Teaching (Goodlad, Soder, & Sirotnik, 1990), which had been for many years the philosophical core for the BYU-Public School Partnership (BYU-PSP) and the conceptual framework for the EPP. After year-long deliberation, the group came to more fully understand the Moral Dimensions and reached consensus concerning the moral and practical underpinnings of the BYU- PSP. The group decided to retain the basic concepts but make some revisions and adjustments as needed. Out of this thinking came the BYU-PSP Vision Statement and Five Commitments, the basis of our current claims” (p. 73). Probe: Ask authors for documentation/minutes that the CITIES study and discussion group met in a “year-long deliberation.” Finding: The authors provided a number of documents that auditors reviewed regarding the CITIES study, including an August 23, 2010 letter from Steve Baugh of the Center for the Improvement of Teacher Education and Schooling (CITIES) and Paul Wangemann of the BYU-Public School Partnership discussing the project that verified the participants’ names and affiliations. Included in the list were names representing Alpine, Jordan, Nebo, Provo, and Wasatch County school districts, participating BYU colleges of arts and sciences, and the David O. McKay School of Education. A 2010-2011 schedule of activities was included, representing the entire year, August 2010-June 2011. Included in the auditor’s review were agendas for the September session (orientation), January 2011 and February-May, 2011 sessions, the June 2011 session, the August 2011 session, and September, October, and November 2011 sessions. Verified

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 53 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Summary of Tasks Related to Quality Principle 2: Evidence of a Quality Control System

The auditors were able to verify that the program’s quality control system functions as described in Appendix A of the Inquiry Brief and that the internal audit occurred as described. A rationale for the assessments exists and was confirmed by linking claims with assessments. Program faculty members collect data regularly but may not review all the data for recommendations in program improvement. Audit Tasks 15 and 16 are problematic due to missing student samples, indicating a need for attention to data collection and management. In addition, some student frustration with the quality of advisement was evident.

C. Tasks Related to Quality Principle 3: Documentation of Program Capacity

In Table C.1 below, the auditors have indicated whether they found evidence that satisfies each requirement for monitoring and control of program quality. Hyperlinked text refers to an audit task that explores the documentation further.

Table C.1 Quality Control of Capacity: Monitoring and Control (Component 2.3) Documents were Found, Found in Part, Not Found, Not Checked or Not Available for Inspection with regard to parity between the program and institution in each area of TEAC’s requirements. Target (choose at least one for Finding Auditors’ Probe each subcomponent) 2.3.1. Curriculum (Target #1) In an e-mail dated December 3, 2014, the lead auditor of BYU received notification from Travis Formal notification from the state Rawlings, Utah State Office of Education Found that it has approved the program. Licensure Specialist, that the BYU Educator Preparation program is currently approved in all areas listed on pages 3 and 4 in the inquiry brief. 2.3.2 Faculty (Target #2) Request and review signed approval statements that faculty read and approved the Inquiry Brief as stated on the cover page of the Inquiry Brief. Faculty have an accurate and Found balanced understanding of the field. On their TEAC website was found an agenda of the EPP TEAC faculty meetings on Aug. 26 and Sept 2, 2013. A hardcopy was presented with the actual signatures by program 2.3.3 Candidates (Target #3)

54 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

View admissions policy: https://admissions.byu.edu/acceptance-criteria

Admissions policy of the program is Auditors checked several of the programs Found published. (Special Education, Early Childhood Ed., Elementary Ed., and Physical Science) and found the admissions policy information published. 2.3.4 Resources (Target #4) The online survey results from candidates on appropriateness of classrooms, equipment, and supplies indicate a mean rating of 4.07, sd 0.91, N=122. The online survey results from faculty on TEAC survey results from faculty Found resources for teaching indicate a mean rating of and students are satisfactory. 4.29, sd 0.80, N=95. On a scale of 1 to 5, these ratings are fairly high, but there are minimum ratings as low as 2 for candidates and 2 for faculty.

In Table C.2 below, the auditors have indicated whether they found evidence that satisfies each requirement for monitoring and control of program quality. Hyperlinked text refers to an audit task that explores the documentation further.

Table C.2 Parity between the Program and the Institution (Component 3.1) Documents were Found, Found in Part, Not Found, Not Checked or Not Available for Inspection with regard to parity between the program and institution in each area of TEAC’s requirements. Target (choose at least one for each Finding Auditors’ Probe subcomponent) 3.1.1 Curriculum (Target #5) Check catalog for program requirements and compare against a random sample of other programs at the institution.

Auditors examined the BYU course catalog and found total credit hours to be 120 for a The number of credits required for a BS degree. A check of the requirements for Found degree at the institution and program is Political Science, El. Ed, Early Childhood comparable. Ed. and Mechanical Engineering programs were found to be comparable. It was observed that the engineering programs were much more prescriptive, with fewer course choices within a required area, than with the education course requirements. 3.1.2 Faculty (Target #6)

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 55 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Interview appropriate administrator to identify the percentage of faculty in the institution who hold a terminal degree and compare against a random sample of program faculty. The percentage of faculty with terminal Partially degrees in program and in the Through an interview with University Vice Found institution shows parity. President Brent Webb, verbal verification was given that parity exists in the percentage of faculty in the program with terminal degrees compared to other institution faculty. Documents were not available for review. 3.1.3 Facilities (Target #7) Tour facilities and education classes and offices and those for other programs to verify.

Auditors toured facilities in the McKay education building, as well as the mathematics facilities and the humanities facilities. Auditors found basically the same types of classrooms, offices, and conference rooms in each. All include The space and facilities assigned to the basically three types of classrooms, which Found program and to similar programs show the university refers to as (1) the standard- parity. non-enhanced,(2) the media enhanced, which has a pull down computer center from the wall connected to an overhead projector, and(3) a full-tech-podium classroom, which has up-to--date technology also connected to the overhead and also connected to the BYU Television system. The three facilities visited had these three different types of classrooms available. 3.1.4 Fiscal and administrative (Target #8) In the Brief under Appendix B, the chart shows there is a marked difference in the EPP faculty salaries and other university faculty. During an interview with BYU The average salary of program faculty Central Administrators Craig Hart, Brent Partially and the average faculty salary at the Webb, Brian Evans, & Jeff Keith, there was Found institution show parity. an understanding among those present in the meeting that in their view it was "inappropriate to compare the EPP faculty salaries with the average for the entire university." 3.1.5 Candidate support (Target #9)

56 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Interview students to verify parity of access to services as compared to students of other programs. When interviewed, students stated and agreed that students in their various programs have equal access with students The program students have the same in other programs to services offered on Found access to services as students in other campus. Examples include counseling, programs at the institution. career placement, library, financial aid, mental health, support services, etc. When interns and student teachers were interviewed, several in attendance expressed frustration about the lack of consistency because of multiple advisors. 3.1.6 Candidate complaints (Target #10) Interview administrator who handles complaints and review documentation to verify candidate complaints are proportionally no greater or more significant than the complaints by candidates in the institution’s other programs.

In an interview, University Vice President Brent Webb gave verbal verification that all complaints are handled equally at the university level and not at the individual program level. He was not aware of a Candidate complaints are proportionally greater number of candidate complaints no greater or more significant than the Found coming from the EPP program as compared complaints by candidates in the to the institution’s other programs. institution’s other programs. According to Aaron Popham, Assessment and Accreditation Director, the EPP program does not collect data on student complaints.

In an interview with CFAs, liaisons, and field supervisors, a comment was made relating to Aaron’s statement: “Even though a process is in place, complaints and concerns may not get resolved at the department level.”

In Table C.3 below, the auditors have indicated whether they found evidence that satisfies each requirement for sufficiency of program quality. Hyperlinked text refers to an audit task that explores the documentation further.

Table C.3 Quality Control of Capacity: Sufficiency (Component 3.2)

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 57 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Documents were Found, Found in Part, Not Found, Not Checked or Not Available for Inspection with regard to parity between the program and institution in each area of TEAC’s requirements. Target (choose at least one for Finding Auditors’ Probe each subcomponent) 3.2.1 Curriculum (Target #11) Review catalog to verify. Credit hours required in the In reviewing the catalog, auditors found the subject matter to be subject matter are tantamount to an academic major. Examples follow: tantamount to an  A mathematics degree in Applied and Computational mathematics academic major. requires 70 hrs. A Math Ed degree requires 74 hrs, of which 35 Found credits are actual math content courses and other credits are Credit hours pertinent to teaching math in schools. required in  A Music Ed K-12 degree with an instrumental emphasis requires 90 pedagogical hours, which includes licensure hrs. subjects are tantamount to an  A Music Ed K-12 degree with a choral emphasis requires 83 hours, academic minor. which includes licensure hrs.  A Commercial Music degree requires 75-76 hrs. 3.2.2 Faculty (Target #12) Review schedule of classes and CV’s for at least 5 full-time faculty to verify that they hold a terminal degree in the areas of course subjects they teach. From the TEAC website auditors found the following information on 5 full-time faculty members:

Elementary Ed., Brad Wilcox holds a terminal degree in curriculum/instruction reading and lang. arts. He teaches elementary reading and literacy courses. Full-time faculty selected at ECE, Kendra Hall-Kenyon has a terminal degree in human random have a development and teaches elementary and early childhood education terminal degree Found classes. (major or minor) in the areas of Secondary Ed., Jethro Gilspe does not have a terminal degree. course subjects However, he is currently enrolled in his first year of a PhD program. He they teach. teaches art education classes

Secondary Ed., Duane Merrell does not hold a terminal degree. He has a master’s in physical science and teaches in the Physical Science Department.

Special Education, Darlene Anderson has a terminal degree in special education. She teaches courses in counseling, psychology and special education. 3.2.3 Facilities (Target #13) The online survey results from candidates on availability of classrooms, TEAC survey equipment, and supplies indicate a mean rating of 4.00, sd 0.97, results from N=121. The online survey results from faculty on facilities for teaching Found program faculty indicate a mean rating of 4.35, 0.80, N=97. On a scale of 1 to 5, these are satisfactory. ratings are fairly high, but there are minimum ratings as low as 1 for candidates and 2 for faculty.

58 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

3.2.3 Facilities (Target #14) Observe at least 2 class sessions to verify that the rooms and equipment constitute adequate instructional settings.

One auditor observed a Teaching Literacy class taught by Dr. Brad Wilcox, EL ED 443, and observed the following:  22 students were seated at 5 round tables. Auditors’  95% of students were using laptops. observations of at  There were 8 banks of lights in a 12X14 classroom as well natural least two class window lighting. Lighting seemed bright and inviting. sessions found  White boards were at the front and across one side so the that the rooms Found classroom could be positioned two different ways for seating and equipment arrangements. constitute adequate  The room had 3 wall bulletin boards. instructional  Chairs were comfortable and on wheels. settings.  The classroom was equipped with an overhead projector mounted to the ceiling and a pull down screen. The equipment was used by the students as well as teacher.  There was a lot of student interaction and engagement.  The auditor observed several different teaching strategies used by Dr. Wilcox: small group interaction, various questioning techniques, and student reporting about class observations. 3.2.4 Fiscal and Administrative (Target #15)

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 59 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Statement from financial auditor attesting to the financial health of the institution.

Regional accreditor’s finding of financial Interview appropriate administrator and request documentation soundness. attesting that resources are adequate to administer the program. A composite In an interview, University Vice President Brent Webb verified what was score of 1.5 or stated in Appendix B: The program is given adequate resources to higher from USDE administer it. He indicated that all colleges are treated equally, but in its Report on individual deans have the autonomy to disperse resources as they Financial Partiall deem appropriate. Statements. y found A memo dated December 3, 2014 from Brent Webb to Lynnette Alignment of Erickson stated that internal auditors Deloitte and Touche do not education faculty provide a letter certifying fiscal operations, but they do “certify that our teaching load with financial statements are prepared in accordance with generally the institution accepted accounting principles.” He also commented that “certification average. cannot be separated from the financial statements, which we treat as confidential.” Qualification of program administrators for their positions.

Adequacy of resources to administer the program. 3.2.5 Candidate support (Target #16) The online survey results from candidates on helpfulness of candidate support services indicate a mean rating of 3.89, sd 1.06, N=122. The online survey results from faculty on helpfulness of candidate support services indicate a mean rating of 4.56, 0.63, N=99. On a scale of 1 to TEAC survey 5, these ratings are fairly high, but there are minimum ratings as low as results from 1 for candidates and mid-range 3 for faculty. Found students and faculty are The online survey results from candidates on availability of candidate satisfactory. support services indicate a mean rating of 3.95, sd 0.97, N=121. The online survey results from faculty on availability of candidate support services indicate a mean rating of 4.52, 0.66, N=97. On a scale of 1 to 5, these ratings are fairly high, but there are minimum ratings as low as 1 for candidates and mid-range 3 for faculty. 3.2.6 Policies and practices (Target #17) An academic The calendar was reviewed at Found calendar is http://registrar.byu.edu/registrar/acadsched/calendar.php published. An academic calendar for this current year is published. 3.2.6 Policies and practices (Target #18)

60 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Random selections of two pages in the catalog that deal with the program have no Randomly select two pages of the catalog and verify that there are no inaccurate inaccurate statements about the program. statements about Found the program. Auditors selected, studied and compared two pages in the catalog, one describing a BS in Music Education and one describing the Psychology Claims made in major, and did not detect any inaccurate statements about the program. program website and catalog are consistent with claims made in the Brief. 3.2.6 Policies and practices (Target #19) Review grading policy for accuracy at http://registrar.byu.edu/registrar/records/grades.php This website is directed to the grading policy of the university overall and not to the individuals in the program. The syllabi of 5 instructors in the program were examined:  Blake Peterson, Math Ed 308 Grading policy of  Debra Dean, English 423 the program is Found published and is  Scott Hendrickson, Math 308 accurate.  Blair Bateman, Secondary Ed 276 R: Exploration of Foreign Lang.  Jeffery Nokes, History 397 All five had their grading policy clearly explained in their published syllabus.

Upon examination, auditors did not see any information that would be considered inaccurate. 3.2.6 Policies and practices (Target #20) Transfer of credit Review policies at policy and transfer http://registrar.byu.edu/registrar/transferEvaluation/transferPolicyDisclo Found of student sure.php enrollment policy On the above website, a section in the policy states the information are published. about the transfer of students and credits. 3.2.6 Policies and practices (Target #21) Program has procedures for Review policy at https://admissions.byu.edu/acceptance-criteria student On the Brigham Young University website in the undergraduate catalog complaints. there is a section called "general information." When you click on that Found section, there is a tab called “grading/reports”; click on that tab and Program provides scroll down to a heading titled “Student Academic Grievance Policy.” for student This section outlines student procedures for filing a complaint or evaluations of grievance. courses. 3.2.6 Policies and practices (Target #22)

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 61 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

If the audited program or any option within the program is delivered in a distance education format, the auditors verify that the program has the capacity to ensure timely NA Currently no distance education classes are offered. delivery of distance education and support services and to accommodate current student numbers and expected near- term growth in enrollment. 3.2.6 Policies and practices (Target #23) If the audited program or any option within the program is delivered in a distance education format, NA the auditors verify Currently no distance education classes are offered. that the program has a process to verify the identity of students taking distance education courses.

In Table C.4 below, the auditors have documented the results of the Call for Comment, which TEAC requires be distributed “to its communities of interest and to members of the public” according to CAEP Policy XLI (see http://caepnet.org/accreditation/).

Table C.4 Call for Comment Call for comment to third parties distributed as # Positive # Negative # Mixed required by TEAC policy (Target #23) Comments Comments Comments Found 14 3 0

62 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Summary of Tasks Related to Quality Principle 3: Documentation of Program Capacity

Of the 23 potential targets associated with Quality Principle III, the auditors examined 23 and found that 22 of these could be confirmed with the documentation provided by the program. These targets indicate that on the whole the program has parity with the institution and has documented its capacity for quality because the preponderance (at least 75%) of the supporting evidence was consistent with claims of capacity and commitment, and/or further audit tasks verified associated capacity targets.

D. Tasks Related to Quality Principle 3: Auditors’ Judgment of Institutional Commitment

In a meeting with Vice Presidents Brent Webb, Craig Hart, Brian Evans, and Jeff Keith, the auditors were informed that the Educator Preparation Program (EPP) is a very viable part of the university and that all programs, including the EPP, are treated with parity in distribution of rank and resources. University administration indicates it is inappropriate to compare the EPP faculty salaries with the average for the entire university. The administration values the program and is well aware of faculty and staff commitment to their school(s) and program(s) as well as to the mission of the university. The vice president focused some of their comments on the university’s commitment to its clear mission statement and core values. The mission includes a “commitment to excellence,” and the aims include a commitment to “lifelong learning and service” to attract ambitious and bright candidates. Results of the online surveys corroborate these findings. Statements made by the vice presidents are aligned with the Mission/Aims found on page 1 of the Inquiry Brief.

TEAC also surveyed students and faculty regarding aspects of the institutional commitment to the program. Results are in Table D.1 below:

Table D.1 On-Line Student and Faculty Mean Ratings on Indicators of Institutional Commitment Number Minimum Maximum Mean Standard Survey item Raters Rating Rating Rating Deviation Student Ratings Appropriateness of classrooms, equipment, 122 2 5 4.07 0.91 supplies Availability of classrooms, equipment, 121 1 5 4.00 0.97 supplies Helpfulness of student 122 1 5 3.89 1.06 support services

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 63 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Availability of student 121 1 5 3.95 0.97 support services Faculty Ratings Institutional 99 2 5 4.28 0.85 commitment to program Resources for teaching 95 2 5 4.29 0.80 Facilities for teaching 97 2 5 4.35 0.80 Helpfulness of student 99 3 5 4.56 0.63 support services Availability of student 97 3 5 4.52 0.66 support services 1=Inadequate, 2=Barely Adequate, 3=Adequate, 4=More than Adequate, 5=Excellent

Although minimum ratings of 1 and 2 expressed by candidates on all four indicators of institutional commitment are troubling, the mean institutional commitment was rated as adequate (3.89-3.95) or more than adequate (4.0-4.07).

In three of five faculty categories, minimum ratings of barely adequate (2) were found, but in all five faculty categories mean ratings were more than adequate (4.28-4.56).

Part Six: AUDIT SCHEDULE

64 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 65 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

66 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 67 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

68 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Part Seven: CASE ANALYSIS

Case Analysis for the Inquiry Brief Pathway Brigham Young University Educator Preparation Program Provo, UT December 2-4, 2014

Presentation of the Case aligned to TEAC Quality Principles

QUALITY PRINCIPLE 1 : EVIDENCE OF CANDIDATE LEARNING

Component 1.1: Evidence of candidates’ subject matter knowledge Evidence available to the panel that is consistent with subject matter knowledge comes from: Clinical Practice Assessment System (CPAS), Employer Survey (ES), Major GPA, Teacher Work Sample (TWS), and Praxis (IB, p. 20). --CPAS: Candidates mentor teacher and university supervisor complete the CPAS for all clinical experiences (practica and student teaching/internship) (IB, p. 32). The average scores for graduates are above 4.0 (on a 5-point scale) for every item except one, and in all cases the scores are well above the 3.0 basic competence level (IB, p. 38-40). A one-sample t-test demonstrated that candidates were performing at a statistically significant level above the faculty criterion 3.0 cut score (university supervisor mean = 4.32, sd = .488 and t(25245) = 136.568, p<.001;mentoring teacher mean = 4.39, sd = .503 and t(2514) = 138.854, p<.001) (IB, p. 43-45). See the following Audit Tasks: A1, A3, A7, A10, A13, A14, A16, A17, B3, B6, B11, & B21. --ES: The BYU Office of Institutional Assessment developed the ES and sampling procedures with input from the EPP and is administered every three years (IB, p. 32). The ES requests feedback from Utah public school principals on the EPP graduates’ performance during their first three years of teaching, with items related to the INTASC standards and to candidates’ preparedness to teach. With a 70.7% response rate, principals rated EPP graduates from 2010-2013 in two areas: (a) teacher skills and behavior on four items at a favorable 97%, 94%, 94% and 93% respectively and (b) knowledge, preparedness, and comparison with other teachers at a favorable 94%, 87%, and 81% respectively (IB, p. 49 & 67). See the following Audit Tasks: A5, B10, & B21 --Major GPA: Major GPAs are calculated from the content, pedagogical, and clinical courses in the candidate’s major (IB, p. 31). The average major GPA during the accreditation period was 3.63 and a one-sample t-test demonstrated the candidates performed at a statistically significant level above the 2.85 major GPA requirement (mean = 3.62, sd = .28275, t(2560) = 139.485, p , .001) (IB, p. 41 & 68).. See the following Audit Tasks: A2 & B21 --TWS: Completed during the candidates’ student teaching/internship, the TWS is a capstone assignment for all EPP majors. The TWS is a capstone assessment for which candidates develop, teach, and assess a unit of instruction and report the results. On the 0-2 scale candidates’ average performance was significantly above the 1.0 passing cut score (m = 1.84, sd = .193). A one-sample t-test showed that candidates scored statistically significantly above the 1.0 cut score on all eight TWS items, t(2265) = 217.833, p ,.001. See the following Audit Tasks: A4, A6, A8, A9, A19, B12, B17, & B21

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 69 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

--Praxis: EPP candidates are required to pass the Praxis II test before they can be recommended for state licensure. Praxis II pass rate among graduates was 98.5%, and a one-sample t-test showed that candidates’ aggregated mean scores were at a statistically significant level above the aggregated mean of the state cut scores at both the larger and smaller scales: t(135) = 17.962, p,.001 and t(24060 = 101.084, p,.001, respectively. See the following Audit Tasks: B9 --Audit Task A20 shows results of TEAC on-line survey. Evidence available to the panel that is inconsistent with subject matter knowledge None Rival explanations for the claim that the evidence is consistent with subject matter knowledge, showing that the evidence is consistent instead with something separate from the quality principle None

Component 1.2: Evidence of candidates’ pedagogical knowledge Evidence available to the panel that is consistent with pedagogical knowledge comes from: CPAS, ES, Major GPA, TWS, and Praxis (IB, p. 20). --CPAS: See above --ES: See above --Major GPA: See above --TWS: See above --Audit Tasks A21 & A22 shows results of TEAC on-line survey. Evidence available to the panel that is inconsistent with pedagogical knowledge None Rival explanations for the claim that the evidence is consistent with pedagogical knowledge, showing that the evidence is consistent instead with something separate from the quality principle None

Component 1.3: Evidence of candidates’ caring and effective teaching skill Evidence available to the panel that is consistent with caring and effective teaching skill comes from CPAS, ES, Major GPA, TWS, Praxis, and Professional Interpersonal Behavior Rating Scale (PIBS) (IB, p. 20). --CPAS: See above --ES: See above --Major GPA: See above --TWS: See above --PIBS: Instructors, clinical supervisors, mentor teachers, and teacher candidates complete these evaluations at various times during the candidates’ professional coursework and/or clinical experiences (IB, p. 32). After three revisions the PIBS has been designated as a “red flag” instrument for faculty or program administrators to use when a candidate needs particular feedback or remediation on his or her professionalism and interpersonal dispositions. Use of the PIBS is a program-by-program option. Average scores for each individual assessment item are all above 3.5. Candidates at the 2.0 level are “red flagged,” as noted above. The one-sample t-test results showed that candidates performed at a statistically significant level above the 2.0 “red flag” score with p<.001 on all 10 PIBS items (IB, p. 46 & 47). See Audit Tasks B8 & B18. Evidence available to the panel that is inconsistent with caring and effective teaching skill None Rival explanations for the claim that the evidence is consistent with caring and effective

70 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway teaching skill, showing that the evidence is consistent instead with something separate from the quality principle None

Component 1.4: Evidence that the cross-cutting themes are embedded Evidence available to the panel that is consistent with the cross-cutting themes CPAS, ES, Major GPA, TWS, Praxis, Clinical Disposition Score (CDS), and Technology Skills Assessment (TSA) (IB, p. 20 & 21). --CPAS: See above --ES: See above --Major GPA: See above --TWS: See above --CDS: This candidate self-report instrument is administered as a pre/post assessment at program admission and again at the student teaching semester or the second semester of a year- long internship (IB, p. 31). The CDS is a self-report instrument used by faculty and administrators to better understand candidates’ perceptions of their dispositions and self-efficacy at the beginning and end of their programs. The paired sample t-test results showed that the post-program CDS score on the Locus of Control (3.78), Aspirations (3.51), and Diversity (4.19) are significantly greater than the pre-program CDS scores on the Locus of Control (3.70), Aspirations (3.41), and Diversity Scales (4.05) (IB, p. 42 & 43). See Audit Tasks A11, A12, A18, --TSA: This performance assessment is administered prior to or near the beginning of each of the teacher preparation programs (IB, p. 32) The TSA is a performance-based assessment representing basic technological skills foundational for eventually integrating technology into teaching. Candidates must successfully complete each assessment in 30 minutes or less with 100% accuracy on the evaluated skill. Results indicate that on each of the four assessments a significant number were able to pass the first time: 84.98% (Word Processing), 71.46% (Spread Sheet), 70.99% (Presentation Software), and 88.85% (Internet & Communications) (IB, p.48 & 49). --Audit Task A23 shows results of TEAC on-line survey. Evidence available to the panel that is inconsistent with the cross-cutting themes None Rival explanations for the claim that the evidence is consistent with the cross-cutting themes, showing that the evidence is consistent instead with something separate from the quality principle None

Component 1.5: Evidence of valid interpretations of the assessments Evidence available to the panel that is consistent with valid interpretations of the assessments Pages 33-37 of the IB outlines the faculty’s case for the validity and reliability of each assessment. Table 8 on IB p. 37 provides a summary of validity and reliability assessment measures. See Audit Tasks: A24, B2, & B4. --Audit Task A24 shows results of TEAC on-line survey. Evidence available to the panel that is inconsistent with valid interpretations of the assessments See Audit Task A19 as evidence of missing data

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 71 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Rival explanations for the claim that the evidence is consistent with valid interpretations of the assessments, showing that the evidence is consistent instead with something separate from the quality principle None

QUALITY PRINCIPLE 2 : EVIDENCE OF FACULTY LEARNING AND INQUIRY

Component 2.1: Rationale for assessments Evidence available to the panel that is consistent with a rationale for assessments The faculty presents a rationale for their assessments on pages 22 – 28 of the IB. See Audit Tasks: A1, B2, B3, B4, & B5. Evidence available to the panel that is inconsistent with a rationale for assessments None Rival explanations for the claim that the evidence is consistent with a rationale for assessments, showing that the evidence is consistent instead with something separate from the quality principle None

Component 2.2: Evidence that program decisions and planning are based on evidence Evidence available to the panel that is consistent with the program basing its decisions on evidence The faculty describe plans based on their response to the evidence collected for the writing of their IB on pages 71 – 77. See Audit Tasks B4, B5, B6, B8, B11, & B12. Evidence available to the panel that is inconsistent with the program basing its decisions on evidence None Rival explanations for the claim that the evidence is consistent with the program basing its decisions on evidence, showing that the evidence is consistent instead with something separate from the quality principle None

Component 2.3: Evidence of an influential quality control system Evidence available to the panel that is consistent with an influential quality control system See IB, Appendix A (p. 79 – 108). Evidence available to the panel that is inconsistent with an influential quality control system Missing data were found in Audit Tasks: A2, A4, A6, A7, A9, A11, A12, A13, A15, A19; missing remediation plans described in Audit Tasks B17, and B18. Auditors acknowledge that faculty included statements in the IB on pages 47, 48, 67, & 74 recognizing problems with missing data. Rival explanations for the claim that the evidence is consistent with an influential quality control system, showing that the evidence is consistent instead with something separate from the quality principle None

QUALITY PRINCIPLE 3: EVIDENCE OF INSTITUTIONAL COMMITMENT AND CAPACITY FOR PROGRAM QUALITY 72 Council for the Accreditation of Educator Preparation Inquiry Brief Pathway

Evidence of institutional commitment and capacity for program quality Evidence available to the panel that is consistent with the capacity for program quality See Brief, Appendix B, and Table C.1, Table C.2, and Table C.3 in the audit report. Evidence available to the panel that is inconsistent with capacity for program quality, showing that the evidence is consistent instead with something separate from the quality principle None

Suggested Recommendations

Suggested Weakness in Component 2.3

Missing data was evident in a number of audit tasks. The faculty acknowledged these findings alerting the audit team in Sections 4 & 5 of the Inquiry Brief that the process and procedures they depended on would yield findings with multiple missing data.

Suggested Accreditation Recommendation (shaded)

Quality Principle 2 Quality Principle 3 Quality Principle 1 Accreditation Faculty learning Capacity & Candidate learning status designations and inquiry Commitment Accreditation Above standard Above standard Above standard (7 years) Accreditation Above standard Below standard Above standard (2 years) Accreditation Below standard Above standard Above standard (2 years) Accreditation Above standard Above standard Below standard (2 years) Below standard Below standard Above standard Deny Below standard Above standard Below standard Deny

©CAEP1140 19th Street NW Suite 400 Washington, DC 20036  202.753.1630 www.caepnet.org 73

Recommended publications