Abhes Program Effectiveness Plan (Pep) Manual

Total Page:16

File Type:pdf, Size:1020Kb

Abhes Program Effectiveness Plan (Pep) Manual

ABHES PROGRAM EFFECTIVENESS PLAN (PEP) MANUAL Updated January 2012

TABLE OF CONTENTS

The Purpose of the Program Effectiveness Plan (PEP)...... 3 Developing, Implementing, and Monitoring the Program Effectiveness Plan...... 5 Format and Content Guidelines...... 7

Subsection 1 –Program Effectiveness Plan content January 2012 Title/Cover Page...... 7 a. Student Population...... 7 b. Program Objectives...... 8 c. Program Retention Rates...... 10 d. Program Job Placement Rates...... 12 e. Credentialing Examination Participation Rates...... 14 f. Credentialing Examination Pass Rates...... 15 g. Program Assessment...... 16 h. Student, Clinical Extern Affiliate, Graduate And Employer Satisfaction Surveys...... 18 i. Faculty Professional Growth And In-Service Activities ...... 22

Subsection 2 – Outcome Assessment i. Historical outcomes...... 23 ii. Types and Uses of Assessment Data...... 23 iii. Initial baseline rates & measurement of results...... 24 iv. Summary and analysis of data collected...... 25 v. How data is used to improve the educational process...... 25 vi. Goal Adjustment...... 26 vii. Activities undertaken to meet future goals...... 26 Format Examples...... 27 Conclusion…… ...... 30

Program Effective Plan Manual 2 Updated January 2012 PURPOSE OF THE PROGRAM EFFECTIVENESS PLAN

The Program Effectiveness Plan (PEP) is an internal quality assessment tool that evaluates each program within an educational institution by

 establishing and documenting specific goals,  gathering outcome data relevant to these goals,  analyzing outcomes in relation to benchmarks and program’s short- and long-term objectives, and  designing strategies to improve program performance. The program effectiveness assessment is expected to result in the achievement and maintenance of outcomes. For each of the outcomes identified by a program, the program establishes the level of performance that serves as a benchmark for acceptable program performance. These benchmarks meet or exceed requirements established by any applicable state or federal authority and by ABHES policies and/or standards.

Program success is based on student achievement in relation to its mission, including but not limited to consideration of the following:  Retention rates  Participation in and results of licensing and certification examinations  Graduation rates  Job placement rates  Program assessment  Survey responses from students, clinical externship sites, graduates, and employers  faculty professional growth and in-service activities Developing and using the Program Effectiveness Program (PEP) should fulfill several purposes, including:

1. Assisting the institution in achieving internal effectiveness through establishing goals for short- and long-term successes. Further, criteria for measuring the accomplishment of these goals can be defined, allowing the institution to focus its plans and activities on the critical processes needed for effectiveness. Once defined, these goals and criteria should then be used to unify administrative and educational activities, which can achieve a high degree of commitment and common direction among all employees.

2. Assessing progress and the need for change and continuously reviewing the process to help the institution make timely changes based upon valid information to achieve even greater effectiveness.

3. Communicating key information about the institution’s goals, its degree of effectiveness, and how it plans to enhance overall quality to external publics such as graduates, employers, and community leaders. Information, which depicts the most important elements of the institution’s operation, communicates clearly and accurately to external publics how well the institution is meeting the needs of students and providing quality-learning experiences.

4. Measuring how the PEP meets the expectations and requirements of approving or accrediting organizations, including state boards and ABHES, to demonstrate regulatory compliance. A document which defines institutional goals and educational processes is a

Program Effective Plan Manual 3 Updated January 2012 primary focus of most accrediting agencies as they measure overall effectiveness and the quality of programs and services provided.

All goals and activities are key indicators of program effectiveness and should relate to the institution’s mission to demonstrate mission achievement and continuous improvement, as the institution’s mission is the impetus and barometer of the program’s effectiveness. The PEP requires an institution to look at its past, present, future, and strategies and to continuously ask:

Where have we been? This data becomes the baseline for gauging and demonstrating improvements.

Where are we now? Current data demonstrates how you will measure change from the baseline data using the caparison to identify changes needed.

Where do we want to go? A look toward the future for goals to improve or enhance processes and/or programs.

How do we get there? Processes used to achieve the new direction based upon the input of all relevant constituents.

Program Effective Plan Manual 4 Updated January 2012

DEVELOPING, IMPLEMENTING, AND MONITORING THE PROGRAM EFFECTIVENESS PLAN (Standards and Examples)

The standards addressing the Program Effectiveness Plan may be found in the ACCREDITATION MANUAL 17TH Edition, Chapter V, Section I, pages 72-76 as published by the ACCREDITING BUREAU OF HEALTH EDUCATION SCHOOLS (ABHES). The standards outline the ABHES requirements in relation to the development, implementation, and maintenance of the PEP, including the outcomes assessment requirements and (Section I, Subsection 2), which gives a detailed description and explanation of the meaning and implications on the required components of the PEP. This manual provides suggestions and examples for addressing each PEP standard.

Developing an PEP requires that each program collect, maintain, and use information reflecting the areas outlined in Chapter V, Section I of the ABHES Manual. The data should be analyzed for a specific 12 month period of time as defined by the institution and be used as the foundation for making comparisons across future time periods. Many institutions perform its analysis in conjunction with its fiscal/calendar year or in conjunction with the ABHES annual reporting period (July 1 – June 30), since the majority of the required PEP information is also required on the ABHES Annual Report. Regardless of the selected timeframe, the data is to be updated at least annually.

The PEP is unique to the program and institution and the institution must evidence its efforts made to ensure continuous improvement. The process requires that the institution: (1) systematically collect data and information on each of the educational outcomes areas and achievement of its occupational objectives at least annually; (2) complete an analysis of the data and information including, but not limited to, performing a comparison with previous findings; and (3) identify what changes in educational operations or activities it will make based on the analysis.

Steps in preparing and managing the PEP are similar to those suggested for preparing an institution’s self-study. Structured organization is essential. Although the exact organizational procedures will vary from institution to institution, the following suggestions may be helpful:

. The program faculty (full time and part time) assisted by the president/director, director of education, and a representative from admissions and placement are the key individuals acting as a team to initiate, guide, and direct the development and implementation of the PEP. It is their commitment to the PEP and empowerment of the team to oversee these activities that will ensure continuous improvement and the ultimate success of the planning process. . The process is a collective effort that should involve all faculty, administrators, staff, and advisory board members. Consideration should also be given to actively recruiting student, graduate, and employer representatives in the process. It is important that all members of the administration, faculty, governing board, and student body understand and appreciate the importance of the PEP and its value to the institution. . Establish subcommittees to prepare specific PEP sections. These subcommittees should be effectively utilized to complete the various tasks in all facets of the PEP, including development, implementation, and evaluation. The consideration of

Program Effective Plan Manual 5 Updated January 2012 subcommittee members should depend on each member’s responsibilities. Include the names of those responsible for implementing and monitoring the PEP. . Establish baseline rates developed through analyzing the results of past annual retention and placement rates, which will be used in the analysis process. The data collected each year on the ABHES Annual Report includes retention and placement percentage; therefore, it is a valuable part of the PEP. Each program should maintain these annual reports with supporting documentation, for at least three years so as to provide historical data from which goals may be set. Be specific in the data to be collected and collect data that will clearly evidence the level of educational outcomes and satisfaction experienced by current students, graduates, and employers. . The PEP may include any other elements determined to be important measures of program effectiveness such as a review of default rates in the student loan programs under Title IV of the Higher Education Act, based on the most recent data provided by the Secretary of Education. These findings may be coupled with student retention and placement rates to determine what correlation, if any, can be determined. Any correlation identified should be reviewed for correction. . Because the PEP focuses on overall program improvement, it is a work in progress as there are many potential elements of the institution’s daily operations, which are relevant and important to improving effectiveness. Each program is encouraged to collect a variety of statistical data, which will assist it in improving the educational outcomes. . The PEP team and subcommittees should adopt and implement a realistic and enforceable periodic schedule throughout the year to review the PEP and document progress through minutes of all meetings where the PEP is discussed. The meeting minutes should show the progress to date, a short summary of the data analyzed, changes anticipated, and continuation or new direction the institution is taking to improve the educational processes. Minor revisions to goals may be made during the monitoring of the PEP; however, substantial revisions should only be made at the annual review unless there is a major change in the institution’s leadership and/or mission. These periodic meetings will ensure that the PEP is utilized and evaluated on a continuing basis.

Program Effective Plan Manual 6 Updated January 2012 PROGRAM EFFECTIVENESS PLAN FORMAT AND CONTENT GUIDELINES

While each program must address each element required of the Program Effectiveness Plan (PEP), the plan may be a comprehensive one which collectively represents all programs within the institution, or may be individual plans for each distinct program. The following section is to serve as a guide as it contains elements that should be incorporated into the PEP. Each standard is given, followed by examples of how an institution might demonstrate established goals and compliance.

Title/Cover Page to include the following: ABHES I.D. CODE Name of Institution Address City Name of Program Program Director Credential awarded Portion of program offered via distance learning Length of program (clock hours, semester/quarter credits, weeks, etc.) 12 Month period covered by the plan (e.g., July 1, 20?? through June 30, 20??) V.I.1. A program has an established documented plan for assessing its effectiveness as defined by specific outcomes.

The Program Effectiveness Plan includes (all of the following standards) clearly stated: STANDARD a. student population A description of the characteristics of the student population is included in the Plan. Student population demographics such as gender ratios, median age, race/ethnicity, marital status, and socioeconomic descriptions should also be included and identified by program if they differ from the overall institutional demographics.

EXAMPLES: There are many ways an institution may identify such information. The following examples include narrative, listing and chart formats: Narrative Format The institution’s student population has doubled over the last three years and is represented by a diversity of demographic characteristics. Approximately 80% of the population is independent with an average annual income below $22,000, and 20% are dependent with an average annual family/household income of $40,000. Male to female ratio is 39% to 61% respectively, and student ages range from 18 to 63. Recent business closings have resulted in an increase in the student population of dislocated workers seeking retraining. The majority of students require some form of financial assistance. The race/ethnicity composition is African American/Black 13%, American Indian/Alaskan Native 0.3%,

Program Effective Plan Manual 7 Updated January 2012 Asian/Pacific Islander 0.7%, Hispanic 1.5%, Mexican American 0.5%, Caucasian/White 78.8%, and Undisclosed race 5.1%

Listing Format In the 2010-2011 Annual Report year, the student body consisted of approximately: 61% female, 39% male 60% attend day classes: 40% attend evening classes 83% earned an average or above grade in high school English 75% earned an average or above grade in high school math 9% English as a second language 58% HS 12% GED 30% had prior postsecondary education 71% were first in family to receive postsecondary education 61% were employed 80% were independent with a household income of $22,00 or less 91% attended full-time classes and 9% part-time 36% were married 29% under age 25, 34% age 25-34, 26% age 35-44, 7% age 45-54, 4% age 55+ Afro-American Black 13%, American Indian/Alaskan Native 0.3%, Asian/Pacific Islander 0.7%, Hispanic 1.5%, Mexican American/Chicano 0.7%, Caucasian/White 78.8%, Undisclosed race 5%

Chart Format Socio- Gender Race/ Economics ratios Median Ethnicity Marital Independent & PROGRAM M F age W NW U status <$22,000 Nursing Assistant Dental Hygienist Massage Therapist Medical Assistant Medical Billing & Coding Specialist Patient Care Specialist Pharmacy Technician Phlebotomy Technician INSTITUTION TOTALS 39 69 13 82 5

STANDARD b. program objectives Programs objectives are consistent with the field of study and the credential offered and include as an objective the comprehensive preparation of program graduates for work in the career field.

Program Characteristics of each currently offered program should include:  Degree level,  Program description,  Program objectives, and

Program Effective Plan Manual 8 Updated January 2012  Description of student outcomes, specifying the competencies students should possess upon conclusion of the program.

If an institution offers Medical Assistant, Nursing Assistant, and Surgical Technology programs, then its PEP might present the following overview:

EXAMPLE: The Medical Assistant academic associate’s degree program prepares the student to become a multi-skilled allied health professional with diverse duties in medical offices, clinics and health centers. The program includes a balance of classroom, laboratory, and clinical experiences. Objectives of the program are to: . Prepare a knowledgeable entry-level employee with technical skills and work habits necessary to perform effectively in various health-care related fields including medical transcriptionist, medical billing specialist, medical office manager, and medical assistant. . Provide clinical activities that include assisting the physician in patient care responsibilities by recording medical histories, taking vital signs, preparing the patient for examination, assisting physician during patient examination and surgical procedures, collecting and performing various laboratory tests, administering medications, performing diagnostic procedures such as EKGs and dressings, and providing patient education. . Teach courses in anatomy, physiology, pharmacology, computer applications, clinical procedures, interpersonal skills, confidentiality, medical ethics, professional behavior, and patient interface, as well as basic office procedures to ensure competency.

At the completion of the program, the student will be able to: . Assume a wide range of responsibilities in a medical office or ambulatory care center. . Communicate with patients to schedule appointments, receive and process payments. . Sit for the credentialing examination

The Nursing Assistant diploma program prepares the student to function under the supervision of a physician and/or a registered nurse and to participate as a member of a healthcare team in providing nursing care. The program includes classroom, laboratory, and clinical patient care experiences. Objectives of the program are to: . Prepare a competent, nurse assistant to function effectively in acute, long-term care, and ambulatory settings. . Provide a collaborative learning environment in which the student will develop and apply principles of systematic reasoning through critical thinking. . Guide the learner in the continuing process of personal and professional growth. At the completion of the program, the student will be able to: . Function in the delivery of care to clients. . Communicate with clients, client families, and members of the healthcare team. . Perform nursing skills applying critical thinking. . Integrate ethical, professional, legal responsibility, and accountability into actions and decisions. . Assume responsibility for personal and professional growth. . Sit for the State certification board exam.

Program Effective Plan Manual 9 Updated January 2012 The Surgical Technology certificate program prepares the graduate to function as an intraoperative team member under the direct supervision of a surgeon or registered nurse. The graduate is prepared for this role through didactic, laboratory, and external clinical experiences. Objectives of the program are to: . Prepare the graduate for a professional career. . Prepare a competent surgical technologist to perform intraoperative first scrub duties. . Guide the learner in the processes for certification and professional development. At the completion of the program, the graduate will be able to: . Effectively perform pre-, intra-, and post-operative duties . Practice aseptic and sterile technique . Practice all patient safety measures and act in an ethical manner . Assume responsibility for personal and professional growth . Sit for a national certification exam Continue same format for all programs offered at the institution.

STANDARD c. program retention rate At a minimum, an institution maintains the names of all enrollees by program, start date, and graduation date. The method of calculation, using the reporting period July 1 through June 30, is as follows:

(EE + G) / (BE + NS + RE) = R% EE = Ending Enrollment (as of June 30) G = Graduates BE = Beginning Enrollment (as of July 1) NS = New Starts RE = Re-Entries R = Retention Percentage % Include the retention results for the last three annual reporting years as the baseline, if available, along with goals for the upcoming year. If an institution has developed long-term goals for retention, this information should also be included with status updates.

EXAMPLE: Retention rates for the past three years, taken from the institution’s Annual Report: 2008-2009 2009–2010 2010–2011 Medical Assistant 67% 69% 70% Nursing Assistant 64% 65% 67% Surgical Technology 80% 81% 85% To establish the goals for the next reporting period 2011-2012, an institution may choose to average the three previous years for each program. However, in this example the goal would be below the 70% benchmark in the medical assistant and nursing assistant programs; therefore, this would not be a practical way to determine the next year’s program goal. A program may elect to establish its goal by an increase of a given percentage each year, such as five percent or by determining the percent increase from year to year of the three previous years. Note in the example that the Surgical Technology program increased retention 1% between 2008-2009 and 2009-2010 and then increased 4% between 2009-2010 and 2010-

Program Effective Plan Manual 10 Updated January 2012 2011. So the average increase among those three years is 2.5%. Using the averaged percent method, a realistic 2012 goal then might be 87.5%. The program may also establish intermittent goals of a percentage increase from month to month or an increase in relation to the same month or other predetermined time periods in the previous year—e.g., a 1% increase from month to month or a 2% increase in April 2012 over the April 2011 rate. Intermittent goals are advantageous as they keep everyone on target throughout the year. The chart below shows a comparison of the medical assisting, nurse assisting, and surgical tech programs to overall retention. The next step is to develop improvement plan/strategies for achieving the 2012 retention rates for each of the three programs that will assist in reaching its projected 2012 rates. Surgical Technology is doing well so its goal may be to either increase by 2.5% as stated previously or maintain the 85% retention rate. The Medical Assisting program is just at 70%, but did show a 1% increase between 2009-2010 and 2010-2011; therefore, it would be realistic to challenge it with perhaps another 1-2% increase for 2012. For the Nursing Assistant program to reach 70% would require an increase of 3% in one year. By establishing program goals, the institution is assured that all programs are working toward overall retention improvement.

100 MA 80 60 NA

40 ST 20 Overall 0 2008-09 2009-10 2010-11

Other areas that might be considered to address retention include setting  an average daily attendance goal for example at 90%;  a maximum withdrawal rate per quarter for example 10%;  a quarterly retention goal  quarterly grade distribution goals in the percentage of As and Bs for selected courses/classes for example for anatomy and physiology

Quarter As % Bs % Dec 09 45 22 Mar 10 62 35 Jun 10 62 35 Sept 10 61 32 Dec 10 50 26 Mar 11 65 14 Jun 11 49 34 Sept 11 27 8 Dec 11 54 31 Mar 12 72 25 Jun 12 83 11 Sept 12 51 32 Total 681 305 Mean 57% 25% Total Average As & Bs 82%

Program Effective Plan Manual 11 Updated January 2012 Based on this distribution, the institution might elect to develop strategies to maintain the 82% rate or raise the goal to 85%. Each quarter an intervention plan might be developed for those struggling students not making As and Bs. Such an intervention plan might enhance retention.

 Similarly quarterly grade distribution goals could be set in overall enrollment performance. Average Quarterly Grade Distribution for March 2010-June 2011 Total In the Quarter EOQ FTEs As% Bs% Cs% Ds% Fs% Ws% analysis Ending Student trends of s Mar 10 571 1606 35 33 16 11 5 0 grade June 10 354 1118 32 29 17 9 3 10 Sept 10 391 1180 32 28 16 13 2 9 Dec 10 417 1295 36 34 15 12 2 1 Etc.

Total 15134 376 378 209 103 38 95 Mean 1261 31% 32% 17% 9% 3% 8% distribution are noted. Goals then could be set to raise the percentage of As and Bs while reducing the percentage of Ds, Fs and Ws accompanied by departmental strategies.

STANDARD d. job placement rate in the field An institution has a system in place to assist with the successful initial employment of its graduates and is required to verify employment post-initial employment date. At a minimum, an institution maintains the names of graduates, place of employment, job title, employer telephone numbers, employment date and verification dates. For any graduates identified as self-employed, an institution maintains evidence of employment. Documentation in the form of employer or graduate verification forms or other evidence of employment is retained. The method of calculation, using the reporting period July 1 through June 30, is as follows: (F + R)/(G-U)=P% F = Graduates placed in their field of training R* = Graduates placed in a related field of training G = Total graduates U** = Graduates unavailable for placement P% = Placement percentage

*Related field refers to a position wherein the graduate’s job functions are related to the skills and knowledge acquired through successful completion of the training program.

**Unavailable is defined only as documented: health-related issues, military obligations, incarceration, continuing education status, or death. Important Note: graduates pending required credentialing/licensure in a regulated profession required to work in the field and, thus, not employed or not working in a related field as defined above, should be reported through back-up information required in the Annual Report. This fact will then be taken into consideration if the program placement rate falls below expectations and an Action Plan is required by ABHES.

EXAMPLES: Placement results for the same annual reporting years identified above for retention are used as your baseline data, if available, along with goals for the upcoming year. In addition, if an

Program Effective Plan Manual 12 Updated January 2012 institution has developed long-term goals for placement, this information should also be included with status updates.

Placement rates for the past three years, beginning with 2008-2009, taken from the institution’s ABHES Annual Reports are: 2008-2009 2009–2010 2010–2011 Medical Assistant 88% 90% 94% Nursing Assistant 85% 88% 91% Surgical Technology 86.5% 88% 91.3% These rates indicate a steady annual increase and all rates exceed the 70 percent ABHES benchmark. The chart shows a comparison of the three programs to the overall placement. Since these are good placement rates, the institution may elect to hold at these rates for 2012

94 M A 92 90 NA 88 86 ST 84 82 80 Overall 2008-09 2009-10 2010-11 and develop strategies to maintain the rates or the institution may elect to increase by a given percentage for 2012 such as one percent or use an average of the increases.

STANDARD e. credentialing examination participation rate Participation of program graduates in credentialing or licensure examinations required for employment in the field in the geographic area(s) where graduates are likely to seek employment, The method of calculation, using ABHES’ reporting period July 1 through June 30 is as follows: Examination participation rate = G / T T = Total graduates eligible to sit for examination G = Total graduates taking examination EXAMPLE: While credentialing may not be required for employment, concerted efforts should be made by the institution to promote and encourage participation in licensure exams by setting participation and pass rate goals and establishing strategies for achieving those goals. Include results of periodic reviews conducted throughout the reporting year of certification exam results by program along with goals for the upcoming year. If results are not easily accessible without student consent, the institution should consider incentive procedures or devise alternate methods to obtain results, which can be documented to assess program effectiveness. Again include the three most recent years of data collection. Data may be analyzed by class or just by program. Data collected and analyzed by class provides more detail.

Program Effective Plan Manual 13 Updated January 2012 Example by program GRADS NUMBER % GRADS PROGRAM ’09 ’10 ‘11 TOOK TOOK EXAM EXAM Nursing Assistant 2 2 2 1 2 2 7 7 0 6 4 5 0 1 5 7 88

Medical Assistant 2 2 3 2 2 2 8 8 4 6 0 0 1 5 3 1 83 EXAMPLE: Surgical Technology 2 1 1 1 1 1 6 5 Looking at the 3 9 7 5 1 3 5 8 76 nursing assistant graduate percentage of those who took the test versus those that have taken the test for the last three years would be lower than the 2011 pass rate. Therefore, it would be more advantageous to calculate the percentage increase between ‘09/’10 and ‘10/’11 to get an average to establish the percent increase for ‘11/’12 [2% (percent increase between 2009 & 2010) + 11(percent increase between 2010 & 2011) ÷ 2 = 6.5]. So the goal for the number taking the nursing assistant exam in 2012 would be 94.5%.

If students are admitted and graduate on a quarterly basis, the institution might find data collected quarterly to be more beneficial such as this example:

Example by program by class (useful tracking if employ adjunct faculty that change each term) GRADS NUMBER TOOK PERCENT PROGRAM ’09 ’10 ‘11 EXAM GRADS TOOK EXAM Nursing Assistant 20 26 24 15 20 21 75 77 88 Winter 2 4 7 2 3 5 Spring 7 6 5 5 6 5 Summer 6 7 6 3 3 3 Fall 5 9 6 4 7 6

Medical Assistant 24 26 30 Winter 7 4 9 Spring 5 6 10 Summer 6 7 5 Fall 6 9 6

Other data to demonstrate student-learning outcomes may include entrance assessments, pre- and post-tests, course grades, GPA, CGPA, standardized tests, and portfolios.

STANDARD f. credentialing examination pass rate An ongoing review of graduate success on credentialing and/or licensing examinations required for employment in the field in the geographic area(s) where graduates are likely to seek employment is performed to identify curricular areas in need of improvement. A program maintains documentation of such review and any pertinent curricular changes made as a result.

The method of calculation, using ABHES’ reporting period July 1 through June 30th, is as follows: F / G = L% F = Graduates passing examination (any attempt) G = Total graduates taking examination

Program Effective Plan Manual 14 Updated January 2012 L = Percentage of students passing examination % At a minimum, the names of all graduates by program, actual graduation date, and the credentialing or licensure exam for which they are required to sit for employment are maintained. Example by program GRADS NUMBER (G) NUMBER(F) PERCENT(L) PROGRAM ’09 ’10 ‘11 TOOK EXAM PASSED PASSED Nursing Assistant 20 26 24 15 20 21 12 15 16 80 75 76

Medical Assistant 24 26 30 20 21 25 15 17 19 75 81 76

Surgical Technology 23 19 17 15 11 13 14 9 10 93 82 77

From t From this data, establish a goals for the percentage of graduates passing the exam using the same methods described above for graduates taking the exam. Since passing rates have not steadily climbed, setting a reasonably achievable passing goal could be established by merely averaging the three most recent passing rates (80+ 75+ 76÷ 3 = 77), which would give a goal of 77% passing for the nursing assistant program.

Again, if students are admitted and graduate on a quarterly basis, the institution might find data collected quarterly to be more beneficial such as this example:

Example by program by class (good tracking if employ adjunct faculty that change each term) GRADS NUMBER TOOK NUMBER PERCENT PROGRAM ’09 ’10 ‘11 EXAM PASSED PASSED Nursing Assistant 20 26 24 15 20 21 1 1 1 8 7 76 2 5 6 0 5 Winter 2 4 7 2 3 5 Spring 7 6 5 5 6 5 Summer 6 7 6 3 3 3 Fall 5 9 6 4 7 6

Medical Assistant 24 26 30 Winter 7 4 9 Spring 5 6 10 Summer 6 7 5 Fall 6 9 6

STANDARD g. program assessment The program assesses students prior to graduation as an indicator of the program’s quality. The assessment tool is designed to assess curricular quality and to measure overall achievement in the program, as a class, not as a measurement of an individual student’s achievement or progress toward accomplishing the program’s objectives and competencies (e.g., exit tool for graduation). Results of the assessment are not required to be reported to ABHES, but are considered in curriculum revision by such parties as the program supervisor, faculty, and the advisory board and are included in the Program Effectiveness Plan. EXAMPLES FOR USE AS PROGRAM ASSESSMENT:  Comprehensive Final Exam  Scenarios  National Practice Exams

Program Effective Plan Manual 15 Updated January 2012  Practical demonstrations using a comprehensive checklist before students go on externship  Comprehensive Final Exam An important measure of program effectiveness is how well it prepares students with the entry level competencies for field of work. A comprehensive examination designed to measure the individual student’s preparation in the required competencies identified by the program objectives administered to every student prior to completion and the collective results used to assess the program’s performance in preparing students, as a group, for employment in the field. The comprehensive examination may include either written questions, practical demonstrations, or a combination of methods so long as necessary clinical competencies are validly and reliably assessed. Such an exam should be designed to incorporate all major elements of the curriculum for assessment of quality. A well-designed exam will point directly to that segment of the curriculum that needs remedy. For example, if scores are consistently low in the anatomy and physiology segment of the exam as indicated by a three year trend, then an action plan may include new textbooks, an instructor change , instructor professional development or in-service or the course taught as a prerequisite instead of a core element. These cohort exam scores are then closely monitored for upward trends that indicate that the plan is working. A program may find it beneficial to score the exam with ranges rather than pass/fail. This communicates to the student that it is being used as an overall quality improvement tool, rather than a personal test. Each student is provided instructions regarding the scenario. The students then role play as they complete the scenarios under the direction of the faculty.

 MA SCENARIO EXAMPLE (Thanks to Ross Medical Education Center for sharing these examples) Program Assessment Evaluation Method: Medical Assistant Day In The Office Scenario Total Number of seniors who completed Total number of seniors Overall Proficiency % day in the office scoring “Proficient” or for all combined day in MA Program scenarios “Acceptable” the office scenarios. Years 08-09 09-10 10-11 08-09 09-10 10-11 08-09 09-10 10-11 N/A N/A 100% N/A N/A 100% N/A N/A 100% Day in the Office Scenario (Tasks) 2010‐11 Class Proficiency Prepare and maintain electronic medical records Perc100entage Manual Filing with Alphabetic System 100 Take and Record Height and Weight 100 Take a complete set of Vital Signs 100 Perform Visual Acuity with Snellen Chart 100 Perform a physical and chemical U/A 100 Administer an ID injection 100 Administer a SQ injection 100 Administer an IM injection 100 Administer a Z-track injection 100 Perform a Standard 12-Lead EKG 100 Perform a Spirometry Test 100 Measure Infant Height/Weight, Head and Chest Circumference 100 Perform a Urine Pregnancy Test 100 Perform Multi-draw Venipuncture 100 Perform Venipuncture with Butterfly 100 Perform a Capillary Puncture and MicroHematocrit 100

Program Effective Plan Manual 16 Updated January 2012 Perform a Capillary Puncture and Hemoglobin 100 Perform a Capillary Puncture and a CLIA Waived Mono Test 100 Measure Blood Glucose using a Handheld Monitor 100 Wrapping Instruments/Operate an Autoclave 100 Apply Sterile Gloves/Set Up Sterile Tray/Remove Sutures 100 Code Assignment/Posting Patient Charges Electronically 100

Program Effective Plan Manual 17 Updated January 2012 Rationale for Data Collection Procedures The program assessment evaluation tool is used to Program assessment rubrics are used to assess senior students’ clinical assess curricular quality and measure overall skills through end-of-term “Day in the Office” scenarios. The Program achievement in the program by class. Assessments are compiled twice a year in July for January 1 to June 30 and in January for July 1 to December 31. To ensure data is significant, The purpose of the assessment tool is to determine if the data is compiled by class and by program and then classes are performing clinical skills at a high level and reviewed/analyzed/assessed based on the results from the prior 6 progressing toward accomplishing the program’s month period. objectives through demonstrated proficiencies. Program assessment of students’ clinical skills is based on 4 categories: Proficient, Acceptable, Limited, and no Opportunity. If the score for any competency falls below the 90% proficiency level (by totaling proficient + acceptable categories), the concern is brought to the Director of Education in order to perform a comparative analysis for the proficiency among all Ross campuses.

Goals for 2011‐12 Responsible Party Review Dates 80% participation rate of all classes during the Director, Instructors January 2012 and reporting period. July 2012

75% of proficiencies listed on each rubric are completed for each evaluation period.

STANDARD h. surveys of students (classroom and clinical experience), clinical extern affiliate, graduate and employer satisfaction with the program surveys A program must survey each of the constituents identified above. The purpose of the surveys is to collect data regarding student, clinical extern affiliate, graduate and employer perceptions of a program’s strengths and weaknesses. At a minimum, an annual review of results of the surveys is conducted, and results are shared with administration, faculty and advisory boards. Decisions and action plans are based upon review of the surveys, and any changes made are documented (e.g., meeting minutes, memoranda). The institution establishes: (i) a goal for the percent of surveys returned and (ii) benchmarks for the level of satisfaction desired. Accordingly, a program must document that at a minimum the survey data included in its effectiveness assessment include the following: A representative sample must provide feedback to determine program effectiveness; therefore, two goals should be established for all surveys (1) a goal for the percent of surveys returned as well as Survey participation rate: SP / NS = TP SP = Survey Participation (those who actually filled out the survey) NS = Number Surveyed (total number of surveys sent out) TP = Total Participation by program, by group; meaning the number of students/clinical extern affiliates/graduates/employers by program who were sent and completed the survey during the ABHES reporting period (July 1–June 30).

(2) satisfaction benchmarks

Program Effective Plan Manual 18 Updated January 2012 Programs must assess satisfaction by surveys for the currently enrolled student, the clinical extern affiliate, the recent graduate, and the graduate’s employer. Student: Student evaluations are used as a composite of student views relating to course importance and satisfaction and overall class attitudes about the classroom and clinical environments. EXAMPLE: Student Satisfaction: The surveys conducted periodically throughout the reporting year should assess the students’ satisfaction with the administration, faculty, and program training, including externship. The institution would also want to establish a percentage return goal such as 90% of the students complete the survey.

Collection Improvement Rationale for Data Procedures Goals Summary/Analysis Strategies Secure feedback from students Student 90% student participation Feedback obtained The data is relating on importance and Satisfaction Using Student Satisfaction Surveys from completed collected and satisfaction on customer service Surveys (orientation through graduation) the surveys should be benchmarks are set and overall attitudes related to collected baselines are: tallied for each and analyzed for the institution’s administration. semiannually Tutoring 80% category. improvement Data used to reflect on what Academic Advising 80% strategies when worked or didn’t work. Support From Admissions 75% measures fall Financial Aid 75% below established Career Services 75% baselines. Library 80% Spirited/Fun Environment 50% Orientation Sessions 75% Recognition 65% Mission Statement 50% Admin Accessibility 80% Facility 70% Social Activities 50% End of term student evaluations Student Using the Student Course Surveys the Feedback obtained Failure to achieve a used as composite of student Course following baselines have been established: from completed baseline goal will views relating to course Surveys Presentation/Delivery Methods 80% surveys should be be addressed at importance and satisfaction and collected each Course Pace 80% tallied for each faculty and in- overall class attitudes about the term Course Objectives 75% category. service meetings classroom environment. Faculty Information Clarity 85% use data to determine Textbook 75% effective/ineffective activities Supplementary Materials 75% and compare this information Classroom Assignments 80% with other classes. Support From Faculty 75% Encouragement/Motivation 80% STANDARD Clinical extern affiliate: Externship site evaluations include a critique of students’ knowledge and skills upon completion of their in-school training and reflect how well the students are trained to perform their required tasks. They include an assessment of the strengths and weaknesses, and proposed changes, in the instructional activities for currently enrolled students. The sites also evaluate the responsiveness and support provided by the designated school representative, who visited the site and remained in contact with the site throughout the duration of the students’ externship.

Program Effective Plan Manual 19 Updated January 2012 EXAMPLE: Externship sites are off-campus labs enabling students to apply acquired knowledge and skills. Students on externship should be given an opportunity evaluate this experience just as they did in the classroom. Summarized results of the externship site evaluations of the students’ knowledge and skills upon completion of their in-school training should reflect how well the students are trained to perform their required tasks, and include an assessment of the strengths and weaknesses, and proposed changes, if any, in the instructional activities for currently enrolled students. The sites should also evaluate the responsiveness and support provided by the designated school representative, who visited the site and remained in contact with the site throughout the duration of the students’ externship.

Collection Improvement Rationale for Data Procedures Goals Summary/Analysis Strategies To maintain interaction with off- Externship site 100% Student participation Feedback obtained Failure to site labs to identify student skill survey Attendance 100% from completed achieve a level and provide follow-up collected bi- Initiative/Appearance 80% surveys should be baseline goal instruction when deficiencies weekly by Communication 80% tallied for each will be identified. externship Critical Thinking 80% category. addressed at coordinator on Information Use 90% faculty and Friday during Quality of Work 90% curriculum student Multi-Tasking 80% meetings to externship Technical Procedural design assignment. Proficiency 90% improvement. Student Professional Attitude 80% clinical Interaction & responsiveness experience of institution 95% evaluation For all students to rate their overall clinical experience at or above the “cut score” of 3 on a 5-point Likert scale.

Rating of School Representative School representative’s responsiveness 90% Quality of the designated school representative’s visit to the site 90% Rank the representative’s contact with the site throughout the duration of the student’s externship.

90%

For all clinical affiliates to rate their contact with the school experience at or above the “cut score” of 4 on a 5-point Likert scale. STANDARD Graduate: A program has a systematic plan for regularly surveying graduates. Graduate surveys are provided no sooner than 10 days following graduation. At a minimum, an annual review of the results is conducted and shared with administration, faculty and advisory boards. Decisions and action plans are based upon the review of the surveys, and any changes made are documented (e.g., meeting minutes, memoranda).

EXAMPLE: Graduate Satisfaction: Include information reflecting the level of satisfaction of graduates with the academic program, how well the educational and clinical experiences prepared the student for employment, and how the education relates to their current position. Such information

Program Effective Plan Manual 20 Updated January 2012 could include a measurement of the quality of instruction and the relevance and currency of curricula. The major distinction between graduate and student satisfaction is that graduate feedback should be sought once graduates’ have had an opportunity to seek and secure employment in their fields and have been employed long enough to be able to evaluate their training in relation to the tasks performed on the job.

Because graduate and employer satisfaction surveys provide valuable information for timely program revision and development, they must be conducted on an on-going basis and summarized at least annually as the information from these surveys are vital to the PEP in evaluating educational outcomes and setting short- and long-term goals. The institution should also establish a percentage return goal such as 60% of the graduates return the survey.

Collection Improvement Rationale for Data Procedures Goals Summary/Analysis Strategies To ensure the ability Data collected 60% of employed graduates during the reporting Feedback obtained The data is of graduates to secure using the alumni year will return the completed survey. from completed collected and employment related survey a surveys should be benchmarks set to their interests and minimum of 30 Baseline is to have an 80% or higher evaluation tallied for each and analyzed in skill categories and 70% or higher in the training both upon days following leadership area of surveys rated at 3 or above on a category. for graduation and student 5 point Likert scale. improvement throughout their employment. strategies when lifetime. Prepared for career in ability to: measures fall  Communicate with employer/co- below workers established  Think critically baselines.  Use information  Multi-task  Apply knowledge  Use learned skills  Perform to employer satisfaction  Secure desired position Satisfied with instruction quality Provided effective job search skills

Employer: A program has a systematic plan for regularly surveying employers. Employer surveys are provided to employer no fewer than 30 days following employment. At a minimum, an annual review of the results is conducted and shared with administration, faculty and advisory boards. Decisions and action plans are based upon the review of the surveys, and any changes made are documented (e.g., meeting minutes, memoranda). EXAMPLE: Employer Satisfaction: Information about the degree of employer satisfaction regarding the competencies of graduates who have completed a program of study is a major part of determining program effectiveness. This information reflects how well employees (graduates) are trained (skill level) to perform their required tasks, and include an assessment of the strengths and weaknesses, and proposed changes, if any, in the instructional activities for currently enrolled students. This example below is for the institution, however, to ensure that all program graduates are meeting this goal, data could be tallied by program. The institution should also establish a percentage return goal such as 50% of the employers return the survey.

Collection Improveme Rationale for Data Procedures Goals Summary/Analysi nt Strategies s To maintain Employer Survey Using the Employer Survey, rate graduate Feedback obtained The data is job performance and use data to update

Program Effective Plan Manual 21 Updated January 2012 interaction with curriculum, program objectives and program from completed collected employing offerings. surveys should be and community, tallied for each benchmarks identify current Employer Response Rate 50% category. are set and workplace needs, Collected Data collected in the following: analyzed for and anticipate quarterly and Technical Knowledge improvemen future job tallied annually Proficiency 85% t strategies requirements that in November Information Use 80% when Quality of Work 80% will shape the Multi-Tasking 70% measures careers and Communication 85% fall below graduate Critical Thinking 75% established opportunities. Professional Attitude 85% baselines. Other goal examples: For all employers to rate graduate overall knowledge base, clinical experience, and interpersonal communication skills at or above the “cut score” of 3 on a 5-point Likert scale. STANDARD i. faculty professional growth and in-service activities. A program maintains data that evidences faculty participation in professional growth activities and in-service sessions that promote continuous evaluations of the programs of study, instructional procedures and training. Include the schedule, attendance roster, and topics discussed at in-service training sessions conducted during the reporting year. The data should evidence that the sessions promote continuous evaluation of the program of study, training in instructional procedures, and review of other aspects of the educational programs. Outline procedures for monitoring all full-and part- time faculty participation in professional growth activities in an effort to remain current in their fields. Include the past two years’ in-service training and professional activities outside the institution for each faculty member. EXAMPLE: Rationale for Collection Improvement Data Procedures Goals Summary/Analy Strategies sis Invest in faculty Professional Full-time and adjunct faculty Feedback Data collected and development to development plan professional development obtained from benchmarks set and ensure current tied to other participation, as documented in completed analyzed for expertise and assessments (student professional development plans and surveys tallied for improvement ability. evaluations, faculty professional development programs, each category. strategies when evaluations, etc.) to will increase. personnel fail to directly address fulfill plan. identified areas of Participation minimum: need. On-campus quarterly in-service Future topics based 3 In-service on survey feedback Professional evaluation and in-service development plans Instructor initiated off-campus Standard evaluations prepared annually professional development directly feedback form for based on instructor related to teaching assignment PD activities evaluation and 2 reviewed quarterly. Current licensure where required in the field

STANDARD Subsection 2- Outcomes Assessment V.I.2. A program has a process for assessing effectiveness.

Program Effective Plan Manual 22 Updated January 2012 The Program Effectiveness Plan specifies a process and a timetable for the annual assessment of program effectiveness in achieving the outcomes it has identified with its objectives and criteria. The plan must:

i. Document historical outcomes and show evidence of how these historical data are used to identify expected outcomes and to achieve expected goals (e.g., evaluations, advisory boards, credentialing). Outcomes are the result of students’ successful completion of a program. Outcomes, though not limited to, are generally defined in terms of retention, placement, student competencies, student, clinical, graduate, and employer satisfaction. Use at least three years’ historical outcomes for each element. The last three PEPs and Annual Reports provide the necessary historical data. Data from other prior years may be used if it will better define the picture of progress or set more realistic goals. Describe the measurable standards used to judge the effectiveness of your institution. ii. Identify and describe types of data that are used for assessment, how data were collected, rationale for use of each type of data, timetable for data collection, and parties responsible for data collection.

Institutions are expected to collect data that clearly evidences the level of educational outcomes of retention and placement and satisfaction experienced by current students, graduates, clinical sites and employers of graduates. In addition, institutions are to include information that is relevant to improving overall effectiveness, such as in-service training programs and professional growth opportunities for faculty.

The institution is encouraged to collect a variety of statistical data that will assist it in improving the educational outcomes. A few examples of possible surveys and studies include:  New or entering student surveys  Program evaluations  Faculty evaluation studies  Alumni surveys  Student demographic studies  Labor market surveys Studies of student performance might include:  Admission assessments  Pre-test and post-test results  Grades by course  Portfolios  Standardized tests  Graduate certification examination results  Quarterly grade distribution  Average daily attendance Consider other studies such as a review of surveys of professional and trade associations, Chamber of Commerce, U.S. Department of Labor, or economic development board studies.

EXAMPLE: Data Collection Rationale for Use Employer Survey collected quarterly and Using the Employer Survey, rate the job performance of graduates and use tallied annually in November by career data to update curriculum, program objectives and program offerings. services department. Rating goals: Employer Response Rate 75% Goals for data collected in the following areas: Communication 84% Critical Thinking 75% Information Use 80% Quality of Work 80%

Program Effective Plan Manual 23 Updated January 2012 Multi-Tasking 70% Technical Knowledge Proficiency 85% Professional Attitude 85% iii. Review initial baseline rates and measurements of results after planned activities have occurred. Data related to the PEP must be evaluated at least once per year and should take place at a predetermined time. Many institutions evaluate data related to their PEP on a monthly or quarterly basis then complete an annual comprehensive evaluation. As previously noted, it is suggested that an institution establish a schedule or range of evaluation dates for each year to ensure that timely monitoring is taking place. To maximize the integrity of the process and opportunities for improving the educational programs, the individuals involved in the evaluation of the data must have the responsibility and authority for the development of the educational programs and educational processes.

An institution should develop an evaluation plan designed to meet its needs; no one model is prescribed as each institution is unique. An example of how an institution may evaluate the PEP could be by completing the following activities: a. Measuring the degree to which institutional or educational goals have been achieved. b. Conducting a comprehensive evaluation of the elements outlined in the Standards Manual. c. Summarizing the institutional changes that have been developed and/or implemented based upon information gained from the evaluation process. d. Documenting changes in institutional, academic, or administrative processes such as revised goals, planning documents, or program goals and activities. At the end of the year, a review of the data collected will demonstrate how well the predetermined goals were met in all categories and will identify changes needed. An example of how data may be reported could be to set it up in a table format for easy analysis.

Program Effective Plan Manual 24 Updated January 2012 Goals Summary/Analysis Improvement Strategies Retention Retention program focused on  Daily instructors contact absent students & 75% motivating students to stay in school document discussion. only 70% stayed.  Refer students to individuals or services to help overcome attendance obstacles.  Department chairs distribute at-risk list to every employee each Tuesday. Placement  Placement staff will be on campus two 70% placed within 60 days of days/week during the hours between day and graduation evening classes to meet with students in their last term.  Placement personnel will join and participate in local business and civic organizations.  Increase employer presence on campus with mock interviews, speaking, and twice a year career fair.  Establish on-line career bank Improved to 80% or above graduate Employers rated performance of  Add at least three critical thinking scenarios to job performance graduates below 80% on the following: the last three modules Critical Thinking 75%  Add four multitasking practica the last two Multitasking 70% modules. iv. Provide a summary and analysis of data collected and state how continuous improvement is made to enhance expected outcomes.

Provide an overview of the data collected. Summarize the findings for all elements reviewed that indicate the institution’s strong and weak areas with plans for improvements, where applicable, and use results to develop the basis for the next annual review, presenting new ideas for changes to help the institution further improve its effectiveness.

One of the most important indicators of the effectiveness of the PEP is the improvement of the educational programs offered at the institution. Establish specific goals as benchmarks to measure improvement of the institution as a whole as well as each program. Goals can be set as an annual incremental increase or set as a static goal, such as 50 percent for employer survey returns. Summary/Analysis Use of Data to Improve Employer Response Rate 45% Career Services maintain bi-weekly contact with employers and graduates. Phone calls made to unresponsive employers Overall Job Performance Rating: encouraging them to complete the survey. Communication 82% Critical Thinking 75% The recommendation is that all programs add more emphasis Information Use 88% in communications, critical thinking, and multi-tasking. More Quality of Work 85% case studies, role-playing, and practical applications in these Multi-Tasking 67% areas will be included in all third and fourth quarter courses. Technical Knowledge Proficiency 94% Professional Attitude 85% Employer response rate improved but continued effort is needed to generate a better response. All baseline goals established were met or exceeded with the exception of communication and multi-tasking. v. Identify how data were used to improve the educational process. At least annually monitor activities conducted that include systematically collecting data/information on each of the elements; analyzing the data/information and comparing it with previous findings; and based on the findings, identifying changes to be made in educational activities.

Program Effective Plan Manual 25 Updated January 2012 An institution may offer an exemplary educational program in terms of curriculum, but for one reason or another, the educational processes are not allowing the contents to be delivered effectively to the students. However, by analyzing the data in the PEP, such as employer, graduate, and student surveys, and faculty professional development, an institution is able to change the process to enhance the program or change the program entirely. Summary/Analysis Use of Data to Improve Professional development participation as documented in Faculty participation in professional development activities professional development plans increased and included all throughout the system has been widespread and generally full-time and adjunct faculty. Professional development successful. Full-time faculty members have usually was tied to annual evaluations, licensing requirements, and followed through to complete their planned activities, student evaluations to directly address identified areas of thereby benefiting the Collegeand the students. Part-time improvement. Five of the nine faculty attended two field faculty members have attended a variety of training sessions related workshops, three renewed licenses, all attended at organized by the College. For instance, quarterly least two of the four in-services. Professional Development sessions on campuses have been beneficial to full-time and part-time instructors, as indicated by faculty evaluation results from the sessions. Individual faculty plans will include at least two field related seminars with some financial support and campus-wide training will continue to be implemented based on instructional needs, as determined by faculty members, the Dean’s evaluation, and mid-term and end of quarter student evaluations, as well as other assessment data. vi. Adjust goals as a result of the evaluation of a PEP, based on an assessment of community and employer demand for graduates, which justifies the continued need for a program. At this juncture, it would be advantageous to identify those responsible and establish periodic times for review to ensure that progress toward the new goals are on track or if not, determine why, new strategies, and/or adjust the goal. Goals Who Responsible Review Dates Summary/Analysis Strategy Adjustment Improve employer Clinical coordinator April 30, 2012, & As of 4-30-12, employer Continue through 11-30- survey return rate by November 30, 2012 survey rates have been 11, modify if a decline in hand delivering the improved by 30% percent returns occurs surveys to respondents vii. Identify the activities that will be undertaken to meet the goals set for the next year. Problems/Deficiencies Specific Activities Inconsistent externship Externship coordinator will contact externship supervisor two days prior to report due date. recordkeeping Three days following due date, if externship report not received, externship coordinator will contact externship supervisor. ATB counseling not frequent Institute required mentoring and tutoring for 1st & 2nd term students. enough and not meeting ABHES Develop standardized reporting form and procedures for ATB counseling by 7/31/-- standards Distribute to all faculty. DOE to monitor to ensure that counseling is provided weekly and filed. Low Employer Survey return Within 10 days of surveys failing to be returned, the career services department will make a follow-up phone call to those delinquent. Inadequate Graduate Survey return Within 10 days of surveys not returned, the career services department will make a follow-up phone call to those delinquent. Faculty Files Incomplete Dean of education will review faculty files quarterly; contact faculty who have not submitted professional development documentation. Quarterly memo to all personnel reminding that all credentials received, CEUs, etc. are to be submitted to the DOE promptly after receipt. Professional Development Increase to quarterly internal monitoring of professional development for all instructors. Request credentialing information every 90 days. Require instructors to obtain two CEUs per year.

FORMAT EXAMPLES*

Program Effective Plan Manual 26 Updated January 2012 All PEPs must contain the following:

Title Page to include all of the following: ABHES I.D. CODE Name of Institution Address City State and Zip Code Name of Program Name of Program Director Credential Awarded Portion of the program, if any offered via distance learning Length of program (clock hours, semester/quarter credits, weeks, etc.) Introduction Schedule of Review Institutional Mission and Objectives Program Description and Objectives Student Population Overall Student Population Program Student Population

FORMAT EXAMPLE I Program Retention Retention statistics are extracted from reports in CampusVue. The annual period used to measure retention for the purposes of accreditation is July 1 through June 30. The following program retention rates were determined using the ABHES formula as follows: (EE + G / (BE + NS + RE) = R% EE = Ending enrollment (as of June 30 of the reporting period) G = Graduates BE = Beginning enrollment (as of July 1 of the new reporting period) NS = New starts RE = Re-entries R% = Retention percentage Retention Percentage – Program FY 2008 FY 2009 FY2010 Medical Assistant, Diploma 57.46% 69.7% 75.0% Summary/Analysis: The Medical Assistant Diploma Program started December 2004. Retention rates have increased in Fiscal Year 2010 due to implementation of several factors; biweekly meetings with the program chair, weekly attrition meetings, more accountability for instructors, and campus wide efforts to retain students. Rationale: To have the most current picture of who is in the program in order to monitor participation and progress, and build retention plans based on student needs. Retention Goal: The retention goal for FY 2012 is 79% Responsible Parties: Academic Dean, Program Chair, Faculty and other staff as applicable Review Dates: Retention of the program will be monitored weekly; however more in depth in monthly meetings Types of Data/Methods of Assessment: Daily attendance reports, Weekly retention meetings,

Program Effective Plan Manual 27 Updated January 2012 Monthly retention reports, Faculty and student surveys, and Quarterly retention reports. Reports reviewed with team members: Registrar, DOE, Program Chairs. Continuous Improvement Strategies: • Daily monitoring of student attendance by the Instructor, Program Director, Academic Dean and the Career Services Representative to proactively identify students at risk of withdrawing or being withdrawn. Regular meetings will be held to discuss at-risk students and action plans developed both individually (advising) and collectively to address challenges facing potential drop students. • The Academic Dean will hold daily meetings with the Program Directors to determine each student absent four or more days. • All absent students will be called daily by their Instructors, Program Directors and/or Academic Dean. Preferably before their regularly scheduled class is over. • Faculty will be coached on holding students accountable for attendance and for developing engaging classrooms. If specific classes are identified as having lower than average attendance and/or retention, coaching and developmental activities will be implemented for the faculty member(s). Outstanding attendance and recognition awards will be given to instructors who have the highest retention and attendance each quarter. Job Placement Repeat sections for each of the elements to be evaluated Placement Percentage – Program FY 2008 FY 2009 FY 2010 Summary/Analysis: Rationale: Goal: Responsible Parties: Review Dates: Types of Data/Methods of Assessment: Continuous Improvement Strategies: Credentialing Examination Participation Rate – Credentialing Examination Pass Rate – Program Assessment Exam – Surveys – Student Surveys Clinical Affiliate Surveys Graduate Surveys Employer Surveys Faculty Professional Growth and In-Service Activities –

Program Effective Plan Manual 28 Updated January 2012 FORMAT EXAMPLE 2 MA program started in FY2011

Where are the program How does the program Who reviews the data? What changes have Date of most objectives published? determine that graduates What process is utilized? resulted from data recent program have achieved the (e.g. semiannually review? review objectives (e.g. surveys, by advisory credentialing exam) committee) Program: MA The program objective Graduation requirements Data is reviewed by Currently no changes are The most recent Advisory is stated on page 15 of are stated on page 9 of campus and corporate in place for the Canton Board meeting was held the 2011‐2012 the 2011‐2012 Michigan leadership on a weekly campus as this is the January 2011. Michigan Campus Campus Catalog. Student basis by retention and first year of operation. Catalog. must have a grade placement reports, and average of 70% or also reviewed by the higher with no less than campus advisory board 60% in any individual twice annually. course, and attend no less than 85% of scheduled classroom days. In addition, student must successfully complete externship.

PROGRAM RETENTION RATE : MA 2008‐2009 2009‐2010 2010‐2011* na% na% 82%

Goals for 2011‐12 Responsible Party Review Dates Maintain a retention rate of 82% Director, Faculty and All Staff Weekly

Summary/Analysis Improvement Strategies The Canton campus achieved a successful 2010‐2011 year with Admissions representatives are trained to uncover issues that the first class opened October 4, 2010. All 11 students may result in students having to withdraw from the program, and completed the program. All 2010‐11 classes achieved a assess preventive strategies with the students so they are satisfactory 82% retention rate. Although some withdrawals prepared to complete the course. Faculty members are were unavoidable due to serious medical conditions, it is to be encouraged to communicate concerns directly to the Director noted that some students simply left the program due a change in when a student is struggling with the program. When a student’s their vocational desires. Ensuring prospective students attendance falls below 85%, the attendance card is brought to understand what the program fully entails, including academic the front desk so the Director is aware of student’s attendance. and attendance requirements; will result in a more committed The student is also counseled by the Director on attendance student population, resulting in a higher retention rate. requirements.

Job Placement Repeat format for each of the elements to be evaluated

*Thanks to Everest College, McLean, Virginia, and Ross Medical Education Center, Canton, Michigan, for agreeing to share their PEP formats for this Manual.

Program Effective Plan Manual 29 Updated January 2012 OTHER EXAMPLES: Examples of changes to a process that might enhance a program:  If a course requires outside lab or practice time and an analysis of the students’ actual lab or practice time demonstrates that the students are not completing the required hours, formally scheduling those hours or adding additional laboratory times may dramatically increase the effectiveness of that course.  If an analysis of the data demonstrates that a large number of students are failing a specific course or are withdrawing in excessive numbers from a particular program, the institution may change the prerequisites for that course or offer extra lab hours or tutoring to see if the failure or withdrawal rate are positively affected.

Examples of changes to a program that might enhance a program:  If the analysis of the data indicates that large numbers of students are dropping or failing a course when taught by a particular instructor, the instructor may need additional training or a different instructor may need to be assigned to teach that course.  If surveys from employers and graduates indicate that a particular software program should be taught to provide the students with up-to-date training according to industry standards, the institution could add instruction in the use of the particular software program. But only after the institution is assured that the instructor has been properly trained in its use.

CONCLUSION The PEP is a working document used as a resource to constantly identify and access a program’s goals that have been established to meet its educational and occupational objectives. An effective PEP is regularly reviewed by key personnel and used in evaluating the effectiveness of each program and the overall operations of the institution. It is important for each institution to establish a program which exhibits the institution’s progress toward providing the highest quality education for its students to present to ABHES evaluation teams, government regulatory groups, and the general public. “Of all our human resources, the most precious is the desire to improve.” –Unknown

A T E S STUDENT S A E LEARNING C S OUTCOMES H S DEFINED STUDENTS I M ADMITTED N E G / N EMPLOYERS L T SATISFIED STUDENTS STUDENT E GRADUATES SATISFIED, & ACADEMICALLY A SATISFIED AS THE R MAIN SUCCESSFUL E CLINICAL N (RETENTION) I V SITES ACTOR A N SATISFIED L G U P A R T STUDENTS GRADUATES O I GRADUATE C O PLACED E N IN FIELD S S

Program Effective Plan Manual 30 Updated January 2012

Recommended publications