Professional Development Manual on NRS Data Monitoring for Program Improvement

Total Page:16

File Type:pdf, Size:1020Kb

Professional Development Manual on NRS Data Monitoring for Program Improvement

Professional Development Manual on NRS Data Monitoring for Program Improvement

By: Mary Ann Corley Principal Research Analyst

AMERICAN INSTITUTES FOR RESEARCH 1000 Thomas Jefferson Street, N.W. Washington, DC 20007

This manual was prepared for the project:

Promoting the Quality and Use of National Reporting System (NRS) Data Contract #ED-01-CO-0026

U.S. DEPARTMENT OF EDUCATION

Office of Vocational and Adult Education Division of Adult Education and Literacy

Susan Sclafani, Assistant Secretary for Vocational and Adult Education

Cheryl Keenan, Director Division of Adult Education and Literacy

Mike Dean, Program Specialist Division of Adult Education and Literacy

April 2004 Contents

Page Introduction...... iii Audience...... iii Purpose...... iii Workshop Overview...... iv Preparation Checklist...... v Workshop Outline...... vi Before the Workshop...... viii

Facilitator’s Notes Facilitator’s Notes: Day 1...... 1 Facilitator’s Notes: Day 2...... 9

Participant’s Handouts Workshop Objectives...... H-1 Workshop Agenda ...... H-2 Why Get Engaged with Data?...... H-3 Your Own Personal Motivators...... H-4 Questions for Consideration...... H-5 Decision for State Teams: Selecting a Standard-Setting Model...... H-6 Adjusting Local Standards: Sample Scenarios...... H-7 Reflection on Success of Past Efforts...... H-8 Variations on a Theme...... H-9 State Worksheet: Planning for Rewards and Sanctions...... H-10 Data Carousel...... H-11a Monitoring Performance Using Indicators of Program Quality...... H-12 Steps and Guidelines for Monitoring Local Programs...... H-13 Planning and Implementing Program Improvement...... H-14a Aha! Experiences and Parking Lot Issues...... H-15 Workshop Evaluation Form...... H-16a

PowerPoint Slides...... 1

Supplement Possible Questions to Ask When Examining the Data (Answers to H-5)...... S-1 Glossary...... S-2a Letter to Send to Participants Prior to Training...... S-3 Alternative Monitoring Exercise...... S-4a

ii INTRODUCTION TO PROFESSIONAL DEVELOPMENT ON NRS DATA MONITORING FOR PROGRAM IMPROVEMENT

Audience This professional development sequence has three distinct audiences:

1. State-level staff (administrators, professional development coordinators, and data managers) who are responsible for statewide planning, management, and dissemination of information and procedures related to the NRS;

2. Professional development specialists who are responsible for rolling out the training statewide to local programs; and

3. Local program administrators, professional development coordinators, data managers, and instructors.

Purpose The purpose of this professional development sequence is to help state and local literacy program personnel identify and define the interrelationships between data and program performance, explore ways to monitor local programs to strengthen the connection between performance and data, and identify and implement program improvement efforts.

This sequence can be adapted to meet the needs of individual states and local programs. It can be used as a train-the-trainers program in which state-level staff offer the workshop to key personnel (e.g., professional development specialists and data facilitators) who will then repeat the training for local program administrative and instructional staff. Information and activities from this training also can be selected and offered to meet the needs of specific audiences. For example, professional development for instructors may focus on using data to inform instruction. Professional development for local program administrators may focus on using data for enhancing performance or for making program improvements. Users of this training sequence are encouraged to adapt and augment activities accordingly.

iii Workshop Overview

Objectives: By the end of this professional development sequence, participants will be able to:

1. Describe the importance of getting involved with and using data;

2. Identify four models for setting performance standards as well as the policy strategies, advantages, and disadvantages of each model;

3. Determine when and how to adjust standards for local conditions;

4. Set policy for rewards and sanctions for local programs;

5. Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention;

6. Distinguish between the uses of desk reviews and on-site reviews for monitoring local programs;

7. Identify steps for monitoring local programs;

8. Identify and apply key elements of a change model; and

9. Work with local programs to plan for and implement changes that will enhance program performance and quality.

Time: The total time required for this workshop is approximately 12 hours of instructional time, or 2 full days of 6 hours each (not including lunch and break times). The 12 hours of training are further divided into 4 discrete segments of 3 hours each. The entire sequence, therefore, may be conducted over 2 consecutive days or delivered in individual 3-hour segments over a 2-week period, thereby affording participants the opportunity to integrate the knowledge and skills gained into their work activities.

Materials Checklist:

 Overhead projector (for use with transparencies) OR

 Laptop and LCD projector (for use with CD-ROM)

 Copies of participant handouts for each participant

 PowerPoint presentation on CD-ROM or transparencies of PowerPoint slides

 Facilitator’s Notes and Supplements

 Flipchart, flipchart stand, and marking pens

 Blank transparencies and transparency pens

iv Preparation Checklist

 Reserve space for the training.

 Duplicate handouts.

 Download PowerPoint from the NRS Web site and create a CD- ROM for use during the workshop, or make overhead transparencies from the PowerPoint slides. Run copies of the PowerPoint slides as handouts, 3 to a page.

 Assemble participant packets with copies of handouts and PowerPoint slides.

 Make nametags and/or name tents for participants.

 Prepare attendance sheet.

 Pre-divide attendees into groups (state teams or local program teams or instructor teams, depending on the audience) for small group activities.

 Arrange for food and beverages, as appropriate.

 Arrive 1/2 hour before training is scheduled to begin.

 Check equipment to ensure that it is working properly.

 Pre-label flipchart pages, one of the following headings per page:

 Expectations—Setting Performance Standards

 Expectations—Monitoring

 Parking Lot Issues

 The Motivation Continuum—5 or 6 pages labeled with this same heading and with an arrow leading from Intrinsic to Extrinsic as in the following: Intrinsic Extrinsic

 Matrix of States’ Preferred Standard-setting Models

 Reward Structures

 Suggested Sanctions

 May Produce Unintended Effects

 Questions to Ask Local Program About Educational Gain

 Questions to Ask Local Program About NRS Follow-up

 Questions to Ask Local Program About Retention

 Questions to Ask Local Program About Enrollment

 Needs/Resources

 Participant Feedback This flipchart should have two columns on the page, one labeled pluses (+), and one labeled deltas ().

v WORKSHOP OUTLINE

vi Materials Activities Times DAY 1 PPT-1—PPT-5 WORKSHOPI. Welcome,OUTLINE Introduction, (CONTINUED Objectives,) 50 Flipchart page for Agenda Review minutes Expectations— Setting A. Welcome and Introductions Performance Standards B. Professional Development Flipchart page for Objectives/Agenda/ Expectations— Monitoring Expectations Flipchart page for Parking Lot C. Parking Lot Issues Issues D. Terms and Definitions S-2a, b, and c; H-16 E. Workshop Evaluation Form PPT-6; PPT-7 II. The Power of Data Flipchart page for The A. Why Get Engaged with Data? 30 Motivation Continuum minutes H-3; H-4 BREAK 15 minutes PPT-9—PPT-24 B. The Data-driven Program Improvement 25 H-5; H-6 Model minutes S-1 (answers to H-5) C. Setting Performance Standards for Program 60 Quality minutes Flipchart page for Matrix of States’ Preferred Standard- setting Models

LUNCH 60 minutes PPT-25—PPT-33; H-7—H-10 D. Adjusting Standards for Local Conditions 30 Flipchart page for Reward E. Shared Accountability with Appropriate minutes Structures Rewards and 60 Flipchart page for Suggested Sanctions minutes Sanctions Flipchart page for May Produce Unintended Effects BREAK 15 minutes PPT-34—PPT-40 III. Getting Under the Data: Performance Measures 60 H-11a, b, c, and d and minutes 4 flipchart pages Program Processes (1)Questions to Ask Local Program about Educational Gain (2)Questions to Ask Local Program about NRS Follow-up Measures (3)Questions to Ask Local Program about Retention (4)Questions to Ask Local Program about Enrollment Flipchart page for Parking Lot IV. Day 1 Evaluation and Wrap-up 15 vii Issues minutes Flipchart page of pluses and deltas, + and  Materials Activities Times DAY 2 Flipchart page for Parking Lot V. Agenda Review for Day 2 30 Issues minutes PPT-4; PPT-5 VI. Planning for and Implementing Program PPT-41—PPT-54 Monitoring 20 PPT-55; H-12 A. Presentation and Discussion minutes B. Small Group Work on Data Sources 60 minutes BREAK 15 minutes C. Small Group Reports 30 PPT-56; H-13 D. Steps and Guidelines for Monitoring Local minutes Programs 25 minutes LUNCH 60 minutes VII. Planning for and Implementing Program PPT-57—PPT-66 Improvement 20 H-14 A. A Model of the Program Improvement minutes Process 60 B. Bringing it Home: The Start of a State minutes Action Plan BREAK 15 minutes C. Sharing Action Plans 45 minutes Flipchart page for VIII. Closing and Evaluation 30 Needs/Resources A. Review Parking Lot Issues minutes H-16a, b, and c B. Identify Additional Resources C. Reflection D. Workshop Evaluation

viii BEFORE THE WORKSHOP

The following tasks should be completed before the workshop:

 Send out flyers announcing the workshop and the dates.

 Send out confirmation letters to those who have registered for the workshop. Tell them that, in preparation for the workshop, they should meet with other persons (from their state or local program) who also will be attending the workshop. Ask them to come prepared to give a 5-minute report that highlights their state or program data. (See S-2a and b for sample letter.)

 Duplicate all handouts for the session (H-1 through H-16) and arrange them into participant packets. By providing a packet of materials to each participant, you can avoid constant distribution and handling of materials during the workshop.

 Download the PowerPoint slides from the NRS Web site (www.nrsweb.org) and create a CD-ROM for use during the workshop or make overhead transparencies from the PowerPoint slides (PPT-1 through PPT-69).

 Pre-label flipchart pages for activities, as indicated in the Preparation Checklist and in the Facilitator’s Notes.

 Order all equipment (overhead projector, screen, flipcharts). If you plan to use a CD-ROM instead of overhead transparencies, be sure that you will have a laptop computer and LCD projector available for the session. Check the equipment to ensure that it is working properly. Also check the size of the screen and the clarity of print from the back of the room.

 Prepare nametags or name tents for participants.

 Make signs or folded cards for each group (state names if this session is for national training, local program names if this is for state training, class names [e.g., ESL, ABE, GED, Workplace Literacy, Family Literacy, EL Civics] if this is for instructor training).

 Arrange for a place to hold the workshop session and ensure that it has sufficient space and moveable chairs for break-out activities. Consider the room arrangement that will best facilitate your activities. For this workshop, it is suggested that, if possible, the room arrangement consist of table rounds that each seat from 5 to 8 persons.

 Prepare a participant sign-in sheet to verify attendance. Include spaces for participants’ names, program names, addresses, phone and fax numbers, and e-mail addresses. This will be useful if you need to make future contact with participants.

 Arrange for refreshments and lunch, as appropriate.

 Read the Facilitator’s Notes for the workshop, pages 1-12.

 Review the handouts (H-1 through H-16), the PowerPoint slides (PPT-1 through PPT-70), and the Supplements (S-1 through S-3).

ix FACILITATOR’S NOTES FACILITATOR’S NOTES: DAY 1

Materials Activities Times

I. Welcome, Introduction, Objectives, Agenda Review 50 for Day 1 minutes A. Welcome and Introduction (30 PPT-1 Welcome participants to this professional development workshop min.) on NRS Data Monitoring for Program Improvement (PPT-1). Have each of the facilitators introduce themselves and make a brief statement about their background and expertise in either professional development or the management of data collection and reporting. Then ask participants to introduce themselves. If participants are few in number, they can introduce themselves one by one to the large group, stating their names, programs, and positions. Move the activity along, allowing each person to speak for only a minute. If the group is large, ask participants to pair up and share background information (name, program, position). As an optional activity, ask to see a show of hands for those who are local program directors, instructors, and professional development coordinators. Ask whether there are other roles represented among the group and what those roles are.

B. Professional Development Objectives/Agenda (15 min.) PPT-2 Show participants PPT-2 and outline for them the workshop PPT-3 objectives for Day 1. Then show PPT-3, Workshop Agenda. Quickly summarize the activities that will be part of this workshop and PPT-4 state their relationship to the expected outcomes. Also show PPT-5 them PPT-4 and PPT-5, the objectives and agenda for Day 2, but do not spend as much time on Day 2 objectives and agenda as you did on Day 1. You will show these again on Day 2; the reason for showing them now is to give participants a sense of the objectives of the full 2-day workshop.

Flipchart page Ask participants to consider one question they want answered titled Expectations about setting performance standards and one question they want —Setting answered about monitoring before this workshop is over. After Performance about 5 minutes, sample responses from the group and record Standards them on the flipcharts. Continue listing expectations until there Flipchart page are no more responses. titled Expectations —Monitoring NOTE: It is not necessary that every participant respond to this question; it’s likely that some participants will have expectations that have already been listed.

1 Materials Activities Times

Refer to the flipchart list and identify for participants those topics that have been planned for in this workshop, those that have not been planned for but that can be addressed easily during the workshop, and those, if any, that are outside the realm of this workshop. To the extent possible, identify resources that participants can access for information about those content issues that will not be covered in this workshop.

C. Parking Lot Issues (2 min.) Flipchart page Tell participants that they will keep a “Parking Lot” of issues and titled questions that arise that are related to the NRS and assessment but Parking Lot Issues not directly related to this workshop on NRS data monitoring for program improvement. To the extent possible, those questions and issues will be addressed at the end of this workshop. H-15 NOTE: Post on the wall a flipchart page marked “Parking Lot Issues.” Also place a Post-It pad on each table. Ask participants, throughout the workshop, to write their questions on the Post-It Notes and place them on the flipchart. They may also use H-15 to keep notes of their issues for the parking lot as well as to take notes of any revelations they have, or any light bulbs that go off for them during the workshop.

D. Terms and Definitions (2 min.) Point out that this professional development sequence, at times, S-2a, b, and c uses technical terms common to the National Reporting System. A Glossary (S-2a, b, and c) of these terms has been provided to make the reading as clear as possible. The Glossary can be found in the Supplement.

E. Evaluation Form (1 min.) H-16 Call participants’ attention to the evaluation form H-16. Remind them that they will be asked to complete the evaluation form at the end of the workshop.

II. The Power of Data A. Why Get Engaged with Data? [Warm-up Exercise]

30 minutes

2 Materials Activities Times PPT-6, PPT-7 Divide participants into teams of 3 to 5 people. Show PPT-6 and Post-It Notes PPT-7 and ask each team to consider the question, “Why is it important to be able to produce evidence of what your state (or local) adult education program achieves for its students?” Provide a supply of Post-It Notes to each team and ask the team to record (10 one reason that it identifies on each of the Post-It Notes. Allow min.) approximately 10 minutes for this activity. PPT-8, H-3 Now show PPT-8. Refer to H-3 and to the large wall charts titled 5-6 flipchart pages The Motivation Continuum. Ask each team to arrange the factors of H-3, The they have identified on one of the large wall charts, ranging in Motivation order from those factors that are internally motivated to those (5 min.) Continuum, posted factors that are driven by external forces. about the room Ask each team to briefly report to the whole group any problems or questions they had to resolve or consider before placing one or (10 more factors on the Motivation Continuum. Allow time for min.) H-3 discussion. H-4 Refer to H-3 again and ask participants individually to record on the arrow those factors that they believe are most meaningful. Tell them that these are the factors that are their own personal motivators for getting engaged with data. Refer them to H-4 and ask them to complete the sentence, “I can be motivated to work with our data if I remember that…” Point out that, when state or local program team members share their motivating factors with (5 min.) one another, this can be a powerful, unifying activity for the team in determining next steps in setting policy, monitoring programs, and initiating program improvement efforts. BREAK 15 minutes

3 Materials Activities Times B. Overview of the Data-driven Program Improvement 25 Model minutes PPT-9 Tell participants that the relationship between data and program PPT-10 quality is dynamic because data, in the form of performance standards and other goals, not only measure program PPT-11 performance but can change it as well. Referring to PPT-9, -10, and -11, describe the steps of a model for data-driven program improvement. The process begins with the setting of standards that define acceptable levels of performance. Underlying the performance measures (or data) are the Powerful Ps, or the (5 min.) program elements of policies, procedures, processes, and products. It is these program elements underlying the data that can be observed and monitored with the aim of improving PPT-12 performance. State and local teams, acting collaboratively, can plan and implement program improvements by making changes to policies, procedures, processes, and products. Now show PPT-12, which displays educational gains for ESL levels H-5 and performance standards for one adult education program. Point out that the program exceeded its targets for three of the (15 PPT-13 ESL levels, failed to meet them for two levels, and did not serve min.) Supplemental any students at the high-advanced ESL level. Ask participants, Handout, working in pairs, to use H-5 to identify questions that they, as S-1 program monitors, would want to ask of local program staff. Allow 5 minutes for this activity and then sample responses from the whole group. Accept all answers. After responses from the group seem exhausted, show PPT-13 and refer participants to the Supplemental Handout, S-1, which lists possible questions and invite them to review this list to see if it includes their questions. Tell them that this list represents only a subset of all possible (5 min.) questions that a visiting team might ask in monitoring program data and performance. PPT-14 Tell participants that we now are ready to examine more closely the 4 areas of this workshop, namely, (1) setting performance standards, (2) examining the elements underlying the data, (3) program monitoring, and (4) program change. Before proceeding, ask if participants have any questions or comments. After responding to any questions, ask participants to get ready to begin their journey into harnessing the power of data… (Show PPT-14.)

C. Setting Performance Standards for Program Quality 60 minutes

4 Materials Activities Times PPT-15 Show PPT-15 and tell participants that an accountability system (20 can measure quality accurately only when it contains the following min.) four elements: . An underlying set of goals that the program is to achieve; . A common set of measures (either qualitative or quantitative) that reflect the goals; . Performance standards tied to the measures; and PPT-16 . Sanctions or rewards for programs, tied to performance. Now show PPT-16 and explain that the goals of the federally funded adult education program (e.g., literacy skills development, lifelong learning, employment) are reflected in the NRS core outcome measures of . educational gain, . GED credential attainment, . entry into postsecondary education, and . employment.

5 Materials Activities Times

PPT-17 Show PPT-17, while reminding participants that each state sets its performance standards in collaboration with the U.S. Department of Education. However, a state’s performance is a reflection of the States’ prior aggregate performance of all the programs it funds. At this point, performance distribute to each state team a copy of its prior performance. Be sure that you give each state only its own data, and not data from any other state or program. Suggest that each state team review and consider its negotiated performance standards throughout this workshop. Tell them that states may soon be required to set standards for their local programs and to monitor PPT-18 through local programs with the aim of meeting or exceeding its PPT-22 performance standards and improving program quality. Show PPT-18 through PPT-22, which outline four different models for setting performance standards (continuous improvement, relative ranking, external criteria, and return on investment). Discuss each slide, inviting comments from participants. The slides are self-explanatory; however, it is not advisable that the facilitator simply read the slides. Participants will be more engaged in the presentation if you elaborate on some of the slides with comments and anecdotes and also invite comments from participants. Survey the group to determine whether any state PPT-23 and PPT- currently uses a specific standard-setting model for local programs (20 and ask for comments on the successes and the challenges faced min.) 24 in the standards-setting process. H-6 Now show PPT-23 and PPT-24 and refer them to H-6. Ask Flipchart page for participants, in their state teams, to consider the questions on H- Matrix of States’ 6. Allow 20 minutes for this. Preferred Then ask each state to report on the performance standard- Standard-setting setting model(s) it is leaning toward and to state the reason for Models this decision. Record the states’ responses by checking the (15 selected model(s) on a flipchart matrix similar to the following and min.) post the chart on a wall for the duration of the workshop. Ensure that participants know that they are not committing to using this performance standard-setting model; that this activity is just for the purpose of getting them familiar with the different models and their uses. Matrix of States’ Preferred Standard-setting Models State Continuou Relative External Return on s Ranking Criteria Investme Improvem nt ent

(5 min.)

Ask participants in what way the standard-setting model(s) they

6 Materials Activities Times LUNCH 60 minutes D. Adjusting Standards for Local Conditions 30 minutes Welcome participants back following the lunch break. Tell them that they now need to consider whether the standard-setting (10 PPT-25 models they have selected will have the desired effect on all min.) programs. Show PPT-25 and tell them that research on the effective use of performance standards suggests that standards often need to be adjusted for local conditions before locals can work to improve program quality. Ask if anyone can tell you why this is so. (Answer: Standards that are set at the wrong level will not work—they will be either too easy or too difficult for the PPT-26 program to meet, and they will not result in program improvement.) Show PPT-26 and explain that there are three main factors that affect program performance and that may require them to adjust standards for local conditions. These factors are: . student characteristics, . local program elements, and . external conditions. For example, the state standard may be too high for the local program that . serves predominantly lower-level students, or . experiences a sudden influx of refugees, or . sees a dramatic increase in student enrollment when the community’s largest employer closes its doors and moves out-of-state. The state may find it helpful to adjust literacy standards for the program that emphasizes special content, such as workplace skills. Likewise, programs in areas of high unemployment may need to have lowered standards for “entered and retained employment.” And when natural disasters affect student attendance or availability of services, standards may need to be adjusted. H-7 Now refer participants to H-7, sample scenarios. Tell them that (20 each scenario represents a local program’s claim that it cannot min.) meet the state-set performance standards. Divide the group into teams of 4-5 people and assign one scenario to each team. Ask each team to consider its scenario and to propose (1) a strategy for verifying the accuracy of the local program’s claim, and (2) a suggested solution or way to respond to the local program. Have each team select a recorder and a reporter. Allow 10 minutes for the team work and then ask each team to report to the whole group its strategy and suggested solution for its assigned scenario.

E. Shared Accountability with Appropriate Rewards and 60 Sanctions minutes

7 Materials Activities Times Tell participants that the section of the training that they are now beginning may be one of the most critical in terms of the ultimate success (or failure) of their program improvement and reform efforts, because, without local involvement and cooperation, every initiative launched by the state will be met with resistance and will be doomed to failure. H-8 (15 Refer state teams to H-8 and ask them to consider past efforts min.) that their state has initiated and to identify those that have been successful and those that the locals resisted. Ask if they can identify elements that led to the success or failure of these initiatives. Allow 10 minutes for the state teams to work and then sample responses from the total group. It is not necessary for each state team to report here.

8 Materials Activities Times PPT-27 through Show PPT- 27 through PPT-30 on Shared Accountability. The table on (10 PPT-30 PPT-30 has a horizontal and a vertical dimension (or axis), each min.) indicating movement from low to high. The horizontal axis represents state control and the vertical axis represents local involvement. The table cells representing the intersection of state guidance and support with local involvement show possible effects. In other words, low state control coupled with low degree of local involvement will likely result in stagnation; and high state control coupled with low degree of local involvement will likely result in local animosity and resistance, etc. If state control is delivered in the spirit of providing guidance and support, and if local involvement means that locals truly have a hand in identifying and designing program improvement efforts, then the local program will yield ever-increasing quality performance. In this scenario, everybody wins: students achieve their goals, programs win PPT-31 recognition and increased funding, and states increase overall performance. Tell participants that states must consider the use of appropriate rewards and sanctions for local programs. Show PPT-31 and ask them which they think is the more powerful motivator—rewards or sanctions? (Answer: Research clearly indicates that rewards are more effective than sanctions in promoting program improvement.) Ask them how sanctioning might be counter-productive? (Answer: The pressure created from sanctions such as partial loss of funding may prompt undesirable behavior by locals, such as limiting enrollment to higher-level students or placing students in inappropriately low H-9 levels.) Explain that such questionable tactics designed to yield high performance results are what is known as unintended effects—that is, PPT-32 the action does not benefit students; it is put in place for the sole purpose of avoiding harsh sanctions on the program.) Refer participants to H-9, Variations on a Theme, and show PPT-32. Ask them to work in groups of 3 or 4 to brainstorm possible reward structures and possible sanctions for local programs that meet or fail to meet their performance standards. Each group should select a (15 recorder and a reporter. Ensure that there is an ample supply of Post- min.) Flipchart (Reward It Notes and marking pens on each table. Instructions are for the Structures) recorder to write one response per Post-It note, using one color Post-It Note for the reward structures and a different color for the suggested sanctions. Allow 10 to 15 minutes for this activity and monitor the groups to determine when to call “Time.” Invite one group to read one of its rewards Post-It Notes and to place the note on the Reward Structures flipchart. Ask other groups if they (15 Flipchart min.) (Suggested also had “variations on the theme” (e.g., monetary rewards) and to Sanctions) post these on the Reward Structures flipchart in a column under the first note. When all groups have posted their rewards related to Flipchart (May monetary incentives, invite another group to read one of its remaining Produce rewards (e.g., Published Honor Roll of programs that met or exceeded Unintended Effects) its performance standards) and repeat the process. Continue until there are no remaining rewards to be posted. Then repeat the process for the sanctions, inviting groups to place (5 min.) their Post-It Notes on the Suggested Sanctions flipchart. For each sanction that is read, ask the whole group to serve as an applause-o- meter, clapping for the gentle sanctions and gonging for sanctions that may be too harsh. Place those sanctions that the group considers H-10 too harsh on the flipchart labeled May Produce Unintended Effects.

9 Materials Activities Times BREAK 15 minutes H-10 Allow 15 minutes for state teams to complete the activity 20 described before the break, using H-10. Then ask if any team minutes wants to share its strategies and perhaps help light bulbs to go off for other states.

PPT-33 In summary, show PPT-33, telling participants that the state 5 process of setting local performance standards consists of the minutes following five steps: 1. Select standard-setting model; 2. Set rewards and sanctions policy; 3. Review performance levels for local adjustment; 4. Provide technical assistance to locals in an atmosphere of shared accountability; and 5. Monitor performance often. III. Getting Under the Data: Performance Measures and 60 Program Processes minutes

10 Materials Activities Times PPT-34, PPT-35 Show PPT-34 and PPT-35. Tell participants that, for the purpose of this workshop, we will consider four sets of measures: 1. Educational gain; 2. The NRS follow-up measures of obtained a secondary credential, entered and retained employment, and entered postsecondary education; 3. Retention; and 4. Enrollment. PPT-36 through Under each set of measures lie the programmatic and PPT-39 instructional decisions and procedures that affect program PPT-40 performance and quality. Show the corresponding data pyramids on PPT-36 through PPT-39. 4 flipchart pages posted about the room: Post the flipchart pages indicated at the left at various places in (1) Questions to the room. Next to each flipchart, post a copy of one of the data Ask Local Program displays found on H-11 (a through d). about Educational Now show PPT-40 and refer participants to the Data Carousel Gain activity (H-11a, b, c, d, and e). Have participants, in groups of 4 or (2) Questions to 5, visit each of the carousel displays and develop questions about Ask Local Program the data and the underlying elements. Each group should identify about NRS Follow- as many underlying elements as possible that affect this data and up Measures the program’s performance and write their questions on the flipchart provided. Then each group should make (3) Questions to recommendations for improving the program’s performance. Ask Local Program Inform participants that they should spend approximately 15 about Retention minutes at each display (for a total of 1 hour) and that each group (4) Questions to should be prepared in the morning to report on their findings. As each group rotates on the carousel to different stops in the room, Ask Local Program it should read the questions generated by other groups and add about Enrollment their own questions. Note: Keep these carousel stops with the Next to each flipchart, questions posted for the duration of the workshop. post one of the data displays from H-11a, b, c, d, and e.

IV. Day 1 Evaluation and Wrap-up 15 minutes

Tell participants that they have now reached the mid-point in the workshop sequence on NRS Data Monitoring for Program Improvement. Review for them the content that they covered and the activities that they engaged in during this first day of the workshop on getting to understand and appreciate the value of data:

11 Materials Activities Times  Warming up to data through the Why Get Engaged with Data Exercise;  Selecting standard-setting models (from the four models of continuous improvement, relative ranking, external criteria, and return on investment;  Adjusting performance levels to meet local circumstances of student characteristics, local program elements, and external conditions;  Determining ways to share accountability with locals;  Setting policy for rewards and sanctions; and  Examining the programmatic and instructional decisions and procedures underlying the data.

Flipchart (Parking Ask if there are any questions about the day’s workshop. Respond Lot Issues) to questions that you can answer on the spot. If there are questions that will take some research before you can answer or policy questions that you must refer to another source, be sure to add the questions to the “Parking Lot Issues.” These will be addressed at the end of the workshop sequence. Flipchart Now tell participants that you would like to “take the temperature” page of pluses and of the group concerning Day 1 activities and content by doing an deltas, + and  informal pluses-and-deltas exercise. On a flipchart page, make a page of two columns, one with a plus sign [+] and one with a delta []. Ask them to call out those things that they liked about today’s workshop. Accept all comments and write them under the [+] column. When there are no more responses, ask them to identify those things that they felt could have been improved about today’s workshop. Again, accept all comments and write them under the [] column. Tell them that you appreciate and take their comments seriously and that, to the extent possible, you will attempt to address those items in the [] column that are under your control throughout the remainder of the workshop sequence. Thank them for their participation and enthusiasm and tell them that you look forward to seeing them tomorrow (or at the next scheduled workshop) and give them the date and location for the next workshop.

12 FACILITATOR’S NOTES: DAY 2

Materials Activities Times V. Agenda Review for Day 2 30 minutes Flipchart (Parking Welcome participants back and ask if there are any questions or (5 min.) Lot Issues) residual issues from Day 1 for which participants would like clarification before moving on to Day 2 activities. Respond to those questions and issues that relate to the content of this workshop on NRS Data Monitoring for Program Improvement. Use the “Parking Lot Issues” page to list those issues that are outside the realm of this workshop. Tell participants that the parking lot issues will be addressed at the end of this workshop. PPT-4 Show PPT-4, Workshop Objectives for Day 2. Tell them that this (5 min.) handout reviews the same objectives that were stated at the beginning of this workshop sequence, but that you are PPT- 5 specifically highlighting the objectives for today’s activities. Also show PPT-5, Workshop Agenda for Day 2. Quickly summarize the activities that will be part of this workshop and state their relationship to the expected outcomes. (20 min.) Ask for feedback on yesterday’s Carousel Activity. Have one team report its findings and invite other teams to comment or to add questions about the data and suggestions for program improvement. Continue until each of the four Carousel stops has been discussed, approximately 5 minutes per stop. Now tell participants that all they have accomplished so far in this workshop will serve as background information for the next important responsibility of states: local program monitoring.

VI. Planning for and Implementing Program Monitoring

13 Materials Activities Times A. Presentation and Discussion 20 PPT-41 Show PPT-41 and ask participants how/why they think local minutes PPT-42 programs might benefit from a state’s policy of conducting regular monitoring of local program performance and measuring it against performance standards. Sample responses and then PPT-43 show PPT-42, discussing each of the points. Encourage participants to share stories/experiences they have had related PPT-44 to these points in conducting local monitoring. Show PPT-43 and PPT-44. Tell them that you understand that the idea of monitoring may seem overwhelming to states with already PPT-45 overburdened staff, but that monitoring can be manageable and can provide an excellent opportunity to work with locals on program improvement. When showing PPT-45, ask whether any states currently include in their monitoring visits a discussion of whether locals are meeting performance standards and whether the monitoring is structured to encourage program improvement. If no states currently measure performance against performance standards, ask whether and how they currently monitor any aspect of local programs. Sample responses, and then ask whether adding the process of measuring performance against performance standards can be fit into their existing monitoring structures. NOTE: Be prepared to hear that some states conduct no monitoring at all. Accept all responses without making judgmental statements. The purpose of this workshop is to provide tools for states to collaboratively set performance standards with locals, to measure local performance against performance standards, and to plan and initiate programmatic improvement.

PPT-46 Show PPT-46 and discuss the difference between desk reviews PPT-47, PPT-48 and on-site monitoring, and then the advantages/disadvantages of each (PPT-47 and PPT-48). Then, showing PPT-49 through PPT- PPT-49 through 54, discuss the various data collection strategies for monitoring PPT-54 (Program Self-Reviews, Document Reviews, Observations, and PPT-52 Interviews). PPT-52 shows the difference between quantitative (collected primarily via desk reviews) and qualitative data (collected primarily via on-site reviews). 60 B. Small Group Work on Data Sources minutes PPT-55 Now show PPT-55 and ask participants, in groups of 4-5, to H-12 review H-12 and to fill in the data sources that they would use for each indicator and the strategies they would use if they were conducting (1) a desk review, and (2) an on-site review. Ask all groups to consider all nine indicators, but assign each group only one or two indicators to report on to the whole group. Ask each group to select a recorder and a reporter. Allow 1 hour for this activity before reconvening the group for the reports. Tell them that they also have the scheduled break time (an additional 15 minutes) to use if they need it. Each group will have 2 minutes to report on one indicator (or, if the groups each have been assigned two indicators to report on, they will have a total of 4 minutes to report). BREAK 15 minutes

14 Materials Activities Times

C. Small Group Reports Ask each group to report on its indicator(s) and the data sources 30 and monitoring strategies for on-site and desk reviews. Allow minutes only 2 minutes per report on each indicator. Following the report on each indicator, ask the total group to comment or, if they wish, suggest additional NRS data sources and other vehicles for conducting the desk and on-site reviews.

D. Steps and Guidelines for Monitoring Local Programs 25 PPT-56 minutes Show PPT-56 and conclude this section on Planning and H-13 Implementing Program Monitoring with a review of steps and guidelines for monitoring local programs, H-13. The steps are as follows: 1. Identify state policy for monitoring and gather support from those who have a stake in the monitoring results; 2. Consider past practices when specifying scope of work for monitoring; 3. Identify persons to lead and participate in monitoring; 4. Identify resources available for monitoring locals; 5. Determine process for collecting data with clearly defined criteria for rating; NRS Data 6. Conduct monitoring; Monitoring for 7. Report findings and recommendations; and Program 8. Follow-up on results. Improvement Guide, pp. 50-55 More information about each of these steps can be found on pages 50 through 54 of the guide, NRS Data Monitoring for Program Improvement (2004). Point out other states’ models and procedures for monitoring (e.g., Pennsylvania, Tennessee) beginning on page 55 in the guide. Tell participants that, following lunch, state teams will have time to consider these steps and the guidelines listed on H-13 and begin to plan a process for local monitoring. Use this time before lunch to clarify any issues related to the workshop content presented thus far and to allow participants to share their concerns about, as well as their past experiences, in monitoring local programs. Now is also a good time to review any of the issues or questions posted on the Parking Lot. LUNCH 60 minutes

VII. Planning for and Implementing Program Improvement

15 Materials Activities Times A. A Model of the Program Improvement Process 20 Tell participants that they now are entering the final phase of minutes this workshop series: planning and implementing a program improvement based on what they learn through program monitoring. Remind participants that, so far, they have reviewed strategies for setting performance standards for program quality, adjusting standards for local conditions, and setting policy for appropriate rewards and sanctions. They also have examined the programmatic components and policy decisions underlying the measures of educational gain, NRS follow-up, retention, and enrollment, and they have considered strategies for conducting monitoring through both desk reviews and on-site reviews, as PPT-57 well as the stakeholders they need to include in setting policy PPT-58 related to data monitoring. Show PPT- 57 and remind participants once again that data can be of considerable use to state and local programs. However, as PPT-59 indicated in PPT-58, data are useful only if the data are valid and reliable, if the state and locals ask appropriate questions PPT-60 after reviewing the data, and if data analysis leads to making PPT-61 wise decisions. Before moving on, it would be wise to remind participants about PPT-62 the change process—that it is a process, not an event. This means that change does not happen overnight. Show PPT-59, the factors that allow us to accept change; PPT-60, the stages of change; and PPT-61, a word of caution from the State Superintendent of Schools in Spokane, Washington. Now show PPT-62 and describe the four steps of a program improvement process: 1. Planning; 2. Implementing; PPT-63 through 60 3. Evaluating; and PPT-66 minutes H-14a and b 4. Documenting lessons learned and making adjustments, as needed. B. Bringing it Home: The Start of a State Action Plan Show PPT-63 through PPT-66 and refer participants to H-14a and b. Ask them to work in their state teams to consider the questions on H-14a and b in beginning to plan a model for monitoring local programs. Tell them that they will have 1 hour for planning. They also may use the 15 minutes of scheduled break time, if they need it. When the group reconvenes following the break, each state is to report on the team’s plans as well as the potential problems it anticipates and the strategies it plans to use to mitigate potential problems. They are to be prepared to make 5-minute reports on their planned changes to the whole group.

BREAK 15 minutes

C. Sharing Action Plans

16 Materials Activities Times Ask representatives from each team to report on the team’s 45 plans for implementing monitoring and program improvement minutes processes. Allow 5 minutes for each report and encourage questions from the other teams. After all teams have reported their plans, ask if anyone would like to make any general observations about the reports—for example, Do states anticipate similar obstacles in setting policy for implementing a program monitoring and improvement process? Have states come up with vastly different models for involving local programs? Ask participants if the various state reports have generated ideas that they can use. Ask whether state teams will modify their plans based on the reports made by other states.

X. Closing and Evaluation 30 minutes A. Review Parking Lot Issues Collect the flipchart pages with the Parking Lot Issues that were posted at the beginning of the workshop. Review lists to determine if these questions have been answered during the workshop. Provide answers to unanswered questions or, if the questions need to be referred to others or if they need research, give participants an approximate date by which they can expect to receive either the answers or referrals to other information sources. Also ask if participants have any questions/items/issues that still need to be clarified. B. Identify Additional Resources Flipchart Mark a flipchart page “Needs/Resources.” Ask participants to (Needs/Resources) name additional resources that they need to implement the changes they identified in their action plans. It could be additional training, online resources, policy issues and changes, etc. The purpose of this activity is to provide information to the drivers of change and of policy at the federal and state levels to help them in their planning. Tell participants that their brainstormed list of needs and additional resources will be compiled and mailed to all workshop participants. Allow approximately 20 minutes for this activity.

C. Reflection Provide closure to the workshop by asking participants to reflect on what they have learned and how they can apply the information they have discussed or acquired. Refer participants to the workshop objectives:

17 Materials Activities Times  Describe the importance of getting involved with and using data;

 Identify four models for setting performance standards as well as the policy strategies, advantages, and disadvantages of each model;

 Determine when and how to adjust standards for local conditions;

 Set policy for rewards and sanctions for local programs;

 Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention;

 Distinguish between the uses of desk reviews and on-site reviews for monitoring local programs;

 Identify steps for monitoring local programs;

 Identify and apply key elements of a change model; and

 Work with local programs to plan for and implement changes that will enhance program performance and quality.

D. Workshop Evaluation H-16a, b, and c Direct participants’ attention to H-16a, b, and c (Workshop Evaluation). Ask participants to complete the evaluation. Thank them for attending and participating, and tell them that you look forward to seeing them at the next workshop.

18 PARTICIPANT’S HANDOUTS WORKSHOP OBJECTIVES By the end of this professional development sequence, participants will be able to:

Day 1

1. Describe the importance of getting involved with and using data;

2. Identify four models for setting performance standards as well as the policy strategies, advantages, and disadvantages of each model;

3. Determine when and how to adjust standards for local conditions;

4. Set policy for rewards and sanctions for local programs; and

5. Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention.

Day 2

1. Distinguish between the uses of desk reviews and on-site reviews for monitoring local programs;

2. Identify steps for monitoring local programs;

3. Identify and apply key elements of a change model; and

4. Work with local programs to plan for and implement changes that will enhance program performance and quality.

H-1 WORKSHOP AGENDA

Day 1 I. Introductions, Objectives, Agenda Review for Day 1

II. The Power of Data

 Why Get Engaged with Data? (exercise and the motivation continuum)

 Overview of the Data-drive Program Improvement Model (questioning data exercise)

 Setting Performance Standards for Program Quality (presentation on four models for setting standards and exercise)

 Adjusting Standards for Local Conditions (scenarios)

 Shared Accountability with Appropriate Rewards and Sanctions (variations on a theme exercise)

III. Getting Under the Data: Performance Measures and Program Processes (data pyramids and the underlying program components, decisions, and processes)

IV. Day 1 Evaluation and Wrap-up

Day 2 V. Agenda Review for Day 2

VI. Planning for and Implementing Program Monitoring

 Differences between Desk Reviews and On-site Reviews

 Data Sources for Each (exercise)

 Steps and Guidelines for Local Program Monitoring

VII. Planning for and Implementing Program Improvement

 The Change Process

 A Model of the Program Improvement Process

 State Action Planning

 Sharing Reports

VIII. Closing and Evaluation

H-2 WHY GET ENGAGED WITH DATA?

Directions: Form a team of 3 to 5 people and consider the following question:

Why is it important to be able to produce evidence of what your

state (or local) adult education program achieves for its students?

Jot down as many responses as you can think of, writing one response on each Post-It Note.

Your workshop facilitator will provide you with a flipchart page titled The Motivation Continuum, similar to the chart that appears below. When your team can think of no more responses, take all the Post-It Notes your team has created and place them on The Motivation Continuum flipchart, ranging in order from those factors that are internally motivated to those factors that are driven by external forces.

The Motivation Continuum

Intrinsic Extrinsic

H-3 YOUR OWN PERSONAL MOTIVATORS

Directions: Complete the following sentence with as many things as apply to you personally.

I can be motivated to work with our data if I remember that…

1. ______

2. ______

3. ______

4. ______

5. ______

6. ______

When state or local program team members share their motivating factors with one another, it can be a powerful, unifying activity for the team in determining next steps in setting policy, monitoring programs, and initiating program improvement efforts.

H-4 QUESTIONS FOR CONSIDERATION

Directions: Examine the data in the following graph.

100% 91% 80% 80%

60% 50% 33% 36% 34% 40% 31% 26% 22%27% 26% 20% Program 0% Performance Standards Beg. Lit Beg. Low Int. High Int. Low Adv. High Adv.

List all the questions you can think of to ask about this local program’s data and what the underlying reasons might be for the results.

______

H-5 DECISION FOR STATE TEAMS: SELECTING A STANDARD-SETTING MODEL

Directions: In your state teams, consider the following questions:

1. Which model do you favor for setting standards for/with local programs? Why? ______

2. Is it appropriate for your state to use one statewide model or will you need to use different models for different programs? ______

3. How will you involve the locals in setting the standards to which they will be held accountable? ______

Consider question #4, but do not include it in your state report. We will discuss this with the entire group following the state reports. 4. How do the standard-setting models(s) that states select represent a policy statement on the relationship between performance and quality that states want to instill in local programs? ______

H-6 ADJUSTING LOCAL STANDARDS: SAMPLE SCENARIOS

Directions: Read the following scenarios. Each represents a local program’s claim that it cannot meet the state-set performance standards. In your small group, discuss how you would handle each claim by (1) proposing a strategy for verifying the accuracy of the claim, and (2) proposing a solution to the problem. Be prepared to report your team’s strategy and proposed solution to the whole group.

1. Continuous Improvement Model Using a continuous improvement model, one state set performance standards for GED attainment for each local program at levels slightly higher than the previous year’s. However, in the previous year, several local programs had received grants to offer an extensive amount of “fast-track GED” instruction prior to the release of GED 2002 and, consequently, their secondary completion and GED rates soared. The “fast-track” grant is now over and the programs think the current levels set by the state are too high and should be lowered, based on levels they attained before the grant.

Your Strategy for Verifying the Accuracy of the Local Program’s Claim: ______

Your Solution to the Problem and Response to the Local Program: ______

2. Relative Ranking Model Another state uses a relative ranking model to set local performance standards. In reviewing its student demographic data, one local program that fails to meet its educational gain performance standards found that it serves a high proportion of older learners. The state average age of ABE learners is 33 years old, but the local program’s average student age is 49. The program requests that the state adjust standards lower for them, based on the common belief that older learners do not make gains as quickly.

Your Strategy for Verifying the Accuracy of the Local Program’s Claim: ______

Your Solution to the Problem and Response to the Local Program: ______

3. External Criteria Model The state legislature requires all adult education programs to show at least a 20 percent increase in the percentage of participants who get jobs. In response, the state increases the standard for adult education programs by 25 percent over previous years for ‘entered employment.’ Several adult education programs claim that they cannot meet this standard because they serve significant numbers of learners who are already working, and the number of students with the goal of ‘obtain employment’ is low. If a program does not focus on employment skills, it cannot substantially increase its ‘entered employment’ rate.

Your Strategy for Verifying the Accuracy of the Local Program’s Claim: ______

Your Solution to the Problem and Response to the Local Program: ______

H-7 REFLECTION ON SUCCESS OF PAST EFFORTS Directions: In your state teams, consider past policy changes that your state adult education office has initiated and asked local programs to comply with. Select one policy change that was well received by the locals, and one policy change that was met with resistance from the locals. Then identify some the factors that you think contributed to the success or failure of these initiatives. You have approximately 10 minutes for this exercise. Be prepared to share your responses with the whole group.

1. List a policy change (imposed by the state office on local programs) that was successful and well received by local programs. ______

What factors contributed to the success of this effort?

______

2. List a policy change (imposed by the state office on local programs) that was not successful and was met with resistance from local programs. ______

What factors contributed to the poor reception of this effort?

______

H-8 VARIATIONS ON A THEME

Directions: In a small group of 3 or 4 people, brainstorm as many possible rewards or incentives as you can for recognizing local programs that meet their performance standards. Write these in the left column below. Then brainstorm sanctions that the state might impose on local programs that do not meet their performance standards. Select a recorder for your group to write one reward per Post-It Note and one sanction per Post-It Note. When you have finished, wait for further instructions from the facilitator.

Reward Structures Sanctions

H-9 STATE WORKSHEET: PLANNING FOR REWARDS AND SANCTIONS

Directions: In your state team, make some preliminary decisions about the rewards and sanctions that you might use to reward and motivate local programs to meet their performance standards. Consider the following questions.

1. What reward structures are you thinking of putting in place for local programs that meet their performance standards? ______

2. What sanctions are you thinking of putting in place for local programs that fail to meet their performance standards? ______

3. What timeline are you thinking of for putting the rewards and sanctions policy in place? ______

4. What stakeholders will you include in the decision-making process about rewards and sanctions when you get back to your home state? ______

5. Who in the state office will have primary responsibility for the following:

a. Announcing the new policy ______

b. Reviewing local programs’ data ______

c. Determining the reward or sanction for each program ______

d. Providing support and technical assistance to the programs that need to improve ______

e. What will be the nature of the support and technical assistance provided by the state? ______

6. What obstacles or resistance do you foresee in putting this policy in place? How might you plan in advance to lessen these obstacles? ______

7. What other factors, if any, do you need to consider that are specific to your state? ______

H-10 DATA CAROUSEL

Directions: In teams of 4-5, visit each of the four carousel stops around the room. Each of these stops represents one of your local programs. Review the data table or chart displayed at each stop (and also displayed on H-11b, c, d, and e). As you review the data at the first stop, ask yourselves what, if anything, in this data is a cause for concern? What questions do you want to ask the local program about these results? Note: Your questions should target and try to get at the underlying elements that may be causing the performance problem(s). Write your questions on the flipchart provided at this stop. Then proceed to the next carousel stop and repeat the process. At each stop, you will see questions that other review teams have written as they revolve on the data carousel. Add your questions, if they are different from the ones that already appear on the flipchart. You may spend up to 15 minutes at each stop.

H-11a DATA CAROUSEL—STOP #1

EDUCATIONAL GAIN ABE LEVELS

Definition: Students who advance an NRS Level

76% 80% 69% 71% 70% 60% 50% Program 35% 35% 40% 30% 26%29% 29% 29% State 30% 22% 19% 20%19% 19%21% 14% 20% Performance 10% NA Standards 0% ABE Beg. Lit ABE Beg. ABE Int. Low ABE Int High ASE Low ASE High

Number Advancing Level And Total % Completing State Enrolled Level Performa NRS Program State nce Level Progra Standard Advanc Total Advance Total State m s ed d ABE Beg. 2 14 169 761 14% 22% 35% Lit. ABE Beg. 9 26 339 1,284 35% 26% 29% ABE Int. 6 31 589 2,060 19% 29% 30% Low ABE Int. 40 58 683 3,339 69% 20% 19% High ASE Low 35 46 385 2,044 76% 19% 21% ASE High 24 34 307 1,062 71% 29% — TOTAL 116 209 2,470 10,550 55% 23%

H-11b DATA CAROUSEL—STOP #2

FOLLOW-UP RECEIPT OF SECONDARY SCHOOL DIPLOMA OR GED (2001-2003) Definition: The number of students who received a secondary school diploma or GED divided by the number of students who had that as a goal.

100

80 70 70 60 60 50 Percent Achieved Goal 33 40 36 Performance Standard

20

0 2001 2002 2003

Number with Number Percent Performance Year Goal Achieved Goal Achieved Goal Standard 2001 150 75 50% 60% 2002 120 40 33% 70% 2003 110 40 36% 70%

H-11c DATA CAROUSEL—STOP #3

RETENTION AVERAGE HOURS ATTENDED Definition: Total attended hours divided by number of enrolled students

180 160 140 120 Program 100

80 State 60 Performance 40 Standards 20 0 ABE ABE ABE Int. ABE Int. ASE Low ASE Beg. Lit. Beg. Low High High

Performance Average Attended Hours Standards NRS Level Performance Program State Standards ABE Beg. Lit. 40 90 70 ABE Beg. 120 120 100 ABE Int. Low 60 120 80 ABE Int. High 161 110 100 ASE Low 50 130 60 ASE High 150 140 100 TOTAL 80 120 —

H-11d DATA CAROUSEL—STOP #4

Enrollment of Student Sub-Populations

100 90 80 70 60 50 40 30 19 20 21 18 20 12 13 8 7 10 5 6 0 ABE Beginning ESL Beginning On Public Immigrant/Refugee Aged 16-24 Literacy Literacy Assistance

Actual % Perform ance Standard %

Target Actual Performance Standard Population Number % of Total Number % of Total ABE Beginning 80 90 20% Literacy 19% ESL Beginning 52 60 13% Literacy 12% On Public 90 80 18% Assistance 21% Immigrant/Refugee 21 5% 35 8% Aged 16-24 25 6% 30 7% Total Enrollment in Program 430 — 450 —

H-11e MONITORING PERFORMANCE USING INDICATORS OF PROGRAM QUALITY Directions: In groups of 4-5, for each indicator in the table below, fill in the data sources that you would use as well as the questions you would ask in monitoring local programs and the strategies you would use to conduct (1) a desk review, and (2) an on-site review. Complete all 9 indicators in the table below. Select a recorder and a reporter for your group and be prepared to report to the whole group on the indicator(s) assigned to your group by the facilitator. You have 1 hour for this activity, plus 15 minutes for the scheduled break. Your group will have 2 minutes to report on each indicator assigned.

Program Area in NRS Data Questions to Strategies for Strategies for Indicators of Source Pose of Locals Desk Review On-site Program Quality re: this Indicator Monitoring

Program Management (Data Reporting)

Recruitment

Goal Setting (Intake and Orientation)

Educational Gains

Assessment

Curriculum and Instruction

Persistence

Support Services

Professional Development

H-12 STEPS AND GUIDELINES FOR MONITORING LOCAL PROGRAMS

Implementation Monitoring Steps Examples Guidelines 1. Identify state policy for Provide clear written guidelines State plan should be open to the monitoring. Gather support to all levels of stakeholders on public and shared at all levels. from those who have a the scope of the monitoring State plans often specify: stake in the results. activities (including process and  Outcome measures, and timelines).  Frequency of evaluation 2. Specifying the scope of Uses quantitative and qualitative Quantitative = look at outcome work for monitoring. data for effective monitoring. measurements Qualitative = look for evidence using program quality indicators 3. Identify individuals to lead Consider the unique program Local staff: practitioners, and to participate in features when identifying who administrators, partners monitoring activities. should be involved from the local External team members: program and who should be part content specialists, other of monitoring team. Consider educators, and staff from strength in diversity. partnering agencies 4. Identify resources available With competing demands for Desk reviews look at program for monitoring local resources (staff, time, and money data from a distance. programs. for monitoring), consider Onsite reviews look at data in formalizing a two-stage context—to see first-hand how the monitoring approach. process and operations lead to positive outcome measures. 5. Determine process for Create and use standard tools for Desk Reviews can include data, collecting data with clearly data collection and analysis. proposals, plans, reports, and stated criteria for rating. Monitors (state staff and team) program self-review. Conduct monitoring need to fully understand the Onsite reviews can include activities. tools, their use, and the rating discussion of self-review, criteria. observations, interviews, and a review of files and documents.

6. Report on the findings, Conclude onsite monitoring visits Report might include a short including with a verbal debriefing followed description of the monitoring recommendations. by a written report. activities with supporting:  Qualitative description  Quantitative information. 7. Follow up on the results. Given that the major purpose of Follow-up activities might include monitoring is program reviewing performance standards improvement, is essential, and and program improvement, should include an ongoing rewarding or sanctioning, and the exchange between the state beginning of technical assistance. office and the local program.

H-13 H-13 PLANNING AND IMPLEMENTING PROGRAM IMPROVEMENT

Directions: With your state team members, consider the following questions and begin to plan a program improvement policy and process. Be prepared to report on your plans to the whole group.

1. Who should be included on your program improvement team? List the positions from the state office as well as the local program. Anyone else, such as community members, learners, etc.?

2. How will you prioritize areas needing improvement when you review a local program’s data and find several areas that may need to be addressed?

3. How will you gain cooperation from locals in this process?

4. What type of training or professional development will be needed to get local buy-in?

5. How will you identify and select strategies for effecting improvement?

6. Who will be responsible for taking the lead on ensuring that the change is implemented?

7. How will expectations for the change be promoted and nurtured?

8. How will the change be monitored?

H-14a 9. How will the changes that are implemented be evaluated?

10. Who will interpret the results?

11. Who will be on the lookout for unintended consequences?

12. Who will document the process of what worked, what didn’t, and lessons learned?

13. What problems do you anticipate facing as you plan for and implement policy related to data monitoring for program improvement?

14. What solutions or precautions can you suggest to avoid having these problems become major ones?

15. What is your timeline or expected completion date for the following activities:

A. Setting performance standards?______

B. Announcing the standards and asking locals to comply? ______

C. Developing a policy for adjusting standards for local conditions?______

D. Developing a policy for rewards and sanctions?______

E. Developing a policy and process for monitoring local programs?______

F. Developing a policy and process for effecting program improvements?______

G. Other ______

H-14b Aha! Experiences Parking Lot Issues

H-15 NRS DATA MONITORING FOR PROGRAM IMPROVEMENT WORKSHOP EVALUATION FORM

Date______Location of Workshop______

State or Local Program Name______

Your Position (Check all that apply)  Instructor  Local Administrator  Data Facilitator  Professional Development Coordinator  State Director or State Staff  Other (identify)______

1. The objectives of the NRS professional development packet were met

(not at all) 1 2 3 4 (completel y)

The Power of Data

2. The “Why Get Engaged with Data?” exercise was (not effective) 1 2 3 4 (highly effective)

3. The concepts and information presented in “The Data-driven Program Improvement Model” were (not useful) 1 2 3 4 (highly useful)

4. The “Setting Performance Standards for Program Quality” exercise was useful in identifying problem standard-setting models we might use in our state. (not at all) 1 2 3 4 (extremely)

5. The concepts and information presented in “Adjusting Standards for Local Conditions” were (not helpful) 1 2 3 4 (extremely helpful)

6. The “Shared Accountability with Appropriate Rewards and Sanctions” exercise was useful for making decisions about the rewards and sanctions my state might put in place. (not at all) 1 2 3 4 (extremely)

General comments about the Power of Data Section:

H-16a Getting Under the Data: Performance Measures and Program Processes 7. The directions for the “Data Carousel” exercise were (confusing) 1 2 3 4 (clear)

8. The concepts and information presented in the data pyramids were (not helpful) 1 2 3 4 (extremely helpful)

General comments about the Getting Under the Data Section:

Planning for and Implementing Program Monitoring

9. The concepts and information presented in the Planning and Implementing Program Monitoring section were (not useful) 1 2 3 4 (very useful)

10. The small group work on Data Sources was helpful in understanding the differences in monitoring local program using desk reviews versus on-site reviews. (not at all) 1 2 3 4 (extremely )

General comments about the Planning for and Implementing Program Monitoring Section:

Planning for and Implementing Program Improvement 11. The concepts and information presented in the “Model for Change” presentation and discussion were (not useful) 1 2 3 4 (very useful)

12. The state action planning was helpful in getting my state started toward data monitoring for program improvement. (not at all) 1 2 3 4 (extremely )

General comments about the Planning for and Implementing Program Improvement Section:

H-16b H-16b Overall Comments

1. What were the most helpful features of the workshop? Please be specific.

2. What do you think were the least helpful features of the workshop?

3. What suggestions do you have for improving this professional development activity? POWERPOINT SLIDES NRS DATA MONITORING FOR PROGRAM IMPROVEMENT (PowerPoint Slides) ______Slide 1 ______

NRS Data Monitoring for ______Program Improvement ______Unlocking Your Data ______4/6/20044/6/2004 M.M. Corley Corley 1 ______

Slide 2 ______Objectives—Day 1 ______1. Describe the importance of getting involved with and using data; 2. Identify four models for setting performance standards as well as the policy strategies, advantages,and ______disadvantages of each model; 3. Determine when and how to adjust standards for local conditions; ______4. Set policy for rewards and sanctions for local programs;

5. Identify programmatic and instructional elements underlying the measures of educational gain, NRS ______follow-up, enrollment, and retention. ______4/6/2004 M. Corley 2 ______

Slide 3 ______Agenda—Day 1

 Welcome, Introduction, Objectives, Agenda Review ______ The Power of Data – Why Get Engaged with Data? Exercise – The Data-driven Program Improvement Model ______– Setting Performance Standards – Adjusting Standards for Local Conditions – Establishing a Policy for Rewards and Sanctions ______ Getting Under the Data – Data Pyramids – Data Carousel ______ Evaluation and Wrap-up for Day 1 ______4/6/2004 M. Corley 3 ______

1 Slide 4 ______Objectives—Day 2 ______1. Distinguish between the uses of desk reviews and on-site monitoring of local programs;

2. Identify steps for monitoring local programs; ______

3. Identify and apply key elements of a change model; and ______4. Work with local programs to plan for and implement changes that will enhance program performance and quality. ______4/6/2004 M. Corley 4 ______

Slide 5 ______Agenda—Day 2 ______ Agenda Review  Planning for and Implementing Program Monitoring – Desk Reviews Versus On-site Reviews ______– Data Sources (small group work) – Steps and Guidelines for Monitoring Local Programs  Planning for and Implementing Program Improvement ______– A Model of the Program Improvement Process – State Action Planning  Closing and Evaluation ______4/6/2004 M. Corley 5 ______

Slide 6 ______STOP! Why Get Engaged with Data? ______4/6/2004 M. Corley 6 ______

2 Slide 7 ______Question for Consideration ______Why is it important to be able to ______produce evidence of what your state (or local) adult education program achieves for its students? ______4/6/2004 M. Corley 7 ______

Slide 8 ______The Motivation Continuum ______Intrinsic Extrinsic ______

Which is the more powerful force for change? ______4/6/2004 M. Corley 8 ______

Slide 9 NRS Data-driven Program ______Improvement (Cyclical Model) ______STEPS – Set performance standards – Examine program elements underlying the ______data – Monitor program data, policy, and procedures ______– Plan and implement program improvement – Evaluate progress and revise, as necessary, ______and recycle ______4/6/2004 M. Corley 9 ______

3 Slide 10 What’s Under Your Data? ______The Powerful Ps ______Performance_(Data)_ Program Policies ______Procedures ______Processes Products ______4/6/2004 M. Corley 10 ______

Slide 11 NRS Data-driven Program ______Improvement Model Set Performance ______Standards ______

NRS Plan and Implement DATA Examine Program Program Elements Underlying ______Improvement; the Data Evaluate Improvement ______Monitor Program Data, Policy, Procedures ______4/6/2004 M. Corley 11 ______

Slide 12 Educational Gains for ESL Levels ______and Performance Standards ______100% Exhibit 1-2 91% 80% 80% ______50% 60% Program 31% 33% 36% 34% 40% Performance Standards 22% 27% 26% 26% ______20%

0% Beg. Beg. Low High Low High ______Lit Int. Int. Adv. Adv. ______4/6/2004 M. Corley 12 ______

4 Slide 13 ______Questions Raised by Exhibit 1-2 ______ How were performance standards set? Based on past performance?  Are standards too low at the higher levels? ______ Is performance pattern similar to that of previous years? If not, why not?  What are program’s assessment and placement ______procedures? Same assessments for high and low ESL?  How do curriculum and instruction differ by level? ______ What are student retention patterns by level? ______4/6/2004 M. Corley 13 ______

Slide 14 The Power of Data: Setting ______Performance Standards ______4/6/2004 M. Corley 14 ______

Slide 15 Essential Elements of ______Accountability Systems ______• Goals ______• Measures ______• Performance Standards ______• Sanctions and Rewards ______4/6/2004 M. Corley 15 ______

5 Slide 16 National Adult Education ______Goals ______Reflected in NRS Outcome Measures of ______educational gain, GED credential attainment, ______entry into postsecondary education, and ______employment. ______4/6/2004 M. Corley 16 ______

Slide 17 ______Performance Standards

 Similar to a “sales quota”: how well are you ______going to perform this year? – Should be realistic and attainable, but ______– Should stretch you toward improvement  Set by each state in collaboration with ED ______ Each state’s performance is a reflection of the aggregate performance of all the programs it funds ______4/6/2004 M. Corley 17 ______

Slide 18 ______Standards-setting Models ______Continuous Improvement ______Relative Ranking External Criteria ______Return on Investment (ROI) ______4/6/2004 M. Corley 18 ______

6 Slide 19 ______Continuous Improvement ______ Standard based on past performance  Designed to make all programs improve compared to themselves ______ Works well when there is stability and a history of performance on which to ______base standard  Ceiling reached over time, resulting in ______little additional improvement ______4/6/2004 M. Corley 19 ______

Slide 20 ______Relative Ranking ______ Standard is mean or median performance of all programs  Programs ranked relative to each other ______ Works for stable systems where median performance is acceptable ______ Improvement focus mainly on low- performing programs  Little incentive for high-performing ______programs to improve ______4/6/2004 M. Corley 20 ______

Slide 21 ______External Criteria ______ Set by formula or external policy  Promotes a policy goal to achieve a ______higher standard  Used when large-scale improvements are ______called for, over the long term  No consideration of past performance: ______unrealistic, unattainable ______4/6/2004 M. Corley 21 ______

7 Slide 22 ______Return on Investment ______ Value of program :: Cost of program  A business model; answers question, Are services or program worth the ______investment?  Can be a powerful tool for garnering ______funding (high ROI) or for losing funding (low ROI) ______ May ignore other benefits of program ______4/6/2004 M. Corley 22 ______

Slide 23 ______Decision Time for State Teams ______1. Which model(s) do you favor for setting standards for/with locals? ______2. Is it appropriate to use one statewide model or different models for different programs? ______3. How will you involve the locals in setting the standards they will be held ______to? ______4/6/2004 M. Corley 23 ______

Slide 24 ______Question for Consideration ______How do the standard-setting ______model(s) that states select represent a policy statement on the relationship between performance ______and quality that states want to instill in local programs? ______4/6/2004 M. Corley 24 ______

8 Slide 25 Adjusting Standards for ______Local Conditions ______Research suggests that standards often need to be adjusted for local ______conditions before locals can work to improve program quality. ______WHY IS THIS SO? ______4/6/2004 M. Corley 25 ______

Slide 26 Factors that May Require ______Adjustment of Standards ______Student Characteristics – An especially challenging group ______– Students at lower end of level – Influx of different types of students ______Local Program Elements External Conditions ______4/6/2004 M. Corley 26 ______

Slide 27 ______Shared Accountability ______State and locals share responsibility to meet accountability requirements ______– State provides tools and environment for improved performance ______– Locals agree to work toward improving performance ______4/6/2004 M. Corley 27 ______

9 Slide 28 ______Locals should know… ______ The purpose of the performance standards;  The policy and programmatic goals the ______standards are meant to accomplish;  The standard-setting model that the state adopts; and ______ That State guidance and support is available to locals in effecting change. ______4/6/2004 M. Corley 28 ______

Slide 29 ______Shared Accountability ______ Which state-initiated efforts have been easy to implement at the local level?  Which have not? ______ What factors contributed to locals’ successfully and willingly embracing the ______effort?  What factors contributed to a failed ______effort? ______4/6/2004 M. Corley 29 ______

Slide 30 ______Shared Accountability ______

High Locals Out of Hot Dog!! We’re Control?? really moving! ______m a t r n g e

o Anything Get r

m ______P e Happening Out OFF

l v

l There?? our backs!! a Low o c

v Low High o n ______L I State Administrative Control ______4/6/2004 M. Corley 30 ______

10 Slide 31 What About Setting ______Rewards and Sanctions?  Which is the more powerful motivator: rewards or ______sanctions?

 List all the different possible reward structures you ______can think of for local programs.

 How might sanctioning be counter-productive? ______

 List sanctioning methods that will not destroy locals’ motivation to improve or adversely affect ______relationships with the state office. ______4/6/2004 M. Corley 31 ______

Slide 32 ______Variations on a Theme Exercise

 (Refer to H-10). Brainstorm as many possible rewards ______or incentives as you can for recognizing local programs that meet their performance standards. ______ Then brainstorm sanctions that the state might impose on local programs that do not meet their performance standards. ______ Select a recorder for your group to write one reward per Post-It Note and one sanction per Post-It Note.

 When you have finished, wait for further instructions ______from the facilitator. ______4/6/2004 M. Corley 32 ______

Slide 33 Summary of Local Performance ______Standard-setting Process ______Procedure Goal Select standard- Reflect state policies; setting model Promote program improvement ______Set rewards and Create incentives; sanctions policy Avoid unintended effects Make local Ensure standards are fair & ______adjustments realistic for all programs Provide T/A Create atmosphere of shared accountability ______Monitor often Identify and avoid potential problems ______4/6/2004 M. Corley 33 ______

11 Slide 34 ______Getting Under the Data ______NRS data, as measured and ______reported by states, represent the product of underlying ______programmatic and instructional decisions and procedures. ______4/6/2004 M. Corley 34 ______

Slide 35 ______Four Sets of Measures ______1. Educational gain 2. NRS Follow-up Measures ______– Obtained a secondary credential – Entered and retained employment – Entered postsecondary education ______3. Retention ______4. Enrollment ______4/6/2004 M. Corley 35 ______

Slide 36 ______Educational Gain ______

Educational ______Gain Assessment Policies and Approach I n s ______Assessment Procedures t r u Goal Setting and Placement Procedures c t i o Retention n Class Organization Professional Development ______4/6/2004 M. Corley 36 ______

12 Slide 37 ______Follow-up Measures ______

GED Employment ______Postsecondary Instruction Support Services G o ______a l-S Tracking Procedures e t t i n Retention g Professional Development ______4/6/2004 M. Corley 37 ______

Slide 38 ______Retention ______

Retention ______Students Class Schedules and Locations I n s ______

Placement Procedures t r u c t Support Services i o Retention Support and Policies n Professional Development ______4/6/2004 M. Corley 38 ______

Slide 39 ______Enrollment ______

Enrollment ______Community Characteristics R

Class Schedules and e c ______r Locations u i t m e Instruction n t Professional Development ______4/6/2004 M. Corley 39 ______

13 Slide 40 ______Data Carousel ______4/6/2004 M. Corley 40 ______

Slide 41 ______Question for Consideration ______How might it benefit local programs if the State office were to initiate ______and maintain a regular monitoring schedule to compare local program ______performance against performance standards? ______4/6/2004 M. Corley 41 ______

Slide 42 Regular Monitoring of Performance ______Compared with Standards ______ Keeps locals focused on outcomes and processes;  Highlights issues of importance; ______ Increases staff involvement in the process;  Helps refine data collection processes and products; ______ Identifies areas for program improvement;  Identifies promising practices; ______ Yields information for decision-making;  Enhances program accountability. ______4/6/2004 M. Corley 42 ______

14 Slide 43 ______BUT… ______ How can states possibly monitor performance of all local programs? ______ Don’t we have enough to do already??  Where will we find staff to conduct the ______reviews?  You’re kidding, right?? ______4/6/2004 M. Corley 43 ______

Slide 44 ______Not! ______4/6/2004 M. Corley 44 ______

Slide 45 ______So….Let’s Find Some Answers ______ How can you monitor performance of locals without overburdening state staff?  What successful models are already out ______there??  How does your state office currently ______ensure local compliance with state requirements? ______ Can you build on existing structures? ______4/6/2004 M. Corley 45 ______

15 Slide 46 ______Approaches to Monitoring ______Desk Reviews On-site Reviews – Ongoing process – Single event, lasting 1-3 days – Useful for ______quantitative data – Useful for qualitative data • Proposals • Performance – Review of ______measures processes & • Program program quality improvement plans – Input from diverse ______• Staffing patterns stakeholders • Budgets ______4/6/2004 M. Corley 46 ______

Slide 47 Advantages and Disadvantages ______of Desk Reviews ______Advantages Disadvantages Data, reports, proposals, Assumes accurate data etc., already in state office that reflect reality ______Review can be built into Local staff and staff’s regular workload stakeholders not heard Data is quantitative; can Static view of data; no ______be compared to previous interaction in context years No travel time or costs No team perspective ______required ______4/6/2004 M. Corley 47 ______

Slide 48 Advantages and Disadvantages ______of On-site Reviews Advantages Disadvantages ______Data is qualitative; review of Stressful for local program and processes & program quality team ______Input from perspectives of Arranging site visits and team is diverse stakeholders time-intensive for both locals and state State works with locals to Requires time out-of-office ______explore options for improvement; provides T/A ______Opportunity to recognize Incurs travel costs strengths; offer praise; identify best practices ______4/6/2004 M. Corley 48 ______

16 Slide 49 Data Collection Strategies ______for Monitoring ______1. Program Self-Reviews (PSRs)

2. Document Reviews ______3. Observations ______4. Interviews ______4/6/2004 M. Corley 49 ______

Slide 50 ______Program Self-Reviews ______ Conducted by local program staff  Review indicators of program quality ______ Completed in advance of monitoring visit and can help focus the on-site ______review  Results can guide the program ______improvement process ______4/6/2004 M. Corley 50 ______

Slide 51 ______Document Reviews

 Can review from a distance: ______– Proposals – Qualitative and quantitative reports ______– Improvement plans  Can review on-site: ______– Student files – Attendance records – Entry and update records ______– Course evaluations ______4/6/2004 M. Corley 51 ______

17 Slide 52 Qualitative and Quantitative Data ______

______4/6/2004 M. Corley 52 ______

Slide 53 ______Observations

Interactions ______– during meetings – At intake and orientation ______– In hallways and on grounds – In the classroom ______Link what is observed to – Indicators of quality – Activities in the program plan ______– Professional development workshops ______4/6/2004 M. Corley 53 ______

Slide 54 ______Interviews ______Help clarify or explore ambiguous findings Provide information re: stakeholders’ ______opinions, knowledge, and needs – Administrative, instructional, and support staff ______– Community partners – Community agencies (e.g., employment, social services) ______– Learners ______4/6/2004 M. Corley 54 ______

18 Slide 55 Fill in the Boxes: Monitoring ______with Indicators of Program Quality ______In teams of 4-5 and using H-12, fill in the data sources you would ______expect to use, the questions you would ask locals, and the strategies ______you would use in conducting a desk review versus an on-site review. ______4/6/2004 M. Corley 55 ______

Slide 56 Steps for Monitoring ______Local Programs ______1. Identify state policy for monitoring; gather support from stakeholders. 2. Consider past practices when specifying scope of ______work for monitoring. 3. Identify persons to lead and participate in monitoring. 4. Identify resources available for monitoring locals. ______5. Determine process for collecting data with clearly defined criteria for rating; conduct monitoring. 6. Report findings and recommendations. ______7. Follow-up on results. ______4/6/2004 M. Corley 56 ______

Slide 57 ______Data Help…

 Measure student progress ______

 Measure program effectiveness  Assess instructional effectiveness ______ Guide curriculum development  Allocate resources wisely ______ Promote accountability  Report to funders and to the community  Meet state and federal reporting requirements ______ Show trends ______4/6/2004 M. Corley 57 ______

19 Slide 58 ______BUT… ______Data do not help:  If the data are not valid and reliable; ______ If the appropriate questions are not asked after reviewing the data; or ______ If data analysis is not used for making wise decisions. ______4/6/2004 M. Corley 58 ______

Slide 59 ______A Word about the Change Process ______Factors that allow us to accept change: 1. There is a compelling reason to do so; 2. We have a sense of ownership of the change; ______3. Our leaders model they are serious about supporting the change; ______4. We have a clear picture of what the change will look like; and 5. We have organizational support for lasting ______systemic change. ______4/6/2004 M. Corley 59 ______

Slide 60 ______Stages of Change ______1. Maintenance of the old system 2. Awareness of new possibilities ______3. Exploration of those new possibilities 4. Transition to some of those possibilities or changes ______5. Emergence of a new infrastructure 6. Predominance of the new system ______4/6/2004 M. Corley 60 ______

20 Slide 61 ______A Word of Caution

 Start small; don’t overwhelm locals with a “data ______dump.”  Begin with the core issues, such as educational gain.  Listen to what the data tell about the big picture; don’t ______get lost in too many details.  Work to create trust and build support by laying data on the table without fear of recrimination. ______ Provide training opportunities for staff on how to use data.  Be patient, working with what is possible in the local ______program. Source: Spokane, WA School Superintendent Brian Benzel ______4/6/2004 M. Corley 61 ______

Slide 62 Planning and Implementing ______Program Improvement ______Stages of the Program Improvement Process 1. Planning; ______2. Implementing; 3. Evaluating; and 4. Documenting Lessons Learned ______and Making Adjustments, as needed ______4/6/2004 M. Corley 62 ______

Slide 63 ______Planning Questions ______ Who should be included on your program improvement team? ______ How will you prioritize areas needing improvement? ______ How will you identify and select strategies for effecting improvement? ______4/6/2004 M. Corley 63 ______

21 Slide 64 ______Guiding Questions for Strategies

Is the strategy: ______ Clear and understandable to all users?  One specific action or activity, or dependent on other ______activities? (If so, describe the sequence of actions.)  An activity that will lead to accomplishing the goal?  Observable and measurable? ______ Assignable to specific persons?  Based on best practices?  One that all team members endorse? ______ Doable—one that can be implemented? ______4/6/2004 M. Corley 64 ______

Slide 65 ______Implementation Questions ______ Who will be responsible for taking the lead on ensuring that the change is implemented? ______ Who will be members of the “change” team and what will be their roles? ______ How will expectations for the change be promoted and nurtured? ______ How will the change be monitored? ______4/6/2004 M. Corley 65 ______

Slide 66 ______Evaluation Questions ______ How will the changes that are implemented be evaluated?  How will the team ensure that both ______short- and long-term effects are measured? ______ Who will interpret the results?  Who will be on the look-out for ______unintended consequences? ______4/6/2004 M. Corley 66 ______

22 Slide 67 ______Possible Evaluation Results ______Significant improvement with no significant unintended consequences: Stay the course. ______Little or no improvement: Stay the ______course OR scrap the changes? A deterioration in outcomes: Scrap ______the changes. ______4/6/2004 M. Corley 67 ______

Slide 68 ______Documenting the Process ______Document what worked and what didn’t; ______lessons learned; and logical next steps or changes to the ______plan. Use as guide for future action. ______4/6/2004 M. Corley 68 ______

Slide 69 ______State Planning Time

In your state teams, consider the questions on ______H-14 and begin planning.  Consider the stakeholders you want to include ______in your planning for data monitoring and program improvement.  Consider the problems you anticipate facing ______and propose solutions to those problems.  Complete H-14 to the best of your ability and ______be prepared to report on your plan in one hour. ______4/6/2004 M. Corley 69 ______

23 Slide 70 ______Thank you ______Great Audience! Great Participation! ______Great Ideas! Live Long and Prosper! ______Good Luck!! ______4/6/2004 M. Corley 70 ______

24 SUPPLEMENT POSSIBLE QUESTIONS TO ASK WHEN EXAMINING THE DATA ON H-5

1. How were the performance standards set for this program? Were they based on past performance or some other criteria?

2. Are these standards appropriate given the pattern of performance the program has shown? For example, are the standards too low at the higher levels where performance greatly exceeded targets?

3. Is this performance pattern similar to that observed in previous years? If not, what has caused it to change? Will this affect setting of performance standards in the future?

4. What are the program’s assessment and placement procedures? What assessments are used for pre- and posttesting?

5. s the program using the same assessment methods for high and low level ESL? If so, is this appropriate given the performance pattern?

6. What type of curriculum and instruction is the program offering? How does it differ by instructional level?

7. What are student retention patterns by level? Is retention affecting the differences in performance among students at different levels?

8. Could the program’s recruitment practices have had an influence on performance? How many students is it serving at each level?

9. Why are no students enrolled at the highest ESL level? Is this a result of recruitment, type of classes the program offers, or placement procedures? Does the program need to change is recruitment practices?

S-1 GLOSSARY

Advancement Learner advances from one educational functioning level to the next, based on the learner’s performance on state designated assessments. Aggregation, or Data The process of combining reports from one level of administration into a aggregation single report at the next (e.g., combining local program reports into one statewide report). Alternative Procedures and techniques used as an alternative to traditional testing. The assessments focus tends to be on individual student growth over time, rather than comparing students with one another. Assessment Measures of student progress, including standardized testing, teacher assessment, portfolios, checklists, etc. Class level The educational functioning level in which students are placed. Contact hours Hours of instruction or instructional activity the learner receives from the program. Instructional activity includes any program-sponsored activity designed to promote student learning in the program curriculum such as classroom instruction, assessment, tutoring, or participation in a learning lab. Continuous Model that uses past performance to set standards for the future and to improvement plan for program improvement. Data forms A written or electronic document for collecting student information. Data items Individual questions or pieces of information contained on data forms. Data quality All states use the same definitions and coding categories for every data element in the NRS. States follow the same step-by-step instructions on how and when to collect each data element. Desk review A structured way to look at program information related to outcomes. May include review of data, proposals, reports, budget, etc. Descriptive measures For the purposes of the NRS, descriptive measures may include student: demographics, educational status, and goals Earn a high school Obtaining a state accredited secondary diploma/credential or passing the diploma or achieve a General Educational Development (GED) Tests. GED Educational gain Learner completes or advances one or more educational functioning levels from starting level as measured at program entry or beginning of an instructional cycle. Employed Learners who work as paid employees, work in their own business or farm, or who work 15 hours or more per week as unpaid workers on a farm or in a business operated by a member of the family. Also included are learners who are not currently working, but who have jobs or businesses from which they are temporarily absent. English as a Second Programs for limited English proficient students that focus on improving Language programs English communication skills such as speaking, reading, writing, and listening. Enters employment The learner obtains full- or part-time paid employment before the end of the first quarter after the program exit quarter.

S-2a Enters post- The learner enters another education or training program, such as secondary education community college, trade school, a four-year college or university, etc. or training program Evidence Data and documentation to support findings. External Criteria Model for setting standards with a formula based on factors not directly related to a program’s performance in the past or in relation to others. Family literacy A program with a literacy component for parents and children or other programs intergenerational literacy components. GED Certificate given to learners who attain passing scores on the General Educational Development (GED) Tests. Generalizable The extent to which a finding can be generalized to other populations or situations. Goals Information collected at intake about the main reason(s) a student is enrolling in the adult education program. Consider both long- and short- term goals. For NRS purposes, report goals that can be reached within the fiscal year. Indicators of program Measures that define policies and practices for effective adult education quality programs. Improved The learner maintains his or her current employment, and receives an employment increase in pay, additional responsibilities, or improved job-related skills. Level benchmarks Guidelines for determining students’ educational functioning levels based on performance on standardized tests. Longitudinal data Data measured consistently from year to year in order to track learner progress over time. Mandatory program A local, state, or federal program that requires a student to attend adult education classes, for example welfare, NAFTA, or probation. Mandatory students Students who are required to attend adult education classes because of their participation in some other local, state, or federal program, including welfare, NAFTA, job training, or probation. Mandatory students do not include students required to attend classes by their employer. Mean The arithmetic average of a set of scores, or the sum of observations divided by the number of observations. Median The middle score of a set of scores. Mode The most frequently occurring score in a set of scores. NAFTA program A federal program to assist workers displaced by the North American Free Trade Agreement (NAFTA). Norm-referenced Tests on which the performance is interpreted in the context of the tests performance of a group with whom it is reasonable to compare the individual (for example, achieving at a 3.4 grade level). On-site review Monitoring local programs on site to verify data by looking at the processes and procedures being used. May include a review of files and recruiting materials, observations, interviews, etc. Outcome measures For the purposes of the NRS, core and secondary outcomes of adult education include learning gains, entry into post-secondary education and training, obtaining high school credentials obtained, entering or advancing in employment, and other gains related to family, education and community participation. Performance Numeric levels established for outcome measures in the state plan standards (for indicating what proportion of students at each level will achieve each states) outcome. Performance Statement that indicates how well or to what extent a student will Standards (for demonstrate knowledge or skills. students) Persistence Student’s ability to continue learning over time; the length of time student remains engaged with learning.

S-2b Post-test A test administered to a student at designated intervals during a program. It is usually used to measure progress or advancement in the program. Pre-test A test administered to a student upon entry into a program. It is usually used for determine level for placement.

S-2b Probation A situation in which a student is under the supervision of a court and may be required to attend classes. Program (or program The main emphasis of instruction for a set of classes. Examples of program area) areas are ABE, GED, workplace literacy, ESL, family literacy, etc. P.I.P. or P.I.T Program Improvement Plan or Team Qualitative data Detailed data collected in the form of words or images that is analyzed for description and themes. Quantitative data Data used to describe trends and relationships among variables. Analysis of the data entails the use of statistics. Relative ranking Model for setting standards based on a program’s rank relative to the state mean or median rating or score. Reliability The extent to which others would arrive at similar results if they studied the same case using the same procedures; evidence of consistency of a measure. Researchable A research question that can realistically be answered with the skills and question resources available Retain employment The learner remains employed in the third quarter after the exit quarter. Return on Investment Net value in relation to cost. (ROI) Rubric A guide to evaluate a program (or student performance) on a scale with clearly defined criteria. Scales may be numeric (1 to 5) or descriptive (not evident to exemplary). Standard deviation A measure of the variability or spread of scores; the square root of the average of the squared deviations of the scores from the means of the set of scores. Student performance Student attainment formally measured by some assessment method. Student record A computerized or paper-based system for tracking student information system related to intake information, goals, educational levels, attendance, achievements, and outcomes. Student retention Student attends program long enough (persists) to show learning gains. TANF Temporary Assistance for Needy Families. A federal public assistance program. Uniform system for All states and programs use the same methodology for collecting data on collecting measures the measures. States certify validity through “data quality checklists” Validity The extent to which a research instrument measures what it purports to measure. Variance A measure of the variability of the scores in a frequency distribution; more specifically, the square of its standard deviation. Voluntary students Students who attend adult education classes of their own free will; they are not required to attend by any state agency. Work-based project A short-term course (at least 12 hours but no more than 30 hours) in which learner activity instruction is designed to teach work-based skills and in which the educational outcomes and standards for achievement are specified. Workplace literacy A program designed to improve the literacy skills needed to perform a job programs and at least partly under the auspices of an employer.

S-2c

S- 2b LETTER TO SEND TO PARTICIPANTS PRIOR TO TRAINING

The Department of Education invites adult education staff to attend NRS Data Monitoring for Program Improvement. This workshop is one in a series of trainings designed to promote the quality and use of data collected under the National Reporting System (NRS). The goal of the workshop is to provide training to staff on how to meet requirements to set local program performance standards, better monitor and continuously improve their local programs using NRS data.

TOPICS

Adult education is facing a time of change, as new legislation replacing the Workforce Investment Act is under consideration. As part of these changes, states may soon be developing new state plans and setting performance standards for program accountability. States will be facing new challenges to monitor and continuously improve local program performance. To assist in these efforts, this workshop will cover the following topics:

 Setting local performance standards – learn about performance standard setting models and how to select the model that best meets your state policies and promotes local program improvement.

 Local monitoring – explore how standards can reflect program instruction, assessment, retention and other procedures and how to use performance standards as indicators of local performance.

 Making change – the workshop will cover change models and how to make real changes in your program to improve program performance and student outcomes.

WHO SHOULD ATTEND

The intended audience for the training is adult education staff who are or will be responsible for setting performance standards, monitoring local programs and providing technical assistance for program improvement. Staff responsible for conducting professional development on these topics, and staff who perform these functions may also wish to attend. To provide a rich training experience, we encourage you to send a team of up to three persons to this training.

COST

The cost of this training is $ , which covers…

LOCATION AND DATES

Locations and dates of the trainings are:

Attendance is limited to xx persons, so please register as soon as possible.

To register for the training, please return the attached registration form and please return the registration form no later than … For questions about the training, please contact:

S-3 Alternative Monitoring Exercise: Facilitator’s Notes

GENERAL NOTES FOR FACILITATORS:

This alternative session design allows participants to actually use the spiral bound guide, Data Monitoring for Program Improvement, to find information related to the purposes and strategies for monitoring. Participants can work in state teams or across state teams during Part One – using the guide to find information on different monitoring strategies that will link data collection with the Indicators of Program Quality. Later in Part Two, participants will return to work in state teams to look at the monitoring steps that the state already has in place and the steps that might require updating in the future.

Part One:

Time: Materials Notes 90 minutes PPT- 41-45 A. Introduce local program monitoring for program (10 min) improvement. SH-12a B. Small Group Work - use the monitoring guide for (60 min) questions one and two on SH-12a followed by more SH- 12b or H-12 intensive practice using either SH-12b (an in depth focus on three Indicators of Program Quality in more depth) or H –12 (a broader focus on nine Indicators of Program Quality) C. Small Group Report - share perspectives (5 min.) on (20 min) the benefits of monitoring and to identify the advantages and disadvantages of both desk and on- site reviews. Allow each group to report how they might monitor one indicator (15 min.).

Part Two:

Times Materials Notes 25 minutes PPT- 56 A. Review steps and guidelines for a state monitoring (5 min) system. SH-12c B. State Team Work to identify what the state already has (15 min) in place and what steps need to be developed or revised. C. Whole Group wrap up monitoring section by asking (5 min) participants to share any salient ideas, strategies, or challenges that came to the surface during this segment of training.

S-4a Alternative Monitoring Exercise: Monitoring Local Programs

Directions for Search and Find in the spiral guidebook - NRS Data Monitoring for Program Improvement:

Work in state teams. Scan Chapter 4 for information to help complete the following exercises.

1. Identify 2-3 benefits for monitoring local programs (pages 37-38; suggested time 10 minutes).

FROM A STATE PERSPECTIVE… FROM A LOCAL PROGRAM PERSPECTIVE        

2. The Monitoring Guide suggests that states can use a two-prong approach to monitoring – desk reviews and on-site reviews (pages 38-40, suggested time 10 minutes).

Scan the information about the approaches and the advantages/disadvantages for each approach.

a. Brainstorm ways that your state could use desk reviews and on-site reviews. b. Brainstorm any obstacles that your state might encounter with each approach.

Approach Useful in our state Possible obstacles in our state Desk Reviews

On-Site Reviews

S-4b Alternative Monitoring Exercise: Using Indicators of Program Quality to Monitor Programs

Refer to the NRS Guide to Data Monitoring, Table 4-4 on pages 46-48 and the Pennsylvania sample on pages 55-56. (Suggested time 40 minutes) a. Select several of your state’s Indicators of Program Quality - exact wording is not necessary. Alternatively, select an indicator from one of the sample states included in the Appendix pages 73-78. b. Complete the chart below for monitoring the outcomes and processes by identifying your data sources, questions to be answered, and then outlining effective strategies for desk reviews and on-site reviews.

1. Indicator of Program Quality

Data Sources Questions to Pose Strategies Strategies (NRS and Local) with Local Staff Desk Review On-site Review

2. Indicator of Program Quality

Data Sources Questions to Pose Strategies Strategies (NRS and Local) with Local Staff Desk Review On-site Review

3. Indicator of Program Quality

Data Sources Questions to Pose Strategies Strategies (NRS and Local) with Local Staff Desk Review On-site Review

S-4c Alternative Monitoring Exercise: Steps and Guidelines for Monitoring Local Programs

Directions: Scan the steps and identify what your state has in place and what needs to be done. Column 4: outline the process and products in place Column 5: identify what needs to be developed 2. 4. 5. To Be 1. Monitoring Implementation 3. Examples Process/Products Developed Steps Guidelines in place 1. Identify state Provide clear State plan should be policy for written guidelines open to the public and monitoring. Gather to all levels of shared at all levels. support from those stakeholders on the State plans often who have a stake in scope of monitoring specify: the results. activities (including 1. Outcome process and measures timelines). 2. Frequency 2. Specifying the Uses quantitative Quantitative = look scope of work for and qualitative data at outcome monitoring. for effective measurements monitoring. Qualitative = look for evidence using program quality indicators 3. Identify Consider the unique Local staff: individuals to lead program features practitioners, and to participate when identifying administrators, in monitoring who should be partners activities. involved from the External team local program and members: content who should be part specialists, other of monitoring team. educators, and staff Consider strength in from partnering diversity. agencies 4. Identify resources With competing Desk reviews look at available for demands for program data from a monitoring local resources (staff, distance. programs. time, and money for Onsite reviews look monitoring), at data in context—to consider formalizing see first-hand how the a two-stage process and operations monitoring lead to positive approach. outcome measures.

5. Determine Create and use Desk Reviews can process for standard tools for include data, collecting data with data collection and proposals, plans, clearly stated analysis. Monitors reports, and program criteria for rating. (state staff and self-review. Conduct monitoring team) need to fully Onsite reviews can activities. understand the include discussion of tools, their use, and self-review, the rating criteria. observations, interviews, and a review of files and documents.

S-4d 2. 4. 5. To Be 1. Monitoring Implementation 3. Examples Process/Products Developed Steps Guidelines in place 6. Report on the Conclude onsite Report might include a findings, including monitoring visits short description of the recommendations. with a verbal monitoring activities debriefing followed with supporting: by a written report. A. Qualitative description B. Quantitative 7. Follow up on the Given that the Follow-up activities results. major purpose of might include monitoring is reviewing performance program standards and program improvement, it improvement, should include an rewarding or ongoing exchange sanctioning, and between the state technical assistance. office and the local program.

S-4d

Recommended publications