Provided Through the Gracious Contribution of Dr. Kurt Steuck
Total Page:16
File Type:pdf, Size:1020Kb
Provided through the gracious contribution of Dr. Kurt Steuck Planning the Evaluation of Sponsored Research Projects Workshop Series Jan - Feb 2016
Workshop Handouts
Table of Contents
This workshop was prepared and delivered by Steuck & Associates for the University of Texas at San Antonio January 2016 Sessions At A Glance
Jan 28 Feb 4 Feb 11 Introductions Review and Q&A Review and Q&A Frameworks Evaluation Plans Data Management Evaluation Questions Working with an Evaluator Evaluation Design Logic Models Work Breakdown Structures Indicators Evaluation Plans Cost Estimation Data Collection Instruments Data Collection Tools One-to-One Consulting One-to-One Consulting One-to-One Consulting
2 Activity #1: Evaluation Questions
Question and Method Oriented Approaches 1. To what extent was each program objective achieved? 2. Did the program effectively discharged its responsibilities? 3. Did tested performance meet or exceed pertinent norms? 4. Did tested performance meet or exceed standards? 5. Where does a group’s tested performance rank compared with other groups? 6. Is a group’s present performance better than past performance? 7. What sectors of a system are performing best and poorest? 8. Where are the shortfalls in specific curricular areas? 9. At what grade levels are the strengths and shortfalls? 10. What value is being added by particular programs? 11. To what extent can students effectively speak, write, figure, analyze, lead, work cooperatively, and solve problems? 12. What are a program’s effects on outcomes? 13. Are program activities being implemented according to schedule, budget, and expected results? 14. What is the program’s return on investment? 15. Is the program sustainable and transportable? 16. Is the program worthy of continuation and/or dissemination? 17. Is the program as good or better than others that address the same objectives? 18. What is the program in concept and practice? 19. How has the program evolved over time? 20. How does the program produce outcomes? 21. What has the program produced? 22. What are the program’s shortfalls and negative side effects? 23. What are the program’s positive side effects? 24. How do the various stakeholders value the programs? 25. Did the program meet all the benefactors’ needs? 26. What were the most important reasons for the program’s success or failure? 27. What are the program’s most important unresolved issues? 28. How much did the program cost? 29. What were the costs per beneficiary, per year, etc.? 30. What parts of the program were successfully transported to the other sites? 31. What are the program’s essence and salient characteristics? 32. What merits and demerits distinguish the program from similar programs? 33. Is the program grounded in a validated theory? 34. Are program operations consistent with the guiding theory? 35. Were hypothesized causal linkages confirmed? 36. What changes in the program’s design or implementation might produce better outcomes? 37. What program features are essential for successful replication? 38. What interesting stories emerged?
3 Improvement/Accountability Evaluation 1. What consumer needs should be addressed? 2. What alternatives are available to address the needs and what are their comparative merits? 3. What plan should guide the program? 4. What facilities, materials, and equipment are needed? 5. Who should conduct the program and what roles should the different participants carry out? 6. Is the program working and should it be revised? 7. How can the program be improved? 8. Is the program reaching all the rightful beneficiaries? 9. What are the outcomes? 10. Did staff responsibly and effectively discharge their program responsibilities? 11. Is the program superior to critical competitors? 12. Is the program worth the required investment? 13. Is the program meeting minimum accreditation requirements?
Social Agenda/Advocacy Evaluation 1. Where questions negotiated with stakeholders? 2. What was achieved? 3. What were the impacts? 4. How did the program operate? 5. How do various stakeholders judge the program? 6. How do experts judge the program? 7. What is the program’s rationale? 8. What were the costs? 9. What were the cost-benefits?
Adapted from: Stufflebeam, D. (2001). Evaluation models. New directions for evaluation, 2001(89), 7-98.
4 Evaluation Plan Example #1: UTSA DHS Evaluation Plan
Logic Model Component Indicator How Collected When Collected Activities (Management Focus) Developed Long-Range Strategic Compliance; Stakeholder Review of Project Dec 2014; Aug 2015; Plan (P2 g) Perceived Quality Documents; Aug 2016; Aug 2018 Communication Goals (P2 g) Interviews with Early Outreach Goals (P2 g) Career Awardees and Mentors Established Evaluation Metrics Stakeholder Perceived Interviews with Early Sept 2014; Annually in (P2 h) Quality Career Awardees and August Thereafter Mentors Establish an Enduring and Compliance with Project Review of Project Fall 2014 for launch in Comprehensive Program of Study Plan; Perceived Quality Documents; Spring 2015; Annually (P2 a) Interviews with Early in August Thereafter Career Awardees, Mentors and DHS Scholars Advisory Group Meetings Compliance with Project Review of Project Biannually (August Plan; Impact of Meetings Documents; and February – Interviews with Early Tentative) Career Awardees and Mentors PI Travel Compliance Interviews with Early Aug 2015; Annually in Career Awardees and August Thereafter Mentors Consulting for Curriculum Compliance; Stakeholder Interviews with Dr. Fall 2014 and as Development Perceived Quality Nikki Austin needed with Advisory (Curriculum Board and Mentor Consultant), Early Consultation Career Awardees and Mentors Establish Meaningful Partnerships Compliance; Issues to Be Interviews with COE Dec 2014; Aug 2015; with COEs (P2 e) Resolved PI’s, Early Career Annually in August Awardees and Thereafter Mentors Implemented Long-Range Compliance with Long- Review of Project Spring 2015; Annually Strategic Plan (P2 g) Range Plan Created by the Documents; in August Thereafter Communication Goals (P2 g) UTSA Grant Faculty/Staff Interviews with Early Outreach Goals (P2 g) Career Awardees and Mentors Leader Time Commitment to Compliance with Grant Review of Project Dec, May, Aug of Each Project (P2 f1) and Faculty Proposal Documents; Year of Grant Involvement in Project (P2 j) Interviews with Early Career Awardees and Mentors Projects Have Been Conducted Compliance with Grant Review of Project Dec, May, Aug of Each Timely (P2 f) Proposal Documents; Year of Grant Interviews with Early Career Awardees and Mentors
5 Research Projects (Faculty Progress indicators (e.g., Review of Project Aug 2015, Aug 2016 Involvement) (P2 i) number of projects, Documents; duration of projects, Interviews with Early number of students Career Awardees, involved); Mentors and DHS Scholars Mentoring Junior Faculty Perceived Quality of the Interviews with Early Dec, May, Aug of Each Mentoring Career Awardees and Year of Grant Mentors
Activities (Student Focus) Recruiting and Selection (P2 m) Compliance with Project Interviews with Spring 2015, Fall 2016, Plan; Director of Fall 2018 Undergraduate Research, Early Career Awardees and Mentors Coordination of and Participation Progress Indicators (e.g., Interviews with Early Aug 2015; Annually In in Internships host companies Career Awardees, August thereafter contacted, agreements in Mentors and DHS place) Scholars; Phone Interviews with Host Companies Research Projects (Student Progress indicators (e.g., Review of Project Aug 2015; Aug of Each Involvement) (P2 i) number of projects, Documents; Year of Grant duration of projects, Interviews With DHS number of students Scholars involved); Student Travel Compliance; Quality of Interviews With Early Aug 2015; Aug 2015; Event/Conference Career Awardees and Annually In August DHS Scholars thereafter Faculty Involvement (P2 j) (With Compliance with Grant Interviews With DHS Dec, May, Aug of Each Students) Proposal Scholars Year of Grant
Outputs/Products Long-Range Strategic Plan (P2 g) Compliance; Stakeholder Review of Project Dec 2014; Aug 2015; Perceived Quality Documents; Survey of Aug 2016; Aug 2018 UTSA Stakeholders Certification Program Perceived Quality Survey of Early Career Aug 2015; Aug 2015; Awardees and Annually In August Mentors and thereafter Administrators Courses Perceived Quality Survey of Early Career Aug 2015; Aug 2015; Awardees and Annually In August Mentors and thereafter Administrators Internships Quality of Internships Survey of UTSA Aug 2015; Annually In Faculty and Students; August thereafter Phone Interviews with Host Companies Research Plan Perceived Quality Survey of Early Career Aug 2015; Aug 2015; Awardees and Annually In August
6 Mentors and thereafter Research papers and publications Progress Indicator (e.g., Review of Project Aug 2015; Aug 2015; (P2 i) number planned, number Documents; Annually In August in draft) Interviews With Early thereafter Career Awardees and Mentors (possibly DHS Scholars)
Outcomes & Impacts Students Benefit from Program Compliance; Participation Review of Project Aug 2015; Aug 2015; Activities (P2 d and P2 l) Levels; Perceived Quality Documents; Annually In August Student Certifications Interviews With of thereafter Undergraduate Research Survey of Early Career Awardees and Undergraduate Experiences Mentors and Eligible Homeland Security Workforce Perceived Quality Survey of Early Career Aug 2016, Aug 2018, (P2 b) Awardees and Aug 2020 Mentors and Outlets and Career Paths (P2 n) Perceived Quality Survey of Early Career Aug 2016, Aug 2018, Employment: DHS, Fed Awardees and Aug 2020 Lab, Graduate School Mentors and
7 Evaluation Plan Example #2: UTSA NSF Noyce
Activity, Output /Product, Indicator How Collected When Collected Impact Activities Workshop - CEMM PD for 1st Compliance with Evaluator Review; Fall 2014 Cohort Master Teaching Fellows Project Plan; Interviews With (MTFs) Perceived Quality; MTFs and UTSA Self-Reported Faculty/Staff; Post- Learning (1st MTFs) Session Survey Workshop - Action Research PD Compliance with Evaluator Review; Summer 2015 for 1st Cohort Mentors Project Plan; Interviews With Perceived Quality; MTFs and UTSA Self-Reported Faculty/Staff; Post- Learning (1st MTFs) Session Survey Workshop - CEMM PD for 2nd Compliance; Interviews With Fall 2015 Cohort Mentors Perceived Quality; MTFs and UTSA Self-Reported Faculty/Staff; Post- Learning (2nd MTFs) Session Survey Workshop - Action Research PD Compliance; Interviews With Summer 2016 for 2nd Cohort Mentors Perceived Quality; MTFs and UTSA Self-Reported Faculty/Staff; Post- Learning (2nd MTFs) Session Survey Workshop - MS&T Teacher Compliance; Interviews UTSA Spring 2017 Presentations Perceived Quality; Faculty/Staff Workshop - CEMM PD by Mentors Compliance; Interviews With Fall 2017 for Mentees Perceived Quality; MTFs and UTSA Self-Reported Faculty/Staff; Post- Learning (Mentees) Session Survey of Mentees Workshop - Action Research PD Compliance; Interviews With Spring 2018 by Mentors for Mentees Perceived Quality; MTFs and UTSA Self-Reported Faculty/Staff; Post- Learning (Mentees) Session Survey of Mentees Workshop - MS&T Teacher Compliance; Interviews UTSA Summer 2019 Presentations Perceived Quality; Faculty/Staff; Workshop - MS&T Teacher Compliance; Interviews UTSA Spring 2020 Presentations 2nd Cohort Perceived Quality; Faculty/Staff; CEOP Use by Mentors Compliance; Interviews With Spring 2015 - Spring Perceived Quality; MTFs and UTSA 2019 Faculty/Staff Systemic Intentional Study by Compliance; Interviews With Fall 2015; Mentors Perceived Quality; MTFs and UTSA Fall 2016 Faculty/Staff Campus Systemic Intentional Compliance; Interviews With Summer 2018; Study by Mentors Perceived Quality; MTFs and UTSA Summer 2019 Faculty/Staff Content Specific Training (CST) Compliance with Interviews With End of Each Project Plan; MTFs and UTSA Semester Perceived Quality; Faculty/Staff; Post- Self-Reported Session Survey Learning
8 SAMSEC Compliance with Post-Session Survey Summer 2015 Project Plan; Summer 2016 Perceived Quality; Summer 2017 Self-Reported Summer 2018 Learning Summer 2019 Summer 2020 Observations of Mentors Compliance; Interviews UTSA End of Each Perceived Quality Faculty/Staff Semester
Participation in CofP Compliance; Evaluator Review Spring 2015 and Perceived Quality periodic follow up Recruitment & Selection of 1st Compliance; Interview with UTSA Summer/Fall 2014 cohort Perceived Quality faculty/staff Recruitment & Selection of 2nd Compliance; Interview with UTSA Spring/Summer cohort Perceived Quality faculty/staff 2015 Iterative Development Phase I Compliance; Interview with UTSA End of Year 1 Perceived Quality faculty/staff Iterative Development Phase II Compliance; Interview with UTSA End of Year 2 Perceived Quality faculty/staff Iterative Development Phase III Compliance; Interview with UTSA End of Year 3 Perceived Quality faculty/staff Research Study Compliance Interview with UTSA End of Each faculty/staff Semester
Outputs/Products Workshops and Instructional Perceived Quality Evaluator review; End of Each Materials Interviews with Semester mentors; Post- session survey CEOP Use and Findings Adequacy of data for Interview with UTSA End of Each decision-making faculty/staff Semester CST Professional Development Perceived Quality Evaluator review; End of Each Events and Materials Interviews with Semester MTFs; Post-session survey SAMSEC Sessions Perceived Quality; Post-session survey Immediately Self-Reported following two-day Learning sessions
Outcomes & Impacts Project Impact (not addressed by Perceived Quality; Interviews with Annual, Summer the research studies) Perceived Impact SAISD Participants and Stakeholders
9 Evaluation Plan Summary #1: TAMU-SA Cybersecurity
Goals of the Evaluation There are two overarching goals for the project evaluation. 1) To ensure that the grant project plan is being followed (e.g., monitoring, compliance) and 2) to assess the quality of the activities, products, and outcomes. Monitoring questions focus on what happened during the grant, whereas the evaluation questions assess the value of the grant activities, products, and outcomes.
Indicators There are three sets of indicators to be collected by the external evaluator. The first is the compliance with the stated grant activities and objectives (e.g., Was the activity completed and documented in a sufficient manner? How many students and faculty participated in the event?). The second set of indicators is the perceived efficacy or quality of the activities (e.g., workshops) by the participating TAMU-SA grant faculty/staff, teachers, and students. This category includes collecting best practices and lessons learned from implementing the workshops, professional development events, and lessons/exercises during each year of the grant. The third set of indicators are measures of student performance in the lessons, exercises, and competitions.
Methods Multiple methods will be used in the evaluation of this project. Interviews and surveys are the primary tools to be used for this evaluation. Data on student performance on the lessons and exercises and adoption rates will provide for a quantitative analysis of the project.
Timeline Evaluation activities will occur throughout the lifecycle of the grant. Depending on the data being collected, some interviews will occur at the end of each academic semester (i.e., May, August, December). Other data will be collected upon conclusion of the event. For example, interviews and/or surveys about the efficacy of a workshop will be collected at the completion of the workshop.
Reporting The external evaluator will provide an annual report to the TAMU-SA grant faculty/staff to coincide with the NSF grant requirement timeline (e.g., prior to the date when TAMU-SA must supply information to NSF). In addition, the external evaluator will provide written and verbal informal summaries as needed.
10 Evaluation Plan Summary #2: UTSA NSF Noyce
Goals of evaluation There are two overarching goals for the project evaluation. 1) To ensure that the grant project plan is being followed (e.g., monitoring, compliance) and 2) to assess the quality of the activities and outcomes/ products of each to provide best practices and lessons learned to the grant staff. The external evaluator and the UTSA grant faculty/staff will ensure that potential duplication of efforts in the research study and evaluation are minimized. For example, if the assessment of the quality of workshop and training materials is adequately covered in the research study, then the evaluation will only note that this has been completed satisfactorily. A detailed evaluation plan is provided in Appendix xxx.
Indicators There are three main sets of indicators to be collected by the external evaluator: 1) compliance with the stated grant activities and objectives, 2) perceived quality of the activities (e.g., workshops) by the MTFs, mentees, UTSA grant faculty/staff, and SAISD stakeholders, and 3) perceived learning. This latter indicator is a self-reported assessment of the degree to which the mentors and mentees believe they have learned specific knowledge and skills targeted by the workshop or training. The research study will address more formal assessments of learning by the MTFs, mentees, and students.
Methods Interviews with UTSA faculty/staff. The quantity of interviews in the detailed evaluation may be misleading. Interviews with UTSA faculty and staff will occur at the end of each semester or immediately after significant project events. The interviews will include questions concerning several evaluation plan items (e.g., quality of the workshops, quality of the CEOP data and observations, quality of the CST).
Interviews with master teachers (both cohorts). Interviews with the MTFs will occur at the end of each semester or after significant project events. Initially, the interviews will be face-to-face to build rapport between the MTFs and the external evaluator. Subsequent interviews will be via phone. In order to be respectful of the limited time teachers have each semester, each interview whether face-to-face or via phone will be less than 30 minutes in length.
Short, focused surveys. The surveys will also be constructed so as to be respectful of the teachers’ limited time. Each survey will have 5-10 succinct questions focused on the events and/or training they have completed.
Reporting The external evaluator will provide an annual report to the UTSA grant faculty/staff to coincide with the NSF grant requirement timeline (e.g., prior to the date when UTSA must supply information to NSF). In addition, the external evaluator will provide written and verbal informal summaries as needed.
11 Work Breakdown Structure Example #1: UTSA CEIG
WBS Task Hours Number of Duration
Prep Information Gathering Review existing documentation 0.5 Meet with UTSA CEIG grant administrators 1 Determine what high school data is available/collected 1
Refine the Evaluation Questions Determine what additional questions of interest by UTSA 0.5 Anticipate stakeholders (e.g., DoD) potential questions 0.5
Aim 1 Draft interview protocol dimensions 1 Draft Interview questions 2 Vett the interview protocols with CEIG administrators 1 Schedule the interviews 1 Conduct the interviews 4 4 1 Transcribe the interview notes and observations 8 4 2 Develop a data analysis coding rubric for the interview data 1 1 1 Analyze the coded interview data 8 4 2 Verify data analysis 1 Summarize the analysis 4
Aim 2 Draft interview protocol dimensions 1 Draft Interview questions 2 Vett the interview protocols with CEIG administrators 1 Schedule the interviews 1 Conduct the interviews - faculty 2 2 1 Conduct the interviews - students 2 2 1 Transcribe the interview notes and observations 4 2 2 Develop a data analysis coding rubric for the interview data 1 1 1 Analyze the coded interview data 4 2 2 Verify data analysis 1 Summarize the analysis 4
Aim 3 Determine IRB requirements for UTSA and for the high schools 0.5 Draft survey question dimensions 1 Draft survey questions 2 Vett the survey questions with CEIG administrators 1 Draft survey instructions 1 Administer the surveys to the high school students 1
12 Tabulate and Analyze the survey data 4 Verify data analysis 1 Summarize data 4
Proj Reports and Presentations Draft interim report 16 Edit and deliver interim report in PDF format 1 Draft final report Edit and deliver final report in PDF format Create draft presentation slides 0 Deliver presentations slides 0
Semester Total 90
13