Program Evaluation
Total Page:16
File Type:pdf, Size:1020Kb
Program Evaluation Synthesis of lessons learned by child neglect demonstration projects September 2005 Child Welfare Information Gateway Children’s Bureau/ACYF U.S. Department of Health and Human Services 1250 Maryland Avenue, SW Administration for Children and Families Eighth Floor Washington, DC 20024 Administration on Children, Youth and Families 703.385.7565 or 800.394.3366 Children’s Bureau Email: [email protected] www.childwelfare.gov This synthesis was made possible by the Children’s Bureau, Administration on Children, Youth and Families, Administration for Children and Families, U.S. Department of Health and Human Services. The conclusions discussed here are solely the responsibility of the authors and do not represent the official views or policies of the funding agency. Publication does not in any way constitute an endorsement by the U.S. Department of Health and Human Services. Suggested citation: Child Welfare Information Gateway. (2005). Program evaluation: Synthesis of lessons learned by child neglect demonstration projects. Washington, DC: U.S. Department of Health and Human Services. Program Evaluation www.childwelfare.gov In 1996 and 1997, the Children’s Bureau plans. While some of the lessons learned will funded 10 demonstration projects to address be most useful to other programs address- the prevention, intervention, and treatment ing child neglect, many are applicable to a needs of neglected children and their families. broader range of social service programs. These projects implemented and evaluated Contact information for each program discussed, a wide variety of service strategies with large and information about evaluation designs, numbers of high-risk children and families. instruments, and outcomes, are included in the The programs varied considerably in terms of appendices for readers interested in learning theoretical model (psychosocial or ecologi- more about individual projects. cal), target population, location (in-home or out-of-home), duration, and intensity. The projects provided a great variety of services, including parent education and support, ChallengesandSuccessful home visits, and referrals to other resources StrategiesforEvaluation or services in the community. (For informa- tion about the programmatic aspects of these DesignandMethodology projects, see the companion synthesis, Child Neglect Demonstration Projects: Synthesis of All 10 grantees encountered challenges Lessons Learned, published by Child Welfare related to design and methodology. These Information Gateway.) Throughout the course challenges, and the strategies grantees of their 5-year projects, grantees faced a reported to be successful in addressing them, number of common challenges in combining are summarized below. They relate specifically service delivery with a rigorous program evalu- to the processes of: ation methodology. Despite these challenges, several of the programs conducted very thor- • Selecting instruments ough evaluations, and they all reported posi- • Establishing comparison groups tive outcomes. • Collecting data In recent years, there has been a growing interest in the evaluation of social service • Analyzing the impact of individualized programs. Funders and other stakeholders services want evidence that programs are achieving their intended outcomes. Practitioners want to SelectingInstruments develop more effective services by document- All 10 programs used standardized instruments, ing interventions and measuring results, and and many used the same ones. The table in they want to establish stronger connections Appendix B shows which instruments each between services and outcomes. Therefore, program used. The most commonly used were: this program evaluation synthesis summa- rizes the grantees’ challenges, strategies, and • Child Well-Being Scales (6 programs) lessons learned regarding program evaluation, • Adult Adolescent Parenting Inventory (4) so that these lessons may help future projects develop and implement effective evaluation • Child Behavior Checklist (4) This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfar e 1 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov • Home Observation for Measurement of the did allow home visits were self-selected and Environment (4) may have had fewer of these concerns. Many young mothers lived with family members • Family Support Scale (2) and were not free to invite clinicians into the • Maternal Social Support Index (2) home without the owner’s permission. When access was possible, some grantees’ staff also • Parenting Stress Index (2) encountered scheduling and safety issues. Grantees did not recommend using standard- Some of the strategies suggested by grantees ized instruments alone for the evaluation of to address these challenges include: these programs. One grantee felt that these commonly used standardized instruments • Modify existing instruments to meet your were designed for individual therapy ses- needs. For example, recommendations for sions or for populations with less pervasive use of the Child Well-Being Scales included problems and therefore did not work well for clustering items to match program objec- measuring outcomes with the grantee’s target tives and adding a seriousness scale to population. The majority of projects found make the findings more interpretable.2 the Child Well-Being Scales to be the most • Develop your own instrument. The Parent appropriate and useful instruments, but one Empowerment Program believed its own program found that these scales were not sen- instrument, the Personal Goal Achievement sitive to detecting risk and were more appro- Measure (PGAM), assessed outcomes priate for higher functioning clients.1 Another more accurately and meaningfully than the grantee reported that its interventions simply available standardized instruments. When did not address some of the domains that the program participants and program staff standardized instruments assessed and, thus, worked together to develop this instru- these instruments would not detect change. ment, participants were empowered by Administering the instruments also proved selecting their own goals and working challenging. One program reported that partic- toward meeting established criteria for suc- ipants’ low literacy levels, test anxiety, absen- cessful achievement.3 teeism, and hostility made it difficult to obtain • Use multiple strategies to collect data. All valid and reliable pre- and posttest scores. programs collected qualitative data to help Another grantee reported that in-home assess- them interpret and clarify the results of the ments were difficult to administer (these were standardized instruments. These data col- not CPS-mandated cases, so home access lection tools and techniques included: was voluntary). Some participants were afraid to allow clinicians into their homes due to » Referral and discharge forms substandard living conditions, overcrowding, domestic violence, or drug dealing. Those who 2 There is a risk, in adapting standardized tests, of nullifying the reliability and validity of the instruments. However, creating 1 Magura, S., and Moses, B. S. (1987). Outcome measures or modifying a risk assessment instrument to use in the for child welfare services: The Child Well-Being Scales and interpretation of the results can have merit. rating form. Washington, DC: Child Welfare League of 3 Data from site-specific instruments should be interpreted with America. Available online at www.cwla.org/pubs/pubdetails. caution. If reliability and validity have not been established, the asp?PUBID=3062. ability to generalize their results is limited. This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 2 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov » Mentors’ pre- and post-intervention have reached and benefited more families, surveys, weekly reports, and in-service resulting in better outcomes overall. meeting notes Strategies used or suggested by grantees » Surveys of presenting problems (types include: and severity), focus groups, life history • Use service enhancements. Several pro- research, and satisfaction surveys grams avoided having to deny services » Staff notes, service documentation to families in their comparison groups by sheets, family case logs, activities providing core services to these families notes, meeting minutes and adding service enhancements (such as a mentoring component) or longer-term » School interviews services (such as 9 months rather than 3 » CPS verification months) for treatment group families. » Exit interviews with families and staff • Establish and communicate clear partici- pation criteria. Several grantees recom- EstablishingComparisonGroups mended establishing clear eligibility criteria for entry into the program to screen out Six of the 10 projects reported program families whose needs were greater than evaluation challenges related to the establish- the program was designed to address. It ment of treatment and comparison groups. A was also recommended that care be taken number of factors affecting the use of com- during the process of assigning families to parison groups may have compromised evalu- treatment and comparison groups to ensure ation results, including voluntary participation, that eventual outcomes will not