Program

Synthesis of lessons learned by child neglect demonstration projects September 2005

Child Welfare Information Gateway Children’s Bureau/ACYF U.S. Department of Health and Human Services 1250 Maryland Avenue, SW Administration for Children and Families Eighth Floor Washington, DC 20024 Administration on Children, Youth and Families 703.385.7565 or 800.394.3366 Children’s Bureau Email: [email protected] www.childwelfare.gov This synthesis was made possible by the Children’s Bureau, Administration on Children, Youth and Families, Administration for Children and Families, U.S. Department of Health and Human Services. The conclusions discussed here are solely the responsibility of the authors and do not represent the official views or of the funding agency. Publication does not in any way constitute an endorsement by the U.S. Department of Health and Human Services. Suggested citation: Child Welfare Information Gateway. (2005). Program evaluation: Synthesis of lessons learned by child neglect demonstration projects. Washington, DC: U.S. Department of Health and Human Services. Program Evaluation www.childwelfare.gov

In 1996 and 1997, the Children’s Bureau plans. While some of the lessons learned will funded 10 demonstration projects to address be most useful to other programs address- the prevention, intervention, and treatment ing child neglect, many are applicable to a needs of neglected children and their families. broader range of social service programs. These projects implemented and evaluated Contact information for each program discussed, a wide variety of service strategies with large and information about evaluation designs, numbers of high-risk children and families. instruments, and outcomes, are included in the The programs varied considerably in terms of appendices for readers interested in learning theoretical model (psychosocial or ecologi- more about individual projects. cal), target population, location (in-home or out-of-home), duration, and intensity. The projects provided a great variety of services, including parent education and support, ChallengesandSuccessful home visits, and referrals to other resources StrategiesforEvaluation or services in the community. (For informa- tion about the programmatic aspects of these Designand projects, see the companion synthesis, Child Neglect Demonstration Projects: Synthesis of All 10 grantees encountered challenges Lessons Learned, published by Child Welfare related to design and methodology. These Information Gateway.) Throughout the course challenges, and the strategies grantees of their 5-year projects, grantees faced a reported to be successful in addressing them, number of common challenges in combining are summarized below. They relate specifically service delivery with a rigorous program evalu- to the processes of: ation methodology. Despite these challenges, several of the programs conducted very thor- • Selecting instruments ough , and they all reported posi- • Establishing comparison groups tive outcomes. • Collecting data In recent years, there has been a growing interest in the evaluation of social service • Analyzing the impact of individualized programs. Funders and other stakeholders services want evidence that programs are achieving their intended outcomes. Practitioners want to SelectingInstruments develop more effective services by document- All 10 programs used standardized instruments, ing interventions and measuring results, and and many used the same ones. The table in they want to establish stronger connections Appendix B shows which instruments each between services and outcomes. Therefore, program used. The most commonly used were: this program evaluation synthesis summa- rizes the grantees’ challenges, strategies, and • Child Well-Being Scales (6 programs) lessons learned regarding program evaluation, • Adult Adolescent Parenting Inventory (4) so that these lessons may help future projects develop and implement effective evaluation • Child Behavior Checklist (4)

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfar e 1 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

• Home Observation for Measurement of the did allow home visits were self-selected and Environment (4) may have had fewer of these concerns. Many young mothers lived with family members • Family Support Scale (2) and were not free to invite clinicians into the • Maternal Social Support Index (2) home without the owner’s permission. When access was possible, some grantees’ staff also • Parenting Stress Index (2) encountered scheduling and safety issues. Grantees did not recommend using standard- Some of the strategies suggested by grantees ized instruments alone for the evaluation of to address these challenges include: these programs. One grantee felt that these commonly used standardized instruments • Modify existing instruments to meet your were designed for individual therapy ses- needs. For example, recommendations for sions or for populations with less pervasive use of the Child Well-Being Scales included problems and therefore did not work well for clustering items to match program objec- measuring outcomes with the grantee’s target tives and adding a seriousness scale to population. The majority of projects found make the findings more interpretable.2 the Child Well-Being Scales to be the most • Develop your own instrument. The Parent appropriate and useful instruments, but one Empowerment Program believed its own program found that these scales were not sen- instrument, the Personal Goal Achievement sitive to detecting risk and were more appro- Measure (PGAM), assessed outcomes priate for higher functioning clients.1 Another more accurately and meaningfully than the grantee reported that its interventions simply available standardized instruments. When did not address some of the domains that the program participants and program staff standardized instruments assessed and, thus, worked together to develop this instru- these instruments would not detect change. ment, participants were empowered by Administering the instruments also proved selecting their own goals and working challenging. One program reported that partic- toward meeting established criteria for suc- ipants’ low literacy levels, test anxiety, absen- cessful achievement.3 teeism, and hostility made it difficult to obtain • Use multiple strategies to collect data. All valid and reliable pre- and posttest scores. programs collected qualitative data to help Another grantee reported that in-home assess- them interpret and clarify the results of the ments were difficult to administer (these were standardized instruments. These data col- not CPS-mandated cases, so home access lection tools and techniques included: was voluntary). Some participants were afraid to allow clinicians into their homes due to »» Referral and discharge forms substandard living conditions, overcrowding, domestic violence, or drug dealing. Those who 2 There is a risk, in adapting standardized tests, of nullifying the reliability and validity of the instruments. However, creating 1 Magura, S., and Moses, B. S. (1987). Outcome measures or modifying a risk assessment instrument to use in the for child welfare services: The Child Well-Being Scales and interpretation of the results can have merit. rating form. Washington, DC: Child Welfare League of 3 Data from site-specific instruments should be interpreted with America. Available online at www.cwla.org/pubs/pubdetails. caution. If reliability and validity have not been established, the asp?PUBID=3062. ability to generalize their results is limited.

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 2 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

»» Mentors’ pre- and post-intervention have reached and benefited more families, surveys, weekly reports, and in-service resulting in better outcomes overall. meeting notes Strategies used or suggested by grantees »» Surveys of presenting problems (types include: and severity), focus groups, life history • Use service enhancements. Several pro- research, and satisfaction surveys grams avoided having to deny services »» Staff notes, service documentation to families in their comparison groups by sheets, family case logs, activities providing core services to these families notes, meeting minutes and adding service enhancements (such as a mentoring component) or longer-term »» School interviews services (such as 9 months rather than 3 »» CPS verification months) for treatment group families. »» Exit interviews with families and staff • Establish and communicate clear partici- pation criteria. Several grantees recom- EstablishingComparisonGroups mended establishing clear eligibility criteria for entry into the program to screen out Six of the 10 projects reported program families whose needs were greater than evaluation challenges related to the establish- the program was designed to address. It ment of treatment and comparison groups. A was also recommended that care be taken number of factors affecting the use of com- during the process of assigning families to parison groups may have compromised evalu- treatment and comparison groups to ensure ation results, including voluntary participation, that eventual outcomes will not be biased statutory requirements that made a “no- because of initial differences between treatment” group impossible in at least one groups in their risk of future neglect. site, inappropriate referrals from community partners, difficulties achieving random assign- ment to groups, and unintended variations in Collecting Data service intensity. At least seven grantees reported difficulties related to data collection. Challenges in this In one program, only a few families were area varied widely, and related to: able to successfully engage and graduate, so it was not possible to compare treatment • Establishing baselines. Several grantees and comparison group service outcomes as reported difficulty accurately evaluating planned. At least one grantee felt that the families at intake. One program reported process of assigning some families to a com- that this intake process took 6 weeks, parison group and the fact that some families often due to family crises. Families drifted could not receive full services competed with in and out of the program and moved in the program’s goals. While not denying the and out of crises, so their progress could value of experimental design for determin- not be represented as a straight line from ing program effectiveness, they noted that intake to exit. Another program indicated without this constraint their services would that intake scores were unrealistically high,

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 3 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

probably because families were not com- of the final report, so findings for some fortable telling a stranger how bad things of their evaluation questions could not were or because their chaotic lives seemed be included at that time. A third program normal to them. While thorough assess- reported that their sample size was only ments were considered essential, the intake half of the projected size. Another program process was often found to be burdensome reported a high attrition rate: 438 families for large families. It sometimes took them were referred, 250 enrolled, 195 started the 4 to 6 hours to complete the numerous program, and only 87 graduated. standardized measures and provide demo- Strategies used or suggested by grantees graphic data. included: • Collecting follow-up data. Grantees • Streamline data collection. Programs rec- found many of the families were difficult ommended striving for a balance between to locate or contact. Follow-up interviews the need to conduct an in-depth initial with caretakers sometimes were delayed assessment and the need to streamline due to scheduling difficulties. One program initial demographic data collection. One reported that posttest data were more dif- recommendation was to use Bavolek’s ficult to collect simply because clients were Adult-Adolescent Parenting Inventory reluctant to exit the program. (AAPI) as a clinical tool to initiate con- • Involving children. Children presented versation while gathering baseline data. unique data collection challenges, because The AAPI is a brief agree/disagree survey they often were unavailable or too tired to designed to assess parenting attitudes. complete the process. One program found Staff reported that using this tool was an screening preschool-aged children nearly efficient and effective way to gather data impossible, as parents were reluctant to risk while engaging the family. (See Appendix having their children labeled at an early age. C for more information on the AAPI.) Another program found it helpful to limit • Using self-report measures. One program the amount of information families were reported differences in findings from asked to provide during initial meetings observational versus self-report measures. with agency staff. In several instances, self-report measures identified significant depression and child • Use multiple data collection methods. behavior concerns that were not observed Programs recommended using many by student interns. sources of qualitative and quantitative data and gathering data more often than just • Generating large enough samples. One when they enter and exit the program. One program reported that use of assessment grantee recommended getting releases instruments by the various intake workers from families for access to Department of was inconsistent, resulting in significant data to track what happens gaps in data. Another grantee reported to families once they exit the program. that due to project extensions to maximize sample size, the 12-month follow-up period • Offer incentives. Providing incentives, such was not reached for all families at the time as cash or “graduation” parties, to families

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 4 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

for participating in baseline and follow-up interviews was recommended. LessonsLearnedforEvaluation • Gather child data. Most of the programs Management collected data on children as well as fami- lies. One program had success assessing children while in child care during parent Several factors related to management of the group meetings. evaluation process contributed to these grant- ees’ ability to overcome numerous challenges Analyzing the Impact of and produce high quality program evaluations: Individualized Services • Detailed evaluation plans Programs frequently customized services, pro- • Sufficient evaluation budget and project vided resources tailored to individual needs, duration changed plans when families were in crisis, and varied the level of service to meet fami- • Strategies to address staff turnover lies’ unique needs. At least 8 of the 10 grant- • Strong evaluation teams ees agreed that providing multifaceted and flexible services was good for meeting fami- • Federal evaluation supports: Grantee lies’ needs. However, this strategy also makes cluster and technical assistance comparisons across families more difficult and makes it very challenging to describe a repli- Detailed Evaluation Plans cable service model. In its fiscal year 1996 funding announcement, the Children’s Bureau made its expectations Strategies employed by grantees to address clear. Applicants were asked to provide a logic this challenge included: model and propose an evaluation plan for • Document service variations. Several their projects, including methods, data, and grantees opted to closely monitor the inter- criteria that would be used to evaluate the vention to ensure it was being implemented proposed project’s processes and outcomes. as planned and to document variations as As a result, all of the programs established they occurred. clearly defined and measurable implementa- tion and outcome objectives as they started • Work closely with the program evaluator. their program planning. These objectives, Several grantees believed that having the along with operational definitions and specific program evaluator participate in program measures for each outcome, helped programs activities and attend staff meetings deep- assess their progress accurately and objec- ened their understanding and helped tively. (For more information about how indi- ensure that the evaluation evolved along vidual programs structured their evaluations, with the program. see Appendix B. For a sample logic model, see Appendix D.)

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 5 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

Sufficient Evaluation Budget intervention manual, weekly seminars, and and Project Duration supervision were helpful in keeping the evalu- ation on track. One program found using the Grantees were expected to invest 10 to 15 evaluation itself as a learning opportunity percent of their budgets (or approximately helped minimize the impact of turnover. They $15,000 to $22,500 per year) in program designed their evaluation to be empowering evaluation. This helped the programs leverage for staff so they could learn while doing their the resources needed to support an in-house work. Project staff reported that they enjoyed evaluation or contract with an external evalu- and benefited from the evaluation feedback ator. (See Strong Evaluation Teams, below.) provided about the types of success their Many of these evaluators were able to invest efforts were supporting. sufficient time to become very familiar with the project. In several cases the evaluator also StrongEvaluationTeams supported graduate students who assisted in data collection and processing. All 10 projects assigned program evaluation responsibilities to qualified individuals or These grantees had 5 years to implement organizations. Universities operated five of the and evaluate their projects. This allowed suf- projects, and a medical center operated one. ficient time to develop and implement strong Four of these projects used their own in-house program evaluation plans. program evaluators. All but one of the remain- ing programs used nonuniversity outside StrategiestoAddressStaffTurnover evaluators. Several of the evaluation teams Eight of the 10 programs reported that also performed or assisted with data collection addressing staff turnover was a crucial part of and entry. In several instances, the evaluation managing the program evaluation. Staff turn- team also assisted with providing training and over sometimes resulted in uneven data col- developing manuals. lection and analysis. One program found that Most of the program evaluators stayed with its pre- and posttest evaluation design did not the program for the full 5 years, although work well, because high staff turnover resulted one of the grantee programs changed its in missing discharge data and inconsistency lead evaluator three times in 5 years and two from pre- to posttesting. Another found that projects using graduate student data collec- clients with a consistent caseworker had tors experienced schedule conflicts and high higher posttest scores and showed greater turnover due to graduation. All the evaluators improvement than clients who experienced maintained close regular contact with the pro- caseworker turnover. grams. Grantees reported that having evalua- Several programs minimized the impact of tors with in-depth knowledge of the program staff turnover on the evaluation process by and its participants, and having staff willing focusing on training. Some projects pro- to implement a program with an extensive vided their staff with evaluation training and evaluation component, were key components manuals to ensure consistency despite staff- in the evaluation’s success. In particular, one ing changes. Others used videotape training program reported that placing research staff to reduce errors. One program found that an in the agency during the delivery of services

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 6 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

to monitor recruitment of families, random included reductions in child behavior prob- assignment to treatment groups, timely com- lems, parenting issues, foster care placements, pletion of data packets, and service param- and CPS reports; improvements in child and eters (such as length) may have improved family health and well-being; and increased targeting, sample size, and adherence to the parenting skills. (See companion synthesis, Child intervention’s original definitions. Neglect Demonstration Projects Synthesis of Lessons Learned, for further information on FederalEvaluationSupports: program outcomes.) GranteeClusterand Along the way, however, all of the grantees TechnicalAssistance experienced some challenges in implementing From the beginning, these projects were their program evaluation plans. The difficul- intended to operate cooperatively as a ties they faced in selecting instruments, staff- grantee cluster. The Children’s Bureau facili- ing their evaluations, establishing comparison tates this cooperation within clusters of discre- groups, conducting sound evaluations of indi- tionary grantees by providing technical assis- vidualized and flexible services, and collecting tance and by encouraging the development data are similar to those any program manager of common evaluation criteria, data elements, will likely face when conducting an evaluation and measures to maximize comparability of in the field. Starting with a detailed evaluation evaluation findings. Grantees reported that plan, committing substantial funds to program operating as a cluster led to stronger and evaluation, developing strategies to minimize more credible results. Project staff were able the impact of staff turnover on evaluation to refine their evaluation plans and imple- implementation, forming strong evaluation mentation processes at annual grantee meet- teams, working together as a cluster, and ings and by keeping in touch via a listserv. receiving evaluation technical assistance all Together with their Federal Project Officer and contributed to the success of these grantees. the evaluation technical assistance provider, These successful strategies, and the lessons grantees developed a uniform final report learned throughout the grant cycle, will be format that emphasized implementation and helpful to other programs wishing to estab- outcome evaluation strategies and results and, lish effective program evaluation designs most importantly, helped them think through and methods. the relationship between program implemen- tation and participant outcomes.

Conclusion

Several of the grantees in this cluster con- ducted very thorough evaluations, and they were all able to report positive outcomes for children and families. These outcomes

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 7 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

AppendixA:Children’sBureauChildNeglectDemonstrationProjects

Family Intervention Program Family Support and Intervention for Valley Youth House Committee, Inc. Neglected Preschool Children 531 Main Street, 2nd Floor University of Rochester Bethlehem, PA 18018 Mt. Hope Family Center Anne Adams 187 Edinburgh Street (610) 954-9561 extension 24 Rochester, NY 14608 Jody Todd Manly, Ph.D. Family Network Project (585) 275-2991 Joan A. Male Family Support Center (for- www.psych.rochester.edu/research/mhfc merly Parents Anonymous of Buffalo and Erie County) Healthy Families D.C. 60 Dingens Street Mary’s Center for Maternal and Child Care, Buffalo, NY 14206 Inc. Joan A. Male 2333 Ontario Road, NW (716) 822-0919 Washington, DC 20009 Joan Yengo Family Preservation Services for African (202) 483-8196 American Families at Risk of Neglect Portland State University Helping Families Prevent Child Neglect P.O. Box 751 University of Maryland, Baltimore Portland, OR 97207-0751 School of Kristine Nelson, D.S.W. 525 West Redwood Street (503) 725-5012 Baltimore, MD 21201 Diane DePanfilis, Ph.D., M.S.W. Family Reclaim: A Community-based (410) 706-3609 Collaborative to Strengthen Families with www.family.umaryland.edu Substance Abuse and Neglect Issues Family Support Services of the Bay Area Homefriends 554 Grand Avenue Temple University Oakland, CA 94610 Center for Intergenerational Learning (CIL) Patricia Chambers, Ph.D. 1601 North Broad Street, USB206 (415) 861-4060 extension 3024 Philadelphia, PA 19122 Adam Brunner, Ph.D., former Program Director Nancy Henkin, Executive Director CIL (215) 204-6970 www.temple.edu/cil/Homefriendshome.htm

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 8 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

Neglected Children in Intergenerational Kinship Care Georgia State University College of Health and Human Sciences University Plaza Atlanta, GA 30303-3083 Susan J. Kelley, Ph.D. (404) 651-3043 www.gsu.edu/~wwwalh/index.html

Parent Empowerment Program: Neglect Prevention Education for Pregnant and Parenting Teens Montefiore Medical Center Child Protection Center 3314 Steuben Avenue Bronx, NY 10467 Karel Amaranth (718) 920-6429 www.Montekids.org/programs/cpc

To order copies of any of these projects’ final reports, contact Child Welfare Information Gateway at [email protected] or 800.394.3366.

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 9 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

AppendixB:ProjectEvaluationInformation

Program Desired Evaluation Tools Reported Evaluation Outcomes and Instruments Outcomes

Family • Outside evaluator • Reduce substance abuse, • Family Stress Inventory (Orkow, 1985) • Substance abuse impact Intervention • Comparison group reduce emotional and economic • Parenting Skills Inventory reduced in 50% of families Program • No random assignment problems, and increase • Family Risk Scales (seven scales modified where it was a problem reported parenting skills from Magura, Moses, & Jones, 1987) • Decrease in caregiver • Three standardized • Reduce health, academic, emotional problems scales behavioral, social, and emotional • Increase in parenting skills for • Pre- and problems in project children 65% of parents postintervention • Keep family unit intact and • Improved health of children • Statistical analysis of minimize the involvement of • Decreased behavior problems data high-risk project families with of children child welfare services • Some reduction in social • Test overall efficacy of the isolation of caregivers program and compare efficacy with two groups

Family Network • Outside evaluator • Maintain safe housing • Adult-Adolescent Parenting Inventory • Families maintained adequate Project • Comparison group • Master the skills necessary to (Bavolek, 1984 ) housing • No random assignment ensure appropriate activities of • Family Profile • Families achieved adequate • Two standardized scales daily living • Eco-Map (Hartman, 1979) health care • Intake, 6 months, 1 • Ensure adequate health care • Intake Packets (designed for program) • Caregivers developed skills year, 18 months, at • Master the skills necessary to • Child Well-Being Scales (Magura & Moses, to meet children’s psycho- closing, and at 6-month ensure the psycho-emotional 1987) emotional needs follow-up needs of family members • Caregivers showed • Statistical analysis of • Master the skills necessary to improvement in using data ensure appropriate discipline appropriate discipline

Family • Outside evaluator • Reduce out-of-home placement • Child Behavioral Checklist (Achenbach, • Decrease in founded neglect Preservation • Comparison group of target children 1991) reports and out-of-home Services • Some random • Reduce re-referrals for neglect • Battelle Developmental Inventory placement assignment • Prevent referral for neglect for (Newborg, Stock, & Wnek, 1984) • Child well-being increased • Seven standardized Outreach families • Child Well-Being Scales (Magura & Moses, from intake to the end of the scales 1987) intensive phase of services • CPS data - 12 months • Family Support Scale (Dunst, Jenkins, & • Increases in family support and prior and 12 months Trivette, 1984) family resources postintervention; child • Resource Scale for Teen Mothers (Dunst, and family data at Leet, Vance, & Cooper, 1988) intake, after intensive • Strengthening Multi-Ethnic Families & service and after care Communities: A Violence Prevention Parent • Statistical analysis of Training Program (Steele, Marigna, Tello, & data Johnson, 1999) • Alaska Assessment for the Risk of Continued Neglect (used at intake) (Baird, 1988)

Family Reclaim • Outside evaluator • Improve the quality of parenting • Child Well-Being Scales (Magura & Moses, • Improvement in child well- • No comparison group • Improve the quality home life 1987) being scores • Two standardized scales • Improve the overall healthy • Family Well-Being Scale (developed for • Improvement in family • Baseline and closure development of children in areas program) functioning for a significant • No statistical analysis of of self-esteem and self respect number of families data • Provide services that are as • 97.5% of children at risk for or more effective than routine removal were able to remain services at an acceptable cost, with their families relative to benefit • Improvement in children’s • Determine client/family academic performance and characteristics that appear to school attendance respond best to the Family • Cost effective Reclaim model

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 10 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

Program Desired Evaluation Tools Reported Evaluation Outcomes and Instruments Outcomes

Family • In-house evaluators • Improve functioning for children • Demographics Interview (Carlson & • 94% of families made progress Support and • Comparison group and parents Cicchetti, 1979) on treatment goals Intervention • No random assignment • For children: improve cognitive • Home Observation for Measurement of the • 99% of children achieved at for Neglected reported and pre-academic skills, Environment (preschool version, Caldwell & least one developmental goal Preschool • 14 standardized scales language and communication Bradley, 1984) • Improved parenting skills and Children • Intake, conclusion, 1- skills, gross and fine motor • Parenting Dimensions Inventory (Slater & increased social support for year follow-up skills, and socioemotional Power, 1987) caregivers • Statistical analysis of development • Adult Adolescent Parenting Inventory • Increased knowledge of child data not completed yet • For families: improve skills in (Bavolek, 1984) development and positive parenting, coping with stress, • Childhood Trauma Questionnaire (Bernstein, behavior management development of social networks, Fink, Handelsman, & Foote, 1994) • Children’s developmental knowledge of appropriate • Interpersonal Support Evaluation List adaptation exceeded that of developmental expectations, (Cohen & Hoberman, 1983) control group effective behavior management, • Perceived Stress Scale (Cohen, Kamarck, & positive interactions with Mermelstein, 1983) children, lower rates of CPS • Daily Hassles Scale of Parenting Events reports, reductions in stress, and (Crnic & Greenberg, 1990) improvements in social supports • Semi-Structured Free Play (Beeghly & Cicchetti, 1994) • Wechsler Preschool and Primary Scale of Intelligence-Revised (Wechsler, 1989) • Child Behavior Checklist (Achenbach, 1991) • Developmental Indicators for the Assessment of Learning-Revised (Mardel- Czudnowski & Goldenberg, 1983) • Preschool Symptom Self-Report (Martini, Strayhorn, & Puig-Antich, 1989) • Maltreatment Classification and Rating System (Barnett, Manly, & Cicchetti, 1993)

Healthy Families • A series of three • Promote optimal birth outcomes, • Knowledge of Infant Development Met objectives with regard to: D.C. different outside child health, child development, Inventory, Short Form (MacPhee, 1981) • Healthy birth weights evaluators and school readiness • Home Observation for Measurement of the • Immunizations and well-care • No comparison group • Foster positive parenting Environment (Caldwell & Bradley, 1984) visits • Eight standardized and successful parent-child • Home Screening Questionnaire (Coons, • Developmental screenings scales interaction Gay, Fandal, Ker, & Frankenburg, 1981) • Progress toward self- • Baseline and at • Promote optimal family • Adult-Adolescent Parenting Inventory sufficiency goals developmental intervals functioning and life outcomes (Bavolek, 1984) • No cases of CA/N • No statistical analysis of • Prevent child abuse and neglect • Maternal Social Support Index (Pascoe, data Ialongo, Horn, Reinhart, & Perradatto, 1988) • Carolina Parent Support Scale (Bristol, 1983) • Maternal Health Beliefs Questionnaire (based on Hochbaum, 1958, & Rosenstock, 1974) • Participant Satisfaction Survey (developed for program)

Helping Families • In-house evaluators • Decrease risk factors for • Center for Epidemiologic Studies- • Reduced caregiver depressive Prevent Child • Random assignment neglect: caregiver depressive Depressed Mood Scale (Radloff, 1977) symptoms, drug use, life Neglect to four different symptoms, alcohol use, drug • CAGE Questionnaire (Ewing, 1984) stress, parenting stress intervention groups use, functioning, everyday stress, • NIMH Diagnostic Interview Schedule • Increased appropriate • 12 standardized scales and parent stress Version III (DIS-III-R) (Robins, Helzer, Cottler, parenting attitudes, • Baseline, case closure, • Increase protective factors: & Goldring, 1989) satisfaction with parenting, and 6-month follow-up attitudes toward change, • Family Risk Scales (Magura, Moses, & Jones, perceived social support • Statistical analysis of parenting attitudes, parenting 1987) • Fewer CPS reports on data competence, social support and • Child Well-Being Subscale (Magura & participants following than family functioning Moses, 1987) prior to intervention • Increase child safety: reduce • Everyday Stressors Index (Hall, Williams, & • Enhanced physical and child maltreatment, meet basic Greenberg, 1985) psychological care of children needs, improve quality of • Parenting Stress Index- Short Form (Abidin, • Decreased caregiver physical home environment 1995) perceptions of child behavior • Improve child well-being: • Child Well-Being Scales (Magura & Moses, problems improve behavior and 1987) functioning • Adult-Adolescent Parenting Inventory (Bavolek, 1984) • Parenting Sense of Competence Scale (Gibaud-Wallston & Wandersman, 1978) • Social Provisions Scale (Russell & Cutrona, 1984) • Self-Report Family Inventory (Beavers, Hampson, & Hulgus, 1985) • Home Observation for Measure of the Environment (Caldwell & Bradley, 1984) • DHR CIS database • Child Behavior Checklist (Achenbach, 1991)

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 11 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

Program Desired Evaluation Tools Reported Evaluation Outcomes and Instruments Outcomes

Homefriends • Outside evaluator • Improved caregiver parenting • Parenting Stress Index (Abidin,1995) • No families in intervention • Random assignment skills • Social Support Network Inventory (Flaherty, group had child placed in to comparison and • Increased caregiver knowledge 1983) foster care treatment groups and access to community • Index of knowledge and use of community • Some improvement found • Five standardized resources resources (developed for this program) in parental teaching and scales • Decreased social isolation • Child Well-Being Scales (Magura & Moses, stimulation of children • Baseline and at 9 • Decreased parent stress 1987) • Parents experienced an months • Improved caregiver attitudes • Parent-Child Dysfunction Scale of the improvement in their feelings • Statistical analysis of and response to children with Parenting Stress Index (Abidin, 1995) and perceptions of themselves data disabilities/chronic illness as parents

Neglected • Outside evaluator • Identify the negative effects • Child Behavior Checklist (Achenbach, 1991) • Decreased child behavior Children in • No comparison group of prior neglect and provide • Teacher’s Report Form (Achenbach, 1991) problems Intergenerational • 10 standardized scales resources tailored to children’s • Child Well-being Scale (Magura & Moses, • Reduced risk for child neglect Kinship Care • Intake, 1-year exit, and needs 1987) • Improved caregiver health 6-month follow-up • Prevent subsequent neglect • Denver II (Frankenburg, Dodds, Archer, • Caregiver empowerment • Statistical analysis of • Decrease grandparents’ social Bresnick, Maschka, et al., 1992) • Increase in caregiver social data isolation • Grandparent Interview (developed for this support • Maximize quality of life for program) • Decrease in caregiver stress grandparent caregivers • Child Neglect Index (Trocmé, 1996) • Home Observation for Measurement of the Environment (Caldwell & Bradley, 1984) • Family Resource Scale (Dunst & Leet, 1987) • Family Empowerment Scale (Koren, DeChillo, & Friesen, 1992) • Family Support Scale (Dunst, Jenkins, & Trivette, 1984) • Health Risk Appraisal (Hutchins, 1991) • SF-36 (Ware, 1992) • Brief Symptom Inventory (Derogatis, 1993)

Parent • In-house evaluator • Empower parents to create • Child Abuse Potential Inventory (Milner, • Slight increase in child well- Empowerment • No comparison group nurturing homes for their 1994) being scores Program • Five standardized children • Knowledge Inventory of Child Development • Slight increase in knowledge scales • Improve participants’ self- and Behavior (Fulton, 1995) of infant development • At enrollment and at esteem and trust in their • Maternal Social Support Index (Pascoe, • Slight downward trend in child graduation parenting skills Ialongo, Horn, Reinhart, & Perradatto, 1988) abuse potential • No statistical analysis of • Reduce neglect and abuse • Personal Goal Achievement Measure • Significant percentage of data reported • Increase maternal social support (developed for program) family-identified goals partially • Decrease child abuse potential • Child Well-Being Check List (CWBCL) achieved or achieved • Increase knowledge of infant development • Improve home environment safety

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 12 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

AppendixC:ReferencesforEvaluationToolsandInstruments

Abidin, R. R. (1995). Parenting Stress Index (3rd ed.). Lutz, FL: Psychological Assessment Resources, Inc. Available online at http://www3.parinc.com/products/product.aspx?Productid=PSI

Achenbach, T. M. (1991). Integrative guide for the 1991 CBCL/4-18, YSR, and TRF profiles. Burlington, VT: University of Vermont, Department of Psychiatry.

Baird, C. (1988). Development of risk assessment indices for the Alaska Department of Health and Social Services. In T. Tatara (Ed.), Validation research in CPS risk assessment: Summary of highlights. Washington, DC: American Public Welfare Association.

Barnett, D., Manly, J. T., & Cicchetti, D. (1993). Defining child maltreatment: The interface between and research. In D. Cicchetti and S. L. Toth (Eds.), Child abuse, child development, and (pp. 7-73). Norwood, NJ: Ablex.

Bavolek, S. (1984). Adult-Adolescent Parenting Inventory. Schaumburg, IL: Family Development Associates. Available online at http://www.nurturingparenting.com/aapi/

Beavers, W. R., Hampson, R. B., & Hulgus, Y. F. (1985). The Beavers systems approach to family assessment. Family Process, 24, 398-405.

Beeghly, M., & Cicchetti, D. (1994). Child maltreatment, attachment, and the self system: Emergence of an internal state lexicon in toddlers at high social risk. Development and Psychopathology, 6(1), 5-30.

Bernstein, D. P., Fink, L., Handelsman, L., & Foote, J. (1994). Initial reliability and validity of a new retrospective measure of child abuse and neglect. American Journal of Psychiatry, 151(8), 1132-1136.

Bristol, N. N. (1983). Carolina Parent Support Scale. Unpublished manuscript. Chapel Hill: University of North Carolina, The Frank Porter Graham Child Development Center.

Caldwell, B. M., & Bradley, R. H. (1984). Home Observation for Measurement of the Environment. Little Rock: University of Arkansas at Little Rock.

Carlson, V., & Cicchetti, D. (1979). Demographics Interview. Unpublished paper. Cambridge, MA: Harvard University.

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 13 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

Cohen, S., & Hoberman, H. M. (1983). Positive events and social supports as buffers of life change stress. Journal of Applied Social , 13, 99-125.

Cohen, S., Kamarck, T., & Mermelstein, R. (1983). A global measure of perceived stress. Journal of Health and Social Behavior, 24, 385-396

Coons, C. E., Gay, E. C., Fandal, A. W., Ker, C., & Frankenburg, W. K. (1981). The Home Screening Questionnaire Reference Manual. Denver, CO: JFK Child Development Center.

Crnic, K. A., & Greenberg, M. T. (1990). Minor parenting stresses with young children. Child Development, 61, 1628-1637.

Derogatis, L. R. (1993). Brief Symptom Inventory. Bloomington, MN: Pearson Assessments. Available online at http://www.pearsonassessments.com/tests/bsi.htm

Dunst, C. J., Jenkins, V., & Trivette, C. M. (1984). Family Support Scale: reliability and validity. Journal of Individual, Family, and Community Wellness, 1, 45-52.

Dunst, C. J., & Leet, H. E. (1987). Measuring the adequacy of resources in households with young children. Child: Care, Health and Development, 13, 111-125.

Dunst, C. J., Leet, H. E., Vance, S. D., & Cooper, C. S. (1988). In C. J. Dunst, C. M. Trivette, and A. G. Deal (Eds.), Enabling and empowering families: Principles and guidelines for practice (pp. 147-148). Cambridge, MA: Brookline Books.

Ewing, J. A. (1984). Detecting alcoholism: The CAGE Questionnaire. Journal of the American Medical Association, 252, 1905-1907.

Flaherty, J. A., Gaviria, F. M., & Pathak, D. S. (1983). The measurement of social support: the Social Support Network Inventory. Comprehensive Psychiatry, 24(6), 521-529.

Frankenburg, W. K., Dodds, J., Archer, P., Bresnick, B., Maschka, P., Edelman, N., et al. (1992). The DENVER II: Denver Developmental Screening Test. Denver, CO: Denver Developmental Materials, Inc.

Fulton, A. M. (1995). Knowledge Inventory of Development and Behavior: Infancy to School-age. Unpublished manuscript, Oklahoma State University at Stillwater.

Gibaud-Wallston, J., & Wandersman, L. (1978). Development and utility of the Parenting Sense of Competence Scale. Paper presented at the meeting of the American Psychological Association, Toronto.

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 14 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

Hall, L. A., Williams, C. A., & Greenberg, R. S. (1985). Supports, stressors, and depressive symptoms in low-income mothers of young children. American Journal of Public Health, 75, 518-522.

Hartman, A. (1979). Finding families: An ecological approach to family assessment in adoption. Beverly Hills, CA: Sage Publications, Inc.

Hochbaum, G. M. (1958). Public participation in medical screening programs: A socio- psychological study. PHS publication no. 572. Washington, D.C.: U.S. Government Printing Office.

Hutchins, E. B. (Ed.). (1991). Healthier people. Health risk appraisal program. Atlanta, GA: Carter Center of Emory University.

Koren, P. E., DeChillo, N., & Friesen, B. (1992). Measuring empowerment in families whose children have emotional disabilities: A brief questionnaire. Rehabilitation Psychology, 37(4), 305-321

MacPhee, D. (1981). Manual: Knowledge of Infant Development Inventory. Unpublished manuscript. Chapel Hill: University of North Carolina.

Magura, S., & Moses, B. S. (1987). Outcome measures for child welfare services: Child Well-Being Scales and rating form. Washington, DC: Child Welfare League of America.

Magura, S., Moses, B. S., & Jones, M. A. (1987). Assessing risk and measuring change in families: The Family Risk Scales. Washington, DC: Child Welfare League of America.

Mardell-Czudnowski, C., & Goldberg, D. S. (1983). Developmental Indicators for the Assessment of Learning—Revised. Circle Pines, MN: American Guidance Service.

Martini, D. R., Strayhorn, J. M., & Puig-Antich, J. (1990). A symptom self-report measure for preschool children. Journal of the American Academy of Child and Adolescent Psychiatry, 29(4), 594-600.

Milner, J. S. (1994). Assessing physical child-abuse risk: The Child-Abuse Potential Inventory. Clinical Psychology Review, 14(6), 547-583

Newborg, J., Stock, J. R., & Wnek, L. (1984). Battelle Developmental Inventory. Itasca, IL: Riverside Publishing.

Orkow, B. (1985). Implementation of a family stress checklist. Child Abuse and Neglect, 9, 405-410.

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 15 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

Pascoe, J. M., Ialongo, N. S., Horn, W. F., Reinhart, M. A., & Perradatto, D. (1988). The reliability and validity of the maternal social support index. Family Medicine 20(4), 271-276.

Radloff, L. S. (1977). The CES-D Scale: A self-report depression scale for research in the general population. Applied Psychological Measurement, 1, 385-401.

Robins, L., Helzer, J., Cottler, L., & Goldring, E. (1989). NIMH Diagnostic Interview Schedule– Version III Revised (DIS-III-R). Bethesda, MD: National Institute of Mental Health.

Rosenstock, I. M. (1974). Historical origins of the health belief model. Health Education Monographs 2, 328-335.

Russell, D., & Cutrona, C. (1984). The provision of social relations and adaptation to stress. Paper presented at the meeting of the American Psychological Association, Toronto.Slater, M. A., & Power, T. G. (1987). Multidimensional assessment of parenting in single parent families. In J. P. Vincent (Ed.), Advances in family intervention, assessment and theory (Vol. 4, pp. 197-228). Greenwich, CT: JAI.

Steele, M. L., Marigna, M. K., Tello, J., & Johnson, R. F. (1999). Strengthening multi-ethnic families and communities: A violence prevention parent training program. Los Angeles, CA: Consulting and Clinical Services.

Trocmé, N. (1996). Development and evaluation of a Child Neglect Index. Child Maltreatment, 1(2), 145-155.

Ware, J. E. (1992). The SF-36. Boston, MA: The Health Institute, New England Medical Center.

Wechsler, D. (1989). Wechsler Preschool and Primary Scale of Intelligence—Revised. San Antonio, TX: PsychCorp.

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 16 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm. Program Evaluation www.childwelfare.gov

AppendixD:SampleLogicModel

Short-Term Long-Term Inputs Activities Outcomes Outcomes

Resources Services Intermediate Outcomes Longer-Term Benefits

• Office on Child Multi-Faceted Increase Child Safety Abuse and Home Based Protective Factors Neglect (HHS) Intervention Decrease Risk Child Well-Being • Annie E. Casey • Emergency Factors • DHR Services • Title IV-E • Family Assessment • Outcome- Driven, Tailored Services

DePanfilis, D. Family Connections: Program and research. USDHHS, Children’s Bureau sponsored conference, Evidence Based Practice, Washington, DC, June 29, 2004

This material may be freely reproduced and distributed. However, when doing so, please credit Child Welfare 17 Information Gateway. Available online at www.childwelfare.gov/pubs/focus/evaldemo/index.cfm.