Sara Hudson Evaluating Indigenous Programs
Total Page:16
File Type:pdf, Size:1020Kb
Evaluating Indigenous programs: a toolkit for change Sara Hudson Research Report | June 2017 National Library of Australia Cataloguing-in-Publication Data: Creator: Hudson, Sara, 1974- author. Title: Evaluating Indigenous programs: a toolkit for change / Sara Hudson. ISBN: 9781922184870 (paperback) Series: CIS research report ; RR28. Subjects: Aboriginal Australians--Services for--Evaluation. Community development--Australia--Evaluation. Aboriginal Australians--Government policy. Evaluating Indigenous programs: a toolkit for change Sara Hudson with contributions from Carlos Andres Monteverde Salvatierra and Eva Christensen Research Report 28 Related CIS publications Research Report RR18 Sara Hudson, Mapping the Indigenous Program and Funding Maze (2016). Policy Monograph PM105 Sara Hudson, Closing the Accountability Gap: the first step towards better Indigenous health (2009) Contents Executive Summary ...............................................................................................1 Introduction ..........................................................................................................3 The case for reform ...............................................................................................4 Why evaluate? ......................................................................................................7 Analysis of program evaluations ..............................................................................9 Analysing the evaluations: A hierarchy of evidence................................................... 10 Productivity Commission’s criteria for evidence of ‘what works’ .......................... 12 Our criteria for evaluating the ‘evaluations’ ...................................................... 12 Lessons to be learnt ..................................................................................... 14 Examples of successful practices .................................................................... 14 Discussion and conclusion ..................................................................................... 20 Recommendations ............................................................................................... 21 Appendix A: ....................................................................................................... 22 Evaluation of CBA Programs .......................................................................... 22 Evaluation of SROI Programs ......................................................................... 24 Summary of Evaluations/Case-studies/Audits .................................................. 25 Appendix B: Evaluation Toolkit .............................................................................. 47 Appendix C: List of Tobacco cessation programs ...................................................... 50 Endnotes ............................................................................................................ 52 ACKNOWLEDGEMENTS This research report has been assisted by comments and suggestions from two anonymous external reviewers; my colleagues Simon Cowan, Michael Potter, Charles Jacobs, Heidi Kiekebosch-Fitt and Gary Banks; and participants who took part in a CIS roundtable on growing the evidence base for effective social programs. I am also grateful to Karla Pincott, who edited the report and Ryan Acosta who designed and laid out the report. All remaining errors are my own. The Properity Project is supported by Executive Summary Previous CIS research indicated lack of evaluation of delivery. Unfortunately, many government agencies Indigenous programs is a significant problem. Of the ignore evaluations when making funding decisions or 1082 Indigenous programs identified, only 88 (8%) had implementing new programs. A recent audit of the been evaluated.1 NSW Evaluation strategy found the NSW Treasury and NSW Department of Premier and Cabinet were not using Following the release of that research and a Productivity evaluation outcomes to inform and improve practices. Commission report that also called for more rigorous evaluation of Indigenous programs, the federal Analysis of 49 Indigenous program evaluation reports government announced it would allocate $40 million over found only three used rigorous methodology, and four years to strengthen the evaluation of Indigenous none used what is considered the ‘gold standard’ of programs and provide $50 million for research into evidence: Randomised Control Trials (RCTs). Overall, Indigenous policy and its implementation. the evaluations were characterised by a lack of data and the absence of a control group, as well as an over- However, given the average cost of an evaluation is reliance on anecdotal evidence. $382,000, the extra $10 million a year for Indigenous program evaluations will not go far. To make the most Particular features of robust evaluations include: of this additional funding, the government must change • A mixed method design, which involves triangulation the way it evaluates and monitors programs. of qualitative and quantitative data and some Although formal evaluations for large government economic components of the program such as the programs are important, evaluation need not involve cost effectiveness/or meta-analysis contractors. Government must adopt a learning and • Local input into design and implementation of the developmental approach that embeds evaluation into program to ensure program objectives match a program’s design as part of a continuous quality community needs improvement process. • Clear and measurable objectives It is not enough just to evaluate. Government must use the findings from evaluations to improve service • Pre and post program data to measure impact Evaluating Indigenous programs: a toolkit for change | 1 Adopting a co-accountability approach to evaluation improvement process with funding for self-evaluation will ensure that both the government agency funding provided to organisations. the program, and the program provider delivering • Developing an evidence base through an the program, are held accountable for results. An accountability framework with regular feedback loops overarching evaluation framework could assist with the via an online data management system — to ensure different levels of outcomes expected over the life of data being collected is used to inform practice and the program and the various indicators needed at each improve program outcomes and there is a process level to measure whether the program is meeting its for escalating concerns. objectives. Feedback loops and a process to escalate any concerns will help to ensure government and program Suggestions for program providers include: providers monitor one another and program learnings • Embedding evaluation into program practice — are shared. evaluation should not be viewed as a negative Suggestions for policy makers and program funders process, but as an opportunity to learn. include: • Developing an evidence base through the regular • Embedding evaluation into program design and collection of data via an online data management practice — evaluation should not be viewed as an system to not only provide a stronger evidence base ‘add on’ but should be built into a program’s design for recurrent funding, but also to improve service and presented as part of a continuous quality delivery and ensure client satisfaction with the program. 2 | Evaluating Indigenous programs: a toolkit for change Introduction The first CIS report in this series ‘Mapping the AusTender procurement contracts found the average Indigenous Program and Funding Maze,’ provided cost of an evaluation is $382,000.5 At this price, the quantitative evidence of the lack of evaluation of additional $10 million will be enough for only 26 more Indigenous programs. Of the 1082 Indigenous evaluations of Indigenous programs per year. programs identified in our research, only 88 (8%) had The Australian government has for some time been been evaluated.2 This finding was corroborated by the aware of the lack of evidence on the effectiveness Productivity Commission’s 2016 Overcoming Indigenous of Indigenous programs. However, the challenge is Disadvantage Report, which found only 24 Indigenous transitioning from awareness to action that will address programs had been rigorously evaluated and that there the knowledge gap. For years, government has claimed was a “pressing need for more and better evaluation of to be focused on delivering evidence-based policy, but Indigenous policies and programs nationally if we are to if this is to become more than just empty rhetoric, see improvements in outcomes for Aboriginal and Torres government needs to urgently change the way programs Strait Islander Australians.”3 and services are funded and delivered. Following the release of these reports, the federal government announced it would be allocating $4.5 million Although broad scale changes to the service system are in the next financial year to a number of key evaluations probably needed, the focus of this report is how best of Indigenous programs, including an evaluation of to measure the effectiveness of current Indigenous the Community Development Programme (CDP) and programs and then how to use that evidence to improve RCTs to assess the impact of the Prisoner Throughcare program design and implementation. Once more Programme in the Northern Territory and the School evidence is collected, the government will have a much Enrolment and Attendance Measure Programme. In better understanding