Evaluating Indigenous programs: a toolkit for change

Sara Hudson

Research Report | June 2017 National Library of Cataloguing-in-Publication Data:

Creator: Hudson, Sara, 1974- author.

Title: Evaluating Indigenous programs: a toolkit for change / Sara

Hudson.

ISBN: 9781922184870 (paperback)

Series: CIS research report ; RR28.

Subjects: Aboriginal Australians--Services for--Evaluation.

Community development--Australia--Evaluation.

Aboriginal Australians--Government policy. Evaluating Indigenous programs: a toolkit for change

Sara Hudson with contributions from Carlos Andres Monteverde Salvatierra and Eva Christensen

Research Report 28 Related CIS publications

Research Report

RR18 Sara Hudson, Mapping the Indigenous Program and Funding Maze (2016).

Policy Monograph

PM105 Sara Hudson, Closing the Accountability Gap: the first step towards better Indigenous health (2009) Contents

Executive Summary...... 1

Introduction...... 3

The case for reform...... 4

Why evaluate?...... 7

Analysis of program evaluations...... 9

Analysing the evaluations: A hierarchy of evidence...... 10

Productivity Commission’s criteria for evidence of ‘what works’...... 12

Our criteria for evaluating the ‘evaluations’...... 12

Lessons to be learnt...... 14

Examples of successful practices...... 14

Discussion and conclusion...... 20

Recommendations...... 21

Appendix A: ...... 22

Evaluation of CBA Programs...... 22

Evaluation of SROI Programs...... 24

Summary of Evaluations/Case-studies/Audits...... 25

Appendix B: Evaluation Toolkit...... 47

Appendix C: List of Tobacco cessation programs...... 50

Endnotes...... 52 ACKNOWLEDGEMENTS This research report has been assisted by comments and suggestions from two anonymous external reviewers; my colleagues Simon Cowan, Michael Potter, Charles Jacobs, Heidi Kiekebosch-Fitt and Gary Banks; and participants who took part in a CIS roundtable on growing the evidence base for effective social programs. I am also grateful to Karla Pincott, who edited the report and Ryan Acosta who designed and laid out the report.

All remaining errors are my own.

The Properity Project is supported by Executive Summary

Previous CIS research indicated lack of evaluation of delivery. Unfortunately, many government agencies Indigenous programs is a significant problem. Of the ignore evaluations when making funding decisions or 1082 Indigenous programs identified, only 88 (8%) had implementing new programs. A recent audit of the been evaluated.1 NSW Evaluation strategy found the NSW Treasury and NSW Department of Premier and Cabinet were not using Following the release of that research and a Productivity evaluation outcomes to inform and improve practices. Commission report that also called for more rigorous evaluation of Indigenous programs, the federal Analysis of 49 Indigenous program evaluation reports government announced it would allocate $40 million over found only three used rigorous methodology, and four years to strengthen the evaluation of Indigenous none used what is considered the ‘gold standard’ of programs and provide $50 million for research into evidence: Randomised Control Trials (RCTs). Overall, Indigenous policy and its implementation. the evaluations were characterised by a lack of data and the absence of a control group, as well as an over- However, given the average cost of an evaluation is reliance on anecdotal evidence. $382,000, the extra $10 million a year for Indigenous program evaluations will not go far. To make the most Particular features of robust evaluations include: of this additional funding, the government must change • A mixed method design, which involves triangulation the way it evaluates and monitors programs. of qualitative and quantitative data and some Although formal evaluations for large government economic components of the program such as the programs are important, evaluation need not involve cost effectiveness/or meta-analysis contractors. Government must adopt a learning and • Local input into design and implementation of the developmental approach that embeds evaluation into program to ensure program objectives match a program’s design as part of a continuous quality community needs improvement process. • Clear and measurable objectives It is not enough just to evaluate. Government must use the findings from evaluations to improve service • Pre and post program data to measure impact

Evaluating Indigenous programs: a toolkit for change | 1 Adopting a co-accountability approach to evaluation improvement process with funding for self-evaluation will ensure that both the government agency funding provided to organisations. the program, and the program provider delivering • Developing an evidence base through an the program, are held accountable for results. An accountability framework with regular feedback loops overarching evaluation framework could assist with the via an online data management system — to ensure different levels of outcomes expected over the life of data being collected is used to inform practice and the program and the various indicators needed at each improve program outcomes and there is a process level to measure whether the program is meeting its for escalating concerns. objectives. Feedback loops and a process to escalate any concerns will help to ensure government and program Suggestions for program providers include: providers monitor one another and program learnings • Embedding evaluation into program practice — are shared. evaluation should not be viewed as a negative Suggestions for policy makers and program funders process, but as an opportunity to learn. include: • Developing an evidence base through the regular • Embedding evaluation into program design and collection of data via an online data management practice — evaluation should not be viewed as an system to not only provide a stronger evidence base ‘add on’ but should be built into a program’s design for recurrent funding, but also to improve service and presented as part of a continuous quality delivery and ensure client satisfaction with the program.

2 | Evaluating Indigenous programs: a toolkit for change Introduction

The first CIS report in this series ‘Mapping the AusTender procurement contracts found the average Indigenous Program and Funding Maze,’ provided cost of an evaluation is $382,000.5 At this price, the quantitative evidence of the lack of evaluation of additional $10 million will be enough for only 26 more Indigenous programs. Of the 1082 Indigenous evaluations of Indigenous programs per year. programs identified in our research, only 88 (8%) had The Australian government has for some time been been evaluated.2 This finding was corroborated by the aware of the lack of evidence on the effectiveness Productivity Commission’s 2016 Overcoming Indigenous of Indigenous programs. However, the challenge is Disadvantage Report, which found only 24 Indigenous transitioning from awareness to action that will address programs had been rigorously evaluated and that there the knowledge gap. For years, government has claimed was a “pressing need for more and better evaluation of to be focused on delivering evidence-based policy, but Indigenous policies and programs nationally if we are to if this is to become more than just empty rhetoric, see improvements in outcomes for Aboriginal and Torres government needs to urgently change the way programs Strait Islander Australians.”3 and services are funded and delivered. Following the release of these reports, the federal government announced it would be allocating $4.5 million Although broad scale changes to the service system are in the next financial year to a number of key evaluations probably needed, the focus of this report is how best of Indigenous programs, including an evaluation of to measure the effectiveness of current Indigenous the Community Development Programme (CDP) and programs and then how to use that evidence to improve RCTs to assess the impact of the Prisoner Throughcare program design and implementation. Once more Programme in the Northern Territory and the School evidence is collected, the government will have a much Enrolment and Attendance Measure Programme. In better understanding of what works and what changes early 2017, the federal government announced it will are necessary to ensure programs meet the needs of allocate $10 million a year over four years to strengthen Indigenous people and communities. the evaluation of Indigenous programs. According to This report starts by outlining the case for reform the government, a formal Evidence and Evaluation and Indigenous people’s frustration at the Indigenous Framework will be developed to strengthen the reporting Advancement Strategy, which saw community and monitoring of the program evaluations. organisations lose funding for programs they felt were In his 2017 Closing the Gap speech, Prime Minister working, while programs and services communities did Turnbull reiterated the government’s emphasis on not want or need were introduced. Next, the report evaluation and announced the appointment of an examines why it is important to evaluate programs, Indigenous commissioner at the Productivity Commission and the concept of co-accountability. The findings of a and $50 million for research into Indigenous policy and literature review of 111 Indigenous program evaluations/ its implementation.4 These announcements suggest audits/reviews is analysed, including what constitutes a the government is finally looking at doing something to rigorous evaluation and a possible hierarchy of evidence. address the serious shortfall in evidence. At the same Finally, recommendations for improvements to practice time, the extra $10 million per year for Indigenous for both policy makers and program providers is program evaluations will not go far. Analysis of the provided.

Evaluating Indigenous programs: a toolkit for change | 3 The case for reform

There is general consensus that more evidence on the government and recommended a full internal review by effectiveness of Indigenous programs is needed to the Australian National Audit Office (ANAO). improve Indigenous outcomes. However, while there The ANAO report found the Department of Prime Minister is bipartisan support to conduct evidence-based policy, and Cabinet (PMC) had not implemented the Strategy in practice, polices are often based on ideology instead effectively, and the grants administration processes “… of practical, evidence-based measures that have been fell far short of the standard required to manage billions tested and proven to work. Each new government wants of dollars of funding.”15 In particular, the Department to put their own stamp on a particular policy or program. was found to have not: But new policies often recycle failed policies of the past, or throw good programs out with the ‘bathwater’. • assessed applications in line with the guidelines and public information provided by the Department “There is a level of frenetic chopping and changing, and policy pulsing, that comes with • met some of its obligations under the Commonwealth electoral cycles and as the political pendulum Grants Rules and Guidelines; swings from left to right…decision-making in • kept records of key decisions; and Indigenous policy feels much like a merry-go round—replete with the same old traps and • established performance targets for all funded reinvented wheels.”6 projects.16

A case in point is the Community Development Nor did the Department advise the Minister of the Employment Program (see Box 1 overleaf), which has risks involved in implementing the Strategy in such a suffered, perhaps more than any other Indigenous short time frame. According to the Australian Public program, from political pendulum shifts. 7 Service Commission, such timidity by public servants is reportedly becoming more common, which is a worrying The previous report, ‘Mapping the Indigenous Program sign, as a well-functioning government is reliant on the and Funding Maze’ found there needs to be a much more provision of free and frank advice to Ministers.17 rigorous process for allocating funding for Indigenous programs and for making decisions about which Although a performance framework was established for programs continue to receive funding. 14 The inquiry into the Strategy, the framework did not facilitate assessing the tendering process for the Indigenous Advancement whether program outcomes had been achieved. Strategy (IAS) funding criticised the procedures used by This therefore inhibited the Department’s ability to

4 | Evaluating Indigenous programs: a toolkit for change Box 1: CDEP to CDP — an example of government failure

Initial design was a community initiative and focused on community development: The first CDEP scheme was introduced in 1977 in Bamyili, a remote Indigenous community in the Northern Territory, as an alternative to unemployment benefit payments and as an instrument of community development. Instead of individual income support payments, the money was pooled to fund community development projects and to employ people. Significantly, the scheme was a community initiative rather than a government-designed and imposed program.8

Reiteration of CDEP to expand it into urban and rural areas and for it to be a transition to work program: In the mid-1980s, CDEP became part of the Aboriginal Employment Development Policy (AEDP) and was expanded into Indigenous urban and regional communities as a transition-to-work program. However, by the late 1990s, issues with the reframing and expansion of CDEP were becoming increasingly apparent. An evaluation of CDEP in 1997 found that at least 33% of CDEP participants did no work.9 More than half, or 60% of CDEP organisations paid people for home duties and mowing their own lawns. Only about 5% of CDEP participants moved from CDEP to real jobs and more than 40% of Indigenous people on CDEP from remote communities had been on CDEP for five years or more. According to a government discussion paper, CDEP had “become a destination rather than a stepping stone towards jobs.”10 There were a number of important reasons why CDEP was not meeting its objectives.

1) There were few jobs for people to transition to in remote areas.

2) There were no incentives to transfer people to mainstream jobs, particularly in remote areas where CDEP funding was used to fund local government, health, education, and policing services.

3) There was no recognition of the need to modify the program depending upon location (ie. it may have been realistic to expect it to be a transition to employment program in mainstream areas but not in remote areas where it needed to take a more community development approach and actually create jobs).

4) There was not enough accountability of CDEP providers, with no repercussions if participants were paid for doing nothing.

Despite the problems with CDEP, some providers were actually doing a good job.11 But rather than learning from these success stories and reforming CDEP to ensure the program was meeting its objectives, or assessing whether the program’s objectives were even achievable, the government decided to abolish CDEP; replacing it with the Remote Jobs and Community Program (RJCP) in 2013.

Remote Jobs Community Program (RJCP) at odds with original intent of CDEP: Where the original CDEP program had been a community initiative aimed at avoiding the negative repercussions of welfare by pooling community members’ social welfare payments, RJCP was a top-down government-controlled program. Its emphasis was on getting Indigenous people into employment and fining those who failed to meet their activity requirements. Unlike CDEP which had large community support, RJCP failed to resonate with communities and had very burdensome administrative arrangements. The pendulum had swung too far towards a punitive model.

Rebadged RJCP to CDP: The unpopularity of RJCP and the high administration costs led the Coalition government to amend the program and change its name to the Community Development Program (CDP). Some people argue the similarity in names between CDEP and CDP was a deliberate ploy to try and get community buy-in. The then Prime Minister Tony Abbott admitted that: “Abolishing CDEP was a well- intentioned mistake and CDP is our attempt to atone for it.”12

CDP: Along with the name change, the government announced there would be more consultation with communities about what projects and activities they wanted, and less red tape. Despite this, a number of people continue to think the CDP program is too punitive and does not take into account the challenges people living in remote communities face; such as the lack of jobs. A recent report by the Australian National University found 146,000 financial penalties had been applied to 34,000 CDP participants in 2015–16, compared to 104,000 penalties to approximately 750,000 job-active participants in mainstream Australia.13 It seems the original reason CDEP was established  the lack of a real economy or many job opportunities in remote Indigenous communities  continues to be ignored.

Lessons to be learnt:

1) Before scaling up programs, check if the objectives need to be modified/tailored to different regions.

2) Don’t throw the baby out with the bathwater — learn from previous mistakes and successes about what does and does not work.

The pendulum swings with CDEP (and its replacements) are illustrative of the failings in going too far in either direction. Too lenient and there tends to be an absence of accountability — as evident in CDEP participants receiving money for doing nothing at all, but too far the other way and approaches tend to be excessively punitive.

To be effective, Indigenous policy initiatives need to adopt a middle ground — where there is accountability and oversight but the need for community involvement and flexibility is also recognised.

Evaluating Indigenous programs: a toolkit for change | 5 “effectively verify, analyse or report on program national program ‘Tackling Indigenous Smoking’ and performance.” 18 The Department had reportedly the Victorian program ‘Yarning it Up — Don’t Smoke it started evaluating some individual projects but had not Up’. The proliferation of tobacco cessation programs adopted an evaluation strategy.19 A draft evaluation and is probably due to the way funding is provided under performance improvement strategy had been developed, the federal government’s Tackling Indigenous Smoking and was considered by the Indigenous Affairs Reform regional grants program, which provides grants to Implementation Project Board in July 2014, but the plan support locally designed anti-smoking and smoking was not formally agreed to, endorsed or funded. 20 cessation programs.

Worst of all, however, was that the Department did A review of Tackling Indigenous Smoking was not document the processes they used when awarding commissioned by the Department of Health in 2014. contracts. The widespread awarding of contracts to The review found evidence that multi-level approaches non-Indigenous organisations meant many Aboriginal to tobacco control were the most effective at organisations had their funding reduced or missed reducing smoking prevalence in Indigenous Australian out on funding entirely.21 Public hearings during the communities. At the same time, the review also found parliamentary inquiry into the IAS were filled with a lack of monitoring and evaluating of the programs. stories of organisations losing funding for programs that Therefore, although the review recommended retaining 22 had run successfully for decades. An example was the the flexibility of the funding approach to tailor programs Djarindjin domestic violence shelter on the Dampier at the local level, it also recommended integrating Peninsula in Western Australia. The shelter is run by a reporting and evaluating framework into future local Aboriginal women and services 50 Aboriginal iterations of the program to develop a stronger evidence communities 200 kilometres north of Broome. After their base around effectiveness of the program.27 Following plight attracted considerable media attention, funding the review, the Department of Health introduced a for the shelter was reinstated. However, there were revised Tackling Indigenous Smoking program with a 23 many other organisations that were not so fortunate. budget of $116.8 million over three years ($35.3 million The IAS funding process is symptomatic of a deeply in 2015–16; $37.5 million in 2016–17 and $44 million 28 flawed system that has led to gaps in programs and in 2017–18). services in some areas and duplication and waste in Despite the increase in the number of Indigenous others. Yet, the problems existed before the IAS, as programs, some communities continue to miss out on former Northern Territory Co-ordinator General for essential services. For example, Fitzroy Crossing in East Remote Services, Olga Havnen documented in her Kimberley suffers from one of the highest incidents of Remote Services Report in 2012: foetal alcohol spectrum disorder (FASD) in the world, but “There are not only massive pre-existing one of the town’s most effective prevention initiatives service gaps but also a serious lack of is in danger of closing. An early learning centre that high quality, evidence-based program and provides pre-natal and post-natal care to mothers and service development…This lack of long- tuition to parents, as well as childcare, is set to close next term strategic vision means governments year under changed subsidy arrangements that will see have spread resources as widely as possible it lose $500,000 from its annual budget of $1.2 million.29 in a ‘scatter-gun’ or ‘confetti’ approach. Six years ago, when alcohol restrictions were first This results in partially funding community introduced in Fitzroy Crossing, a study by Notre Dame initiatives for short periods with no long term University noted there were significant gaps in support strategy for how the positions created or services in the community. Most damning was the fact initiatives undertaken will be sustained.”24 that while alcohol restrictions had been introduced to try and combat the epidemic of alcoholism in the town, Soon after the release of this report, Olga Havnen was there was no resident alcohol and drug counsellor or sacked from her position as Co-ordinator General for mental health worker. The community was serviced only 25 Remote Services. twice a month by two regional mental health workers Since there is no strategic oversight, nor a requirement from Derby (a town several hours away).30 These are for an evidence base for funding, the number of not isolated, one-off examples, they are endemic to the Indigenous programs has increased over time with Indigenous program and service sector. no appreciable improvements in outcomes. When the Mark Moran’s book Serious Whitefella Stuff illustrates review of programs on the Indigenous HealthInfoNet through a selection of case studies how governments was done at the beginning of January 2016, there were often make decisions without involving local Indigenous 2468 programs listed on the website, of which 2024 people and cut funding to programs without any were Indigenous-specific. Over a year, the number of assessment of their effectiveness, even though there programs has increased by 383 to 2851.26 is now widespread recognition of the importance of The way programs are funded through multiple small engaging with local Indigenous people in the design grants contributes to the growing number of programs. and implementation of programs.31 According to Fred Our research identified at least 30 different Indigenous Chaney: “The system under which we operate is tobacco cessation programs (see Appendix C). Of broken, and it is the broken system that we should be these 30 programs, only two had been evaluated: the evaluating.”32

6 | Evaluating Indigenous programs: a toolkit for change Why evaluate?

There are many reasons for conducting evaluations of a certain degree of flexibility to tailor the program to programs. For example: to highlight what is and is not meet community needs. This approach is different from working; to inform decision making about allocation traditional ideas of accountability, and involves moving of resources; or to improve service delivery and away from simply monitoring and overseeing programs client satisfaction with a program (see Appendix B for to supporting a learning and developmental approach to Evaluation Toolkit and a more detailed explanation). evaluation.34 Ultimately, evaluation is necessary to ensure government It is also not enough to just evaluate; government is held accountable for monitoring how organisations must use the information from evaluations and reviews are spending taxpayers’ money. Yet, there must be to improve service delivery.35 There is considerable co-accountability — the organisation receiving the evidence to suggest that even when programs have funding must be held accountable for how they have spent the money and whether the program has achieved been evaluated, governments have not used the findings its desired outcomes, and the government agency to inform funding decisions. For example, according to must be held accountable for monitoring whether the a report by Olga Havnen, the former Northern Territory organisation is meeting its objectives and work with Coordinator-General for Remote Services a non- them to improve their practices if they have not. As government organisation (not named in report) was Australian National University academic Will Sanders contracted to deliver a multi-million dollar program has argued: “Government must not prioritise excessive ($5 million over three years) in five Northern Territory 36 accountability to bureaucrats over accountability to communities. An evaluation of the program mid-term communities.” Organisations are accountable to the revealed “serious deficiencies” in the way the program government agency funding them, but the government was delivered, and the conduct of staff employed by the is accountable to the community. organisation. Despite the poor findings of the evaluation, the organisation was invited by the federal government Improved accountability, however, does not mean there to submit a proposal for the continuation and expansion has to be detailed daily monitoring of the activities of of the program.37 both providers and participants. If there is any lesson to be learnt from the failed RJCP, it is that excessive Another example is a recent Indigenous health campaign monitoring can be a huge administrative burden for little – No Germs On Me – which ran three different television gain.33 There needs to be an appropriate balance between commercials encouraging people to use soap when they maintaining program fidelity and allowing organisations washed their hands. Although the evaluation found no

Evaluating Indigenous programs: a toolkit for change | 7 change in participants’ beliefs, behaviours or attitudes “The NSW Government’s program as a result of the campaign, the evaluators concluded evaluation initiative is largely ineffective, the reach of the advertisement was satisfactory and the as it is not providing sufficient information campaign was worth continuing.38 to government decision makers on the Every state and territory has some sort of evaluation performance of programs. For program or data monitoring guideline or strategy (see Table 1). evaluation to be effective, agencies should Despite all these strategies and guidelines, a recent demonstrate they are evaluating the right audit of the NSW Evaluation strategy by the Audit Office programs, and the outcomes from completed of NSW found the NSW Treasury and NSW Department evaluations should inform advice to the NSW of Premier and Cabinet were not using evaluation 39 outcomes to inform and improve practices. According to Government on investment decisions.” the audit:

Table 1 State and Territory Evaluation Strategies

Type of documentation Key features

NSW The Centre for Program Evaluation and capability Guidelines are a comprehensive document with best building practice principles and links to other websites with other evaluation material. NSW Government Program Evaluation Guidelines(2016)40

NSW Evaluation Toolkit 2016 VIC Evaluation Step-by-Step Guide (2008).41 Guide is for evaluation contractors — provides four steps for managing an evaluation. Useful material. Funded Organisation Performance Monitoring Framework (2017).42 Performance framework for monitoring funded organisations. QLD Queensland Government Program Evaluation Comprehensive document, similar advice to NSW Guidelines (2014).43 and Victoria guidelines but better use of diagrams/ tables to explain evaluation processes. TAS Planning, evaluation and procurement guidelines Guidelines are focused on communication and not (Tasmanian Government 2015).44 as comprehensive as other evaluation guidelines. Useful link to Tasmanian Government approach to collaboration. SA Managing a Community Organisation Evaluation Guidelines directed at community organisations. (Social Inclusion, 2016).45 Website has a series of six steps to follow when conducting or managing an evaluation. WA Program Evaluation Unit (PEU) within the Comprehensive guide but with almost identical Department of Treasury.46 material as other guidelines, some useful links to other sources though and a helpful program logic Program Evaluation website.47 table with examples. Program Evaluation Guide, 201548 NT Aboriginal Affairs Monitoring, Evaluation and The MERF contains targets that relevant NTG Reporting Framework (MERF).49 agencies report to on a monthly basis. Every two months the results are reported to the sixteen Chief Executives of NTG agencies that are members of the Aboriginal Affairs Standing Committee (AASC). Twice a year the results are reported to Cabinet, with the Chief Minister publicly releasing the Framework performance report following Cabinet endorsement. ACT ACT Government Evaluation Policy and Quite a comprehensive document, with similar Guidelines (2010).50 material to other state/territory guidelines. Useful table on the benefits of evaluation.

8 | Evaluating Indigenous programs: a toolkit for change Analysis of program evaluations

Research for our previous report, ‘Mapping the Figure 1: Number of programs by category and Indigenous Program and Funding Maze’ identified 1082 number of evaluations by category current Indigenous-specific programs. Of these:

• 49 were federal government programs;

• 236 were state and territory programs; and

• 797 were programs delivered by non-government organisations (though many of these are funded in part or full by government).

Of the 1082 programs only 88 (8%) were found to have been (or were in the process of being) evaluated.

The largest category of programs were health related programs (n=568) followed by cultural programs (n=145) then early childhood and education programs Source: Government websites, major philanthropic and NGO websites, and analysis of IAS funding recipients and programs listed (n=130) — see Figure 1. on the Australian Indigenous HealthInfoNet. The program category with the highest number of evaluations was health (n=44), followed by early Figure 2: Number and percentage of evaluations childhood and education (n=16). However, percentage by category wise, more programs were evaluated under the jobs and economy category (15%) than the other program categories.

Of the 490 programs delivered by Aboriginal organisations, only 20 were evaluated (4%). The small number of businesses delivering a program (n=6) meant that while there were only two evaluations of Indigenous programs provided by a business, this category had the highest percentage of programs evaluated (33%). Similarly, while only six of the 33 programs delivered by schools and universities were evaluated, this category had the second highest percentage of programs Source: Authors’ calculation based on a review of government, evaluated (23%). Conversely, government and non- major philanthropic and NGO websites, and programs listed on the Indigenous NGO delivered programs had the highest Australian Indigenous HealthInfoNet. number of evaluations, n=36 and n=24, but much lower percentages of evaluations as the number of overall Figure 3: percentage of Indigenous programs programs was higher, n=278 and n=276. evaluated by provider

Source: Government websites, major philanthropic and NGO websites, and programs listed on the Australian Indigenous HealthInfoNet.

Evaluating Indigenous programs: a toolkit for change | 9 Analysing the evaluations: A hierarchy of evidence

Not all evaluations are equal. Many evaluations are best practice, these are: akin to a ‘tick box’ exercise, with limited data available 1. Build evaluation into your program design. to measure impact. The primary focus of these types of evaluations appears to be participation in the 2. Base your evaluation on sound methodology. program, or throughputs, rather than outcomes. A 3. Include resources and time to evaluate. number of program providers seem reluctant to admit the failings of their programs and their evaluation 4. Use the right mix of expertise and independence. reports read more like exercises in public relations than 5. Ensure proper governance and oversight. independent and rigorous analysis. The purpose of conducting an evaluation should be to look at what is 6. Be ethical in design and conduct. and is not working, what some term a ‘warts and all’ 7. Be informed and guided by relevant stakeholders. evaluation.51 However, for many not-for-profits, the pressure not to publish negative evaluations is high, 8. Consider and use evaluation data meaningfully. with specific concerns ranging from whether negative 9. Be transparent and open to scrutiny.57 publicity will affect funding, to how staff working on the ground may perceive any criticism of the project.52 However, having principles and actually applying Similarly, if the findings of a government evaluation them are two different things. For instance, although are particularly negative, it is not uncommon for evaluations should be built into the program design, in government to insist that the results are not practice this does not always happen. Often evaluators published.53 Evaluations of government programs are are asked to evaluate a program after it has been often conducted by the department responsible for running for a while but when there is no pre-program funding or delivering the program, and even if an data or even any uniform collection of administrative external evaluator is used, their ‘independence’ is data. As a result, the evaluation is not as useful as it compromised by the client relationship.54 How much could have been if the evaluation and implementation of independence can a consultant claim to have when the program had occurred concurrently. they are reliant on their clients for business?55 The second principle, basing your evaluation on sound Consultants can sometimes be pressured to frame the methodology, also sounds like common sense. Yet results of evaluations in a certain way and to downplay although there is generally agreement on a hierarchy any negative findings. For example, a recent evaluation of evidence, with meta-analyses of multiple randomised of the cashless debit card trial, came to some surprising trials at the top (see Box 2), in practice, RCTs of conclusions about the effectiveness of the trial, given Indigenous programs are very rare. In fact, none of the weight of evidence to the contrary.56 the evaluation of Indigenous programs reviewed in In determining what constitutes a rigorous evaluation, this report used RCT. However as mentioned earlier, state and territory evaluation guidelines provide the Australian government is starting to invest in the examples of principles of ‘best practice’. The NSW method, with funding for two RCTs of Indigenous Program Evaluation Guidelines contain nine principles of programs recently announced.58

10 | Evaluating Indigenous programs: a toolkit for change Box 2. Proposed Hierarchy of Evidence

Shadow Assistant Treasurer Andrew Leigh’s hierarchy of evidence involves six levels, ranging from systemic reviews at the top to expert opinion and theoretical conjecture at the bottom.

1. Systemic review (meta-analyses) of multiple randomised trials

2. High quality randomised trials

3. Systematic reviews (meta-analyses) of natural experiments and before-after studies

4. Natural experiments (quasi-experiments) using techniques such as differences-in-difference, regression discontinuity, matching or multiple regression

5. Before and after (pre-post) studies

6. Expert opinion and theoretical conjecture.

While there is general agreement that RCTs are the intervention but not evidence on how best to deliver the gold standard of research evidence, there are that intervention as part of a program. For example, a some dissenting voices on the exact order of Leigh’s review of Indigenous health projects in WA found there hierarchy; for example, whether systematic reviews was a ‘disconnect’ between the strong scientific evidence are a more rigorous methodology than genuine quasi- for the health interventions and the way the service 59 experimental work. University of Wollongong academic sector was delivering the health intervention.61 The Peter Siminski argues that: “studies relying only on success of the program was strongly influenced by the matching or multiple regression are a lower grade of staff’s knowledge and familiarity with the interventions 60 evidence than genuine quasi-experimental work.” they were promoting or delivering. Research on Quasi-experimental impact techniques are gaining in ‘implementation science’ (how to implement evidence- popularity as they are typically much cheaper, and face based research into practice) has found it can take less practical barriers to implementation, than RCTs about 17 years for research evidence to be incorporated (see Box 3 for an example of some of the challenges in into health care practices.61 Program evaluations are implementing RCTs), though, only one of the evaluations also more challenging than measuring the benefit of a of Indigenous programs reviewed for this report adopted particular intervention, as programs to address complex this type of approach. The issue in Australia is that social problems are likely to have multiple objectives.62 there are few people who have the training required to conduct high quality quasi-experimental work. The fact The underlying reason for conducting evaluations is to that RCTs and quasi-experimental evaluations require improve the delivery of programs and to achieve better highly trained practitioners to carry out the evaluations outcomes. There is no point in evaluating programs and restricts their usage and arguably is a reason why interrogating the standard of evidence if programs are alternative methods of evaluating Indigenous programs not designed to use the evidence from evaluations to should be considered. improve practice. As a result, it may be necessary to It is also important to note that there is a difference reach a compromise between what is considered the between a health or early-childhood intervention and ‘gold standard’ in terms of research evidence and what a program. There may be evidence for the benefit of is practical and achievable given limited resources.

Box 3. Evaluation of ACT Extended Throughcare Pilot Program

A recent evaluation of the ACT’s pilot Throughcare program, conducted by Social Research Policy Centre, has revealed issues with establishing a satisfactory RCT. The evaluation sought to rely on a RCT sample of participants who did not take part in the program, for the period June 2013– June 2016, as the control group. This sample group was ‘insufficient’ as the number of participants in the program was cited as being ‘very high’ and therefore there were very few non-participants.

In an attempt to rectify this issue, a sample group was developed from the period 2010–2013, prior to the implementation of the program. This data had differing baseline characteristics and was supplemented with ‘before and after custodial episode data’ to attempt to account for this.

Another issue with the evaluation was that there was little data on outcomes for Indigenous people. The study highlights that, of the Indigenous male study group, 57.4% returned to custody compared to 38.3% of the control group. For Indigenous females, the figures were 28.6% returning to custody compared with 33.3% of the control group. Figures for recidivism rates were provided by ACT Corrective Services in Productivity Commission’s Report on Governments Services (ROGS), but these do not explicitly identify the rates for Indigenous people. This highlights that despite Indigenous people being significantly overrepresented in the prison population, data is lacking on the Indigenous experience and outcomes in the program.63

Evaluating Indigenous programs: a toolkit for change | 11 Productivity Commission’s criteria for Our criteria for evaluating the evidence of ‘what works’ ‘evaluations’

In the Overcoming Indigenous Disadvantage Report, the In developing a method for ranking the evaluations Productivity Commission used a set of criteria to select identified in our research the following scale was used: case studies of programs or services they considered • Weak — limited methodology reliant on qualitative were having a positive impact on improving outcomes evidence or a survey with a small sample size, no pre for . and post data, or only a summary of full evaluation The criteria used to select the case studies were that the report publicly available program had: • Moderate — a mixture of qualitative and • Measurable, up to date outcomes quantitative data, some attempt at triangulation of • A reasonable track record of success (though what data (cross verification from two or more sources), this means is not defined) some evidence of impact but no pre and post data and no control groups. • Support from local Indigenous people who had used, or were affected by, the program; and • Strong — a mixture of qualitative and quantitative data with evidence of triangulation of data. Evidence • Where possible, include an analysis of costs and the program is having an impact through the use of benefits.64 pre and post data or other benchmarking data. The The rigour in the selection of case studies resulted in use of experimental design/random control trials/ or only 24 program evaluations being included in the report control group. Or in the absence of that, evidence (though 10 more case studies of promising programs the evaluation utilises in addition to triangulation that had not yet been evaluated were also included). of data and benchmarking one or more of the following: an economic component through either Despite the relatively high number of evaluations of Indigenous health programs, the Productivity a cost benefit or cost effective analysis or some Commission found a lack of evidence on interventions to mention of the financial impact of the program and address a range of different health indicators measured or meta-analyses — reviews of multiple evaluations. 65 in their Overcoming Indigenous Disadvantage report. Some flexibility had to be employed in developing this For instance, they considered that there is currently no list of criteria, as none of the evaluations reviewed evaluated program on approaches that work to reduce employed RCTs. Therefore, evaluations were considered smoking or alcohol consumption during pregnancy. Nor is strong if they involved triangulation of data and two or there a published robust evaluation of interventions that more of the following: control group; meta-analyses; contribute to a decrease in the prevalence of tobacco and cost effectiveness. This approach to weighting the smoking for Aboriginal and Torres Strait Islander people, methodology was based on the Victoria’s Department even though there has been a proliferation of tobacco of Treasury and Finance’s report ‘Guide to Evaluation: cessation programs under the federal government’s How to plan and conduct effective evaluation for policy Tackling Indigenous Smoking program.66 Other gaps and programs’, which ranked different evaluation data in evidence identified by the Productivity Commission and method types by level of sophistication.68 The initial included the lack of research and program evaluation identification of Indigenous program evaluation reports on Indigenous school engagement and the absence of was quite broad and included audits and reviews of evaluations of programs that work to improve home programs. At the same time, the focus on evaluation ownership for Aboriginal and Torres Strait Islander excluded some other program reports, such as case people.67 studies, Cost Benefit Analysis (CBAs) and Social Return The 88 program evaluations identified in our research on Investments (SROIs) (see Appendix A for description were compared with the Productivity Commission’s of CBAs and SROIs). When these were included, a total evaluations in their Overcoming Indigenous Disadvantage of 111 reports were identified. These 111 reports were Report. Overall: then broken down into five categories, evaluations, • 12 of the Productivity Commission’s 24 programs audits, reviews, CBAs, SROIs and others (ie case were included in the 88 program evaluations our studies). In total, 75 evaluation reports were identified research identified (though some of the programs had an evaluation report and a CBA report in which case only the evaluation • 5 of the Productivity Commission’s programs were report is included in the table below so the program is not Indigenous-specific (a criteria for programs to be not double counted). included in our research); and

• 7 evaluation reports were added to our literature review (these additional reports did not come up in our initial desk-top review of publicly available program evaluations).

12 | Evaluating Indigenous programs: a toolkit for change Table 2 Breakdown of Indigenous program reports by type

Category Evaluation Audit Reviews CBA SROI Other Total Crime 8 1 1 1 10 Culture 4 1 1 1 1 8 Education 15 2 5 22 Health 41 2 5 1 3 52 Housing/ 4 4 2 1 11 Jobs 3 3 7 Transport 1 1 75 7 9 3 6 11 111

Source: Government websites, major philanthropic and NGO websites, and programs listed on the Australian Indigenous HealthInfoNet.

Of the 111 program reports identified in our research, that could affect the impact of the program and only 71 were reviewed in detail (5 CBA, 6 SROI reports undertake sensibility analysis that considered different and 60 evaluations/audits/reviews), as the full text of scenarios and to change assumptions accordingly. the remaining 40 evaluation reports was not available — Table 4 highlights the findings of the assessment of the see Appendix A for summary tables of our assessment SROIs. Overall, only 50% of the SROIs appeared to of the evaluation methodology. have measurable objectives or to look at the impact of In total, only 49 of the 60 program reports, were able to the program in context. In addition, only two of the six be assessed against the scale (weak, moderate, strong) SROIs used a discount rate in their methodology. At the identified above, as the other 11 were not evaluation same time, while the criteria used to assess the SROI is reports, but audits or reviews. helpful in terms of evaluating the SROIs, the checklist does not tell the complete story about the quality or Overall our findings identified that: depth of the analysis underpinning the measurement of • 23 evaluation reports had weak methodology outcomes. For example, there were some SROI reports that did not include all the criteria, but still demonstrated • 23 evaluation reports had moderate methodology sound analysis. • 3 evaluation reports had strong methodology

Figure 4: Rating of evaluation methodology of Table 3 Analysis of CBA methodology Indigenous programs Criteria Yes No Measurable objectives 1 4 Identification of options 1 4 Proper quantification 4 1 Sensibility analysis 3 2 Equity implications 1 4 Discount rate 2 3 NPV calculations 4 1

Table 4 Analysis of SROI methodology

Criteria Yes No In general, Indigenous evaluations are characterised by a lack of data and the absence of a control group, as well Measurable objectives 3 3 as an over-reliance on anecdotal evidence. Quantification 6 0

Table 3 highlights the findings of the assessment of CBAs Sensibility analysis 4 2 and the criteria used to determine the effectiveness of Inputs/Outputs 4 2 the methodology used. Overall, few CBAs appear to have Impact in context 3 3 measurable objectives, or to identify a range of options Discount rate 2 4 and use equity implications. Conversely, good examples NPV calculations 5 1 of CBA reports tended to consider possible constraints

Evaluating Indigenous programs: a toolkit for change | 13 Lessons to be learnt

From our assessment of the evaluation methodology the following lessons can be drawn for policy makers and program providers.

Table 5 Lessons to be learnt about evaluation

Focus area Advice Methodology • It is important to use a mixed methodology and not just rely on qualitative evidence

• A case study or review should not be considered less rigorous than an evaluation, in fact some case studies may utilise a more robust methodology than many evaluations

• There is potential for biased samples when program participants receive benefits from taking part in the program Data • The same standards of data collection need to be upheld in each program location in order for effective comparisons to be made

• It can be difficult to measure changes in behaviour if the right administrative data is not available or collected

• Program providers need to have strategies for recording and accessing administrative data before the program is rolled out, particularly for a small cohort of program participants where there are potential privacy concerns Analysis and • Strong analysis can overcome some of the limitations of a small sample reporting • It is important to take into account the environment programs are operating in, and that some programs may have their impact minimised because they do not have certain authorities

• Evaluation reports need to be clear about whether the evaluation is on the framework/service delivery or the impacts/results the program produces, or a combination of both Program design • There need to be effective links between policy and program initiatives and delivery • While the general model of a program may be transferable, much of the successful implementation of programs depends on having the right combination of people with the appropriate knowledge and skills

• People delivering programs need ongoing training to ensure they have up-to-date information on the evidence available about best practice approaches

• Participants are more likely to provide honest feedback on a program when program staff have made an effort to establish positive relationships with them; this is particularly the case when the intervention being delivered by the program is of a sensitive and private nature

Examples of successful practices

In addition to the lessons that can be learnt from problems evaluating programs, there were also some examples of good practice. Particular features that made these evaluations stand out from the rest included:

• A mixed method design, which involved triangulation of qualitative and quantitative data and some economic components of the program such as cost effectiveness

• Local input into design and implementation of the program to ensure program objectives matched community needs

• Clear and measurable objectives

• Pre and post program data to measure impact

The following case studies illustrate examples of rigorous evaluation practice and/or successful programs that are regularly monitored and evaluated.

14 | Evaluating Indigenous programs: a toolkit for change Ganbina: evidence rating = strong

Ganbina was established in 1997 to help improve school and further education completion rates and ‘real’ job prospects among about 6000 Indigenous people in the Goulburn Valley in Victoria. The program receives no government funding, relying instead on philanthropic and corporate sponsorship for its activities on an annual budget of $1.4 million. According to Ganbina’s Chief Executive, Anthony Cavanagh, “Not seeking government funding is a choice and allows the program to be innovative…” 69

An independent evaluation by Price Waterhouse Coopers in 2014 found very high Year 12 completion rates (100% in 2014) and high retention rates (over 95%). Ganbina’s cost per participant of approximately $3500 was about half the average spend of other similar type programs. Despite costing less to run, it also had the highest retention rate, gender balance and broadest age group of other comparable programs.

An Impact Assessment was conducted by Social Ventures Australia (SVA) in 2016 to assess the cumulative impact of the program since it was first implemented in 2005.

The methodology used consisted of a desktop review of client data and previous evaluations and data collected on Ganbina and consultation with stakeholders.

The Impact Assessment found Year 11 to Year 12 retention rates increased from 62% in 2009-10 to 73% in 2015-16, which was considerably higher than the rate for Indigenous people in the Greater Shepparton area and national Indigenous rates. Ganbina achieved a 100% success rate for participants who had taken part in the program for five years or more and who were aged between 25-34 years, with all achieving a Year 12 or equivalent qualification.

University participation increased from two Ganbina participants in 2009 to 15 in 2016.

Key features of the program:

Does not receive any government funding, which has enabled it to adopt a more innovative and cost-effective approach (much cheaper than other comparable programs to run)

Complete transparency with six-monthly reports provided to investors and bi-monthly newsletters that document exactly how much funding has been used on administration and how much is left.

Aboriginal Maternity Group Practice Program (AMGPP): evidence rating = strong

The AMGPP provides free antenatal and postnatal clinical care, to pregnant Aboriginal women. Each client is supported by a team of health professionals during pregnancy and for four weeks after they have given birth. Support provided includes clinical care and cultural, social, and emotional care and support.

The evaluation involved a non-randomised intervention study using data from the Western Australian Midwives Notification System. Methodology used included regression models to analyse data from 343 women (with 350 pregnancies). The analyses included developing historical and contemporary control groups of pregnant Aboriginal women and matching them for maternal age.

Participation in the AMGPP was associated with significantly improved neonatal health outcomes. Babies born to AMGPP participants were significantly less likely to be born preterm 9.1% versus historical controls of 15.9%.

Key features of the evaluation/study:

Quasi-experimental design involving regression analysis and matched control group to show the impact of the program.

Evaluating Indigenous programs: a toolkit for change | 15 Australian Electoral Commission’s (AEC) Indigenous electoral participation program: Evidence rating = moderate

The IEPP program is aimed at empowering Aboriginal and Torres Strait Islander Australians in exercising their right to vote.

The evaluation methodology included: a literature scan and document review; semi-structured interviews with staff; focus groups; case studies with a cross section of communities and analysis of data available from the Queensland and the Northern Territory elections.

The evaluation found variation in the degree to which IEPP’s stated objectives and outcomes have been achieved. The evaluation recommended basing future changes to the program on evidence of ‘what works’, and harnessing the experiences of other government agencies and programs working in an Indigenous context. This includes the adoption of a robust monitoring and evaluation system and routine analysis of performance data.

Key features of the evaluation:

Methodology was relatively robust and included a Monitoring and Evaluation Framework and triangulation of qualitative and quantitative data. Information on the number of people who participated in interviews/ focus groups was provided and interview guides were provided in appendices. However, it was difficult to identify changes in electoral behaviour as ethnicity was not recorded on the electoral roll or when people vote. Performance data for the program was also not entered uniformly or consistently by States and Territories.

Indigenous Community Volunteer (ICV) Program: Evidence rating = moderate

A case study which incorporated a social and economic impact assessment was conducted by KPMG in 2015.

The ICV program is a registered charity and non-profit community development organisation that matches volunteer’s experience and skills with different Indigenous communities needs to help address Indigenous disadvantage. In 2013/14 ICV worked with 169 communities.

Assessment of activities in two communities involved stakeholder consultations and document and data analysis, including assessing the impacts of the activities in economic terms.

The study found there was evidence ICV was invited into communities and involved in discrete, well defined projects, and that volunteers were providing a positive impact and building on existing work that had been done in the community. There was also evidence that ICV had developed positive partnerships with other organisations and were collaborating with them on activities.

Key features of the evaluation and program:

Study involved triangulation of data from multiple sources, including analysis of economic data. However study only looked at two communities so difficult to extrapolate about overall program impact.

There was evidence of good practice in program design and implementation with volunteers ensuring communities wanted their assistance and only working on discrete well-defined projects in collaboration with other organisations involved in similar activities.

16 | Evaluating Indigenous programs: a toolkit for change As discussed above, it may be necessary to reach a Figure 5: Evidence of program impact and compromise between what is considered the ‘gold learnings standard’ in terms of research evidence and what is practical and achievable given limited resources. There is also no point conducting ‘rigorous’ evaluations, if the evidence is not used. As a result, instead of focusing on having the highest standard of evidence for assessing the impact of a program (such as in RCTs), it may be more practical to consider how to ensure evaluation learnings are used to inform program practice. Figure 4 shows an alternative hierarchy, where the minimum standard is evidence of learnings being applied to improve program outcomes, and the highest level is where there is evidence of the impact of the program and the benefit of the particular intervention in addition to learning and program improvement.

Government departments administering funding may conduct an evaluation to analyse funding distribution and to report on the achievements and impact of the program.70 However, these types of evaluations can make organisations feel like they have to pass a test in order to continue to receive funding and they may resist the evaluation process as a result. Resistance could be indirect or subtle, such as avoiding or delaying entering program data into databases. There is evidence to suggest organisations are more likely to engage with the evaluation process when it is presented as a learning tool to improve program delivery than when it is presented Another way of describing this iterative approach as a review or audit of their performance.71 This is is ‘developmental evaluation’ — a relatively recent particularly the case if they are given the opportunity to evaluation methodology that seeks to combine the provide input into the evaluation plan or framework, so rigour of evaluation with the flexibility and innovation they can see the benefit of the evaluation activities in of developmental approaches to social problems. The documenting the impact of the program and contributing primary focus of developmental evaluation is adaptive to evidence about what works. Evaluation as a learning learning to inform the implementation of programs or tool could be considered similar to continuous quality community development initiatives.72 improvement processes in the health sector and usually The following text boxes provide examples of programs involves ‘reflective practice’ to help identify and address that have adopted an iterative or developmental approach issues with program design or delivery (see Appendix B to evaluation and that have used evaluation findings to for Evaluation Toolkit which explains reflective practice improve the program. Neither of these two programs in more detail). was reviewed as part of the assessment of evaluations as A reflective practice approach to evaluation relies on a Ability Links NSW is not an Indigenous specific program two-way exchange, with the experiences of those on the and The Martu Leadership Program (MLP) evaluation ground delivering the program being used to inform the report was only released in April 2017, after the analysis ongoing implementation of the program. This is different of the evaluations was completed. However, they are from a government top-down technocratic approach, included as they provide the best examples of programs which might have strict accountability measures in where evaluation has been embedded into the delivery place, but fails to recognise there may be better ways of of the program and reflective/developmental approach delivering the program (see Table 6). to evaluation is used.

Table 6 Differences between top-down and bottom-up approaches in program design and evaluation

Top-down Bottom-up Approach Technocratic/evaluator as expert Participatory/community engagement/ empowerment Orientation Identifying weaknesses, problem or Strengthening capacity/improving deficit competence Who defines the issue/need? Outside agent (government) Community Evaluation methodology Quantifiable outcomes and targets Pluralistic methods, documenting changes of importance to Indigenous community

Source: Adapted from Laverack, 2000

Evaluating Indigenous programs: a toolkit for change | 17 Ability Links NSW and Early Links NSW Evaluation Ability Links NSW (‘ABNSW) is a program that was developed by the NSW Department of Family and Community Services through extensive community consultation to provide greater flexibility and control in the way services are delivered to people with disability. A concurrent program was developed for children and young people called Early Links NSW (‘ELNSW’).

ALNSW is staffed by ‘Linkers’ who work alongside people with a disability or their carer and assists in life planning as well as connecting them to relevant community organisations. The program aims to empower people with a disability to make their own decisions and work towards achieving what is important for them. The program also includes community engagement where Linkers work with community organisations to assist them to improve services and support for people with disability.

ALNSW commenced as a pilot in 2013/14 and was rolled out state-wide from July 2015. ALNSW was designed with evaluation in mind from the very beginning and evaluation processes were therefore embedded in the roll- out of the program. Urbis was commissioned by both ALNSW and ELNSW to evaluate the program over three years from 2013–2016, with Interim Evaluation Reports delivered annually.

The evaluation itself was uniquely designed as a collaborative joint approach, involving extensive participation at a community level (either people with a disability or their carers), staff involved in the program (‘Linkers’ and managers), and external linked agencies that worked with the program in various ways. Extensive consultations and surveys were undertaken with these stakeholders over a three month period to allow for a comprehensive analysis of the effectiveness of ALNSWs implementation. A key feature of the program was embodying a ‘culture of learning’. The annual Interim Evaluation Reports similarly provided an ongoing opportunity to review responses and apply the lessons learnt from the evaluation to the implementation of the program along the way. 73

The Martu Leadership Program (MLP) and the Developmental Evaluation Methodology: The MLP has been at the forefront of social and economic development in Indigenous Communities in the Pilbara over the past three years. A recent report by Social Ventures Australia revealed the strengths of a developmental evaluation methodology when assessing the outcomes of such programs.74

Focussing on capacity building and governance in the Martu community, the MLP’s establishment of a community Leadership Group originally aimed to enhance individualistic leadership skills so that participants could return to impart knowledge and skills to remote communities and Martu companies.

However, the developmental strategy applied by the facilitating Martu organisation, Kanyirninpa Jukurrpa, has enabled the program to evolve in an organic manner that has had wide ranging and unexpected benefits to the community.

The highly adaptive approach, co-designed with the Martu community, allowed the MLP to evolve its strategies, goals and targets based on developments over time.

The most noticeable benefit of this approach is the evolution of the MLP from an individualistic style leadership training course to the creation of a collective Leadership Group that is actively and independently leading change in the Martu community.

Acting on behalf of all Martu, the Leadership Group now works to enhance the capacity and governance capabilities of Martu society by serving as a cohesive actor that has taken on numerous responsibilities. For example, the Leadership Group now provides a platform for Martu to meet and discuss and resolve sensitive social issues in an organised and open manner. The Group has also facilitated dialogues with external stakeholders such as Newcrest Mining to ensure the best social and economic outcomes for the community.

The evolution of the Leadership Group into an empowered body that is actively and independently promoting the Martu agenda on the national stage is a clear example of the benefits of a developmental evaluation approach. Flexible outcomes and community consultation enabled the MLP to evolve in a manner that best suited Martu interests and ultimately gives them greater control of their own development.

The overwhelmingly positive growth of the Leadership Group could not have occurred if the focus had remained on achieving the fixed outcomes originally listed by the MLP. Set targets and objectives are often the cornerstone when evaluating programs, however the success of the MLP demonstrates the unexpected positives that can arise from a more flexible approach.

Key Features of the Evaluation/Program: The report assesses the importance of viewing Indigenous economic development programs from a more qualitative mindset. It emphasizes the ability of programs to extract unexpected benefits and outcomes by utilising a developmental evaluation approach that enables context based flexibility and adaptability.

18 | Evaluating Indigenous programs: a toolkit for change Fear of failure can inhibit government from experimenting Under this approach, 70% of the initial Action Plans were with different program approaches, but often it is only revised during implementation. However, this did not through this process of trial and error that evidence mean the initial plans were necessarily wrong, as the about what truly works can be collected.75 Genuine final plans tended to build on what was in the original adoption of a ‘learning by doing’ approach can be a Action Plans rather than starting from scratch. 77 very accountable process, as evidenced by Malaysia’s At the same time, while these types of participatory 76 National Transformation Program. Under the Malaysian research approaches can allow programs to be adapted government’s Performance Management and Delivery to suit local conditions, it should also be recognised Unit (PEMANDU) a three-staged approach was developed that increasing community control over program design that enabled initial Action Plans to be regularly updated and implementation will not necessarily produce a depending on information received from those working ‘perfect’ program.78 According to research conducted by on the ground. A distinctive feature of the PEMANDU was the World Bank, while involving local people can have the way in which any implementation issues were dealt positive impacts on program outcomes, care is required, with by being ‘bumped up’ through a series of ascending as in some instances programs can be controlled by steps from an email to the relevant managers, to a local ‘elites’ and more disadvantaged members of the closed-door meeting with the Minister (see Table 7). community can miss out.79

Table 7 Process for escalation of concerns

Frequency Action Format

Annually Annual report Report published: televised address by PM

Once-to-twice per year ‘Putrajaya Inquisition’ Meeting chaired by PM to clear any issues not solved in lower meetings

Semi-annually PM’s performance review Closed door meeting: only PM, Minister and PEMANDU CEO

Monthly to quarterly Steering Committee meeting Co-chaired by Ministers, with senior officials from all agencies: principal decision making forum

Weekly to fortnightly Meeting of technical working Problem solving with relevant managers: principal group working session

Weekly Progress report Emailed, uploaded, available on mobile devices.

Source: Sabel and Jordan, 2015

Evaluating Indigenous programs: a toolkit for change | 19 Discussion and conclusion

The previous report ‘Mapping the Indigenous Program but when they are evaluated they often have the best and Funding Maze’ recommended that all Indigenous outcomes.83 Yet, problems can arise when government programs must be linked to outcomes and that all or NGOs try to scale-up and replicate these types organisations must: of community-initiated programs. If programs are responsive to the needs of individual communities, • formally account for how the money has been spent; any metrics recorded may not be readily compiled or • provide evidence of the program’s impact; and compared with those from other programs.

• assess and report on whether the program is meeting Other researchers have also struggled to find examples its intended objectives. of best practice in Indigenous evaluation and program This recommendation still stands. However, while delivery that could be replicated. Mark Moran author large government programs should be subjected to of the book Serious Whitefella Stuff states he spent formal evaluation, preferably utilising RCT or quasi- 12 months looking for a standard of evidence to sort experimental methodology, it would not be an efficient through the complexity of Indigenous program delivery use of taxpayer funding to expect every Indigenous to find what he calls “the best performers and team 84 program to be evaluated by external contractors. The players.” In examining the evidence base he assessed NSW Government Evaluation guidelines outline how the following methodologies: “Randomised Control evaluations should be prioritised based on their “size, Trials; reverse cross-over (quasi-experimental) design; strategic significance and degree of risk.”80 This is the comparative case study analysis; process tracing; correct approach to take, as it is not worthwhile formally Bayesian analysis and fiscal ethnography.” He concluded evaluating a small program when the cost of the that too many programs were being implemented for evaluation would outweigh the cost of actually delivering too few people and that as a result it was difficult to find the program. Nor was it our intention in recommending people who had not been “treated” to form a control more evaluation to unduly benefit evaluators. group.85

Given that the average cost of an evaluation is However, this does not mean government, or anyone $382,000,81 the extra $10 million a year for Indigenous involved in the delivery of Indigenous programs, should program evaluations will not go far. In fact, it will be not evaluate Indigenous programs. Without some sort possible to formally evaluate only a small proportion of evaluation and accountability measures to track what of the 1000 or so Indigenous programs the federal is happening to the money spent on these programs, government funds. Additional funding to conduct it is impossible to know whether the lack of progress more evaluations is unlikely, given the critical budget in improving outcomes is because there is not enough situation. The government therefore, needs to move money relative to need, or whether the funding for away from traditional evaluation practices involving Indigenous programs and services is being wasted. expensive external evaluators, to approaches that Moreover, it is all very well to say that successful embed evaluation and reflective practice into the programs involve community involvement and buy-in, delivery of programs. but how do you achieve this in communities resistant to Our research identified a plethora of small programs change? Implementation science is a term increasingly (particularly health and well-being programs) currently being used to describe the field of study which examines being delivered by Aboriginal organisations that are not the individual, organisational and community influences being evaluated. For these small programs, a proper surrounding the implementation process of programs and reporting and monitoring framework that allows for the gaps between research and practice. Unfortunately reflective practice and continuous quality improvement rigid funding guidelines often prevent flexibility in may be all that is required rather than a formal, implementation timelines and innovation in program independent evaluation (see Evaluation Toolkit in design and delivery. People, and by extension programs, Appendix B for an example of an evaluation framework). are not like an assembly line. Cookie-cutter solutions do At the same time, while it is not economical to evaluate not tend to work. So while it is vital that government multiple small and disparate programs, it is often sets objectives for programs, they should not be overly community-initiated programs that appear to have the prescriptive in how those objectives are achieved. greatest impact. Where there are national or state-wide programs, there Unfortunately few evaluations compare community- needs to be a balance between maintaining program managed programs with non-Indigenous managed fidelity and allowing flexibility for local contexts. In this programs to provide evidence on the effectiveness of context, a developmental evaluation approach may be Indigenous community-led and designed programs.82 helpful, as the main focus of this type of evaluation is Therefore, there exists the paradox that small scale understanding the activities of a program and how the locally-based programs are less likely to be evaluated, program operates in different environments.

20 | Evaluating Indigenous programs: a toolkit for change Recommendations

The overarching recommendation of this report is: This approach is different from traditional ideas of accountability and involves moving away from simply There must be co-accountability for government monitoring and overseeing programs to supporting a funded programs learning and developmental approach to evaluation.86 Organisations receiving funding must be held A two-way, learning by doing approach to evaluation, accountable for how they have spent the money and with regular feedback loops, will help to ensure both whether the program has achieved its desired outcomes, government and program providers keep each other and the government agency must be held accountable honest. for monitoring whether the organisation is meeting In recognition of this co-accountability, the following its objectives and working with them to improve their table presents recommendations for both policy makers/ practices if they have not. program funders and program providers.

Table 8: Recommendations

Policy makers/program funders Program providers Embedding evaluation into program design and practice Evaluation should not be viewed as an ‘add on’ but Evaluation should not be viewed as a negative should be built into a program’s design and presented process but rather as an opportunity to learn. If as part of a continuous quality improvement process. your organisation does not have the capacity to hire Where funding constraints do not allow for an external an external evaluator consider hiring a professional evaluation, funding should be provided to organisations evaluator to help with the development of an for self-evaluation. evaluation framework and for some advice/training in undertaking self-evaluations. Developing an evidence base Regular feedback loops with a process for escalating Documenting how you have achieved the program’s concerns should be part of the data and monitoring objectives through regular collection and analysis of process to ensure data being collected is used to inform data is important, not only for providing a stronger practice and improve program outcomes. Develop a evidence base for recurrent funding but also to improve co-accountability framework and consider providing service delivery and ensure client satisfaction with the funding for an online data management system for program. Consider using an online data management data collection which will make it easier for program system for data collection which will make it easier for providers to enter and share data. staff to enter and share data. Questions to ask before implementing/delivering a program • What is the program trying to achieve? • Do you think the programs objectives meet the needs of the community? • Is the program needed? • Do you think the community will support this • Is there community support for the program? program? • Is there an existing program already addressing a • What is different about this program? similar need? • What staff will you need to deliver this program? • What is different about this program?

• Who will implement the program? Questions to ask before evaluating a program • What is the program trying to achieve AND how will • What is the program trying to achieve AND how will you measure whether it is meeting this objective — you measure whether it is meeting this objective — are the program’s objectives measurable? are the program’s objectives measurable?

• What type of data are you able to collect to monitor • How will you collect the administration data needed to the effectiveness of the program? measure the impact of the program? For example, will there be an online database for staff to add data to, • Is there existing data (e.g. administrative data/ABS or will they be required to enter program data into an data) that could be used to measure change/impact? Excel spreadsheet? How could you make this process • How will you collect the data? — what methodology as streamlined as possible? will be used to collect the information — ie surveys/ • How will you show evidence of the program’s impact; interviews e.g. will you undertake pre-admission surveys and • Who will collect the data/undertake the evaluation? post-exit surveys of participants in the program?

Evaluating Indigenous programs: a toolkit for change | 21 NPV calculations — the Yes aggregate NPV estimated as was $14,163,000 in 2014 dollars. No — the net Yes benefit of the AIME program $38 in 2012 was million based on a 7% discount rate (benefits are $58 million while costs, including costs of education, are $21 million). Discount rate Yes No Yes Equity implications Yes No No Sensibility analysis Yes No Yes ams aimed at helping Indigenous people in Australia. ams aimed at helping Indigenous people in Australia. Proper quantification Yes Yes Yes Identification of options No No No Measurable objectives No No Yes Findings The project returned economic benefits that its economic exceeded costs, with a benefit cost of 4.3 ratio Full list of assumptions not provided The CBA is prospective, CBA the available evidence from similar indicates the initiatives healing centres could a return, on average, of benefit to cost ratio 4.1 over Full list of assumptions not provided each $1 spent, $7 in For for benefits is generated the economy Full list of assumptions not provided Description An Indigenous designed conflict mediation system Establishment of 13 Indigenous healing centres for spiritual healing A program providing education, training and employment for Indigenous as well Australians as protecting and conserving the environment Appendix A Program name Yuendumu Mediation and Justice Committee Healing Foundation Australian Indigenous Mentoring Experience (AIME) Evaluation of CBA Programs The table 9 below analyses appropriateness of the methodology 5 recent CBAs which specifically measure progr Table 9 Evaluation of CBA reports*

22 | Evaluating Indigenous programs: a toolkit for change NPV calculations — scenario Yes from results range a net gain of $4.3 million in the worst case to a net gain of $50.8 million in the best case. — the true Yes cost of the is at program estimated to be at least 17 to 23% lower than the book cost.

Discount rate No No Equity implications No No Sensibility analysis Yes No Proper quantification Yes No Identification of options Yes No Measurable objectives No No Findings Base case petrol sniffing benefits of $53.7 million per annum and base case Opal rollout costs of $26.6 million, producing a net gain of $27.1 million. Full list of assumptions provided. In 2009-10 the budget or book cost of Working $41.2 on Country was the true million, however found to be cost was between $34 million and $32 million. Full list of assumptions not provided. Description to address Strategy petrol sniffing A program providing education, training as employment and well as protecting the environment : Access Economics, 2006, Allen Consulting Group, 2011, Daly, A. and Barrett, G., n.d., Deloitte Access Economics, 2013, 2014, PWC, 2015, KPMG, 2013 n.d., A. and Barrett, G., 2011, Daly, : Access Economics, 2006, Allen Consulting Group, Program name Opal Unleaded Petrol on Working Country Source report and a CBA had an evaluation *Some of the programs

Evaluating Indigenous programs: a toolkit for change | 23 NPV calculations Yes Yes Yes Yes Yes No Discount rate Yes No Yes No No No Impact in context Yes Yes Yes No No No Inputs/ Outputs Yes Yes Yes No Yes No Sensibility analysis Yes Yes Yes No Yes No Quantification Yes Yes Yes Yes Yes Yes . Measurable objectives Yes Yes Yes No No No Findings equates to 3:1 The SROI ratio and estimated $55m in social compared generated was value with the $20m investment. SROI not actual. Forecast of $1.1m in the An investment created 2012 calendar year $7.5m of social and economic of 6.7:1 or a SROI ratio value, $1. Cost was $6.70 for every $3,500 per participant approx. in 2012. estimated The SROI was $4.41 for every to average however dollar of revenue, questionable methodology used. of From an annual investment forecast that $448,000, it was $1.89 million will be created in in one year. social value for SROI ratio Forecast from 2014. A SROI 3 years of 1:2.71 for school-based and 1:3.14 for traineeships fulltime traineeships. Key features include teams of Programs (Return employees, ranger to Country) trips and culture and heritage programs case management Intensive to to Aboriginal youth address the underlying causes of offending Organization working with Aboriginal people young through their school years to make and beyond, sure they get the right and education, jobs training experience, and life skills Connects Indigenous owned businesses with opportunities in corporate supply chains. aimed at building Initiative of the financial capacity Indigenous people, through face-to-face practical financial support and access to microfinance School-based traineeships, and internships for to Indigenous Australians pursue careers in financial services Program name Kanyirninpa Jukurrpa (KJ) program Ranger Helping Hand and Linking Youth Ganbina Supply Nation Certified Suppliers National Australia Bank: Indigenous Money Mentor NAB/Indigenous Program Trainees Evaluation of SROI Programs Indigenous helping at aimed programs Indigenous different of impact social the measure that reports SROI six analyses and report SROI a in used methodologies the indicates 10 Table evolving is still SROI an as Overall, reports. different the of each from omitted been which have included and been have which methodologies analyses table The in Australia. people of the methodology used in reports in the comprehensiveness variability field, there is some Table 10: Evaluation of SROI reports A. and 2014, Ravi, EY, E. n.d., 2014d, Burton, R. and Tomkinson, Australia, 2014c, Social Ventures Australia, Social Ventures 2014b, Australia, 2014a, Social Ventures Australia, 2013, Social Ventures A., Source: Ravi, 2013 Siddiqi, T.,

24 | Evaluating Indigenous programs: a toolkit for change Rating of evidence Moderate of high data was Although the qualitative a lack of consistent there was quality, data. quantitative not Data on victim participation was collected consistently. Moderate Despite the lack of some quantitative used in the data the that was analysed thoroughly in report was Analysis included numerous ways. of of offences and types different types crimes committed as well different In conclusion, the handling time frames. quite strong, but due of the data was and lack of to the small sample sizes can only be control groups, the rating judged as ‘moderate’. Key findings found the program The evaluation and that met most of its objectives, consensus” the “general/stakeholder ‘culturally was that the program was appropriate’ and a success. However, the report also states that data from the NSW Bureau of Crime and Statistics an did not have shows that the program of recidivism, which was impact on rates also one of the main goals. There was data, which was a lack of quantitative taking place attributed to the program in different places with data collection methods. re-offending, Furthermore, in evaluating found that the Circle groups were it was to reoffend than the control more likely group. while of recidivism improved Rates offenders were participating in the findings also showed However, program. no drop in the severity that there was results of reoffending. The mixed questions about the success of raised Findings were deemed the program. inconclusive. and a literature review and a literature

Type of evaluation/description methodology methodology primarily The evaluation involving approach, qualitative a adopted interviews and group discussions with 115 people trends with regard to of Australian and Indigenous sentencing programs however was There existing evaluations. some analysis of existing data on Circle Sentencing offenders and Local Court data. collected from data was Quantitative local police sources. Limitations listed included: small sample and a lack of control groups. sizes a Name and description of program/ form of publication Circle Sentencing Program (NSW Attorney General’s Office) — & Indigenous (Cultural Report Evaluation 2008). May Centre Australia, Research aimed to The Circle Sentencing Program sentencing court an alternative provide for adult Aboriginal offenders that more Aboriginal people in the directly involved sentencing process. Queensland Indigenous Alcohol Diversion Program (QIADP) — recidivism study (Specialist Courts Legal Services Branch, and Diversion, 2010). November treatment a voluntary QIADP was for Aboriginal people who have program been arrested and appeared in court for alcohol related offences. Type of program 1. Crime 2. Crime Summary of Evaluations/Case-studies/Audits Table 11 Summary of Evaluations/case-studies/audits

Evaluating Indigenous programs: a toolkit for change | 25 Rating of evidence Moderate for comparisons not available Data was that did not participate in the with youth during this time, and further program assessment of the outcomes from this comparing with a similar by program group who did not participate would be School attendance data for desirable. individuals were requested but unable and due to confidentiality to be provided small numbers N/A Weak summary Only the evaluation so difficult to draw available, was conclusions. Although people reported benefits from taking part in the positive little evidence of very there was program beyond of the program the effectiveness anecdotal evidence. Key findings found the program The evaluation in reducing adverse effective was and the contact between Tiwi youth criminal justice system. Individual for re-offence data from NT Police participants showed that program 20% of participants (13 65 young people) had contact with the police for following alleged offences in the year — commencement with the program below what would be expected for this population without the intervention. reduced found to have The project was ensuring violence by and prevented conflicts are settled peacefully and in 2014 The Net Present Value quickly. $14.1 million with a benefit-cost was of 4.3. ratio briefly The findings of the evaluation working well with the outlined what was on the impact of program program, with people and what needed to improve the program Type of evaluation/description methodology and qualitative involved Evaluation data analysis including quantitative interviews, document analysis, and analysis of observations data. administrative methodology used a The evaluation interviews with key desktop review, documents to and key stakeholders the economic costs and value identify Mediation and benefits of the Yuendumu and Justice Committee using an application of Cost Benefit Analysis. The evaluation involved talking to people involved The evaluation and part in the men’s who had taken groups and other people in the women’s with the program. involved community The report consisted of three pages a summary of the evaluation providing findings.

an CBA report evaluation summary (James Cook evaluation Name and description of program/ form of publication The Tiwi Islands Youth Development and Diversion Unit (TIYDDU) (NT) based in Wurrumiyanga — et al 2014; CTGCH (Stewart evaluation 2014). started in 2003 and The program program diversion consists of a 12-week offenders from the youth for first-time Tiwi Islands. Yuendumu Mediation and Justice Committee program — (Daly and Barrett, n.d). Mediation and Justice The Yuendumu has been Committee program upon since 2011 and draws operating dispute resolution Warlpiri traditional and relationship-sustaining practices to strengthen family relationships that promote strategies and develop and address family safety community violence. Marapai Ngartathati Murri Womens Group and Yurru Ngartathati Mens Group Indigenous justice program — 2013). November University, bail rehabilitative The diversionary targets Indigenous offending in program Mount Isa, Queensland. Type of program 3. Crime 4. Crime 5. Crime

26 | Evaluating Indigenous programs: a toolkit for change Rating of evidence Weak The structure of the report is good and However, there were clear objectives. the report is only able to confirm that in the region had been awareness on. The core objectives improved (reducing the number of driving offences and individuals coming into contact with the criminal system) could not be adequately accounted for. Moderate comprehensive The methodology was and of qualitative with a range methods utilised. However, quantitative the number of people who responded and participated in the to the surveys not provided. interviews was Key findings has The report states that the program been successful in accessing its target of its audience and increasing awareness resources. available The report also states that there has been a decline in the number of driving offences for the targeted Aboriginal without an population, however, this cannot be established control group, Furthermore, attributed to the program. the lack of routinely collected data in the area (lack of identification makes road crashes) in RTA Aboriginality to the it impossible to link improvements program. by of funding received The majority Vibe (89%) is expended on program than administrative rather delivery additional funding. Vibe also generated entities and as income from corporate able to match the a result Vibe was contribution of government’s federal $4.6 for its suite $2.3 million to generate while there was of products. Overall, between the products some variability between products could and integration found they the evaluation be improved, for money and a unique value provide service. Type of evaluation/description methodology review. Literature statistics from the ACE Quantitative: databases. enrolment and BOCSAR 18 face-to-face interviews, Qualitative: 5 phone interviews and focus groups with 55 participants. No control group. methodology consisted of: Evaluation to desktop research; a reader survey Deadly Vibe magazine readers of Vibe’s (3,500 were distributed); an online focus groups participant survey; survey; and telephone interviews. ) — evaluation report (Lismore — evaluation Name and description of program/ form of publication On The Road Lismore driver education Education (ACE)). Adult Community is a comprehensive On the Road that targets education program driver Aboriginal people living in the Far goal The overall North Coast of NSW. is to reduce the over- of the program representation of Aboriginal people living North Coast of NSW in the in the Far criminal justice system. Vibe Australia suite of products — (KPMG, 2013 evaluation is a NFP organisation Vibe Australia the well-being which aims to improve of Indigenous people through national communications, media and events services. Type of program 6. Crime 7. Culture

Evaluating Indigenous programs: a toolkit for change | 27 Rating of evidence N/A Weak sought to use a range The evaluation let down methods but was of qualitative data and an a lack of quantitative by on anecdotal evidence. overreliance Weak Methodology not described and only Program summary available. executive but positive were very achievements more evidence needed to show causality. No control group.

Key findings study found that The qualitative people’s builds young the program their self- confidence and improves helped worth, and is considered to have and substance curb suicide, self-harm abuse in the participating communities. people found that young The evaluation appeared to respond well the health There was promotion messages of IHHP. some recall of the messages relating as well to depression and self-respect, messages of look; listen; as the key people Young talk; and seek help. also expressed “feeling good about as a result of some the themselves” the evaluation IHHP activities. However, Blue should recommended that Beyond complimentary to consider programs IHHP that specifically target education about depression and anxiety. include a achievements program Key reported decrease in the incidence of family and domestic violence less in communities. In violence generally a reported there was one community 50% reduction in the number of men registered with the NT Department of Correctional Services and a significant of recidivism and reduction in rates reoffending, as well increased levels practice. of cultural A literature review including an A literature feedback from and verbal Written Direct participation and observation Interviews with staff. appraisal of media reviews and appraisal articles. members community from field trips, including extended visits of two months during 2009 and 2010. Type of evaluation/description methodology study. Case study/qualitative Methodology consisted of: •  •  •  •  of Indigenous Hip Impact evaluation people Hop Projects (IHHP) on young in selected sites the Kimberley and regions of WA. Pilbara conducted over was The evaluation three stages, using a combination methods including of qualitative questionnaires; one-on-one interviews and focus groups. focuses on implementation Evaluation of the programs. and early development summary did not describe The executive of methodology used in the the type evaluation. — — Name and description of program/ form of publication The Yiriman Project (WA) 2013) D, (Palmer, immersion The Project is a cultural that consists of trips ‘on program where elders teach young country’, people at risk of offending about their heritage. cultural Indigenous hip hop projects report of Indigenous Hip Hop Evaluation Centre for Katitjin, Projects (Kurongkurl Education and Indigenous Australian of Education and Arts, Faculty Research 2009). University Edith Cowan The IHHP is a partnership with Beyond workshops that combine Blue to provide Indigenous culture and hip hop to self- promote confidence and positive expression. Our Men Healing: Creating hope, respect and reconnection summary of evaluation executive 2015). (Healing Foundation, Our Men Healing are three pilot healing projects in the remote men’s communities of Northern Territory Maningrida, Ngukurr and Wurrumiyanga. Type of program 8. Culture 9. Culture 10. Culture

28 | Evaluating Indigenous programs: a toolkit for change Rating of evidence N/A Weak sought to use a range The evaluation let down methods but was of qualitative data and an a lack of quantitative by on anecdotal evidence. overreliance Weak Methodology not described and only Program summary available. executive but positive were very achievements more evidence needed to show causality. No control group. Rating of evidence N/A Moderate The data captured seems appropriate reported on a and was and accurate continuous basis. Moderate design evaluation A mixed-method of data sources to utilised a range and enable findings to be cross-checked with other data. verified

Key findings study found that The qualitative people’s builds young the program their self- confidence and improves helped worth, and is considered to have and substance curb suicide, self-harm abuse in the participating communities. people found that young The evaluation appeared to respond well the health There was promotion messages of IHHP. some recall of the messages relating as well to depression and self-respect, messages of look; listen; as the key people Young talk; and seek help. also expressed “feeling good about as a result of some the themselves” the evaluation IHHP activities. However, Blue should recommended that Beyond complimentary to consider programs IHHP that specifically target education about depression and anxiety. include a achievements program Key reported decrease in the incidence of family and domestic violence less in communities. In violence generally a reported there was one community 50% reduction in the number of men registered with the NT Department of Correctional Services and a significant of recidivism and reduction in rates reoffending, as well increased levels practice. of cultural Key findings has Since 1997, the program implemented projects in 35 20 discrete locations covering communities in the Northern Territory, and South Australia Australia, Western the audit found Queensland. Overall represented good value the program was for money and that the program objectives. consistent with COAG’s The report suggests some improvement although long-term in skill development, opportunities/occurrences employment are lacking, mainly due to personal the participants). issues (as reported by noted that there were often It was problems with too little work to actually participants busy during the week. keep found that AIME and AOP The evaluation results. In its first are achieving positive the AOP reached its of operation, year better school of encouraging objective for Aboriginal progression rates grade Islander students, Strait and Torres In compared with the national average. 2015, AIME connected approximately 5700 high school students with 1900 students across 18 university volunteer in all mainland universities, Australian states and the ACT. A literature review including an A literature feedback from and verbal Written Direct participation and observation Interviews with staff. appraisal of media reviews and appraisal articles. members community from field trips, including extended visits of two months during 2009 and 2010. Type of evaluation/description methodology study. Case study/qualitative Methodology consisted of: •  •  •  •  of Indigenous Hip Impact evaluation people Hop Projects (IHHP) on young in selected sites the Kimberley and regions of WA. Pilbara conducted over was The evaluation three stages, using a combination methods including of qualitative questionnaires; one-on-one interviews and focus groups. focuses on implementation Evaluation of the programs. and early development summary did not describe The executive of methodology used in the the type evaluation. Type of evaluation/description methodology — the audit assessed FaHCSIA’s Audit management of AACAP and how the Department monitors is making contribution the program of primary and to the improvement health, and living environmental conditions, in remote Indigenous communities. almost entirely of data was The type comprising tables and quantitative, figures showing the number of jobs/ positions created, number of animals treated, how long each person had retained their position/employment. was An independent evaluation the AIME in 2012 to evaluate undertaken in comparison to the Outreach Program, included The evaluation Core Program. design incorporating: a mixed-method delivery; of program observation facilitators, interviews with program mentors and mentees; review of AIME survey documentation and a quantitative of mentees. — — — — audit Animal — Name and description of program/ form of publication The Yiriman Project (WA) 2013) D, (Palmer, immersion The Project is a cultural that consists of trips ‘on program where elders teach young country’, people at risk of offending about their heritage. cultural Indigenous hip hop projects report of Indigenous Hip Hop Evaluation Centre for Katitjin, Projects (Kurongkurl Education and Indigenous Australian of Education and Arts, Faculty Research 2009). University Edith Cowan The IHHP is a partnership with Beyond workshops that combine Blue to provide Indigenous culture and hip hop to self- promote confidence and positive expression. Our Men Healing: Creating hope, respect and reconnection summary of evaluation executive 2015). (Healing Foundation, Our Men Healing are three pilot healing projects in the remote men’s communities of Northern Territory Maningrida, Ngukurr and Wurrumiyanga. Name and description of program/ form of publication Army Aboriginal community assistance program — Office (ANAO), National Audit (Australian 2010) Aboriginal Community The Army (AACAP) aims to Assistance Program environmental and upgrade develop in remote health infrastructure Islander Strait Aboriginal and Torres communities. Animal Management in Rural and Remote Indigenous Communities Inc. (AMRICC) Management Worker Program Hill, Regina report (Regina evaluation Ltd, August Consulting Pty Hill Effective 2014). The project aimed to shift responsibility back into the hands of and capability Aboriginal employing by local community (AMWs) to help improve animal workers companion animal health and control in those regions. AIME Outreach Program (KPMG, 2015). evaluation Indigenous The AIME (Australian was Mentoring Experience) Program established in 2005. The goals of the retention rates are to improve program Islander Strait of Aboriginal and Torres 12 and high school students to year post school and to connect Aboriginal Islander students to Strait and Torres and employment. university Type of program 8. Culture 9. Culture 10. Culture Type of program 11. Culture 12. Education 13. Education

Evaluating Indigenous programs: a toolkit for change | 29

Rating of evidence Moderate methodology has a sound The evaluation it does not list However, framework. the % of participants who completed it Rather, program. the entire 12-week only lists participants who completed 4 or more sessions, which is 43% for Tiwi people, and 10% for urban indigenous is people. The original sample size 110 (of which 47 are non-indigenous) small sample pool. In so it is a very delivered was addition, the program using different methods in the locations. of different types many However, been applied in the report, analysis have rating. resulting in the moderate Moderate sound, with a The methodology was conducted to measure baseline survey the impact of national campaign potential However, and a control group. for bias due to the non-representative nature of the sampling, in particular not the choice of locations was As a result the findings are randomised. to the wider target not generalisable population of Aboriginal and/or Torres Islander mothers and carers of Strait in Australia. children aged 0 to 5 years Key findings The report states that there were in the behaviour improvements marked of both children and parents. The seen in urban was most improvement indigenous girls and urban non- indigenous boys. found that the Campaign The evaluation of impact on awareness had a positive ear disease among Aboriginal and Torres Islander communities, including Strait increased knowledge of symptoms and and increased help-seeking prevention, as evidenced through a behaviours, of 200 mothers/carers survey follow-up 18 months after the campaign launch. Type of evaluation/description methodology Strong and thorough analysis of both data. and qualitative quantitative Full sources of data attached in the appendix. No control group. used the following The evaluation methodology: with Interviews and an online survey education and health professionals; case studies based on focus groups/interview with parents/carers and a quantitative conducted with mothers and survey female carers of children aged 0-5 years (baseline in July 2011 and follow-up 2013), 2012 to February from November of n=200 in each with a sample size round. — evaluation report (CIRCA, evaluation Name and description of program/ form of publication Let’s Start: Exploring Together report (Charles Darwin final evaluation 2009). University, which program An early intervention aims to help parents and young children deal with emotional issues Therapeutic and challenging behaviour. to help children’s support is provided before social and emotional development they start school. The national Care for Kids’ Ears Campaign — 2013). the federal is funded by This program under a four-year government and ear eye agreement to improve health services for Indigenous people. Type of program 14. Education 15. Education

30 | Evaluating Indigenous programs: a toolkit for change Rating of evidence Moderate robust and included a Methodology was Framework Monitoring and Evaluation and and triangulation of qualitative data. Information on the quantitative number of people who participated in provided interviews/focus groups was in and interview guides were provided difficult to identify appendix. However, as behaviour changes in electoral not recorded on the electoral ethnicity Performance roll or when people vote. also not was data for the program entered uniformly or consistently Lack of States and Territories. by and training capacity preparation, analysis of Field impacted on qualitative Officer journals. N/A Key findings in the found variation The evaluation sated objectives degree to which IEPP’s The been achieved. and outcomes have recommended basing future evaluation on evidence changes to the program and harnessing the of ‘what works’, experiences of other government working in an agencies and programs Indigenous context. This includes the adoption of a robust monitoring and system and routine analysis evaluation of performance data. In 2015 the number of students on attending school and university 500 for the AIEF Scholarships exceeded first time. the School students supported by a 93% retention and achieved program and 96% of 12 completion rate Year tertiary scholarship students continued studies or completed their university during the year. Type of evaluation/description methodology methodology included: Evaluation scan and document review; literature semi-structured interviews with staff; focus groups; case studies with a cross section of communities and analysis from the Queensland and data available elections the Northern Territory Annual reports/reviews

evaluation evaluation annual report Name and description of program/ form of publication Australian Electoral Commission’s (AEC) Indigenous electoral participation program — 2 1 and 2 (AEC, 2012). Volume volumes contains the case studies is aimed at empowering IEPP program Islander Strait Aboriginal and Torres their right to in exercising Australians vote. Australian Indigenous Education Foundation (AIEF) — non-profit The AIEF is a private, scholarships organisation that provides to enable Indigenous students boarding schools. The attend private mentoring and also provide programs Indigenous career support to encourage There are students to attend University. the AIEF Scholarship two core programs: and the AIEF Pathways Program Program. Type of program 16. Education 17. Education

Evaluating Indigenous programs: a toolkit for change | 31 Rating of evidence Strong utilised and of data sources was A range was a review of previous evaluations a longitudinal to provide undertaken impact. assessment of the program’s Weak concluded that Although the evaluation there met its objectives, the program analysis provided, no cost-benefit was other than measuring success as producing the desired number of events In other words, by and other programs. completing/producing enough programs, a success. it stated that was The appendices clearly show that the consulted were only stakeholders consults and others internal (workers, who personally benefitted from the project). Weak focus on self- Due to a heavy of many assessment, and no follow up, the learned outcomes cannot definitely be attributed to the activities of program. Examples of the questionnaires were attached as an appendix, but not the answers/data. Key findings 12 retention rates 11 to Year Year increased from 62% in 2009-10 to 73% in 2015-16 far surpassing the Greater Shepparton Indigenous and national Indigenous rates. who All participants aged 25 to 34 years years had been with Ganbina for five 12 or an or more had attained Year qualification. equivalent participation increased from University two Ganbina participants in 2009 to 15 in 2016. The conclusions were that the pilot project contributed to a “growing body of evidence” that arts-led processes are beneficial for recovery due to the However, the community. data coming solely from evaluation and associated artists, workers program the conclusion is on shaky grounds. no data gathered from the There was wider community. report concludes that The evaluation it a success. However, was the program is stated that more effort needed to be put into finding appropriate mentors, of them were ill-equipped to as many perform their duties as outlined in the program. While leadership skills were one of of the program, the main objectives not all participants reported that their leadership skills had improved. Type of evaluation/description methodology The purpose of the impact assessment cumulative to assess Ganbina’s was impact since 2005. The methodology used consisted consultations and of stakeholder client interviews, reviewing Ganbina’s data, desktop research on relevant factors, contextual and environmental and data collected on previous and assessments of evaluations Ganbina. quantitative of variety broad a was There methods used to and qualitative the sample pool is However, evaluate. small (22 participants; mentees and mentors). is that most of the A drawback — is based on “self-reporting” evaluation asking the participants questions at about their attitudes end of the program and feelings of satisfaction. The methodology was comprised The methodology was data: stakeholder solely of qualitative consultations, in-depth interviews, Recovery participation in the Creative and case studies. National Forum conducted one was The evaluation completion. month before program — Department — Impact Assessment (SVA, — Impact Assessment (SVA, — an evaluation report (Jobs — an evaluation Name and description of program/ form of publication Ganbina 2016). established in 1997 to help Ganbina was school and further education improve and ‘real’ job prospects completion rates among about 6000 Indigenous people in in Victoria. the Goulburn Valley Ganbina is unique in that it 100% and philanthropic corporate funded by foundations. $3,300 The cost per participant averages a year. Creative Recovery Pilot Program report (for an evaluation and Innovation of Science, Technology, NSF Consulting, the Arts, conducted by December 2012). project teaches recovery The Creative arts skills to Indigenous people with mental health issues to help improve their emotional well-being. Indigenous Youth Leadership Program EMS prepared by Foundation, Australia Consultants, October 2011). aims to help increase The program leadership people’s Indigenous young skills and for them to become positive role models in their communities. The a period of two years, covers evaluation 2010 and 2011. Type of program 18. Education 19. Education 20. Education

32 | Evaluating Indigenous programs: a toolkit for change Rating of evidence Weak limited performance data As there was the auditors were unable to available, comment on the extent to which students’ academies had improved enrolment, attendance, retention and engagement. did not include The evaluation comparison results for schools and students not in the program. Weak The data included in the report is purely It emphasises statements qualitative. (providers, stakeholders made by parents, etc) such as ‘attendance has no it provides however improved’, statistical data as evidence. Weak been improved would have Evaluation a detailed methodology section by in the report and a discussion on limitations of the methodology. Key findings The audit concluded that the academies’ achieving efforts were directed toward intermediate outcomes — that is, helping students to come school and their behaviour and improving engagement in the classroom. Anecdotal had evidence suggested the program impact on the educational a positive experiences of Indigenous students. Of the 143 schools contacted as part 87 (61 per cent) of the evaluation, participated. School staff considered the impact. a moderate to have program is that of The conclusion of the survey as satisfaction of the program, overall by the self-assessment measured by stakeholders. various found that, while many The evaluation members still had complex community their financial barriers to improving MMT clients were more likely literacy, developed than non-MMT clients to have basic financial management skills. he objective of the performance he objective Type of evaluation/description methodology T to assess the performance audit was of academies funded under the and DEEWR’s Sporting Chance Program management of the program. The audit did not assess the education performance of the program’s component engagement strategies not was as this part of the program introduced until 2008. of the Sporting Chance An evaluation Program. only. survey stakeholder Qualitative No control group. in No methodology section provided methodology appeared report, however administered to consist of surveys through fieldtrips and analysis of data program — an evaluation (Garner, S (Garner, an evaluation Name and description of program/ form of publication The Sporting Chance Program — and audit report (Office of Evaluation 2009 OEA, Indigenous Programs, Audit (Lonsdale et al, and an evaluation 2011). uses sport and recreation The program educational outcomes to help improve for Indigenous students School Nutrition Program Of (Dep. survey Stakeholder Education and Workplace Employment, 2009). Relations, breakfast and/ provides The program children from or lunch to school-aged remote communities of the Northern them to to help encourage Territory, attend school and to engage in learning. The MoneyMob Talkabout (MMT) program — A, 2015) and Pryor, established in was The MMT program 2012 in the Amata, Mimili and Pukatja and assists people communities of SA in remote Aboriginal communities to also visit Teams manage their money. WA other remote communities in SA, services on an and the NT providing outreach basis. Type of program 21. Education 22. Education 23. Education

Evaluating Indigenous programs: a toolkit for change | 33 Rating of evidence Moderate had a 39% surveys Health worker and 69 clients participated response rate in discussion groups. The feedback is analysed thoroughly, it must be emphasised that however and is on the quality the evaluation than utilisation of the materials, rather whether the material made a positive difference amongst recipients. Moderate good relatively design was Evaluation not data was level but community for three of the communities. available Moderate on the was The focus of the evaluation management and implementation of the and did not assess the current level PSS impact. of sniffing to illustrate Key findings on given generally Good feedback was the utilisation of GS material, with responding that 9/10 health workers they had used the materials. Positive on the quality also given feedback was of the materials, and organisation had 65 individual requests for more resources/materials, mainly from other indigenous-run NGOs. ‘a found EON takes The evaluation development genuine community long term approach that values delivery…’ rapid engagement over Evidence suggested the program knowledge had increased people’s eating. and healthy of horticultural found that the evaluation However, and lapses in the poor local governance of councils had made administration engagement with residents in some communities difficult. and particularly through the Overall, roll out of low aromatic fuel (LAF) and a has achieved services, the PSS youth of reduction in the prevalence dramatic sniffing across much of remote Australia. the six evaluation the six evaluation

Type of evaluation/description methodology distribution and promotion Quantitative: of created content. with health workers surveys Qualitative: and client discussion groups. to was The goal of the evaluation the usage of Growing evaluate and health workers Strong resources by other health professionals throughout Queensland, and to obtain client feedback. No control group. of A process and outcomes evaluation in six remote Indigenous the program used a case communities. The evaluation study approach to describe the evolution and impact of each the Program components in all locations. and data review. Literature consultations. Stakeholder of to five Field-visits communities (with wet weather the visit to sixth preventing community). (WOSE) Evaluation A Whole of Strategy review of the strategic — a high-level Sniffing implementation of the Petrol since its establishment in (PSS) Strategy 2004/05. — an — evaluation report — evaluation Name and description of program/ form of publication Growing Strong — Feeding You and Your Baby (Queensland Health, 2009). information on provides The program food to eat during pregnancy. healthy EON Thriving Communities Program in Six Kimberley Communities report (KPMG, 2013). A Evaluation The Edge of Nowhere Foundation to the program (EON) delivers in remote promote healthier lifestyles communities. The Thriving Communities in is currently being delivered Program — 13 across the 16 communities in WA Kimberley region and 3 in the Pilbara. Petrol sniffing strategy — report (Origin Consulting, evaluation 2013). aimed to reduce petrol The strategy of substance sniffing and other types people. abuse among Indigenous young Type of program 24. Health 25. Health 26. Health

34 | Evaluating Indigenous programs: a toolkit for change Rating of evidence Moderate Although the report does not utilise a control group/benchmark data to it does provide effectiveness, evaluate and in some areas good overall, a very using detailed, analysis of the program, different metrics. several Moderate methodology Although the evaluation robust, the model for program was performance missing a set of key was indicators, so that activities could be assessed against objectives. Moderate Although the methodology used was the first qualitative sound, as this was no formal baseline there was evaluation information regarding women’s on. empowerment to draw meant some High population mobility from women were away of the key their communities during the time of fieldwork. Key findings in-depth and analyses The report is very but in terms thoroughly, the program and outcomes (with of service delivery available). what data was The report states that the increase in numbers of indigenous people accessing the services confirm pressing need for publicly funded dental care in remote locations. identified successes of the program Key were: of sexual health enhanced integration into local primary health care comprehensive expanded and coordinated delivery; case finding; sustained regional active commitment/approaches to STI control; and cross-border standardisation of sexual health procedures. However, links to policy and more effective in the region was planning development needed. had found the program The evaluation of ‘Increased the self-determination The most region’. women in the Jawoyn self- important impact on the women’s the establishment determination was and of the Banatjarl Women’sCouncil the election of office bearers, which formalised the role of women’s centres in speaking up for women and taking control of issues that affect women in the region. strategy strategy

Type of evaluation/description methodology review. Literature data from patient Quantitative databases. In-person, telephone and group interviews. This project has been subjected since its evaluations to several implementation in 1994. The most recent evaluation analysis of included a quantitative data and the STI/HIV surveillance system, combined with a surveillance process evaluation. qualitative Strength-based evaluation methodology. Strength-based evaluation of Documentation and Literature. Review and Stakeholder Initial Field Trip Interviews. and Stakeholder Second Field Trip Interviews.

final evaluation report final evaluation Name and description of program/ form of publication Filling the Gap Indigenous Dental Program — of NSW, & al, University (L. Pulver December 2009). ‘Filling the Gap’ recruits volunteer dentists to address the chronic shortage of dental professionals to Indigenous people in available northern Queensland. Tri-State HIV/STI Project — Institute for (Australian Evaluation Primary Care, 2004) The project aims to reduce the incidents and impact of sexually transmissible infections on remote Indigenous people living on the borders of South Australia, and the Northern Australia, Western Territory. Women’s development project — (Fred Hollows Foundation, evaluation 2012) The project aims to increase self- determination among Indigenous women in the Northern living in East Katherine Territory. Type of program 27. Health 28. Health 29. Health

Evaluating Indigenous programs: a toolkit for change | 35 Rating of evidence Moderate framework Although an evaluation no discussion on used there was was the limitations of methodology. numbers were also not Participant included. Moderate is sound, framework The evaluation but in depth comparisons of the level post- pre-and of service provision commencement of the Scheme was had not not possible as other providers the Indigenous been required to identify status of people accessing the Scheme. was limitation of the evaluation A key data, the amount of service delivery to including baseline data available analyse whether the Scheme had its intended outcomes. achieved Moderate allowed evaluations the seven Reviewing time to be identified. for trends over some of the methodology However, limited, for was evaluations the seven the primary example, for one evaluation analysis of unedited source of data was reflective video footage from a one day workshop for FWB participants. Key findings logic enabled effective The program and of the program evaluation model that the program demonstrated to is basically sound and generalisable a number of other services. However, issues were identified with the program such as a poor recall system, education sessions that participants were unable to understand, and medication compliance issues. and transport The number of consultations increased of 116% within the first year by Anecdotal evidence from the operations. interviews seems to confirm the increase in demand for the services. The synthesis found that the program of had increased the capacity greater control over participants to exert their health and wellbeing. no evidence presented There was changes occurring at the of positive level. community broader, Type of evaluation/description methodology used to help logic was A program framework. an evaluation develop to allowed the program This framework in terms of process, impact be evaluated and outcome. data (from service delivery Quantitative: databases). interviews, service provider Qualitative: awareness. patient stories, community No control group. formative A synthesis of seven which of the program, evaluations a total of 148 adult and 70 involved student participants. — evaluation report — evaluation an evaluation (Centre for an evaluation Name and description of program/ form of publication Wurli-Wurlinjang diabetes day program — Health, 2011) Remote outcomes aims to improve The program 2 Diabetes at Wurli- for clients with Type Health Service by Wurlinjang of greater self-management encouraging diabetes. The Victorian Aboriginal Spectacles Subsidy Scheme College Of Health & Australian (Vic. Dep. July 2012). of Optometry, subsidies to provides This program Indigenous Victorians for spectacles and other visual aids. The Family Wellbeing Program — report (Closing the Gap an evaluation Clearinghouse, 2013). aims to help Aboriginal The program to cope with grief people find new ways is being delivered and loss.The program in 56 sites across Australia. Type of program 30. Health 31. Health 32. Health

36 | Evaluating Indigenous programs: a toolkit for change Rating of evidence N/A N/A N/A Key findings in the Participation a number of positive delivered program outcomes for men, including increased and education self-confidence opportunities. employment The report recommended developing a reporting template to capture data against KPIs regularly and consistently. This would also allow sites and the Department to monitor implementation and outcomes across sites, measure time. change in participants over data collected, From the qualitative an increase in knowledge there was cuisines, people’s about other cultural a increased and there was skill level reported increase in confidence and self- esteem. One person found a job working in kitchen after taking part in the project. from the group interview Respondents the recipes and stated that they liked more sessions, 33% of the wanted participants interviewed had ‘cooked the more often’ and 71% had cooked recipes again. a strong by is delivered The program team of Aboriginal people who amongst of experiences, skills a range them have and knowledge, a deep commitment work. The to making the program participants relate well to the young had similar life Aboriginal staff who have experiences to them in the past. Type of evaluation/description methodology than an Analysis rather Descriptive evaluation. data. Document review of program Interviews and focus groups with staff, participants and stakeholders. as such, more Not a proper evaluation plan. an evaluation Data primarily came from feedback forms from people who had participated in the project and group interviews. methods of Case study used qualitative data collection and analysis, including in-depth interviews documentary review, of one the and participant observation healing camps. (Muru Marri

descriptive descriptive case study evaluation of initial concept evaluation (Urbis, 2013).

Name and description of program/ form of publication Strong Fathers Families program (SFSF) — than an rather analysis of the program evaluation aims to promote the The SFSF program role of Aboriginal fathers and other men in supporting their children and families with a particular emphasis on antenatal care and early childhood. Good quick tukka: cook it, plate share it — (QAIHC, no date). The project teaches cooking skills with the aim of increasing number at home. meals Indigenous people make Balunu Foundation Cultural Healing program — School of Public Health and Community Medicine 2013). Australia, UNSW, Aboriginal young provides The program to reconnect people with the opportunity and spiritual identities with their cultural Type of program 33. Health 34. Health 35. Health

Evaluating Indigenous programs: a toolkit for change | 37 — however, the methodology used — however, Rating of evidence N/A with those used favorably compares very the same At of the evaluations. in many time, a disproportionally large section the of the report is spent justifying than analysing methodology used rather the findings from research. also limited due to the lack Study was of robustness in such a small sample and that the review only assessed size, a single point in time. Ideally for an measurements would be evaluation, prior to starting (pre-program taken baseline) and then at regular intervals after commencing in order to detect changes in response. Strong analysis and matched control Regression group. Key findings the Findings suggest parents perceive for their own to be valuable program as well their personal development child’s. important incorporates The program for elements of successful programs Aboriginal and Torres people, such as Islander young Strait creating a safe place and providing their opportunities for people to develop own strengths and skills etc… The authors argue more funding would to reach its potential enable the program and allow its impacts to be sustained. in the AMGPP associated Participation neonatal with significantly improved health outcomes. Babies born to AMGPP participants were significantly less to be born preterm 9.1% versus likely historical controls of 15.9%. Type of evaluation/description methodology a The case study/review involved methods approach participatory mixed research processes and quantitative including analysis of routinely collected data and the collection program data. Qualitative analysis of survey research processes included focus groups, semi-structured interviews and testimonial data. compared data was Baseline quantitative findings. with qualitative study intervention Non-randomised Australian using data from the Western Regression System. Notification Midwives models were used to analyse data from 343 women (with 350 pregnancies) who birth participated in the AMGPP and gave between 1 July 2011 and 31 December 2012. Methodology included historical control groups of and contemporary pregnant Aboriginal women matched for maternal age. (Christina Bertilone . Name and description of program/ form of publication Ngala Nanga Mai pARenT Group — Case Study (Muri Marri School of Public Medicine, UNSW Health and Community and Group Program, The pARenT Hospital, n.d.) Children’s the health aims to improve The program Aboriginal parents and their of young parents to take encouraging children by part in educational activities. Aboriginal Maternity Group Practice Program (also known as Moort Boodjari Mia) — 2015) and Suzanne McEvoy, free provides The AMGPP program antenatal and postnatal clinical care, to pregnant Aboriginal women. Each a team of health client is supported by professionals during pregnancy and for birth. given four weeks after they have includes clinical care Support provided social, and emotional care and cultural, and support Type of program 36. Health 37. Health

38 | Evaluating Indigenous programs: a toolkit for change Rating of evidence Strong provide both evaluations Together a strong evidence base about the of the FVPP program. effectiveness of In particular the meta-analysis documentation, and triangulation data and quantitative of qualitative insights about some revealing provided of the issues with FVPP program. Utilisation of this information would have to the enabled significant improvements to be made. FVPP programs Weak As a result of low participation, focus group interviews were dropped. Furthermore, some interviews were telephone, due to time and/ done by on behalf of the or resource constraints mothers. There is a distinct lack of tying outcomes specifically to the the positive study. no control group. There was Weak Mainly data showing whether the dental clinics were utilising the provided resources and anecdotal evidence. it is difficult to Without a control group, assess the success of program. Key findings few projects found very First evaluation were directly based on knowledge of documented evidence approaches to dealing with family violence. People did not appear to know about good and that some projects were practice ie risks of a safe counterproductive, house in communities that are not ready to manage the challenges yet involved. found that while Second evaluation and quantitative there were qualitative performance indicators for FVPP projects no identified program-wide there was performance indicators for FVPP. positive The report stated many outcomes for the mothers and their of depression, all babies (lower rates babies residing with their mothers, etc.). without a control group it is However, not possible to attribute these findings to the program. weak and the The data collection is very does not ask specific questions survey activities and related to the program to participating clinics. resources given that The only recorded outcome was were (1) culturally the resources given appropriate and (2) that they would little in the report use them again. Very activities. about actual program Type of evaluation/description methodology primarily at FVPP, looked First evaluation at FVPP as part looked second evaluation Violence Programs of a suite Family FaHCSIA. by delivered a document involved First evaluation key data review, and administrative interviews, ten site visits stakeholder and interviews/focus groups with participants. a literature involved Second evaluation review (meta of government reports), consultation with key project of FaCSIA survey stakeholders, data analysis. managers and program interviews with review, Literature data participants and quantitative and surveys). (demographics 9 mothers participated in the program. the time of writing report, a few At completed the of them had not yet the under were children their as program 18 month milestone. The report does not include a specific methodology section (!). pre review, The report lists a literature telephone surveys. and post-program on the framework The emphasis was than itself and resources utilised rather on measuring outcomes. No control group. — Avrille Burrows, Avrille Two evaluations, evaluations, Two project report, Phase Name and description of program/ form of publication Family Violence Partnership Program (FVPP)— 2005) and evaluation Partners, (Courage of FaHCSIA (Department Violence Programs Family of Finance and Deregulation, Office (Indigenous and Audit of Evaluation 2007). Programs), grassroots supports practical, Program to address family violence, initiatives sexual assault and child abuse. Bumps to Babes and Beyond report (by evaluation Allen and Sharon Gorton, Beverley December 2014). is designed to provide This program and vulnerable support for young mothers during pregnancy and the first a 2 year It was 18 months afterwards. program. Deadly Teeth — & Health Aboriginal Corp. 1 (Winda-Mara District, May Promotion Unit at Portland 2012). a holistic approach to takes Program hygiene focusing on oral health by oral as well access to and health literacy dental services, for families with children aged 0-5 years. Type of program 38. Health 39. Health 40. Health

Evaluating Indigenous programs: a toolkit for change | 39 Rating of evidence Weak data collected from The report says the Deadly Ears ENT clinics “indicated” of CSOM had been that the prevalence reduced, though senior officials from the clinics said they had no data on of the disease. Data from prevalence other clinics were not included = no control group. assessment of the overall An accurate reduction in the incidence of ear not possible, due to inflammation was data. the lack of population level Weak Despite evidence suggesting that a total of 432 people participated, full only available screening data was for 34 individuals. Furthermore, data from this participant population may of participants not be representative who attended screening sessions or workshops during the 12-month data (!). collection period for this evaluation (as measured effectiveness Comparative not was against similar programs) investigated. collected data was The post-program through questionnaires. Self-reporting choices in relation to healthier lifestyle and notoriously over-stated, are always no confirmation of answers by there was workers. the program Key findings The conclusion is that stand-alone is such as this program, ventures, combat CSOM to effectively unlikely (chronic ear inflammation) as the and causes and determinants are varied other lifestyle with many interwoven aspects. It also highlighted the need for a standardised method of collecting data across clinics. outcomes were generally Post-program as shown through the individual positive participant screening data and data. Most participants were qualitative ‘at risk’ at the start of HWP; most reduced their body weight and waist/ hip circumferences; and showed some change in terms of their modest positive behaviours. lifestyle Type of evaluation/description methodology interviews stakeholder review, Literature independently (the content of which was visits and analysed), community data from medical records. quantitative No control group. participant data analysis Quantitative and in-depth interviews with key stakeholders. There were baseline measurements obtained prior to the program. available — evaluation report — evaluation evaluation report (J. report (J. evaluation Name and description of program/ form of publication Deadly Ears Kids Communities — March Durham, L. Schubert & Vaughn, 2015). aims to reduce high rates The program hearing loss from middle of conductive ear disease among Indigenous children. Healthy Weight Program (Living Strong Program) (Queensland Health Promotion Unit, 2005). participants encourages The program through good lifestyle a healthy to have exercise. nutrition and physical Type of program 41. Health 42. Health

40 | Evaluating Indigenous programs: a toolkit for change Rating of evidence Weak The report does not offer much data, it to only summarises what has worked suggestions as to how date and makes should be carried out in the the program models, funding future (service delivery models, etc.). Weak Despite showing attendances at workshops of almost 5,500 people, data gathered (questionnaires and surveys) shows only numbers of 20-30. The small analysis. sample pools prohibit effective of the project was The stated objective not necessarily to reduce the number general to raise but rather of smokers, effects. In of the negative awareness shows that this sense, the evaluation a success, as the post- was the program show an increase in workshop surveys ? However, awareness. Weak Although there were pre and post interviews with participants the sample small and the evaluation was size For primarily relied on qualitative. no evidence example, there was health statistics that participant’s reported any were recorded to verify activity. in physical improvements Key findings The conclusion of the report is that more funding is needed and for across regions. of programs integration effectiveness The evidence for program with only a largely descriptive was trials. This was handful of intervention to the established evidence in contrast tobacco control programs of effective community. amongst the general the the participants liked Overall, and reporting learning more program about nutrition and that their physical activities had increased. However, they reported experiencing difficulty due to maintaining a healthier lifestyle issues in their lives. Type of evaluation/description methodology to assess The purpose of the report was of the program. the effectiveness (data analysis of how the Quantitative had run up until 2005) and program data (site visits, interviews), qualitative conducted in 3 phases. project reporting (at 6 review, Literature interviews stakeholder month intervals), and surveys. No control group. methodology included pre Evaluation though evaluations, and post program part took who participants of number the small, 8 in the first in interviews was round and 11 in the second.

— an evaluation — an evaluation evaluation report (by report (by evaluation

final e. — final evaluation report (South — final evaluation Name and description of program/ form of publication Family Violence Regional Activities Program — Morgan Disney & Partners, Courage 2005). May Associates, Success Works, support to provides This program projects aimed at addressing grassroots family violence, sexual assault and child abus Aboriginal Tobacco Control Project — Yarning It Up, Don’t Smoke Up Metropolitan Health Service, June 2014). workshops to help The project involves people stop smoking and information sessions to those working in the tobacco cessation field. Healing program: healthy eating activities (and) lifestyles (for) Indigenous groups Weight (QLD Health for the Healthy 2004). (HWP, Program is a 10-week The HEALInG program program, eating and lifestyle healthy and practical which aims to provide realistic advice to participants. Type of program 43. Health 44. Health 45. Health

Evaluating Indigenous programs: a toolkit for change | 41 Rating of evidence Weak that were interviewed The stakeholders were all professionals working with the also purely It was of CAYLUS. delivery data. The (and subjective) qualitative up by statements made were not backed statistics. quantitative any Weak The report states that data collection to be a challenge. With sampling proved inadequate for techniques and sizes reliability, results (i.e. validity, conclusive data and service provider transferability) being initially scarce, finding alternative of measuring and reporting ways outcomes of the HPHM Service became crucial. The researchers had to think outside the confines of their empiricist methods that were and employ training and flexible. sensitive culturally 1,000 births registered, Out of over only 5 mothers completed a post- questionnaire. Weak None of the original sources (answered interviews, etc.) are attached to the report. The report is mainly comprised options of descriptions the various that the Healing Center offers (bush trips, cooking classes, etc.) but it fails to explain results. Key findings supports the The report generally A large CAYLUS. by services provided of respondents felt that the majority a key was programs of youth provision for their communities because program young crime, and gave it prevented of spending ways people more positive their time. The report lists 4 out of 5 KPIs as been fully or partially met. The having around increasing KPIs mainly revolve services. for the health unit’s awareness during data collection Staff turnover points hampered progress. of The report is mainly descriptive ‘healing’ and ‘healing processes’ in the indigenous sense. The only actual findings are anecdotal quotes from completed interviews. Type of evaluation/description methodology interviews with 39 Telephone (across 7 remote stakeholders workers, communities in NT); youth etc. No participants regional staff, interviewed. The questions revolved around the conditions before a program and what kind of services arrived No questions delivered. the program or measurement regarding the effectiveness. No control group. and quantitative Both qualitative methods utilised. A baseline report was in 2008. developed were identified and results 5 KPI’s measured for these. review. Literature and interviews, photographs Qualitative: videos. No control group. — final .

) — stakeholder ) — stakeholder — final Evaluation Report Report — final Evaluation

. Name and description of program/ form of publication Central Australian Youth Link Up Service (CAYLUS 2013). feedback report (CAYLUS, mission is to support This program’s that improve initiatives community of life, and address alcohol quality other drug use issues affecting young people Aboriginal Perinatal Service Expansion Maternal Health Unit, Perinatal (WA 2008). February and trial a The project aims to develop appropriate’ perinatal mental ‘culturally health service framework Akeyulerre Healing Center report (Charles Darwin evaluation April 2010). University, The Angkwerre-iweme (traditional healing) project helps Elders to practise Arrernte healing and pass on traditional in their community. Type of program 46. Health 47. Health 48. Health

42 | Evaluating Indigenous programs: a toolkit for change objective objective st year due to lack of staff and year nd Rating of evidence Weak of high Although the report is generally the fact that evaluation quality, lists the 1 framework (reducing the number of harms) as “not serious questions to be measured” raises effective was program the whether about or not. Weak It is highlighted that one of the core abandoned in the data sources was 2 participation. That meant that individual child outcomes could not be measured. Weak The report mainly contains descriptions conditions current and history past the of of the Indigenous communities that It also participated in the program. recommendations on contains many there where to go from here. However, few mentions on the outcome are very of the program. Moderate The report stresses that an evidence- utilised. Program based approach was a database that records keep providers the instances. This data is included the report and is clearly labelled analysed. stage improved the stage improved st Key findings The focus of the report is largely on and “increasing the “building capacity” Unfortunately there is no awareness”. in terms evidence of actual effectiveness of a drop in substance abuse. up the The anecdotal evidence makes bulk of the findings, with some it support for the program leaning towards (from parents and other carers). a 1% increase in school There was in the attendance for the kids involved but this is based on a small program, self-reported. and was sample size “Stage 1” of the The report covers the which focused on identifying NEP, causes for the poor condition of participating communities. Although the good, the documentation for that is very evaluation any report does not provide of how the 1 communities. The report states that the available is evidence suggests that the program effect on indigenous a positive having and street crime. The main youth is that the organisation conclusion drawn needs more funding in order to expand capabilities. its administrative were geared towards The surveys perception of the stakeholder’s organisation/program. Type of evaluation/description methodology attendance statistics Quantitative: the kids). by (though self-reported Qualitative: stakeholder interviews. Data stakeholder Qualitative: from communication logs and reports. No control group. community focus group, Qualitative: interviews. feedback, stakeholder No control group. group meetings, surveys, Focus review, document reviews, literature stakeholders. a few meetings with key Mainly qualitative.

— . evaluation evaluation — evaluation — evaluation

. Name and description of program/ form of publication Alcohol and Other Drugs — Indigenous Communities Project report (Center for — final evaluation 2011). Health, May Remote The project seeks to reduce substance use and associated harms, among Indigenous communities from the Top End region of the Northern Territory Jalaris Kids Future Club Side Side by report (M. Haviland, Consulting, 2010). the The project aims to improve educational outcomes of Indigenous helping them to prepare for children by mainstream schooling. National Empowerment Program et Millroy national summary report (J. 2014). al., The project aims to reduce the rate of suicide in Aboriginal and Torres Islander communities through Strait the promotion of social and emotional wellbeing initiatives. Nyoongar Outreach Patrol Service — Keeping People Safe report (John Scougall Consulting Services, March 2012). social and welfare provides This program services to Aboriginal people who are alcohol or homeless and or affected by drugs Type of program 49. Health 50. Health 51. Health 52. Housing

Evaluating Indigenous programs: a toolkit for change | 43 Rating of evidence Moderate triangulation of data Study involved from multiple sources, including analysis study only of economic data. However at two communities so difficult looked program about overall to extrapolate impact. N/A N/A Key findings into invited Evidence ICV was in discrete, communities and involved well defined projects, and that a positive were providing volunteers impact and building on existing work that had been done in the community. Evidence that ICV had also developed partnerships with other positive organisations and were collaborating with them on activities. management The ANAO found that IBA’s had been inefficient of the program and lending did not fully align with for which IBA objectives the program is funded. In particular loans were been to people who would have provided able to access loans from mainstream lenders. often falls of drinking water Quality standards. Testing short of Australian irregular or systems was of wastewater incomplete between January 2012 and 2014, so Housing could not be sure if they were working effectively. current arrangements Housing’s limit its for managing the Program and efficiency. effectiveness for The criteria to determine eligibility not been applied since have the Program 2008 and as a result Housing did not know if the right communities were in the Program. Type of evaluation/description methodology Case study (but actually a social and economic impact assessment). Assessment of activities in two stakeholder communities involved consultations and document data analysis, including assessing the impacts of the activities in economic terms. to assess of the audit was The objective management of IBA’s the effectiveness and implementation of the IHOP assessed how well the WA Audit power, Department of Housing delivers repair and water and waste water maintenance services to selected remote Aboriginal communities through the Program. Services Essential Area Remote The scope of this audit did not include of all services to remote the provision communities or their sustainability. . .

economic and social impact/ social and economic audit . — audit report (Australian — audit report (Australian Name and description of program/ form of publication Indigenous Community Volunteer — Program case studies (KPMG, 2015) and non-profit ICV is a registered charity organisation development community experience that matches a volunteer’s and skills with different Indigenous communities needs to help address Indigenous disadvantage Indigenous Home Ownership Program Office, December 2015). National Audit Indigenous The IHOP is administered by and aims to help Business Australia into home Indigenous Australians ownership Remote areas essential services program — Western General (Office of the Auditor 2015) Australia, Type of program 53. Housing 54. Housing 55. Housing

44 | Evaluating Indigenous programs: a toolkit for change Rating of evidence N/A Moderate report is solid, with very The evaluation thorough analysis of both the qualitative data. There are many and quantitative based suggestions for improvement the report on the findings. However, also concludes that the emphasis of design and the report is on program than measuring rather service delivery, and outcomes. effectiveness Moderate and quantitative of qualitative A range to enable data sources were utilized triangulation of data. Key findings were Health related improvements 2000 houses in 34 made to over communities between July 2005 and program FaHCSIA’s June 2009. However, did not cater management arrangements for the collection of data that provided a means of linking improvements made to houses in communities with in health indicators improvements those same communities. The report highlights the difference in reporting and data collection standards across mental health campuses as a key It also cites problem for the evaluation. of staff and insecure the high turnover funding. a on developing The focus was for AMH workers. framework found that the program The evaluation of economic, social, had a range benefits. and environmental cultural the jobs as ‘real jobs’ that saw Rangers better income and conditions, provided more interesting work and ongoing compared to the CDEP employment, alternative. . Type of evaluation/description methodology interviews, site visits Stakeholder (patient file audits, policies and procedure checks), health centre data evaluation. The data is clearly labelled and explained. The audit’s objective was to assess the was objective The audit’s management of FaHCSIA’s effectiveness for the Fixing Housing Better Health since 2005. Program 18 case studies, a Methodology involved and policy data review of program documentation and consultations with departmental personnel key audit audit

. s. — final evaluation report (G. — final evaluation Name and description of program/ form of publication Fixing houses for better health projects (Housing for health) — Office, 2010) National Audit (Australian is targeted at individual The program households and aims to promote a making by healthier living environment repairs to home Aboriginal Mental Health Worker Program & A. Harris, Charles Darwin Robinson 2004?) University, promotes and supports the The program role of AMHWs as mental health workers in remote communities The Working on Country (WoC) (Urbis, 2012). evaluation provides (national) program The WoC to Indigenous and training employment living in regional and Australians to work on Indigenous remote Australia and manage Protected Areas (IPA) care for their ‘country’. Type of program 56. Housing 57. Jobs and economy 58. Jobs and economy

Evaluating Indigenous programs: a toolkit for change | 45 Rating of evidence Moderate a perception that things While there was in the communities, as were improving the survey a social survey, evidenced by communities, other in conducted not was part not clear whether this was so it was of the benefits trend. Many of an overall to the communities appeared have arisen from the Alcohol Management the Plans, which, although initiated by Institute were not part of the Cape York a couple reforms, and preceded them by of years. Weak Only four of the eight Indigenous regions took part in the operates in which Jawun Coast, Central (Cape York, evaluation East Kimberley and Inner Sydney) summary of and only the executive on is publicly available the evaluation website. Jawun’s Key findings were The findings of the evaluation difficult to interpret and attribute. very analysis showed While the quantitative on a number of different improvements dimensions, including school attendance and reductions in and achievements no progress in other crime, there was areas. assistance found overall The evaluation secondees was Jawun offered by appropriate and met Indigenous strengthening needs, by organisation’s the leveraging and by their capacity and government expertise of corporate partners to support Indigenous led the evaluation projects. However, suggested the model could be improved better assessing organisations’ by to host capabilities and capacity secondees effectively. . Type of evaluation/description methodology and an An implementation evaluation Methods included outcomes evaluation. interviews in the four stakeholder communities, document analysis and data examination of administrative relating to education, child protection, as well housing, crime and employment, database. as the FRC’s did not include an The evaluation although the economic evaluation, design report welfare reform program recommended that ‘an economic should assess the cost evaluation of the effectiveness Interventions.’ Impact Evaluation sought to validate The evaluation ‘Theory of Change’ by Jawun’s and interviews with conducting surveys and Indigenous organisations, corporate partners and secondees government socio- analysing available and by data. demographic — — executive summary of report — executive Name and description of program/ form of publication Cape York Welfare Reform and report (Performance evaluation Branch Evaluation and independent consultants FaHCSIA, Centre Research from the Social Policy of at the University and academics from the Australian 2012) National University, aims to restore social The program norms and promote community leadership and economic development welfare reform communities: in five Doomadgee Coen, Hope Vale, Aurukun, and Mossman Gorge. Jawun 2015). (KPMG, 7 November organisation is a non-government Jawun which places people from companies and agencies into Indigenous government organisations. in nine Indigenous Currently operates Cape communities across Australia: East Kimberley; Goulburn Murray; York; Coast; Lower Central Inner Sydney; North East Arnhem Land Murray; River and the NPY Lands. Type of program 59. Jobs and economy 60. Jobs and economy

46 | Evaluating Indigenous programs: a toolkit for change Appendix B: Evaluation Toolkit

There are many different reasons for, and benefits in, • Development evaluation — a non-linear approach, conducting evaluations (see Table 12). not specific to a particular point in the roll outor delivery of a program. The main focus of this Table 12: Reasons for, and benefits in, conducting type of evaluation is understanding the activities evaluations of a program and how the program operates in a Agency/ Potential benefits dynamic environment. The principle focus is on institution learning and feedback rather than achieving a set of predetermined outcomes. Development Government • More efficient resource allocation evaluation also recognises that positive outcomes • Highlights what is and is not can sometimes occur unintentionally. working • More informed decision-making Ideally evaluation should be embedded into program • Encourages greater public trust in development and implementation. The Queensland government Government Program Evaluation Guidelines have adapted the Gibbs Reflective Model to illustrate how evaluation Service • Improved service delivery/client should inform program design and implementation (see providers satisfaction Figure 6). • Stronger basis for recurrent funding The first step in undertaking an evaluation generally • Opportunity for continuous involves having a clear understanding of the outcomes improvement processes the program is hoping to achieve and how those program outcomes will be measured — what evaluators Society • Improved government services sometimes call a program logic model or theory of • More open and accountable change. However, although these two terms are government sometimes used interchangeably they are actually two • Public money used more efficiently different approaches. A program logic or logic model • Increased confidence in seeks to illustrate how the needs or issues the program government is seeking to address links with the intended activities 87 Source: Adapted from ACT Government Evaluation Policy and outputs and outcomes of the program (see Figure 7). Guidelines, 2010 A theory of change model seeks to link outcomes and activities to explain how the desired change will occur There are also different types of evaluations depending and what factors contributed to that change. While on the stage of a program’s implementation and what the logic models do not always identify the indicators that evaluation is seeking to measure. Generally speaking, will be used to measure whether outcomes have been different types of evaluation are used at different stages met or not, theory of change models do. For instance, of a program’s implementation. These include: the program logic for a program that seeks to improve • Formative evaluation — generally used at students reading ability would identify the program as the design stage of a program and before it is an activity and improved reading scores as an outcome, implemented. Can be useful to inform decision- but it would not tell you that students need to attend making about whether a program should proceed or the program at least three days a week for a minimum not. Types of questions asked at this stage include, of x number of days and that the course material must what is the problem, is government intervention include a focus on phonics for student’s scores to rise. As appropriate, how will we measure success? a result, a program logic model based on an underlying theory of change will have a lot more rigour than one • Process evaluation — used during the program that does not.88 An evaluation plan or framework sets delivery process. Focuses on processes and what out the information contained in a program logic model can be done to improve the operation of projects in more detail and generally includes a hierarchy of and programs. These types of evaluations are also outcomes from inputs and process outcomes to ultimate known as performance evaluations. Questions asked outcomes, with key evaluation questions, indicators and in these evaluations tend to focus on how well an potential data sources for each stage ( see Figure 8).89 A activity been executed, and inputs and output hierarchy of outcomes recognises that change can take • Summative evaluation — focuses on the outcomes time, and that certain outcomes need to be achieved in and achievements of projects/or programs — also order to progress to a new level. referred to as outcomes evaluation. Questions asked Ideally the objectives of the program should be specific, include: what kind of change has occurred as a result measurable, realistic and relevant to the overall of the intervention? objectives the program is trying to achieve. For example, • Impact evaluation — looks at how a program has a specific and measurable objective would be to increase affected the people participating in the program. the number of children who enrolled by 10% (from x Often not available until towards the end of the to y) by a certain date. If a percentage increase is part project and often relies on pre- and post-program of a measurable objective then it is important to have data. Similar in many ways to summative and baseline data that provides a comparison for assessing outcomes evaluations program impact. However, although gathering program

Evaluating Indigenous programs: a toolkit for change | 47 Figure 6: Incorporating evaluation into program development and implementation

Source: Queensland Treasury (2014).

Figure 7: Program logic model

Figure 8: Hierarchy of outcomes

Program logic Key evaluation Indictors Potential data sources statements questions Ultimate outcome To what extent has the Number and percentage • Interviews with people, program contributed of program participants their families and E.g. Indigenous people to Indigenous people surveyed who report program staff are able to achieve achieving their goals? improvements in the their goals and improve • Longitudinal case studies quality of their life and their quality of life their ability to achieve • Quality of life their goals assessment/survey Longer term outcomes To what extent has the Level of improvement • Interviews with people, program contributed in people’s ability to their families and E.g. Indigenous people to Indigenous people set their own goals, as program staff are actively pursuing being more able to reported by Indigenous their goals • Longitudinal case studies determine and pursue people, their families their goals? and staff • Quality of life assessment/survey Intermediate outcomes To what extent are Number and reach of • Program data Indigenous people participants E.g. Indigenous people • Population data accessing the are aware of and access program? the program Inputs and process Have staff been Number and percentage • Program data outcomes recruited within of funded services that agreed timelines? recruit staff within E.g. Support is provided agreed timeframes to help Indigenous people identify the steps they need to take to pursue their goals, and appropriately skilled and experienced staff are recruited.

Source: Adapted from an Urbis evaluation framework, 2014

48 | Evaluating Indigenous programs: a toolkit for change data can be hard work, it is a myth that access to base- as a type of control group, given that the NTER covered line data is always a problem. There are often existing so many of the Indigenous communities in the Northern administrative data sets on health, education and Territory and the varied nature of those communities.94 crime statistics that could be used to give a baseline. In addition, while ‘data is not the plural of anecdote’ a Efficiency good evaluation involves a mixture of qualitative and quantitative data.90 Qualitative data assesses people’s Evaluating the efficiency of a program involves perceptions of a program and often provide the ‘how’ for identifying whether the program represents value for why the program has or has not achieved its objectives. money, how a program’s resources are being used to Broadly speaking there are three main areas of focus achieve outputs of the desired quantity and quality, and when conducting an evaluation to assess whether a whether the use of the resources could be improved to 95 program has achieved its objectives:91 achieve the desired outcomes. • Appropriateness Economic evaluation identifies, measures and values 96 • Effectiveness a program’s economic costs and benefits. Two methodological approaches used to measure the • Efficiency effectiveness and efficiency of programs are Cost Benefit Analysis (CBAs) and Social Return on Investment Appropriateness (SROI).

Evaluating the appropriateness of a program involves Cost Benefit Analysis expresses the costs and considering whether there is a need for the program, benefits of a program in monetary terms and focuses given the social, economic and environmental context on community-wide rather than individual benefits.97 and how the program aligns with the government’s Values are aggregated using a discount rate that policies and priorities. Assessments of appropriateness represents trade-offs between current and future should focus not only on the individual program, but on consumption.98 Then, the discounted costs and benefits how the policies underpinning the program and other are compared using specific criteria. Limitations to the government policies and instruments interact with CBA methodology are that the benefits of some programs 92 each other. In considering the appropriateness of a are very difficult to quantify due to their subjective program, policy makers need to also look at whether nature. For instance, when measuring the effectiveness there is a priori evidence base for the interventions. of Indigenous social programs, it can be difficult to place Questions related to appropriateness to consider before a monetary value on concepts such as social capital, implementing a program are: wellbeing, quality of life, and cultural attachment.99 • Is the program needed? It is also difficult to quantify causal factors behind • Is there community support for the program? flow-on benefits, such as improved health outcomes or • Is there an evidence base for the interventions used decreased crime rates. The CBA methodology can also in the program? be limited to ‘first round’ impacts and as a result indirect effects can be excluded.100 Another limitation is results • Is there an existing program already addressing a can be skewed if an ‘improper’ rate for discounting similar need? future flows is used.101 • What is different about this program? The Social Return of Investment methodology • Who will implement the program? originally began as a specialised form of cost-benefit analysis but has grown to incorporate many aspects Effectiveness of evaluation practice, such as qualitative interviews with stakeholders. Like CBAs, SROI methodology Evaluating the effectiveness of a program involves places a monetary value on the social impact of an considering whether it is achieving the set objectives activity and compares this with the costs involved in and producing worthwhile outcomes. A key challenge in illustrating the effectiveness of program is having valid implementing that activity. However, SROIs place a measurement in place to determine whether there would greater emphasis on the social purpose for activities 102 have been a difference in outcomes without the program; and how to measure the social impact. Although the what some term ‘estimating the counterfactual’.93 SROI approach utilises aspects of evaluation practice 103 However, estimating the counterfactual when it comes to it is not a comprehensive evaluation framework. Indigenous programs can be difficult, given the myriad As SROI analysis is specifically tailored to individual programs in Indigenous communities. Determining the organisations it is not always possible to do cross- impact of a single program in a particular Indigenous organisational comparisons. However, a SROI ratio community is virtually impossible because so many can be used as a benchmark to enable organisations programs are being delivered simultaneously. If another to measure changes in performance over time.104 One community is used as a counterfactual or ‘control group’ of the biggest issues with the SROI methodology is the then the community is likely to be already receiving similar tendency for people to misunderstand what the SROI programs. Another factor making assessing the impact ratio means. SROI is about value, rather than money. of programs difficult, is the uniqueness of Indigenous The SROI ratio represents the social value created for communities. For example, when the Northern Territory each $1 invested, rather than an actual financial return. Emergency Response (NTER) was evaluated, it was As a result care needs to be taken with how the SROI difficult to find comparable communities that could act ratio is communicated.105

Evaluating Indigenous programs: a toolkit for change | 49 Appendix C: List of Tobacco cessation programs

Name Provider Reach Objectives Alcohol, tobacco and Tasmanian Aboriginal Centre TAS The Alcohol, tobacco and other drugs other drugs program Inc. funded by the federal program, run by the Tasmanian Aboriginal government. Centre, provides alcohol and other drug (AOD) support to the Aboriginal community across Tasmania. Apunipima tackling Apunipima Cape York Health QLD The program aims to raise awareness about smoking and healthy Council the impacts of tobacco smoking and to help lifestyle program facilitate smoke-free environments. Beyond today — it’s Australian Capital Territory ACT A social marketing campaign to encourage up to you Department of Health Indigenous Australians in the ACT to stop smoking. Don’t let your Nunkuwarrin Yunti of South SA To encourage young Aboriginal people in dreams go up in Australia Adelaide to share their ideas, stories and smoke videos on smoking and how it is harmful. Feet first Australian Council on WA Aims to reduce the amount of people (Thoolngoonj Smoking and Health smoking in Kununurra to teach Indigenous bowirn) (ACOSH) people about the harmful effects of smoking. Good sports program Australian Drug Foundation National To address risky drinking, smoking, obesity and mental health though community sports. Healthy lifestyle & Central Australian Aboriginal NT The program provides services to help raise tobacco cessation Congress awareness of chronic disease resulting from program smoking. Heart health ‘for Derbarl Yerrigan Health WA A cardiac rehabilitation program, our people, by our Service (DYHS), the concentrating on health, medications, oral people’ National Heart Foundation, health and quitting smoking. Royal Perth Hospital (Cardiology Department) It’s your choice, South Eastern Sydney Local NSW Encourages young Indigenous people to now! Health District give up smoking by teaching them new skills and making their own films. Kick the butt A partnership between VIC Aims to limit the uptake of smoking tobacco Bunurong Health, Quitline within the Southern metropolitan region and the Cancer Council. of Melbourne. Provides a 24 hour hotline, social marketing campaign and advertising on SBS. Maternal health Aboriginal Health Council of SA Aims to reduce tobacco smoking among tackling smoking South Australia pregnant Aboriginal women and to increase program the birth weight of babies. No more boondah Winnunga Nimmityjah ACT The program looks at what triggers people Aboriginal Health Service to smoke. No more nyumree Wheatbelt Aboriginal Health WA This program aims to provide ‘culturally Service appropriate’ support to help Aboriginal people stop smoking. Primary Prevention The Queensland Aboriginal QLD This program seeks to develop the Capacity Building and Islander Health Hub capacity for Aboriginal organisations to Project offer interventions to address high rates of smoking among Indigenous people living in Queensland. Puyu blaster (Keep it Aboriginal Health Council of SA A healthy lifestyle and anti-smoking corka) South Australia (AHCSA) campaign which seeks to promote local role models to encourage people to give up smoking.

50 | Evaluating Indigenous programs: a toolkit for change Name Provider Reach Objectives Quit for new life Hunter New England Local NSW This program aims to reduce the rate of program Health District smoking among pregnant women and their family or household members. Regional tackling Wuchopperen Health Service QLD The aim of this regional program is to tobacco and healthy reduce the onset and risk of chronic disease lifestyles program developed through tobacco use, poor nutrition and lack of physical activity. Rewrite your story Nunkuwarrin Yunti Inc. SA This program aims to help people break the cycle of smoking and to quit for good. Smoking cessation Derby Aboriginal Health WA The Smoking cessation program provides program Service information about services to help people quit smoking, including nicotine patches at no cost to participants. Stepping Stones AOD Aboriginal Drug and Alcohol SA This program provides free confidential Day Centre Ceduna Council treatment, counselling and referral services for Aboriginal peoples concerned about alcohol, tobacco and other drug issues. Substance use, Katherine West Health Board NT This program focuses on the harmful effects social and emotional Aboriginal Corporation of alcohol, tobacco and cannabis use on wellbeing Indigenous people in the Northern Territory. Tackling Indigenous Australian Drug Foundation National This program aims to reduce smoking smoking among Indigenous Australians. Tackling smoking South West Aboriginal WA This program aims to tackle chronic and healthy lifestyle Medical Service in Western disease risk factors including smoking, program Australia (SWAMS) poor nutrition and lack of exercise, and to deliver community education initiatives to reduce the prevalence of these risk factors in Aboriginal and Torres Strait Islander populations. The Gnumaries hurt Southern Aboriginal WA This program was developed to reduce the program Corporation uptake and prevalence of tobacco smoking among the Noongar people of the Great Southern region of Western Australia. Time to quit Kambu Aboriginal and QLD This program takes a holistic approach to Torres Strait Islander tobacco cessation and provides people with Corporation for Health practical suggestions to help them stop. Tobacco and healthy Ngaanyatjarra Health WA This program aims to reduce the risk lifestyles Service of chronic disease from smoking and other unhealthy lifestyle choices among the Indigenous people living on the Ngaanyatjarra Lands, in Western Australia. Tobacco cessation Victorian Aboriginal VIC This program provides support to the team Community Controlled Victorian Aboriginal Community Controlled Health Organisation Health Organisation (VACCHO) member (VACCHO) services to develop and implement programs and policies to reduce smoking. Tobacco resistance Aboriginal Health and NSW This program aims to reduce smoking rates and control (A-TRAC) Medical Research Council of for Aboriginal people in New South Wales. program NSW Yarning it Up — Don’t South Metropolitan VIC The project runs workshops to help people Smoke it Up Population Health Unit quit smoking that aim to be ‘culturally appropriate’ and non-judgmental. Young Aboriginal Young Aboriginal Drug and TAS This program aims to provide ‘culturally drug and alcohol Alcohol Service relevant’ anti-smoking and drug programs service (YADAS) in partnership with other health service providers.

Evaluating Indigenous programs: a toolkit for change | 51 Endnotes

1 Hudson, S, 2016. Mapping the Indigenous 12 Antonios, Z, 1997. The CDEP Scheme and Program and Funding Maze. Sydney: The Centre Racial Discrimination: A Report by the Race for Independent Studies. Discrimination Commissioner. Sydney: Human Rights and Equal Opportunity Commission; and 2 Hudson, 2016. Mapping the Indigenous Program Hudson, S, 2012. No more dreaming of CDEP. and Funding Maze The Centre for Independent Studies, available 3 SCRGSP (Steering Committee for the Review of at: https://www.cis.org.au/commentary/articles/ Government Service Provision) 2016, Overcoming no-more-dreaming-of-cdep/ accessed 17 May Indigenous Disadvantage: Key Indicators 2016, 2017; and Morris, N and Tomllin, S, 2016. Canberra: Productivity Commission, p.iii Cultural incompatibility hindering public policy 4 The Prime Minister of Australia. 2017. Closing the in remote Western Australia, indigenous leaders Gap Report Statement to Parliament. Available say. ABC News, 22 September 2016 available at: https://www.pm.gov.au/media/2017-02-14/ at http://www.abc.net.au/news/2016-09-22/ closing-gap-report-statement-parliament. cultural-incompatibility-of-public-policy-in-remote- Accessed 17 April 2017 australia/7868810 accessed 17 May 2017 5 Austender Australian Government. 2017. Entity 13 ATSICOEA (Aboriginal and Torres Strait Island Reports for complying with the Senate Order on Commission Office of Evaluation and Audit), Procurement Contracts and use of Confidentiality 1997 Evaluation of the Community Development Provisions. Available at: https://www.tenders.gov. Employment Projects Program: Final Report, au/?event=public.senateOrder.list accessed 17 Canberra: ATSI April 2017 14 DEWR (Australian Government Department of 6 Empowered Communities, 2015. Empowered Employment and Workplace Relations) 2005. Communities Design Report, Wunan Foundation Building on success: CDEP discussion paper, Canberra: Australian Government Inc, available at http://empoweredcommunities. org.au/f.ashx/EC-Report.pdf accessed 17 April 15 Hudson, S (2008) Leadership can make all 2017 the difference in a CDEP organisation, Sydney Morning Herald, 11 October, 2008, available at 7 Antonios, Z, 1997. The CDEP Scheme and https://www.cis.org.au/commentary/articles/ Racial Discrimination: A Report by the Race leadership-can-make-all-the-difference-in-a-cdep- Discrimination Commissioner. Sydney: Human organisation accessed 17 May 2017 Rights and Equal Opportunity Commission; and Hudson, S, 2012. No more dreaming of CDEP. 16 Aikman, A, 2016. Widen welfare rules to all The Centre for Independent Studies, available jobless, says Tony Abbott. The Australian, at: https://www.cis.org.au/commentary/articles/ 6 October 2016 available at http://www. no-more-dreaming-of-cdep/ accessed 17 May theaustralian.com.au/national-affairs/indigenous/ 2017; and Morris, N and Tomllin, S, 2016. widen-welfare-rules-to-all-jobless-says-tony- Cultural incompatibility hindering public policy abbott/news-story/8720125387b8731d8dd6cd731 in remote Western Australia, indigenous leaders cc18395 accessed 13 December 2016 say. ABC News, 22 September 2016 available 17 Australian National University. 2016. Researchers at http://www.abc.net.au/news/2016-09-22/ call for urgent rethink of remote policy, cultural-incompatibility-of-public-policy-in-remote- available at: http://cass.anu.edu.au/news/ australia/7868810 accessed 17 May 2017 research/20161202/researchers-call-urgent- 8 Senate Finance and Public Administration rethink-remote-policy accessed 17 May 2017; and Committee, 2017. Commonwealth Indigenous Jordan, K, and Fowkes, L, 2016. Job Creation and Advancement Strategy tendering processes. Income Support in Remote Indigenous Australia: Canberra: Commonwealth of Australia, available at Moving forward with a better system. Centre http://www.aph.gov.au/Parliamentary_Business/ for Aboriginal Economic Policy Research, CAEPR Committees/Senate/Finance_and_Public_ Topical Issue NO. 2/2016, available at: http:// Administration/Commonwealth_Indigenous/Report caepr.anu.edu.au/sites/default/files/Publications/ accessed 17 May 2017 topical/CAEPR%20Topical%20Issues%202_2016. pdf accessed 17 May 2017 9 Australian National Audit Office, 2017. Indigenous Advancement Strategy, Canberra: Commonwealth 18 Australian National Audit Office, 2017. Indigenous of Australia available at https://www.anao.gov.au/ Advancement Strategy, Canberra: Commonwealth work/performance-audit/indigenous-advancement- of Australia strategy accessed 17 May 2017 19 As above 10 As above 20 Australian National Audit Office, 2017. Indigenous Advancement Strategy, Canberra: Commonwealth 11 Australian Public Service Commission. 2016. of Australia Providing robust advice, available at: http://www. apsc.gov.au/publications-and-media/current- 21 The Senate Finance and Public Administration publications/learning-from-failure/providing- Committee, 2016, Commonwealth Indigenous robust-advice. accessed 17 May 2017 Advancement Strategy tendering process, page 56

52 | Evaluating Indigenous programs: a toolkit for change 22 Young, E, 2016. Senator Rachel Siewert condemns available at http://www.pc.gov.au/research/ Indigenous Advancement Strategy after report. supporting/better-indigenous-policies/better- WA Today, 22 March 2016 available at http://www. indigenous-policies.pdf accessed 17 May 2017 watoday.com.au/wa-news/senator-rachel-siewert- 33 Jordan, K, and Fowkes, L, 2016. Job Creation and condemns-indigenous-advancement-strategy- Income Support in Remote Indigenous Australia: after-report-20160321-gnnp74.html accessed 17 Moving forward with a better system. Centre May 2017 for Aboriginal Economic Policy Research, CAEPR 23 As above see also Parke, E and Martin, L, 2015. Topical Issue NO. 2/2016, available at: http:// Funding cut for remote Aboriginal domestic caepr.anu.edu.au/sites/default/files/Publications/ violence shelter will ‘put lives at risk’. WA Today, topical/CAEPR%20Topical%20Issues%202_2016. 18 May 2015 available at http://www.abc.net. pdf accessed 17 May 2017 au/news/2015-05-17/funding-withdrawal-puts- 34 FSG. 2016. The Case for Developmental indigenous-womens-lives-at-risk/6476132 Evaluation, available at: http://www.fsg.org/blog/ accessed 17 May 2017 case-developmental-evaluation accessed 17 May 24 Havnen, O, 2012. Office of the Northern Territory 2017 Coordinator-General for Remote Services Report. 35 Neave, C, 2016. Submission by the Darwin: The Office of the Coordinator-General for Remote Services available at www.territorystories. Commonwealth Ombudsman to the Department nt.gov.au/bitstream/handle/10070/241806/ of Prime Minister and Cabinet Consultation Paper: NTCGRS_fullreport_2012.pdf?sequence=1 Changes to the Community Development Program, accessed 17 May 2017 available at: https://www.dpmc.gov.au/sites/ default/files/cdp-submissions/commonwealth- 25 Gosford, R, 2012. Anderson sacks Havnen – a case ombudsman-submission.pdf accessed 17 May of two strong hens in the NT henhouse is one too 2017 many? Crickey, available at: https://blogs.crikey. com.au/northern/2012/10/09/anderson-sacks- 36 Havnen, O, 2012. Office of the Northern Territory havnen-%E2%80%93-a-case-of-two-strong-hens- Coordinator-General for Remote Services Report. in-the-nt-henhouse-is-one-too-many/ accessed 17 Darwin: The Office of the Coordinator-General for May 2017 Remote Services available at www.territorystories. nt.gov.au/bitstream/handle/10070/241806/ 26 Australian Indigenous Healthinfonet. 2017. NTCGRS_fullreport_2012.pdf?sequence=1 Programs and Projects available at: http://www. accessed 17 May 2017, pages 57-58 healthinfonet.ecu.edu.au/key-resources/programs- projects. accessed 17 May 2017 37 As above 27 The Commonwealth Department of Health. 2014. 38 McDonald, E, Cunningham, T, and Slavin, N 2017. Tackling Indigenous Smoking and Healthy Lifestyle Evaluating a handwashing with soap program in Programme Review 2014, available at: http:// Australian remote Aboriginal communities: a pre www.health.gov.au/internet/main/publishing.nsf/ and post intervention study design. BMC Public Content/indigenous-tis-hlp-review. accessed 17 Health 15, available at: https://www.ncbi.nlm.nih. May 2017 gov/pmc/articles/PMC4662811/ accessed 14 May 2017 28 The Commonwealth Department of Health. 2017. Tackling Indigenous Smoking (TIS), available 39 Audit Office of New South Wales, 2017. at: http://www.health.gov.au/internet/main/ Implementation of the NSW Government’s publishing.nsf/Content/indigenous-tis-lp.accessed program evaluation initiative. available at: http:// 17 May 2017 www.audit.nsw.gov.au/publications/latest-reports/ 29 Taylor, P and Laurie, V, 2016. Fitzroy Crossing at nsw-government-program-evaluation accessed 17 the Crossroads. The Australian, 16 November 2016 May 2017 available at http://www.theaustralian.com.au/ 40 New South Wales Department of Premier and news/inquirer/fitzroy-crossing-at-the-crossroads/ Cabinet, 2016. NSW Government Program news-story/c46c7b0ee66be2ac3c67354e056c7e7b Evaluation Guidelines, available at: http:// accessed 17 May 2017 www.dpc.nsw.gov.au/programs_and_services/ 30 Australian Human Rights Commission, 2017. From policy_makers_toolkit/evaluation_in_the_nsw_ community crisis to community control in the government accessed 17 May 2017 Fitzroy Valley. Social Justice Report 2010, available 41 Victoria Department of Planning and Community at: http://www.humanrights.gov.au/publications/ Development, 2008. Evaluation Step-by-step chapter-3-community-crisis-community-control- Guide, available at: http://www.dhs.vic.gov.au/__ fitzroy-valley-social-justice-report-2010 accessed data/assets/pdf_file/0011/769943/Evaluation- 17 May 2017 Step-by-Step-Guide.pdf accessed 17 May 2017 31 Moran, M, 2016, Serious Whitefella Stuff: When 42 Victorian State Government. n.d. Funded solutions become the problem in Indigenous Organisation Performance Monitoring Framework. affairs, Melbourne University Press, Melbourne available at: http://www.dhs.vic.gov.au/facs/ 32 Australian Productivity Commission, 2012, bdb/fmu/service-agreement/4.departmental- Roundtable Proceedings. In Better Indigenous policies-procedures-and-initiatives/4.10-funded- Policies: The Role of Evaluation. Canberra, 22–23 organisation-performance-monitoring-framework October 2012. Canberra: Productivity Commission accessed 17 May 2017

Evaluating Indigenous programs: a toolkit for change | 53 43 Queensland Government, Queensland Treasury. 55 Gruen, N, 2016. Why we accept travesties of 2014. Queensland Government Program ‘evidence-based’ policymaking. The Mandarin, 9 Evaluation Guidelines available at: https://www. May 2016 available at http://www.themandarin. treasury.qld.gov.au/publications-resources/qld- com.au/64557-nicholas-gruen-evidence-based- government-program-evaluation-guidelines/. policy-part-one/ accessed 17 May 2017 accessed 17 May 2017 56 Hunt, J, 2017. The Cashless Debit Card trial 44 Tasmanian Government. 2015. Planning, evaluation: A short review. Centre for Aboriginal evaluation and procurement, available at: http:// Economic Policy Research, ANU College of Arts and www.communications.tas.gov.au/policy/planning_ Social Sciences, available at: http://caepr.anu.edu. and_procurement, accessed 17 May 201 au/Publications/topical/2017TI1.php 45 South Australian Government. 2016. Managing accessed 15 May 2017 a Community Organisation Evaluation, available 57 New South Wales Department of Premier and at: https://www.sa.gov.au/topics/family-and- Cabinet, 2016. NSW Government Program community/community-organisations/managing- Evaluation Guidelines, available at: http:// a-community-organisation/evaluation accessed 17 www.dpc.nsw.gov.au/programs_and_services/ May 2017 policy_makers_toolkit/evaluation_in_the_nsw_ 46 Western Australia Department of Treasury. 2017. government accessed 17 May 2017 Program Evaluation, available at: http://www. 58 Senator Nigel Scullion. 2016. Media Release: treasury.wa.gov.au/Treasury/Program_Evaluation/ Overcoming Indigenous Disadvantage Report Program_Evaluation/ accessed 17 May 2017 highlights progress, available at http://www. 47 The Government of Western Australia. 2017. nigelscullion.com/overcoming-indigenous- Program Evaluation Western Australia, available disadvantage-report-highlights-progress/ accessed at: http://www.programevaluation.wa.gov.au/ 17 May 2017 accessed 17 May 2017 59 Siminski, P, 2016. How to get a better bang for the 48 The Government of Western Australia. 2015. taxpayers’ buck in all sectors, not only Indigenous Evaluation Guide, available at: http://www. programs. The Conversation, 23 August 2016 treasury.wa.gov.au/uploadedFiles/Treasury/ available at https://theconversation.com/how- Program_Evaluation/evaluation_guide.pdf, to-get-a-better-bang-for-the-taxpayers-buck-in- accessed 17 May 2017 all-sectors-not-only-indigenous-programs-64296 49 Northern Territory Government, n.d. Aboriginal accessed 17 May 2017 Affairs Monitoring, Evaluation and Reporting 60 As above Framework (MERF) available at https://dlgcs. 61 Holman, D, 2014, A Promising Future: WA nt.gov.au/interpreting/?a=228592 accessed 17 Aboriginal Health Programs Review of performance May 2017 with recommendations for consolidation and 50 ACT Government. 2010. ACT Government advance Government of Western Australia Evaluation Policy and Guidelines, available at: Department of Health available at http://www.cmd.act.gov.au/__data/assets/ http://ww2.health.wa.gov.au/~/media/Files/ pdf_file/0004/175432/ACT-Evaluation-Policy- Corporate/Reports%20and%20publications/ Guidelines.pdf accessed 17 May 2017 Holman%20review/a-promising-future-wa- 51 Victorian Government Department of Primary aboriginal-health-programs.ashx accessed 17 May Industries, 2017, Gathering evidence using data: 2017 the implications and advantages of undertaking 61 Bauer, M, Damschroder, L, Hagedorn, H, Smith, evaluation research in-house. In Australian J and Kilbourne, A. 2015. An introduction to Evaluation Society Conference. Canberra, 2009 implementation science for the non-specialist. available at https://www.aes.asn.au/images/ BMC Psychology, 3:32, 1-12. available at: stories/files/conferences/2009/documents/ https://bmcpsychology.biomedcentral.com/ Narelle%20Fitzgerald.pdf accessed 17 May 2017 articles/10.1186/s40359-015-0089-9 accessed 14 52 Oxfam blogs. 2012. Getting evaluation right: a five May 2017 point plan, available at: https://oxfamblogs.org/ fp2p/getting-evaluation-right-a-five-point-plan/ 62 Cobb-Clark, D, 2012. The case for making public accessed 17 May 2017 policy evaluations public. In Australian Productivity Commission, 2012, Roundtable Proceedings. In 53 Cobb-Clark, D, 2012. The case for making public Better Indigenous Policies: The Role of Evaluation. policy evaluations public. In Australian Productivity Canberra, 22–23 October 2012, pages 81-91 Commission, 2012, Roundtable Proceedings. In available at http://www.pc.gov.au/research/ Better Indigenous Policies: The Role of Evaluation. supporting/better-indigenous-policies/07-better- Canberra, 22–23 October 2012, pages 81-91 indigenous-policies-chapter5.pdf accessed 17 May available at http://www.pc.gov.au/research/ 2017 supporting/better-indigenous-policies/07-better- indigenous-policies-chapter5.pdf accessed 17 May 2017 54 As above

54 | Evaluating Indigenous programs: a toolkit for change 63 Griffiths A, Zmudzki F, Bates S, 2017, Evaluation 75 James, M, 2012. Designing evaluation strategies. of ACT Extended Throughcare Program: Final In Australian Productivity Commission, 2012, Report (SPRC Report 02/17). Sydney: Social Policy Roundtable Proceedings. In Better Indigenous Research Centre, UNSW available at https://www. Policies: The Role of Evaluation. Canberra, 22–23 sprc.unsw.edu.au/media/SPRCFile/Evaluation_of_ October 2012. Canberra: Productivity Commission ACT_Extended_Throughcare_Pilot_Program.pdf available at http://www.pc.gov.au/research/ accessed 17 May 2017 supporting/better-indigenous-policies/09-better- indigenous-policies-chapter7.pdf accessed 17 May 64 SCRGSP (Steering Committee for the Review of 2017 Government Service Provision) 2016, Overcoming 76 Sabel, J, and Jordan, J, 2015. Doing, Learning, Indigenous Disadvantage: Key Indicators 2016, Being: Some Lessons Learned from Malaysia’s Productivity Commission, Canberra available National Transformation Program. 1st ed. at http://www.pc.gov.au/research/ongoing/ Washington: The World Bank Group available at overcoming-indigenous-disadvantage/2016/report- http://www2.law.columbia.edu/sabel/papers/ documents/oid-2016-overcoming-indigenous- CS-LSJ--DLB%20Malaysia%20PEMANDU-- disadvantage-key-indicators-2016-report.pdf Final-190115.pdf accessed 17 May 2017 accessed 17 May 2017 77 As above 65 As above 78 Jordan, K, and Fowkes, L, 2016. Job Creation and 66 SCRGSP (Steering Committee for the Review of Income Support in Remote Indigenous Australia: Government Service Provision) 2016, Overcoming Moving forward with a better system. Centre Indigenous Disadvantage, as above for Aboriginal Economic Policy Research, CAEPR 67 As above Topical Issue NO. 2/2016, available at: http:// caepr.anu.edu.au/sites/default/files/Publications/ 68 Victoria Department of Treasury and Finance, topical/CAEPR%20Topical%20Issues%202_2016. 2011, Guide to Evaluation: How to plan and pdf accessed 17 May 2017 conduct effective evaluation for policy and programs available at www.dtf.vic.gov.au/ 79 James, M, 2012. Designing evaluation strategies. files/41bba0b5-9ef3-4232.../DTF-Guide-to- In Australian Productivity Commission, 2012, Evaluation-2005.doc accessed 17 May 2017 Roundtable Proceedings. In Better Indigenous Policies: The Role of Evaluation. Canberra, 22–23 69 Fitzpatrick, S, 2017. Ganbina a shining light October 2012. Canberra: Productivity Commission for indigenous accountability, The Australian, available at http://www.pc.gov.au/research/ 24 August 2016 available at http://www. supporting/better-indigenous-policies/09-better- theaustralian.com.au/national-affairs/indigenous/ indigenous-policies-chapter7.pdf accessed 17 May ganbina-a-shining-light-for-indigenous- 2017 accountability/news-story/201b36dbe8655cf83c18 d2964b93ea3e accessed 17 May 2017 80 New South Wales Department of Premier and Cabinet, 2016. NSW Government Program 70 Victoria Department of Planning and Community Evaluation Guidelines, available at: http:// Development, 2008. Evaluation Step-by-step www.dpc.nsw.gov.au/programs_and_services/ Guide, available at: http://www.dhs.vic.gov.au/__ policy_makers_toolkit/evaluation_in_the_nsw_ data/assets/pdf_file/0011/769943/Evaluation- government accessed 17 May 2017 Step-by-Step-Guide.pdf accessed 17 May 2017 81 See Entity Reports for complying with the Senate 71 Fred Hollows Foundation, 2012. The Women’s Order on Procurement Contracts and use of Development Project: Indigenous Australia Confidentiality Provisions 2015/2016 Financial Year Program Evaluation Report available at http:// from AusTender website www.healthinfonet.ecu.edu.au/uploads/ 82 Morely, S , 2015 What works in effective resources/23392_23392.pdf accessed 17 May community managed programs, Child Family 2017 Community Australia, CFCA Paper No. 32: 72 FSG. 2016. The Case for Developmental Canberra available at https://aifs.gov.au/cfca/ Evaluation, available at: http://www.fsg.org/blog/ publications/what-works-effective-indigenous- case-developmental-evaluation accessed 17 May community-managed-programs-and-organisations 2017 accessed 17 May 2017 73 Urbis, 2015, Ability Links NSW Evaluation Interim 83 As above Report 4, 84 Moran, M, 2016. How community-based innovation 74 Social Ventures Australia, 2017. The can help Australia close the Indigenous gap. Martu Leadership Program Evaluation of a The Conversation, 4 March 2016 available at pilot program using the Social Return on https://theconversation.com/how-community- Investment (SROI) methodology, available based-innovation-can-help-australia-close-the- at https://static1.squarespace.com/ indigenous-gap-54907 accessed 17 May 2017 static/54e3fc54e4b08f2ad4349a65/t/58fd4da9d17 85 As above 58eec4bd06271/1492995520775/170420+Kanyirn inpa+Jukurrpa+Evaluation+of+the+Martu+Leader ship+Program.pdf accessed 17 May 2017, page 20

Evaluating Indigenous programs: a toolkit for change | 55 86 FSG. 2016. The Case for Developmental 96 New South Wales Department of Premier and Evaluation, available at: http://www.fsg.org/blog/ Cabinet, 2016. NSW Government Program case-developmental-evaluation accessed 17 May Evaluation Guidelines, available at: http:// 2017 www.dpc.nsw.gov.au/programs_and_services/ policy_makers_toolkit/evaluation_in_the_nsw_ 87 New South Wales Department of Premier and government accessed 17 May 2017 Cabinet, 2016. NSW Government Program Evaluation Guidelines, available at: http:// 97 Ackerman, F., 2008, Critique of Cost-Benefit www.dpc.nsw.gov.au/programs_and_services/ Analysis, and Alternative Approaches to Decision- policy_makers_toolkit/evaluation_in_the_nsw_ Making, Available at: http://www.ase.tufts.edu/ government accessed 17 May 2017 gdae/pubs/rp/ack_uk_cbacritique.pdf, Accessed Feb 1 2016 88 Clarke, H, and Andersson, A, 2017, Theories of Change and Logic Models: Telling Them Apart. In 98 Daly, A. and Barrett, G., n.d., Independent Cost Presentation at American Evaluation Association. Benefit Analysis of the Yuendumu Mediation Atlanta, Georgia, November 2004 available at and Justice Committee, University of Canberra, https://www.theoryofchange.org/wp-content/ Available at www.centraldesert.nt.gov.au/sites/ uploads/toco_library/pdf/TOCs_and_Logic_ centraldesert.nt.gov.au/files/attachments/ Models_forAEA.pdf accessed 17 May 2017 yuendumu_cba_0.pdf, accessed Jan 26 2016. 89 Victoria Department of Planning and Community 99 Daly and Barrett, n.d. as above and Access Development, 2008. Evaluation Step-by-step Economics, 2006, Opal Cost Benefit Analysis, Guide, available at: http://www.dhs.vic.gov.au/__ Available at: http://www.npywc.org.au/wp- data/assets/pdf_file/0011/769943/Evaluation- content/uploads/2011/09/OpalReport2006_02_23. Step-by-Step-Guide.pdf accessed 17 May 2017 pdf, Accessed Jan 26 2016 90 Australian Productivity Commission, 2013, 100 Daly and Barrett, n.d as above Summary of Roundtable discussions In Better 101 Daly and Barrett, n.d as above and Allen Indigenous Policies: The Role of Evaluation, Consulting Group, 2011, Assessment of the Roundtable Proceedings, Productivity Commission, economic and employment outcomes of the Canberra, page 3 available at http://www.pc.gov. Working on Country program, Available at: au/research/supporting/better-indigenous- http://www.environment.gov.au/indigenous/ policies/02-better-indigenous-policies-summary. workingoncountry/publications/pubs/woc- pdf accessed 17 May 2017 economics.pdf, Accessed Jan 26 2016 91 United States Government Accountability Office, 102 Nicholls, J., Lawlor,E.,Neitzert, E. and Goodspeed, 2012. Designing Evaluations 2012 Revision. T., 2009, A guide to Social Return on Investment, Washington available at http://www.gao.gov/ available at: http://socialvalueuk.org/publications/ assets/590/588146.pdf accessed 17 May 2017 publications/doc_download/241-a-guide-to-social- return-on-investment-2012., Accessed Feb 1 2016 92 As above 103 Social Ventures Australia (SVA), 2012, Social 93 James, M, 2012. Designing evaluation strategies. Return on Investment Lessons learned in In Australian Productivity Commission, 2012, Australia’, Available at: http://socialventures.com. Roundtable Proceedings. In Better Indigenous au/assets/SROI-Lessons-learned-in-Australia.pdf, Policies: The Role of Evaluation. Canberra, 22–23 Accessed Feb 4 2016 October 2012. Canberra: Productivity Commission available at http://www.pc.gov.au/research/ 104 As above supporting/better-indigenous-policies/09-better- 105 Social Ventures Australia (SVA), 2012, Social indigenous-policies-chapter7.pdf accessed 17 May Return on Investment Lessons learned in 2017 Australia’, available at: http://socialventures.com. 94 As above au/assets/SROI-Lessons-learned-in-Australia.pdf, Accessed Feb 4 2016 95 United States Government Accountability Office, 2012. Designing Evaluations 2012 Revision. 2nd ed. Washington: http://www.gao.gov/ assets/590/588146.pdf.

56 | Evaluating Indigenous programs: a toolkit for change

About the Author

Sara Hudson

Sara Hudson is a Research Fellow and Manager of the Indigenous Program at the Centre for Independent Studies. She has published widely on Indigenous policy for the CIS, with a particular focus on Indigenous programs, economic development, health and criminal justice.

Research Report 28 (RR28) • ISSN: 2204-8979 (Printed) 2204-9215 (Online) • ISBN: 978-1-922184-87-0 Published June 2017 by The Centre for Independent Studies Limited. Views expressed are those of the authors and do not necessarily reflect the views of the Centre’s staff, advisors, directors or officers. © The Centre for Independent Studies (ABN 15 001 495 012), 2016 This publication is available from The Centre for Independent Studies. Visit www.cis.org.au.

Level 1, 131 Macquarie St, Sydney NSW 2000 • phone: +61 2 9438 4377 • fax: +61 2 9439 7310 • email: [email protected]