Indigenous evaluation findings: delayed publishing prevented contest of policy

By Stephen Easton

Tuesday February 12, 2019

The federal government has finally published a long-awaited evaluation report on the special work-for-the-dole scheme for remote communities, but it’s unclear where it fits into the policy process.

While it was planned to be finished in mid-2018, the appraisal of the Community Development Programme’s first two years – a period that ended over eighteen months ago – was only made public last week.

The report makes some worthy observations of how the program works and the particular challenges it seeks to address by applying special conditions to 35,000 people receiving welfare benefits in designated remote areas. Unfortunately, the delayed process has significantly reduced the practical public benefit of publishing it.

No doubt the process, part of a wider Indigenous affairs evaluation program over recent years, has influenced departmental thinking. But it seems out of sync with the policy cycle, coming out well after the consultation phase on a set of changes the government now seeks to implement.

The evaluation found only marginal improvements in some measures compared to the CDP’s immediate predecessor, the Remote Jobs and Communities Program, alongside some rather worrying results, especially among the “community voices and stakeholder perspectives” surveyed by the Indigenous engagement consultancy Winangari and market researcher Ipsos.

“Of the almost 1,000 community members surveyed across the eight remote communities, 21 per cent felt the community was better off since CDP was introduced, 32 per cent said the community was the same, and 36 per cent said the community was worse,” reports the Department of the Prime Minister and Cabinet.

A lot of critics zeroed in on parts that best supported their existing views that it is ineffective, excessively punitive and racially discriminatory, but the official findings are not so dramatic.

When asked, Minister for Indigenous Affairs Nigel Scullion has said his new policy adjustments broadly address many of the evaluation’s findings, without going into much detail, but he has generally refrained from proactively addressing the report or explaining its role in the policy process.

Instead, the minister got out in front and announced his reforms had been “well-received” by a group of six Indigenous leaders, who provided enthusiastic and supportive quotes suggesting the changes address most of the main criticisms that have been lobbed at the CDP since 2015.

Better late than never

While scholars, pundits and politicians awaited the report, policy development marched on. The government proposed its changes to the CDP in the last budget, but it is hard to join the dots from there back to the evaluation.

The reforms followed a brief consultation process run over the preceding Christmas holidays, based around a discussion paper that presented three options on December 14, 2017, and submissions accepted up to the end of the following February.

Scullion has announced some reforms that begin this March, but other proposals require the Senate to pass a bill that was introduced last August. Some Senators wanted to see the evaluation first.

Queensland-based social science researcher and former state public servant Zoe Staines, who has previously run evaluations for the Cape York Institute for Policy and Leadership, keenly awaited the PM&C report on the CDP for over half a year. She commends the department for running the project, especially since an RJCP evaluation stalled due to an election, but questions the timing.

“It’s too late; consultation’s already wrapped up,” she told The Mandarin.

“Any of the findings in these reports would have been incredibly useful for communities, going into the consultation process, and for providers and participants. As far as I know they didn’t have access to these reports prior to the consultation or during the consultation, although I imagine those leading the consultation may have referred to some of these outcomes, but I’m not sure.”

“That’s problematic. If you’re going to do consultation, in a way that’s supposed to really inform policy development and you’re really taking into consideration what people think about the program and their views and perspectives on what should change, then those views and perspectives need to be as informed as possible. And having access to these evaluation documents is a big part of that.”

Along with insufficient monitoring and evaluation, Clayton’s consultation is another of the most common criticisms of public policy processes in general.

Staines observes that Indigenous affairs consultations – including for development of the CDP and the reforms in recent years – are often publicly criticised for being “tokenistic and superficial” and having limited impact on policy.

She sees a need for “better interfaces for government to interact with Indigenous people in a more representative way” but the latest attempt to build such a mechanism, the Indigenous voice proposed in the Uluru Statement, was instantly rejected by the government.

Writing in The Conversation just after the reform bill came out, Staines suggested the evaluation was late right from the get-go, according to the department’s own best practice guidelines:

“The Community Development Programme evaluation design was only developed and signed off between seven and 10 months after the program was implemented (rather than forming part of the program design).

“This contradicts one of the best practice principles for evaluation in Indigenous affairs. There was also no consideration of the initial design by an evaluation reference group.”

A month before that, PM&C told Senator Malarndirri McCarthy it was still “being finalised” and a decision on “the appropriate release” of the information would be made when that was done.

When Senate estimates came back around in October, Scullion told Senator Rachel Siewert the PM&C evaluation report had been “completed and finalised” but had to go back to Indigenous communities, where the research was conducted, for final consultation on its contents.

Update: The Mandarin asked PM&C on Monday evening when the two reports were “finalised and provided to the government” and a spokesperson told us the final steps took until “late January” to complete, in a response on Wednesday morning.

For something that took so long to produce, the evaluation does not delve very deep into the CDP or provide particularly conclusive or specific findings about what works and what doesn’t.

Success in the eye of evaluator

Staines is not surprised the fairly inconclusive results were seized by government to demonstrate success, as well as by critics as proof they were right all along. She says how you interpret it depends on how one defines success, and notes the evaluation focused a lot on simple measures like work for the dole participation rates.

“But it doesn’t necessarily then speak to whether that is particularly meaningful or not in terms of actually moving people into work. And not just moving them into work, but also the nature of that work and what other implications that attendance can have on people in terms of their overall wellbeing, their temporal autonomy and things like that.”

The report finds little evidence of better outcomes under the CDP’s first two years. Enrolment in “work-like and community activities” increased from 58% to 85% but work-for-the-dole attendance rates remained steady.

Based on modelling, the evaluators estimated a 1% increase in the number of people reaching 26 weeks in a form of employment under the CDP, up from 5.7% under the RJCP. “This result is consistent with the greater weight placed on 26 consecutive-week employment outcomes in provider and employer payments under the CDP,” the report notes.

It is not clear what happened after 26 weeks. “The evaluation report mentions that it is impossible based on the available data, because job seekers are not tracked after that point,” observes Staines.

It also says little about the quality of jobs and the income they provide. According to Scullion, the CDP has helped people into 29,000 paying jobs of some kind since 2015, and about 10,000 of those lasted over six months.

“Certainly under past programs it’s been found the jobs that people move into are often part-time, they’re often flexible, they don’t last very long, they aren’t secure in the long term and they don’t provide a great deal of income – in some cases, even less income than has been received under income support on CDP,” said the social science researcher.

“So in fact having that 26-week outcome, it’s on the surface level a good thing but we have to dig deeper to really understand the implications of that as well.”

The highest improvement in the 26-week measure was among participants with “low barriers to employment” who are small minority of participants. Showing its age, the report states the CDP had only been running for “a short time” and it might take “a few years” for outcomes to improve for those facing “extreme barriers” to employment or living in places with limited job opportunities.

That may be so, but it is hardly revolutionary; it shares key fundamental approaches with a long line of previous policies that have similarly failed to achieve major results. Staines points out many academics, including herself, would argue that is because they do too little to address major “structural factors” present in remote communities and focus too much on modifying the behaviour of individual citizens.

Like its predecessors, the CDP is not just a labour market program akin to the mainstream jobactive system, it also aims to create new jobs in the designated remote areas through employer subsidies – 6000 in the next round of implementation, with the first 1000 announced for next month.

The evaluation confirmed the number of people who dropped out of the program was beginning to increase under the CDP, but doesn’t shed new light on why. The report states it is not clear whether this was linked to the quality of activities, breaches and financial penalties, or because they got jobs and just did not report this to their CDP service provider or the Department of Human Services.

Staines finds the latter suggestion highly unlikely, given the financial incentives for CDP providers to report clients getting jobs. She thinks it’s more likely that many forgo payments altogether because they found their CDP obligations unhelpful and difficult to manage.

“And that’s pretty scary when you’re considering remote populations are already living in incredible entrenched poverty. In some of my research I’ve found individuals that are spending a year or more without any income or income support because they’ve disengaged with CDP.”

Giving up on Centrelink payments like this obviously leads to deeper poverty and, she adds, this burden is typically shared by other community members who then distribute their income among a larger number of people.

“That’s really really problematic, and it’s what everyone had suspected based on anecdotal evidence, but it’s interesting to see it reported.”

One interesting finding is that while there is much evidence of a far higher “burden of disease” in the remote Indigenous communities where the CDP applies, only 5% of its participants had medical exemptions from their obligations to Centrelink, which is half the rate for recipients of unemployment benefits in the rest of the country.

Monitoring, evaluation, small-scale pilots: who has the time?

As it should, the evaluation report points to its limitations, and in doing so highlights big blindspots in the department’s understanding of the program’s actual outcomes and what is driving them. It also reflects the tension between textbook policy processes and the realities of government.

The reforms involve adjusting the incentive structure of payments to employment service providers and employers but these policy settings are tricky to get right without some testing – as the issues with jobactive demonstrate, aligning the interests of private businesses with participants is hard.

Among six official findings, the report calls for better monitoring and evaluation via new measures of the quality of activity and longer term outcomes beyond the 26-week milestone.

It also suggests small-scale pilots could be a useful way to try out “innovative payment models and employment programs” as these are particularly useful for programs whose precise impacts are uncertain, allowing them to be “refined and evaluated” and reducing the chance of full-scale failure.

However, the report adds an interesting caveat to this, recognising governments might not have the patience to wait for public servants to methodically test policy in this way:

“Where pilots require extensive time to set up, run and evaluate, the benefits of refining policy through pilot programs needs to be balanced against the potential benefits of providing earlier access to new policy arrangements.”

About the author
Inline Feedbacks
View all comments

The essential resource for effective
public sector professionals