An Australian royal commission is investigating how Australia’s use of an automated debt-raising tool that issued 500,000 incorrect debt notices – and led to the biggest class-action legal case in the country’s history – was allowed to cause so much damage. Beckie Smith looks at the story so far

Justice Murphy’s description of the Australian government’s failed Robodebt programme is one that is both distinctively Antipodean and familiar to civil servants across the globe.

As he rubber-stamped a settlement of nearly two billion Australian dollars over the automated debt-raising system that had wrongly attempted to claw back payments from benefit recipients, he said: “I am reminded of the aphorism that, given a choice between a stuff-up, even a massive one, and a conspiracy, one should usually choose a stuff-up.”

The use of algorithms, data analytics and automated systems in public services has come under the spotlight in recent years, particularly their potential for harm if safeguards are not built in at every stage. For those who have been following the story of Robodebt, it has become painfully apparent that such safeguards were not in place.

Introduced in 2016, the Online Compliance Intervention programme – to use its formal name – was administered by Services Australia, which is broadly equivalent to the UK’s Department for Work and Pensions. It used an automated tool to calculate and recover overpayments and other money owed by recipients of benefits through Australia’s Centrelink welfare programme.

But the programme used a flawed method of income averaging to compare welfare recipients’ reported incomes with their income as recorded by the Australian Tax Office. Where it determined that there had been an overpayment, recipients were automatically issued with debt notices.

In the four years it was in operation, Robodebt issued nearly 500,000 incorrect debt notices. The Australian press was flooded with stories of people being shocked and pushed into hardship by unexpected and aggressive demands for large repayments. 

The case led to what is believed to be the biggest class-action suit in Australian legal history, which the government settled in 2021 to the tune of AU$1.8bn (£1bn) in refunds, legal fees, cancelled debts and compensation.

Stuff-up or conspiracy?

It should have been “obvious” to ministers and public servants, the judge who approved the 2021 settlement said, that the income-averaging calculations on which Robodebt was built were flawed.

“Ministers and senior public servants should have known that income averaging based on [Australian Tax Office] data was an unreliable basis upon which to raise and recover debts from social security recipients,” Justice Murphy said.

The scheme did not account for any variation in people’s working hours from week to week, instead using tax data to estimate their “assumed” fortnightly income, which then resulted in falsely estimated overpayments.

This oversight led to a “shameful chapter in the administration of the commonwealth social-security system and a massive failure of public administration”, Murphy said.

There are a number of ways the harm and controversy that arose from the use of Robodebt could have been avoided, according to Rich Wentel, a deputy director in the UK’s Public Sector Fraud Authority.
Speaking at a recent CSW webinar on fraud detection and fairness in the public sector, Wentel said:

“There are two key lessons that come out of Robodebt and others, which are: if you don’t have the expert people in the beginning to set the framework correctly, you will get rogue outcomes or you have a higher propensity for that. And secondly, if you just believe what the computer says, you’re then hostage to fortune.”

While data analytics and algorithms can be useful tools in helping to identify fraud, they should be used to highlight anomalies, “never to give an absolute ‘pay/no-pay’ decision,” he said.

“Data flags – or the reliance on data flags – should never be the be-all and end-all,” he explained. “It’s highlighting things that don’t fit a pattern. And if you’ve done your risk rules correctly, then it will throw out a population of high-risk cases that need review. But that’s the key thing: it needs review.”
Robodebt lacked this critical review stage: debt notices were issued automatically without any intervention from officials, or any further investigation of the “overpayments” flagged by the system.

By contrast, the UK’s PSFA uses a multi-step approach to identifying and tackling fraudulent activity for the public bodies it works with. The authority conducts large-scale data-matching exercises for local government bodies, NHS trusts and other organisations, using analytics to flag cases that need reviewing.

“We provide the matches to the customer and then the customer does compliance work, investigation work, and then comes back and tells us if it’s fraud or not,” Wentel says. “There are false positives in there; there always will be. It’s a question of managing the false positive rate against the impact.”

Hostage to fortune

While the decision to rely solely on an automated system to detect fraud and recover overpayments left the Australian government a hostage to fortune, evidence continues to emerge that there were several opportunities to change Robodebt’s outcome.

Hearings have been taking place in recent weeks as part of a royal commission set up last August to probe the design and implementation of the scheme. Witnesses’ testimony is beginning to reveal the extent to which warnings about the programme’s potential to do harm were seemingly ignored.

In November, the commission was told that the Department of Social Services, Services Australia’s parent department, had received legal advice from a private law firm on Robodebt in 2018 that cast doubt on the scheme’s lawfulness. Despite finding the use of income averaging to raise welfare debts was unlawful, Robodebt continued to run until a legal challenge was launched in November 2019. 

In a February hearing, the inquiry then heard that a review commissioned by Services Australia in 2017 was discontinued despite finding “a lot of flaws” in the scheme. 

“Our view was that there were things [in the scheme] that should have been considered but weren’t,” Shane West, a partner at PwC, told the commission. West said the consulting giant had “put considerable time” into developing a 70-page draft report that it was later told was no longer required.

In that time, it found “room for improvement” in the scheme and “clear evidence… savings weren’t going to be achieved”.

The commission will continue to draw out lessons from the botched Robodebt scheme in the coming months. But one lesson is clear: public authorities must be rigorous in ensuring that their automated systems are implemented fairly. As Wentel says: “You have to be very, very clear about fairness. The mantra that we have is that we should be our own biggest critic.” 

Read the most recent articles written by Beckie Smith - Chisholm to become UK chair of energy giant EDF

Share this page