Sarah Sacher, Economic Justice Australia, Law Reform Officer
Evidence provided to the Robodebt Royal Commission has highlighted the many systemic failings that led to the introduction of the Robodebt scheme, and the extensive human and financial costs that were subsequently incurred.
Now is the time to learn lessons from these failings, in order to prevent similar problems from reoccurring in the future. EJA is currently assessing the broader implications of automated-decision making (ADM) and artificial intelligence (AI) in social security systems, as part of a new long-term project on ‘Automation in Social Security’. This project will draw on the insights EJA of member organisations and provide evidence-based recommendations for reform.
For the moment, there are immediate lessons which are clear and require prompt action. EJA has made contributions to the Royal Commission through a Statement of Evidence and a submission. We highlighted a number of forward-looking insights for reform of social security systems, including regarding the use of ADM and AI. Below is a summary of the key takeaways.
Bringing our Social Security system into compliance with not just the law, but also ethical and human rights standards
Much of the Royal Commission’s inquiry has focused on the lawfulness of the Robodebt scheme. This may distract from the crucial point that, regardless of lawfulness, it was known from an early stage that the system was inaccurate and unfair.
The Australian Government is obligated to comply with the rule of law, public law principles and human rights standards when developing policy and making administrative decisions. These standards apply regardless of how a system operates or the kind of technology that is used in the decision-making process. Social security systems, including those that utilise ADM and AI must be brought into line with these standards.
Importantly, these standards should be applied when considering the suitability of the use of ADM and AI in administrative systems, noting that they will not be appropriate in all contexts. Factors to be considered include:
- the risk of breaching administrative law requirements or human rights obligations;
- whether discretionary decision-making is involved – noting that ADM systems are inherently rigid, and are unsuited to making discretionary decisions or taking into account individual circumstances;
- whether accuracy of decision-making may be compromised; and
- whether there is a high level of vulnerability in the affected cohort, and a risk of harm to that cohort through use of the proposed system.
Where ADM/AI tools are judged suitable for use, they should then be carefully designed to be compliant, including through early consideration of rights impacts, consultation, and rigorous testing. Wherever ADM and AI systems are adopted for administrative decision-making, it is essential that they have a basis in legislation – so that there is a clear legal authority for the use of the tool, and transparency around its use in the particular context.
Compliance systems must also be lawful under the Social Security Act. To bring Services Australia into alignment with the Act, it must meet its obligation to bear the onus of proving that a person has been overpaid a social security or family assistance payment, instead of requiring recipients to disprove a debt upon notice. The basis on which any Centrelink debt has been calculated must be explained to the person affected and to any reviewer in a way that is clear, intelligible, and transparent.
Considering the vulnerability of the social security cohort
Some of the most vulnerable people in our communities are social security recipients. These include elderly people, people with disability, people who are homeless, and First Nations people in remote communities. The Robodebt scheme was deeply inaccessible and not fit-for-purpose for these cohorts (or any cohort). Given the financial instability of social security recipients, it is absurd to consider that the Robodebt scheme was designed without considering how people without stable income or consistent working hours would be impacted by the design of the averaging procedure.
The vulnerable groups that makes up social security recipients should be a core consideration for Services Australia across all of its operations, since its raison d’etre is to provide a service to those cohorts. Social security compliance processes need to be designed to take into account their particular vulnerabilities, disadvantages and accessibility needs.
Testing administrative systems to ensure they are fit for purpose
Income reporting compliance processes relying on ADM/AI must be rigorously tested and audited by an independent, expert agency prior to implementation, and routinely thereafter. This should be mandatory under legislation.
These systems should be subsequently implemented with appropriate human oversight, and include a ‘human in the loop’ to ensure accountability over decision-making – decisions that affect people’s rights and entitlements should not be made solely through automation with no human involvement. Training for Centrelink staff is necessary to maintain the level of skill required to provide effective oversight.
All automated systems used by government in administering the law to determine individual interests, obligations and rights must be fully transparent and explained in a way that is comprehensible to the public. If this is not possible with a particular system, it should not be used.
Ensuring individuals can challenge decisions and access review
There are number of barriers affecting access to internal and external review of social security compliance decisions. In relation to Robodebt, these included:
- Inadequate debt notices that provided no detail about how the debt was incurred and calculated, which meant that they are difficult to challenge.
- Centrelink compliance staff responding to queries about Robodebts were generally unable to explain the basis of the Robodebt calculation.
- EJA members had to resort to obtaining Freedom of Information requests to obtain basic information about how the debt was incurred and calculated
- Internal Centrelink reviews were routinely denied unless Robodebt recipients provided payslips for the period in question (which many recipients were unable to do).
- Where the AAT Tier 1 made decisions to set aside Robodebts because they were not lawfully raised, it was the practice of Services Australia not to appeal to the AAT General Division. As a result, legitimate scrutiny of the Robodebt scheme was avoided,
Efficient, fair, accessible, independent review of income reporting compliance decisions must be available. The announcement of the abolition of the AAT in favour of a new model is an opportunity to address some of the issues related to external reviews. Additionally, people affected by income reporting compliance decisions need access to free and independent specialist social security legal assistance, which requires resourcing of community legal centres.
Ending unethical debt recovery practices
Vulnerable Robodebt recipients were exposed to increasingly coercive threats made by Centrelink debt recovery staff and by externally contracted debt collectors, and were charged debt recovery fees.
Existing debt recovery practices must be reformed, through the introduction of a time bar (EJA suggests a maximum of six years after any payment was received) the abolition of debt recovery fees, and obligations on third party debt collectors to act in accordance with public sector standards.
You can read our full submission, and a summary submission with recommendations here.