8. Contestability

8.1    Notification of AI affecting rights

You should notify individuals, groups, communities or businesses when an administrative action materially influenced by an AI system has a legal or similarly significant effect on them. This notification should state that the action was materially influenced by an AI system and include information on available review rights and whether and how the individual can challenge the action. 

An action producing a legal effect is when an individual, group, community or business’s legal status or rights are affected, and includes:

  • provision of benefits granted by legislation
  • contractual rights.

An action producing a similarly significant effect is when an individual, group, community or business’s circumstances, behaviours or choices are affected, and includes: 

  • denial of consequential services or support, such as housing, insurance, education enrolment, criminal justice, employment opportunities and health care services
  • provision of basic necessities, such as food and water.

A decision may be considered to have been materially influenced by an AI system if:

  • the decision was automated by an AI system, with little to no human oversight
  • a component of the decision was automated by an AI system, with little to no human oversight (for example, a computer makes the first 2 limbs of a decision, with the final limb made by a human)
  • the AI system is likely to influence decisions that are made (for example, the output of the AI system recommended a decision to a human for consideration or provided substantive analysis to inform a decision). 

‘Administrative action’ is any of the following:

  • making, refusing or failing to make a decision
  • exercising, refusing or failing to exercise a power
  • performing, refusing or failing to perform a function or duty.

Note: this guidance is designed to supplement, not replace, existing administrative law requirements pertaining to notification of administrative decisions. The Attorney-General’s Department is leading work to develop a consistent legislative framework for automated decision making (ADM), as part of the government’s response to recommendation 17.1 of the Robodebt Royal Commission Report. The Australian Government AI assurance framework will continue to evolve to ensure alignment as this work progresses.

8.2    Challenging administrative actions influenced by AI

Individuals, groups, communities or businesses subject to an administrative action materially influenced by an AI system that has a legal or similarly significant effect on them should be provided with an opportunity to challenge this action. This is an important administrative law principle. See guidance on section 8.1 above for assistance interpreting terminology.

Administrative actions may be subject to both merits review and judicial review. Merits review considers whether a decision made was the correct or preferable one in the circumstances, and includes internal review conducted by the agency and external review processes. Judicial review examines whether a decision was legally correct.

You should ensure that review rights that ordinarily apply to human-made decisions or actions are not impacted or limited because an AI system has been used.

Notifications discussed at section 8.1 should include information about available review mechanisms so that people can make informed decisions about disputing administrative actions.

You will need to ensure a person within your agency is able to answer questions in a court or tribunal about an administrative action taken by an AI system if that matter is ultimately challenged. Review mechanisms also impact on the obligation to provide reasons. For example, the Administrative Decisions (Judicial Review) Act 1977 gives applicants a right to reasons for administrative decisions.

9. Accountability

Connect with the digital community

Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.