Introduction
The Pilot Australian Government artificial intelligence (AI) assurance framework (the framework) guides Australian Government agencies through impact assessment of AI use cases against Australia's AI Ethics Principles. It is intended to complement and strengthen – not duplicate – existing frameworks, legislation and practices that touch on government’s use of AI.
The draft framework should be read and applied alongside the Policy for the responsible use of AI in government and existing frameworks and laws to ensure agencies are meeting all their current obligations. Above all, Australian Government agencies must ensure their use of AI is lawful, constitutional and consistent with Australia’s human rights obligations and reflect this in the planning, design and implementation of AI use cases from the outset.
Assurance is an essential part of the broader governance of government AI use. In June 2024, the Australian Government and all state and territory governments endorsed the National framework for the assurance of artificial intelligence. The national framework establishes a nationally consistent, principles-based approach to AI assurance, that places the rights, wellbeing and interests of people first. By committing to these principles, governments are seeking to secure public confidence and trust that their use of AI is safe and responsible.
This pilot assurance framework is exploring mechanisms to support Australian Government implementation of these nationally agreed principles. Evidence gathered through the pilot will inform the DTA’s recommendations to government on future AI assurance mechanisms, as part of next steps for the Policy for the responsible use of AI in government.
The framework will continue to evolve over time. Please email the Digital Transformation Agency (DTA) at aistandards@dta.gov.au if you have any questions regarding the framework.
AI use cases covered by the framework
For the purposes of the framework, agencies should apply the Organisation for Economic Co‑operation and Development (OECD) definition of AI:
Check the OECD.AI for any definition updates.
The framework applies regardless of whether an AI model or system is procured, built or otherwise sourced or adapted.
An AI use case is a covered AI use case if it has reached the ‘design, data and models’ AI lifecycle stage (see section 1.2 below and in the guidance) and if any of the following apply.
- The estimated whole‑of‑life cost of the project or service incorporating the use of AI is more than $10 million. This coverage is consistent with other whole-of-government digital policies to ensure agencies understand that AI is classified as a digital and ICT investment and must meet Investment Oversight Framework obligations.
- It is possible the use of AI will lead to more than insignificant harm to individuals, communities, organisations or the environment.
- The use of AI will materially influence decision-making that affects individuals, communities, organisations or the environment.
- It is possible the AI will either directly interact with the public or produce outputs that are not subject to human review prior to publication.
- It is deemed a covered AI use case by the DTA.
For the avoidance of doubt, you are not required to undertake an assessment under this framework if you are doing early-stage experimentation which does not:
- commit you to proceeding with a use case or to any design decisions that would affect implementation later
- commit you to expending significant resources or time
- risk harming anyone
- introduce or exacerbate any privacy or cybersecurity risks
- produce outputs that will form the basis of policy advice, service delivery or regulatory decisions.
However, you may still find the framework useful and choose to apply it. Agencies are encouraged to apply the framework as appropriate for AI use cases that are not covered. Early and iterative engagement with the framework throughout the lifecycle of an AI use case will assist you to identify, manage and mitigate risks more effectively.
Assessment process
Each use case assessment should be completed by a designated assessment contact officer who oversees the end-to-end assessment process. Assessment contact officers should first complete sections 1 to 3, then, depending on the outcome of the threshold assessment at section 3, proceed to complete a full assessment.
The assessment sections do not need to be completed in the numbered order – assessment contact officers may choose to work on sections concurrently and revisit sections as needed. Assessment contact officers will likely need to seek input from other staff in their agency to complete some sections, and in some cases may need external advice. Refer to the accompanying draft guidance material, which is intended as an interpretation aid.
Below is a simplified process map for completing an assessment under this framework.
- Is it a ‘covered Al case' and have you progressed beyond early experimentation to the design, data and models stage or beyond?
- If no, you do not need to complete this assessment.
- If yes, complete sections 1 to 3 of the framework and proceed to question 2 below.
- Does the threshold assessment at section 3 of the framework indicate that you need to complete a full assessment?
- If no, you have completed the assessment.
- If yes, complete the full framework assessment and proceed to question 3 below.
- Your agency’s responsible governance body must conduct an internal review of the assessment. After applying the body’s recommended risk treatments, is the use case’s residual risk high?
- If no, the residual risk is medium or below. Your agency may choose to accept this residual risk and proceed with the use case. You have completed the assessment.
- If yes, proceed to question 4 below.
- Consider whether the assessment would benefit from external review, which may recommend further risk treatments or adjustments to the use case. Your agency must:
- consider any recommendations from external review and decide which to implement
- decide whether to accept any residual risk and proceed with the use case.
Remember to review completed assessments when use cases move to a different lifecycle stage or significant changes occur to the scope or function of the use case.