Standard for accountable officials
Version 1.1
Use the following information to support your agency’s implementation of the policy for responsible use of AI in government.
Your responsibilities
Agencies must designate accountability for implementing the policy to accountable official(s) (AOs), who must:
- be accountable for implementation of the policy within their agencies
- notify the Digital Transformation Agency (DTA) where the agency has identified a new high-risk use case by emailing ai@dta.gov.au
- be a contact point for whole-of-government AI coordination
- engage in whole-of-government AI forums and processes
- keep up to date with changing requirements as they evolve over time.
The policy does not make AOs responsible for the agency’s AI use cases however an agency may decide to apply additional responsibilities to their chosen AOs.
How to apply
Choose a suitable accountable official
Agencies may choose AOs who suit the agency context and structure.
The responsibilities may be vested in an individual or in the chair of a body. The responsibilities may also be split across officials or existing roles (such as Chief Information Officer, Chief Technology Officer or Chief Data Officer) to suit agency preferences.
— Enable and prepare, Policy for the responsible use of AI in government
As policy implementation is not solely focused on technology, AOs may also be selected from business or policy areas. AOs should have the authority and influence to effectively drive the policy’s implementation in their agency.
Agencies may choose to share AO responsibilities across multiple leadership positions.
Agencies are to email ai@dta.gov.au with the contact details of all their AOs both on initial selection and when the accountable roles change.
Implementing the policy
AOs are accountable for their agency’s implementation of the policy. Agencies should implement the entire policy as soon as practical, with consideration to the agency’s context, size and function.
The mandatory actions of the policy must be implemented within their specified timelines.
The policy provides a coordinated approach for the use of AI across the Australian Government. It builds public trust by supporting the Australian Public Service (APS) to engage with AI in a responsible way. AOs should assist in delivering its aims by:
- uplifting governance of AI adoption in their agency
- embedding a culture that fairly balances AI risk management and innovation
- enhancing the response and adaptation to AI policy changes in their agency
- facilitating their agency’s involvement in cross-government coordination and collaboration.
AOs should also consider the following activities:
- developing a policy implementation plan
- monitoring and measuring the implementation of each policy requirement
- strongly encouraging implementation of AI fundamentals training for all staff
- strongly encouraging additional training for staff in consideration of their role and responsibilities, such as those responsible for the procurement, development, training and deployment of AI systems
- encouraging the implementation of further actions suggested in the policy
- encouraging the development or alignment of an agency-specific AI policy
- reviewing policy implementation regularly and provide feedback to the DTA.
In line with the Standard for Transparency Statements, AOs should provide the DTA with a link to their agency transparency statement, whenever it is published or updated, by emailing ai@dta.gov.au.
Reporting high-risk use cases
AOs must send information to the DTA, by emailing ai@dta.gov.au, in the event their agency identifies a new high-risk use case. This includes when an existing AI use case is re-assessed as high-risk.
The information should include:
- the type of AI
- intended application
- why the agency arrived at a ‘high-risk’ assessment
- any sensitivities.
This is not intended to prevent agencies from adopting the use case. Instead, it will help government develop risk mitigation approaches and maintain a whole-of-government view of high-risk use cases.
While the risk matrix explains how to approach a risk assessment, it is the responsibility of agencies to determine the risks to assess for.
Acting as the agency’s contact point
At times, the DTA will need to collect information and coordinate activities across government to mature the whole-of-government approach and policy.
AOs are the primary point of contact within their agency. They should facilitate connection to the appropriate internal areas and allow for information collection and agency participation in these activities.
Engaging with forums and processes
AOs must participate in, or nominate a delegate for, whole-of-government forums and processes which support collaboration and coordination on current and emerging AI issues. These forums will be communicated to AOs as they emerge.
Keeping up-to-date with changes
The policy will evolve as technology, leading practices and the broader regulatory environment mature. While the DTA will communicate changes, AOs should keep themselves and stakeholders in their agency up to date on:
- changes to the policy
- impacts of policy requirements.
Questions about policy implementation
AOs can contact the DTA with questions about policy implementation by emailing ai@dta.gov.au.