6. Privacy protection and security

6.1   Minimise and protect personal information

Data minimisation

Data minimisation is an important consideration when developing and deploying AI systems for several reasons, including privacy and improving quality and model stability. In some cases, more data may be warranted (for example, some large language models) but it is important that you follow good practice in determining the data needed for your use case. 

Privacy requirements for personal information under the Australian Privacy Principles (APPs) are an important consideration in responding to this question. Ensure you have considered your obligations under the APPs, particularly APPs 3, 6 and 11.

For more information, you should consult the APP guidelines, your agency’s internal privacy policy and resources and privacy officer.

Privacy enhancing technologies

Your agency may want or need to use privacy enhancing technologies to assist in de‑identifying personal information under the APPs or as a risk mitigation/trust building approach. Under the Privacy Act 1988 (Cth) and the APPs, where information has been appropriately de‑identified it is no longer personal information and can be used in ways that the Privacy Act would normally restrict. 

The Office of the Australian Information Commissioner’s (OAIC) website provides detailed guidance on De-identification and the Privacy Act that agencies should consider. You may also wish to refer to the De-identification Decision-Making Framework, jointly developed by the OAIC and CSIRO Data61.

6.2   Privacy assessment

The Australian Government Agencies Privacy Code (the Privacy Code) requires Australian Government agencies subject to the Privacy Act 1988 to conduct a privacy impact assessment (PIA) for all ‘high privacy risk projects’. A project may be a high privacy risk if the agency reasonably considers that the project involves new or changed ways of handling personal information that are likely to have a significant impact on the privacy of individuals.

A Privacy Threshold Assessment (PTA) is a preliminary assessment to help you determine your project’s potential privacy impacts and give you a sense of the risk level, including whether it could be a ‘high privacy risk project’ requiring a PIA under the Code. 

This assurance framework does not determine the timing for conducting a PIA or PTA – it may be appropriate that you conduct a PIA or PTA earlier than your assessment of the AI use case under this framework.

If no PIA or PTA has been undertaken, explain why and what consideration there has been of potential privacy impacts.

Privacy assessments should consider if relevant individuals have provided informed consent, where required, to the collection, sharing and use of their personal information in the AI system’s training, operation or as an output for making inferences. Also consider how any consent obtained, including a description of processes used to obtain the consent, has been recorded.

For more information, you should consult the guidance on the Office of the Australian Information Commissioner’s website. You can also consult your agency’s privacy officer and internal privacy policy and resources.

If your AI system has used or will use Indigenous data, you should also consider whether notions of ‘collective’ or ‘group’ privacy of First Nations people are relevant and refer to the guidelines in the Framework for Governance of Indigenous Data (see 5.2).

6.3   Authority to operate

The Protective Security Policy Framework (PSPF) applies to non‑corporate Commonwealth entities subject to the Public Governance, Performance and Accountability Act 2013 (PGPA Act). 

Refer to the relevant sections of the PSPF on safeguarding information and communication technology (ICT) systems to support the secure and continuous delivery of government business. 

Under the PSPF, entities must effectively implement the Australian Government Information Security Manual (ISM) security principles and must only use ICT systems that the determining authority (or their delegate) has authorised to operate based on the acceptance of the residual security risks associated with its operation.

In addition, the Australian Signals Directorate’s Engaging with Artificial Intelligence guidance outlines mitigation considerations for organisations to consider. It is highly recommended that your agency engages with and implements the mitigation considerations in the guidance. 

AI systems that have already been authorised or fall within existing authorisations by your agency’s IT Security Adviser (ITSA) do not have to be re‑authorised. 

It is recommended you engage with your agency’s ITSA early to ensure all PSPF and ISM requirements are fulfilled. 

7. Transparency and explainability

Connect with the digital community

Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.