Unintended outcomes

This section outlines findings in relation to trial participants’ views of the unintended outcomes of Copilot and generative AI more broadly.

Key insights

Generative AI tools could improve staff inclusivity and strengthen employee-value propositions which could positively impact ways of working and staff satisfaction across the APS.

Risks such as potential job loss, perpetuation of subconscious bias and western norms as well as changes to workers’ skill composition must be effectively managed and monitored to ensure the effective and responsible use of generative AI.

The intended outcomes of the trial aimed to increase users’ confidence in the use of Copilot, improve productivity from regular access and use, improve sentiment reported from regular access and have unintended outcomes identified and catalogued.

Trial participants identified a range of unintended benefits and risks which are detailed below.

Generative AI could improve inclusivity and the attractiveness of the APS as a workplace

Generative AI could improve inclusivity and accessibility.

A user can alter the style and language of texts in Copilot to their preferred format. For example, focus group participants often raised the benefits of Copilot being able to convert technical concepts and subject matters into simple terms as well as translate documents into an employee’s preferred  language.

Several trial participants highlighted the benefits of Copilot in ensuring the appropriate tone was used in their content, testing their thinking and refine their outputs with Copilot, checking their grammar and structure of their content, and explaining concepts in a way they found individually helpful and enabled them to more effectively deliver outputs.

The potential of Copilot in improving accessibility was also noted. One trial participant in the post-use survey highlighted that they can more easily rewrite their word documents using accessible language, such as plain English which could enhance government’s communication with the public.

The adoption of Copilot and generative AI more broadly in the APS could help the APS attract and retain employees.

The APS must actively compete in the labour market to attract and retain top talent, especially in high-demand fields such as technology and data analytics. The APS Employee Value Proposition aims to address attraction and retention issues at the enterprise level by raising awareness of the APS as an employer and sending a positive message about working for the APS

Two focus group participants believed that the APS’s adoption of Copilot was important in attracting employees. They viewed that the APS could be seen as a leader in generative AI and could attract younger generations to choose the APS as a place where they could learn skills in generative AI.

The potential importance of the APS adopting generative AI to prospective employees suggest that the APS should look to further consider the role of generative AI in its Employee Value Proposition.

There are broader concerns regarding generative AI’s impact on jobs, industry and the environment

There are concerns regarding the impact of generative AI on jobs.

Concerns about job loss are common in discussions of AI. These fears stem from the idea that AI will become a more cost-effective, efficient and more effective operative than humans. The World Economic Forum’s Future of Jobs Report (2023) argues that the number of clerical or secretarial roles is likely to decline, while roles for AI and machine learning, data analysis and digital transformation specialists are expected to grow. In Australia, 75% of administrative and clerical staff are women.

if we're going to understand the impact [of generative AI], we need a baseline understanding of workforce and gender.

Interviewee from an Australian Government agency

Focus group participants articulated a concern that Copilot would lead to efficiencies that would reduce the number of employees needed to complete the same amount of work across the APS. Two focus group participants expressed concerns that job losses would be concentrated in certain job types and demographics. It was feared that time savings from administrative tasks, such as note-taking and producing minutes, could lead to job loss among administration staff. 

Interviews with Australian Government agencies highlighted that women could be disproportionately impacted as they currently comprise most APS administration staff. Interviewees remarked that entry-level administration jobs are important pathways for women and other marginalised groups to enter the APS and as a result were concerned that generative AI may further create barriers for disadvantaged cohorts to enter the APS.

The potential for generative AI to reduce employment opportunities or significantly alter job roles elicited strong responses. Agencies should engage with staff to manage reactions and provide assurances about generative AI's impact. Additionally, generative AI's impact on employment opportunities across different demographics should be regularly monitored to ensure that specific groups do not experience unfair rates of job disruption or displacement.

Copilot outputs may be biased towards western norms and may not appropriately use cultural data and information.

Generative AI models can inadvertently amplify societal biases present in their training data. This could lead to outputs that reflect historical inequalities and stereotypes.

Trial participants were cognisant of the risks associated with the misappropriation of cultural data and the lack of non-western norms in outputs. For example, one trial participant perceived that Copilot was prioritising western thinking and did not acknowledge the existence of non-western ideas or frameworks. Other trial participants highlighted the challenges Copilot had with using First Nations words often leading to misspelt places or names.

It's difficult to account for a bias that you are yet to identify

Trial participant from the ICT and Digital Solutions job family, pre-use survey

Informal observations made by Home Affairs staff participating in a Hackathon indicated that a weakness of Copilot was the introduction of invisible bias (Department of Home Affairs 2024:10). Further, a pre-trial survey conducted by CSIRO found that some CSIRO trial participants had ethical concerns about potential biases in the LLM model (CSIRO 2024:8). 

Generative AI may change the skills composition of workers.

As users turn to generative AI to complete more of their work – in particularly summarisation and re-writing content, there are concerns that staff may ‘forget’ how to performs these tasks without Copilot. 

[Copilot could] cause myself and colleagues to lack deep knowledge of topics

Trial participant in focus group

Across focus group participants, there are mixed views regarding the effect of Copilot on skills development. Two focus group participants noted that they were concerned that Copilot may result in APS staff not needing to develop and maintain their subject matter expertise as they can instead rely on Copilot. They also commented that a reliance on Copilot for writing could decrease participants’ writing skills and ability to construct logical arguments.

However, 2 focus group participants remarked that reductions in competencies in some areas is acceptable as generative AI will fundamentally change the skills that are needed in the APS. For example, one focus group participant commented that taking notes will become less important whereas critically assessing output will become more relevant. 

The concerns around potential skill decay suggest that there should be ongoing analysis into key workplace competencies as generative AI becomes increasingly incorporated in the APS. This analysis could also identify how to best support skill development in these competencies.

There are concerns related to vendor lock-in.

Vendor lock-in refers to a situation where a customer becomes overly dependent on a particular vendor's products or services, making it difficult, costly or impractical to switch to another vendor or solution. Vendor lock-in can happen through proprietary technologies, incompatible data formats, restrictive contracts, high switching costs, complex integrations and specialised knowledge requirements.

There’s a risk of vendor lock-in as the government becomes more dependent on this tool.

To use Copilot, an organisation must have a Microsoft365 licence that allows them to use a range of Microsoft products. Two focus group participants noted that Copilot’s easy integration with existing Microsoft licences could mean they are likely to remain with Copilot even if better products become available and that this could reduce competition.

The potential for Copilot to result in vendor lock-in suggests that agencies should look to monitor alternative generative AI options that may better suit their needs. This is particularly pertinent for products that may be more effective but also more difficult to implement.

Generative AI use could increase the APS’s environmental footprint.

Most generative AI products have a high carbon footprint. This occurs as AI models require vast amounts of energy to power their data centres that train generative AI models and to power each individual request. Some photo-generation generative AI tools use as much energy to produce an image as it takes to fully charge a smartphone (Luccioni et al. 2024).

The APS Net Zero Emissions by 2030 outlines the APS’s commitment to achieve net zero in government operations by 2030. One focus group participant was concerned that Copilot will lead to an increase in the APS’s carbon footprint as more licences are distributed and adoption continues. 

This suggests that future uptake should also consider the carbon footprint of potential models, in line with agencies’ own targets around reducing their footprint.

Reference

  1. Commonwealth Scientific and Industrial Research Organisation (2024) ‘Copilot for Microsoft 365; Data and Insights’, Commonwealth Scientific and Industrial Research Organisation, Canberra, ACT, 8
  2. Department of Home Affairs, ‘Copilot Hackathon’, Department of Home Affairs, Canberra, ACT, 2024, 10
  3. Luccioni S, Jernite Y and Strubell E (2024) ‘Power Hungry Processing: Watts Driving the Cost of AI Deployment?’, FAccT ’24: Proceedings of the 2024 ACM Conference on Fairness, Accountability and Transparency, https://arxiv.org/pdf/2311.16863, (accessed 3 September 2024).

Appendix A

Connect with the digital community

Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.