Appendix

Methodological limitations

Evaluation fatigue may have reduced the participation in engagement activities.

Several agencies conducted their own internal evaluations over the course of the trial and did not participate in Digital Transformation Agency’s overall evaluation.

Mitigations: where possible, the evaluation has drawn on agency-specific evaluation to complement findings.

The non-randomised sample of trial participants may not reflect the views of the entire APS.

Participants self-nominated to be involved in the trial, contributing to a degree of selection bias. The representation of APS job families and classifications in the trial differs from the proportions in the overall APS.

Mitigations: the over and underrepresentation of certain groups has been noted. Statistical significance and standard error were calculated, where applicable, to ensure robustness of results.

There was an inconsistent roll out of Copilot across agencies.

Agencies began the trial at different stages, meaning there was not an equal opportunity to build capability or identify use cases. Agencies also used different versions of Copilot due to frequent product releases.

Mitigations: there is a distinction between what may be a functionality limitation of Copilot and when a feature has been disabled by an agency.

Measuring the impact of Copilot relied on trial participants’ self-assessment of productivity benefits.

Trial participants were asked to estimate the scale of Copilot’s benefits, which may naturally under or overestimate its impact.

Mitigations: where possible, the evaluation has compared productivity findings against other evaluations and external research to verify its validity.

Statistical significance of outcomes

The trial of Copilot for Microsoft 365 involved the distribution of nearly 5,765 Copilot licenses across 56 participating agencies. As part of engagement activities — consultations and surveys — the evaluation gathered the experience and sentiment from over 2,000 trial participants representing more than 45 agencies. Insights were further strengthened by the findings from internal evaluations completed by certain agencies. The sample size was sufficient to ensure 95% confidence intervals of reported proportions (at the overall level) were within a margin of error of 5%.

There were 3 questions asked in the post-use survey that were originally included in either the pre-use or pulse survey. These questions were repeated to compare responses of trial participants before and after the survey and measure the change in sentiment. A t-test was used to determine whether changes were statistically significant at a 5% level of significance.

A t-test is a statistical method to test whether the difference between 2 groups, such as a ‘before’ and ‘after’ samples, are statistically significant.

The survey aligned with the APS Job Family Framework and APS job families and classifications were aggregated in survey analysis to reduce standard error and ensure statistical robustness. Post-use survey responses from Trades and Labour, and Monitoring and Audit job families were excluded from reporting as their sample size was less than 10, but their responses were still included in aggregate findings. 

For APS classifications, APS 3-6 have been aggregated.

Survey participation by APS classification and job family

Table A: Aggregation of APS job families for survey analysis
GroupJob families
Corporate

Accounting and Finance

Administration

Communications and Marketing

Human Resources

Information and Knowledge Management

Legal and Parliamentary

ICT and Digital SolutionsICT and Digital Solutions
Policy and Program Management

Policy

Portfolio, Program and Project Management

Service Delivery

Technical

Compliance and Regulation

Data and Research

Engineering and Technical

Intelligence

Science and Health

 

Table B: Participation in surveys according to APS level classification
 Percentage of all APS employeesPercentage of pre-use survey respondentsPercentage of post-use survey respondents
SES1.94.75.3
EL 29.020.020.2
EL 120.836.934.0
APS 623.423.422.3
APS 514.78.59.6
APS 3-426.06.07.4
APS 1-24.210.51.1

 

Table C: Participation in surveys according to job family
 Percentage of all APS employeesPercentage of pre-use survey respondentsPercentage of post-use survey respondents
Accounting and Finance5.15.33.5
Administration11.49.08.9
Communication and Marketing2.54.95.8
Compliance and Regulation10.36.66.5
Data and Research3.79.98.3
Engineering and Technical1.81.31.5
Human Resources3.95.35.0
ICT and Digital Solutions5.019.622.3
Information and Knowledge Management1.12.51.6
Intelligence2.40.92.1
Legal and Parliamentary 2.64.13.5
Monitoring and Audit1.51.11.0
Policy7.913.714.4
Portfolio, Program and Project Management8.38.67.5
Science and Health4.21.62.1
Senior Executive2.12.31.5
Service Delivery25.52.74.0
Trades and Labour0.70.9-

 

Participating agencies

Table D: List of participating agencies by portfolio
PortfolioEntity
Agriculture, Fisheries and Forestry

Department of Agriculture, Fisheries and Forestry

Grains Research and Development Corporation

Regional Investment Corporation

Rural Industries Research and Development (trading as AgriFutures Australia)

Attorney-General’s

Australian Criminal Intelligence Commission

Australian Federal Police

Australian Financial Security Authority

Office of the Commonwealth Ombudsman

Climate Change, Energy, the Environment and Water

Australian Institute of Marine Science

Australian Renewable Energy Agency

Department of Climate Change, Energy, Environment and Water

Bureau of Meteorology

Education

Australian Research Council

Department of Education

Tertiary Education Quality and Standards Agency

Employment and Workplace Relations

Comcare

Department of Employment and Workplace Relations

Fair Work Commission

Finance

Commonwealth Superannuation Corporation

Department of Finance

Digital Transformation Agency

Foreign and Trade Affairs

Australian Centre for International Agricultural Research

Australian Trade and Investment Commission

Department of Foreign Affairs and Trade

Tourism Australia

Health and Aged Care

Australian Digital Health Agency

Australian Institute of Health and Welfare

Department of Health and Aged Care

Home AffairsDepartment of Home Affairs (Immigration and Border Protection)
Industry, Science and Resources

Australian Building Codes Board

Australian Nuclear Science and Technology Organisation

Commonwealth Scientific and Industrial Research Organisation

Department of Industry, Science and Resources

Geoscience Australia

IP Australia

Infrastructure, Transport, Regional Development, Communication and the ArtsAustralian Transport Safety Bureau
Parliamentary Departments (not a portfolio)Department of Parliamentary Services
Social Services

Australian Institute of Family Studies

National Disability Insurance Agency

Treasury

Australian Prudential Regulation Authority

Australian Securities and Investments Commission

Australian Charities and Not-for-profits Commission

Australian Taxation Office

Department of the Treasury

Productivity Commission

Connect with the digital community

Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.