• Establish internal processes to support performance data analysis and reporting

    Collect and report meaningful data: Make sure the performance monitoring frameworks and data analytics tools are fit for purpose and provide meaningful reporting data. While there are numerous metrics, calculations and methods to collect data, your choice should prioritise ‘real-time’ user-centric approaches and align with the criteria in the Digital Performance Standard. The data gathered should reflect the true user experience to gain valuable insights. Agencies are required to report ongoing performance data for digital services delivered via IOF-tracked ICT investments once the service is implemented. 

    Off
  • Report on progress during Investment Oversight Framework states and post-implementation performance

    Use data to identify the benefits: Use the data collected to identify service benefits. Benefits can include uncovering service inefficiencies by analysing data on digital service performance, unearthing deeper insights into users’ experience, segmenting user data based on user groups to better understand their needs and working in partnership with users to develop user-based solutions. Further qualitative metrics, complementing the quantitative, can add a rich layer of information on underlying factors influencing the user experience.

    Off
  • Analyse your performance results and act on any improvements to the digital services

    Use data-driven insights to continuously improve: Look for ways to continuously improve the digital service and the quality of the data. Use automated reporting tools where possible to streamline processes and reduce manual efforts. This will allow agencies to dedicate more resources to the analysis of the data.

    Off
  • Establish processes to support performance data analysis and reporting

    Establish internal processes to support performance data analysis and reporting: 

    • Set up a dedicated team responsible for collecting, analysing and interpreting your digital service data. The team should be equipped with tools and software to make sure data reporting is accurate and comprehensive. 
    • Select monitoring tools with built-in analysis and reporting features and that easily integrate with existing software applications. This will make analysis and reporting easier.
    • Make sure regular analysis and reporting are part of your monitoring framework. See also: Document how you will implement the monitoring framework and Develop processes for continuous improvement
    Off
  • Report progress during Investment Oversight Framework states and post-implementation performance data

    Reporting using existing reporting mechanisms will make sure your proposal or project is complying with the Performance Standard. This will occur as part of the DTA’s Digital and ICT Investment Oversight Framework process. Depending on your ICT investment your agency will be requested to report information in the following states: 

    • Strategic Planning and Prioritisation: within your proposal or business case report on how you intend to implement a monitoring framework to your digital service.
    • Contestability: report that the Performance Standard has been, or will be, applied to the digital service. For example, as part of the Digital Capability Assessment Process 
    • Assurance: report how you have applied the Performance Standard to the digital service and made progress with delivery milestones.
    • Operations: using the DTA’s existing data collection mechanisms such as the Approved Programs Collection, report on how your digital service continues to meet customer needs post-implementation.
    Off
  • Analyse performance results and make improvements to the digital service Off
  • Develop a business case for change

    Be outcomes focused: Consider what problems the service needs to solve and why they are important. Share early-stage assumptions, gather diverse perspectives from stakeholders and take advantage of pre-existing data and resources. Clearly state the risks of action and inaction, who might be impacted, potential barriers to success and any knowledge gaps.

    Frame the problem: Form a simple, clear problem statement from the evidence that’s already available. Use it as the basis of further research and validation, and to identify the users agencies need to engage with.

    Don’t jump to solutions: Don’t anticipate a technical or design solution before validating the problems identified. Evaluate the rest of the Digital Service Standard criteria to understand what else could drive the problem. Consider if a new solution is required or if an existing platform or service might achieve the best outcome.

    Align stakeholders to a vision: Engage key stakeholders to establish a shared vision for success. Set clear expectations for the project and make sure everyone knows why change is necessary.

    Off
  • Survey the policy and service landscape

    See the bigger picture: Assess how the problems identified play out in the broader policy and government service ecosystems. Use resources such as the Australian Government Architecture and Delivering Great Policy Toolkit to understand the landscape and the intentions of different policies.

    Align to government priorities: Have a clear understanding of how the service will contribute to government priorities, including the achievement of the Data and Digital Government Strategy 2030 vision.

    Off
  • Understand the service’s lifecycle

    Invest for the future: Consider whole-of-life investment costs, including maintenance and upgrades, to ensure proper investment across short, medium and long-term horizons. Make sure the team are familiar with the Investment Oversight Framework and its thresholds. Get in touch with the Digital Transformation Agency for questions about the ICT Investment Approval Process and work with the relevant area of the Department of Finance to understand ongoing costs.

    Off
  • Adopt an agile methodology

    Use a multi-disciplinary team: Consider tools and techniques based on agile values and principles. Engage a multidisciplinary team to understand the whole problem and create an effective solution. Monitor time and effort expended to understand and refine whole-of-life investment costs from the outset. 

    Off
  • Guidance to have a clear intent

  • Understand the problems your service will solve

    Understand your current digital landscape and the problems or gaps your service will solve: 

    • Conduct needs assessments and collect qualitative and quantitative data through surveys, interviews and focus groups to understand user experiences and challenges.
    • Clarify the problem that needs to be addressed by creating user journey maps to visualise the user experience. Include pain points, problems that slow or halt progress and areas of confusion. 
    • Gather insights into challenges by engaging with a diverse range of stakeholders, including users, staff and community organisations.
    • Periodically review and analyse trends in user behaviour and service performance to identify recurring problems and new challenges. 
    Off
  • Understand government initiatives that address the problem

    Research and identify existing programs or initiatives that address the problem. Leverage these initiatives to enhance service delivery or fill the gaps. Use methods or tools such as:

    • Review government policies and existing initiatives and map the problem to relevant strategic objectives and priorities.
    • Collaborate with other government agencies and departments with similar issues to facilitate resource sharing, knowledge exchange and coordinate efforts. 
    • Engage with stakeholders involved in related government initiatives to gain insights into government approaches and successes. 
    • Periodically review and update strategic objectives, ensure plans are flexible in response to new information, the changing landscape and user needs.
    Off
  • Document your findings

    Document your findings and recommendations to apply criterion 1:

    • Develop clear problem statements that articulate the issues identified. This will communicate the problem to stakeholders and set a focused direction for efforts.
    • Make sure the data is collected and documented in a centralised knowledge repository. 
    Off
  • The supporting standards

  • Understand the users of the service

    Listen carefully for implicit and explicit needs: During user research, discuss their daily lives and observe their real-world actions to contextualise their needs. Use a discussion guide to capture all aspects of their experience. While some needs or pain points will be stated explicitly, pay attention to small or superfluous details to recognise the implicit ones. Use at least 2 methods of user research to make sure what they say matches what they do. For example, open-ended interviews and observing users completing relevant tasks.

    Begin with pain points: Identify and address the most common pain points that the service should address. Prioritise them by most impactful; this isn’t necessarily the number of users affected. Adopt continuous improvement to address pain points that emerge after launch or upgrades.

    Observe usage patterns: Use various data sources to identify how often different users use the service. Stress test any solutions for pain points along task journeys and assess load-bearing capacity during peak periods.

    Map experiences: Use visual aids to make sure the breadth of user interactions are captured and that the team works from a shared understanding. Build, test and refine journey maps and job stories. This will help to understand the end-to-end user journeys and behind-the-scenes processes. It will also reduce unintentional duplication and support agencies to communicate findings.

    Off
  • Conduct user research

    Test any assumptions: Validate assumptions made in Criterion 1 (‘Have a clear intent’). Qualitative user research conducted directly with people who may be impacted by the service will provide agencies with either confirmation that they are on the right track, or that they are solving the wrong problem and need to adapt their approach.

    Gather different perspectives: Undertake ethical and inclusive user research to capture a breadth of needs and capabilities. Zoom out and consider how the digital service interacts with the agency’s wider methods of service delivery. It is helpful to zoom in and out of the problem space to observe the different perspectives and impacts of the service being designed, and to explore how the problem may manifest at macro and micro levels.

    Off
  • Test and validate designs

    Embed co-design: Where appropriate, use co-design to involve users and stakeholders and demonstrate transparent, equitable decision making. Avoid tokenism by meeting people’s physical, cultural and psychological safety needs in consultations. Maintain ongoing user engagement to keep the service fit for purpose and address changing needs over the course of people’s lives.

    Engage designers: Make sure the team has the expertise to capture and interpret useful information from users’ personal experiences. Use service designers and user experience (UX) designers to conduct user research, map experiences and design the service to meet and surpass the needs of all users.

    Off

Connect with the digital community

Share, build or learn digital experience and skills with training and events, and collaborate with peers across government.