Skip to navigation Skip to content

Incentive Programs quality checking in QST 111-22140000



This document outlines the quality checking process for Practice Incentives Program (PIP), Workforce Incentive Program (WIP) and the Indigenous Health Incentive (IHI).

Objectives

The Incentives quality checking process aligns with Services Australia's Enterprise Quality Framework and the 7 quality principles. The quality checking process:

  • helps Team Leaders and Quality Checkers to identify Incentives process improvements, and
  • ensures consistency in quality checking

Roles and Responsibilities

Incentives Quality Portfolio Holders are responsible for:

  • national implementation of Incentives quality practices
  • analysis of national quality reports to identify learning needs or process improvement opportunities
  • escalation of national quality issues to the program area and/or to Quality Operations & Technical Support Team

Managers/Team Leaders must make sure:

Quality Checkers complete the process for Quality Checker accreditation

Quality Checkers have the right skill tags in the Quality and Support Tool. Send an email to the Quality Operations & Technical Support Team to get more information:

  • error feedback is discussed with Service Officers as part of the coaching conversation. This supports any learning and development needs
  • they analyse any errors. For example, did the errors occur because there is limited training, understanding of the business rules, environmental or system based issues
  • positive quality checking results are discussed with the Service Officer
  • they analyse data to identify learning needs or process improvement opportunities
  • quality checks are done as specified in the sampling plan
  • they notify senior management of a serious error which places the agency’s reputation at immediate risk
  • they escalate systemic issues
  • lodge quality checking outcome disputes
  • they escalate serious quality issues, such as reporting all privacy breaches using the Privacy Incident Notification form

Error feedback is discussed with Service Officers as part of the coaching conversation. This supports any learning and development needs.

Quality Checkers must:

  • carry out quality checks as outlined in the sampling plan
  • update and maintain technical knowledge, procedures and policies
  • maintain a correctness standard for personal accuracy
  • complete root cause analysis and identify systemic issues
  • report and escalate trends or concerns to the Team Leader
  • give constructive and non-judgemental feedback. Include reference materials such as Operational Blueprint
  • explain the impact of errors and action to be taken to correct the error (where applicable) when giving feedback
  • consult subject matter experts for procedural or policy related queries
  • report results of quality checks in the QST
  • make sure they do not check their own work. If a quality checker’s work is selected, it must be passed to another accredited checker to check

Quality Team must:

  • provide support for quality checking in the QST
  • provide relevant reports as needed
  • arrange QST training for Quality Checkers
  • support the business with flexible and fit for purpose quality approaches

Quality checks for the month must be actioned within 3 calendar days of the following month. After this time, remaining work is abandoned.

Accreditation of Quality Checkers

To be accredited, Quality Checkers must:

  • have knowledge of relevant policies and procedures
  • maintain a high correctness standard in the program they are checking
  • understand the quality checking procedure for Incentive Programs
  • meet accreditation standards annually
  • renew Quality Checker accreditation annually

Team leaders contact Quality Operations & Technical Support Team to request a skill tag be added to a staff member.

Types of quality checks

Pre-checks

Pre-checks apply:

  • to new starters
  • when new processes are introduced

New starters have 100% of their work reviewed and corrected until deemed proficient. Any errors made in pre-check must be recorded as errors. Results are used to support ongoing training.

Programme/post-checks

Programme/post-checking randomly selects work completed the previous business day in Processing and National Demand Allocation (PaNDA). The results are provided to management and external stakeholders as needed.

Targeted checks

Targeted checks apply to:

  • staff when there is a substantial new process introduced or an identified quality issue
  • Aim for Accuracy process (Check the Checker) which is completed every 12 months. This confirms the accuracy of quality checking results. The process involves rechecking a sample of previously checked work to decide if quality checking procedures have been followed

Sampling plan

Pre-checks

New starters will have 100% of their work checked until they reach the program accuracy KPI, with a minimum of 40 activities per program checked. Please contact the OPH Service Enhancement and Deployment team for the training and consolidation plan.

Post checks

Two percent of all incentives work types will be checked. This includes:

  • Practice Incentive Program (PIP)
  • Workforce Incentive Program (WIP) - Practice Stream
  • Indigenous Health Incentive (IHI)

The sample is prioritised according to the risk of the work type:

  • High risk work items checked at 2.5%
  • Medium risk work items checked at 2.0%
  • Low risk work items checked at 1.9%
  • PIP Practice Bank Details work type checked at 100%.
  • PIP/WIP – Practice Stream (PIP consenting) Practice Bank Details work type checked at 100%

Targeted checks

To arrange a targeted check(s), the Team Leader sends an email request to the Quality Operations & Technical Support Team. Include:

  • the staff member’s details
  • the reason and length of time required for the targeted check

Aim for Accuracy to be developed as a separate process.

Source documentation

Incentives transactions selected for quality checking will be obtained from work types allocated through the Processing and National Demand Allocation (PaNDA) tool. Source documents will be obtained through PaNDA, and/or the Practice Profile.

Quality checking is also performed on the following documentation (where applicable):

  • scanned applications
  • supporting documentation, for example, accreditation certificate for a practice relocation
  • faxes
  • letters

Note: if a notation has been made, this becomes part of the source (supporting) documentation that is subject to quality checking. This will be in the form of notations in PaNDA.

Process for undertaking checks

Time frames

Checks are undertaken daily on a sample of work processed during the month.

All quality checking results must be completed and submitted in the Quality Support Tool (QST). Quality checks are undertaken daily on a random sample of work. Quality checks are completed within 1-2 business days to ensure timely feedback. All quality checks for the month must be actioned within 3 calendar days of the following month. After this time, remaining work is abandoned.

Resources

Quality checkers must have access to the following to undertake quality checks:

  • Program specific processing systems
  • Processing and National Demand Allocation (PaNDA)
  • Practice Profile
  • Drive where QST is located

Quality checking enquiries

Checkers must consult Local Peer Support (LPS) if they have any questions about quality checking.

Enquiries about quality checking business rules or issues with submitted checking results that cannot be resolved on site are directed to the Quality Management Team by email.

Error definition

Definition of what is considered an error

For the purpose of quality checking, an error is information processed/keyed into the system which does not match the documentation submitted or the supporting documents uploaded to systems.

Definition of critical and non-critical errors

The error codes are categorised as critical and non-critical.

An error is defined as critical when there is a potential risk to:

  • a customer’s:
    • privacy
    • eligibility
    • payments
    • application assessment being finalised
    • record
  • a process affecting payments and/or information of Services Australia and its customers
  • the reputation of Services Australia

A non-critical error is one that does not impact the above criteria. Generally, this relates to administrative errors. Non-critical errors will be recorded as feedback and any action required will be sent to the Service Officer for correction.

Only critical errors are used to calculate the:

  • health status of the program
  • accuracy of processing results

Recording Results

Each time a quality check is completed, the Quality Checker records the outcome in the Quality and Support Tool (QST). A system-generated email is sent in real time to the Service Officer and their Team Leader advising them of the outcome. Individual results are captured in QST.

Feedback

After performing quality checking on selected transactions, an email is sent to the Service Officer and the Team Leader. The email includes details about the:

Feedback on errors must:

  • be direct and to the point, non-judgemental and constructive
  • describe the error, and what was incorrect
  • explain the impact the error has, for example to the recipient/claimant, agency
  • explain the action that is to be taken to correct the error (where applicable)
  • include any relevant reference material that supports the action, such as an Operational Blueprint file
  • provide the opportunity for development and not be used for punitive purposes

Error documentation

Error documentation is electronically stored for coaching and development.

Method for managing and storing feedback

Feedback is to be recorded in the QST. Discussion notes about the feedback should be detailed in coaching sessions.

Analysing Errors

Error analysis is conducted by the:

  • Team Leader or Local Peer Support (LPS) as required for local issues
  • Assistance Programmes Branch for analysis of the programs

Correction of Errors

If errors have been identified, action must be taken to correct the error.

Critical errors

Quality Checkers must correct critical errors immediately as they are found.

Non-critical errors

Quality Checkers provide instruction in the feedback email. Service Officer’s correct non-critical errors.

Enquiries

Quality Checkers must consult their subject matter experts for questions about the Incentives program.

Enquires about quality checking process or issues with checking results that cannot be resolved on site should be directed to the Incentives Quality Portfolio Holder Team by email. Team Leaders can raise unresolved concerns with Incentives Policy.

Change Management

Quality checking procedures are reviewed and updated by the business and programme in consultation with the Quality Operations & Technical Support Team. The review ensures error codes are current and approved policy changes are included.

Any changes that require immediate action will be advised and updated accordingly. Any changes not critical to quality will be incorporated in the review of the procedures.

The Resources page contains the Enterprise Quality Framework intranet pages, a link to the Privacy Incident Notification Form, contact details and a list of error codes.

Incentive Programs

Quality checking dispute process for Health Service Delivery Division

Quality checking using the Quality and Support Tool (QST)

Quality Checker accreditation