Interrelate Client Outcomes Framework – the ongoing journey of collecting outcomes data

Background to Interrelate’s Client Outcome Measures

In 2013, Interrelate began designing outcome tools to gather data in order to examine how clients benefitted by attending our services and programs. The rationale for this work was underpinned by two key principles reflecting the organisation’s primary values: as a service provider we are accountable to our clients and as a funded organisation we are accountable to our funders. We developed a framework to guide the creation of tools, structures, and procedures to better understand and demonstrate the effectiveness (or otherwise) of our programs and services, and support ongoing practice improvement.

Around this time, the Department of Social Services developed the Data Exchange framework to collect from providers key information relating to changes in a client’s circumstances through the provision of the service, in a consistent, standardised way. The domains of interest to the Department matched and extended those we had been planning to measure. In order to make both the transmission and translation of the data and the uptake by practitioners as smooth as possible, we aligned our measures to the constructs of interest to the Department: knowledge, skills, behaviours, connectedness, and confidence. The tools are collectively known as Client Outcomes Measures (COMs), and are an integral component of Interrelate’s Clinical Governance Framework.

Early Development

The first version of the framework comprised both client-rated and practitioner-rated measures. Through extended discussions, the constructs and response scales outlined by the Data Exchange were re-worked to achieve a close fit between the intention of the construct and wording that would be reasonably easily interpreted by clients across all Interrelate service types..

Key Actions:

  • Drew on administrative data to determine the timing of data collection intervals
  • Worked with a diverse service practice group with broad representation of service types to design the tools and processes.
  • Undertook Pilot 1, supported by ongoing contact with managers and monitoring of use of the COMs. The most significant issue emerging from the pilot was the challenges associated with integrating collection of COMs into established service delivery processes. This stemmed from individuals not fully understanding application of the framework, some concern about interrupting the therapeutic relationship between practitioner and client, and a view that some questions did not resonate with clients.
  • COMs tools and processes were slightly altered.
  • Undertook Pilot 2, building on the lessons from the first pilot, this pilot (held in a different region) was more successful in the volume of data recorded and the feedback provided by practitioners regarding the usability of the materials and the clients’ response to the process.
  • COMs tools and processes were again slightly altered to develop the final version.
  • Final version of COMs was rolled out to all regions through site visits and training workshops for practitioners so that they understood the why and the how.
  • Developed a range of support resources for practitioners including information sheets, FAQ, webinars and role plays demonstrating ways in which the COMs could be used in clinical practice in different service settings.
  • Clinical Supervisors were actively engaged to support the process.

Current COM Review

A year after its implementation and following the first reports extracted from the Data Exchange, a review of the COMs was undertaken.

The key issue emerging from the Data Exchange reports related to the collection of both pre- and post-measures, which varied across regional offices. During discussions with practitioners, a number of procedural issues were identified reflecting expected questions that arise in conjunction with implementation of any new processes. In addition, some helpful improvements and changes for the next version were suggested:

  • client circumstances will now be rated by the client rather than the practitioner
  • some goal questions will be re-worded to increase clarity and provided with examples to aid client understanding
  • inclusion of the client satisfaction survey question for the last post COM

Dissemination of the outcomes of the COMs review is in progress with the new forms and processes soon to be disseminated through regional trainings.

… and the journey of improvement in collecting valuable evaluation data continues …