Organisational Outcomes Reporting Case Studies
Interrelate Client Outcomes Framework – the ongoing journey of collecting outcomes data
Background to Interrelate’s Client Outcome Measures
In 2013, Interrelate began designing outcome tools to gather data in order to examine how clients benefitted by attending our services and programs. The rationale for this work was underpinned by two key principles reflecting the organisation’s primary values: as a service provider we are accountable to our clients and as a funded organisation we are accountable to our funders. We developed a framework to guide the creation of tools, structures, and procedures to better understand and demonstrate the effectiveness (or otherwise) of our programs and services, and support ongoing practice improvement.
Around this time, the Department of Social Services developed the Data Exchange framework to collect from providers key information relating to changes in a client’s circumstances through the provision of the service, in a consistent, standardised way. The domains of interest to the Department matched and extended those we had been planning to measure. In order to make both the transmission and translation of the data and the uptake by practitioners as smooth as possible, we aligned our measures to the constructs of interest to the Department: knowledge, skills, behaviours, connectedness, and confidence. The tools are collectively known as Client Outcomes Measures (COMs), and are an integral component of Interrelate’s Clinical Governance Framework.
The first version of the framework comprised both client-rated and practitioner-rated measures. Through extended discussions, the constructs and response scales outlined by the Data Exchange were re-worked to achieve a close fit between the intention of the construct and wording that would be reasonably easily interpreted by clients across all Interrelate service types.
- Drew on administrative data to determine the timing of data collection intervals
- Worked with a diverse service practice group with broad representation of service types to design the tools and processes.
- Undertook Pilot 1, supported by ongoing contact with managers and monitoring of use of the COMs. The most significant issue emerging from the pilot was the challenges associated with integrating collection of COMs into established service delivery processes. This stemmed from individuals not fully understanding application of the framework, some concern about interrupting the therapeutic relationship between practitioner and client, and a view that some questions did not resonate with clients.
- COMs tools and processes were slightly altered.
- Undertook Pilot 2, building on the lessons from the first pilot, this pilot (held in a different region) was more successful in the volume of data recorded and the feedback provided by practitioners regarding the usability of the materials and the clients’ response to the process.
- COMs tools and processes were again slightly altered to develop the final version.
- Final version of COMs was rolled out to all regions through site visits and training workshops for practitioners so that they understood the why and the how.
- Developed a range of support resources for practitioners including information sheets, FAQ, webinars and role plays demonstrating ways in which the COMs could be used in clinical practice in different service settings.
- Clinical Supervisors were actively engaged to support the process.
Current COM Review
A year after its implementation and following the first reports extracted from the Data Exchange, a review of the COMs was undertaken.
The key issue emerging from the Data Exchange reports related to the collection of both pre- and post-measures, which varied across regional offices. During discussions with practitioners, a number of procedural issues were identified reflecting expected questions that arise in conjunction with implementation of any new processes. In addition, some helpful improvements and changes for the next version were suggested:
- client circumstances will now be rated by the client rather than the practitioner
- some goal questions will be re-worded to increase clarity and provided with examples to aid client understanding
- inclusion of the client satisfaction survey question for the last post COM
Dissemination of the outcomes of the COMs review is in progress with the new forms and processes soon to be disseminated through regional trainings.
….… and the journey of improvement in collecting valuable evaluation data continues …..
Relationships Australia NSW’s Data Journey
Recently the Relationships Australia NSW (RA NSW) leadership team committed to new resources and associated education and training to build data confidence and accountability among staff. A key impetus of this project has been the organisation’s recognition of the importance of creating and maintaining a robust data collection system that is meaningful for both internal and external reporting purposes; that supports its continuous improvement systems and future planning; and ultimately results in improved outcomes for clients.
Staff confidence and competence in data collection and reporting remain significant challenges for organisations, and RA NSW recognises that while their practitioners are passionate about making a difference to people’s lives, they are not equally passionate about data. In response, the leadership team have implemented a comprehensive change management approach to raise data awareness among staff and equip them with the skills of recording the work they deliver in a consistent and timely manner.
A number of complementary strategies have been implemented to support the change management agenda. These include new, data focused reports for each of the practitioners, reflective practice on data quality, and tailored supports.
Practitioners are resourced to enter their data straight after a client session. At least half an hour between clients is scheduled for this purpose. A key data message for staff is, if it is not recorded or it is not recorded correctly in our system, it is like the work never happened. The data entered by practitioners into the system is used to generate engaging and relevant reports. The new ‘My Story’ report is a representation of the data that relates to individual practitioner activities. These individualised reports are emailed to each of the practitioners on a fortnightly basis, about a week after the fortnightly representation of that data has been completed. The report format is structured by Data Exchange SCORE variables. RA NSW has found that the My Story reports has helped practitioners make better sense of how the data relates to their actual work behaviour.
The organisation supports reflective data collection and reporting practices, and the fortnightly production of the ‘My Story’ reports helps the leadership team identify any individual training needs. The reliability, consistency and quality of the data entered by practitioners is audited during fortnightly supervision meetings. A new data coach role has been resourced to provide targeted and tailored support where required.
The Data Exchange outcomes framework has been helpful in supporting RA NSW’s change management agenda because it provided an understanding what data was most important to collect. Opting into the partnership approach resulted in the establishment of rigorous processes across the organisation around the collection of that data, which is reported internally to the organisation’s Board of Directors and to the Department.
RA NSW has developed robust data collection and reporting systems which support the organisation in its business planning. By drilling down into the data, RA NSW identified three key customer help-seeking categories, from short-term responses for crises to longer-term self-improvement goals. Through understanding the underlying needs and mind-sets of clients, the organisation is better able to meet their needs and respond with appropriate, tailored language, content and services.