Data Exchange Client Survey Webinar
Good afternoon everyone, and welcome to our webinar on the Client Survey for the Data Exchange.
My name is Marita Baier-Gorman. I am the lead trainer for the Data Exchange working out of Canberra. I’d just like to welcome all of you and thank you for taking some time out to be with us here today. So I’m just taking a moment to sort of welcome you all and make sure everyone can hear me okay, also allowing a few extra people to log in. It’s always one of those things I think the first five minutes of every webinar is you can see everyone sort of bustling in to the virtual space.
But today we’re going to be discussing the client survey and where it’s up to in terms of the pilot. So we’re quite excited about the client survey; it’s something that we’ve been piloting since January this year. We’ve done a lot of work and research to get it to where it is at this point and this is what we want to discuss and share with you today.
So just a few things to let you know about in terms of the webinar technology and what you can see here today. You’ll notice that you have a control panel that’s on the right hand side of your screen. So first off you’ll notice that muted is against all of you today so you will not be able to ask questions using your voice, however you can send in questions to us throughout the webinar today. So there is a questions box on the bottom of that control panel that you can expand open and you’ll be able to type in questions this way or comments to us. With me in the room today I have my colleagues Rose, Brendon and also Angela as well, so they’ll be here to answer your questions as we go through the webinar.
Now if at any point you find that this box disappears, and it can sometimes if you don’t use it for a little while, it will shrink to a small size in the corner of your screen and look very much like this. So, if you find that it minimises and you want to expand to send in your question, all you need to do is click on that orange arrow and what will happen is that the full screen will then expand and come out and you’ll be able to then see that.
Alright so let’s discuss the agenda today. So what we’re having a look at in this webinar is, what is the client survey? Why are we piloting it? How did we get to this point? How was the survey developed and tested? And what was the research and evaluation that we did there? What have we learned from the pilot so far? The feedback have we gotten from providers that are participating in the pilot as well as our stakeholders? What are we proposing as a result of this feedback? So where we’re moving forward next within the pilot and what we’re anticipating, which I’m very excited to share with all of you. And also the time frames and next steps so, what are we anticipating for the next phase of the pilot? Where do we see the client survey moving toward in the future? And I think one of the best things about this webinar and your ability to be able to type in information and questions to us is that we can have a bit of a discussion about that and also we’ll be collecting those comments and those questions as well to get your perspective on what you think about the client survey and how its evolving.
Now I notice that there’s a question that’s popped up from Linda asking about the webinar on Tuesday that was cancelled? ‘Yes’, we had some technical difficulties, I do apologise to anyone whose logged in today who tried reaching us on Tuesday for the partnership approach. It has been rescheduled, and we’ll be sending out information about that so I’m happy to be taking down your name. Linda, thank you, and we’ll be sure to share that with you. Otherwise if anyone whose tuned in is interested in learning about the partnership approach via webinar, if you go on to our Data Exchange website and click on the button on the front screen that says ‘Find Training’ and has a little graduation cap next to it, you’ll be able to find the link for registration for the webinar there on the partnership approach.
Oh, excellent! Sue, thank you for your comment which I can see has come through. ‘Yes’ we are happy to do that, but please do go through onto our website to find training and register that way so that you have the details and you’re able to log in to the webinar.
Okay, so what is the client survey? So the client survey is intended to be an independent mechanism that we want to use to capture the clients point of view but also the value that they place on the services that they have received. So it’s supposed to be an independent source of capturing that client voice on how the client feel they went with their service – What did they get out of that service? What did they achieve? What sort of outcomes occurred because they had access to a particular service?
So why have a client survey? The main reason for the client survey is that we wanted to increase the number of clients with outcomes that are captured through Data Exchange reporting. Now for organisations that are within the partnership approach, and some of you listening in today may be those organisations, the idea of having client survey outcomes data would be that you could compare and contrast the outcomes collected through the survey but also from your own partnership approach information to get a sense of what‘s happening there and what is the full picture. It also allows organisations that have not opted into the partnership approach to still gather client outcomes information, which means that they’ll still be able to meet key performance indicators, or KPI’s for short, four and five for their grant agreements which are about collecting those client outcomes.
It’s also an opportunity for clients to share their service experience. And this is something the department is quite interested in as well, so, we do not have direct contact with clients of course, that’s not our business, that’s your business, but getting a sense of how clients feel they’re achieving things or how things are going on the ground we think is beneficial generally for the department but more specifically important to yourselves to get that sense of how clients are moving through your services and the benefits of those services.
Now this screen that is presented in front of you is looking at the Data Exchange framework. So this is a screen shot that we use fairly often and if you’ve come along to other Data Exchange training or tuned into other webinars you’re probably familiar with this screen. Basically it’s a visual representation of the different parts of the Data Exchange and I bring it up here because I’d like to talk about where we see the client survey fitting. So on the left hand side in blue you can the priority requirements and also the client survey listed there, and this is because we very much except that the client survey will be part of the priority requirements as we move forward and once its released to the network which won’t occur for a little longer. But essentially the client survey it will be mandatory to deliver it, to offer it, to talk to your clients about whether or not they’d like to complete a survey but of course it’s absolutely voluntary as to whether or not a client would actually participate or fill out that survey and complete that. And we’ll talk a little bit more about that as we move forward, and some of the methodology that we’re creating and looking out for the client survey.
Now where the client survey fits in terms of outcomes, it’s not about replacing the need for or the value of outcomes data collected through the partnership approach, its one piece of the outcomes puzzle. So while the partnership approach is fantastic for collecting Score data, for collecting that information from the organisations perspective particularly if you’re using Score for your practitioners point of view. Some organisations are using Score in a variety of different ways, but it’s just a different way of collecting outcomes so the client survey is that very separate client centric way of collecting outcomes where the client actually has their own separate voice to be able to share that.
Now I am noticing that we’ve got a few questions coming up asking about the presentation and whether or not it will be available online. ‘Yes’. So we’re actually recording this webinar as we speak, and we will be editing that and putting it up on our website and it will be available at a later stage. Takes about two weeks, so ten business days, for us to have that turn around and pop that up on the website, but certainly we will be sharing this webinar and this presentation with the public as well, so if you’re curious or perhaps wanting to share this with other staff members and you have other staff members who are curious and perhaps want to learn about the client survey, do not fret, we will have this available for you.
Now just to remind you as well, because I notice that we have a number of people who have come in a bit late, about using the webinar control panel. There is an ability to ask questions in that question box on the bottom right hand side of your control panel on your screen. Also you may notice that your control panel could minimise, especially if you’re not using it throughout the webinar today, if this happens all you need to do is click on the orange arrow, it will restore the full panel there for you. So thank you for your patience others who’ve already seen that notice, it’s always good to have a reminder there.
So how was the survey developed? So the survey was actually developed over 4 waves of research that actually spanned an 18 month period, so there was a significant amount of research and evaluation that took place, for the survey and many iterations of the survey as well before we got to the current web tool, which I will actually be demonstrating and sharing with you today. So these different iterations of the survey were developed using control or focus groups, if you like, of clients from a variety of different demographic backgrounds. So these clients were of various different ages from 18 all the way through to 88. They were from different regions as well, so we did a lot of different focus groups in regional, metro but also out in communities in the Northern Territory and they’d also accessed a range of social and health services as well either with department of social services or department of health and it was really interesting a broad variety of people in different setting because we wanted to get a sense of how the survey would work with settings and areas.
We’re also developing this survey in consultation with a few other areas, so the Australian Institute of Family Studies otherwise known as AIFS for short. Working with AIFS about selecting the survey questions themselves. So we worked very closely with them in terms of, what survey questions are being used elsewhere in Australia in other popular surveys? What questions are being used internationally in client outcome surveys and similar surveys? Working with them to get a sense of those question sets and how they behave. We’ve also been working with the department of prime minister and cabinet behavioural economic team. Bit of a long name, we usually just call them BETA. But we’ve been working with BETA to get a better sense of the user experience with the survey, so how the survey tool actually looks, how it functions moving through it and so on and so forth.
So the key findings that we found from this research phase and the feedback we got directly from these focus group is that client welcome and value the opportunity to tell their story and they strongly indicated that if they knew that information would improve the service or make it easier for people like them to access that service at a later stage, they would be more likely to participate and to want to share that feedback. They did not want their individual results shared back with the provider and this was largely, should they have anything critical or something that could be perceived as negative for feedback. They didn’t want the provider to then exclude them from a service or stop them accessing that service. Now of course this would not happen in real life but certainly there was a perception there that they felt that their service might be taken away and they really value that service, so certainly anonymity, in terms of their results was a very strong opinion that we heard in that phase. They also let us know that they were more likely to engage with the survey if it was delivered on the premises. This was very much because when they were there in that frame of mind they were more likely to give accurate or specific answers. Once they went home, gone to pick up the kids, back to their very busy lives, they were less likely to want to share that information because they’re out of that head space and they’re sort of getting on with things. So that was quite interesting for us to get that sense. And that was very interesting too in terms of safety issues as well, so some of the clients in our focus groups expressed that for them, they were accessing the service and their family members didn’t know, or they may not know they were accessing a service. It would actually compromise their personal safety if they were to do a survey or sort of expose the fact that they were seeking this service outside of the premises. So these are all things we found out in the research and evaluation phase and things we heard and things we wanted to incorporate with the first phase of the client survey pilot.
So a few key assumptions that we had for the pilot starting out, and this was our starting point in January when we first launched the pilot with a few pilot organisations and these were our key assumptions. First of all the survey is online only, so at the moment it is a web based survey only available online. A client must complete both a pre and a post survey, so very similar to the Score methodology where there is a pre Score somewhere towards the beginning of service delivery and a post Score somewhere toward the end of service delivery, or if it’s a long service where you’re seeing someone for multiple months or years at regular points, very much focussed on the same, a pre and post survey to be able to capture where a client was at toward the beginning of their service, where did they end up, what was that measurable difference between the two.
The survey is comprised of approximately 50 questions drawn from existing population wide surveys that are mapped across to score and looking at very similar areas that are within score. Now interestingly, not every client going through this survey would actually see all fifty questions. They actually expand and contract depending on the answers to previous questions that the client had answered. So it’s a pretty interesting tool in how it actually works, but certainly one of the things we are looking at is reducing those question sets. It was mandatory for all eligible clients to be offered the survey, but of course voluntary for clients to participate for our pilot organisation. So the idea for those pilot orgs was to really get a sense of clients willingness to participate in the survey and what the take up would be, and that it was a single survey for all programs including most Score domains so it was not a program specific survey, it isn’t as it stands now, in the pilot, it is a one general survey, that looks the same, that has the same questions no matter what the program the client is accessing.
Other key assumptions was that the survey would be completed at the location of service delivery so this was based on that research and evaluation phase and what we learned there and what clients were telling us, but we also wanted to see how that would work on the ground which was a big part of the pilot. We assumed the survey would take about ten minutes to complete per survey. This is because this is what we found as the average time during the testing phase through that research and evaluation of those different iterations what the survey might look like, this is what we were seeing is that it’d take about ten minutes to complete. And the survey would only be available to clients who were 18 years age and older, and only in English so exemptions apply where significant barriers to participation arose such as mental impairment, language literacy, technology or safety, but these were the main barriers I think coming in to the pilot is that they had to be 18 years and older and all the questions in the survey tool itself, forgive me, were in English.
Now we released a client survey discussion paper last year going over these, so, how we saw the pilot being launched, the sort of methodology we had at that time, what we learned from our research phase, we put it all into our discussion paper and then released it publically on our website. It was open for comment until the 31st of March this year. We did actually receive a variety of different comments and feedback, which was excellent, and we received 34 more responses to the discussion paper. Seven key themes emerged from those formal responses, and I think you’re going to get a sense of those key themes as we move throughout the material in this webinar today. Very clear directives I think in terms of how providers and also clients anticipate the client survey developing.
So piloting the client survey. The pilot for the client survey began in January of this year, and the purpose of the pilot was to trial the survey implementation with the pilot organisations, so when we started off in January we had around 30 organisations that wanted to trial the implementation, so it was all about getting a sense of how it would actually work on the ground, how would they fit it into their business as usual, how would they train up staff and so on and so forth. What kind of resources could they target to deliver? And then of course to report that back to us and have the regular meetings with the other pilot organisations to share those stories, the successes but also the challenges. We also wanted to trial clients willingness to participate in the surveys, so, while our focus groups had seemed quite positive and keen, we wanted to get a sense of would that stay the same once it was out in the network being delivered on the ground. Would clients equally be as positive and willing to participate?
So during the first phase of the pilot we now have over 60 organisations that are now participating in the pilot, which is excellent. They’re from a variety of different programs targeted to a variety of different clients on different age groups, different backgrounds and so on. Twelve training workshops were delivered at the end of last year and this was to train up our pilot organisations and give them a sense of what they would need to share with their staff, how they might want to get set up, the support materials that were available to them. We’ll be doing another follow up training workshop with our new pilot organisations as well to keep them on track with that. The survey was offered to clients from late January, although we found that some organisations picked up with their client survey delivery after Australia day because of course, staff were still coming back from leave and their services weren’t picking up until a little bit late, and 296 surveys have been completed as at the 23rd of May
So I’d actually like to demonstrate the existing client survey for you now. And this is just to give you a sense of what it looks like in its current form. Now this does not mean it’s going to stay this way, we anticipate a variety of changes. Certainly we’ve had some very interesting, varied and powerful feedback on this particular survey, but I’d just like to walk you through it so you can get a little sense here.
Now I have a question from Cindy asking, “296 completed. How many offered to clients?” An excellent question there Cindy, so as part of the pilot, our pilot organisations have actually been gathering data on how many clients they ask versus how many clients actually proceed and go through with the survey, and this is really important information for us to get a better sense of the ratio of clients as to who says ‘Yes’, who says ‘No’, but also anecdotal evidence as well to get a sense of those clients that are saying no to the survey, what are their reasons? Is it an accessibility issue? Is it a time issue? Is there something there that we need to know about? Because of course we want to make this survey as accessible and easy as possible for as many different types of people as possible to ensure that really broad, well just that breadth of voice really.
Okay so I’ve just brought up here on screen for you all the client survey tool itself. So the client survey sits very separately to the Data Exchange, it’s not even hosted near the Data Exchange, it’s on its own website which you can see here, which is myservicemystory.com.au.
Now here you can see there is a number of items. This was for our pilot organisation to fill out and sort of prepare the survey for their client. So our pilot organisations have been given an organisation ID that is unique to them. They would enter in the outlet, so where the survey was actually being completed, and the program activity. So what was the program activity that the client was accessing. Now the reason why we front loaded the program activity here at this screen was because we felt and found, and providers let us know too, that it was unlikely that clients would really understand the strict grant name of the program that they were accessing. So for example they might know it as “Giggle and Wiggle”, but they wouldn’t necessarily know that that came under Children and Parent Support Services for the organisation. Now here the organisation would put in a password that was unique to them. Just the one password so they could get in to their area, and they could select login.
Now you’ll notice here as well there’s a section and I’m just wiggling my mouse around it so you can see it. There’s an ability here of generating a QR code. So by ticking this box and then selecting the login button this would generate a QR code that the client could scan using a QR code reader on their mobile phone and, if they wanted to they could complete the survey separately away from location at a different time, at their leisure. What we’re finding through is the QR code does not seem to be the most popular option or the most user friendly option. While QR codes are probably less popular in Australia than other countries reporting at the moment around the world, what we’re finding with the QR code its more about the QR code reader, it’s not something a lot of people have on their phone and of course it depends on the person having a smart phone too.
So I’m going to select login here. And you’ll notice we’ve come to a Welcome to My Service My Story survey page. So this has a little bit of reference material for pilot organisations they can use which you can see me hovering over here with my mouse. There’s also, you’ll notice on the top right hand side a feedback report button. So this enables our pilot organisations to run a report to get a sense of how many surveys have been completed by their clients, what sort of devices they are completing the survey on and give them a little bit more information about what sort of questions are being answered and sort of results that are coming back.
But here, but selecting Begin Survey this would then pop it into the next section, to be able to pass that then onto the client to complete the survey. Now, how you would pass that on to your client we find has been very different depending on the client organisation. Some organisations are finding that they have the ability and the resources to set up a laptop or a PC in a separate room that’s on premises, so clients can go in there after their service, have a bit of privacy and enter in the survey that way. Other organisations have actually set up a laptop in the corner of their waiting room, sort of like a little kiosk, and then their clients are able to go over and enter in a survey that way. Others find using a tablet works incredibly well because they’re very flexible as a tool so they can have, if they’re lucky enough to have a few tablets, a couple of clients sitting in the waiting room and filling out surveys at that time and unassisted. But how that works is of course very different depending on the program, depending on the pilot organisations we’ve been speaking to. Certainly there’s been a lot of different options they’ve been sharing with us.
So this first page here, for the client, explains a little about the survey, do they have to take part? What happens to their answers? The privacy statement as well, and then the ability here to start survey. So by clicking the start survey button it launches the survey for them and that’s how they demonstrate that they are providing consent to then complete the survey. Now, if the client does not wish to complete the survey all the way through, they may get halfway through and say, “No, I’ve run out of time,” or “I’m not interested.” and they may drop out. They of course are absolutely able to do that, just because they’ve said ‘Yes’ to doing the survey does not mean that they have to follow it all the way through to the end.
So this next screen here is when the provider would then pass over to their client and the client would then be able to go through and complete the survey. So there are a few details that are asked for here and they are very similar to what you see within the Data Exchange. The reason why we have these client identifiers at the beginning of the survey is because of the pre and post aspects of the survey. We needed a way to be able to link pre surveys to post surveys so that we could get that measurable outcome between the two. So all this information is kept anonymous, it gets stripped out at the minute at creation and just generates an SLK in the background so that those two can be linked. This information is certainly a part of all our training materials and things we’ve discussed with pilot organisations but that is the reason there for that screen.
Now once we’ve entered in this information and clicked next, it will bring up the survey for us. Now I’ve actually gone into the system already and done a pre survey separately. Under “Ben” my fictitious person that I’ve made up today to use as an example in this webinar. The reason for that is there are different questions in the pre survey than the post, so I wanted to give you a sense of the post survey questions because they include service experience questions. Now very similar to the Data Exchange, the client survey has mandatory and non-mandatory questions. Most of the questions in the survey are non-mandatory so clients can move through and answer what they would like and what resonates with them. All of these questions in fact were looking at here are actually non-mandatory questions, and I’ll point out the others as we move through this demo today.
Each section of the survey is sort of put into a different theme, to give a little bit more context on the sort questions being asked or about to be asked. So this section here is about you, and asks what is your Country of Birth? There’s a drop down here, what is the Main language spoken at home? And are you of Aboriginal or Torres Strait Islander heritage? This next question asks do you currently have a permanent place to live? Now this is a mandatory question that requires an answer because it’s a question that expands or retracts based on its answer. So for example if I select ‘Yes’, nothing happens. However if I selected ‘No’ to this particular question it asks me a few other questions about the permanency of my living arrangements, and these are all optional questions here. Now interestingly these are questions that are relating to homelessness, and what we found through anecdotal evidence but also through our research phase is those clients that had never experienced homelessness sort of moved through these without thinking too much about it or not seeing the relevance, but those clients that have experienced homelessness were very positive about these questions being included and having that visibility of those issues here.
So this next question in About you is asking to provide the following details about your permanent residence, so a State and also a Suburb, more particularly a postcode, so you notice all the postcodes are listed here and can be easily selected. Aside from now was there any time over the last 2 years when you did not have a permanent place to live?
And this next section is asking more questions aligned to the Personal Wellbeing index, so thinking about your own life and personal circumstances, how satisfied are you with your life as a whole? Now this is actually a sliding scale that you can click on the button here, move that slide scale across, and it will change those numbers in the box on the left hand side and allow the client to sort of scroll between “No satisfaction at all,” up to “Completely satisfied”. Interestingly what we are seeing in the service- sorry, forgive me I used the wring word there. What we’re seeing in the survey meta data coming back is that this is where we are actually seeing a really interesting shift for clients in terms of the pre and post surveys. And here are some more questions regarding service experience, “Now we would like to ask you some questions about the service you are receiving”, “How did you find out about this service?” Now these are all items that are sourced directly from the Data Exchange, the referral pathways, “And what was your main reason for seeking assistance.” Now if I wanted to select a few of these as a client I can, there is the ability to select multiple options.
This next page is more questions about your service experience, so, “Overall how satisfied are you with the service that you received.” And then asks a breakdown of other questions asking, “How satisfied are you that the service was easy to access?”, “Was friendly and welcoming?”, “Listened to your needs?”, “Understood your needs?”, “Met your needs?”, and “Helped you deal with issues you needed help with?” And this is just to provide a further breakdown on the aspects of that satisfaction of the service because it can be for a variety of different reasons. This next question asks, “Thinking about the issues you sought assistance for, did you learn anything that will help you with the issues?” And the answers here are ‘Yes’, ‘No’ and ‘Not applicable’. The reason why there’s a Not applicable option here is that of course not all services being provided are about learning something new or gaining a particular skill.
So this next section of the survey is asking questions about Living arrangements. So these are questions that are very similar to Census questions, if you guys have trolled around the census website, and had a look at community profile information or ever sourced that in the past you probably recognised some of these questions, so “Do you feel safe in your neighbourhood?” and also “Are you currently living with anyone?” Now if I selected ‘Yes’ to this particular question it would ask me how many adults aged over 18 do I live with, if I selected ‘No’, not too much changes here
So this next question here is looking at your living arrangements. So, “Are you the parent or guardian of any children under the age of 18?” If I select ‘Yes’ it would ask me “How many of these children, if any, are under the age of 18 and living with me?”, if I selected ‘No’, that question of course does not present. So you can get a sense of how questions can expand and contract based on our answers. And I say ‘our’ because were going through it today together, but of course for a client they would be going through this independently.
So this next section here is asking the people around you and asking in general how would you rate your family’s ability to get along with one another? Now none of these are mandatory questions so I could answer some of these, I could not answer some of these as a client. It’s really for the client to select what resonates with them and what they wish to share. “How often do you feel that you need support or help but can’t get it from anyone?”, or “Do you help someone who has a long term health condition or disability or who is elderly, with everyday types of activities?” And this is a question very similar again to census to get a better sense of those in the community that are unpaid carers.
So these next questions you’re going to see are within the ‘Your recent history” area of the survey. Now all these questions and items are actually sourced from the personal wellbeing index, and this is something recommended to us by AIFS to include in the survey, so we do make it very clear up front that a client can skip through any questions they do not wish to answer but they’re all sourcing and looking at stressful life events, so “In the last 12 months, have any of the following happened to you or your partner?” Now interestingly, depending on program we are getting some very interesting and I think insightful feedback from our pilot organisations here in terms of which of these resonates with some clients and which don’t. For example that first one there we have a lot of anecdotal evidence from our aged care providers, that usually their clients have a bit of a chuckle at that first particular question, asking about a birth of a child or a pregnancy because of course, no one’s been pregnant very recently so it gives them a little smile there and this is something we’re taking on. You’ll notice there’s a number of options there and of course any that the client can select if they wanted to. “In the last 12 months, due to shortage of money, have any of the following happened?” and also this last question of, “Do you participate in any community service activity such as volunteering at school, coaching a sports team, or working with a church or neighbourhood association?” And this is to sort of, get a sense of community participation and networks and engagement in the local community.
So this next section of the survey is asking about thoughts and feelings. So, these statements are all non-mandatory, and the client can move through and select what resonates with them. So, asking things about how much control they have over their lives and how strongly do they agree or disagree with the following statements. “During the past four weeks how often did you feel the following”, and “During the past four weeks have you had any of the following problems with your work or other regular daily activities as a result of any emotional problems (such as feeling depressed or anxious)?”
So this next section is your health and wellbeing, and asks a few questions around that. Again, non-mandatory questions round here so, “In general, would you say your health is:” Now if I selected “Good” for example it does ask me if I “Have any medical conditions or disabilities that have lasted, or are likely to last for 6 months or more?” if I selected ‘Yes’, it would actually ask me what those medical conditions or disabilities are, and I can select it if I want to, if I don’t want to I don’t have to or if there are a number here or a combination I can select a combination as well.
So this next section, Health and wellbeing, asks questions about activities done in a typical day, so “Does your health now limit you in these activities and if so how much?” and, “During the past 4 weeks, how much of the time have you had any of the following problems with your work, or other regular daily activities as a result of your physical health?”
These last few questions in Health and wellbeing are around smoking and drinking. Now interestingly when we were going through our first iterations of the client survey, some of our stakeholders were concerned that these questions would not be answered accurately or the clients might be quite affronted about being asked these questions. What we found with our focus groups was actually they were pretty nonplussed about these particular questions and they’re asked in a lot of different surveys, so we thought we should include them for the pilot to get a sense of whether or not they are being answered. And sort of what the take up there is from a metadata point of view. So it asks, “Do you currently smoke cigarettes?” ‘Yes’ or ‘No’. and “Does anyone smoke inside the house”?
And this last question is about alcoholic drinks, so it is a drop down, and some of the feedback that we’ve received about this particular question is perhaps having an option in the drop down that states something such as “I choose not to answer” if there is a particular reason why a client does not want to share that information, or of course the other option here is that we could always turn this into a non-mandatory question which means the client could choose not to answer the questions entirely and skip forward. We’ve also had other feedback from providers in our pilot organisation saying that actually the question being there is quite useful for them to get a better sense of their client cohorts but perhaps having a visual reference on this page about what a standard drink actually is, or what that looks like would be particularly useful as well because not everyone of course understands or recognises the concept or measurement of a standard drinks. Now you can see that I’ve selected monthly or less, and it can ask me a few other questions that I can choose to answer should I wish to.
So this last section of the survey asks about Work and studies. So how many paid jobs do you currently have?”, “The highest year of primary or secondary school”, “Highest qualification”, and “Approximately which value best describes the current annual household income (before tax)?” All drop downs, all non-mandatory. And that last question is asking about assistance. So, what we have done, what we have aimed for and what we are still aiming for with the client survey is to make it as easy and accessible as possible so that the clients can go through it independently, autonomously and be able to answer and give that true authentic feedback. But of course being only in English and only web based at this stage we anticipated and we were expecting that service providers in our pilot organisations would have staff that would need probably to provide assistance, either to step their clients through how to use the actual tool itself, or perhaps translate some of the questions for them into another language, or help explain some of the questions if need be and so we wanted to get a sense here of how often that was happening and the level of assistance that was occurring, so, it’s simple, “Was there any time that you required assistance from the service provider to help you complete the questions in the survey?” ‘Yes’ or ‘No’.
Now when I select submit, it comes up saying “Do you wish to submit your survey answers now?” And then a Thank you page comes up. Now this Thank you page has actually been updated, so this is our training environment that I’ve walked through and shared with you but the ‘live’ client survey tool itself actually has a slightly different wording here, and discusses that, should any feelings or issues have arisen from completing the survey, the client may wish to talk to their service provider about it, they may wish to seek out someone that they trust, or possibly contact Lifeline as well.
So that is just a little bit of a walk-through of the client survey tool and how it is looking at the moment and what we have been piloting for the last couple of months. I can see a lot of questions coming through here as well which is excellent and I am definitely wanting to use a few of these because I notice from your questions they are very similar to the content we are about to cover together. But basically, let’s go through the pilot feedback together and get a sense of what people have said about the pilot tool, how it is working on the ground, and what we have learned from the pilot so far.
There we go, we are on full screen now, the joys of technology, okay.
So, feedback from our providers that are participating in the pilot – those organisations that have been piloting this tool for the last few months. Feedback here about their clients are that their clients are time poor – so often if they do decline the survey it is because they just don’t have the time or they would actually like to complete it off the premises, which is something we are seeing more and more. They want to provide feedback on the service they received so we are still seeing a pretty good take-up of the client survey and certainly there is still that positivity, that willingness to want to share their story happening on the ground.
They would like to complete the survey outside of service delivery – so while initially in the research phase we were told they would prefer to do it on premises, there is certainly a larger and broader cohort of clients who would like to do it on their own time, and certainly this is something we are seeing for programs that are delivered outside of an office space, so for those particular services delivered in people’s homes or perhaps over the phone, perhaps emailing a link to the survey would be much more beneficial here in terms of allowing clients to complete one off premises.
Clients are surveyed for a lot of other reasons – so certainly in the Network different providers are at different levels in terms of looking for client outcomes themselves, so some organisations have their own surveys that they like to run to get a sense of client satisfaction. Of course as well, surveys come from a lot of different areas. So a personal experience I will share with you all: my aunt was recently in hospital for a few things and she received a survey about her hospital experience, touched base with her GP, so the health centre wanted to also run a survey as well, and so when you think about it from the clients point-of-view, surveys can come from a lot of different areas and for a lot of different reasons, so keeping it short and concise and to the point is very powerful in terms of making it easy for a client to want to share that.
Clients also question the need to provide similar data to what is already asked at intake, so that first page where you notice we are popping in the name, date of birth, and gender and so on, if they were doing it very close to when they had intake or just started using a service, then they do stop and say, “why are you asking me these same questions again, I have already given you this information”. So that is quite interesting for us to know.
So, feedback from providers in the pilot regarding their staff. It can be difficult in terms of resistance to change, so getting their staff prepped and ready for the survey, knowing about the survey, feeling comfortable discussing it with their clients, or offering it to their clients, and getting a sense of how it might fit into business as usual. There are also issues in terms of clients with low levels of the English language or IT literacy and requiring assistance; so this has an impact on staff as well, in terms of assisting those clients to move through the tool as it currently exists. Interesting fact for you, the survey was actually designed to be at a year 10 reading level, but we believe and certainly the feedback we are getting is that this can be dropped and reduced even further.
Vulnerable clients will likely require staff support to complete the survey and this is something we would really like to minimise where possible, and these factors will impact on the length of time required to complete the survey and will have repercussions for both staff and for clients. There is certainly impost there that needs to be minimised and this is something we are aware of and we were anticipating and we are also seeing differentiating levels of that in the feedback we are getting from our pilot organisations.
They have also spoken to us a lot about logistics, so things like finding space on premises to administer the survey; do we set up that laptop in the corner of the waiting room, can we use tablets and do we have the resources for a tablet. Some of our pilot organisations have been sharing with us different ways that they have found they can overcome that hurdle or things might work on the ground. Is it something as easy as just letting a client know to come in 15 minutes earlier for their next appointment, setting them aside in the counselling room with the tablet so they can go through and complete the survey, and then if anything comes up or they think about anything through that survey they can then discuss it with that counsellor in their appointment.
So these are all things we are hearing and discussing throughout the pilot. Also discussing the financial impacts on human and also IT resources to support the survey delivery. Access to reliable internet is something often raised and something we were anticipating as well, particularly in rural and remote locations. Now the survey in its current form is actually quite light in terms of bandwidth, our designer is fantastic and is able to make it a very clean, make it an easy running tool so even in areas with intermittent access it runs well. But of course this is Australia, it is a very large country and there are spots where there is intermittent internet or no internet at all, so having a look at how survey delivery could work in those areas. Providing and investigating alternate methods for completing a survey, there is certainly a strong level of feedback here because as I was just discussing, in some areas or just for some client cohorts, a web-based format may just not be appropriate, so what other or alternate methods could work in the sector and what do we want to explore next. We have also gotten feedback from our pilot organisations about the survey methodology; so is having a pre and post survey the best option for all services? For some services that have an ongoing relationship with clients we are hearing that the pre and post survey works quite well for their business model, but for those organisations that have services that are sporadic, one-off, unpredictable, or they just might have a client that they are unsure they are going to see again, having a single survey or a one-off survey is something that we are certainly hearing feedback on that would be very valuable for those particular programs.
So the feedback that we have received from our pilot organisations but also other stakeholders has very common themes that I think you will see here and I am sure many of you are considering these things yourselves, is that clients may feel over-surveyed and survey fatigued; surveys are coming from a lot of different angles for a lot of different reasons – how can we minimise that as much as possible? The survey may duplicate existing data collection under DEX; so very similar questions in terms of name, age, gender, and so on, can we remove this?
Some of the survey questions could be considered too personal or intrusive, and may lead to clients disengaging with the survey or even the service. So I am referring here particularly to those questions regarding stressful life events and the personal wellbeing index. Interestingly the survey data we are seeing is that clients are answering these questions but certainly there is the possibility that some clients may disengage and we of course don’t want this to happen.
Survey questions could be more program specific and could be entirely non-mandatory; so first of all looking at a survey tool where all questions are non-mandatory and a client could move through and select what they wanted and where. Also having a look at and thinking about survey questions being more program specific, so if I am delivering an Emergency Relief service is there a way where the questions could be filtered to something more specific to what I deliver on the ground.
The feedback as well that we are getting from the pilot orgs and stakeholders is having that client survey data available back to them. It would be of course in aggregate format so that clients are kept anonymous, but certainly sharing that information back with you is so important, because this is the main aspect of the client survey, is that you yourselves as organisations can use this information.
Current Survey Data
So, what the survey data is telling us. So of the 296 surveys that have been completed only about 3 per cent have both the pre and post surveys done. Now this is something we expected because not quite enough time has passed to enable the post-survey to be sort of done at an appropriate time for a lot of the pilot organisations participating. In terms of the devices that are being used to complete the survey you can see about 40 per cent are done on a PC or a laptop, almost 58 per cent have been on tablets, and certainly this is something we are seeing for a lot of different programs being so mobile, being so easy to use, the tablet does seem to be the preferred method at this stage. And, being completed on a mobile phone is only 2.3 per cent; so clients using that QR code doing the survey on their phone is a lesser popular option than we initially thought it might be.
The survey is being completed 92 per cent of the time in business hours; business hours being 8 to 6 PM, Monday to Friday, and only about 8 per cent outside of those timeframes. Now, the average time to complete the survey is more than we originally anticipated at the beginning of the pilot. So you can see there for the pre-survey that the average is about 14 minutes and for the post-survey it is about 15 and a half minutes.
Now what is interesting is where it has been flagged that assistance was required to complete the survey; so where a survey required assistance you can see that that average extends out to 24 minutes. We are seeing a significant amount of time there for staff but also for the client to move through and complete a survey.
Now, 86 per cent of clients who start a survey are completing it and they are moving all the way through and submitting their answers, which is excellent. We are also finding that 94 per cent of clients who are completing and going through those surveys are actually answering the non-mandatory questions. We certainly do not have quite enough of a pool yet to get sense of what those non-mandatory questions are.
We will be looking at and trying to determine if there are patterns, so depending on the program selected are there particular non-mandatory questions that clients feel comfortable in answering and are choosing to answer over others and of course we will be sharing that information with our pilot organisations and sharing that as we move forward in the pilot.
Webinar Participant Questions
Now, I have a question here from Paula asking; “how can surveys be completed if there are over 3,000 clients ages 65 years and over, service is delivered in their home, and 66 per cent of them are from a non-English speaking background?” And that is an excellent question Paula and you are not alone in this question at all, and this is something that certainly we are getting a real sense of in the pilot so far. We already know that we want to translate the survey into a number of different languages, we also know as well that for the time being in the aged care sector using a web-based tool is a bit of a barrier in terms of IT literacy.
They say, you know, it is a generational issue, in twenty years’ time we probably won’t be as concerned but for now it is a very real issue and how might that work best. We do have a few aged care providers that are participating in the pilot, some of which who deal exclusively with multicultural clients. Some of the successes and challenges they have been sharing with us have been quite interesting. One of the organisations was actually sharing with us that they used the client survey as a training tool for some of their clients on how you would actually use a tablet; so they set up the client survey on the tablet because it has got slide scales, it has got radio buttons and different things and a scroll bar on the right-hand side to move up and down the page, they actually used that as the training tool to sort of guide their clients through using a tablet for the first time. Which apparently was really really positive and the clients absolutely loved it and it worked really well but certainly all these things we are learning about and hearing about as well as we continue with the pilot.
Client Survey Reports
So let’s now take a moment to look at the existing report for the survey. So the feedback report that pilot organisations can run and look at. This is just a screenshot of that and how it currently looks so it gives them a sense of the average time their clients are taking to complete, the different devices that are being used, and how many surveys are being completed as well.
Reports that are in development, I’ve got a few screenshots there that I would like to share with you that are here on screen. So this is information that we have heard from our pilot organisations they would also like to know; so number of completes versus number of incompletes, are their particular age brackets or gender brackets; particular gender that might be filling out the survey more than the other, if they have multiple outlets; is there are particular outlet that is delivering high numbers of surveys rather than others. So to give them this information that can be useful to them in their business.
This next page is another survey report aspect in development, and this is to give organisations a sense of the average time to complete a survey versus completing when assisted, device used, and other client identifiers.
This next screenshot that you are looking at here is something that we have certainly been monitoring in the metadata of the client surveys, but this is something that we are hearing from our pilot organisations that they would also like to see, and this is looking at drop-outs by question, but also getting a sense of time.
So this is something that has been really interesting for us for the development and moving forward through the pilot, is getting a sense of is there a particular point in time where clients decide to drop out of the survey; they have had enough, it is taking too long, or whatever their reason may be and they disengage. Or, is there a particular question that we find that clients just really don’t want to answer or a particular question they all of them are answering and relating to as well. So this is certainly information our pilot organisations want to get a better sense of too for their own organisations and for their own client survey delivery.
So what are we proposing as a result of this feedback. I would like to share with you now of the feedback we have received what do we know we need to change, what do we want to do with that feedback next. So firstly what we really want to have a look at for the client survey is simplifying the language. So a ‘plain English’ review will be completed to identify ways that questions can be changed, or lessened, or shortened, to make them easier to understand and more client focused and more client friendly.
So as I mentioned before, at the moment the survey tool is at a year 10 reading level and we want to reduce this even further. We will also be doing another cultural sensitivity review to further consider topics that could be sensitive to clients of various backgrounds but particularly Indigenous clients as well. I may actually call you back to one of the questions we looked at that was asking about community participation, so “do you volunteer with a school or a sports group, a neighbourhood association or church?” and interestingly one of the pieces of feedback I hear fairly often at the moment, certainly in the Roadshow when we have been going around and talking about the survey and meeting a lot of you in person is the term ‘church’, because of course we have many different citizens from many different backgrounds and different faiths, and not everyone expresses or celebrates their faith in a church.
So certain things like this as well that we will be looking at in terminology and things used within the survey. We will also be looking at alternative question sets, identifying new questions or clearer defined questions, rephrasing of questions, altering questions to target best results, and to minimise confusion and to minimise any language barriers.
Alternate ways to deliver the survey that are under consideration at the moment, you can see now on screen. One of the main ones we hear a great deal, and what we are hearing back during the Roadshow is looking at a paper-based survey. Now, we are very much committed to investigating this option on what the survey might look like and how it might work if it was paper-based.
While this will address some barriers and some client cohorts may be much more comfortable in sharing their point-of-view and their service experience on paper, it also raises a few issues in terms of privacy and this is something we need to think about as well, because of course you have got a client’s details and their point-of-view on a piece of paper; where does that piece of paper go and how do you protect that piece of paper, how do you get it back to a central place that it can then be digitised and given back in a report that will be useful and meaningful for the organisation.
So we do want to investigate this fully and all of its flow-on effects. We are also looking at and thinking about an off-line survey, so for those areas that may only have intermittent internet access, is there a way that the survey tool can store results in the background somewhere and then once a month the org could then hook it up to the internet and it would then upload that information separately, or other offline options are definitely what we are looking at there.
Also telephone assisted surveys, so this would be more of an automated telephone service where a client could ring a number and they could select the language of their choice and the questions would then be read out to them via the telephone and they could then select their answers using the keypad. Now it would be an automated telephone service though and not a real human on the other side. Certainly depending on where we are, I hear very stark and different feedback on this point.
Some people say; “for our visually impaired clients this would just be amazing, they would love this”, but then equally I also hear providers saying, “are clients talking to a fake person on the phone, forget it, they will hate it”, so it is quite interesting hearing that, certainly if anyone has any comments on that I encourage you to send them through here with us today.
Webinar Participant Question
Now I have a few other questions that have popped up so i thought we might take a moment to discuss them while we are here. So one of them here is asking from Jane; “has the survey been trialled in group settings? How could we best encourage participation in this environment?”
Excellent question Jane, so the survey itself is really designed for clients to feel safe and independent and separate enough to be able to pop in their point-of-view and their story, and so in a group setting we are sort of getting different sense of how this works.
Some of our pilot organisations, one of them in particular, raised that they have a men’s shed where they see a regular group of men who get together and hang out and build things together and what they have done is actually bought a tablet in with them and say, “look, we have a survey over here, here is a bit of information about the survey, if you would like to sit over in the corner with a chair and you cup of tea in our break and go through and answer that survey, you are more than welcome to”. They are finding that that is working quite well in that particular group setting. It has also been quite nice for the men involved as well because some of them are not very tablet savvy so some of them have actually been sort of sharing with it and walking their friends through and saying, “oh no no that is a slidey scale, you’ve got to click on it and slide”, and they are finding that to be a really nice aspect of the group service, but certainly we are hearing different things.
For some other group focused and group delivered services such as Playgroups or child-focused services, I think possibly having the survey delivered separately or being able to be given to the client to complete in their own time at home, we are definitely getting the sense the feedback back that that would be more appropriate because of course you do not want to distract the parents or practitioners or other people there that are supervising and looking after the kids at the time. So we are looking at a number of different ways it may work in terms of group participation. So thank you Jane for that question, that was an excellent question there thank you.
Now some other aspects about the client survey that we are looking at is a review of the policy behind the survey. So getting a sense of whether or not the policy we anticipated and the methodology we created in our discussion paper is something that needs to be adapted as we learn more throughout this pilot phase and we are open to changing or updating that if we need too.
We are also looking at language barrier improvements, so some of the things that we will be implementing for the second phase of the pilot where we can is adding an audio loop to the survey tool, because this was something raised in feedback that some clients; while they may not be literate in English they may not actually be literate in their first language either, but hearing their first language, hearing their own language, or hearing English they are very comfortable with that and are able to answer the questions this way.
So we think adding an audio loop would be quite effective for those particular client cohorts.
Also we definitely want to translate the survey, so we are looking at which languages we would choose first for this next phase of the pilot, but certainly this is something we absolutely want to do, it is something we always knew we would want to do and have to do, but of course wanted to start that pilot and get a sense of how the tool is working as it exists at the moment.
We are also investigating targeting the questions to program objectives. So, it being a general survey with very similar questions across the board and the feedback we are getting on how we want to target that to particular programs or something a bit more specified, we are certainly having a look at. Now in some ways this could cause barriers for us that we need to think about.
So you know, at the moment the Data Exchange is used by 60 different programs and of course it would be quite difficult for us to monitor and maintain 60 separate surveys, but could they actually be grouped in similar ways; if it is a child-focused service, if it is a counselling service, if it is more of an Emergency Relief service, could we group surveys in particular ways that could be more adaptable to particular programs, we are certainly investigating. Also getting a sense too of is there a way we could ask particular questions that expand or contract questions that were more program specific as well; all things that we are investigating fully and playing with during this pilot.
Now, there are alternate ways to deliver the survey that we have looked at, and we have also had approved which is wonderful, so one of those is actually sending the survey by email. So this has recently been approved by our ethics committee. This is something we will be exploring in the second phase of the pilot from July to December this year. So for those clients who perhaps are in Playground or a group setting, or other clients who do not have time then and there, but they are happy to do it on their own time at home, it is the ability for the organisation to send an email link to the survey for the client, then they can go and complete that survey separately. We are also looking at creating a one-off version of the survey, so I have a demo version I would actually like to share with you guys during this Webinar today on how we anticipate it might look.
Part of the Roadshow that we are doing at the moment, going out and travelling around Australia and meeting a lot of you in person where we can has been about sourcing feedback on this one-off version of the survey, providers comments, any changes they would like to see or things they would particularly feel are beneficial for them. But essentially the one-off survey looks at basic demographics of a client, client reasons for seeking assistance, combining the pre and post outcomes together in a single survey; so rather than the pre and post, what would these two aspects look like if they were combined together and also client rated satisfaction with services, so much more heavily focused on the client satisfaction.
One-Off Survey Demonstration
So I am just going to demonstrate the one-off survey with you at the moment. It is a demo, so please keep that in mind as we move through, but this is to sort of give you an idea of how this may look and this is something we are certainly looking to build and pilot as well, to get a sense of how popular and useful it will be on the ground.
So I will just walk you through and explain a few parts of it, and please know as well that no graphic designers were consulted in the making of this demo, it is very much a demo so please enjoy the clipart. Okay, so this first screen here is asking for these details again. Now one of the pieces of feedback we have gotten from the one-off survey is; if it is a one-off survey and the pre and posts do not need to be linked then ideally this section might actually be able to be removed, because of course we would not need all this information to be able to create the SLK to link the pre and post surveys together, so this something we are considering.
Now the feedback that we have heard is that clients to enjoy having the survey in sections, so we have tried to replicate that as well, but also by providing a bit of a prompter screen at the beginning to give a little bit more context about why these particular questions are being asked. You will also notice as we are moving through that all of these questions are non-mandatory, so a client could select any that resonate with them or any that they would like to share information on.
So this first section is Your story; ‘the following questions about you, help us understand who is accessing the services that were delivered, and how different people get different things out of the activities that are offered’. Your housing situation; do you currently have a permanent place to live? ‘Yes’ or ‘No’. Please provide the following information about your permanent residence or place that you stay at most of the time. And again this would be a state dropdown list and the suburb dropdown list. Now a piece of key feedback that I get, and I think it is a really powerful point, we are thinking of changing suburb to postcode particularly in regional areas where they do not have suburbs and it could be a misnomer there in terms of terminology. Your cultural identity; what is your country of birth? What is the main language spoken at home? Are you of Aboriginal or Torres Strait Islander heritage?
Your health and wellbeing. Now this screen would only present that top question which asks; do you have medical conditions or disabilities that have lasted, or are likely to last for 6 months or more? ‘Yes’ or ‘No’. If the client selected ‘Yes’, this bottom half would pop-up and present on screen, very similar to how it works in the original tool and how it behaves at the moment. There are a number of options there and they could select multiple options.
Now some of the feedback that I have received on the road is that some of these particular medical conditions and disabilities could be better described or could be better grouped, if you have any comments on that please do send them through to us I would be very interested to hear your perspective on that.
Your work and studies; now we would like to ask you some questions about your work and study, how many paid jobs do you currently have? What was the highest year of primary or secondary school that you have completed? And, which value best describes the current annual household income (before tax)? Now this is certainly a sentence that I think requires a plain English review. Also, some of the feedback we have had is, what about clients that may be living at home with others, they may not actually be aware of the annual household income. So having an option such as ‘I don’t know’ listed here might be quite beneficial as well.
And something I would like to share with you too in terms of our ethics approval for the client survey is that we recently have gotten approval for the second part of the pilot to be able to offer the survey to clients that are 16 and 17 years of age, as well. This is something that we are quite excited about because there are a number of young person’s accessing programs across Australia that are making quite adult decisions, leading adult lives, some of them young parents, so being able to share their voice as well we find particularly important. This next section asks Your Service; we would like to ask you some questions about the service you are receiving, how you found out about it and what you got out of it. So, ‘how did you find out about this service?’
It is very, very similar, in fact exactly the same and linked to those referral pathways that you would see in DEX, and the client can select any or all that apply. Now this is an area that we would really like to do a bit more of a plain English review on, or also perhaps change some of the terminology to be much more client focused. A good example of that is actually about half way down on that list. So at the moment within the Data Exchange you will notice an aspect of whether a client was self-referred. But of course from a client point-of-view, this would be something much more along the lines of, ‘I found the service myself’. So whether that be social media, the internet, brochures or so on. Now, these two questions are actually new questions that we are thinking of including in the one-off due to the feedback we have gotten from the pilot organisations so far. And this is to get a sense of the intensity of service from clients, so ‘how long have you been accessing this service?’, Less than one week, through to More than one year. But also, ‘how many times have you accessed this service?’, ‘Once’ or ‘More than 20 times’. So this is to get a real sense of a client who comes along maybe once every 12 months, or is this a client that actually accesses the service quite regularly.
Interestingly the feedback I am getting here about this is certainly using the word ‘accessed’ twice but differently in both sentences, so this is something definitely we will be looking at in terms of our plain English review. But also services where it may not be numerable, in terms of accessing, so for example a Meals on Wheels service where a client could have a delivery of 20 meals, but they get that delivery once every week, so perhaps rephrasing or having an option here about being able to say how many times you access this service weekly, monthly, annually and so on.
I have a question that has come through here from Miff asking, ‘are you exploring a survey option for 9 to 16 year olds?’ This is a really interesting question Miff, thank you for sending that through. Ideally to have a survey be appropriate for children and to collect a child’s perspective and their voice for things, I think is something that would be fantastic but something that is quite far off at this stage for the client survey. It would require a great deal of consultation, experimentation and testing.
We would also probably need to consult on the specific age bracket on how the language of questions and how these might work for different age groups. But this is something that we do regularly hear about and something that is shared about the survey, so certainly something that we are trying to get a sense of in terms of popularity, accessibility, and how we might want to explore that as we move forward.
Webinar Participant Comment
I also have a comment here from Leanne in terms of the annual household income, ‘it would be good to indicate what the aged pension amount is for older people that are completing the survey’. And that is an excellent comment Leanne, and actually one that I think it quite important if that question is included in the one-off, and that is that often people do not actually know what their household income is before tax. A lot of the time people are much more confident about what their get in their hand, whether that be a pension or from salary or from elsewhere. Certainly there are questions about where the income comes from in things like Census, I do not think it is something that we would want to replicate here, but often there are questions received on that.
Webinar Participant Question
I have got another question here from Jane too, ‘thank you’, you are welcome Jane. “I was wondering more about if there was not a clear date for a service, like a Playgroup, then how do we determine the time to offer the post-survey, can we offer several post-surveys and so on”, and this is actually a great question Jane, so at the moment with the pre and the post, this is really why we are exploring this one-off survey idea, is because it is not always appropriate depending on the service and it can be difficult to get a sense of when a post-survey might actually occur. Certainly multiple post-surveys, like having multiple post-SCOREs is definitely an option, and it would be at the organisations discretion to get a sense of when that would happen and on the ground when would be the most appropriate time to do that.
One-Off Survey Demonstration
I might just draw everyone’s attention to – oh I have gone ahead – attention though back to the one-off survey and I will pause at the end to take some more questions and discuss a few more items of feedback as well. So this next area here is asking, ‘What was your main reason for seeking assistance or using this service? (Please select all that apply). Now these have been taken directly from the Circumstances domain of SCORE.
We certainly want to change the language to make it more client focussed or client friendly such as instead of ‘managing money’ perhaps putting it into a phrase such as to ‘seek improvements in managing my money better ‘or something along those lines so we are getting a sense of that and thinking through that at the moment.
But what’s interesting about this particular screen in the one-off survey is that whatever a client selected here as their reason for seeking assistance or why they were using the service then directly influences the questions that are coming in the next screen, which is looking at their ‘life before receiving assistance from the service and also after receiving assistance’.
So ‘Thinking about the impacts of the issues you were seeking help for before getting help from this organisation how much was’ and if I had selected for example ‘I wanted to improve my physical health’ it would ask ‘how much was your physical health impacting on your life’ and then we have our radio button options of ‘Not at all’, all the way through to ‘All of the time’. Now as a client if I had selected at the previous screen that I was seeking assistance in terms of managing my money better it would ask something along the lines of, ‘before getting help from this organisation how much was managing your money impacting your life’; ‘Not at all’, all the way through to ‘All of the time’.
This next page then asks about the client’s life after receiving assistance from the service, so; ‘now that you have received assistance how much is your physical health impacting your life? How much is managing your money impacting your life?’
This next screen asks ‘What has changed as a result of your service, so; ‘Thinking about the issues that you sought assistance for: Did you learn anything that will help you with the issue(s)’; ‘Yes’, ‘No’, ‘Not applicable’. And we are looking at determining on what was answered in this particular question whether or not the others that you see on screen here then also present or if they are already there as an option as well. ‘How confident are you that you can use what you have learned to help you with the issues’, ‘Did you make any change or do anything differently as a result of the new things that you learnt?’ And ‘Did making these changes help resolve/improve the issue(s) that you sought assistance for?’
So some of these questions you can see are different to the current survey tool and really trying to get a better sense of attribution which can be quite difficult from a data perspective, you know, being able to really locate because a client accessed this service and this is what was directly approved because of that service.
‘What has changed as a result of your service?’ ‘Thinking about your experience with this organisation and the issue or issues you sought help for, did you: Receive useful information, Develop relevant skills, Improve your access to relevant support services, Increase your confidence to make decisions about the issue, or issues or Gain the ability to help yourself with the issue or issues in the future’. So you can see here these are questions that are very similarly targeted to the goals domain of SCORE and the client could answer ‘Not applicable’ if they wanted or anything from ‘Not at all’ through to ‘A lot’.
‘Your satisfaction with service experience’; ‘How satisfied are you that the service: Was easy to access, Was friendly and welcoming, Listened to your needs, Understood your needs, Met your needs, or Helped you deal with the issues you needed help with’. Very similar questions to what was asked in the current survey and how it is at the moment, we wanted to replicate this. But interestingly something that has come back through the Roadshow at the moment and I think it is a really interesting point in terms of interpretation, which is the question; ‘easy to access’. Now depending on the client or the service this could actually mean a number of things; is it asking me whether or not the service was easy to find and know that the service existed, or is it more actually asking me is the service easily accessible in terms of me getting into the building. So having a look at those interpretations at those different ways and what language is used is certainly something we are very interested in and wanting to improve.
Now this bottom question on the screen is actually a new question that we are thinking of including in the one-off survey; ‘Would you be willing to recommend this survey to others based on your experience?’ The reason why we wanted to include this question and we have received a great deal of feedback that including this question would be very beneficial for organisations, is because having a client refer on to someone else they know a service is a very strong determinant in how satisfied they are with that service.
‘Is this the only organisation helping you with the issue you sought assistance for?’ So this particular question is again another new one and this was something that came out of feedback from pilot organisations particularly around getting a better sense of how often clients are being asked about the survey, are they accessing services from elsewhere; so if they go down the road to another organisation will that organisation also ask about doing the survey, just getting a better sense of where crossover occurs is something that pilot organisation are really interested in and to a level we are too, so this one here has been included. It is a simple ‘yes’ or ‘no’ answer and if ‘yes’ is selected it will then pop up the bottom half of the screen which you can see here, which is ‘what were the other sources you were seeking assistance from?’, and there are a few options that we have listed there. Now I would be interested to hear from those of you who have tuned in today, is there other ones that are not listed there that you think would be quite relevant to yourselves, are there other sources out there that you know that your clients might also be accessing alongside your services, we would like to know that because while this is a general listing, we also want to make sure it is as comprehensive as it can be.
And then of course the last page, again asking about assistance in completing your survey, so; ‘We understand there may be various reasons why you may need assistance from your provider to complete this survey. Was there any time you required assistance to help you complete the questions in the survey?’, ‘Yes’ or ‘No’. So continuing to get that sense of assistance or where staff may have needed to help out in that.
And lastly our thank you, so; ‘Thank you for telling us about your service and your story. Information provided from people like you assist us in ensuring that services are available and appropriate’. And essentially that is how we are anticipating the one-off survey might look.
Webinar Participant Questions and Comments
Now I have a great question here that I have noticed come up which was asking about the timeline for the one-off survey and when that might be released or available to be used for the pilot. So at the moment we are really just getting feedback about the one-off survey; what people like, what people don’t like, what organisations think we should include or exclude. This will help us finalise the design of our one-off survey.
We will then have to actually build it and so at the moment we are probably anticipating closer to the end of year possibly September/October is the timeline we are thinking about and aiming for at this stage, but we would ideally like to get the one-off survey out there for the second phase of the pilot before the end of the year so we can adequately test its take up and see how it works on the ground. But excellent question and thank you so much for sending that through Claire.
I have a question here too from Donna asking; ‘does having neither satisfied or dissatisfied as an option add unnecessary ambiguity?’ At this stage I do not think we have quite enough metadata to make that determination in the current client survey existing tool at the moment, but this is something we definitely want to have a look at and something we are trying to minimise with the one-off survey, but certainly we will keep an eye on that to get a sense of if it is actually an adequate response, because as you say Donna there is a bit of ambiguity there and how do you know the level of satisfaction for the client there or if they have an opinion on that. So that is something we will definitely be investigating once we know more.
Generally speaking though I would really like to get a sense from all of you, what do you think of that short one-off survey? Is this something you think would be very useful to yourselves delivering services on the ground? Do you think this is something that your clients would be open to participating in and sharing their story? I am just going to pause here for a minute because I understand that people will be frantically typing at this point.
So thank you Brendan and Rose for being my lovely facilitators and answering a lot of your questions, oh and Angela as well, in answering your questions that are coming through today it has been wonderful.
Yes, so I am seeing a lot of positive responses about the one-off survey and enjoying that and thinking it is quite appealing and would be very useful – much more relevant than the first survey, oh that is from a Marita that is quite exciting I think I have only ever met one other Marita in my whole life it is good to know there are more of us out there. ‘Once-off survey love it, great session’, good good, excellent well that is really good to know because this is certainly something we are hearing across the network and we really want to respond to that feedback.
As you can see we are taking on all feedback, good bad or indifferent, negative or positive, we always say feedback is an absolute gift and it is certainly something that we want to take on because of course at the end of the day this is a too that you yourselves will be offering to your clients, it needs to be something that you believe in, that you know the benefit of, it needs to be something that will be useful for yourselves. Yes I am seeing some comments too here that the one-off survey seems more relevant and client friendly and certainly more appropriate for our service – excellent. Thank you everyone for your comments, I really appreciate that.
So feel free to keep sending in those comments, but i think we might just finish up the main content of the presentation today looking at timing and next steps for the client survey. So I have just bought up that slide here now for us so that we can have a closer look at that. So as I mentioned before the first phase of the pilot started in January. We are finishing up that first phase with the Roadshow, touring around Australia meeting a lot of you in person where we can, getting your feedback about where we are up to with the pilot, what we have learned, what we are heading toward, and also getting a sense of what you think is going to work best as well within your business and for your clients and your services. So we are moving to a soft launch now for the client survey or the second phase of the pilot, which is going to span from the 1st of July through to the 31st of December.
Now what we mean by soft launch is that it is available for anyone who would like to trial implementation in their own business, so if you want to see how it is going to work for you in your organisation this is a perfect time for you as of the 31st of July to get a sense of how that is going to work, to prepare your staff, give it a go, get a sense of what the survey will mean for you on the ground.
We are hoping this period of time will really give organisations time to prepare for the launch of the client survey. We are anticipating it will be launched across the network in 2018, so just to give a little bit of extra play room to get a sense of how the email link works on the ground and if that is a much more popular option and more accessible for clients. Translating the survey, adding that audio loop; does that improve our accessibility and the clients wanting to take that up? Shortening the survey, and so on and so forth. It will also be a great opportunity for us to one-off survey and any additional enhancement to the current survey tool as it stands as well, so what is more popular, what is less popular, and how we might move forward with that so it is a quite exciting time I think, looking at all of that.
Now in terms of next steps if you would like to be part of the soft launch from the 1st of July, you are absolutely welcome. If you would like to be part of the pilot itself you are more than welcome to join in the pilot as well, even though we are part way through it, it does not mean you cannot join in. If you would like too, we would love to have you and please send through that enquiry to the email address you see on screen: MyServiceMyStory@dss.gov.au.
Otherwise I think I might close the webinar for today. I want to thank all of you for taking time out of your busy schedules to be here. We are very excited about having this Webinar technology available to us now. It certainly means we can see a lot more of you and it is certainly a lot more accessible for many of you I know – a lot easier tuning in from the office for a little bit to get a little bit of a sense of what is happening in the DEX world. If you have any other questions please send them through. I think for the last bit of this Webinar we might have a little bit of a Q and A for the next 15 minutes and get a sense of any other outstanding questions you have about the client survey and we will discuss those together. Otherwise for those of you who would like to tune out you are welcome too and let’s go through some of those questions before we finish today.
So we have got questions here about the slide you have had a look at and how you would like to distribute them, so just want to let you know again Gillian and for those that may be interested in this presentation today and sharing that with their staff, we have actually recorded the presentation for today so we will be preparing that for our website and we will be releasing it and having it hosted on the DEX website so you can be able to watch that at your leisure at any time. If your staff want to have a look at it they will be absolutely welcome too and of course you will be able to have the slides accessible that way. So thank you for that question Gillian, always a good question to raise for the benefit of the group.
I have a comment here from Claire saying; ‘the second survey comments are more relevant from a services perspective to gauge outcomes and client satisfaction. Leaving it to after the service has been provided seems to waste an opportunity to improve the service being provided for that client though and when would be the ideal time to ask the survey to be completed if at the end of the program”. That is an excellent point Claire and thank you for raising that. I think this is certainly something the pilot will explore, certainly at the moment in its current format the survey being a pre and post, we are finding depending on the program, depending on the organisation our pilot organisations are figuring out when the pre and post- surveys take place in their service delivery.
For example if it is more of a counselling service, they know on average they do about three counselling sessions, a few have then decided to go, ‘okay we will go for a post-survey on that third counselling session and see where we go from there. I think it will be very individual depending on the organisation. I think we are going to see very varied responses to that as the pilot progresses. In terms of the one-off survey, I think because it is that one off instance of asking information, it does lean toward happening at the end of service delivery with a client. But if you are continuing to have contact with that client, you are continuing to have that relationship it could be something that is more appropriate to have during the middle of that service delivery, particularly if they are going to be accessing that service over multiple months or years. It could be that it is just an appropriate time and place; you offer the survey and it just so happens at that time they are ready and willing to do it. As I mentioned before, I do think we are going to see very, very different results as the pilot continues – thank you for that question.
Another question here from Jane; “Am I right in thinking that the trial is ongoing with the current survey; pre and post? And the soft launch gives option of trying out the one-off survey and audio options, etc.” Yes Jane so that is absolutely what we are moving toward. Interestingly some of our programs really like the pre/post client survey, it seems to be working really well for a couple of pilot organisations. We definitely want to continue on with that for a little bit longer to see how that works. Particularly because with services you need weeks or months before a post-survey would be relevant or appropriate, so we will definitely be continuing that throughout the trial. But this soft launch is definitely more about having the email option, allowing the survey to be completed by those 16 and 17 years of age, as well as 18 and over, and also then the audio loop and the one-off survey as well are things that we want to then try during the second part of the soft launch as well, absolutely you have that right.
I have a question here from Kim wondering how regularly the survey data reports would be available to service providers, is it live streaming? Excellent question Kim. So at the moment for our pilot organisations the results are actually updated about every two weeks. It is a pretty manual process at the moment extracting that data for them, so updating every two weeks. But ideally moving forward with the client survey report it is something we would like to have probably more on a live basis. So I do not know if you are part of the Partnership Approach or if you have been playing with the Qlik reports that were released in March, but they are incredibly fast, really easy to use, and they update in live time. And this is something if we could replicate that for the client survey reports for a provider as well, this is something we definitely want to be able to have available for them – thank you, great question.
I am just pausing for a moment just in case there are any other comments any of you have that you would like to share. Are there any other questions about the client survey you might want to ask will you are here? Otherwise if you have any other questions that come to you at a later stage you are more than welcome to send them through to MyServiceMyStory@dss.gov.au.
As I mentioned at the beginning of the Webinar, my name is Marita Baier-Gorman, I am the leader trainer of the Data Exchange training team, and it has been an absolute pleasure spending a little bit of time this afternoon with all of you to share with you our client survey, how it’s looking but also where we want to take it in the future. There will be a short survey that will be emailed to you once we complete this Webinar today and if you wouldn’t mind taking the time to send in those answers we would greatly appreciate that.
Otherwise I hope you have a wonderful and safe afternoon and a really excellent weekend and I am sure I will meet many of you I person in the future. Have an excellent day and I hope you have enjoyed the Webinar – thank you very much.
This webinar aims to enhance a deeper understanding of the intent of the client survey; where the Department is looking to adjust its approach in response to feedback and the next phase of the pilot moving forward.
Please note: This webinar was recorded on Thursday, 25 May 2017 and that the content of the client survey questions discussed for both the post and one-off survey is as at this date.