Listening to what you think
Danny Langley reflects on results from this year’s ONS customer satisfaction survey and how it is evolving.
Every year ONS used to run the same Customer Satisfaction Survey, asking our users what they think of us, what we do well, and what can we do better.
The Bean Review of economic statistics challenged us to better understand our users’ needs. It noted that our annual process regularly suggested enviably high satisfaction scores, yet also highlighted that there were known limitations to our outputs which numerous users have commented on, whether in public or privately. Within ONS, too, we pondered whether the process was really giving us the most meaningful insight possible into what our users thought about us and what their needs were.
Single survey
So this year we tried something a bit different. Instead of running two different surveys, one for Government departments and another for users more generally, we ran a single survey that aimed to feed views into the ongoing business planning process. We stopped using the same sample frame that had returned almost identical results year-on-year and instead asked colleagues to share with their contacts and put the survey out on social media. We also asked a slightly different set of questions.
Disappointingly, we received many fewer responses this year. That could be for a number of reasons. Changing the way we advertised the survey, relying less on direct mail outs than previously, no doubt contributed. So, perhaps, did the way we labeled it. A ‘business priorities survey’ perhaps sounds more daunting then a simple satisfaction survey!
The low response rate means that the results need to be treated with some caution. Nonetheless, the headline results suggest generally very high satisfaction levels across the board, with the most room for improvement in the areas of communications and ensuring our statistics are as helpful as possible to decision or policy-makers:
- 88% trust ONS statistics and analyses
- 83% are satisfied with the quality of ONS statistics and analyses
- 76% are satisfied with ONS’ communication
- 75% consider ONS statistics to be helpful in providing an evidence base decision making or policies over the last year
- 84% are satisfied with the overall performance of ONS as a National Statistics Office
Because of the different questions, different overall approach, and the much smaller number of respondents, this year’s results are not comparable with previous years and so we haven’t included them here, although for those interested previous years’ results are available on the archive website.
Perhaps more valuable than the quantitative results was the qualitative feedback respondents gave us. This was supplemented by a series of in-depth anonymous interviews we commissioned to give us additional detailed feedback.
High quality outputs, but more to do
Areas where users felt we were doing well included being responsive to requests and being helpful and courteous. Many comment on the generally high quality of our outputs, and it was felt that our external communications and engagement have improved since the publication of the Bean Review.
The ONS website was an area that received both praise for having improved but also for further improvements. We’ve shared comments on the website with our digital colleagues, who are working constantly to improve the website further. You can keep up with their progress over at the ONS Digital blog.
An area where many respondents also called for more improvements was our regional statistics offering. We know that in the age of increasing devolution there are constantly growing demands for more and more data at a local level. We’ve instigated an ambitious programme of improvements, and in the past weeks we’ve launched a series of consultations on some of our detailed proposals: one on regional Gross Value Added, another on sub-national public sector finances (closes 11 September) and finally one on sub-national service sector exports (closes 8 September). We think all of these consultations represent tangible steps towards improving our local offering, and I’d encourage you to take a look if these topics are of particular interest to you.
Some people have also said they would like the ability to sign up to receive e-mail updates about our releases which you can do so here. We also have heard loud of clear those of you who don’t think we consult in a coordinated way. We have now set up a one-stop-shop for ONS consultations, and are thinking more generally about how we can run consultations better.
In terms of our business priorities, you told us that the three major output-based transformation programmes – covering the census, economic statistics and social statistics – were roughly equally important to you. Fewer respondents expressed a strong interested in how we are transforming data collection, although in truth transforming the way we collect data is central to all other areas of transformation so this will continue to be a key priority.
For more detail, you can find a presentation summarising the survey results here.
Next steps
This year we tried something new which worked well in some ways and not so well in others, which is almost inevitable when it comes to evolving. Rather than revert back to what we did before, we want to keep trying new things until we are confident we have the best system in place to make sure we are meaningfully engaging all our users in everything we do.
If you have any comments or suggestions, or if you know of any organisation that does this particularly well, please get in touch – we’d love to hear from you. Is there a way you think the survey could be done better? Is a survey even the best way to get your input at all, e.g. would we be better to run a series of events? We’re open to all ideas. Please get in touch.
However, even though much of what you told us via the survey chimed with what we expected, it’s nonetheless always valuable to receive timely reminders of what we do well and – more importantly – what we need to do better. So the final thing for me to say is thank you to those who took the time to tell us what you think – please continue to do so.
Danny Langley is ONS’s Head of User Insight and Engagement