Learning to listen

If you’ve visited our website you might be aware of our exit poll. It’s a small box that pops up after you’ve spent some time on the site and it’s one of the ways we collect feedback from our users.

In this piece I’d like to explain a bit about how we’re using these comments to improve our digital service.

Constructive criticism

Each month thousands of our users complete exit polls and a large percentage tell us they are happy with the service provided.  A smaller group tell us about the difficulties they experienced using the website. These comments range from design issues (“search facility needs improving”), all the way to personal queries (“more information for self-employed mothers”).

The Insight and Evaluation team shares these comments with me and I categorise them for the editorial team. Common themes include consumers asking for more ‘real life’ examples and complaints about articles being too complicated.

The initial scope of the project was very wide and we were looking at comments across the entire website. For example, a frequent criticism is that we use too much jargon. This has been levelled at us across most of our content, however, breaking these comments down showed us that the issue was much greater in some areas of the website.

Having a better understanding of our users’ needs across the different sections means we can prioritise the areas we need to improve. In this case, the long-term solution is to integrate a tool tips / glossary function to the website, but the priority is to do so in the areas where jargon is most frequently described as a problem.

A segmented approach to improvement

This layered approach also means we are able to assess feedback on individual articles. For example, one piece received a relatively high number of comments from users saying they couldn’t find our Retirement Adviser Directory (RAD). There was already a link to this content in the article, but clearly it wasn’t visible enough.

To help users find what they were looking for, an extra link to the RAD was added higher up the page, as well as an internal menu. The latter shows users exactly what’s in the article and lets them skip to the section they’re interested in. After just a few weeks of monitoring it became clear the experiment was a success, with the majority of users interacting with the new menu or clicking through to the RAD.

This type of quick win enables the content team to drive the business case for larger scale improvements, such as a site-wide glossary.

Building on success

Finally, although each comment is valuable, it’s worth pointing out that we don’t make changes based solely on customer feedback. Each request or comment is analysed and assessed as part of our overall content strategy before it’s actioned. It’s easy to get carried away when implementing feedback, but it’s important to remember that our content strategy is built on several pillars.

Leave a Reply

Your email address will not be published. Required fields are marked *