How to Measure the Effectiveness of Your Self-help Content
Do you know how much money you spend each time a customer contacts you for support?
I’m not talking about the money spent on the issue resolution. I’m just talking about an instance of a customer approaching your support staff.
If you aren’t sure, refer to the following chart of the approximate cost per contact as established by a Forrester Research:
As you can see, the cost of calls and chats range from $5 to $12+ (per contact) while those for the web self-service channels start at just $.10 or less.
In addition to bringing down the cost per contact, self-service channels also delight the customers. 65% of consumers feel good about themselves and the company they’re engaging with if they’re able to resolve their issues on their own.
In most of the popular web self-service channels like forums, knowledge bases, and FAQs, CONTENT plays the BIGGEST role. So to maximize these self-help channels, it’s important to keep improving their content.
Of course, it’s not possible to improve the help material without knowing what the customers think about it and interact with it. So let’s look at 3 simple ways you can find this out:
In This Article
We rigorously test and research every product that we recommend through HeroThemes. Our review process. We may also earn a commission if you make a purchase through our links.
1. Ask for user feedback
The simplest way to gauge the quality of your self-help content is to ASK your customers for feedback.
This tactic might sound like a no-brainer, but you’ll be surprised to know that a Forrester research found that more than 75% of the respondents failed to ask their customers for even basic feedback. Forrester calls this a “huge missed opportunity”:
Ask customers a Yes or No question “Did this FAQ/search/recommendation solve your problem?” Only 23% of Web sites in Forrester’s July 2009 eBusiness Customer Service And Support Benchmarks invited customers to provide feedback on how effectively a question was answered – a huge missed opportunity to improve answer relevance.
User feedback doesn’t just mark the quality of the help content, but it also underlines what’s wrong with it.
If you use a solution like our KnowAll knowledge base theme, you can easily add feedback forms to your knowledge base content.
Look at our site, for example. For all our support content posts, we ask our users
The objective answers from the users help us identify if anything’s grossly wrong with our content.
To get more insights into how the customer found the content, we follow up our objective feedback question with a simple feedback form with a prompt.
(Introducing the long feedback form at a later stage helps in getting the initial one-click responses.)
Over time, the data about the upvotes and downvotes is available through the analytics dashboard.
2. Use built-in insights from your support content solution
After user feedback, the second most straightforward and data-backed approach to measuring the performance of your support content is to use the analytics of your support content solution.
Self-service solutions like our KnowAll knowledge base theme come with a full-blown analytics dashboard that calculates key performance statistics for the entire knowledge base.
Here are some of the insights that KnowAll offers:
- Self-support content to contact ratio: This metric helps you know when your support content failed to resolve the issue and the user had to raise a support ticket or had to contact you. (Knowledge Base records all such instances as “Transfers”.)
In your analytics dashboard, you can view the transfer percentage for each knowledge base article. With this metric, you can identify all the poorly written articles that lead to tickets.
- Search analytics: There are times when all the queries of a user aren’t addressed in a knowledge base. Measuring such instances helps one discover the topics that the users typically struggle with but can’t find help about in the knowledge base.
Knowledge Base measures such queries with “NULL” results and reports them under the search tab. Since the null queries are reported along with the number of times they’ve been searched for, you can decide if a topic has generated enough NULL queries that you need to develop help content on it.
- Feedback analysis: The feedback analysis measures the overall user feedback. It factors in data about the upvotes/downvotes and the other detailed feedback. (This is the same user feedback we talked of in the first section)
The benefit of using a solution with inbuilt analytics is that you get to eliminate all the guesswork from your self-help content’s performance calculation.
While these metrics will tell you exactly how your users interact with your content, there are a few more stats that you need to track with your site traffic monitoring tool like Google Analytics or Google Search Console.
3. Dig into Google Analytics/Search Console
Even if you don’t have a knowledge base management system in place, you can still measure the effectiveness of your self-service content using Google Analytics and Google Search Console.
As the first part of this evaluation, you need to identify if you’ve covered all the issues that your users need help with.
There are various ways to do this, but I recommend the following method:
Step #1: Access your Search Analytics Report
If you’re already logged into your Google account, clicking on this button will take you to your report directly.
Step #2: Click on the “Clicks” grouping criteria and select “Queries”.
Step #3: Download the list of queries and the associated clicks data.
Now sort these queries and identify the ones about support. For example, look for queries like “your product name + reset password”. Such a query shows that you have a user who’s stuck at password resetting. Note all such queries where the users seem to be seeking help.
Next, identify the cases where you get zero clicks. If you catch such instances, it might be because you haven’t created help content around those topics. Or your self-service content isn’t optimized for those queries.
Alternate Method: Login to your Google Analytics account and visit Acquisitions > Search Console > Queries
Once you’ve identified the help content topics, your next step is to determine if your users are finding the content relevant and easy to follow
The average time on page metric is a good indicator to show if your users can follow the support content. Low average page time values might indicate that either your support content is not on the mark or it’s too overwhelming.
For all the support queries you identified in the above section, you now need to look at the average time on page metric for the associated landing pages.
To do so, go to Behavior > Site Content > All Pages
If you don’t use a solution that comes with smart insights, you may want to further analyze the behavior of the users on your support pages. For example, you might want to see the page they visit when they leave your support content’s page.
To find out if they visit your support ticket or contact page from the help article page, click on the Secondary Dimension > Behavior > Next Page
For the articles for which the next page is always the ticket or contact page, analyze and see if they can be improved. If you can make them better, they will help the users resolve their issues themselves and the number of tickets around those topics will drop.
We have explored some effective knowledge base articles and things you can learn from them. It will help you find errors in your knowledge base articles.
Conclusion
For your support content’s well-rounded evaluation, you need to define a strategy that includes inputs contributed by your users (the user feedback), insights from your support content solution (the built-in analytics), and the broad level picture that a tool like Google Analytics shows.
Once you have data from all the above sources, you can identify:
- The content that’s poor and mostly leads to support requests (which needs complete rewriting)
- The content that’s missing (and which generates enough search queries to be covered)
- The content that’s not great (which the users rate badly and find confusing but can be improved with basic rewriting)
As you improve your support content, go back to your solution’s dashboard and Google Analytics and see how your improvements are impacting the overall support content performance.
If they get better, you can tell that your tweaks are working. Or, you might want to revise some more.
Which of the above methods do you use to measure your help content’s performance?