Feedback from our customers is vital to us. It enables us to understand how effective our service delivery is, to spot where improvements can be made and to identify the factors that make our service great.
We use a Net Promoter Score [NPS] to gain feedback from all of our clients. Our NPS asks clients how likely they’d be to recommend us as a service provider, therefore indicating how positively they feel about us in general. We ask them to rate their support request for quality and speed too.
This approach gives us a fantastic insight in to how well we’re doing both tactically and strategically. A client may have had a great experience with our support team, but may still have a passive NPS or vice-versa; they may have provided a great NPS but may not have had a support request handled as well as it could have been. Whichever scenario, the NPS lets us know if we are on the right track or not.
For some time now we’ve had a Net Promoter Score in excess of 50. Other leading hosting companies are very proud of scores in the 40’s and market this fact proudly.
The score itself is calculated simply by taking the percentage of positive scores and subtracting the percentage of detractors. There are schools of thought that believe that any score above zero is good. There are also schools of thought that believe that culture plays an integral role in how high clients score companies; that positive scores are much harder to get outside of the USA and in the UK and Europe passive scores are more likely to be given, even when great service has been delivered.
It is worth pointing out at this point that passive scores are not used when calculating your NPS. It could be argued that positive scores in the UK represent exceptional or outstanding service!
At UKFast we take feedback very seriously, inviting clients to ‘rate us’ after every support request rather than just once or twice a year.
Every month we receive hundreds of responses from our customers. In January alone we received almost 600 scores. What’s particularly interesting is that if we look at our feedback for that month alone, where 78% rated us positively at either a 9 or 10, our NPS would have been more like 70. But, before we get carried away, we like to calculate our NPS correctly and incorporate the scores from every customer regardless of when they were provided. What this does tell us however, is that our approach is working and we are improving.
We pay as much attention to the passive feedback as we do to negative feedback. We look for trends and for root causes. If we receive feedback that suggests we were slow, we look at resourcing and skill-levels within the team responsible for that customer. If we see negative feedback suggesting quality issues, we look for skills gaps and training opportunities.
Even where the feedback is passive, we look at each request critically, reviewing the background to see where we could have done better. In most cases we contact our customers to let them know what we are doing to improve and how their feedback is helping.
This approach may sound like an arduous and intensive task for an army of quality and service managers, but that is not the case at all. Having had a sustained commitment to this approach over a number of years we now receive negative feedback from a tiny percentage of our customers and only slightly larger percentage that feedback passively.
For us, feedback and is about being thorough and our commitment to searching out opportunities to continuously improve.