I analyzed reviews of top survey platforms to test if Keatext can highlight some business insights based on user feedback. My goal in this post is to prove that analyzing unstructured text can be quickly translated into a customer experience strategy.
UPDATE: If you want to do a similar project, check out the tutorial post.
Survey Websites Project Overview
Working on a project like this for the first time I was amazed at how much there was to learn from user reviews. Each one tells an individual story but together, they can give a very clear picture of where survey platforms should focus.
- Time is the most essential component that contributes to a good user experience.
- The payment systems need to be flawless when processing reward money.
- Expectations need to be clear from the beginning (length of survey, length of payment processing time).
I am a first-time user of our data analytics platform and therefore not a specialist. However, our software made it easy to draw some top-level insights that are applicable to improving the user experience.
The data set I’ve used has 3.3k entries so it’s statistically significant, representing feedback about 14 top incentivized survey platforms. The reviews have some structured data representing a five-star rating and an unstructured component in the form of review text. I won’t be naming any platforms because the aim is to get a general view of the landscape.
What can we learn from user reviews?
I wanted to do this as scientifically as possible, so I wrote down some ideas I had before looking at the data. It’s important to check assumptions. Here is what I started with:
- Users only write reviews when they want to complain
- Users appreciate surveys that reflect their interests
- Complaints would mainly have to do with money
Let’s see how I did!
Users only write reviews when they want to complain- FALSE
Isn’t this a reasonable assumption one might have? Reviews are a good venting places, and if it might hurt the company that upset you, even better (or is it just me?). “No sane professional would pay attention to these bloodthirsty users”- it turns out this is not a valid assumption.
Have a look below:
We can see that the data I collected is overall very balanced, even skewed towards a positive rating. 2.92 stars overall is not a venting space score. This validates the value of the data and will give a balanced account of what companies are doing well and what they can improve on.
The next two hypotheses I had were to do with things users don’t like and things that upsets them. There are three top topics: survey, site and money.
I chose to focus on “survey” and “money” and exclude “site” from my analysis. It’s not relevant because it’s related to the brand perception and I am looking for practical insights on ways to improve the user experience.
Users appreciate surveys that reflect their interests- MAYBE
The data hints at the fact that matching interests is an important aspect of the customer experience, but this is not the main factor. Any second guesses? Hint: it’s not money. I chose to see the overall topics that show up in positive mentions.
We can see users use site to mean the company itself. I didn’t spend too much time here since it seems that this term is linked to the overall brand perception. I’m looking for actionable business insights.
Based on the sentiment bubbles we can see that users like it when surveys are easy, good and interesting. While obviously positive, they don’t mean much in themselves. This is where I chose to look at some of the actual reviews that use these words. Keatext already highlighted the positive and negative expressions for me so that I don’t need to spend too much time looking for them.
A lot of reviews mention “quick and easy”, “fast and easy” and other variations. It seems to be a quality that is paired with time. It seems that people respond well to clear surveys that don’t take too long to complete.
Another word I found intriguing is “interesting”. What makes a survey interesting? I once again looked at records to get an idea.
The above example offers some hints. The point about expectations and promises are very important. It seems transparency is key to this customer’s experience. Another point that may be valuable is “my inputs will make a difference”. This shows that people love to provide companies with useful feedback. This may hint at my hypothesis about matching interests, but further testing would be needed.
We can see from the above example that the most frequent positive reviews mention money together with “quick”. Users want to receive their rewards quickly. This dramatically outweighs amount qualifiers such as “good” or “decent”.
Complaints would mainly have to do with money- FALSE
Once again I manage to intuitively guess an aspect of the complaints, but once again it’s not the main issue. We have the same words in the top 3, but in a different order: survey, site, money.
The word problem can refer to many different things. Here’s an example:
We can see that the user had a problem with the payment but due to customer service they managed to fix it, resulting in a neutral survey. Other mentions of problems are in regards to the volume of survey available or not being eligible.
Like a mirror image of the positive feedback (isn’t data amazing?), we see that money was slow to arrive or there was a problem.
Wrapping up the feedback
We learned that users appreciate when a survey is fast and easy to complete. The questions need to be clear, short and simple. Consider adding more open ended questions and analyzing them with a text analytics solution like I have done in this post. This leaves it up to the customer to write what they like resulting in a better experience. We saw users like to give meaningful insights that will help the companies succeed. It can help your business discover features or problems that would otherwise not be visible without their free input.
Make sure to fully utilize their profiles to hand-pick relevant surveys. This will make them enjoy the survey more and keep them engaged until the end. Some users also mentioned appreciating clarity and transparency. Some A/B testing helps to check if this is the case generally, and what it means to different customers. Are you stating the incentives clearly? Is the completion time accurately measured?
It seems that both the complaints and the praises weren’t regarding the amounts of money but mostly about the speed of the transactions. Making sure the payment platform performs well is key to user satisfaction.
We saw above that the three hypotheses I started with were not necessarily right. It’s important to let the customers and users say what’s important to them. Using a text analytics tool can enable you to discover their needs quickly. Another key takeaway was that analyzing both the positive and the negative data conveyed the same ideas. At a statistical level it was clear that most users need the same things; companies just need to listen.
Finally, if any of the features I used in this post could help with your customer experience strategy please book a demo with Jay, he would be happy to show you around.