Note to the reader:
This part of the field guide comes from our 2019 version of the UX Research Field Guide. Updated content for this chapter is coming soon!
Want to know when it's released?
Subscribe to our newsletter!
Collecting user feedback continuously can help inform future UX research studies, improve product designs, and speed up iteration cycles.
And one of the best ways to collect this feedback is with surveys.
To help you not only collect survey data, but also act on it, we’ve rounded up the best tips, tools, and examples from experts around the web.
Ongoing user feedback surveys (such as NPS, CSAT, or CES—more on these in the next section) are a method for collecting information directly from users or customers about their product experiences, preferences, and usage. These surveys are a continuous, always-on process, allowing teams to keep a pulse on customer satisfaction and respond to user needs as they arise.
The value of regularly collecting customer feedback should not be understated—it’s what drives every decision in your company. As Product Designer Noah Shrader says:
"User feedback in customer-centric companies is the fuel that drives every internal working part. Every process and every business decision is powered by a deep understanding of who the user is, what it is they're needing to do and ensuring they can do it successfully."
Collecting this feedback continuously can also help speed up and improve iteration cycles in continuous delivery models. In the research study, How do Practitioners Capture and Utilize User Feedback during Continuous Software Engineering?, the authors concluded:
“We conclude that a continuous user understanding activity can improve requirements engineering by contributing to both the completeness and correctness of requirements.”
Net Promoter Score segments users based on how likely they are to recommend your product to a friend. It’s a simple way of assessing whether your experience can spread word of mouth, which counts for a lot. Often users will also see an option to include why they scored a particular way. A little quant, a little qual. Nice.
The Customer Satisfaction Score measures customer satisfaction by asking users to rate their experience based on a predetermined scale. CSAT is simple and easy to use, but since the question is so broad, the reason behind the responses can be hard to decode. Still, in some ways it is a more direct question than NPS and can help gauge overall satisfaction in a more straightforward way.
Customer Effort Score measures how much effort it takes for users to complete certain tasks, such as contacting support to resolve an issue.
Website intercept surveys are essentially modals or similar pop-ups that appear at key points in the user journey to assess sentiment. This may seem like an annoying addition to your site, but when implemented properly, they can be relatively frictionless for your user, and provide key ongoing feedback for you.
Email surveys are surveys that you send, well, via email. You can deliver NPS, CSAT, or CES surveys using email, or you can send more in-depth qualitative surveys.
Ideally, email surveys allow respondents to answer questions embedded directly in the email; this offers the cleanest, easiest user experience. However, you can also collect responses by asking respondents to reply to the email or by including a link to a survey on an external site or platform.
Like any research method, user feedback surveys come with their own pros and cons. It’s up to you to determine the situations in which they’re most appropriate for your team.
“Through continuous user feedback loops, you are ensuring at every point that you are working on something useful and valuable to customers. This reduces risk and cost throughout the development cycle massively.” – Kerstin Exner on The Product Manager
The main benefits of user feedback surveys for UX research include:
“It is too easy to run a survey. That is why surveys are so dangerous. They are so easy to create and so easy to distribute, and the results are so easy to tally. And our poor human brains are such that information that is easier for us to process and comprehend feels more true. This is our cognitive bias. This ease makes survey results feel true and valid, no matter how false and misleading. And that ease is hard to argue with,” Erika Hall writes.
The main risks of user feedback surveys include:
Ongoing surveys are a great way to collect user feedback post-product launch. Here are some best practices for user feedback surveys to help you get started from scratch (or more likely build on systems you may already have running within your organization):
As always, the best place to start is with your high-level goals.
Consider the overall goals of your product, business, and users—goals that have probably been identified in prior research.
Do you want to:
Ongoing customer surveys can help you track important metrics and collect qualitative feedback related to how your users feel about the experience. Knowing your goals ahead of time will help you align your efforts with those goals and ask the right questions—to the right people, at the right time, with the right survey methods.
Whether your research team acts as a service arm at your organization or works hand-in-hand with the product team to make decisions, it’s essential that you get the right people on board from the beginning.
Gaining the support and participation of the right people makes it easier to execute your ideas and make the right decisions about product changes. Collaborate with key stakeholders and internal team members to assess the biggest priorities and determine how to address ongoing survey feedback in general.
As we mentioned earlier, there are many different types of survey methods you can choose from, including:
Many organizations prefer Net Promoter Score (NPS) which divides your users into promoters (those who would recommend the experience), passives (those who are neutral), and detractors (those who have substantial problems).
However, NPS can easily fall into the camp of “vanity” questions, or those that tell you little about users’ needs or any real insight into their product experiences—so don’t just choose NPS because it’s the most popular. Make sure the survey assessment you choose is intentional, based on your goals, and allows you to ask meaningful questions.
As with many things user experience, when it comes to deploying an ongoing user feedback survey, friction is your enemy. You have to make it easy for users to give feedback.
Be sure to:
If you’re looking for more qualitative feedback than 2 open-ended questions will allow, consider running continuous user interviews instead.
Writing great answer choices is just as important as asking great survey questions.
There are several different types of response options for surveys, including:
Choosing the right survey answer option depends on the type of question you’re asking and the type of data you’re hoping to collect in return. If you want to explore more survey answer types and when to use them, check out this great article from SurveyMonkey.
As you’re creating your survey, you may have the option of making survey responses anonymous. There are pros and cons to anonymized surveys, and the best option (as always) depends on your survey goals.
As a rule of thumb:
Is there an ideal moment in your user journey where it makes sense to ask for feedback?
As a general rule, the best time to ask for feedback is while the experience is still fresh in the user’s mind. Depending on your goals, you might deliver surveys:
For example, it makes sense for Google Maps to ask for a restaurant rating shortly after you’ve added the restaurant as a destination in the app or for Uber to prompt a driver rating right after a ride.
Whatever you do, though, do not ask for feedback:
Next comes where to deploy your survey. Your options may be limited to the survey tools you’re using, but here are a couple things to consider when you decide where to deploy surveys.
If there’s a key moment to get feedback when someone is actively using your web or mobile app, ask them directly in the app through a modal or similar experience. If they miss a prompt, you could follow up with an email, judiciously.
Email-based support conversations are naturals for email feedback follow-ups. In the same vein, a chat or messenger conversation can easily end with a chat or messenger feedback request.
If you’re sending proactive surveys (as opposed to those triggered by user behavior), try to send them in the channels your users prefer.
Email is a classic choice here! But for your users who aren’t subscribed to email or are more responsive via other channels, you may find chat, in-app messaging, or push notifications to be more appropriate. Keep in mind push is likely your most aggressive option, and you should closely watch your negative KPIs—like opt-outs—when using it to request feedback.
Nothing messes with survey results quite like having to edit it post-launch.
Make sure your survey is 100% workable and free of errors by testing it with your team before you send it live.
You can also download these free survey quality assurance checklists to use as reference.
As Roberta Dombrowski, former VP of Research at User Interviews, and Chris Lee, Product Designer at Sprig discuss in this webinar, the best way to know if your designs will resonate is by testing early, often, and with the right users.
If you can, automate the delivery of these early surveys to save time and effort. Depending on the platform(s) you’re using for your survey(s), your ideal state may be seamless, or may take a little clever Zapier-ing, but you should be able to get these kinds of ongoing surveys to a largely autopilot state, freeing you to spend more time uncovering insight and making better product and business decisions.
And please make sure you’re not berating your users. Look at your ongoing survey program holistically, and take advantage of any frequency capping or other options available to you through the platforms you're using to make sure you are not overwhelming anyone.
This may seem simple enough, but in many organizations, surveys may be owned by a variety of teams like support or marketing, so you’ll need to work across teams to make sure your surveys are implemented in such a way that gives you valuable feedback without annoying users.
The thing about ongoing listening methods is that they’re running all the time, so when should you stop and analyze what’s happening?
Again, it depends on your goals and the context of the survey.
If you’ve rolled out a brand new experience for the first time, you should be checking in very frequently, whatever that means for you. If you haven’t released anything major recently, you might review on a less regular cadence.
Many services now integrate with email or Slack, so you can stay on top of the day-to-day somewhat passively, while doing a deeper dive on a more set, less frequent cadence.
As the feedback rolls in, you might identify quick wins in the form of bugs or small usability issues that can make a big difference to the user experience.
Building processes and relationships to turn this insight into action is critical to the success of your ongoing user surveying initiatives.
Make sure to document how the changes you've made based on ongoing survey feedback have improved the key metrics you're tracking both regarding customer satisfaction, and broader business goals (like retention or revenue).
The insights you discover from user feedback surveys can illuminate key questions or areas of exploration for future UX research, and vice versa. Pairing surveys with other user research methods can help you develop a more nuanced understanding of your customers, ultimately leading to better products and experiences.
You might also pair survey data with feedback from:
One way of easily accessing and aggregating this data is to develop what Sachin Rekhi, founder and CEO of Notejoy, refers to as a “feedback river,” or an open channel for anyone to get direct access to primary feedback on the product from across various teams and channels:
“This has typically taken the form of an internal company mailing list in Gmail or Outlook, but I’ve also seen it as a feedback channel in Slack or HipChat. I typically require all product managers, designers, and engineering leads to be on the list, but encourage anyone across R&D, marketing, sales, customer service, and more that’s interested to subscribe as well. The list usually has open write access as well for any internal stakeholder to contribute.
All feedback that is gathered across the various feedback sources is then encouraged to be shared in a reasonable aggregate form on this channel. For example, let’s say the product team conducted a set of customer interviews. They are encouraged to provide both links to interview recordings as well as summarized feedback on the channel. As another example, the customer support team usually has a designated person who sends a weekly customer feedback report on the channel with details of top issues that customers have been facing as well as links to reports for further details.”
This is a great way to start identifying patterns in user insights across your various feedback channels, including surveys.
The best user feedback survey questions are always the ones that help you answer your key questions and reach your particular goals—but if you’re looking for inspiration, here are some great templates, examples, and samples of great survey questions for collecting ongoing user feedback.
Your options for collecting user feedback in the form of continuous surveys are vast. Some of the most popular tools for continuous surveys include:
In some cases, you might also consider using conversational tools like Drift or product marketing tools like HubSpot to implement a continuous survey practice.
Continuous feedback surveys are a great way to keep a pulse on user sentiment, at every user touchpoint throughout the entire user lifecycle.
By choosing the right type of feedback survey and implementing them carefully and efficiently according to your goals, you can gain a deep and adaptable understanding of your users. Ultimately, continuous surveys done right can help speed up product iteration cycles, leading to better, more competitive products and highly loyal customers.