Designing Customer Feedback Surveys

By Janet M. Six

Published: October 22, 2012

Send your questions to Ask UXmatters and get answers from some of the top professionals in UX.

In this edition of Ask UXmatters, our experts discuss how best to design and utilize customer feedback surveys to obtain data from customers.

In my monthly column Ask UXmatters, our panel of UX experts answers our readers’ questions about a variety of user experience matters. To get answers to your own questions about UX strategy, design, user research, or any other topic of interest to UX professionals in an upcoming edition of Ask UXmatters, please send your question to: ask.uxmatters@uxmatters.com.

The following experts have contributed answers to this edition of Ask UXmatters:

  • Steve Baty—Principal of Meld Studios; President of IxDA; UXmatters columnist
  • Caroline Jarrett—Owner and Director at Effortmark Limited; UXmatters columnist
  • Whitney Quesenbery—Principal Consultant at Whitney Interactive Design; Past-President, Usability Professionals’ Association (UPA); Fellow, Society for Technical Communications (STC); UXmatters columnist

Q: Do you ever use customer feedback surveys? What questions do you ask? What weight do you give the responses in measuring the success of a product?—from a UXmatters reader

“If you want to obtain useful data from a survey, you have to ask questions that people can answer meaningfully and that are phrased in ways that people can understand, and pose questions that they are willing to answer.”—Whitney Quesenbery

“If you want to obtain useful data from a survey, you have to ask questions that people can answer meaningfully and that are phrased in ways that people can understand, and pose questions that they are willing to answer,” replies Whitney. “Most of all, make sure that you find a way to let people tell you what they want to tell you—rather than restricting them to simply answering your questions. You might be surprised by what you find out!

“I like the qualitative data we get from asking questions about why a person has come to a site or used a product. For example, to get users’ help with categorization, you can pair an open-ended response with a response to a closed question—including none of the above—as long as you are sure that the categories you offer are meaningful to them!

“It can also be helpful to ask users to categorize themselves in terms of broad audience categories. I find this approach to gathering data more useful when doing research for specific sections of a site than for a home page. This is especially true if you are looking at a feature you’re aiming at a particular market segment. You might be most interested in learning why people are using an application or site. Is there an audience or user story that you haven’t thought of?

“I know that marketing folks are really fond of quantitative responses to questions about how much people like a product and whether they would recommend it,” continues Whitney. “But, unless you have a very strong context, I tend to wonder whether those types of responses are of any value in themselves. However, if you are careful to ask the same questions of all research participants, the data can be very valuable as a comparative indicator of trends.”

Proper Surveys Versus Routine Feedback

“When conducting true surveys, companies survey a random sample of customers, asking them a specific set of questions and analyzing the results.”—Caroline Jarrett

“It seems that people use the term customer feedback survey in two different ways,” replies Caroline. “People use this term to refer to:

  1. routine feedback—When gathering routine feedback, companies offer a feedback form to every single customer, after each encounter.
  2. proper surveys—When conducting true surveys, companies survey a random sample of customers, asking them a specific set of questions and analyzing the results.

“A well-conducted, proper survey is a powerful thing, giving you robust data that you can really trust. As for routine feedback: well, an occasional customer might respond as if it were a proper survey and give you good-quality data. But regular, repeat customers see these surveys too often, so learn to ignore them—unless they particularly want to vent about something. Even occasional customers may have had encounters with similar bad practice regarding feedback surveys in different contexts, with the result that they have been trained to react in the same way. Should you pay attention to these routine feedback surveys? Certainly. Anything a customer takes time to tell you is important data. Should you regard them as robust, trustworthy data? Definitely not.

“There is also a very sad phenomenon that is becoming more common: the use of customers’ routine feedback as a basis for rewarding—or even worse, punishing—staff who deal with customers. There are examples popping up all the time of staff shamelessly—or perhaps desperately—asking customers to give them good marks in a routine feedback survey, because ‘otherwise my job may be on the line.’ Is that the sort of impression you want your brand to give? I hope not.”

A Conference Feedback Survey

“Our questions were very specific, and we phrased them in ways that would encourage people to reflect on particular elements of the conference and rate just those elements.”—Steve Baty

Here’s an example of one type of customer feedback survey: a conference feedback survey.

“We use customer feedback surveys each year to obtain feedback on our UX Australia conference,” responds Steve. “When we set out to design the conference, we had some very specific objectives and results we wanted to achieve—around the content of the conference, the atmosphere, the scheduling of the program, and a variety of other facets. We made some very particular design decisions along the way in order to meet those objectives.

“A conference is a very difficult thing to prototype and test. So, in 2009, the first year of our conference, we needed to learn as much as we could about

  • whether we had successfully delivered on those design ideas
  • whether that resulted in a conference experience people really enjoyed

“As a result, our questions were very specific, and we phrased them in ways that would encourage people to reflect on particular elements of the conference and rate just those elements. Also, we didn’t ask questions about anything we weren’t really interested in knowing. Doing that would just make the survey longer and provide data we weren’t going to use. Plus, it would lower the response rate we would get from our customers.

“We sent the survey out fairly quickly after the conference—but not during the event—or even a day or so after the event. It was a fortnight later. This allowed people time to digest the overall experience; get over the hype and excitement of being at the conference—or any extreme frustrations, if they had them—and give us more balanced feedback. We were then able to fold those results in with our own observations and iterate our design for 2010.”

A message to our readers—Please participate in our forthcoming UXmatters Reader Survey. Every year in November, we conduct our annual survey to find out how we’re doing and what our readers want and need. This years’ survey is particularly important because we’re embarking upon a re-envisioning of the UXmatters Web site, and we’d like to do some user research to find out about your needs before making any decisions. Please participate in our survey and help us to ensure that our design and development efforts target real user needs. Thank you!—Pabini Gabriel-Petit, Publisher of UXmatters

Join the Discussion

Asterisks (*) indicate required information.