In this episode, we’ll hear from insight professionals at top brands and consultancies on their top tips for asking a good research question to participants as well as common mistakes and how to avoid them.
Stay tuned for the following weeks to hear the individual episodes of our referenced guests.
Harry Brignull, Head of UX Innovation at Smart Pension
Emma Craig, UX Research Manager at Shopify
Zoe Dowling, SVP of Research at FocusVision
Josh LaMar, Principal Researcher & Co-Founder at Authentique UX
- Email: email@example.com
- Web: http://www.authentiqueux.com
- LinkedIn: https://www.
- Facebook: https://www.
Find Jamin Online:
- Email: firstname.lastname@example.org
- LinkedIn: www.linkedin.com/in/jaminbrazil
- Twitter: www.twitter.com/jaminbrazil
Find Chueyee Online:
- Email: email@example.com
- LinkedIn: www.linkedin.com/in/chueyeeyang
- Twitter: www.twitter.com/chueyee15
Find Us Online:
- Twitter: www.twitter.com/happymrxp
- LinkedIn: www.linkedin.com/company/happymarketresearch
- Facebook: www.facebook.com/happymrxp
- Website: www.happymr.com
- “Clap Along” by Auditionauti: https://audionautix.com
- Leading Questions – Yes Prime Minister: https://www.youtube.com/watch?v=G0ZZJXw4MTA
- Epidemic Sound: https://www.epidemicsound.com/
- Double-barreled questions: https://en.wikipedia.org/wiki/Double-barreled_question
- Dark Patterns: https://www.darkpatterns.org/
[00:02] Jamin Brazil: In this episode, we’ll hear from insight professionals at top brands and consultancies on their top tips for asking a good research question to participants as well as common mistakes and how to avoid them.
[00:25] Emma Craig: So, a really simple example would be, “How often do you picture yourself using this?” Maybe in the interview, you have exposed that this is something they are interested in, and they think it would be very helpful. It would ease all of these pains and challenges that they have. And then you want to say, “OK, well, how often do you think you would use it?” But people cannot give you a realistic idea about the future; they don’t know, they will make it up.
[00:54] Jamin Brazil: Thanks for tuning in! You’re listening to the Happy Market Research podcast, I’m Jamin Brazil, the show’s host. In this episode, we’ll hear from insight professionals at top brands including Shopify, as well as leading User Experience and Market Research professionals. If you are involved in consumer insights from either a practitioner or buyers perspective, this episode is for you. I’m joined by our Executive Producer, Chueyee Yang. Chueyee, how are you?
[01:20] Chueyee Yang: For some reason, I keep yawning and I don’t know why.
[01:47] Jamin Brazil: Before we get started, I want to give a big thank you to our sponsor, Lookback. This episode is brought to you by Lookback. Lookback is the leading software that enables researchers to interact with users, in real-time, and in context. Built from the ground up by some of the original Spotify engineers, Lookback is the best in class video screen share platform for User Experience and Market Researchers. Check them out at lookback.io.
[02:12] Chueyee Yang: Last bit of housekeeping. It would be amazing if you’d stop this episode… right now…and rate us on whatever app you use to listen. Great! Let’s get started. For this episode, we interviewed four research professionals who have thousands of research projects under their collective belts.
[02:31] Jamin Brazil: Sorry executives and heads of revenue, we gave the real researchers the spotlight for this one. Before we get into the elements of a good question, let’s talk about some of the biggest mistakes researchers make when composing a question.
Mistake 1: Leading Questions. Leading questions came up a lot. Here is Josh LeMar, a well-known User Experience Researcher.
[02:56] Josh LeMar: I think that the biggest mistake is to either ask a leading question or to frame it too narrowly first. We’ve talked about framing narrowly first, so I guess we could talk about leading questions now. Which are things like, tell me how amazing this product is. That’s an over-exaggeration, but it can be much more subtle too. Like, if you’re only asking about the positive aspects of something or you’re saying, oh, this is a really great feature, isn’t it? Well, what did you just do there? You told the user-you primed them, number one, to say, I like this feature, and then I created this tag question like, isn’t it? Don’t you agree with me? You should agree with me because I’m the smart one here. You just made the user feel dumb, and then you also told them exactly what you want to hear. So what are they gonna do? They’re gonna tell you what you want to hear because they want to make you happy. And it’s so important as a researcher, to be very neutral and to ensure that you’re not letting too much of your own feelings ever come out. Because as soon as you start letting on like, this is really dumb, isn’t it? Yeah, I don’t really use this, but we need to test this for our client. Can you just tell us that thing? You’re throwing out the whole study data if you do that because it’s too leading, you don’t want to lead them on to the answer. The answer is what they think, not what you think.
[04:15] Jamin Brazil: All of our guests talked about leading questions one of the most common mistakes made when interviewing a participant. For kicks, let’s listen to a 2-minute excerpt from Yes Minister, an English TV Series originally airing in the early 1980s, on leading questions. This is a comical bit that serves as a clear example of the importance of being thoughtful about your study design and the questions you ask.
[7:04] Jamin Brazil: This excerpt is especially on point considering 2020 is an election year in the United States. As you are exposed to data, from both sides, try and be thoughtful about processing it versus blindly adopting the implications. Mistake 2: Double-Barreled Questions. Connected to leading questions are double-barreled questions. Zoe Dowling, SVP at FocusVision talked about these. As described in Wikipedia, “It is committed when someone asks a question that touches upon more than one issue, yet allows only for one answer. This may result in inaccuracies in the attitudes being measured for the question, as the respondent can answer only one of the two questions, and cannot indicate which one is being answered.”
[07:49] Zoe Dowling: Double-barreled questions. How can you really answer that? You’re leading me into- it’s the basics. It’s you’re leading me into this response. I can’t respond to it the other way. We’re all the time- it’s like I can’t respond to that at all. I- none of those apply, and we don’t give any- we don’t give- we’re constructing these questions to allow, and this is actually more on the quantitative side because at least on the qualitative side, people ask- you get to some sort of response, whether it’s what you want or not. People will give their opinion because it’s open ended. Whereas in a closed ended survey question, you’re dictating the whole frame of it. The question you’re asking and the responses they get. And it’s like no, that doesn’t apply to me. You’re not getting to my opinion, and I think those are some of the things you see frequently and we’re all guilty of it because you, the person that’s designing the instrument, you’re bound by your own parameters and how you’re viewing it and how you’re framing it.
[08:38] Chueyee Yang: Tip 1: Keep it Conversational. In addition to double-barreled questions, Zoe outlines the need for us to talk in conversational, human, terms.
[08:57] Zoe Dowling: I think the fundamentals remain the same, whether you’re asking a question in a survey or constructing it for an interview. I mean obviously there’s some fundamental differences. If you think about- the first thing are you gonna be understand? Talk in everyday language. I think too often we want to frame- we either bring in the world that we’re in, be it the actual industry. We’ve got particular language jargon that we’re using. Or you might think that I need to be so incredibly specific that you end up creating this very convoluted, the way it’s constructed question that is anybody gonna- you’ve just said it. We read in headlines. So do our participants. They scan. In fact, very often in a survey, they actually just go straight to the answers to determine that the question was and how they’re going to respond. So it’s been clear. It’s been concise. And I think that kind of works for both sides, qualitative or quantitative. Because if we’re qualitative, you’re gonna take the question and you can probe. You can go deeper and you’re gonna take it all from there. But if you start with something that’s very convoluted. Then, well, you’re probably not gonna get to where you really wanted to go in the first place. That would be my overarching thought. We sometimes over engineer our questions.
[10:07] Chueyee Yang: Keeping things simple can be the best way to connect the intent of your question to the participants. Here is an example. If we want to know how much people like my new electronic coffee mug that keeps the liquid hot, we could ask, “Most coffee makers produce a cup of coffee that is 170 degrees. After you poor the coffee into a mug or other preferred drink container of choice, how does the change in temperature of that coffee’s life cycle impact your enjoyment level?” Versus, “Thinking about your last cup of coffee. What do you think about it cooling off?”
[10:43] Jamin Brazil: I have programmed so many surveys that use the first option over the last. Part of the issue here is surveys and discussion guides are often written by a committee and as my late grandfather used to say, a camel is a horse designed by a committee. Okay, let’s look at some more tips to frame a good question. Tip 2: Use Common Terms. In full disclosure, I made a big mistake in my first interview with Emma of Shopify. Here was the question I asked her, “Key elements of a good question?” The way I framed the question was ambiguous. This created confusion. She didn’t know what we were talking about. Was the elements specific to research objectives, or a question in a survey, or a question in focus group, or a user experience study. It is so easy to assume your participants are starting with the same framework as you are. Nomenclature, colloquialisms, phraseology, mindset … these are just some of the things we need to think about.
[12:01] Chueyee Yang: Tip 3: Know why you are Asking Each Question. So, the updated question we asked was, “What are the key elements of a good participant question?”
Having only been in research for a year, I found this episode really useful, especially when listening to Harry Bringnull interview. For those that don’t know, Harry is the UX specialist who first coined the term “dark patterns.” Dark Patterns are tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something. Harry mentioned that in order to ask the right questions, you have to ask yourself why you’re having this study and what your goal is. After you answer that question, you can move on to creating a good question for participants.
Tip 4: Start Broad Then go Narrow. Another point Harry made was about starting diagnostic questions broad and then narrowing things down…
[13:02] Harry Brignull: I think it’s very easy to focus on the small details of the research. And researchers can feel very safe when they focus on small things, like the recruitment, specification, exact wording of the questions. But in my opinion, what defines good research and then it sort of cascades into the questions is the overarching research objectives. So what are you doing the research for in the first place? And if you don’t get that right, the questions are inconsequential. And if you do get it right, the questions become much easier to write anyway. So what do I mean by that? Basically it’s very common, particularly when you got a new job or if you’re a junior researcher to have someone come along and for example, a product manager or product director or someone in management try and tell you the objectives in advance of what you should be doing your research on. And managers tend to be very feature-focused, so they’re probably going to be very specific and have a very narrow brief about the one thing that they care about at that point in time. So for example, imagine you’re a researcher and you’ve got a new job and the team you’re joining has never done any user research. And your manager, or product owner or whatever comes along and says, “I want you to do some research on this particular dashboard that we’re building for [INAUDIBLE] This dashboard is used by this one particular user type.” Let’s say you’ve got six user types and it’s used by one of them. So if you go and do that research, you’ll probably make that person happy. But you’ll still be kind of in the dark about the big picture. So what about the other five user types that we talked about there? What about the broader user needs? What were the most worrying or the least understood things about the problems that your product is trying to solve for users? And besides, often these sort of senior manage-y type people, they don’t really know what good user research is anyway. So really, like I was saying earlier is really a lot of the job of the researcher is to teach the people around them how they can be engaged with in a constructive way so they don’t get approached with very tightly defined research questions that are overly scoped basically. So I’ve got a metaphor here. If you think of your problem space as being like a dark cave. Using research is a bit like a flashlight that shines a beam into the cave so you can see what’s going on. The first time, if you did go climbing or go exploring and find a big dark cave. The first thing you’re going to want to do is shine your torch, shine your flashlight around the cave to try to work out what’s in there. You’ll probably do it quite quickly right just to make sure that you’re safe and there’s no big surprises like a bear or something. And then once you’ve done that, then you might have a more focused beam and shine it at something else. You might feel like okay we’ve covered all that, we’ve done our first pass. Now we can focus on that really exciting structure over there, the stalagmites and stalactites or something like that that you really point with being there and get really interested and focused on it. So I guess a bit of a tenuous metaphor there, but I think it’s really, really important to always start broad. Otherwise you can end up getting really deep into something and missing the points on how. Because human life is multilayered and it’s always good to start out with the broadest possible way and then zoom in gradually rather than zoom in first and kind of miss out on some big thing that you should be working on.
[16:14] Chueyee Yang: Similarly, Josh LaMar, the previous research manager for Outlook says…
[16:27] Josh LaMar: I think that the way that you frame a question is very, very important because you have to be at the right level. And what I mean by level is that, if you start off an interview by saying like, well, tell me how you check your email on the weekends, you’ve just scoped it so narrow and really, you might be interested in something else. I was the research manager at Outlook for several years, so I can use email as a really easy example of things that I’ve done research on a lot. So it’s really important to start very broad and then move into the specific. And an example of a broad question might be, tell me how you communicate with your friends and family, much more broad than just email. And then as you start getting into it, you’ll find more interesting things. The framing is so important because when you frame too narrowly, you put this box around the user. And the user thinks, I think that they want to hear just this part, and so they only share the things that are in that box. But when you add a broader box from the beginning, then everything else is open. And you might find something that’s even more interesting just by asking a broader question.
[17:30] Chueyee Yang: Harry’s framework of a cave is exactly how we should think about our research. When writing your discussion guide or survey, start with your assumptions and then get rid of them. The less you know, the more you will understand the context of the participant and their opinions about your research topic.
[17:48] Chueyee Yang: In line with starting broad and then narrowing in on your research question. I loved the tactical example of how Emma Craig, of Shopify, breaks this point down.
[18:00] Emma Craig: I think good interview questions or these direct questions that you’re asking a participant or a respondent start with your bigger question, your research question. And I don’t want it to get confusing here of what’s what. But before you can start to formulate your discussion guide and understand exactly what it is you want to ask these people when you’re face to face with them, you have to have your research question and your research objective in mind. So, the research question here is, essentially, seeking to understand why something is happening. Or what is happening? You’re looking to uncover a process, or a need, or a challenge that someone is experiencing. So, an example would be, “What are the biggest challenges people experience when it comes to taking public transit?” And that would be your research question from which you derive all of your interview questions. And you had a really good point about not asking these pointed, direct questions that you just directly ask because, half the time, people won’t actually know the answer or they won’t have the answer. But I have learned over the years that if you ask somebody a question, they will answer your question. So, whether they make it up, or they exaggerate, or whatever it might be. If you ask them something directly, they’ll give you a direct answer. And you can’t always be certain but that is true or that they are not just telling you what they think you want to hear. So, your interview question, it’s there for you to collect evidence and you have to take different angles. You have to go sideways or, like you said, take the backdoor. If your research question is around the biggest challenges people experience when it comes to taking public transit, your interview question shouldn’t be just asking somebody if they like to take the bus, your interview question could be asking them to walk you through how they got to work last week. And to kind of take these roundabout ways to understand the environment that it is you’re researching.
[20:04] Jamin Brazil: The context of the participant when consuming or experiencing the thing you are measuring is 100% vital. In the way a breaks doesn’t matter without a car, we have to put our participants in the context of their consumption and then drill down. This is much harder than simply reducing your research to an Net Promoter Score or similar likert scale. While NPS is far easier, it is less effective at uncovering hidden truths.
[20:29] Chueyee Yang: Tip 5: Protect Your Participants. It is also important to protect your research participants from your internal stakeholders.
[20:42] Harry Brignull: I remember once doing some research and you have the stakeholders in the room. And one of the stakeholders would rap his fingers on the table like this when the user didn’t answer the question. Yeah we were doing some research on time tracking companies in Munich. Because the tech was like a stumbling piece of tech that you kind of had to be in the room to see working. So that didn’t go. It’s basically sometimes you need to keep the stakeholders far away sometimes. And I often find that, I know some researchers like to have a chat window open and like to let some people ask some questions during the research. I absolutely will not abide that as the research they can all get lost. They can write notes and stuff and I’ll talk to them afterwards. But having that extra channel of input while you’re trying to run an interview, it’s just mind meltingly annoying.
[21:29] Chueyee Yang: Exactly. While Harry’s example sounds like it came straight out of an episode of the office, this is a real issue. If I was a participant, I’d want the interviewer to treat our conversation like a date where I have their full attention…not me sitting across the table from someone who is swiping on their phone after asking me a personal question.
[21:53] Jamin Brazil: Tip 6: Leverage Past Behaviors to Inform Future Usage. Now, we are going to cover the importance of leveraging past behavior to inform future outcomes.
[22:05] Emma Craig: So, a really simple example would be, “How often do you picture yourself using this?” Maybe in the interview, you have exposed that this is something they are interested in, and they think it would be very helpful. It would ease all of these pains and challenges that they have. And then you want to say, “OK, well, how often do you think you would use it?” But people cannot give you a realistic idea about the future; they don’t know, they will make it up. Like I said, if you ask somebody a question, they will answer that question. But it probably won’t be true because they don’t know well enough if they’ll use something or if they’ll do something in the future. I think an example I use a lot is if somebody asked me what I was going to eat for lunch tomorrow, I can’t actually- I can give them a guess but I can’t actually tell them. But if they asked me what I have eaten for lunch every day this past week, I’ll give them a much better indication of what I might eat for lunch tomorrow or what my lunches look like. So, yeah, probably that, asking people to predict instead of basing the question in past behavior.
[23:08] Jamin Brazil: It is so easy, as a researcher, to ask a simple question around projected or expected usage and then extrapolate that for your client to use in their models. Tip 7: Run a Pilot Before Your Study. Before we end this episode, one of the tips that came up is from the old days of research. Prior to the internet, recruiting people for research was very very hard and expensive. To avoid a project going into a FUBAR state, we’d often run a pilot project. This took a bit of time and was about 10% of the total budget but it served to ensure that we were asking the right questions in the right way to answer our research objectives.
[24:31] Chueyee Yang: In the next episode, we’re releasing the long-form interview with Emma Craig, Lead UX Researcher at Shopify.
[24:40] Emma Craig: The interviewer, I think, it’s easy to be unaware of the signals we send because they are so subtle. But just things like nodding your head, or shaking your head, or your facial expressions; every tiny, tiny movement, the other person, often unconsciously, is watching as a way of receiving feedback about how well they’re performing.
[25:01] Chueyee Yang: Happy Market Research is hosted and produced by me, Chueyee Yang and Jamin Brazil.
[25:08] Jamin Brazil: Special thanks to our referenced guests…
Emma Craig, UX Research Manager at Shopify
Harry Brignull, Head of UX Innovation at Smart Pension
Zoe Dowling, SVP of Research at FocusVision
Josh LaMar, Principal Researcher & Co-Founder at Authentique UX
To subscribe to the podcast, go to iTunes or check out the Happy Market Research website at happyMR.com. You can follow us on Twitter at @happyMRxP. Thank you for listening and see you next week.