My guest today is Zoë Dowling, SVP of Research at FocusVision.
Founded in 1990, FocusVision offers a technology suite that enables both qualitative and quantitative research.
Prior to joining FocusVision, Zoe served as an executive at Kantar and the US Census Beuro. Additionally, she was an Associate Lecturer at the University of Surrey.
Find Zoë Online:
Find Jamin Online:
Find Us Online:
“Clap Along” by Auditionauti: https://audionautix.com
This Episode’s Sponsor:
This episode is brought to you by Lookback. Lookback provides the tools to help UX teams to interact with real users, in real-time, and in real contexts. It’s Lookback’s mission to humanize technology by bridging the gap between end-users and product teams. Lookback’s customers range from one-man teams building web and app experiences to the world’s largest research organizations, collectively ensuring that humanity is at the core of every product decision. For more info, including demos of Lookback’s offering, please visit www.lookback.io.
My guest today is Zoë Dowling, SVP of Research at FocusVision.Founded in 1990, FocusVision offers a technology suite that enables both qualitative and quantitative research. Prior to joining FocusVision, Zoe served as an executive at Kantar and the US Census Beuro. Additionally, she was an Associate Lecturer at the University of Surrey. Tell me a little bit about your parents, where you grew up, and how that’s impacting what you’re doing today.
Zoë: So my parents are philosophers, which obviously made for interesting dinner conversation. And how they impacted- one of the big things that impacted I think exactly where I am today and my interest in- strong interest in culture, technology, and of course, I bring that together with research and understanding people and life is that we- when I was about five or six, we moved to South Africa. And one of the interesting things was that moving so far away from family and in those times, in the ’80s, we didn’t have a landline at home. And even if we did, to actually do an international call, you actually had to go to one of the- we lived in a very rural area. There was only two hotels, and we went to one of them to place a call with international operator to speak to my grandmother on her birthday. So this was the most exciting event. So birthdays and Christmases. We can get in the call, go to the local hotel, to actually place this long distance call, international call. And it was incredibly exciting, but it brought this kind of communication because then it was the wonder of- wow, I’m speaking to somebody that’s 5,000 miles away. And you can look at how technologies change. Think about that. In the ’90s, email, I remember I was in Scotland and my parents were in South Africa, and I was emailing them. It was like this is incredible. I’m not waiting three weeks for this blue air mail letter. And then the first time that I ever did instant message chat, it was like- it was mind blowing. That sounds crazy today. You think of young people growing up and even young people in the work force. The internet’s always just been there. Whereas I remember how pivotal it was to actually bridging that gap in communication. Which is a long kind of winded way of saying all of these kind of experiences and also the culture element coming in. I lived in a very, very different culture, and so looking at South African culture and looking at British culture and now I’ve been in the US for 15 years, and so American culture, and just the world and how all this comes together with technology is fascinating. And that’s what kind of brought me to where I am today with all of these different things of how do we understand the world. How do we understand people and brands are a very fascinating element with that- within that. Because you think of your brands, you think of the products and services that have changed our lives, how technology has driven that, and so it kind of bringing it all together and trying to keep digging into it and understanding it. And also from my more methodological background. How can we do that in better ways? How do we ask the right questions? How do we make sure that we’re getting answers that are the right answers because we’ve asked the right questions.
Jamin: And it does start with a question, right? You have an overarching objective for a study. Almost every survey I’ve written, I start with this- the title font with what the objective is and then try to control survey creep or discussion guide creep if it’s not attached to the question or if the direction of a conversation is not attached to addressing- we just don’t have time for nice to knows.
Zoë: And I- well, and I think that’s a shift though. Because I think that there was- if I think of when I joined the market research community. I joined Kantar 12, 13, 14 years ago, and I started- well, I started in copy testing, which is still fun actually. But there was this thing of we’re going in to speak to people, and so it was almost like this is our opportunity to get the answers so let’s ask everything. Let’s put in the kitchen sink to be cliche about it. And I think it was this thing of because it took time to get that information. It took- I mean a copy test back in those days took four to six weeks, which is just unthinkable today. Can you imagine? You need to do an ad test, and it’s like actually we’d like four to six hours, never mind three to four days. So I think that we’ve got this little bit of a legacy of this is our chance to speak to people so let’s- why not just add that. What if that’s interesting? What if that’s interesting? And I think that’s a mindset that we still need to get out of, and when our surveys are all mobile friendly and in that ten minute mark, perhaps we’ve reached that point. But we’re a long way from it.
Jamin: Speaking of philosophy, just kind of like Occam’s razor has always been one of my favorite frameworks for life in general. Any sort of piece of correspondence I send, I always try to reduce the words and the content to the point where the intent is clear, but then the rest of it is kind of like cut out just to kind of get through the noise. And with respect to survey design, that works in both ways because you can be too brief. Therefore missing the really intent, and then at the same time you have this overall kind of headline. We’ve always been headline consumers, but even more so today, it’s like people just simply don’t read. So what do you see as the key elements to a good interview question? And this can be framed in- with qualitative or quantitative or both. However you want to think about it.
Zoë: I think the fundamentals remain the same, whether you’re asking a question in a survey or constructing it for an interview. I mean obviously there’s some fundamental differences. If you think about- the first thing are you gonna be understand? Talk in everyday language. I think too often we want to frame- we either bring in the world that we’re in, be it the actual industry. We’ve got particular language jargon that we’re using. Or you might think that I need to be so incredibly specific that you end up creating this very convoluted, the way it’s constructed question that is anybody gonna- you’ve just said it. We read in headlines. So do our participants. They scan. In fact, very often in a survey, they actually just go straight to the answers to determine that the question was and how they’re going to respond. So it’s been clear. It’s been concise. And I think that kind of works for both sides, qualitative or quantitative. Because if we’re qualitative, you’re gonna take the question and you can probe. You can go deeper and you’re gonna take it all from there. But if you start with something that’s very convoluted. Then, well, you’re probably not gonna get to where you really wanted to go in the first place. That would be my overarching thought. We sometimes over engineer our questions.
Jamin: So this is something- I’ve never actually heard anybody say this before. I wish there was a counterpoint person on the show right now because it would be really fun to do a- to have the other side of the table represented. But respondents go straight to the answer and try to figure out what the question is. It’s so funny. So I’ve literally been thinking this, but never said it out loud for over a few years now that those two or three or four lines of text that exist on top of your answer set inside of a survey, nobody’s getting that, unless maybe you have a few bold words or underlined words or something that stands out from the general text. People are going right to what the content and the answer choices.
Zoë: So I’ve been fortunate enough, I can back this up with data, I have to say. I’ve been in usability interviews. We had an extensive program when I was in Kantar and this was probably a good ten years ago when we were trying to improve web questionnaires. So we looked at a number of different things. Obviously, the technology and how it’s being displayed on screen was an essential part of it that we actually took time to think about how are we constructing questions, what are people doing, how are we getting to understandings. And so we did usability testing, and we also did some eye tracking. And what was fascinating, you could literally see it on screen, was that people’s, the dominant area of where their eyes was going was to the response options. And it was a very fleeting glance at the questions. And so that- you know, you can see this mapped out. And the longer your question is, the more daunting it is. And the less people- because people don’t have time, and they’re not invested in it. And we could go all the way back to some of the theory around this, so a couple of things if you want to look at the psychology of survey response, again keeping to the quantitative side. People have got comprehension, recall, judgement, and then response. So comprehension, what are you trying to ask me? Recall is where is this kind of answer coming into- how can I answer this? What is my answer to this? Judgement, what is the appropriate answer? And then your actual response. And so you’re doing that in split seconds. This is all just part of your cognitive process, and we make it harder for people if the questions are extremely lengthy. Smaller screens amplify this grossly. Jakob Nielsen, people are familiar with him, he talks about this. Smaller screen, it is hard. It is less- to comprehend. It takes us longer to do it. And ironically, on a mobile device, people are probably less focused in on the task. They might be doing it just as a quick stop gap in their lunch break or on the bus or sitting on the sofa whilst their adverts are on. So we have all of these things. We understand what people are doing when they’re actually responding to a survey and responding to these questions. But then we’re not necessarily building that into how we’re asking them and how we’re constructing our questionnaires. And then one last thing I’ll talk about just on- in terms of going back to the theory behind it, Don Dillman, who’s sort of renowned professor that actually started out in perfecting telephone and mail surveys and has since done work over the last decade or two decades on the internet, starting with email and then into web. But he had this whole thing about the value exchange, and there’s a lot of different reasons why people take surveys and what are they trying to get out of it and of course, we think about all the time about the financial incentives. The panels and it’s like what points are people getting. But there is also a value element to it here. People need to understand the value of their research, that their opinions- they want their opinions. I think culturally as we’ve already talked about we’re in this generation of reviewers, and people want to believe that the information that they’re giving is going to be used. And sometimes, we don’t- I think we don’t always convey that in the survey. How is this information going to be used or that you- that this is really incredibly valuable beyond just saying that. So I think there’s a lot of different elements in how we’re constructing our surveys that we could do to make it easier and one of them, just going back to where we started is, shorter questions. So keeping to the ten words. That’s my rule of thumb. I mean it’s, you know, hey, we’ve all got to have a gold stripe, and I think a ten word question would be a great way to start.
Jamin: That’s- and now all of a sudden, you have to become highly disciplined around- because it’s easy to write a long question. Wasn’t that Hemingway? Right? He said sorry, I don’t have time to write you a short letter, so this long one will have to do.
Zoë: Mark Twain?
Jamin: Mark Twain. Mark Twain. Sorry, yes, right. So- thanks for correcting me on that. That would have been catastrophic if that would have gone out. Anyway. So- and it is hard. It’s really hard to reduce a thought to fewer words. Getting back to Mr. Occam.
Zoë: Absolutely. Because you’re having to clarify within yourself, what is it that I really want to know? What is it that I’m trying to get at? I always likened it as well to the early days of composing a tweet. You had 140 characters. It’s like I can’t write everything that I want in there. And then it turns out you actually can by simplifying your language, by simplifying the concept, and then you can always follow up. And I think that’s something that we can think about doing within our surveys.
Jamin: Have you seen the Twitter surveys?
Zoë: Yes, I have.
Jamin: What do you think?
Zoë: I mean they’re- I like them. They’re short. And they’re kind of interesting. But yeah, I like them, and it kind of gets to that in the moment have you see this. Or are you interested in XYZ.
Jamin: I screen captured one and then posted it on Twitter #MRX, asking researchers what they thought about it. And actually one researcher who’s well known responded very negatively about no professional researcher could use this approach to gather consumer insights. And see he would be fun to have on the show by the way, because I think he would offer a nice counterpoint. But anyway, the- I just couldn’t disagree more. I just- I think that it absolutely- it’s not like the industry killer or anything like that, but it is a nice supplement to- or another arrow in the quiver of consumer insights.
Zoë: Sure. And I think this is all part of our evolution. I’m classically trained, and I come from this. You’ve heard. I was almost getting on my soap box there, talking about how we should be thinking about designing our questions. Let’s go back to the fundamentals on the best practices. We can’t go wrong with those. But yes, we do need to apply them to today’s world, and the different mediums that people are using to take our surveys or even speak to us qualitatively, and of course, it’s adaptations. I think there’s also adaptations in how we- what kind of data we’re collecting and when and how we’re using it. Businesses can’t wait. That is not where we are today. Everything is far too fast, and so sometimes having that quick answer, having that litmus test. Has my brand been noticed? Has there been XYZ being able to recall this? Or just what is somebody’s opinion on ABC. That might be enough as a starting point to then go into something that’s a little bit more in depth. And I think that that kind of more iterative approach and- to research and thinking about how can we just get some data to work with. And of course, when you’re getting that data, of course, you’ve got to think if you’re asking on Twitter, well, think about who the audience is on Twitter. Who- the demographics. What kind of make up of them. Because that’s obviously going to influence how you look at that data and say, you’re not gonna say hey, this is representative in the United States or in the world. So I think it’s always like- we can use these, dare I say, quick and dirty, and I know that that’s controversial in and of itself, mechanisms. That we can use these- this way to get information, and as long as you’re looking at it with the right lens on OK, I know this is limited by ABC. Then it might be good enough to help you move to the next step. And I think that’s how we need to. We are progressing and evolving as an industry.
Jamin: I like that. So you know, going back to Twitter surveys, framing your answer or by that, I mean your objective in the context of that community that is answering the tweet, right? Or the call to action, which is again whatever sort of profile, if it’s a paid advert and then the context of Twitter utilization. So who is that audience and what do they look like? And that obviously isn’t the whole pie, but it is a varying size of the sliver, depending on who your audience is.
Zoë: Absolutely. The other thing you could do just to pick up what you said there was framing the question. That’s something that we don’t have time for and I don’t know how long ago we gave up on doing sort of pilot tests on I’ve constructed this survey due to time and the financial implications of it. Using something like a- I don’t know if this question’s gonna be answered in the way that I want it to be answered. Or am I- is it gonna get the response, just the interest, or whatever it may be. This type of way of doing a quick test on is this the right wording that I’m after or there’s two different ways I could frame this. Why don’t I quickly put that out there and see what I’m getting back? And qualitatively as well you could follow up on [INAUDIBLE] again. I’m kind of mixing things around here, but there’s a lot that we can do to improve the questions that we’re asking, and ultimately the data that we’re getting by doing some of the quicker fire ways of exploring do I want to go down that path or not with the way that I construct this question or the data I’m getting or so on.
Jamin: And there’s- we’re in a little bit a rabbit hole. I’m gonna try to bring us back in, but I do feel like this is really important for the audience because a large portion of my audience hasn’t been doing this for 20+ years. So the- what Zoë’s talking about here is we used to- when market research was old and we did things through mail and mall intercepts and phone, so it was very cost- very expensive, time and money-wise. We would do a soft launch of a project and then get the data back. It would cost about 10% of the project fee and from that, we would then refine the survey mechanism or discussion guide or whatever and then launch the full study. The objective there on the first part is less about understanding the consumer and more understanding is your instrumentation correct for gathering the correct stimuli or the correct feedback from consumers. We have completely walked away from that as an industry as insights has become more democratized and then the other thing that- the other point that I want to- or not point. The question I want to ask you, Zoë, is how do you see automation impacting this- the- maybe I’m gonna call it traditional rigor. Is that pilot not necessary now that we have so many automated solutions and templates for research?
Zoë: One would hope that the pilots were done as those templates were created, so that you perfect your instrument before commercializing, before you’re making it wide spread. And I think that’s one of the things that I do like about some of the automation and sort of existing approaches is that you can use robust approaches, methodologies, that have been tried and tested, and go in and start- I think eye testing is probably a great example of it. I think it’s not gonna get you all the way. Like if you really want to understand how somebody’s interpreting that ad and there’s lots of different ways that you could- and approaches you could layer in. But is it going to be good enough? Yeah. And so I think there’s an interesting- automation can bring back some of that expertise to a broader audience. So going back to that democratization of research. So established researchers, established approaches, established companies that have been doing this for however long and all their expertise. They put it into this templative solution, and then you as the person say in the marketing department or wherever you may be coming from can benefit from that expertize without having to kind of go down that process of either learning it yourself or engaging somebody for once small thing that you have.
Jamin: What do you see as common mistakes in framing questions to participants? And again, I’ll just kind of like broaden it. You can talk to either discipline, qual or quant or more broad.
Zoë: I- the thing that just comes to mind when I’m thinking about this is probably around the word jargon that I’ve already talked about, and trying to be very specific. There was an instance that I had- I’ve reviewed countless questionnaires. There’s just something in terms of language and trying to- the experience that we’re trying to deliver to our participants. That’s something that I do. And this comes up over and over again, and you- I really do understand why it’s there. And it’s also incredibly difficult. Let’s not make any bones about it. It’s very easy to pick holes in somebody else’s survey. When you’re looking at it, you’re not the one’s that constructed it, you’re not the one that’s been writing it, because it’s a difficult thing to do and it can have its challenges and it’s a skill. It’s like anything. It’s like writing’s a skill, and writing your survey questions are- is a skill. Your qualitative interview questions or your online activity questions. It is a skill and it gets better and better over time. I think one of the things that I see people falling into the trap of is trying to be so specific about the situation. So a scenario that stood out, and this is probably from five, six, seven years ago, was I was reviewing this- it was a diary study, and it was actually taking place in Africa. I think it was in Northern Africa set of countries, and they wanted to understand some snacking moments, which again we don’t talk about I had a snacking moment today or several of them. But this is the language that was being brought into this- the questions within the diary. And then they wanted to get more specific about it, and I wish I could remember the details of it. And it’s not to unfairly call out this- the people that are constructing it, but it was- this is fairly typical. We’re interested in this particular event, and they had their whole definition of it. And it was incredibly important to them, and I just kept going but that doesn’t make sense to an everyday person that’s just going into their cupboard and picking out the snack on hand. [CROSSTALK] And even just- we do. We use language- even if it’s just the snacking occasion or the- I’m trying to think of other moments. Your morning routine. We might have other words for that, but we get so immersed in our world and thinking about this is how we define it. OK, but how does your customer define it? Or how does people just going about their everyday lives define it? That’s something that I think that we- I understand why we do it. I really do. But from a participant’s perspective, the person responding to the question, that makes it challenging.
Jamin: What is the worst question you’ve ever seen?
Zoë: Honestly, I’m going back to that as a reference. Just because it was so convoluted, and the other thing is I don’t have it- I’ve seen- I’ve seen some doozies, so. I mean everything. Double barrel questions. How can you really answer that? You’re leading me into- it’s the basics. It’s you’re leading me into this response. I can’t respond it the other way. We’re all the time- it’s like I can’t respond to that at all. I- none of those apply, and we don’t give any- we don’t give- we’re constructing these questions to allow, and this is actually more on the quantitative side because at least on the qualitative side, people ask- you get to some sort of response, whether it’s what you want or not. People will give their opinion because it’s open ended. Whereas in a closed ended survey question, you’re dictating the whole frame of it. The question you’re asking and the responses they get. And it’s like no, that doesn’t apply to me. You’re not getting to my opinion, and I think those are some of the things you see frequently and we’re all guilty of it because you, the person that’s designing the instrument, you’re bound by your own parameters and how you’re viewing it and how you’re framing it.
Jamin: Have you- do you remember the show Yes, Prime Minister?
Zoë: Yes, I do.
Jamin: So they have- you may have seen it. The leading questions episode?
Zoë: Oh, I think I might have a long time ago.
Jamin: I’ll share it with you right after this. It is epic. He has two different- so he writes- on the fly, he writes a survey that is designed to answer a specific question, and then he creates a counterpoint survey that comes out with a completely different question just simply by changing the question types to being leading. And it was- I think it should be part of every researcher’s experience to watch this episode. It’s so artfully done. Anyway. Yeah. I hear you. I hear you on that point. So, Zoë, you are gainfully employed at my alma matter, FocusVision. Are you guys doing anything that’s particularly interesting you want to talk about at the moment?
Zoë: We have a couple of things in the pipeline. But I think I’m just gonna say stay tuned. I think it’s all gonna be exciting.
Jamin: Good. I will certainly stay tuned. I continue to be a big advocate for the company and the leadership, so well done with what you guys are doing. My guest today has been Zoë. Zoë, thank you so much for joining on Happy Market Research podcast today.
Zoë: Thank you very much for having me. It was a pleasure talking to you.
Jamin: Zoe, thank you so much for joining me today on the Happy Market Research Podcast. Everyone else, I hope you find value in this episode. As always, screen capture, share on social, tag me, I will send you something very special — that is a Happy Market Research T-shirt. Have a great rest of your day.