Happy MR Podcast Podcast Series

Ep. 302 – Michele Ronsen on how the key to Asking Better Questions is Tied to Developing Better Listening Skills and What you can do About it

My guest today is Michele Ronsen, principal and founder of Curiosity Tank. 

Founded in 2010, Curiosity Tank is a design research and strategy firm that identifies customer insights and put them into action by digging into your problem space and charting a path forward.

Prior to founding Curiosity Tank, Michele worked for top design firms and gained experience working in Fortune 500s, academia, and start-ups.

Find Michele Online: 

LinkedIn: https://www.linkedin.com/in/michele-ronsen-0a55233/ 

Website: https://www.curiositytank.com/ 

Find Us Online: 

Twitter: www.twitter.com/happymrxp 

LinkedIn: www.linkedin.com/company/happymarketresearch 

Facebook: www.facebook.com/happymrxp 

Website: www.happymr.com 


“Clap Along” by Auditionauti: https://audionautix.com

This Episode’s Sponsor: 

This episode is brought to you by Lookback. Lookback provides the tools to help UX teams to interact with real users, in real-time, and in real contexts. It’s Lookback’s mission to humanize technology by bridging the gap between end-users and product teams. Lookback’s customers range from one-man teams building web and app experiences to the world’s largest research organizations, collectively ensuring that humanity is at the core of every product decision. For more info, including demos of Lookback’s offering, please visit www.lookback.io


Jamin: Hi, everybody, this is Jamin. You are listening to the Happy Market Research podcast. Hope you’re having a wonderful week. My guest today is Michele Ronsen. Michele, how are you?


Michele: I’m great. Thanks so much for inviting me.


Jamin: Michele is the principal of Curiosity Tank, a UX specialty shop based out of the heart of San Francisco. We’re gonna dive in a little bit more about her background momentarily, but before we do, Michele, I’ve got this standard question that we ask. Tell us a little bit about where you grew up, your parents, and how that’s informed your career.


Michele: Sure. I grew up right outside of Manhattan. Both of my parents were fourth generation native New Yorkers. And they very much influenced both my brother and me. I’m a classically trained designer and the apple doesn’t fall far from the tree, as they say. My mom was an interior designer, my father studied architecture and later became an entrepreneur. And my earliest memory is of driving down the freeway, or back then it was highway because I was on the East Coast, with my dad and he asked me to look at the McDonald’s sign. And he handed me a crayon and asked me to draw what I thought the gear inside the little golden arches looked like, that little device that made the sign spin. And I was about 3 years old. And I just remember, my brother and I talk nowadays that we were just tortured with that kind of stuff. We were tortured to think about how things work and to build things and to take things apart, and we didn’t have coloring books but we had all the blank paper we could ever imagine. And it was a delightful way to grow up, looking back, and now I delightfully torture my 6-year-old daughter in the same way.


Jamin: Market research, user experience research, customer experience, all of these, all three of these disciplines are focused on primary research, so getting to the heart of the consumer. And of course they incorporate external data to help supplement and understand and provide context for our insights. But what do you see as the differences and the overlaps across these three disciplines?


Michele: Well, first of all, I don’t consider myself a market researcher. And market research was like, aahh. But you definitely explore and interact with people to help figure out directions and strategies. Yes, I absolutely do that. But I see market research as more, right or wrong, as more focused on the sales portion, as more focused on purchase decisioning or pricing, the best ways to create awareness of a product or a service. I see marketing as focusing more on buying and serving the company or the entity, the organization that is producing whatever is being sold. I see user research as being more focused on the goals, the design, the context, the ease of use, the product market fit or the mental model fit. Both the attitudes and the behaviors, if you will, of the individual user. So I see user research as working in service to the user and market research as working in service to the larger organization. When it comes to customer research, I think it includes both market research and user research and the broader kind of support models like customer service or sales.


Jamin: So if you were to rename Happy Market Research so that it incorporated these three disciplines, what would you see as like an umbrella term?


Michele: You know, they’re so heated. There’s such a debate out there. It’s like this massive turf war. Why don’t you draw a heart around them all and call it the stuff that drives customer engagement.


Jamin: There you go. I know, it’s so funny. And it is, you’re right, it’s like there’s not–and you answered that like a perfect–I hope to see you out there in 2020. It was the perfect political answer. Because you’re right, it is such a heated conversation. I had somebody from Shopify on earlier and I tried to nail her down on this exact question and she wouldn’t answer it.


Michele: Well, not that I don’t, it’s not that I’m shying away from it. It’s like, I don’t have a good answer. But I can tell you that I would not ever say that I’m a market researcher. And not that there’s anything wrong with market researchers. In fact, my stepfather was a market researcher and he also was very influential on me and my career, clearly, with what I do today. I probably didn’t realize it 20 years ago. But there’s–I think of user research as very different.


Jamin: Yeah, and it is very different, right? At a minimum, one of the big differences is it sits in a unique spot inside of the organization. It seems to me that they sit closer to if not right beside product.


Michele: Exactly. And that’s my first language. My first language is design. So I speak design fluently and that might be one reason why I identify with that so much more than marketing. Whereas maybe if I had come from the sales side, I would–and I became a marketer, I would identify with marketing more or that terminology more. Maybe it comes, I think it comes from your background and it comes from the context that you bring to the table.


Jamin: And that’s interesting how you’re kinda hitting on this nomenclature or terms that we use, because as you know, language is culture and if I say survey, my head jumps into one thing, whereas somebody else’s might to a different, right? They might think about a geological survey, for example. So the context is really important for understanding the words. You do consulting for large companies. I don’t know if you can divulge some of the companies that you do work for.


Michele: I do. I have worked with Facebook and Zillow and Gusto and I’m at Slack right now running their rolling research program. I’ve worked with PayPal and Next Vacay and a whole bunch of really, really great companies. And they’re primarily tech-driven companies, but I wouldn’t say that I am a–those are my clients because I’m so tech-savvy myself. I like to kind of put like a bumper around that. Those are the companies that tend to understand user research and know how to leverage it the best.


Jamin: And at the same time, you’ve spent time, a fair amount actually, in your career in academia, from curriculum design to art–


Michele: Yes.


Jamin: –and instructing. So we know that terms are actually really important in academia. I was struck by, and I think our first introduction was in conjunction with a class that you teach, I think it’s Berkeley but I apologize if I’m wrong, with respect to defining a common set of terms across consumer insights.


Michele: Yes. So it just, it sort of struck me over and over again, and I think it’s ironic, for those of us who work in UX, user researchers included, and designers and product managers, we truly aspire to create terrific user experiences. That’s what we do. That’s what gets us out of bed every morning and that’s personally what I love to do. But what we’ve done with our own terminology is create just an absolute mess. It’s a perfect example of the cobbler’s children. And that UX terminology problem really followed me like a cloud. There were all of these different instances, like you mentioned, I’ve taught all over town, I love teaching, but in the classroom, a student, a PhD student asked me to explain the difference between ethnography and design thinking. And I was like, what? Like, you could’ve heard a pin drop. And I first thought to myself, ooh, that’s a really bad question, like you shouldn’t ask that out loud. Of course I didn’t say that, but that was my immediate–and this woman was clearly very bright and very educated, but wow. If she’s that confused about our terminology, hm, there’s a problem here, there’s a gap. And then it just kind of kept coming up. It came up in interdisciplinary conversation with a marketing strategist. A colleague of mine I’ve known for 15 years, I had asked her for feedback on one of my teaching tools, and she was adamant that workflow analysis was not a legitimate user research method and it should be removed from this tool that I was developing. And this was actually like a heated debate. And I was like, huh, that’s interesting. And she’s also not a very argumentative person, but she was like hellbent on this. And then also among professional researchers, I was in a conversation with–this all happened within like a two-week span. I was in conversation with two very esteemed research colleagues and we had a half-hour conversation about whether journey mapping was indeed a user research method. And we couldn’t come up with an agreed-upon definition. Does a method just gather data? Because journey mapping doesn’t gather data, journey mapping is a visualization technique to plot either assumptions or data that’s already been gathered. So how are we defining this? And if the three of us couldn’t come up with a shared term, then there’s a lot of confusion out there. And then last is with customers. You used the–you joked about whether if you say the word survey, someone might think of a geological survey. I was on a discovery call within this same two to three week period and it was a real estate client that was looking to get into shared workspaces. And they had never really done any user research before, but they wanted to get themselves some of that. They’d read about it, they heard it was really important, and they wanted my feedback on how to ask better questions. Because they had been surveying their customers, but they didn’t know if they were asking the right questions or what to do with the data and yada-yada. So there was four people on the call and 45 minutes later I realized that they’ve actually been conducting one on one interviews. But to them, the word survey and interview was completely interchangeable. And it was such an enormous waste of time. And it was there that I was like, wow, there’s really, there’s confusion everywhere. And depending upon your background, you likely use the language differently.


Jamin: Right.


Michele: And it’s wasting time. It’s leading to poor proposals. It’s leading to the wrong hires. It’s leading to the wrong methods being used. So what can we do about this? And that’s when I started the crowdsourcing effort.


Jamin: And so talk to us a little bit about what that looked like and then we’ll move into the public, where it is in context of the public domain.


Michele: So I am part of one Google group and I just kind of threw it out there to this Google group that said, hey, would anybody be interested in helping me define a whole bunch of terms that relate to UX and user research? And then I put that same kind of call to action on my LinkedIn profile. I’m very active on LinkedIn. I have become very active on LinkedIn since then, actually. And the response was just really overwhelming. At that point I had just jotted down the terms that either I had used or that I had heard within a couple of weeks; I easily came up with 100 terms. And then I held a kickoff call or two kickoff calls to just get feedback from people of how they think something like this might work, what they expected, what they think would be helpful, how should we define them, what criteria we should use. I had never crowdsourced anything and I didn’t know these people that were participating. Most of them I did not know. So I kinda threw a wide net out there and then held these two calls, and fast forward, we divided into five teams, we had almost 60 people participate in that first round, and we defined, at that point it was 150 terms within a 10-day period. It was super–


Jamin: That’s amazing.


Michele: It was riveting. It was like front car of the rollercoaster, “I can’t believe this is actually working.” And the beautiful thing was it all took place in Google Docs.


Jamin: Right.


Michele: All of it.


Jamin: It’s funny how you would think–so you have Word, it’s this old and still very use–I use it every day. But then you have Google Docs for this whole different use case, right? That’s such a dominant part of my day, is spent in both of those tools.


Michele: Well, and these were people from all over the world. I mean, I heard from people in places, it’s like wow. I heard from people in Nigeria. I heard from people in Serbia and Sudan and Russia and Singapore and Taiwan and Vietnam. I mean, you name it. Chile and Brazil and Argentina. And this really struck a chord. People really felt, people wanted to participate either because they had been in a heated debate recently with their team, they found themselves not knowing where to go for a trusted source to get a definition, and/or they felt that this would be a great way for them to expand their vocabulary and broaden their network.


Jamin: Yeah, I think that that’s an interesting point in terms of the collab, the motivations around collab. I think there is this–and I do it, like 100% do this, where–and I’m part of similar groups to you, the same groups, some of the same groups as you. And the opportunity to be able to contribute is in a lot of ways I don’t want to say selfish on my part, but I’m getting a lot out of that by adding to the body of work.


Michele: Absolutely. I mean, the response was amazing. And the language matters. And then they went back and forth and they commented on each others’ terms: “Really? I interpret it like this,” or, “Have you thought about it from this standpoint?” And there was also just, it really highlighted the difference between how applied researchers think about something versus academic researchers. And we communicate in totally different ways.


Jamin: So we put together or you put together in this collaborative environment a–how many terms were there? It was 200, right?


Michele: It’s up to 200 terms. So once we finished those existing or those first 150, I then turned it over to my class at UC Berkeley and I introduced those students to secondary research and fact-checking. Then they went through it as new practitioners to review the terms for consistency and tone and clarity and accuracy. I completely think the word “consistify” should be a word.


Jamin: I love that.


Michele: They went through and they fact-checked, clarified, consistified each of the terms, and then they added the term used in a sentence and then related terms. And then they were each asked to add one more term of their choice from a list that I had gathered. So then it grew to 200 terms.


Jamin: Language is also an evolution, in evolution, so it changes with time. And I don’t mean to say that like you don’t know that; of course you know that. But my question is really, with that understanding, coming up with a definitive dictionary of common terms and what they mean, how are you gonna deal over time with changes to this I presume some sort of a version of a Wikipedia type page?


Michele: So you bring up a really good point, and I think that what was fascinating about this, there’s so many fascinating aspects about it, but one aspect that we haven’t touched on yet is that I think it also serves as sort of a chronicle of the evolution of our industry. And I don’t think that whatever–we’re calling this UX Lex, by the way, UX Lexicon. I don’t think that it should be static. I think that it should be a living chronicle and it should evolve over time. And I think as we see more marketing converge into user research, converge into data analytics and all of these spheres kind of blending into one, I think what’s gonna happen is that those terms are gonna evolve as well.


Jamin: That’s super interesting. So where does it exist now in context of access? Can we get access to that?


Michele: A dozen or a few dozen of the terms should be available by mid February on curiositytank.com. And I’m actively looking for sponsors who would like to build out a more robust site. The majority of my students’ work last term at UC Berkeley was not only–the first portion was to, quote unquote, consistify these terms and then add them in a sentence and kind of build out new terms. But then they spent another 10 weeks doing generative research and evaluative research on what kind of home these would best live in. So they interviewed aspiring researchers and practicing researchers of all different levels to find out more about where this should live, how it should live, what should a company and what types of experiences would deliver the material in the most meaningful ways. So we had some great MVPs there.


Jamin: Got it, got it. So you’re releasing a subset of the lexicon and then looking for a sponsor–which I think is very important, by the way–to come alongside, add value/cash, and really unlock the rest of the value.


Michele: Exactly. And maybe it’s not cash, maybe it’s talent and development talent.


Jamin: Interesting. Good. Well, gosh, insights nation, there you go, there’s a really clear call to action. If you have desire to get in front of and really help define an industry–gosh, that’s kind of powerful, huh–then check the show notes for contact information. I know somebody that would like to talk with you. I actually have a few people in mind already.


M ichele: Yeah, or sponsor a collection of words. Maybe you’re in the recruiting industry or maybe you’re in the platform industry or maybe–there’s a myriad of collections that we would like to present. One of the things that we learned by doing all of these studies with our target users is how they would like that information to be presented and what collections make sense to them.


Jamin: Did you, were you surprised by anything that’s come out of the research?


Michele: Yeah. A lot. Beginning practitioners and people that have been in the field are looking for two very different things. Beginning practitioners don’t know where to start and they don’t necessarily know the word that they’re going to look up, so they want to be told and almost guided, and they want context of where that term fits into the overall cycle or development cycle or design process cycle. Whereas more experienced practitioners know the word that they’re looking up and they’re probably going there to type in the exact words to maybe share a definition with a coworker who is confused. Or to maybe create some sort of shared understanding within a broader team: “When we refer to, say, a persona or an archetype, let’s agree to always use this definition of it or that definition of it.” So what they look for and how they look differs considerably.


Jamin: So that actually was not my question, but I think that is really interesting from–and this is me as a marketing person–from an SEO perspective. So if I want–and this has been part of my ongoing thesis, is that the new generation of researchers does not have the same vocabulary as the previous generation. And so to that end, it’s very important if we’re gonna be–if you’re a services business, then you need to make sure that you’re talking, your voice is–you’re using the right words in order to be discoverable by the next generation of researchers.


Michele: Well, that’s a great segue. So in addition to sponsors, I’m also looking for people to have conversations with me where we can videotape a conversation about a term. So you and I would be talking about the term “persona” and what it means to me as a user researcher versus what it means to you as a marketer so we could provide real practical value in context. The aspiring researchers today, our people that are up and coming in our industry, want to learn by video. They don’t read nearly as much. So they also, they want this context provided in a different way.


Jamin: That’s super interesting. I didn’t even think about that, but you’re right. Ironically that you don’t give your kid YouTube, but YouTube is the go-to for knowledge.


Michele: Exactly. Exactly. So I’ll be looking for people–and I think it’ll just be fascinating to have a five-minute conversation with someone about, “What does this term mean to you? When was the last time you used this term? Can you give me an example of this term in use?” And just create a series of those so that people can really learn theory from practice and be able to apply that to maybe what they’re doing today.


Jamin: So words have evolved. The role of insights has evolved tremendously, insight in the context of a major brand. You’ve worked both inside and outside of leading brands. How do you–historically, how have things changed from a user experience researcher’s perspective?


Michele: You know, I think that I work in a little bit of a bubble because my clients, as I mentioned earlier, clients like Square and Microsoft, those guys are really UX mature. Slack, Facebook. So when they’re calling me in or asking, when I’m in discussions with them, they know what I do, they understand the value, and they’re there because they know it’s important and they want it. So I don’t do a lot of sales, so to speak. I don’t do a lot of convincing people why it’s important. But I am getting more and more calls for people like the real estate company, they are not UX mature. They’re not a tech company. But they’re hearing more and more about the value of user research and they think it’s important and they want to get some of that, quote unquote, but they don’t know where to start. So I think that the UX mature companies are becoming more mature. Research ops are becoming more savvy. They’re becoming more efficient. They’re becoming more effective. In those tech-savvy companies, I’m also being asked more and more to build internal research programs to help upskill the, quote unquote, non-researchers. I call those like the temporary researchers or the accidental researchers, people that are finding that research is becoming a bigger portion of their job, or they have a specific question to explore, but they don’t necessarily have the skills or the confidence to go that. So I’m seeing lots of shifts. Nine years ago when I set off on my own, it was a big sales job. Now my clients, it’s always been word of mouth, but now I’m not selling so much. Where it comes to sales, it’s really in those discovery calls with people that are not UX mature. And quite frankly, I have difficulty and a lack of patience, if you will; I haven’t figured out how to bridge that gap.


Jamin: Yeah, it’s a–you have like known pain in the sales cycle that you’re addressing, but if it’s not necessarily known, then you can–but they just feel like they need it, almost FOMO, then you wind up in a lot of cycles of education.


Michele: Right. And it’s such a longer kind of sales cycle for me, whereas Slack, it was literally a half an hour conversation, send us samples, and boom, you seem like a really good fit.


Jamin: Yeah. Super easy.


Michele: Where the real estate company, I don’t mean to pick on them but it was just a kind of recent example, was like, OK, well, what are you looking to learn? And their questions are, we want to become more profitable. We want to maximize our revenue. And I’m like–


Jamin: Sounds like it’s owned by a private equity company.


Michele: I know. Well, my first response is like, get in line, sister. But my second response is like, you know what, that’s not a user research question, right? There’s a million ways we can become more profitable, but that’s not a user research question. And user research is not intended to boil the ocean. So then we go through this whole series of, OK, let’s peel back the onion, like what is an appropriate user research question, and what kinds of data do you have that we can leverage? Because we don’t want to start from ground zero if we don’t have to. If you’ve already done, you already have some data, let’s triangulate and use that to inform kind of our starting point. There’s no need to reinvent the wheel.


Jamin: All right, my last question. Actually, I have to ask this other one first. I’m sorry, I know we’re over on time. What is the biggest issue from your vantage point that’s facing market researchers? And I know you don’t consider yourself a market researcher, but facing market researchers, and I guess we’ll broaden it to user experience researchers if you feel more qualified there, currently?


Michele: You know, I think I mentioned this before, I truly think we’re becoming data obsessed and we’re collecting data to collect data’s sake. I mean, how many times can you be asked to rate your Lyft or Uber driver? Are you even looking at the app anymore when you open it? Right?


Jamin: I literally had this conversation with Shopify this morning on the podcast.


Michele: Oh, no way.


Jamin: We were like, she said, “We’re drowning in NPS.”


Michele: Exactly. Same thing. It’s like we’re so, we are Linus, right? We are this culture with this gray cloud of data around us that A) we don’t need and B) we’re not using. So I think that that’s a concern. Another concern or another opportunity or challenge I think is helping to upskill these, what I call these temporary or accidental researchers. So those people should feel comfortable and should be confident in knowing how to ask good questions and follow up to dig deeper. But we also need to upskill these people to share that knowledge back, to get it back into a repository, so we’re not asking the same questions over and over.


Jamin: Yeah, that’s interesting. I’ll say this on this episode because I don’t think it’ll actually make it on the Shopify conversation that I had because it’s a little off-topic, but one of the things I was impressed with that she was telling me about, Emma is her name, Emma was telling me about, is that she actually was in a managerial role inside of Shopify and then she wanted to move into UX and so she had to start basically her career over as an intern in the UX department and then get a mentor that then helped her navigate her career, and now she’s a lead UX researcher. But to me, I really, I think that this whole area around mentorship, which used to exist and in a lot of ways has gone away, it’s gonna have big opportunity in corporations.


Michele: So I definitely agree with you. There is a huge gap in user research education. In an ideal model, there would be a series of apprenticeships, but that just doesn’t exist today. It just doesn’t exist for our culture. But the classes that I’m offering hope to address this specific gap, because I truly think that you need really, really hands-on practice and you need a mentor to help you, guide you on your way and to give you feedback along the way. One of the best quotes that I was able to uncover from one of my students when I was trying to learn more about this gap, I think he said something along the lines of: User research is a long and lowly road. There’s no one there to let you know how well you’re doing along the way. And it’s very true. So I think that improving research education and making it more accessible and providing more hands-on experiences and practice for the newer generations is gonna be really paramount.


Jamin: That’s super–I love that. Opportunity for education in our space, especially considering the rate of growth that it’s going through right now, it’s like material. Somebody should seize that opportunity. Last question: What is your personal motto?


Michele: Oh my gosh, I have a lot of little -isms. In regard to user research or in regard to life, I would say there’s a couple. Start where you are. No matter where you are, just start where you are. If you’re a bartender and you want to learn about user research, do a study about who you think, what assumptions you have, about people who order Shirley Temples. And then go out and interview those people while they’re drinking their Shirley Temple. Just start wherever you are. And the more you ask questions and the more you improve your listening skills, the better you’re going to become. And then when you do the best you can until you know better, and then when you know better, you’ll do better. But don’t wait. Don’t wait for the perfect class. Don’t wait for the perfect mentor. Don’t wait for the perfect opportunity. Because there’s never going to be one. Our culture just isn’t set up for those sort of apprenticeships now. And life is moving. But your life, you practice research in your day to day life. Are you looking to buy a new washer and dryer? Great. Make a research project about that. Are you looking to move into a new apartment? Great. Make a research project about that. Are you dating? Great. Make a research project about that.


Jamin: That’s an interesting way to think about it, actually.


Michele: Or make a research project about something at work. Maybe there’s something that’s not working very well. Maybe there’s a process or a procedure or an intake something. Make a research project around it. How are people doing that same thing at other companies? Do you need to upgrade something or purchase new software? Great. Make a research project out of it.


Jamin: And then after you’re done with that, check out Curiosity Tank to make sure that you’re using the right terms.


Michele: That’s right. That’s right. And pilot everything. Pilot everything including your pilots. If you’re gonna do an interview, pilot your interview. If you’re going to run a survey, pilot your survey. If you’re gonna run a card sort, highlight it. Highlight everything. I guarantee you’re gonna learn from every single pilot you do.


Jamin: My guest today has been Michele Ronsen. Michele, thank you very much for being on the podcast today.


Michele: Thank you.


Jamin: Curiosity Tank is the name of the company. If you’d like to get in contact with her, Michele, what is the easiest way for people to get in contact with you?


Michele: Probably on LinkedIn, actually. Michele with one L, and Ronsen, R-O-N-S-E-N.


Jamin: Perfect. And of course, as always, you can find her contact information in our show notes and on the blog. I hope you have a wonderful rest of your day.


Issue 6 – Happy MRx Podcast Newsletter

My hope for you all is happiness, growth, and success. 

I’ve been a husband (granted to a few different wonderful ladies), dad, and business operator for 20 years. Here are some practical points from successfully weathering tough times…twice. 


On April 1, 2000, I left a well-paying job to start an online survey platform with no customer commitments, no technology and two months of earned PTO. 

On March 10, 2000, the NASDAQ Composite stock market index peaked at over 5,000. 

Over the following 12 months, my co-founder and I bootstrapped the company to over $1 million in sales and hired six souls. 

By the end of the stock market downturn of 2002, stocks had lost $5 trillion in market capitalization since the peak and we lost 93% of our business. 

During the following six months we made no changes to our cost structure and subsequently ran out of cash and put $100,000 on credit. 

We were forced to rightsize the business and rebuild. It took two years for us to beat our previous highest revenue month.

Great Recession

Fast forward to 2007. When the severe worldwide economic crisis hit, Kristin Luck, Jayme Plunkett and I acted swiftly by cutting executive salaries by a third, lowering everyone’s salary by 10%, and paying sales commissions in stock options as opposed to cash. 

In 2009, we grew by 156%, corrected salaries and started hiring. 

COVID 19 Pandemic

From an economic perspective, COVID 19 is and will hard. But, we will overcome. 

According to McKinsey, “The resulting demand shock cuts global GDP growth for 2020 in half, to between 1 percent and 1.5 percent, and pulls the global economy into a slowdown, though not recession.”

Besides economic impact, this pandemic is redefining us as a culture. 

  1. Companies have been forced to work and operate remotely. 
  2. Families are stuck at home.
  3. Many of us are re-learning to cook.
  4. Our kids are having to adjust as well. 

This morning I got a note from a friend, “Many of you know that my son was supposed to have a wedding in April, but that has now been postponed.”


Last Friday I woke up, looked in the mirror, and realized I looked like Tom Hanks in the Cast Away during his time on the island. My productivity for the week was crap, there were no workouts, no morning routine, and I went through more than my fair share of whiskey. 

As in times past, I reinstated my morning routine with modifications and my workday was a million times better…but I was still missing the moments when you “bump into someone” at work. 

So! For this week, I’ll be hosting a virtual lunch from 11:00-11:30 PT via Zoom. All are welcome. This is just a chit chat time for us to connect. 


I’d love you to join. 

I hope you have a great week! As always, I’d love to hear what you think about this content or if you have ideas on stuff you’d like us to cover. You know where to find me, @jaminbrazil

Peace to you! 


About the Happy Market Research Newsletter: 

A highly editorialized recap of the week in consumer insights. We cover trends, happenings, and tips in market research, user experience research, and customer experience.

Happy MR Podcast Podcast Series

Zoë Dowling, SVP of Research at FocusVision on Elements of a Good Participant Question

My guest today is Zoë Dowling, SVP of Research at FocusVision.

Founded in 1990, FocusVision offers a technology suite that enables both qualitative and quantitative research. 

Prior to joining FocusVision, Zoe served as an executive at Kantar and the US Census Beuro. Additionally, she was an Associate Lecturer at the University of Surrey.

Find Zoë Online:

LinkedIn: https://www.linkedin.com/in/Zoëdowling/ 

Twitter: https://twitter.com/Zoëdowling

Find Jamin Online:

Email: jamin@happymr.com 

LinkedIn: www.linkedin.com/in/jaminbrazil

Twitter: www.twitter.com/jaminbrazil 

Find Us Online: 

Twitter: www.twitter.com/happymrxp 

LinkedIn: www.linkedin.com/company/happymarketresearch 

Facebook: www.facebook.com/happymrxp 

Website: www.happymr.com 


“Clap Along” by Auditionauti: https://audionautix.com

This Episode’s Sponsor: 

This episode is brought to you by Lookback. Lookback provides the tools to help UX teams to interact with real users, in real-time, and in real contexts. It’s Lookback’s mission to humanize technology by bridging the gap between end-users and product teams. Lookback’s customers range from one-man teams building web and app experiences to the world’s largest research organizations, collectively ensuring that humanity is at the core of every product decision. For more info, including demos of Lookback’s offering, please visit www.lookback.io


My guest today is Zoë Dowling, SVP of Research at FocusVision.Founded in 1990, FocusVision offers a technology suite that enables both qualitative and quantitative research. Prior to joining FocusVision, Zoe served as an executive at Kantar and the US Census Beuro. Additionally, she was an Associate Lecturer at the University of Surrey. Tell me a little bit about your parents, where you grew up, and how that’s impacting what you’re doing today.


Zoë: So my parents are philosophers, which obviously made for interesting dinner conversation. And how they impacted- one of the big things that impacted I think exactly where I am today and my interest in- strong interest in culture, technology, and of course, I bring that together with research and understanding people and life is that we- when I was about five or six, we moved to South Africa. And one of the interesting things was that moving so far away from family and in those times, in the ’80s, we didn’t have a landline at home. And even if we did, to actually do an international call, you actually had to go to one of the- we lived in a very rural area. There was only two hotels, and we went to one of them to place a call with international operator to speak to my grandmother on her birthday. So this was the most exciting event. So birthdays and Christmases. We can get in the call, go to the local hotel, to actually place this long distance call, international call. And it was incredibly exciting, but it brought this kind of communication because then it was the wonder of- wow, I’m speaking to somebody that’s 5,000 miles away. And you can look at how technologies change. Think about that. In the ’90s, email, I remember I was in Scotland and my parents were in South Africa, and I was emailing them. It was like this is incredible. I’m not waiting three weeks for this blue air mail letter. And then the first time that I ever did instant message chat, it was like- it was mind blowing. That sounds crazy today. You think of young people growing up and even young people in the work force. The internet’s always just been there. Whereas I remember how pivotal it was to actually bridging that gap in communication. Which is a long kind of winded way of saying all of these kind of experiences and also the culture element coming in. I lived in a very, very different culture, and so looking at South African culture and looking at British culture and now I’ve been in the US for 15 years, and so American culture, and just the world and how all this comes together with technology is fascinating. And that’s what kind of brought me to where I am today with all of these different things of how do we understand the world. How do we understand people and brands are a very fascinating element with that- within that. Because you think of your brands, you think of the products and services that have changed our lives, how technology has driven that, and so it kind of bringing it all together and trying to keep digging into it and understanding it. And also from my more methodological background. How can we do that in better ways? How do we ask the right questions? How do we make sure that we’re getting answers that are the right answers because we’ve asked the right questions.


Jamin: And it does start with a question, right? You have an overarching objective for a study. Almost every survey I’ve written, I start with this- the title font with what the objective is and then try to control survey creep or discussion guide creep if it’s not attached to the question or if the direction of a conversation is not attached to addressing- we just don’t have time for nice to knows.


Zoë: And I- well, and I think that’s a shift though. Because I think that there was- if I think of when I joined the market research community. I joined Kantar 12, 13, 14 years ago, and I started- well, I started in copy testing, which is still fun actually. But there was this thing of we’re going in to speak to people, and so it was almost like this is our opportunity to get the answers so let’s ask everything. Let’s put in the kitchen sink to be cliche about it. And I think it was this thing of because it took time to get that information. It took- I mean a copy test back in those days took four to six weeks, which is just unthinkable today. Can you imagine? You need to do an ad test, and it’s like actually we’d like four to six hours, never mind three to four days. So I think that we’ve got this little bit of a legacy of this is our chance to speak to people so let’s- why not just add that. What if that’s interesting? What if that’s interesting? And I think that’s a mindset that we still need to get out of, and when our surveys are all mobile friendly and in that ten minute mark, perhaps we’ve reached that point. But we’re a long way from it.


Jamin: Speaking of philosophy, just kind of like Occam’s razor has always been one of my favorite frameworks for life in general. Any sort of piece of correspondence I send, I always try to reduce the words and the content to the point where the intent is clear, but then the rest of it is kind of like cut out just to kind of get through the noise. And with respect to survey design, that works in both ways because you can be too brief. Therefore missing the really intent, and then at the same time you have this overall kind of headline. We’ve always been headline consumers, but even more so today, it’s like people just simply don’t read. So what do you see as the key elements to a good interview question? And this can be framed in- with qualitative or quantitative or both. However you want to think about it.


Zoë: I think the fundamentals remain the same, whether you’re asking a question in a survey or constructing it for an interview. I mean obviously there’s some fundamental differences. If you think about- the first thing are you gonna be understand? Talk in everyday language. I think too often we want to frame- we either bring in the world that we’re in, be it the actual industry. We’ve got particular language jargon that we’re using. Or you might think that I need to be so incredibly specific that you end up creating this very convoluted, the way it’s constructed question that is anybody gonna- you’ve just said it. We read in headlines. So do our participants. They scan. In fact, very often in a survey, they actually just go straight to the answers to determine that the question was and how they’re going to respond. So it’s been clear. It’s been concise. And I think that kind of works for both sides, qualitative or quantitative. Because if we’re qualitative, you’re gonna take the question and you can probe. You can go deeper and you’re gonna take it all from there. But if you start with something that’s very convoluted. Then, well, you’re probably not gonna get to where you really wanted to go in the first place. That would be my overarching thought. We sometimes over engineer our questions.


Jamin: So this is something- I’ve never actually heard anybody say this before. I wish there was a counterpoint person on the show right now because it would be really fun to do a- to have the other side of the table represented. But respondents go straight to the answer and try to figure out what the question is. It’s so funny. So I’ve literally been thinking this, but never said it out loud for over a few years now that those two or three or four lines of text that exist on top of your answer set inside of a survey, nobody’s getting that, unless maybe you have a few bold words or underlined words or something that stands out from the general text. People are going right to what the content and the answer choices.


Zoë: So I’ve been fortunate enough, I can back this up with data, I have to say. I’ve been in usability interviews. We had an extensive program when I was in Kantar and this was probably a good ten years ago when we were trying to improve web questionnaires. So we looked at a number of different things. Obviously, the technology and how it’s being displayed on screen was an essential part of it that we actually took time to think about how are we constructing questions, what are people doing, how are we getting to understandings. And so we did usability testing, and we also did some eye tracking. And what was fascinating, you could literally see it on screen, was that people’s, the dominant area of where their eyes was going was to the response options. And it was a very fleeting glance at the questions. And so that- you know, you can see this mapped out. And the longer your question is, the more daunting it is. And the less people- because people don’t have time, and they’re not invested in it. And we could go all the way back to some of the theory around this, so a couple of things if you want to look at the psychology of survey response, again keeping to the quantitative side. People have got comprehension, recall, judgement, and then response. So comprehension, what are you trying to ask me? Recall is where is this kind of answer coming into- how can I answer this? What is my answer to this? Judgement, what is the appropriate answer? And then your actual response. And so you’re doing that in split seconds. This is all just part of your cognitive process, and we make it harder for people if the questions are extremely lengthy. Smaller screens amplify this grossly. Jakob Nielsen, people are familiar with him, he talks about this. Smaller screen, it is hard. It is less- to comprehend. It takes us longer to do it. And ironically, on a mobile device, people are probably less focused in on the task. They might be doing it just as a quick stop gap in their lunch break or on the bus or sitting on the sofa whilst their adverts are on. So we have all of these things. We understand what people are doing when they’re actually responding to a survey and responding to these questions. But then we’re not necessarily building that into how we’re asking them and how we’re constructing our questionnaires. And then one last thing I’ll talk about just on- in terms of going back to the theory behind it, Don Dillman, who’s sort of renowned professor that actually started out in perfecting telephone and mail surveys and has since done work over the last decade or two decades on the internet, starting with email and then into web. But he had this whole thing about the value exchange, and there’s a lot of different reasons why people take surveys and what are they trying to get out of it and of course, we think about all the time about the financial incentives. The panels and it’s like what points are people getting. But there is also a value element to it here. People need to understand the value of their research, that their opinions- they want their opinions. I think culturally as we’ve already talked about we’re in this generation of reviewers, and people want to believe that the information that they’re giving is going to be used. And sometimes, we don’t- I think we don’t always convey that in the survey. How is this information going to be used or that you- that this is really incredibly valuable beyond just saying that. So I think there’s a lot of different elements in how we’re constructing our surveys that we could do to make it easier and one of them, just going back to where we started is, shorter questions. So keeping to the ten words. That’s my rule of thumb. I mean it’s, you know, hey, we’ve all got to have a gold stripe, and I think a ten word question would be a great way to start.


Jamin: That’s- and now all of a sudden, you have to become highly disciplined around- because it’s easy to write a long question. Wasn’t that Hemingway? Right? He said sorry, I don’t have time to write you a short letter, so this long one will have to do.


Zoë: Mark Twain?


Jamin: Mark Twain. Mark Twain. Sorry, yes, right. So- thanks for correcting me on that. That would have been catastrophic if that would have gone out. Anyway. So- and it is hard. It’s really hard to reduce a thought to fewer words. Getting back to Mr. Occam.


Zoë: Absolutely. Because you’re having to clarify within yourself, what is it that I really want to know? What is it that I’m trying to get at? I always likened it as well to the early days of composing a tweet. You had 140 characters. It’s like I can’t write everything that I want in there. And then it turns out you actually can by simplifying your language, by simplifying the concept, and then you can always follow up. And I think that’s something that we can think about doing within our surveys.


Jamin: Have you seen the Twitter surveys?


Zoë: Yes, I have.


Jamin: What do you think?


Zoë: I mean they’re- I like them. They’re short. And they’re kind of interesting. But yeah, I like them, and it kind of gets to that in the moment have you see this. Or are you interested in XYZ.


Jamin: I screen captured one and then posted it on Twitter #MRX, asking researchers what they thought about it. And actually one researcher who’s well known responded very negatively about no professional researcher could use this approach to gather consumer insights. And see he would be fun to have on the show by the way, because I think he would offer a nice counterpoint. But anyway, the- I just couldn’t disagree more. I just- I think that it absolutely- it’s not like the industry killer or anything like that, but it is a nice supplement to- or another arrow in the quiver of consumer insights.


Zoë: Sure. And I think this is all part of our evolution. I’m classically trained, and I come from this. You’ve heard. I was almost getting on my soap box there, talking about how we should be thinking about designing our questions. Let’s go back to the fundamentals on the best practices. We can’t go wrong with those. But yes, we do need to apply them to today’s world, and the different mediums that people are using to take our surveys or even speak to us qualitatively, and of course, it’s adaptations. I think there’s also adaptations in how we- what kind of data we’re collecting and when and how we’re using it. Businesses can’t wait. That is not where we are today. Everything is far too fast, and so sometimes having that quick answer, having that litmus test. Has my brand been noticed? Has there been XYZ being able to recall this? Or just what is somebody’s opinion on ABC. That might be enough as a starting point to then go into something that’s a little bit more in depth. And I think that that kind of more iterative approach and- to research and thinking about how can we just get some data to work with. And of course, when you’re getting that data, of course, you’ve got to think if you’re asking on Twitter, well, think about who the audience is on Twitter. Who- the demographics. What kind of make up of them. Because that’s obviously going to influence how you look at that data and say, you’re not gonna say hey, this is representative in the United States or in the world. So I think it’s always like- we can use these, dare I say, quick and dirty, and I know that that’s controversial in and of itself, mechanisms. That we can use these- this way to get information, and as long as you’re looking at it with the right lens on OK, I know this is limited by ABC. Then it might be good enough to help you move to the next step. And I think that’s how we need to. We are progressing and evolving as an industry.


Jamin: I like that. So you know, going back to Twitter surveys, framing your answer or by that, I mean your objective in the context of that community that is answering the tweet, right? Or the call to action, which is again whatever sort of profile, if it’s a paid advert and then the context of Twitter utilization. So who is that audience and what do they look like? And that obviously isn’t the whole pie, but it is a varying size of the sliver, depending on who your audience is.


Zoë: Absolutely. The other thing you could do just to pick up what you said there was framing the question. That’s something that we don’t have time for and I don’t know how long ago we gave up on doing sort of pilot tests on I’ve constructed this survey due to time and the financial implications of it. Using something like a- I don’t know if this question’s gonna be answered in the way that I want it to be answered. Or am I- is it gonna get the response, just the interest, or whatever it may be. This type of way of doing a quick test on is this the right wording that I’m after or there’s two different ways I could frame this. Why don’t I quickly put that out there and see what I’m getting back? And qualitatively as well you could follow up on [INAUDIBLE] again. I’m kind of mixing things around here, but there’s a lot that we can do to improve the questions that we’re asking, and ultimately the data that we’re getting by doing some of the quicker fire ways of exploring do I want to go down that path or not with the way that I construct this question or the data I’m getting or so on.


Jamin: And there’s- we’re in a little bit a rabbit hole. I’m gonna try to bring us back in, but I do feel like this is really important for the audience because a large portion of my audience hasn’t been doing this for 20+ years. So the- what Zoë’s talking about here is we used to- when market research was old and we did things through mail and mall intercepts and phone, so it was very cost- very expensive, time and money-wise. We would do a soft launch of a project and then get the data back. It would cost about 10% of the project fee and from that, we would then refine the survey mechanism or discussion guide or whatever and then launch the full study. The objective there on the first part is less about understanding the consumer and more understanding is your instrumentation correct for gathering the correct stimuli or the correct feedback from consumers. We have completely walked away from that as an industry as insights has become more democratized and then the other thing that- the other point that I want to- or not point. The question I want to ask you, Zoë, is how do you see automation impacting this- the- maybe I’m gonna call it traditional rigor. Is that pilot not necessary now that we have so many automated solutions and templates for research?


Zoë: One would hope that the pilots were done as those templates were created, so that you perfect your instrument before commercializing, before you’re making it wide spread. And I think that’s one of the things that I do like about some of the automation and sort of existing approaches is that you can use robust approaches, methodologies, that have been tried and tested, and go in and start- I think eye testing is probably a great example of it. I think it’s not gonna get you all the way. Like if you really want to understand how somebody’s interpreting that ad and there’s lots of different ways that you could- and approaches you could layer in. But is it going to be good enough? Yeah. And so I think there’s an interesting- automation can bring back some of that expertise to a broader audience. So going back to that democratization of research. So established researchers, established approaches, established companies that have been doing this for however long and all their expertise. They put it into this templative solution, and then you as the person say in the marketing department or wherever you may be coming from can benefit from that expertize without having to kind of go down that process of either learning it yourself or engaging somebody for once small thing that you have.


Jamin: What do you see as common mistakes in framing questions to participants? And again, I’ll just kind of like broaden it. You can talk to either discipline, qual or quant or more broad.


Zoë: I- the thing that just comes to mind when I’m thinking about this is probably around the word jargon that I’ve already talked about, and trying to be very specific. There was an instance that I had- I’ve reviewed countless questionnaires. There’s just something in terms of language and trying to- the experience that we’re trying to deliver to our participants. That’s something that I do. And this comes up over and over again, and you- I really do understand why it’s there. And it’s also incredibly difficult. Let’s not make any bones about it. It’s very easy to pick holes in somebody else’s survey. When you’re looking at it, you’re not the one’s that constructed it, you’re not the one that’s been writing it, because it’s a difficult thing to do and it can have its challenges and it’s a skill. It’s like anything. It’s like writing’s a skill, and writing your survey questions are- is a skill. Your qualitative interview questions or your online activity questions. It is a skill and it gets better and better over time. I think one of the things that I see people falling into the trap of is trying to be so specific about the situation. So a scenario that stood out, and this is probably from five, six, seven years ago, was I was reviewing this- it was a diary study, and it was actually taking place in Africa. I think it was in Northern Africa set of countries, and they wanted to understand some snacking moments, which again we don’t talk about I had a snacking moment today or several of them. But this is the language that was being brought into this- the questions within the diary. And then they wanted to get more specific about it, and I wish I could remember the details of it. And it’s not to unfairly call out this- the people that are constructing it, but it was- this is fairly typical. We’re interested in this particular event, and they had their whole definition of it. And it was incredibly important to them, and I just kept going but that doesn’t make sense to an everyday person that’s just going into their cupboard and picking out the snack on hand. [CROSSTALK] And even just- we do. We use language- even if it’s just the snacking occasion or the- I’m trying to think of other moments. Your morning routine. We might have other words for that, but we get so immersed in our world and thinking about this is how we define it. OK, but how does your customer define it? Or how does people just going about their everyday lives define it? That’s something that I think that we- I understand why we do it. I really do. But from a participant’s perspective, the person responding to the question, that makes it challenging.


Jamin: What is the worst question you’ve ever seen?


Zoë: Honestly, I’m going back to that as a reference. Just because it was so convoluted, and the other thing is I don’t have it- I’ve seen- I’ve seen some doozies, so. I mean everything. Double barrel questions. How can you really answer that? You’re leading me into- it’s the basics. It’s you’re leading me into this response. I can’t respond it the other way. We’re all the time- it’s like I can’t respond to that at all. I- none of those apply, and we don’t give any- we don’t give- we’re constructing these questions to allow, and this is actually more on the quantitative side because at least on the qualitative side, people ask- you get to some sort of response, whether it’s what you want or not. People will give their opinion because it’s open ended. Whereas in a closed ended survey question, you’re dictating the whole frame of it. The question you’re asking and the responses they get. And it’s like no, that doesn’t apply to me. You’re not getting to my opinion, and I think those are some of the things you see frequently and we’re all guilty of it because you, the person that’s designing the instrument, you’re bound by your own parameters and how you’re viewing it and how you’re framing it.


Jamin: Have you- do you remember the show Yes, Prime Minister?


Zoë: Yes, I do.


Jamin: So they have- you may have seen it. The leading questions episode?


Zoë: Oh, I think I might have a long time ago.


Jamin: I’ll share it with you right after this. It is epic. He has two different- so he writes- on the fly, he writes a survey that is designed to answer a specific question, and then he creates a counterpoint survey that comes out with a completely different question just simply by changing the question types to being leading. And it was- I think it should be part of every researcher’s experience to watch this episode. It’s so artfully done. Anyway. Yeah. I hear you. I hear you on that point. So, Zoë, you are gainfully employed at my alma matter, FocusVision. Are you guys doing anything that’s particularly interesting you want to talk about at the moment?


Zoë: We have a couple of things in the pipeline. But I think I’m just gonna say stay tuned. I think it’s all gonna be exciting.


Jamin: Good. I will certainly stay tuned. I continue to be a big advocate for the company and the leadership, so well done with what you guys are doing. My guest today has been Zoë. Zoë, thank you so much for joining on Happy Market Research podcast today.


Zoë: Thank you very much for having me. It was a pleasure talking to you.


Jamin: Zoe, thank you so much for joining me today on the Happy Market Research Podcast. Everyone else, I hope you find value in this episode. As always, screen capture, share on social, tag me, I will send you something very special — that is a Happy Market Research T-shirt. Have a great rest of your day.

Happy MR Podcast Podcast Series

Harry Brignull, Head of UX Innovation at Smart Pension on Elements of a Good Participant Question

My guest today is Harry Brignull, Head of UX Innovation at Smart Pension. 

Find Harry Online:

Web: https://www.brignull.com 

LinkedIn: https://www.linkedin.com/in/harrybrignull 

Twitter: https://twitter.com/harrybr  

Find Jamin Online:

Email: jamin@happymr.com 

LinkedIn: www.linkedin.com/in/jaminbrazil

Twitter: www.twitter.com/jaminbrazil 

Find Us Online: 

Twitter: www.twitter.com/happymrxp 

LinkedIn: www.linkedin.com/company/happymarketresearch 

Facebook: www.facebook.com/happymrxp 

Website: www.happymr.com 


“Clap Along” by Auditionauti: https://audionautix.com

This Episode’s Sponsor: 

This episode is brought to you by Lookback. Lookback provides the tools to help UX teams to interact with real users, in real-time, and in real contexts. It’s Lookback’s mission to humanize technology by bridging the gap between end-users and product teams. Lookback’s customers range from one-man teams building web and app experiences to the world’s largest research organizations, collectively ensuring that humanity is at the core of every product decision. For more info, including demos of Lookback’s offering, please visit www.lookback.io


Jamin: Hey everybody. This is Jamin. You’re listening to Happy Market Research podcast. My guest today is Harry Brignull. Harry, thanks for joining me on the podcast.


Harry: My pleasure.


Jamin: So we’re talking about the anatomy of a research question. A question in context of one you’d ask a participant. So give us a little bit of context. Tell us how you wound up in research.


Harry: You know what, I went through a very sort of traditional route. But that’s because I’m kind of a bit old. So in the old days, the only way to get into research was through pretty much the formal route of studying something like psychology and doing it through academia because there was sort of, at the end of the 90s and the early 2000s, there was no UX community, there was no sort of user research roles that you can get in industry. So yeah, I was an academic researcher back in the day. And when we did research, we had to record the research onto VHS cassettes. So I remember setting up our first lab where we had S-VHS recorders and being really, really proud of it because they were slightly higher fidelity than the regular VHS. It’s quite funny to look back on it now. So I’ve been in the business for quite a few years as you can tell. So I became a usability consultant because in those days there wasn’t really. There wasn’t a term user experience, no one. If you go into Google trends and have a look at the term user experience, it wasn’t really around in the early 2000s at all. Usability was a thing though. So that’s kind of how I got into user research doing a lot of lab research, a little bit of eye tracking and ethnography and that sort of thing. And then I kind of ended up going a bit design side. I think most research, most people in the UX industries sort of move around a bit. So I started out with research, then I went into more design, then back into research again and now I run a design team with a mixture of all of those skills. I guess the bit that listeners are probably most interested in is when I went to work at Spotify a few years ago. When I was using, I was using the Lookback really intensively when I was at Spotify. So I would work from home in Brighton on the south coast, my product squads were in Stockholm, totally different country. And the end users I was working with were in the USA. So it was just the perfect tool for that sort of thing, where I could sit at home in my pajamas and do research and deliver the research to my team and still be near my family and everything. The great thing about remote research, particularly in the States which is just so big, is that you can do one interview and speak to a college kid on some amazing college campus. And the next one will be someone in the trailer park eating their breakfast cereal or something, and the next one will be someone with a totally different accent and a totally different way of life. So it’s a really nice way, and really a cheap way to be able to interview lots of different people from different walks of life in a very short span of time.


Jamin: That is, sorry my voice. Apologies. So I’m laughing about the VHS comment still. I did a study, I started my career in ’96. And we did a, I guess it would be equivalent to a Netflix sort of study, where the company that commissioned it, we had a bunch of VHS recorders in our players I guess in the back room of the focus group facility. And then they were all wired into this television set. The participant would pretend to point to a show that was on the television set and then we would quickly try to swap out the cords so that we could play that particular show for them.


Harry: [INAUDIBLE] methodology. Right yeah.


Jamin: Yeah, it was so funny. And so, what’s really interesting to me is in those days you basically just had market research as just this broader category. User research, I guess user research was a thing, but maybe a little bit more underground at least from where I sat. But as companies created their own labs and design teams needed to have access to insights faster it feels like there was this birth of, I don’t want to call it a new discipline, but a new discipline which is user experience research.


Harry: The amazing thing is even now though. You can find yourself getting a job in a reputable company as a user researcher and in that role, you’ll meet people who have no idea what user research is and think it’s some sort of variant of market research were we use focus groups. So despite how much the world has moved on, the understanding of the subtleties and the different kinds of research like what market research is for and what user research is for. It’s not wide spread and I think you have to spend quite a lot of your career just explaining to people what you do and how it does different from like old fashioned 1980s market research.


Jamin: See that’s, so how do you make those distinctions to the non-professional?


Harry: I guess you’ve got attitudinal research, where you’re trying to talk to people about their attitudes about a thing. And I think when you’re doing product research, you’re looking at interaction design and things. I mean you have to observe behavior, it’s all about observed behavior. I mean that’s what Lookback is for too right? Lookback is a tool for you to screen share. So you can see people using the product, you don’t just sit there talking to them about what they would do. You give them activities and then you watch them, you quietly watch them get on with it. And user behavior is far more telling than the attitude and stuff in the context of interaction design which you’re going to look at to see how to make a thing designed better.


Jamin: Right. Kind of going back to the origination in the fighter cockpit right? Where it’s all about streamlining the controls so that the pilot would have easy access to the right stuff that’s important like speed, and altimeter and what not.


Harry: Yeah. And it turns out it’s much better to sit and watch them doing stuff and watch all the subtleties that are [INAUDIBLE] than to get them all to the market research lab and give them all a paper and ask them how they feel about the cockpits. You might get some useful stuff from them but hey, if you’re designing for behaviors, doing behavioral observation is the way to find out about it. Which is kind of obvious when you’re clear like that.


Jamin: Do you see, gosh I’m so glad that we’re having this conversation because I’m been struggling to be quite honest with you on how, I know there’s differences, like material differences. But I’ve been having a hard time articulating them. And so do you, when you think of it like that, do you see there’s, it’s almost like a Venn diagram right? So there are some interdisciplinary commonality, such as how to ask a question. So kind of segue into that piece, what do you see as the elements of a good question?


Harry: Well I made some notes here after I saw your question earlier. And I’ve actually, what my answer here is going to take you a little bit off topic, but hopefully you’ll like this. Because I know you’re interviewing different people and at least I have a slightly different perspective here to report. But I think, I think it’s very easy to focus in on the small details of the research. And researchers can feel very safe when they focus in on small things, like the recruitment, specification, exact wording of the questions. But in my opinion, what defines good research and then it sort of cascades into the questions is the overarching research objectives. So what are you doing the research for in the first place? And if you don’t get that right, the questions are inconsequential. And if you do get it right, the questions become much easier to write anyway. So what do I mean by that? Basically it’s very common, particularly when you got a new job or if you’re a junior researcher to have someone come along and for example a product manager or product director or someone in management try and tell you the objectives in advance of what you should be doing your research on. And managers tend to be very feature focused, so they’re probably going to be very specific and have a very narrow brief about the one thing that they care about at that point in time. So for example, imagine you’re a researcher and you’ve got a new job and the team you’re joining has never done any user research. And your manager, or product owner or whatever comes along and says, “I want you to do some research on this particular dashboard that we’re building for [INAUDIBLE] This dashboard is used by this one particular user type.” Let’s say you’ve got six user types and it’s used by one of them. So if you go and do that research, you’ll probably make that person happy. But you’ll still be kind of in the dark about the big picture. So what about the other five user types that we talked about there? What about the broader user needs? What were the most worrying or the least understood things about the problems that your product is trying to solve for users? And besides, often these sort of senior manage-y type people, they don’t really know what good user research is anyway. So really, like I was saying earlier is really a lot of the job of the researcher is to teach the people around them how they can be engaged with in a constructive way so they don’t get approached with very tightly defined research questions that are overly scoped basically. So I’ve got a metaphor here. If you think of your problem space as being like a dark cave. Using research is a bit like a flashlight that shines a beam into the cave so you can see what’s going on. The first time, if you did go climbing or go exploring and find a big dark cave. The first thing you’re going to want to do is shine your torch, shine your flashlight around the cave to try to work out what’s in there. You’ll probably do it quite quickly right just to make sure that you’re safe and there’s no big surprises like a bear or something. And then once you’ve done that, then you might have a more focused beam and shine it at something else. You might feel like OK we’ve covered all that, we’ve done our first pass. Now we can focus in on that really exciting structure over there, the stalagmites and stalactites or something like that that you really point with being there and get really interested and focused on it. So I guess a bit of a tenuous metaphor there, but I think it’s really, really important to always start broad. Otherwise you can end up getting really deep into something and missing the points on how. Because human life is multilayered and it’s always good to start out with the broadest possible way and then zoom in gradually rather than zoom in first and kind of miss out on some big thing that you should be working on.


Jamin: That’s the best metaphor I’ve ever heard by the way for research. That’s so on point, because you could miss, by starting narrow you could miss the most important thing that could kill you at a product level. And by starting broad, you’re able to get rid of those biases that we naturally bring into our conversations.


Harry: Yeah. Product teams are always going to [INAUDIBLE] you about the thing that they are currently working on. And they’ll probably be some unsexy thing that’s really broken that actually really matters to users, but the way in which. I’ve got my Slack open. How naughty of me to do that [INAUDIBLE] it now. So what was I going to say? If you open any US textbook you’d read about types of research. So you’ll probably read that there are two types of research. You got generative research and evaluative research. Generative is where you sort of want to discover user needs before you’ve got a product or when you’re looking at some of the big sort of, yeah discovery phase stuff. And evaluative is when you’ve got a design and you want the user to evaluate it. But I think one of my points is that I think that you should always try to merge the two types a little bit so you’re not always doing all of one or all of another. And don’t phase it out too much. So for example, if you’re doing evaluative research. Say you do have a design of the thing and you’re taking it into a core research session or Lookback or whatever. It’s good to start out by doing some oral discussion about their lives, the problem you’re trying to solve for them, how they might fit into it and just shoot the shit a bit basically and just talk to them in a really broad way. And then equally, when you’re doing the generative research where that’s kind of the whole point of it. Do let them show you the competitive products they use, do let them get into the interface and show you the things that annoy them and sort of [INAUDIBLE] over screen those two worlds and just let the conversation shift naturally and you’ll learn things that you wouldn’t have otherwise found out. So I think that, yeah the thing about research is that there are other layers of the person’s life or other layers of context that you can really, really miss out on. And if you’re really thinking about how to ask that perfect question it may be that actually asking a load of fondlingly badly worded questions, but in a really relaxed environment can get you the answers you want. I guess that’s kind of my point actually.


Jamin: That makes a lot of sense, especially given the metaphor that we started with. When you think about the common mistakes that researchers make like how, if you could really kind of condense, I know are all make mistakes. I make mistakes all the time. Every time I listen to one of my interviews by the way I’m like, gosh darn it. But anyway. What do you see as some of the more common interview mistakes?


Harry: I think what you can often do when you go in for an interview, is you’ll have an interview script you put together. And then some stakeholders will have come along and gone, can you also ask them these things too. So you’ll have a load of tasks and suddenly the amount of tasks you’re getting to be too many to fit in the time and they’re loaded questions, and there are too many, and then you’ll get some spearish nonsense that will come in from somewhere and you’re like, well I might as well put that in too without saying it. And then you have this script with just tons of stuff that you just don’t have time to cover. And I think a really common mistake is to cut users off and keep hurrying them up to try and get it all in. And obviously it’s not really an interview then, you’re basically. And in fact, if you look in Zencastr in here. You’ve got two audio tracks, like there’s the track with me talking and you can see the wave form and there’s a track with you talking. And generally, what you should have is a situation where the user is talking the most and the track with the interviewer should be relatively quiet most of the time, just kind of encouraging them and helping them. But if you do an interview where the interviewer is talking all the way through, all you’ve learned is more about the interviewer’s voice, which is very little. So you have to give them the time to talk and having a very loose and open interview script is good, and booking in slots that are too long. So for example, if you book a bunch of 90 minute interviews and know that most of them will only take an hour, then you’ll never need to rush anyone. Obviously you’ll have to pay a bit more and you’ll get a few [INAUDIBLE], but that lack of rushing means that you’ll just get more of a natural conversation with them. I remember in one agency I worked at, we had this sort of running joke that you’d always learn the single most important insight from a participant after the session ends on their way out of the building. It’s weird how often that came true. I once did some research for a finance company. They were building a [INAUDIBLE] for financial advisors. I think we’d done something like 20 hours of interview time, we were all exhausted. And one of the participants on their way out went, “I heard that financial advisors would never use this tool anyway.” And well somebody asked them about it and it turned out that most of the people that we recruited were all too senior and wouldn’t actually use the software that we were designing at all, they’d just get someone else to do it in their team. And we were trying to design a product for the users, they were the users, they were kind of a different kind of stakeholder entirely. So we’d failed to go broad first, we didn’t really understand the organizations that we were trying to sell this product to, we didn’t understand who was going to use it. And we basically had to start again. But in a way it was good. That person had the opportunity to have that little conversation with us and it changed the direction. And maybe we did waste a couple weeks, but it allowed us to end up with a useful thing and knowing who it was for and selling it to the right people. All of that cascaded from having the space to have that conversation in the first place.


Jamin: Oh my God. Yes, that is so true. The post recording phase, right? All the sudden, truth comes out. It’s crazy.


Harry: Exactly, exactly.


Jamin: So common mistakes. What do you see as common mistakes, whether it’s with colleagues or peers or as we’ve been in the industry doing research for more than a little while. What do you see as sort of the common pitfalls in asking a question?


Harry: I think just zeroing in, the kind of what I’ve been saying all the way through is zeroing in too specifically on very specific questions asking closed questions when you could be asking open questions. I mean closed questions are okay, but you shouldn’t just ask a series of closed questions with nothing else, because then you could just do a survey or something if you’re going to be like that. I mean if you’re going to do interviews, the beauty of an interview is the richness of insight that you can get. And you want, the whole point of research is to be surprised and to have your mind changed about something. So you have to structure the interview in a way that allows that to happen. You can’t ask people a load of minute questions that way you sort of itemize something so small but that you’re really guiding them through a series of thoughts. And typically when you’re doing product research and people are trying out a product instead of giving them one open ended task like, try this out and see what you think. If you give them a dozen very small things and each of those things corresponds with one of the features, you’re basically telling them what to do. And if you tell them that to do, you’re guiding them through the interface. It’s a bit like sort of subpractical questioning where you’re educating them and guiding them through a process by asking a series of highly structured interview questions where each task corresponds with an extra bit of the interface. So it’s much better not to really ask them very much at all and ask them to muddle through of their own accord. Have a look at this product and see what you think. And then I would go, oh you want to look at that, try it and see, try it and see and let them kind of make their own way.


Jamin: It’s so hard. It’s so hard.


Harry: Yeah, I know. It’s very easy to sit here and talk about how you ought to do research. But once you’re in the research session, it all sort of changes. Especially when you’ve got lots of people watching.


Jamin: Yeah and you want that participant to feel successful right? So there’s this natural human inclination to help them, to try and aid them in their-


Harry: Letting them fail is, that’s a really good point. Yeah. It’s that horrible awkwardness and the really sort of pregnant pauses you get where someone is totally struggling with the interface and they have no idea of what to do next. And what you really want to, your natural inclination is to help them or to give them a tiny little tip as to what to do next. But seeing somebody fail completely, give up is absolutely vital feedback for everyone involved in the product. And it’s better to let them fail completely and say, “That’s it. I would give up at this point.” Take a note and go, that basically tasked [INAUDIBLE]. After you’ve got that data point there, you can then step in and say, right OK in that case, I’ll explain to you X, Y, and Z. And you can kind of continue the interview. But if they can’t use the product, if it’s signing up for a new credit card and they couldn’t complete the sign up process for example. Everything that happens subsequently in the interview is kind of a moot point really, it’s like you’ve got to go and fix that one thing.


Jamin: Right. That’s such an important point that is so easily missed. That could be the most important point, it could happen in the first 5 minutes of the interview. You’ve still got 55 minutes left, so you feel obligated to kind of push through.


Harry: Yeah. And you don’t want to make them feel sad. You don’t want to end the interview after 10 minutes going, we’re all done here.


Jamin: Especially for the client.


Harry: Yeah exactly. They’re paying for it as well, so you do have a, you have a duty to soldier on through the interview for everyone’s sake, but when it comes to the actual research findings. If they fail the task, if it’s a big long task and they fail it near the beginning, everything else they say thereafter after you give them some guidance. The guidance you give them as the researcher, you won’t be there in real life for the hundreds of thousands of users that are using that product. The user has to stand on his own two feet and if they can’t, then that’s the absolutely most vital feedback there is that everybody on the product team needs to know.


Jamin: Do you have a favorite bad question that maybe you’ve seen maybe recently?


Harry: I don’t know. I remember once doing some research and you have the stakeholders in the room. And one of the stakeholders would rap his fingers on the table like this when the user didn’t answer the question. Yeah we were doing some research on time tracking companies in Munich. Because the tech was like a stumbling piece of tech that you kind of had to be in the room to see working. So that didn’t go. It’s basically sometimes you need to keep the stakeholders far away sometimes. And I often find that, I know some researchers like to have a chat window open and like to let some people ask some questions during the research. I absolutely will not abide that as the research they can all get lost. They can write notes and stuff and I’ll talk to them afterwards. But having that extra channel of input while you’re trying to run an interview, it’s just mind meltingly annoying.


Jamin: You’ll appreciate this, considering I think we started around the same time. The note under the door, right?


Harry: Well as the manager now, I sometimes have to go into the research room having to, like maybe the screen is broken or some weird thing has happened where we can’t hear the audio or something. Sometimes that happens in old fashioned labs. So I have to go in there and go, like pretend I’m not watching it live from the screen. You just have to go, oh, I heard there’s some technology issues can you try reading into your brooch, or some nonsense like that. But yeah, it can be quite awkward. Simply if the viewing room door is left open when the participant comes in and they see a massive audience of people all sitting there bolt upright with clipboards, it’s doesn’t really look good [INAUDIBLE] 


Jamin: Harry so tell me about your current business.


Harry: So let’s see I’ve actually got a few things that I do at the moment. I think probably the thing that your listeners would find most interesting is my work on dark patterns. So I basically invented dark patterns, well no that’s not the right word. I didn’t invent them, I discovered them and gave them a name sort of in 2010, and it was. So I set up this website, darkpatterns.org and it, it’s sort of become kind of a bit of a meme. Like everybody uses that term now when they talk about deceptive interfaces and deceptive [INAUDIBLE]. And for quite a few years I thought it was just kind of a hobby thing, I’d go and do talks on it and run the website and the Twitter feed a bit and I thought that was that. But just recently last year, I started providing expert witness services which is fascinating. So it’s the intersection between sort of psychology, UX design and the law. So if someone’s doing a big class action lawsuit and they need an expert to sort of analyze and describe the nature of the deceptive interface, I would get hired to go and write a report and then give them their position.


Jamin: So trick questions, sneak into a basket, roach motel, pricy. So the website is fascinating, I can not wait to dive in.


Harry: Yeah, yeah. So I’m going to be doing a little bit of user research. I’m looking for a couple of agencies to partner with actually, but I’m going to be doing some research looking at how. So I’m looking closely at how dark patterns can work. So for example, how you can hide something in plain sight in the user interface, and you can design it in such a way where the users don’t notice it, but because it’s on the page it sort of makes it legal. So one of the most sort of famous examples of that was the Ryanair. I don’t know if you’ve heard, I’ll describe the example. So on Ryanair which is a low cost airline in Europe. You could go into the checkout as you’re buying your ticket, it would ask you a question like. Let me just see. Oh yeah. What is your country of residence? And if you answer the question directly, it would buy insurance. Because somewhere around it, it said selecting country of residence will cause you to buy insurance. And if you didn’t want it you had to go into the dropdown and pick some sort of between [INAUDIBLE] it said please don’t insure me. And you’d have to pick that option and it [INAUDIBLE]. So they came up with this design of finding the real nature of the question in plain sight on the page, it was written right there. But just the way people scan, they just scanned right past it and don’t see. So I’m really interested in doing a bit of user research or a few bits of user research on getting really under the skin of how dark patterns actually work and the psychology of them.


Jamin: That is so Ryanair. Right? That’s like such a cultural fit for that brand. That’s hilarious and terrifying at so many levels. Yeah well, thank you very much for being on the show. If somebody wants to get in contact with you, how would they do that?


Harry: Well you could google me. If you go to sort of my personal bank [INAUDIBLE].com, you’ll see various different ways of contracting me and various different things that I do. And I would welcome anyone to get in touch. I’m quite active on Twitter actually, so if anyone sort of Tweeted me or DM’d me, I’d probably talk to them at length if they wanted to.


Jamin: My guest today has been Harry Brignull. Harry thank you so much for being on the Happy Market Researcher’s podcast.


Harry: My pleasure.


Jamin: Everyone else, please take time. Screen capture, share this on social media, really appreciate it. Love to tag, love it if you would tag Harry and myself, we would enjoy a Twitter.


Issue 5 – Happy MRx Podcast Newsletter

Last week I had the honor to give the keynote at the Insights Association’s combined chapter event in Las Vegas.

My longtime friend, Ellen Pieper the Chief Client Officer at Research Results, gave me a pretty tall order:

  1. State of the industry
  2. Biggest challenges facing researchers
  3. Highlight trends in marketing research
  4. Practical tactics to growth

And, finally, “Jamin promises that each attendee will walk away with a takeaway that they can integrate into their company right away!”

The good news is that I had one hour to cover all this … this, of course, was also the bad news as I could build an entire two-day workshop around these important issues.

This week’s post is a highlight of my talk along with a link to the actual presentation. So, here we go!

Slide 1: The single biggest growth hack is listening. I realize how this sounds. It assumes we, you, prefer talking to listening. In sales, we feel like talking is our superpower. Instead, open each customer conversation up with this, “What is the biggest problem you are facing today?”

Slides 2-4: Companies are prioritizing consumer insights like never before. Why? Because good customer relationships are worth a fortune. And, the only way you can build that relationship is to listen.

Slides 5-8: Market research is now part of our everyday language, humor, and culture.

Slides 10-11: The key to sales is trust. How do you create trust? Prioritize the customer over your quota. If you can’t solve the problem, connect the customer to someone who can.

Slides 12-14: Triangulating truth is at the core of consumer insight success. Survey data is just part of what needs to be layered into your insights story. And, the better the story the more the insights impact actions.

Slides 15-17: A major point of differentiation is to operate in full transparency. With the rise of black-box research through automation, end-users need to understand how it all fits together. In the end, the company that wins the automation race will be the one that provides the clearest view into the assumptions, processes, and financials of their model.

Slides 18-20: Better, faster, cheaper. Pick three. We have to be in a constant state of improving.

Slides 21-23: Reduce your research into a repeatable story. The measure of a good story is how well it can be retold at the water cooler.

Slides 24-26: Bringing customers close to executives increases empathy. There is a movement to have executives involved in the actual research. Not the whole thing … but a few customer interviews or in-home visits are a great way to help them connect.

Slide 27: Introduce four ways to increase engagement in your customer and stakeholder meetings.

Slide 28: Highlight Reel: Give a face and voice to your insights. There are many free tools you can use, such as iMovie, to stitch together testimonials.

Slide 29: Tufte-Style Meeting: ~6-page paper. Start meeting with 10 minutes of reading in silence. The remainder of the meeting is discussing the implications.

Slide 30: Posters > PPT: Use a flip chart instead of PowerPoint. Secure each “slide” on the wall. Have people walk around the hung pages to discuss.

Slide 31: Code Open-ends: Split up into groups and have stakeholders code a few open ends. Tools: Excel, Dovetail, etc. Use the same set of codes.

That’s it for this week! Now, on to the news. 

Big News!

We have started a podcast network called the Market Research Mafia Podcast Network.

The Market Research Mafia is a podcast network that offers the market a one-stop-shop for market research and user experience podcasts. The podcast network came online to help listeners find consumer insight specific podcasts.

So far, we have 5 podcasts and expect to add another 20 in the coming months.

I hope you have a great weekend! As always, I’d love to hear what you think about this content or if you have ideas on stuff you’d like us to cover. You know where to find me, @jaminbrazil, on any network … now including TikTok. LOL!


GreenBook has offered us, Insights Nation, a unique 30% discount on any of their events. I’ll be at IIeX North America … podcasting. Along with a few friends! If you are going, please use the discount code HAPPYMR and let me know if you’d like to grab a drink! Maybe I should host something one night? 

Have a fun weekend! 


About the Happy Market Research Newsletter: 

A highly editorialized recap of the week in consumer insights. We cover trends, happenings, and tips in market research, user experience research, and customer experience. 

Happy MR Podcast Podcast Series

Josh LaMar, Co-Founder of Authentique UX on Elements of a Good Participant Question

My guest today is Josh LaMar, Principal Researcher and Co-Founder at Authentique UX.

Find Josh Online:

Email: josh@authentiqueux.com 

Web: http://www.authentiqueux.com 

LinkedIn: https://www.linkedin.com/in/joshlamar/

Facebook: https://www.facebook.com/authentiqueux/ 

Find Jamin Online:

Email: jamin@happymr.com 

LinkedIn: www.linkedin.com/in/jaminbrazil

Twitter: www.twitter.com/jaminbrazil 

Find Us Online: 

Twitter: www.twitter.com/happymrxp 

LinkedIn: www.linkedin.com/company/happymarketresearch 

Facebook: www.facebook.com/happymrxp 

Website: www.happymr.com 


“Clap Along” by Auditionauti: https://audionautix.com

This Episode’s Sponsor: 

This episode is brought to you by Lookback. Lookback provides the tools to help UX teams to interact with real users, in real-time, and in real contexts. It’s Lookback’s mission to humanize technology by bridging the gap between end-users and product teams. Lookback’s customers range from one-man teams building web and app experiences to the world’s largest research organizations, collectively ensuring that humanity is at the core of every product decision. For more info, including demos of Lookback’s offering, please visit www.lookback.io


Jamin: Hey everybody, this is Jamin. You’re listening to the Happy Market Research Podcast. My guest today is Josh LaMar. He is live from Brazil. Josh, how are you today?


Josh: Hello. I am doing very well. It’s a nice warm afternoon in Brazil.


Jamin: San Pablo?


Josh: Yes.


Jamin: Specifically?


Josh: São Paulo.


Jamin: So tell us a little bit about yourself. You’re a UX researcher, how in the world did you wind up in that job, and what kind of customers are you working with?


Josh: Well, I didn’t initially set out to be a researcher. And it’s funny because I did my undergraduate degrees in music composition and English poetry, which are seemingly as far as you can get away from research. But then I pivoted a little bit, and then went to grad school in human-centered design and engineering. And I’ve been playing with this idea, I think throughout my career, of the role of creativity in science and how important it is to use creativity to create research methodologies, and to answer questions that we have about our users. It ended up being a really great fit for me.


Jamin: That’s a big chasm between those two things, right?


Josh: Yeah.


Jamin: To say the least. How did you bridge it professionally?


Josh: I started out thinking, what can I do with my English degree? And I started out by getting a certification in technical editing, thinking that I would become an editor. And then afterwards, I found this master’s program at the University of Washington, like I mentioned, human-centered design and engineering. And when I finished the program, I was like, oh, I’m qualified to do research now, and people are way more interesting than commas.


Jamin: That’s true, I guess.


Josh: They’ll always surprise you, and you just have to be open to seeing, what’s gonna happen when you meet this new person? And the act of doing research and going and talking to people, and especially visiting people in their homes, it’s really fascinating for me. And it’s interpersonal and it’s empathetic. And that’s, I think, what draws me and keeps me doing research.


Jamin: The topic for today’s discussion is centered around questions, the anatomy of a question as it relates to research. It’s funny, because as I’ve done this interview now a few times, I’ve realized that my first question is actually a flawed question. It’s hilarious. The type of question that we’re talking about here is really at an interview level, right?


Josh: Correct.


Jamin: So, how you would ask or frame a question in such a way that it would be considered useful to gather information from a participant.


Josh: I think that the way that you frame a question is very, very important because you have to be at the right level. And what I mean by level is that, if you start off an interview by saying like, well, tell me how you check your email on the weekends, you’ve just scoped it so narrow and really, you might be interested in something else. I was the research manager at Outlook for several years, so I can use email as a really easy example of things that I’ve done research on a lot. So it’s really important to start very broad and then move into the specific. And an example of a broad question might be, tell me how you communicate with your friends and family, much more broad than just email. And then as you start getting into it, you’ll find more interesting things. The framing is so important because when you frame too narrowly, you put this box around the user. And the user thinks, I think that they want to hear just this part, and so they only share the things that are in that box. But when you add a broader box from the beginning, then everything else is open. And you might find something that’s even more interesting just by asking a broader question.


Jamin: So it’s thinking of it in a way like a funnel, is how you’re framing it. And so you don’t start with the specifics, you start with the broader. And do you find that helps it-? Let me reframe the question. What are some of the primary benefits of starting broader and then narrowing through discovery?


Josh: I think that the biggest benefit is that you don’t limit the response into the mental model or mindset of the user, or of your own mental model and mindset. As soon as you start limiting yourself, you’re limiting what you’re gonna hear and you’re focusing too narrowly. I think that by broadening it first, you really start to understand the landscape a bit better. And then from there, you’re able to dig into the things that are interesting. And it’s not necessarily the things that are interesting to you, it’s the things that are interesting to the user as well. And that’s what’s more interesting and more important when it comes to doing research. Is, what they think, not what I think and how I compartmentalize all these different ideas together in my brain.


Jamin: How have you framed this out with your customers or your constituents inside of-thinking about like Microsoft?


Josh: Actually, similarly, I think I explain it a lot just like that. Typically, when I’m explaining how I come up with a research methodology, I’ll be talking about the approach, the research approach. And I’ll start broad, like I mentioned, and then narrow in on the things that the team is more interested in. So for example, in doing research on email, it would start with, hey, how do you communicate with people? And then maybe it might go to, where does email fit into the rest of your communication mediums that you use? And then it might go into more specific things that the team is interested in, like testing this particular feature in the email or application itself.


Jamin: That makes a lot of sense. And I can imagine that you’ve had some interesting-like, understanding the context of the usage, thinking about email, probably has been as insightful as some of the initial objectives of the more narrow research.


Josh: Yeah. And I think that another really interesting part about this, when you’re thinking about how you’re framing a question, is that, when you start framing it too narrowly, you can end up priming the user to talk about specific things that they think that you want to hear. Whereas, if you’re starting more broadly, you’re not priming them in any way. You’re starting with like, how do you communicate? In general, and not, what do you think about this particular feature? And that way allows for the unexpected to happen in the interview, which is the most interesting part.


Jamin: So that’s the piece to dive into, right? It’s when the unexpected happens in an interview, that’s the most interesting part. That’s exactly right. Because I’ve done a fair amount of qualitative and quantitative and you sort of, after-I failed, I really failed. I did terrible at research for like three years. And what would happen is, we would do these focus groups around the nation or whatever. And after about three, it kind of felt the same. Like, every focus group started feeling like the same conversation over and over again. My failure in that was I wasn’t starting with this broader context to then narrow it down. Instead, I sort of like developed my assumptions or story and then after that, I basically started framing my questions to self-fulfill this narrative. Sorry everybody that I charged 25 years ago. They did get some value, but you get my point.


Josh: Yeah, I think that that’s what happens too, is you’re essentially telling the user, this is what I think, tell me what I think and in your own way. Or you’re saying, tell me, just reinforce what I already think. And that’s a huge problem as a researcher because then we just start reinforcing our already held beliefs. And like I mentioned earlier, this idea of what’s interesting is what is brand new and what is unexpected. Like, you always know in the back of your mind, well, we want to find out about these five things, or we have this feature that’s gonna do this thing, so we want to understand if people care about this. But then when you’re open to allowing people to share about other related things and you’re not priming them to say, oh, he cares about this feature. It’s more about, well, what do you care about? What are the problems that you have when you’re doing this, in general? There could be problems that we don’t even know about, that if we don’t ask, then we’re spending all this time trying to build this feature that does this thing that people don’t really have that much of a problem with. And so it allows us to really find the latent needs a lot better.


Jamin: It takes a lot of energy, right, when you’re in the field doing the interviews and then trying to find-discover the new or the unknown. It’s an amazing drain.


Josh: It really is.


Jamin: It’s funny, it’s even physically exhausting, which is like-the pre-researcher Jamin, would have had no idea that that was true. So, do you have any tips or ways that you maintain your energy?


Josh: One of the things that I think is really important is to not overload a research schedule. So, yeah, you could do like eight interviews in a day, eight one-hour interviews, right? That’s eight hours. That’s a day of work. But honestly, it’s ten times more work than I would do. And it’s so much better to just pace things out in such a way you are able to have a break to think about nothing, to check social media and decide that you’re going to look at silly cat videos. Things like that are actually really good for your brain because your brain needs a break between all of this really intense thinking that you’re doing. I think that junior researchers are often surprised by how difficult it is. For many years, I was coaching a usability study course at the University of Washington and my former grad program. And I’d be there in the lab with them for the first time that they’ve ever done a usability session ever, ever. And we’d talk about it afterwards and they’d always say, oh, wow, that was so much harder than I thought it was gonna be.


Jamin: Right.


Josh: You can make all the mistakes you want because it doesn’t really matter, this is just like practice. But they always are surprised at how difficult it really is, because you do have to be on top of so many things all at once. You have to know what the goal of the research is. You have to know about the different features that you’re trying to test, what decisions the team is trying to make in order to-like, this research has to inform these business decisions. You have to know what those are and then follow up on them and ask around them, as to my other favorite technique, ask around things, but don’t ask them directly. And so there’s this back and forth that goes on where you’re like, well, what do you think about this? And, tell me about how this works for you or not. And you have to kind of like dance around the topic until at some point, if they just never get it or never think about it, then you can ask it directly at the end. But there’s this dance that you’re dancing around the, what matters to you, without telling them what you want to hear about.


Jamin: That is so hard to do because it feels like the wrong question when you’re in the moment and having that. And there’s such a risk-a perception of risk, I should say, because your client may be present or may look at the transcripts and they care about the answer to their business question. Not-thinking about Outlook, not broader communication themes that may involve things like, in the old days, mail or what have you, or faxes. I’m definitely dating myself. So the risk there is that-or the perception of the risk, I should say, is that, gosh, am I gonna be able to meet the objectives in the allotted time that I have to meet with a participant? But really the risk is, by not doing that, it’s much greater, because you ultimately may not get to the undiscovered truth.


Josh: I think that over time, I’ve realized that I trust the system, I trust the process. That when you create this opportunity for people to talk about what’s interesting to them, they will start doing it. And if they go off-topic, a little off-topic is fine and you can always bring them back. My favorite trick for that is to ask a yes or no question, and then take control of the conversation and ask about something different.


Jamin: Oh, that’s so clever. Thank you for that hack.


Josh: It’s like, oh, do you do that every day? Yeah, I do. Great. Let’s shift gears for a minute, and I want to talk about this other area now.


Jamin: That is the quote of the episode right there, that is gold.


Josh: It is. Just even understanding like, this is an open-ended question, this is a close-ended question. I’m gonna use them very intentionally. So I’m gonna ask an open-ended question when I want to just hear you talk about anything that you have to say about whatever topic it is. And then a closed-ended question could be used to close things off and say, yes, no, that’s it. And then you take control, shift gears, and then you’re on to the next topic. There’s also a really interesting moment when you’ve done several interviews in a row, I think it’s around five interviews. If you did like a usability test of ten people or an interview with ten people, around five or six people, you know pretty much what they’re gonna say. It starts off with like, the first few are, oh, yeah, that’s really interesting. Around three and four, you’re like, these are some themes that are coming out. Around five to seven, you’re like, this is exactly what you’re gonna say. And then by seven through ten, you know exactly what they’re gonna say. And there is this moment where they say enough and you’re like, you said the exact same thing, we can move on. And so that’s why the final interviews always go so much faster, because you can already hear the same thing and you’ve heard them say everything that everybody else said.


Jamin: It is uncanny. And then the temptation there is, for me anyway, is to treat that as a quant study because you feel like you’ve definitively answered for that particular segment, right?


Josh: Yeah, that external validity is really important to iterate.


Jamin: For sure. We’ve really covered this already, so common mistakes that you’ve seen in framing questions for an interview?


Josh: Yeah. I think that the biggest mistake is to either ask a leading question or to frame it too narrowly first. We’ve talked about framing narrowly first, so I guess we could talk about leading questions now. Which are things like, tell me how amazing this product is. That’s an over-exaggeration, but it can be much more subtle too. Like, if you’re only asking about the positive aspects of something or you’re saying, oh, this is a really great feature, isn’t it? Well, what did you just do there? You told the user-you primed them, number one, to say, I like this feature, and then I created this tag question like, isn’t it? Don’t you agree with me? You should agree with me because I’m the smart one here. You just made the user feel dumb, and then you also told them exactly what you want to hear. So what are they gonna do? They’re gonna tell you what you want to hear because they want to make you happy. And it’s so important as a researcher, to be very neutral and to ensure that you’re not letting too much of your own feelings ever come out. Because as soon as you start letting on like, this is really dumb, isn’t it? Yeah, I don’t really use this, but we need to test this for our client. Can you just tell us that thing? You’re throwing out the whole study data if you do that because it’s too leading, you don’t want to lead them on to the answer. The answer is what they think, not what you think.


Jamin: You’ve seen a lot of questions, what is one of the worst?


Josh: So tough. It’s gonna be anything where it’s too, too leading like that. Like, oh, this is great, isn’t it? I used to coach PMs and designers at Microsoft on my former team on how to talk to users, and I have a whole presentation that I would give on how to talk and not talk to users. Because really it’s about listening, it’s not about talking. And whenever I’ve seen these examples, I will call it out after the session and say, oh, hey, when you said this, it was pretty leading. So the best thing that you can do, if you find yourself asking a leading question, is to ask the opposite next. So if you said like, oh, tell me how great this is, you could say, is there anything that isn’t as great about this? Or, how was this difficult to use? The better thing would be to say, how easy or difficult was this to use? Or, tell me about your experience using this thing. Those are much more neutral responses and questions. So you can try to be as neutral as possible, but then if you ever find yourself being a little bit more leading, just ask about the opposite end of the scale. It’s really all about being neutral, that’s so important.


Jamin: That’s probably, for me, the biggest energy drain in interviews because I like people, fundamentally. I just like people and I want to connect with them, so it’s like-I don’t know, sort of like innate desire for me. I’m sure it has a lot more counseling before I get to the root cause of that. And so one of my big energy drains in interviews is maintaining neutrality as opposed to building a relationship or some level of rapport.


Josh: I think it’s really important to develop just enough rapport in order to have that person feel comfortable with talking to you. And you’ll have a little back and forth or chit chat about the weather from the elevator to your lab or something like that, it’s totally fine. You want them to be comfortable in sharing their deepest, darkest secrets with you. And if they don’t feel comfortable with that, there’s no way they’re gonna share anything that really matters to them. But I think that it’s also important to keep the focus on that other person. Even if you’re thinking, oh my god, me too. I feel like that all the time. You, uh-huh, tell me more about that. Neutral, and you can think it without sharing that. And as a really empathetic person, I feel that same thing that you just talked about, where I really want to connect with the person because I feel their pain. And I feel like that’s one of the best things that a researcher can be is just empathetic with people’s pain. Because it’s not easy, using technology is not easy and we try to make it easier, and that’s really important. And then when you see people, you see them struggle, usually the worst is in a usability study where you see like, you know how the team decided to build the feature, and you know why they didn’t put the button in the place that the user is looking for the button, and then you just see them go in circles. I’ve seen people go in circles looking for things and never find it. And it’s like, it’s sad, but this is the kind of thing that the team needs to see in order to understand, oh, hey, this is really confusing. And then after the session is over, there’s always that moment after you leave, after the cameras are off and you’re like, thank you so much. I felt the same way about these things that you talked about too. And then you can have that human moment when it’s all over, when it’s not thwarting you.


Jamin: That’s so funny that you say that. It’s almost like there’s delayed gratification and that we just need to make sure that we install as a discipline. And you’re right, that level of rapport that you build at the beginning is critical because you need to be able to get to the deep stuff. But you have to refrain from congratulatory responses to answers that they give, because all of a sudden you start creating that feedback loop of fishing, I guess.


Josh: And I’ve seen that thing where-in a usability study, I’ve seen other researchers say like, oh, yeah, that was very good. Great, next, and they’re always too positive and they start getting to this level of congratulating the user for their feedback. Or when they say something negative, always saying, oh, thank you so much, that was so helpful. It’s maybe a thing that we’re thinking, but you have to tone it down a little bit like, thank you for your feedback. Very different than, oh my gosh. And I find that I have to tone down my exaggeration quite a bit.


Jamin: That’s why being a podcast host is way more fun, that’s why I can be. So, you’re a UX consultant, right?


Josh: Yes.


Jamin: If somebody has a UX job, what type of jobs do you usually take? What’s the ideal customer look like, and how can they get in contact with you?


Josh: Well, we just started my UX research company with my husband, we are Authentique UX. You can just add a-Authentique is the conglomeration of authentic plus boutique. Plus, it is the word authentic in French, and so that’s where it came from. authentiqueux.com is our website. To get in touch with us, my email is josh@authentiqueux.com. That’s A-U-T-H-E-N-T-I-Q-U-E-U-X.com. And yeah, we do research, UX research. We’re based in Paris, and so we can easily do stuff in Paris. But my husband is from Brazil, and so we’ve done research in Brazil as well. And yeah, we’re pretty open in terms of methodologies, but we’re really looking at trying to find the interesting depth of whatever question it is that we come across, because there’s always more there. And we have a team of people actually that are-they all have over ten years of experience. And so we’re really looking at, how can we do these things very, very well and answer those deeper needs that people have and also address the business questions?


Jamin: My guest today has been Josh LaMar. Josh, thank you so much for joining me on the Happy Market Research Podcast.

Josh: Thank you. It’s been great to be here.


Jamin: Everybody else, if you please take time, screen capture, share, tag us, LinkedIn, Twitter.

MRx News

Maru/Blue Launches a Panel Management Research Tool

Maru/Blue, a premium data services firm, launches Maru/Blue Forums, a value-driven panel management research tool powered by Maru’s proprietary technology, Maru/HUB. This gives clients the option to either manage and execute community research themselves or have assistance from Maru/Blue’s insights team.

In today’s mergers and acquisitions, Clarivate Analytics, an insights and analytics company, completes its acquisition of Decision Resources Group, a healthcare research and consulting company, for $950 million. 

In human capital news, EMI Solutions hires former Abt Associates and Kantar executive, Beth Teehan, as its new Chief Operating Officer. 

Interpublic Group appoints Chad Engelgau as Chief Executive Officer of Acxiom, a data, technology and marketing services company. 

In jobs, Cranbrook Search Consultants is looking to place a remote senior product manager in a SaaS market research company.

Find Jamin Online:

Email: jamin@happymr.com

LinkedIn: www.linkedin.com/in/jaminbrazil

Twitter: www.twitter.com/jaminbrazil

Find Us Online:

Twitter: www.twitter.com/happymrxp

LinkedIn: www.linkedin.com/company/happymarketresearch

Facebook: www.facebook.com/happymrxp

Website: www.happymr.com


Clarivate Analytics: https://clarivate.com/news/clarivate-analytics-closes-acquisition-of-decision-resources-group/

Maru/Blue: https://www.marublue.com/in-the-news/launch-of-forums-proprietary-panel-technology-platform-to-clients

EMI Solutions: https://emi-rs.com/2020/02/26/emi-adds-to-executive-team/

Interpublic Group: https://www.globenewswire.com/news-release/2020/02/28/1992901/0/en/Acxiom-Transitions-to-New-Global-CEO-Following-Successful-Integration.html

Cranbrook Search Consultants: https://www.cranbrooksearch.com/open-jobs/#!/47abc9b0-f1da-417c-9b76-9d714dfa4480/detail 

This Episode’s Sponsor:

This episode is brought to you by Lookback. Lookback provides the tools to help UX teams to interact with real users, in real-time, and in real contexts. It’s Lookback’s mission to humanize technology by bridging the gap between end-users and product teams. Lookback’s customers range from one-man teams building web and app experiences to the world’s largest research organizations, collectively ensuring that humanity is at the core of every product decision. For more info, including demos of Lookback’s offering, please visit www.lookback.io