My guest today is Harry Brignull, Head of UX Innovation at Smart Pension. 

Find Harry Online:

Web: https://www.brignull.com 

LinkedIn: https://www.linkedin.com/in/harrybrignull 

Twitter: https://twitter.com/harrybr  

Find Jamin Online:

Email: jamin@happymr.com 

LinkedIn: www.linkedin.com/in/jaminbrazil

Twitter: www.twitter.com/jaminbrazil 

Find Us Online: 

Twitter: www.twitter.com/happymrxp 

LinkedIn: www.linkedin.com/company/happymarketresearch 

Facebook: www.facebook.com/happymrxp 

Website: www.happymr.com 

Music:

“Clap Along” by Auditionauti: https://audionautix.com

This Episode’s Sponsor: 

This episode is brought to you by Lookback. Lookback provides the tools to help UX teams to interact with real users, in real-time, and in real contexts. It’s Lookback’s mission to humanize technology by bridging the gap between end-users and product teams. Lookback’s customers range from one-man teams building web and app experiences to the world’s largest research organizations, collectively ensuring that humanity is at the core of every product decision. For more info, including demos of Lookback’s offering, please visit www.lookback.io


[00:00:03]

Jamin: Hey everybody. This is Jamin. You’re listening to Happy Market Research podcast. My guest today is Harry Brignull. Harry, thanks for joining me on the podcast.

[00:00:10]

Harry: My pleasure.

[00:00:11]

Jamin: So we’re talking about the anatomy of a research question. A question in context of one you’d ask a participant. So give us a little bit of context. Tell us how you wound up in research.

[00:00:23]

Harry: You know what, I went through a very sort of traditional route. But that’s because I’m kind of a bit old. So in the old days, the only way to get into research was through pretty much the formal route of studying something like psychology and doing it through academia because there was sort of, at the end of the 90s and the early 2000s, there was no UX community, there was no sort of user research roles that you can get in industry. So yeah, I was an academic researcher back in the day. And when we did research, we had to record the research onto VHS cassettes. So I remember setting up our first lab where we had S-VHS recorders and being really, really proud of it because they were slightly higher fidelity than the regular VHS. It’s quite funny to look back on it now. So I’ve been in the business for quite a few years as you can tell. So I became a usability consultant because in those days there wasn’t really. There wasn’t a term user experience, no one. If you go into Google trends and have a look at the term user experience, it wasn’t really around in the early 2000s at all. Usability was a thing though. So that’s kind of how I got into user research doing a lot of lab research, a little bit of eye tracking and ethnography and that sort of thing. And then I kind of ended up going a bit design side. I think most research, most people in the UX industries sort of move around a bit. So I started out with research, then I went into more design, then back into research again and now I run a design team with a mixture of all of those skills. I guess the bit that listeners are probably most interested in is when I went to work at Spotify a few years ago. When I was using, I was using the Lookback really intensively when I was at Spotify. So I would work from home in Brighton on the south coast, my product squads were in Stockholm, totally different country. And the end users I was working with were in the USA. So it was just the perfect tool for that sort of thing, where I could sit at home in my pajamas and do research and deliver the research to my team and still be near my family and everything. The great thing about remote research, particularly in the States which is just so big, is that you can do one interview and speak to a college kid on some amazing college campus. And the next one will be someone in the trailer park eating their breakfast cereal or something, and the next one will be someone with a totally different accent and a totally different way of life. So it’s a really nice way, and really a cheap way to be able to interview lots of different people from different walks of life in a very short span of time.

[00:02:36]

Jamin: That is, sorry my voice. Apologies. So I’m laughing about the VHS comment still. I did a study, I started my career in ’96. And we did a, I guess it would be equivalent to a Netflix sort of study, where the company that commissioned it, we had a bunch of VHS recorders in our players I guess in the back room of the focus group facility. And then they were all wired into this television set. The participant would pretend to point to a show that was on the television set and then we would quickly try to swap out the cords so that we could play that particular show for them.

[00:03:19]

Harry: [INAUDIBLE] methodology. Right yeah.

[00:03:22]

Jamin: Yeah, it was so funny. And so, what’s really interesting to me is in those days you basically just had market research as just this broader category. User research, I guess user research was a thing, but maybe a little bit more underground at least from where I sat. But as companies created their own labs and design teams needed to have access to insights faster it feels like there was this birth of, I don’t want to call it a new discipline, but a new discipline which is user experience research.

[00:03:55]

Harry: The amazing thing is even now though. You can find yourself getting a job in a reputable company as a user researcher and in that role, you’ll meet people who have no idea what user research is and think it’s some sort of variant of market research were we use focus groups. So despite how much the world has moved on, the understanding of the subtleties and the different kinds of research like what market research is for and what user research is for. It’s not wide spread and I think you have to spend quite a lot of your career just explaining to people what you do and how it does different from like old fashioned 1980s market research.

[00:04:30]

Jamin: See that’s, so how do you make those distinctions to the non-professional?

[00:04:38]

Harry: I guess you’ve got attitudinal research, where you’re trying to talk to people about their attitudes about a thing. And I think when you’re doing product research, you’re looking at interaction design and things. I mean you have to observe behavior, it’s all about observed behavior. I mean that’s what Lookback is for too right? Lookback is a tool for you to screen share. So you can see people using the product, you don’t just sit there talking to them about what they would do. You give them activities and then you watch them, you quietly watch them get on with it. And user behavior is far more telling than the attitude and stuff in the context of interaction design which you’re going to look at to see how to make a thing designed better.

[00:05:17]

Jamin: Right. Kind of going back to the origination in the fighter cockpit right? Where it’s all about streamlining the controls so that the pilot would have easy access to the right stuff that’s important like speed, and altimeter and what not.

[00:05:32]

Harry: Yeah. And it turns out it’s much better to sit and watch them doing stuff and watch all the subtleties that are [INAUDIBLE] than to get them all to the market research lab and give them all a paper and ask them how they feel about the cockpits. You might get some useful stuff from them but hey, if you’re designing for behaviors, doing behavioral observation is the way to find out about it. Which is kind of obvious when you’re clear like that.

[00:05:53]

Jamin: Do you see, gosh I’m so glad that we’re having this conversation because I’m been struggling to be quite honest with you on how, I know there’s differences, like material differences. But I’ve been having a hard time articulating them. And so do you, when you think of it like that, do you see there’s, it’s almost like a Venn diagram right? So there are some interdisciplinary commonality, such as how to ask a question. So kind of segue into that piece, what do you see as the elements of a good question?

[00:06:23]

Harry: Well I made some notes here after I saw your question earlier. And I’ve actually, what my answer here is going to take you a little bit off topic, but hopefully you’ll like this. Because I know you’re interviewing different people and at least I have a slightly different perspective here to report. But I think, I think it’s very easy to focus in on the small details of the research. And researchers can feel very safe when they focus in on small things, like the recruitment, specification, exact wording of the questions. But in my opinion, what defines good research and then it sort of cascades into the questions is the overarching research objectives. So what are you doing the research for in the first place? And if you don’t get that right, the questions are inconsequential. And if you do get it right, the questions become much easier to write anyway. So what do I mean by that? Basically it’s very common, particularly when you got a new job or if you’re a junior researcher to have someone come along and for example a product manager or product director or someone in management try and tell you the objectives in advance of what you should be doing your research on. And managers tend to be very feature focused, so they’re probably going to be very specific and have a very narrow brief about the one thing that they care about at that point in time. So for example, imagine you’re a researcher and you’ve got a new job and the team you’re joining has never done any user research. And your manager, or product owner or whatever comes along and says, “I want you to do some research on this particular dashboard that we’re building for [INAUDIBLE] This dashboard is used by this one particular user type.” Let’s say you’ve got six user types and it’s used by one of them. So if you go and do that research, you’ll probably make that person happy. But you’ll still be kind of in the dark about the big picture. So what about the other five user types that we talked about there? What about the broader user needs? What were the most worrying or the least understood things about the problems that your product is trying to solve for users? And besides, often these sort of senior manage-y type people, they don’t really know what good user research is anyway. So really, like I was saying earlier is really a lot of the job of the researcher is to teach the people around them how they can be engaged with in a constructive way so they don’t get approached with very tightly defined research questions that are overly scoped basically. So I’ve got a metaphor here. If you think of your problem space as being like a dark cave. Using research is a bit like a flashlight that shines a beam into the cave so you can see what’s going on. The first time, if you did go climbing or go exploring and find a big dark cave. The first thing you’re going to want to do is shine your torch, shine your flashlight around the cave to try to work out what’s in there. You’ll probably do it quite quickly right just to make sure that you’re safe and there’s no big surprises like a bear or something. And then once you’ve done that, then you might have a more focused beam and shine it at something else. You might feel like OK we’ve covered all that, we’ve done our first pass. Now we can focus in on that really exciting structure over there, the stalagmites and stalactites or something like that that you really point with being there and get really interested and focused on it. So I guess a bit of a tenuous metaphor there, but I think it’s really, really important to always start broad. Otherwise you can end up getting really deep into something and missing the points on how. Because human life is multilayered and it’s always good to start out with the broadest possible way and then zoom in gradually rather than zoom in first and kind of miss out on some big thing that you should be working on.

[00:09:49]

Jamin: That’s the best metaphor I’ve ever heard by the way for research. That’s so on point, because you could miss, by starting narrow you could miss the most important thing that could kill you at a product level. And by starting broad, you’re able to get rid of those biases that we naturally bring into our conversations.

[00:10:15]

Harry: Yeah. Product teams are always going to [INAUDIBLE] you about the thing that they are currently working on. And they’ll probably be some unsexy thing that’s really broken that actually really matters to users, but the way in which. I’ve got my Slack open. How naughty of me to do that [INAUDIBLE] it now. So what was I going to say? If you open any US textbook you’d read about types of research. So you’ll probably read that there are two types of research. You got generative research and evaluative research. Generative is where you sort of want to discover user needs before you’ve got a product or when you’re looking at some of the big sort of, yeah discovery phase stuff. And evaluative is when you’ve got a design and you want the user to evaluate it. But I think one of my points is that I think that you should always try to merge the two types a little bit so you’re not always doing all of one or all of another. And don’t phase it out too much. So for example, if you’re doing evaluative research. Say you do have a design of the thing and you’re taking it into a core research session or Lookback or whatever. It’s good to start out by doing some oral discussion about their lives, the problem you’re trying to solve for them, how they might fit into it and just shoot the shit a bit basically and just talk to them in a really broad way. And then equally, when you’re doing the generative research where that’s kind of the whole point of it. Do let them show you the competitive products they use, do let them get into the interface and show you the things that annoy them and sort of [INAUDIBLE] over screen those two worlds and just let the conversation shift naturally and you’ll learn things that you wouldn’t have otherwise found out. So I think that, yeah the thing about research is that there are other layers of the person’s life or other layers of context that you can really, really miss out on. And if you’re really thinking about how to ask that perfect question it may be that actually asking a load of fondlingly badly worded questions, but in a really relaxed environment can get you the answers you want. I guess that’s kind of my point actually.

[00:12:00]

Jamin: That makes a lot of sense, especially given the metaphor that we started with. When you think about the common mistakes that researchers make like how, if you could really kind of condense, I know are all make mistakes. I make mistakes all the time. Every time I listen to one of my interviews by the way I’m like, gosh darn it. But anyway. What do you see as some of the more common interview mistakes?

[00:12:22]

Harry: I think what you can often do when you go in for an interview, is you’ll have an interview script you put together. And then some stakeholders will have come along and gone, can you also ask them these things too. So you’ll have a load of tasks and suddenly the amount of tasks you’re getting to be too many to fit in the time and they’re loaded questions, and there are too many, and then you’ll get some spearish nonsense that will come in from somewhere and you’re like, well I might as well put that in too without saying it. And then you have this script with just tons of stuff that you just don’t have time to cover. And I think a really common mistake is to cut users off and keep hurrying them up to try and get it all in. And obviously it’s not really an interview then, you’re basically. And in fact, if you look in Zencastr in here. You’ve got two audio tracks, like there’s the track with me talking and you can see the wave form and there’s a track with you talking. And generally, what you should have is a situation where the user is talking the most and the track with the interviewer should be relatively quiet most of the time, just kind of encouraging them and helping them. But if you do an interview where the interviewer is talking all the way through, all you’ve learned is more about the interviewer’s voice, which is very little. So you have to give them the time to talk and having a very loose and open interview script is good, and booking in slots that are too long. So for example, if you book a bunch of 90 minute interviews and know that most of them will only take an hour, then you’ll never need to rush anyone. Obviously you’ll have to pay a bit more and you’ll get a few [INAUDIBLE], but that lack of rushing means that you’ll just get more of a natural conversation with them. I remember in one agency I worked at, we had this sort of running joke that you’d always learn the single most important insight from a participant after the session ends on their way out of the building. It’s weird how often that came true. I once did some research for a finance company. They were building a [INAUDIBLE] for financial advisors. I think we’d done something like 20 hours of interview time, we were all exhausted. And one of the participants on their way out went, “I heard that financial advisors would never use this tool anyway.” And well somebody asked them about it and it turned out that most of the people that we recruited were all too senior and wouldn’t actually use the software that we were designing at all, they’d just get someone else to do it in their team. And we were trying to design a product for the users, they were the users, they were kind of a different kind of stakeholder entirely. So we’d failed to go broad first, we didn’t really understand the organizations that we were trying to sell this product to, we didn’t understand who was going to use it. And we basically had to start again. But in a way it was good. That person had the opportunity to have that little conversation with us and it changed the direction. And maybe we did waste a couple weeks, but it allowed us to end up with a useful thing and knowing who it was for and selling it to the right people. All of that cascaded from having the space to have that conversation in the first place.

[00:15:04]

Jamin: Oh my God. Yes, that is so true. The post recording phase, right? All the sudden, truth comes out. It’s crazy.

[00:15:12]

Harry: Exactly, exactly.

[00:15:15]

Jamin: So common mistakes. What do you see as common mistakes, whether it’s with colleagues or peers or as we’ve been in the industry doing research for more than a little while. What do you see as sort of the common pitfalls in asking a question?

[00:15:33]

Harry: I think just zeroing in, the kind of what I’ve been saying all the way through is zeroing in too specifically on very specific questions asking closed questions when you could be asking open questions. I mean closed questions are okay, but you shouldn’t just ask a series of closed questions with nothing else, because then you could just do a survey or something if you’re going to be like that. I mean if you’re going to do interviews, the beauty of an interview is the richness of insight that you can get. And you want, the whole point of research is to be surprised and to have your mind changed about something. So you have to structure the interview in a way that allows that to happen. You can’t ask people a load of minute questions that way you sort of itemize something so small but that you’re really guiding them through a series of thoughts. And typically when you’re doing product research and people are trying out a product instead of giving them one open ended task like, try this out and see what you think. If you give them a dozen very small things and each of those things corresponds with one of the features, you’re basically telling them what to do. And if you tell them that to do, you’re guiding them through the interface. It’s a bit like sort of subpractical questioning where you’re educating them and guiding them through a process by asking a series of highly structured interview questions where each task corresponds with an extra bit of the interface. So it’s much better not to really ask them very much at all and ask them to muddle through of their own accord. Have a look at this product and see what you think. And then I would go, oh you want to look at that, try it and see, try it and see and let them kind of make their own way.

[00:17:05]

Jamin: It’s so hard. It’s so hard.

[00:17:06]

Harry: Yeah, I know. It’s very easy to sit here and talk about how you ought to do research. But once you’re in the research session, it all sort of changes. Especially when you’ve got lots of people watching.

[00:17:18]

Jamin: Yeah and you want that participant to feel successful right? So there’s this natural human inclination to help them, to try and aid them in their-

[00:17:30]

Harry: Letting them fail is, that’s a really good point. Yeah. It’s that horrible awkwardness and the really sort of pregnant pauses you get where someone is totally struggling with the interface and they have no idea of what to do next. And what you really want to, your natural inclination is to help them or to give them a tiny little tip as to what to do next. But seeing somebody fail completely, give up is absolutely vital feedback for everyone involved in the product. And it’s better to let them fail completely and say, “That’s it. I would give up at this point.” Take a note and go, that basically tasked [INAUDIBLE]. After you’ve got that data point there, you can then step in and say, right OK in that case, I’ll explain to you X, Y, and Z. And you can kind of continue the interview. But if they can’t use the product, if it’s signing up for a new credit card and they couldn’t complete the sign up process for example. Everything that happens subsequently in the interview is kind of a moot point really, it’s like you’ve got to go and fix that one thing.

[00:18:25]

Jamin: Right. That’s such an important point that is so easily missed. That could be the most important point, it could happen in the first 5 minutes of the interview. You’ve still got 55 minutes left, so you feel obligated to kind of push through.

[00:18:39]

Harry: Yeah. And you don’t want to make them feel sad. You don’t want to end the interview after 10 minutes going, we’re all done here.

[00:18:44]

Jamin: Especially for the client.

[00:18:50]

Harry: Yeah exactly. They’re paying for it as well, so you do have a, you have a duty to soldier on through the interview for everyone’s sake, but when it comes to the actual research findings. If they fail the task, if it’s a big long task and they fail it near the beginning, everything else they say thereafter after you give them some guidance. The guidance you give them as the researcher, you won’t be there in real life for the hundreds of thousands of users that are using that product. The user has to stand on his own two feet and if they can’t, then that’s the absolutely most vital feedback there is that everybody on the product team needs to know.

[00:19:25]

Jamin: Do you have a favorite bad question that maybe you’ve seen maybe recently?

[00:19:31]

Harry: I don’t know. I remember once doing some research and you have the stakeholders in the room. And one of the stakeholders would rap his fingers on the table like this when the user didn’t answer the question. Yeah we were doing some research on time tracking companies in Munich. Because the tech was like a stumbling piece of tech that you kind of had to be in the room to see working. So that didn’t go. It’s basically sometimes you need to keep the stakeholders far away sometimes. And I often find that, I know some researchers like to have a chat window open and like to let some people ask some questions during the research. I absolutely will not abide that as the research they can all get lost. They can write notes and stuff and I’ll talk to them afterwards. But having that extra channel of input while you’re trying to run an interview, it’s just mind meltingly annoying.

[00:20:24]

Jamin: You’ll appreciate this, considering I think we started around the same time. The note under the door, right?

[00:20:33]

Harry: Well as the manager now, I sometimes have to go into the research room having to, like maybe the screen is broken or some weird thing has happened where we can’t hear the audio or something. Sometimes that happens in old fashioned labs. So I have to go in there and go, like pretend I’m not watching it live from the screen. You just have to go, oh, I heard there’s some technology issues can you try reading into your brooch, or some nonsense like that. But yeah, it can be quite awkward. Simply if the viewing room door is left open when the participant comes in and they see a massive audience of people all sitting there bolt upright with clipboards, it’s doesn’t really look good [INAUDIBLE] 

[00:21:18]

Jamin: Harry so tell me about your current business.

[00:21:21]

Harry: So let’s see I’ve actually got a few things that I do at the moment. I think probably the thing that your listeners would find most interesting is my work on dark patterns. So I basically invented dark patterns, well no that’s not the right word. I didn’t invent them, I discovered them and gave them a name sort of in 2010, and it was. So I set up this website, darkpatterns.org and it, it’s sort of become kind of a bit of a meme. Like everybody uses that term now when they talk about deceptive interfaces and deceptive [INAUDIBLE]. And for quite a few years I thought it was just kind of a hobby thing, I’d go and do talks on it and run the website and the Twitter feed a bit and I thought that was that. But just recently last year, I started providing expert witness services which is fascinating. So it’s the intersection between sort of psychology, UX design and the law. So if someone’s doing a big class action lawsuit and they need an expert to sort of analyze and describe the nature of the deceptive interface, I would get hired to go and write a report and then give them their position.

[00:22:30]

Jamin: So trick questions, sneak into a basket, roach motel, pricy. So the website is fascinating, I can not wait to dive in.

[00:22:46]

Harry: Yeah, yeah. So I’m going to be doing a little bit of user research. I’m looking for a couple of agencies to partner with actually, but I’m going to be doing some research looking at how. So I’m looking closely at how dark patterns can work. So for example, how you can hide something in plain sight in the user interface, and you can design it in such a way where the users don’t notice it, but because it’s on the page it sort of makes it legal. So one of the most sort of famous examples of that was the Ryanair. I don’t know if you’ve heard, I’ll describe the example. So on Ryanair which is a low cost airline in Europe. You could go into the checkout as you’re buying your ticket, it would ask you a question like. Let me just see. Oh yeah. What is your country of residence? And if you answer the question directly, it would buy insurance. Because somewhere around it, it said selecting country of residence will cause you to buy insurance. And if you didn’t want it you had to go into the dropdown and pick some sort of between [INAUDIBLE] it said please don’t insure me. And you’d have to pick that option and it [INAUDIBLE]. So they came up with this design of finding the real nature of the question in plain sight on the page, it was written right there. But just the way people scan, they just scanned right past it and don’t see. So I’m really interested in doing a bit of user research or a few bits of user research on getting really under the skin of how dark patterns actually work and the psychology of them.

[00:24:05]

Jamin: That is so Ryanair. Right? That’s like such a cultural fit for that brand. That’s hilarious and terrifying at so many levels. Yeah well, thank you very much for being on the show. If somebody wants to get in contact with you, how would they do that?

[00:24:27]

Harry: Well you could google me. If you go to sort of my personal bank [INAUDIBLE].com, you’ll see various different ways of contracting me and various different things that I do. And I would welcome anyone to get in touch. I’m quite active on Twitter actually, so if anyone sort of Tweeted me or DM’d me, I’d probably talk to them at length if they wanted to.

[00:24:47]

Jamin: My guest today has been Harry Brignull. Harry thank you so much for being on the Happy Market Researcher’s podcast.

[00:24:53]

Harry: My pleasure.

[00:24:55]

Jamin: Everyone else, please take time. Screen capture, share this on social media, really appreciate it. Love to tag, love it if you would tag Harry and myself, we would enjoy a Twitter.