Ep. 236 – Ray Poynter on How Understanding the Business Stakeholder’s “Why,” Creates More Actionable Research

My guest today is Ray Poynter, ESOMAR Council Member and Founder of NewMR. Founded in May 2010, NewMR organizes online events and the LinkedIn NewMR group. NewMR is managed by Ray Poynter and Sue York, Research Strategist and Chief Curator.

Ray has spent the last 40 years at the intersection of research, innovation, and business, having been involved in the development of CAPI, online systems, online surveys, and social media research. 

Find Ray Online:

LinkedIn: www.linkedin.com/in/raypoynter/?originalSubdomain=uk 

Twitter: www.twitter.com/RayPoynter

Website: www.newmr.org

Find Jamin Online:

Email: jamin@happymr.com 

LinkedIn: www.linkedin.com/in/jaminbrazil

Twitter: www.twitter.com/jaminbrazil 

Find Us Online: 

Twitter: www.twitter.com/happymrxp 

LinkedIn: www.linkedin.com/company/happymarketresearch 

Facebook: www.facebook.com/happymrxp 

Website: www.happymr.com 

This Episode’s Sponsor: 

This episode is brought to you by HubUx. HubUx reduces project management costs by 90%. Think of HubUx as your personal AI project manager, taking care of all your recruitment and interview coordination needs in the background. The platform connects you with the right providers and sample based on your research and project needs. For more information, please visit HubUx.com.


[00:00]

On Episode 236 I’m interviewing Ray Poynter, ESOMAR council member and founder of NewMR. But first a word from our sponsor. 

[00:11]

This episode is brought to you by HubUx.  HubUx is a productivity tool for qualitative research.  It creates a seamless workflow across your tools and team.  Originally, came up with the idea as I was listening to research professionals in both the quant and qual space complain about and articulate the pain, I guess more succinctly, around managing qualitative research.  The one big problem with qualitative is it’s synchronous in nature, and it requires 100% of the attention of the respondent. This creates a big barrier, and, I believe, a tremendous opportunity inside of the marketplace.  So what we do is we take the tools that you use; we integrate them into a work flow so that, ultimately, you enter in your project details, that is, who it is that you want to talk to, when you want to talk to them, whether it’s a focus group, in-person, or virtual or IDI’s or ethnos; and we connect you to those right people in the times that you want to have those conversations or connections – Push-Button Qualitative Insights, HubUx.  If you have any questions, reach out to me directly. I would appreciate it. Jamin@HubUx.com   

[01:35]

Hi, I’m Jamie Brazil, and you’re listening to the Happy Market Research Podcast. My guest today is Ray Poynter, ESOMAR Council Member and founder of NewMR. Founded in May, 2010, NewMR organizes online events and the LinkedIn NewMR group. NewMR is managed by Ray Poynter and research strategist and chief curator, Sue York. Ray has spent the last 40 years at the intersection of research, innovation and business, having been involved in the development of CAPI, online systems, online surveys, and social media research. Ray, thanks for joining me on the Happy Market Research Podcast today. 

[02:12]

Pleasure to be here. Thanks for having me. 

[02:14]

You are an industry sage is how I would cast you, right? Anytime I see you pop up on my LinkedIn feed, I always click on it, read it, whatever, digest it. And you’re prolific in terms of your visibility in the marketplace. I’m really interested. Tell us about the young Ray. Like where did you grow up? What’d your parents do? How has that impacted your career? 

[02:42]

So, I’m from Nottingham in the United Kingdom, and I grew up in a mining village. My father worked at the local coal mine. He was first a lorry driver, then later a manual worker on the surface. And my mother by the time that I was born was a full-time mother and a part-time cleaner. And so, that’s where I hail from in the UK.

[03:05]

I love the humble beginnings aspect of that. It’s interesting like the juxtaposition because ESOMAR Council, at least from a branch perspective, is fancy, right? You’re the face of the industry to the governance of the world. What lessons did you learn growing up in that environment that you now have applied and help propel your career? 

[03:33]

I guess the most important one is that most people are not like market researchers. So, in the UK, amongst younger people, nearly half go to university. So we need to remember that across the whole country the majority don’t if we take the older groups. The vast majority do not look at advertising and marketing and think the way we think about things. So coming from a background and a culture where most people did not go to university, (I think three people in my class at high school went to university). It’s a useful reminder that there are lots and lots of different things and it’s one of the reasons why you need market research. You can’t understand people if you start thinking they’re like you. 

[04:28]

One of the things that I found interesting when I started learning about sampling is that about 30% of the U.S. population lives in 90% of the land. And it really put a finer point on where we need to make sure that we have representative sample. So if we have all of our sample coming from a specific geography or a large proportion, then we could be missing out on a meaningful point of view because there’s massive differences both psychologically: the way that they view the world from their upbringing perspective. And then, also socioeconomically and that has such a big impact on whether it’s new product adoption or ad resonance or what have you. I think it is a really important point that it’s easy to kind of get into our silos. When you think about like market-research focus group locations and forget about, I’ll call it, the common man. 

[05:20]

Yeah, Main Street or whatever you want to call it, absolutely. 

[05:25]

What is the biggest challenge that you’ve overcome either personally or professionally?

[05:29]

Well, actually, it is related to that issue, and it was understanding that other people are not necessarily the same as me. So, as an employer or as a team leader in my early years, I would create environments that I would have wanted if I were them. And for some of them that really was quite distressing. So I would change the layout of the office two to three times a year because people like change. No, Ray likes change interests. People generally don’t. And it was applying my research skills to the people around me as opposed to just when I was being paid to use them. That really was quite a breakthrough. 

[06:12]

How long did that take? That isn’t an overnight discovery.

[06:17]

No, no. As the people who’ve ever been in my teams would attest, I’d been to my thirties before I really got ahold of that. 

[06:27]

So the thirties are this interesting point in a lot of people’s that I’ve interviewed  career where it’s a decade of… (This is broadly speaking.) It feels like it’s a decade of self-discovery and then that tends to be the point of inflection where their careers move up directly after that. The thing that I’ve seen is that people that decide not to develop the self-awareness, oftentimes will wind up all the time, we’ll wind up doing it at their own expense long term. But going through that process, actually, it can be a really uncomfortable situation at least from my vantage point. I realized that when I first did the… Do you remember Uber conference, they provide a percentage of time a speaker speaks? And for like six months, I just start analyzing how much time I was speaking, and I worked consciously on decreasing that amount of time to be 40% was my goal, which was a big epic challenge for me. And it still is, which is why I’m still talking. OK. So, tell me a little bit about the research project that you’re most proud of.

[07:42]

Well, this is kind of a curious one. I don’t use the word “proud.” Blame it on my Methodist upbringing. I lost my faith in my teens, but I’ve kept most of the habits: so, sort of a distrust of gambling and alcohol and an avoidance of the word “proud” being in that. So let’s think about “pleased.” Most times, I’m most “pleased” with the project I have just done. I’m most looking forward to my next one. So characteristics: if it’s solved a problem. So I’m quite often these days if I’m brought in on a project, is because the project has gone wrong. I don’t do normal projects where they’re set up. People contact me and say, “Ray, this has gone adrift. Can you come and help us?” And I’m just doing one at the moment, which was a really nice piece of work by an agency for a large international company, but they’d got some crossed wires between them, and they just needed to put a few things straight, and then that unfolds. And when you see it coming together, that’s fantastic, or when you discover something that the client didn’t know and that you’ve got this new facets. I can remember one many, many years ago, so I can talk about it now – Project for Whirlpool in Europe. And we did a multi-country, conjoint study of the white goods market, and we identified that what they thought was the structure, wasn’t the structure and what they needed to do was this. And that was really useful and I should add, it’s only really pleasing when they accept your advice. I can think of a project for a candy manufacturer where we found out something that was really useful. We were able to give them a warning: “If you do this, the product will fail.” They ignored it. They did it, the product failed, and they’ve never asked me for advice again. 

[09:39]

Oh, that’s funny. What kind of tips could you give our audience on how you can help the client turn their insights into action?

[09:49]

Find out what the business problem is. We focus too much on the research problem, and you can answer the research problem without helping the business. So if you’ve been told, “Can you test these three ads?” find out why they want to test and what action are they thinking of taking when they’ve got the results. Why are they testing them now and not previously or later? Why the three are not more? How do they think these ads are going to perform? With whom are they going to be successful? So the more you can find out about what the stakeholders further up the chain, further up from the insight manager really need to do, then you can help them more. Now, when sometimes when you do that, you end up telling them, “You know what? I don’t think you need this research because actually either way you’re going to have to do this, or either way it isn’t going to work, or we’ve done something similar before and we can show you what the likely outcomes are.” So the more you understand, the more likely it is that you will be able to give advice that results in action as opposed to describing the data that you’ve collected. 

[11:03]

So, I started my career in the nineties, mid-nineties. Do you feel like there’s been a trend of moving more towards like actual intent of discovery and actionability in research as opposed to maybe a few decades ago? It seemed from my vantage point— but I’m not casting it across the world—it seemed to be a little bit more about just supporting an existing decision that had already been made. 

[11:32]

No. I go back to the seventies and quite often then you’d have massive discovery because new products, new categories. I remember when the cooking sauces were launched and one of the brands in the UK spent more on advertising than the sector took in revenue because it was all about, “Can we create a sector which is fundamentally different?” And they were able to, and they used research to plot their way through that process and guided things that were going on. There was a difference in how research was done then to how it’s done now. And more decisions probably had to be made with that research. It was more expensive, and it was more time-consuming, and it took longer time in terms of a lapse. So somebody would write you a letter saying, “Can we have a meeting?” And you would write back and say, “Yes, how about this date?” because we didn’t all have telephones on every desk. You had telephones in every office but certainly not on every desk. And then you would start to arrange the meeting, and you’d find out they wanted to test something, and then you would agree a price, and then you would start printing questionnaires and posting questionnaires around the country. And that elapsed time allowed you to get a much broader understanding of the context for the research. So I think that there has probably always been a mix of validation research, discovery research, innovation research. People like Peter Cooper, who was such an innovator in qualitative research into the nineties… You’d go for Wendy Gordon and Colleen Ryan, enormously important qualitative researchers, who were doing fantastic discovery about what do people want, what are the real motivations and the drivers. Things like needs states were developed in the nineties as a method of understanding people. So I think we’ve always had that mix. What one is exposed to in any particular period of time changes, but the whole industry has got most of this going on somewhere. 

[13:45]

That’s actually really interesting when you think about the speed of discovery that’s happening or the speed to insight right now. And we’ve gone from (again, my framework is the nineties) so CATI and in-mall intercepts were the primary two data collection methods that we used. And so, in that framework a project would take at fastest scenario would be three weeks, maybe six, three to six weeks, pretty normal for ad hoc research. And now it’s gone down to literally days; Zappi is a great example of hours. Right? How is that changing the decision-making process? Does it mean that the researcher is subsequently just being cut out more and more from the decision-making process and the integration into that decision-making process? Or do you think it’s maybe they’re still involved in that for the important stuff, but the internal brand manager or whoever’s commissioning, conducting the research at speed is doing sheer volumes more. 

[14:44]

There is certainly sheer volumes more. There are some interesting patterns around that. When I talked to insight managers who are pretty comfortable at their future, which is not all of them by any means, they are mostly switching from being order takers to being planners. So they’re going around the business saying, “We think you need to look at this. We think we’ve got an answer to your problem.” “I don’t have a problem.” “You do.” “Let me tell you what the problem is and then I’ll tell you what the answer is.” So there’s that sort of pro-action going on with client-side insight managers in the best organizations: the shift from order takers to planners. Along with that, there is the facilitation of really quick answers. We need to know how many people drink cappuccinos in the afternoon in the major metropolitan areas. You don’t want to wait a week for them. 

Ideally, you Google it and you find out whether that information already exists or you use your KnowledgeHound and your information system. And if you don’t, then you want some way of getting that answer quickly. And you don’t care whether that is a piece of data analytics, whether that’s like a super fast survey, whatever it is providing, it provides a sensible answer to the process. And then the next step is to use that speed and those lower costs to allow iterative development. So rather than telling somebody to develop an ad or develop a concept, and at the end of it, we will tell you “pass” or “fail,” we will sit alongside you as you go through, allow you to test more ideas quickly so you can keep changing the product through to something that’s actually going to fly and is going to win. 

So we can reduce the risk of failure by working alongside people all the way through. And that sort of iterative process is growing. And the other thing which is happening, and it’s most obvious where people are using online insight communities is that every single project is fast. But if you’re working with the same team of researchers and client-success managers for year after year, they start to really understand your business, and you can go back to this community of people and ask additional short questions and then build those answers back into the bigger picture. So the way that we gather the information is increasingly between the projects, not during the project. 

[17:19]

I really like this point that you’re making. I think it’s vital that we, as researchers, understand this. And the other part that supports a methodology is a human being just can’t take 15 minutes of solid focus on completing a project. It’s really, really hard to get that kind of attention in context of a survey, I do think it works in video-based IDIs, that sort of thing. But the benefit of touching people over time and then building out that respondent record, almost creating a longitudinal point of view on the human, it definitely turns that one-off into an ongoing asset for further leveragability. Are you seeing companies leverage technology like combining community management and knowledge-management platforms like KnowledgeHound? Is that becoming part of the arsenal of the internal brand researcher? 

[18:22]

It is. It’s coming in. It’s still got quite a way to go so more data goes into the system than comes out. And so, there is more work to do this. KnowledgeHound is another good example. There are plenty of platforms out there. We are going to see, I think, some good developments in those over the next few years so that your first port of call will be to ask, “Do we already know the answers to that?” And if we don’t, what additional thing do we need to know? Cause it’s unlikely that we’re going to need to know a full-survey’s worth because we should already know quite a bit about that. 

[19:07]

Yeah, data visibility and accessibility across your organization is probably one of the biggest problems that we have because data really is not a renewable resource inside of the organization. It’s largely treated as a one-off. The big problem with it is that every platform—and there’s, according to Lenny Murphy, over 600 now— has its own data schema. So we don’t have the common language across research platforms (data-collection platforms) to— and I’m including qual and quant there by the way—to systematically know that this question is a gender question, for example, or age or diagnostic. Do you think that’s one of the big white spaces that exists in the space?

[19:54]

It is. And it’s not just in the research space. So, Tim Berners-Lee Semantic Web, he thought that was going to be the next big breakthrough because it would make the web so much more useful if every piece of data told you what it was. The difficulty is it would have required people who are putting the data in to describe it for the benefit of people later on who would use it, which is a bit like timesheets requires the people putting the data in to do it for the benefit of other people who will use the information because it didn’t happen. So what we’re beginning to see are AI tools that are being used for the tagging and the coding and that will make sense of that data. But we will need to see market researchers move away from their preferred data structures because their preferred data structures tend to be rectangles, tables of data.

And if you look at the way the world is changing, it’s mostly JSum. It’s to do with strings of information where you have a lot of information from one person, not very much from another. And it’s lots of different things. So we’re going to have some interesting work probably over the next 20 to 30 years in creating structure that unifies unstructured information and then uses appropriate analytical techniques. Because once you get near the marketing scientists, they try to cram the data back into their rectangles of data where everybody’s answered the same questions on the same format, and it’s just not going to be available in that format.

[21:38]

It’s fascinating when you think of it like that, especially with the timeframe that you’ve created, which is pretty long, but you’re seeing it now with NLP machine learning, supporting the structure of unstructured data, helping with social listening, for example. One of the things that you’ve done is made it come out pretty strongly with respect to surveys decreasing from a methodology perspective relative to global market research turnover and then at the same time qualitative increasing. How have you seen that play out over the last, what it’s been, I guess four-ish years?

[22:20]

Probably slightly more. The decline in surveys has been a little bit slower than I thought it would be. It’s been of quite a clear decline but it’s been slow. And I think that is because the panel companies have done an awesome good job at driving down price, driving up the flexibility, the speed, the convenience, the targetability, and so on. So, surveys are what you do when there isn’t a better option, a better, faster, cheaper option. And so, their decline is because other things are coming in. Now the most obvious one is that the passive data that has come through really, really strongly where we want to know what did you watch and when did you watch it and all of that sort of thing and that where did you travel and how many journeys have you had. That is increasingly being collected with passive information as it should be.

 We’re seeing all sorts of quite interesting things with facial recognition where poster boards are looking at people as they pass them to see how many people look at the poster board. Because in the old world we do a survey to say, did you notice that poster board? Well, it’s not a great methodology. You’re going to give that methodology up as soon as you’ve got a better alternative. So that’s the survey side. The qual side is really interesting. Initially, it was a hypothesis, but we’re seeing this come about. Qual has traditionally been about 15% of all market research spend, but if you look at the people who fill in the GRIT survey, who are not a representative sample of the industry—they tend to be at the front end of the industry—they have a much bigger proportion of their time being devoted to qual. If you go to a data analytics conference, you will find an amazing amount of the conversation is around this understanding qualitative issue because the more data you have about what people do, the more you scratch your head and you say, “Why? Why on earth are they doing that? ” They are buying electric cars and then they’re not using them very much, and the car is depreciating. They’d be better off doing this. They’re financially looking after this, but they’re not looking after that. Why? How would we create that message? And for that you need qualitative tools. So, I saw a fantastic presentation in ESOMAR APac last week in Macau and a British researcher, Crawford Hollywood, had been working with the Australians because every time there is a big typhoon or something, there’s an outpouring of sympathy and lots of wonderful people send tangible things: beds, tents, buckets, Teddy bears. And the problem with that is they clog the airports. They then have to be contained; somebody has to pay to put them into containers; and then they usually end up in landfill. They’ve got lots of data about who did it and why they did it and so on.

But to understand how to create a message that was going to say, “Send cash, not stuff.” They needed to use qualitative research and behavioral economics to get that insight. So the data told them what the problem was. The data told them what triggered the problem, how to recognize the problem. And that was relatively inexpensive and relatively straightforward. But what was more (It would have been more expensive, but I think there was a lot of pro bono in it.) more difficult was the qualitative assessment. And so, we will see good qualitative; we will see semiotics and ethnography and this real understanding develop. And I was talking to someone recently about what is the AI analysis of video going to do for qualitative researchers. And I said it’s going to be fantastic cause what will happen is you will get thousands of hours of video and you will ask the AI, “OK, I want you to do some unsupervised topic modeling to find out the main types of breakfast eating situations that exist.” “OK, boss, I’ve done that.” “Right, OK, now I want you to show me some clips, which are absolutely typical of each of those.” And I will then work out what they mean because that’s the division of labor between the machine and the human.

[26:55]

When we’re seeing this emerge right now… I’ll be at the Insights Association’s next conference in Chicago in a few weeks. And one of the speakers, and I’ve recently interviewed, actually talked about this issue. AI-empowered insights, actually, it’s like the Google algorithm, where it can start self-reinforcing bad behavior, which,  fortunately, is part of the dialogue now as it relates with machine learning, thinking about what’s important and what you should show, what insights should be taken out of a vast amount of data based on user behaviors. In the Google example, it’s, obviously, clicks rate. And so, there’s a famous white paper that was done where it’s illustrated that there was a clear bias towards men CEOs versus women CEOs. So anyways, and it was just a function of that algorithm reinforcing what user behavior was illustrating as important as opposed to what actually is important, which is truth, Anyway, I think this is a very fascinating point of view. Besides AI machine learning, do you think there’s… How is voice moving going to play in the market? When I think about voice, I actually mean Alexa and Google Home. 

[28:17]

I think there are probably a few unknowns. The first one is: Are people going to get freaked out? Is it listening to stuff that it shouldn’t be listening to? So that has got to be reconciled. The really nice use that one could envisage of it, if we get all the trusts in place, is a question about… So, when I go to bed I say, “Hey, Google, turn off the lights,” and it’s just actually turned on. 

[28:51]

I know. It’s awesome. 

[28:52]

It will know what time I go to bed normally. So there’s no reason why we shouldn’t have a voice-activated question in there that says and I’ve signed up to have a certain number of these in return for some sort of benefit. “Why are you up later today?” “Or why are we turning the lights out earlier today?” Or my temperature control is connected to my internet of things; so is the carbon monoxide leaks. We can envisage putting in really single simple questions like that. Not too many per person. You know I would probably be tolerant of a question most days, but I wouldn’t want to get several questions each day like “Did you have a cappuccino today, Ray?” And if so, “Where did you get it from?” But those questions are going to be super short; so they will have to be part of a program of collecting data from people through their devices, through some surveys, through some voice activation. So it is going to probably only be a small part of the existing mix. I know that some people think surveys are going to be voice-activated. That only works for… We have an English expression called “Billy no mates,” which is somebody who is on their own. So if you’re in a family setting, you are not going to sit there and talk to Google home and answer a four-minute survey. You are not going to do it necessarily when you’re watching TV. Yes, you might do it when you’re eating so it’s going to be a marginal piece. We’ll probably see it as an option. 

Where I have seen it produce something that’s quite interesting is where you are looking for open-ended texts. Getting people to speak it into the right sort of device at the right sort of time gives you more and quite interesting text, but the right time is not when you’re on the train. If you look at the time of day when most surveys are completed, they’re completed during the working day. People are cyber loafing, so that is not amenable to voice data collection. So it’s going to be a useful addition. I don’t think it’s going to be a mainstream game changer. 

[31:26]

It’s interesting how as you’re casting it, it really becomes a part of this whole micro survey or data collection point of view where you’re building out that respondent record because done in context of a one-off is a lot less interesting in context of a thousand data points and this is just a few more that we’re appending to that particular respondent record. 

[31:49]

That’s right. If you imagine you get a compliant person who is happy to take part in this, and every time they listen, they leave a restaurant, they get a beep and they speak into their phone very quickly. “Yeah, I’d give that a five out of seven. The food was good, but the service was lousy.” And you build that up over time so that you find out, “OK, this guy always says the service was lousy.” So that’s pretty irrelevant to how we get to the advice we’re going to give the management, but this was really good.

[32:28]

I mean that’s really interesting. 

[32:29]

One of the things that we tend to get wrong when we oversimplify quant research is that everybody’s opinion is equally valuable. But, actually, think about when you look at TripAdvisor and you see somebody who has been really negative. You have a look at a couple of other places they’ve reviewed to see whether they are just a negative person. Or are they normally positive and they thought this was really bad, in which case that’s a really strong warning. And I will often say to people when they’re looking at customer satisfaction data, if you’ve got somebody that thinks you are terrible at everything, ignore that data pretty much because you can’t learn from it. You want people that say “This was OK, this was OK, that was terrible and that was terrible” because now you’ve got something you can work with because you can conceivably make those happy people by getting it right. People are thinking everything is terrible. You are not going to make happy and what’s more, you’re going to spend money on something that nearly everybody else thinks is OK.

[33:30]

I mean you’re circling around the same point, and a lot of data privacy actually becomes like part of it, right? So, when you think about… I’ve said this on the podcast numerous times, so I apologize. But you are still asking gender in surveys. And on some anecdotal research that I’ve done, when a respondent qualifies for a survey, they’ve probably been asked gender six times in that one instance because they’ll get screened out or over quota, whatever, and then they’ll again get shown a set of screening questions, so on and so forth. And really what you’re talking about is that longitudinal point of view with the respondent. How does data privacy GDRP start playing into that? Does it then become limited to controlled communities or do you think we’ll be able to have a broader point of view at the respondent level through companies like Dynata? 

[34:27]

I think there’s a fascinating battle to be had about whether we get to see it through companies like Dynata, whether we get to see it through companies like the banks leveraging their information. Or do we get to see it through databases accumulated by clients? So will Proctor & Gamble, so Unilever, for example, want to have a connection, a digital connection with 1 billion, that’s B for Bertie, 1 billion of their customers? So that is one heck of a walled garden. And every one of those 1 billion will have their privacy settings and some of them will be set, “So yeah, you can use all of my data, but I want 1% off the price at the stuff I buy,” through to people who are totally locked down and sharing almost nothing. So it is going to be about asking additional questions. And that is why it’s so important for the Dynatas; it’s why it’s so important for the Experians and people like this who hold data. The credit card companies, the retailers all want to turn their data into an asset and they’re trying to work together in collectives. And then companies are trying to create those. So, today’s online communities, typically 5,000 to 50,000 people, and it’s used exclusively for market research—in the future, if for a big brand, it’s more likely to be a million to 5 million to a 100 million people. And it will be used for all the communication purposes of the organization, not just for research. 

[36:11]

So, you’re seeing this like Venn diagram of marketing research converging into marketing, or even brand, so at a big B level. You know, I did some research for Intuit a long time ago, and it was basically just a count of how many times each person in their database had been solicited for research. It was remarkable the volume because Intuit, being a consumer-centric organization, they were hitting their people a lot for ad hoc research. There was one guy that in a week, I think it was 16 times he’d been solicited for research by all the disparate stakeholders within the company. He was just checking all the boxes, I guess. And so, all of a sudden, you as a brand need to become very cognizant of the mechanisms or the experience that you’re creating through the research to the end consumer because that has an overarching… He was being contacted more for research than he was for upgrade or buy or whatever, right? So the value piece wasn’t necessarily there. 

[37:18]

And Scott Miller, at Vision Critical, spoke quite eloquently on this. So many of these as he calls them spam surveys are… You’re not sending these out to the general public. You’re sending them to your customers. They are part of the customer experience. And if they are not good, they’re a very negative part of the customer experience. 

[37:39]

When you think about market research today and you’re exposed to most tech and then also at an ESOMAR Council Member level and then from your brand exposure, what do you see as the biggest issue that we are facing as an industry? 

[37:56]

The difficulty of knowing whether research is good research or not has been growing, and probably underlying that is the decline in in rigor. We’ve got a lot of things out there like facial coding, which has very little good validation and may or may not work. Now, if it works, it’s fantastic, and maybe it works in some circumstances. We saw an attempt in the early days of neuroscience: The ARF did some side-by-side research, but there hasn’t been much since that. Some of the biometrics where people are looking at galvanic skin response. Well, if you show me a baby playing around with a razor blade, you are going to get a psychophysical response that’s really straightforward to measure. But if you show me a slightly different color for the floor detergent, you’ll be lucky if you’re going to get a physical response even if you are generating a response big enough for me to buy, make a purchase choice, that’s differently. So I think we need a lot more rigor in that process. But I’m not sure anybody is brave enough to do it. I don’t think the Insights Association or ESOMAR or any of these bodies are going to come in and say, “This piece of research from company A, uh, it’s rubbish. We’d suggest you don’t buy it.” And it’s really hard now for clients because people are coming along and they’re saying, “Nobel prize winner, Daniel Kahneman, recommended this or Daniel prize winner, Nick McFadden generated this type of conjoint analysis.” And it’s really hard for clients to know what the true value of that is, and things are moving so fast that it’s going to get more and more difficult.

[40:03]

I’ve had a couple of people reach out to me—one just yesterday interested in moving careers, so jumping into market research. If you were interested (kind of rewind the clock and context of today) in entering into market research/user experience, what would you do in order to bridge that gap? The problem that this individual was she didn’t have direct primary research experience that she could point to, which just basically opted her out even though she did have a clear desire and interest in the category. 

[40:39]

You want to find a place where your skill is relevant. So if you are a linguist, you would use that as your entry point. Use or find an area like customer-success management, which doesn’t require a research background, but it does require a background in working with clients, developing markets and so on. You might find that usability testing was a relatively straightforward point. If you’re a data scientist, then you’d come in through the data science route. So you want to find out what is your strength because when you’re in the industry, the phrase that’s become fashionable over the last few years is the T-shaped employee. So the most important thing about being a T-shaped employee is having that specialty, the stick that sticks out. Once you’ve got your feet under the table, you then want to make sure that you’ve got the regular skills, the flat part of the T: So you know about the research industry; you know what qual adds to the picture; you know why we don’t ask leading questions, how to recognize bias data and things like that. But you want to come in via your specialty.

[41:54]

Yeah, that’s interesting. Do you see the role of social media like LinkedIn, etc. playing a material part in that transition? In other words, do you think somebody could start blogging on that platform and then connecting? So at almost a very tactical level, leveraging whatever sort of overlap their existing talent may have with market research and then trying to really overreach but not in an inappropriate way.

[42:22]

I have seen individuals do it. Most people who try it can’t actually keep up that level of material. It’s hard work for most people to generate a lot of social media presence. It’s a little bit like keeping a diary. Lots of people want to keep a diary, but they can’t. Lots of people thought they were going to blog but they didn’t and it’s similar to do that. So I have seen people who have helped move careers, helped develop, become more aware. It may not be the easiest way unless you are a native to social media; that is where you want to be. So it is a possible route, but I wouldn’t think it’s the easiest. 

[43:11]

My first blog I had one post. I can’t think; it was like I was going to commit to a weekly post and then six months later, it was my second post. “Sorry, I haven’t posted in a long time.” It was just really perfectly described as you did. It is a major commitment, but one of the things that we’re seeing is within the LinkedIn platform itself, it’s almost like there’s this mix of Facebook content getting intermingled with business-related content. Are you seeing that on your feed or is it just unique to the people that I’m following? 

[43:46]

I actually don’t see much of it in my feed. I see people complaining about it in my feed, but I don’t really see much of the puppies and “Here is me running up a mountain. Here is the celebration of my anniversary.” I have a Facebook feed and I can see some really important things in people’s lives happening in Facebook. And I’m connected with the same people in LinkedIn. And what they are talking about in LinkedIn is different. So whether the algorithm is working better for me, whether my cross-section is different, I’m not sure. 

[44:29]

Interesting. Gosh, I tell you it’d be nice to see behind the curtain in terms of how it prioritizes what it is that we’re seeing and if we could help inform that content. But anyway, that’s another subject. How do you see the role of social media playing at a corporate level in order to advance the companies—and I’m talking about market research companies in this case—their individual agendas? How’s that going to evolve over the next couple of years? What should they be paying attention to?

[44:59]

They should be part of their broad strategy. So, there are two things that you want to do with your brand image. If you’re a market research company, then you want to establish the right image with clients, and you want us establish the right image with current and future employees. So part of what you’re doing the social media for is to get the right sort of people to apply, for people to want to work for you, and for people who do work for you to understand what your mission is because most people who work for a medium- to large-sized company have no idea what the mission of the company is. So you need to try to get that message across on posters in the bathrooms, on messages on their computer, and in social media and in what you say to the press, just at every possible opportunity. 

So social media is one part of that. And then you have the same thing when you’re talking to clients. So when somebody like Tom Ewing does a very thoughtful piece in LinkedIn, that is part of the message that says System1Group is going to give you this level of thoughtfulness. It also is a sub-message: “We’re not that cheap. So don’t bother yourself; only come to us if you’re looking for something extra.” You’ll see a lot of the stuff from InSites Consulting, the Belgium company, which again talks about some of the really exciting things they’ve done with clients. And one of them recently I saw Thomas talking about, Tom De Ruyck, sorry. Done a project with a transport company, and he’d insisted that they all travel by bus for the few days before the debrief because most of them didn’t travel by bus. Some had got drivers, they were so senior, and he wanted them to do that. 

So when he came to make the presentation, people would go, “Oh yeah, I recognize that. Yes, I recognize that.” Well, there’s a level of chutzpah or arrogance if you as a brand are telling your senior stakeholder clients to do something. And so, when you say that in social media, you are creating that expectation. Don’t come to us if you’re looking for a really cheap quick service because actually that’s not the sort of business you are. And similarly, if that is the sort of business you are, then you will message that through social media. But nothing in social media should be unique. It should simply be part of your broader communication plan. 

[47:43]

Got it. Are you seeing it as an increased… I’m thinking now about the new entrants into market research, the new buyers. Are you seeing it playing an increasing role in their buying decision? 

[47:58]

So, we’re talking about brands buying research from agencies or people buying agencies. Right. 

[48:05]

Right. Yeah, in this case, it’d be part of that $46 billion market research turnover piece.

[48:12]

Most clients are not active in social media. So there is a section who will be very aware, and they will be aware when for those situations where they have a different problem. So somebody comes along and says, “We really want to do some form of implicit association testing.” That’s when they’re going to say, “Well, yeah, I saw that nice presentation at the conference, and these people have been talking; decision-making talking about that in social media. I better give them a call.” That, of course, is something that doesn’t happen every year for that particular insight manager. They may not get any questions from their team that fall outside of their regular suppliers. So it is in use. It’s part of the communication strategy, but it’s not a major force for most people. 

[49:12]

One of the things I’ve been trying to wrestle with is I heard a podcast where Brian Halligan, one of the founders andCEO of HubSpot, was talking. He actually claims that telesales is net negative for a company’s brand if you’re selling into enterprises or B2B. This is his point of view and he gave a reason why, which, of course, all of us know, right? Nobody likes that person on the other end of the phone. However, on the other side of it, is exactly what you’re describing, which is you have this specific need in context of time and it isn’t regular. How are you finding companies building that brand so that they are top of mind when the buyer has the specific itch to scratch?

[50:02]

It depends what you’re selling, but my experience of telesales is that the companies that use it well do better than other companies. Now that maybe isn’t just because of telesales because maybe they’re doing lots of other things well too. But if you think about how telesales work, somebody rings you up and it’s a complete nuisance and somebody rings you up and it’s a complete nuisance. Somebody in your company says, “We’ve got to do a project with chatbots.” And 20 minutes later somebody rings you up and talks about chatbots. It’s brilliant. So that telesales has an immense amount of throwaway. But if it falls in at the right time, it’s like fly fishing; then it’s successful. And I think that is part of what is happening. It is different in different markets. So telesales is much less effective in Asia Pacific than it is in North America.

But generally speaking, the research industry I think should be using more telesales in the mix. Europeans, in particular, are very squeamish. They think, “Oh, people don’t want to receive those calls.” Look at the companies: people like Matrix Consulting and so on, a matrix lab who have done really well using telesales as an important part of the process. Otherwise, you very, very rarely breakout of whichever circle you’re in. If you look at the people who’ve done the social media mapping, what you notice is that everybody’s in groups and they’re in echo chambers, and we all talk to each other. If your marketing is going to break out of there, you have to do cold calling. Now that cold calling can be knocking on doors. It can be going up to people at conferences, but for a lot of clients it has to be telephone. There isn’t another method that we have today of cold calling. You can’t email people you don’t know. Geographically, it’s too time-consuming to knock on the door on the off chance, they’ll let you in. So, telesales cold calling is terrible, but better than all the other cold calling options.

[52:29]

Qualtrics is another example. They’re rumored to have had a phone room of over 400 people, just dialing. And I know that I actually lost one project to Qualtrics and the reason given to me was they just keep calling me, so I feel like I have to use them. Which I thought was really interesting.

[52:49]

And it is. And there will be lots and lots of people who will not respond to those calls, but it’s still giving them a mechanism to break out into somewhere else. And generally speaking, this comes back to my point about other people a lot like you. I could not be a telesales person, but it forces me to realize that there are people who can be.

[53:12]

That’s funny. Yeah, my previous chairman, Dennis Malamatinas, he got a telesale from Richard Branson. He was the CEO of Burger King Global, and they were reaching out to global CEOs to solicit utilization of Virgin airlines. So it was a really interesting thing. The billionaire is doing cold calling. Anyway, there you go. There you go. Last question: What is your personal motto?

[53:49]

Have fun, keep learning, help people and, hopefully, make some money along the way. 

[53:54]

My guest today has been Ray Poynter, ESOMAR Council Member and founder of new NewMR. Thank you, Ray, so much for joining me on the Happy Market Research Podcast today.

[54:03]

Pleasure.

[54:04]

Everyone else, if you found value in this, as I certainly did, I hope that you’ll take the time to screenshot this episode, share it on social media. Your feedback as always helps other people like you find this content. Have a wonderful rest of your day.

[54:22]

This episode is brought to you by HubUx.  HubUx is a productivity tool for qualitative research.  It creates a seamless workflow across your tools and team.  Originally, came up with the idea as I was listening to research professionals in both the quant and qual space complain about and articulate the pain, I guess more succinctly, around managing qualitative research.  The one big problem with qualitative is it’s synchronous in nature, and it requires 100% of the attention of the respondent. This creates a big barrier, and, I believe, a tremendous opportunity inside of the marketplace.  So what we do is we take the tools that you use; we integrate them into a work flow so that, ultimately, you enter in your project details, that is, who it is that you want to talk to, when you want to talk to them, whether it’s a focus group, in-person, or virtual or IDI’s or ethnos; and we connect you to those right people in the times that you want to have those conversations or connections – Push-Button Qualitative Insights, HubUx.  If you have any questions, reach out to me directly. I would appreciate it. Jamin@HubUx.com.   Have a great rest of your day.