fbpx
DigitalFeaturedInsightsMarketingOpinion

In defense of empiricists: blindly following numbers is leading us from reality – by Ogilvy’s Joe Lipscombe

We’re fortunate with the accessibility of quantitative data sources and tools today, but by abandoning qualitative research, we are operating without headlights, writes Memac Ogilvy's regional director of content strategy, Joe Lipscombe

If you follow the English Premier League on beIN Sports, you may have noticed the win probability powered by Oracle Cloud updates that frequently appear beneath the score icon in the top left corner. In short, during the game, global enterprise firm Oracle will pop up to tell you the likelihood of the outcome based on a combination of numerical factors.

It’s here! Marcomms360 – Predictions 2023 is Campaign Middle East’s flagship annual event and a must-attend for anyone who wants to be prepared for the year ahead. The full-morning conference takes place on December 8 in Dubai. Book your tickets now before they sell out.

This is good fun. But sports fans will know it doesn’t take a computer to tell you that if your team is three goals down, the probability of a favorable result is low. And more importantly, regardless of what the historical data is telling you (because even “real-time” is historical in reality) almost anything can happen — and that’s what we love about sports, its unpredictability. When humans are physically competing, anything can happen.

Now, this isn’t all that important. It’s a platform to tease a much deeper and more complex Oracle tool. But it highlights an interesting insight: we’ve developed an over-reliance on quantitative data as the source of all our knowledge.

Quantitative data and the ability to derive insight from it is critical in decision-making. And the democratisation of data tools today has changed our industry for the better. But the more we make it the core source of insight for strategy, the further away from reality we drift. Moreover, the tougher we make it for ourselves to generate and execute original, illogical, unconventional ideas that create disproportionate impact.

We replace the need for lateral thinking with logical solutions that often result in mediocre results.

There is, fittingly, logic behind this. Numbers soothe our risk aversion. They guide us and offer us an avenue of rationalisation. They provide us with a failsafe. If our strategy pays off, it justifies the data. If it fails, we can point to the data and say it should have worked. For the C-Suite, this makes all the difference. After all, it’s important we understand the potential of an idea before we press go.

Quantitative data is not enough

The trouble with quantitative data alone is that it has extreme limitations in terms of what it can tell us about humans. And humans are inherently unpredictable, irrational, inconsistent, and emotionally driven. Take the troubling ‘Green Gap’ — the disparity between attitudes towards sustainability and behaviors towards sustainability — as an example. A study by Neilson showed 73 per cent of consumers claimed they would shift to green products, but only 41 per cent of those were actually willing to do so. Someone is lying.

Another survey by First Insight said more than 90 percent of respondents claimed they would be willing to spend extra on sustainable FMCG products. Yet, Unilever claims some 70 percent of its emissions come as a direct result of its customers’ product choices. Again, someone is lying.

Too often do we see brands and agencies following or creating leading surveys that result in misleading answers and using them as the bedrock of all their future strategies. It’s lazy and unambitious.

As quantitative data becomes increasingly accessible and complex, we must encourage the qualitative, observational, empiricist’s view of the world. Otherwise, we risk losing total sight of the very people we’re paid to understand. As Y&R beautifully put it as far back as the 1960s, Numbers are inert. People are dynamic. Numbers behave. People do not.

Watching spreadsheets, not people

In 2014, Dubai was attempting to increase the volume of caregivers using child booster seats to reduce the devastating impact of road accidents on young children. The conclusion was that too few people owned booster seats and that an increase in ownership would result in an increase in usership. It deployed a rational solution. It offered booster seats free of charge to new mothers leaving the maternal ward in select hospitals. But according to research from a UAE university, it failed. People simply didn’t use them. They were quickly tossed under the stairs alongside suitcases and cleaning products. Quantitatively, more people had booster seats than before. In that sense, it could be deemed a success. In terms of actual behavioral change, it had almost zero impact.

A few years later, legislation was introduced that made it mandatory for children under four to be buckled into a booster seat. Retailers claimed sales went up from 50-100 in a month to between 300-400 a day.

So, what can we learn from these two campaigns? To drive long-term, sustained behavioral change, communicators need to be able to affect not only the capability (the ability to alter the behavior) of the audience, but also the motivation (the willingness to alter the behavior) of the audience. In the case of the first campaign, motivation was missed entirely. Although individuals were given the resources to commit to a behavior (based on the quantitative findings), their motivation to uphold the behavior wasn’t impacted. This created what’s known as an absence of self-determined motivation. Secondly, due to the lack of legislation, there was also an absence of pressured motivation. This resulted in almost zero compliance — voluntary or pressured. The saying you can lead a horse to water, but you can’t make it drink springs to mind.

The broad numerical insight of poor booster uptake was valuable, but the motivational insight that would have come from empirical research never surfaced.

There were significantly more seats in the market than before, but the core objectives of the campaign were barely impacted at all. The very human reasons as to why caregivers were not buckling up their children were missed entirely. The data wasn’t qualitative whatsoever.

Trying to induce change depends upon understanding the reasons behind the choices that people make. These commonly are a combination of shared factual beliefs (it’s safer to hold my child in the car), interdependent expectations (what other drivers or family members think is appropriate), and normative barriers (good parents carry their children on their laps). Understanding each of these requires empirical, qualitative field research

Because we figure we can build successful campaigns based on numerical insight only, we, ironically, often end up executing on assumption. This isn’t always our fault. As eye-opening as quantitative data can be, numbers are also very good at hiding things in plain sight (much of the time this is the objective of the publisher (I’m looking at you, media)). Moreover, humans are inherently very good at finding patterns and correlations in numbers that aren’t necessarily helpful. And what’s more, we often aren’t cynical enough toward the data we’re using—which makes us highly susceptible to poorly designed graphs and charts that skew our perceptions.

It’s tricky. I remember when data scientists worked for NASA. Now, every junior marketing job description I see is asking for people that can intelligently decipher accurate insight from data. It’s critical, but let’s not kid ourselves—it takes years of practice and a deep understanding of the principles of number.

Future public-sector impact will rely on understanding the behaviour of citizens

Providing solutions that drive long-term change and success will be critical for public and private sector entities in the fight against climate change, inequality, and other social, cultural, and economic challenges the region faces today. They require a deep understanding and trust of the psychology behind human behavior. And that behavior needs to be measured qualitatively and quantitatively. It sounds obvious, but increasingly we see entities opting to blindly follow inconclusive quantitative data pointing them in other, less fruitful directions.

Humans are feeling machines. They’re illogical, emotional, and irrational. Our fear of hunches will continue to be a barrier to success unless we’re prepared to, as an industry, takes some risks. For that to happen, we need to teach, encourage, measure, and praise qualitative, behavior-led, empirical research coupled with solutions that tackle the illogical.

Oh, and let’s stop segmenting people generationally. It’s absurd.