The philosopher of science Paul Feyerabend famously wrote that “the only principle that does not inhibit progress is: anything goes.”

As marketing is being transformed by advances in our ability to collect and manage data, the industry is becoming more ‘scientific’. This is why every day it becomes more important for marketers to heed Feyerabend’s advice.

A hypothesis about data

The crucial element in the recent evolution of marketing has been data. The collection of comprehensive data about customers and their behaviour promised marketers unprecedented insight into the effectiveness of their efforts, including of course where they should spend more and where they had been wasting their budget.

Consequently, marketing began to worship at the altar of data, eventually giving rise to the fascination with the nebulous “big data.” Marketers now have the ability to collect data on almost anything they want.

The fact that the underlying principles of marketing have remained much the same throughout this process (sell more stuff by putting what you’re selling in front of the right people in the right way) therefore begs the question: Why aren’t marketers doing better?

How not to do things with data

Marketers have been getting their relationship with data the wrong way round. Simply, the answer is never in the data. In fact, the best way to get answers is to forget about the data.

In scientific inquiry, trawling through existing data is rarely conducive to innovation. Trying to piece new things together from the mass of what you already know is an aimless, hopeless endeavour. You become a prisoner of conventional wisdom, reaching ever narrower, less original conclusions, with an increasing likelihood of being wrong.

Scientific research shares at least this much in common with marketing. For example, we have data on the most shared headlines for content marketing. (Buzzsumo collated 100 million of them.)

According to the data the top three-word phrases to use in article headlines for maximum shares are “will make you,” “this is why,” and “can we guess.” Widely-shared articles also begin with “X reasons why” or “X things you,” and very frequently include appeals to emotion.

However, as Marketing Profs’ Ann Handley correctly noted in response, marketers should not “take this information and conclude that the best headline to use forever and always is something like 10 Ways That Will Make You a Better Headline Writer (and You Won’t Believe What Happens Next!).”

What this demonstrates is a problem with attempting to draw useful conclusions from data alone. While there are many things we can conclude from Buzzsumo’s impressively comprehensive analysis, not many of them are useful for content marketers attempting to come up with headlines.

In fact, Handley gets it absolutely right when she urges marketers to “get a little creative with headlines.” Not only will different types of headlines work differently in different contexts (we cannot all be Buzzfeed, and we definitely should not try to be) but it is only by being creative that we actually end up writing better headlines.

Simply mimicking the headline formats that currently work well will create not only an artificial ceiling over how successful content can be, but suffers inevitably from regression towards the mean. This is what happens when marketers limit themselves according to convention.

If the answer is clickbait, you asked the wrong question.

How to do things with data

A marketer trying to come up with more effective headlines for her content does not need an answer to the question, “what are the most popular phrases in headlines?”, she needs an answer to a specific question, “is my content going to perform better if I use this phrase or that phrase?”

These questions are easy to confuse. The crucial difference is that our hypothetical marketer cannot use the answer to the first question to make any sort of conclusion about how to act. She will simply learn more about what has worked for others and be restricted to coming up with derivative ideas.

Just because something worked for somebody else, it does not mean it will work for you. And when it comes to the over-saturated world of online content, the fact that something worked for somebody else means precisely that it is less likely to work for you.

It is the second question, a specific one about some actual ideas, that represents the best way to go about dealing with this problem. It is a practical question that makes data useful and this is because it puts new ideas ahead of old conventions.

What does genuinely experimental marketing look like?

A particularly clear recent example of this is AS Roma’s successful approach to social media video. In an industry where all the major football clubs (and a lot of the minor ones) are stepping up their digital marketing and where almost every player transfer is announced with slick professional video on social media, Roma succeeded by doing something different.

These idiosyncratic videos embody Feyerabend’s “only principle that doesn’t inhibit progress.” Where their competitors acted like sheep, Roma chose goats. They forgot about the data on what worked for their competitors and instead asked “what if we do something else?” They chose to experiment.

As the categories of data available to marketers have multiplied, the possibilities for experimentation have grown exponentially. However, in practice this has not led to the proliferation of a diverse range of experimental approaches to marketing. Instead, there has been a succession of “next big things” (such as AI), which seem to sweep the industry each year. The prospective benefits of each of these potential innovations and the specific uses for them end up being submerged by the hype. Brands frantically attempt to emulate their competitors to avoid being seen as technological laggards. The appearance of innovation trumps real experimentation.

This is because too much marketing data is not collected with a specific purpose, it is simply collected in a way that encourages marketers to emulate their competitors and reinforce the status quo. A successfully experimental approach to marketing therefore requires marketers to put their own creativity first.

How to experiment in marketing

Professor Byron Sharp recently mentioned how important it is for marketers to learn how to run “proper controlled experiments,” something which most formal business educations dearly lack. He is correct that experiments are only useful if they are carried out according to rigorous scientific principles (with control variables and so on).

This emphasises the connection between the scientific and creative aspects of experimentation; marketers cannot truly have one without the other. They therefore require a consistent experimental method that can be applied repeatedly and which maintains a complementary relationship between data and innovation.

First, an experimental method requires marketers to come up with hypotheses, i.e. “I think our content might perform better with this sort of headline” or “I think our social media engagement would be improved with this sort of video.”

It then requires marketers to collect data for the specific purpose of testing a hypothesis. Generally this is done through A/B testing (and specifically with Bayesian statistical inference rather than frequentist statistical inference, given that it is better suited to getting answers quickly). This approach to data allows it to inform marketers’ hypotheses in a way that complements their creativity rather than inhibits it.

This process of testing hypotheses can then be repeated in an iterative cycle that allows marketers to try out as many new ideas as possible in order to increase the chances of a major breakthrough. This process aligns neatly with the concept of agile marketing, which perhaps goes some way towards explaining the current vogue for that term.

The balance of power

Technological advance has given marketers access to invaluable quantities of information and as a result marketing and data have become intensely-linked. However the outstanding question about this relationship is simple. Who is in charge?

Is marketing led by the hackneyed conventional wisdom represented by existing data or is it led by marketers’ own creativity and critical thinking? Where the balance of power leans towards the data, marketers are inhibited. Where it lies with the marketers, the data can yield genuinely useful conclusions and help marketers to come up with their next great idea.

Need to improve your own content marketing efforts? Book yourself onto one of Econsultancy's upcoming training courses.

Frederic Kalinke

Published 20 September, 2017 by Frederic Kalinke

Frederic is managing director of Amigo and a contributor to Econsultancy. You can connect with him via LinkedIn.

1 more post from this author

You might be interested in

Comments (4)

Pete Austin

Pete Austin, CINO at Fresh Relevance

Re: "Marketers have been getting their relationship with data the wrong way round". Yes, but not entirely as the author suggests.

Data analysis is a problematic way to find good marketing - as the author correctly says, you can't “take this information and conclude that the best headline to use forever and always is something like 10 Ways That Will Make You a Better Headline Writer (and You Won’t Believe What Happens Next!)”. What makes for effective marketing can change quickly, not least because of factors like the novelty effect and, advertising blindness.
See https://www.freshrelevance.com/blog/5-ways-your-ab-tests-are-going-wrong

But it works great when used the the opposite way around - for finding (and rejecting) relatively bad marketing - because what fails today is very likely to keep on failing. We are all bad at seeing the faults in our own work, but the data doesn't lie and quickly shines a spotlight on them.

about 1 month ago

Carlos Abler

Carlos Abler, Manager of Online Content Strategy at 3MEnterprise

I love this article. And not just because it quote Feyerabend, but that helps. I'm so exhausted by people using aggregate statistics to provide a rationale for point solutions. It's an intellectual regression to the mean(ingless). To be fair, aggregate analysis can be insightful, can help us ask better questions, guide us on where to look, or where (over our shoulder) to aim. But a solution is always case-specific, and should have variable possibilities to test, which may or may not reflect variants that are hypothesis derived from the mean. Likely not, if something innovative or thoughtfully targeted is in flight.

about 1 month ago

Frederic Kalinke

Frederic Kalinke, Managing Director at Amigo

Thanks for the comment Peter. I am not arguing that we should not analyse data. As you say, data never lies and can shine a light on ineffective marketing. What I am saying is that marketers must start with posing questions and setting hypotheses; marketing should never start with data analysis, otherwise campaigns are derivative and homogenous. Path-dependency stifles creativity and innovation.

Your point about the novelty effect is bang on - this article unpacks why the novelty effect is an issue for the validity of A/B tests: http://blog.sumall.com/journal/optimizely-got-me-fired.html

about 1 month ago

Frederic Kalinke

Frederic Kalinke, Managing Director at Amigo

Glad you like the article Carlos and I fully agree with your points on using aggregate statistics for point solutions. Feyerabend is brilliant, along with Richard Feynman who was so good at explaining the scientific method in layman terms.

about 1 month ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.