mast.JPG

Main

Boring people know the most interesting things

449284933_3d3808c038.jpg

Image courtesy of boo cru

This is my first NMA column for the year and its about measuring the effectiveness of digital campaigns. Obviously there is quite a tongue in cheek theme about bringing the digital geeks and the research geeks together but the serious points are about looking beyond intermediate metrics, the folly of accountability and need for greater ambition in digital campaigns.

As ever, enjoy.

It goes without saying that two of the words digital people seem to loathe the most are ‘advertising’ and ‘research’. Advertising because digital-kind believes itself to be above such a grubby and discredited approach to selling things. And research because this hardly suits the buccaneering spirit of the digital frontier where gut feel and instinct get results.

So it was no great surprise that at a recent conference on advertising research run by the eminent people at the World Advertising Research Centre there wasn’t a single soul from the digital fraternity on the delegate list.

Now, the truth is that attending a conference like that wouldn’t have even begun to flicker across the minds of most people in the London digerati. I mean, who in their right mind would swap a slap up lunch at Shoreditch House for listening to a bunch of boring research people droning on about how you evaluate engagement online? However, this sort of disinterest is symptomatic of the chasm that exists between the digital world and the research community. Which is a shame because you could be so good for each other.

Sure, many research people carry the pallid complexion of those that spend too long sitting ruminating in dank basements, emerging blinking into the light merely to collect data or submit a paper for peer review. Sure, they tend to wear suits that were mildly fashionable in the years preceding decimalisation. And sure, they have a curious way with Powerpoint that involves cramming so many words onto a slide than you’d have difficulty reading them with the Hubble Telescope. But they might, just might, have the answers that you are looking for, or at the very least a way to help you to find them.

And of course number one on the list of questions that need answering is how to prove the effectiveness of digital campaigns and specifically the value of engagement. Not with terminally intermediate metrics like click-through, pass-on rates, average dwell time and the like. Nor the entirely reprehensible habit of multiplying visits by length of visit, finding out the cost of buying that ‘engagement’ with conventional media and calling the result of this sordid little calculation ‘Return on Investment’. But real attempts to prove the commercial value of immersing people in a brand’s world and having them interact with this world and share it with others. Not to mention the means by which to model the sales effect of digital activity and prove its contribution to the client’s bottom line.

Now you may feel that you are happy with your click--throughs, pass-on rates and average dwell times. After all aren’t they proof of digital’s accountability? And isn’t that what clients are looking for?

Well in my book accountability is rather over-rated. What clients really want is an effect – a real sense that the marketing activity they undertake is selling goods and services. Not shifting a few here and there but manifestly affecting the momentum of their business. Advertising agencies have always understood this and not only have they historically valued bigger and longer term effects over short term movements in the metrics but they have set out to get them, and develop the research tools and models to prove that they have been delivered.

If digital wants to move beyond mere accountability and prove that it can deliver the real results clients are looking for, it must engage properly with the research community. Digital folk, it is time to discover that the boring people sometimes have interesting things to say.

Comments

This is an interesting argument you bring. I for one as a digital media developer need to stop and look at the bigger picture from time to time. Collaboration with the dark side could be very worthwhile to bring a stronger long term strategy to fruition rather than chasing after the next 'big thing' in digital media. With such a fast moving sector it is all too easy to be swept up with the flow, much like amateur football where the whole team is chasing the ball rather than strategically placed positioning. Thank you for the insight and I will try to couple up with an advertising guru.

Posted by: DigiKev Digital Media at January 24, 2008 09:48 PM

Great article. I am fascinated by the general assumption in the media and digital world that digital metrics are somehow more intelligent that previous metrics. Click clickthroughs and GRPs are equally dumb stats. Each tell you numbers of impressions buy nothing of connection, engagement or influence let alone likability in the most rudimentary sense. Digital agencies will start to get the respect they crave when they grasp and act on adding real value to brands through the digital experience to enhance pricing power rather than simply facilitation closing transactions when you've got the lowest price listed on Bizrate.com

Posted by: Brett at January 25, 2008 12:13 AM

oouch.

While your argument is absolutely valid, I think you waisted too much lines on biting digeratis' butts than on the actual benefits of doing better digi-research.

Marketing comms is undergoing massive changes where digital is slowly maturing from the fringe into mainstream and budgets slowly follow. I'm not well versed with the history of advertising research but I guess that companies like Hall & Partners and the likes grew out of / from the inside of advertising by a mix of need, demand, frustration, vision and business opportunity.

It will take some time until companies like H&P will come up with metrics that will start making sense in this brave new world. Currently we are in an interim stage where we just still don't know how to measure stuff. There is just too much new things going on and no one yet has come up with something convincing to measure engagement, involvement and their effectiveness on bringing good business.


And finally, yes - some parts of the digital world (isn't it tedious to generalise the digi-world as it is to generalise 'advertising people'?) have evolved, in parallel with the rest of the world as a counter discipline to traditional marketing/advertising with its tones of ego, bullshit and hubris and still carries a flair of rebellion against everything that advertising represents. Part of it is resisting the dry, cold rationalism of research that wrongly(?) perceived as creativity killer.

So we in digital are still enjoy doing simply great stuff that people are happy to play with and pass on to their friends without the sword of effectiveness above our necks....when we'll have £1 million for a digital campaign we'll probably have something to spare on research as well...

Posted by: Asi at January 25, 2008 11:05 AM

Hi Richard

Burp. Sorry a bit gassy after my 1/2 free-range boar upstairs at 'the house'. ;-)

I agree with pretty much everything you've written. I also have some sympathies with Asi's whining about budgets. But it's really not about money, it's about the culture of the digital industry (as much as there is one).

Of course we should be looking to prove real, genuine, ROI. The digi-novelty is going to wear off. And people will stop doing digital stuff just because everyone else is. And when that happens agencies that aren't able to prove their work works are going to get screwed (and rightly so).

As usual I've not got any answers, but here's a few observations:

All digital activity can't be lumped together. I think that the brand/comms side of things suffers most heavily from lack of decent metrics. It's only gut feeling and GCSE maths that tells me someone playing a branded game for 20 minutes has to be 40 times more effective than a 30s TV spot ;-)

There are clients and agencies who've always been first and foremost about delivering real tangible results. These aren't the companies that you'll see in Campaign or the front bit of NMA. These guys are too busy building stuff, making money and running successful businesses. They probably don't even have time to do to Shoreditch House (or research conferences for that matter) - poor things. But there's definitely things that we can all learn from them.

Similarly if you've spent time working on 'commmerce' projects you become painfully aware of just how much difference something seemingly tiny can make. How putting 'that button' in one place rather than another can generate thousands of pounds a week in extra sales. It makes you wonder what you could do if we'd just thought about things a little bit more... Even perhaps done some research of one kind of another...

Which throws up another complication. The differences between research and testing. Quite often clients will get us to research comms-concepts (alongside other advertising) and we'll also want to put a site through some kind of usability testing to validate user-experience. Which all gets confusing and expensive as these things are currently carried out by different kinds of companies.

As a digital person I'm going to make a couple more excuses for our failings:

1. Historically a lot of digital projects were serial projects rather than longer-term relationships with clients. This short-termism of both budgets and involvement on both client and agency side has meant that investment in research has been low. In retrospect a mistake. But we've had a bumpy ride over the years and investment in the future was hard when you didn't know if you had a future.

2. Obsession with the wow! On both client and agency side people have been looking to do the 'new cool thing' - because it makes them look good (in the eyes of the mainstream marketing and creative communities).

Unfortunately in the absence of good, solid metrics people have started to believe their own lies. Big traffic = success (or at least easy to measure pseudo-success). Therefore objective = big traffic. How do you get big traffic? See (2) above. 400,000 web designers visit site. That's big traffic. Job done.

I don't pretend to know how we get to the killer measurements. But I reckon we'll all get some surprises when 'the clever folk' mange to figure it out.

Personally I'd love to spend more time hanging out with research heads and trying to untangle some of the mess we've made. I'll even take them to Shoreditch house (we'll have to get them a suit makeover first though, obviously...)

[Sorry I didn't intend to ramble so much]

Posted by: Iain Tait at January 25, 2008 03:27 PM

Your observations are sound. The research industry hasn't done a great job so far of measuring the world of online communications. Most researchers are desperately trying to adapt their offline metrics to an online world, and failing for obvious reasons.
There's a diversity of brand experiences you can have online (somewhat like the offline world) so it is fairly obvious that no simple metrics are going to provide us with relevant measures of effectiveness. You have to start with an understanding of how the online brand experience has changed consumer's relationship with the brand. If we then know what exposure these people have had to the brand online we can start to understand the contribution it is making.
It's relatively simple - the problem is that research companies have such a vested interest in protecting the value of their offline tools they are a little reluctant to acknowledge that these methods are becoming less and less relevant.

Posted by: David Alterman at January 28, 2008 06:16 PM

Agree with what was said particularly from Ian. I would add that in the many instances when we tried to convince clients to evaluate digital in new and clever ways - namely getting them to evaluate the consumer not the medium - in order to understand what 'effects' or 'impact' on sales digital has you quickly realise that there is a mountain to climb. Even client's consumer insight people still think their tracking study or god forbids their econometric model is all that they need. Secondly anything new or challenging in digital evaluation means that somebody 'equally clever' on the side of the client needs to get their head around stuff like correlating their online reputation with their brand equity never mind actually building a model that might use that. I actually think that it is not just culture & budgets are the problem I think that the potential for evaluating intelligently and indeed in new revolutionary ways hasn't even started we have a long way to go from the vaguely interesting tools that the CIA uses to keep tabs on unsavory individuals online and the highly suspect algorithms of Google and Facebook but we certainly can given a half willing client.

Posted by: speed at February 4, 2008 11:09 AM

To say nothing of the user experience...

Not only is there a dearth of considered traditional web metrics in digital propositions, there is also a lack of understanding about how the even-more-ambiguous user experience can be measured. When agencies and clients pore over clicks, visits, dwell time and conversion I can't help but feel they're simply looking for good numbers. The quick fix is to provide data without insight, a spreadsheet with numbers speaks loudly in a room of suited clients, especially when the graphs are going in the right direction. I would like to see both quantitative and qualitative user research book-ending projects, demonstrating through verbatims, video clips of user engagement and rigourous Ux heuristics that the work has a demonstrable benefit. To look at pure numbers is akin to assuming that having 98% of calls in a call centre answered and resolved in 30 seconds is good. It may look good on paper but if every customer goes away thinking 'that call was rubbish' then the metrics are giving false positives.

Posted by: John at February 5, 2008 10:11 AM

Along the lines of what John said:

Where my dad used to work (big big company), they used to pick up and hang up hundreds of support calls a day so they would show on the data as "answered within 20 seconds".

Posted by: Rob Mortimer at February 5, 2008 03:13 PM