Stupid Marketer Watch: Worse than Pointless
By Ken Magill
An ad-agency-executive colleague called me yesterday and asked: “Do you have the Direct Marketing Association’s Statistical Fact Book?”
“No I don’t,” I answered. “Why do you ask?”
“Because Bob [An account executive; Not his real name] wants to know what average direct response rates are for [client’s] industry.”
“No, for everything.”
“Why is he asking for this information?”
“Because [client] is insisting on it.”
“Have you explained to him that even if that information exists, it’s worse than useless?”
“Have you explained to him that if he gives an answer other than ‘no such figures exist’ he’s setting your team up for failure?”
“Yes. But as usual, he’s not managing his client properly. I keep telling him these things and I keep getting push back.”
Later, Bob apparently found some numbers and sent them over as demanded.
Not managing his client properly, indeed. What’s more, by insisting on getting so-called industry average response rates “for everything” the client is self identifying as an incompetent boob.
Here’s why: The only response rates that should matter to a marketer are that particular marketer’s response rates. A “response rate” in and of itself is meaningless.
A campaign can get a high response rate but draw some really crappy customers. Likewise, a campaign can draw comparatively low response rates but result in the acquisition of some customers with exceptionally high average order sizes and lifetime values.
For example, I once worked for a small, business-to-business cataloger in a suburb of Buffalo, NY.
If I remember correctly, response rates of between four and five per thousand were acceptable to us. That’s less than half a percent.
But because our customers’ average-order dollar amounts were north of $200 and their lifetime values were also comparatively high, the company made money.
Benchmarking using industry-average response rates without the context of related metrics is worse than an exercise in pointlessness. It holds people accountable to a number that could be dangerous to their efforts.
If my colleague’s team sends a campaign that doesn’t meet the magical industry-average response rates Bob sent over to the client, my colleague’s team will be taken to task for a “failed” campaign that may have worked just fine when other metrics are taken into account.
Bottom line: The client was being a marketing ignoramus for demanding industry-average response rates and Bob was stupid to supply whatever it was he found.
Whatever the number was, it will result in nothing productive.