Marketing’s Weekly Dose of the Truth

Ken Magill

About Us

A Big Step Toward a Common Language

8/16/11

By Ken Magill

A years-long industry-wide effort at standardizing definitions of email metrics is getting a major boost this week as Responsys becomes the first large, enterprise-focused email service provider to formally adopt them.

Dubbed the SAME Project, or Support the Adoption of Metrics for Email, the initiative’s standards are the culmination of an effort that began in 2007. Over the years it has involved a who’s who list of email marketing executives.

Most notably, the effort began with Loren McDonald, vice president of industry relations, Silverpop and David Daniels, founder of The Relevancy Group as co-chairs. Stephanie Miller, formerly of Return Path and now vice president of digital messaging for Aprimo, also played a leadership role.

Current co-chairs are now Luke Glasner and John Caldwell of Red Pill Email.

The project—undertaken by the Direct Marketing Association’s email experience council’s Measurement Accuracy Roundtable—standardized the definitions of metrics in three key areas: delivery, opens/render, and clicks.

“Responsys … is pleased to support the eec’s SAME Project in the adoption of standardized metrics for email marketing,” the company said in an emailed statement.

Glasner said he believes Responsys’ adoption of the standards may lead to other large enterprise ESPs adopting them. He said that previous to Responsys, the vendors that have adopted the standards in the year they’ve been out have been those aimed at small- to medium-sized businesses.

“Responsys lends a certain level of credibility to our project,” said Glasner. “We feel that with the addition of one of the leading ESPs, maybe that will encourage some of our other ESPs that have been sitting on the fence” to adopt them.

The need for standards became glaringly apparent in a study the eec conducted in 2007.

For example, there was no standard meaning for “delivered.” The study found that 79 percent of email service providers surveyed defined “delivered” by deducting all failures from total mailed, while 21 percent calculated it by deducting hard bounces—where the address no longer exists.

Email marketers were in even further discord on the “delivered” metric, as 63 percent defined it as total failures subtracted from total mailed, 11 percent defined it as simply total mailed and 10 percent defined “delivered” as only those emails that made it into the recipients’ inboxes versus their spam folders.

After several years of debate, the SAME Project determined that even the term “delivered” was a misnomer because what happens to email after it is accepted by recipients’ servers can be difficult to tell.

As a result, the SAME Project has adopted the terms “accepted,” “accepted rate,” “bounce,” and “inbox placement rate” as more accurate terms for metrics to be used related to email delivery.

According to the SAME Project, “accepted rate” is: “The total amount successfully delivered to the server divided by the total email deployed (unique records). The amount successfully delivered is the total amount attempted minus all failures, including hard bounces.”

The standards were developed to address two main concerns: No standards makes benchmarking difficult and when marketers migrate from one vendor to another it is not uncommon to find that the new vendor calculates metrics differently than the previous one.

“If we can’t agree on these metrics, email will never get any respect,” said Glasner.

Heather Blank, vice president of strategic services at Responsys, said standardized metrics will help email marketers and vendors get higher-quality benchmarking and trending data.

“Right now everyone tracks things differently so there’s a lot of confusion,” she said. For example: “When Marketing Sherpa surveys clients, a lot of those clients are looking at things differently so there’s not a lot of quality to that data. Also, when clients move from one provider to the other, there are a lot of challenges around legacy reporting and trending. We think this will help out there, as well.”

Blank added that she wants to see a SAME Project-like effort undertaken for social media.

“I’d like [the industry] to quickly get ahead of the charge on normalizing the types of metrics we want to look at on some of the emerging channels such as social and mobile,” she said.

Comments

Show: Newest | Oldest

Post a Comment
Your Name:
Subject:
Comments:
Verification:
Please type the letters in the image above

Terms: Feel free to be as big a jerk as you want, but don't attack anyone other than me personally. And don't criticize people or companies other than me anonymously. Got something crappy to say? Say it under your real name. Anonymous potshots and personal attacks aimed at me, however, are fine.

Posted by: Call it the way it is
Date: 2011-09-03 21:41:34
Subject: BS

Please. Let's see an out of the box report with the Accepted metric. This is BS PR and you are being hoodwinked.
Posted by: Thomas Grimes
Date: 2011-08-23 13:43:29
Subject: A step in the right direction.

This is a good step, in my opinion. In my role I spend a lot of time explaining email metrics and concepts to editorial and sales staff. The sales team may want to compare the performance of our campaigns with that of a potential client, but I have no idea if that client (or the client's ESP) is using the same calculations as I am. Standardization of metrics would allow us to better compare email performance across platforms.
Posted by: Luke Glasner
Date: 2011-08-17 17:02:46
Subject: Standards do provide value.

Actually, ESPs do measure things differently based upon our research in the field, for example few people realize that ESPs use 4 methods to calculate the Click Through Rate? We have studied this in the field for a few years now. However, the standards provide value in many ways, for example industry benchmarking. Doesn't that provide insight and value to marketers? How can we have industry benchmarks when we can't even determine how to measure CTR? Standards provide more accurate creative testing using the Render rate and/or CTO, as we use metrics that filter out people that did not see image based creative changes that may have been blocked for many people and actually test the factor we are changing. Otherwise, we are testing that image change and image blocking effects. What really increased your response rates? The fact that the CTA move up because images were blocked or the fact we changed the button graphic? Better testing controls produces more accurate tests, which should lead to better business decisions. Saying there was confusion in the industry does not imply disrespect for experienced email marketers; rather this helps our newer industry professionals get up to speed by providing clarity, ending that confusion for people that are less knowledgeable about email marketing. Luke Glasner, Measurement Accuracy Board, eec
Posted by: Steve Henderson
Date: 2011-08-17 12:43:31
Subject: Sorry, but...

I agree with SJO. When we are only talking about the basic concepts of email (sent, delivered, not delivered, inbox folder, bulk folder) really how can statements be made like "everyone track things differently” and “a lot of confusion”. Does this not display a certain lack of respect for the people in our industry if the consensus is that marketers are confused by the term sent, delivered and bounced. If these new standards are aimed at helping those people currently confused by the current terminology, is introducing a range of new terms really going to help them? Would it have been easier to clarify the meaning of delivered? Maybe clarification and standardisation is needed for PAYG-type email providers where marketers frequently switch providers. I can only speak for my company, but at Communicator Corp, email marketing software is not just given to someone without training and support. This is purely my opinion, but I can't see how changing the names on a couple of columns on some summary reports will generate clarity across a newly-enlightened client base. Instead, it will cause months of support calls "Oh, so accepted means delivered... okay... well my manager wants to have the old reports back for his board meeting and my analyst said that his excel spreadsheet is now broken". Steve Henderson www.communicatorcorp.com
Posted by: SJO
Date: 2011-08-17 11:23:59
Subject: True Insight?

I'm all for standard metrics that provide true insight, but I'm less convinced that this is a giant leap forward. Really? Delivered is now Accepted? Opened is now Rendered? That drives actionable insight? Was there really confusion over the delivered metrics? Changing the name to Accepted advances the game? Sorry. This is first grade stuff. Maybe that's as good as it gets in the "new school."

Xverify