Well, I'll be Damned: Queue Up the Violins
By Ken Magill
Okay folks. I take back every critical word I’ve said about privacy advocates over the last 15 years. Every. Single. One.
Why? Because one of them has risen to a challenge I’ve repeatedly issued: the challenge of actual harm.
Oh, the writer certainly wasn’t responding to me. She very likely has no idea who I am.
But she did it. For years I have been saying that no one has ever been hurt by data-driven marketing and for that reason, privacy advocates always talk hypothetically. You know the drill: Personal data could be used to deny insurance, or employment, or whatever.
Never mind that stuff people put online, such as photos of themselves pregnant with a lit cigarette (real), can be pretty decent indicators that they’re not necessarily the most responsible humans and maybe not a good risk.
People must be protected from evil insurance companies, employers and financial institutions.
Otherwise, heart-wrenching terribleness like the following may happen. [Warning: After I read what you’re about to read, I literally fell out of my chair, curled up on the floor in the fetal position and began to sob uncontrollably. … Okay, well the sobbing started after reading it and consuming three double martinis, but that doesn’t make it any less tragic.]
In a piece in the New York Times on Sunday headlined “Facebook is Using You,” law professor Lori Andrews recounted the following:
“Stereotyping is alive and well in data aggregation. Your application for credit could be declined not on the basis of your own finances or credit history, but on the basis of aggregate data — what other people whose likes and dislikes are similar to yours have done. … When an Atlanta man returned from his honeymoon, he found that his credit limit had been lowered to $3,800 from $10,800. The switch was not based on anything he had done but on aggregate data. A letter from the company told him, ‘Other customers who have used their card at establishments where you recently shopped have a poor repayment history with American Express.’”
That poor man.
His credit limit was lowered because he apparently used his card where deadbeats tend to frequent. All because of stereotyping.
Stereotyping. How evil. After all, not everyone who rings up big charges at Skanky’s Liquor and Lap Dance is a bad credit risk.
Sniff. I’m … I’m … shaking as I type this. I … I … Oh, Gaawwwwddddd!
I’m soooooo sorry … sniff … privacy advocates. How could I … sniff … have doubted you?
I mean, there are no other credit card companies, right? He has nowhere else to go, right?
And he is entitled to borrow as much money from whoever he wants for as long as he wants, right? And we should ignore the fact that credit cards are unsecured debt and that if he defaults, the card issuer has no way to recoup its losses other than to ding everyone who makes their payments, right?
How dare Amex use predictive-behavioral information to protect its own financial interests.
I wonder: Will he ever father children after this? Be able to tell his wife he loves her? Kiss a puppy?
Geez. I’m almost too emotionally drained to finish this column.
Message to Andrews: It’s not stereotyping. It’s profiling. And it works.
For example, fairly accurate generalizations can be made about people based on seemingly trivial behavior, such as, say, whether they get their morning coffee from Dunkin’ Donuts or Starbucks.
Guess which establishment’s patrons will probably be more likely to shop at Whole Foods? And guess which establishment’s patrons are more likely to drive Priuses or Subarus?
See how that works? Database marketing is simply a sophisticated extension of the above common-sense exercise. No one gets hurt. Risks and waste are mitigated. It’s all very simple, except for in My Little Pony Land where using established predictive techniques to mitigate risk is called “stereotyping.”