To marketers, comprehensive, accurate consumer data is key to attracting customers, increasing sales and building longterm loyalty. The better you ‘know’ your customer, obviously, the more efficient your advertising spend and the happier the customer who is presented with a product or service that he or she actually wants. It’s a model that has been around for ages and has only gotten more sophisticated in the digital age.
Even when hiccups arise in the processing and interpretation of data for advertising purposes, it hasn’t historically caused too many waves. Anyone who has bought a book as a gift in the past decade on Amazon is familiar with being marketed baseball books when you’ve never touched a bat; or with getting an enthusiastic recommendation for the My Little Pony movie because we can’t be bothered to change the Netflix family account profile before pressing ‘play.’
Sometimes these ‘algorithmic fails’ can be more insidious. I once received an email from LinkedIn recommending I connect with my departed grandparents. Ostensibly, LinkedIn having convinced my mother to share her contacts, a ‘profile’ was created for my grandparents who had passed away years before. But, even then, only brand loyalty was damaged. I was disgusted but I didn’t feel ‘played.’
So far so easy right?
Well the recent saga surrounding Facebook, Cambridge Analytica, Russia and Trump has produced a much more dystopian scenario.
Facebook’s much lauded ‘social graph’ is more personal, more nuanced and diverges from just what you buy, eat, and ‘like’ to making assumptions about your values, beliefs and the like-minded ‘tribe’ you must belong to.
It’s no longer just cereal makers, auto insurance brokers and the Gap looking to target you, it’s now political campaigns, movements and the like. And while it’s not too controversial to assume I like shoes or need some wrinkle cream, trying to figure out what I believe in and how to get my vote has turned the process into a dark art.
So what is going awry and what do consumers need to know? It’s not just the data being collected as we interact with social media (although it’s really important to be highly aware of this) it’s how as humans we are being ‘categorized’ and pit against one another.
While it’s unlikely that branding me a ‘shoe person’ could stimulate much argument between friends who may disagree on the utility of a pump; it’s become much more complex and (I’d say) disturbing now.
Here’s an example that has given me much to consider: within the advertising settings in Facebook (settings -> account settings -> ads -> your information -> review and manage your categories) I’ve found myself tagged by Facebook as Liberal (Extreme).
Is that accurate? Well no. I’d say I’m more centrist – fiscally conservative, socially liberal. But that’s not the problem, really. What’s upsetting is this — I’ve posted (with below-average frequency) about women’s issues, sensible gun laws, support for Parkland, for Newtown. I’ve shared lots of mother-related content and the oft piece expressing my concern about president Trump. And Facebook considers some mix of these issues, or all of them, as both liberal, as well as, extreme.
And this is where data collection, and its exploitation, have gone very, very wrong. It seeks to get us to line up, pick a side, fight with ‘others’. It marginalizes, coerces and makes us question who we are. To support sensible gun control legislation is widely supported across the US and increasingly hard to argue against, and yet it’s a view that puts me in a bucket — troublesome, extreme, liberal, ‘that person’; a clear target of a paying Facebook client’s spend (regardless of which side of an issue this client is on — I don’t get to choose). And that, folks, is how it’s all gone down of late.
The issue is wider, deeper and more worrisome than can be communicated in just a simple blog post, but the key to remember is this: social media sites like Facebook are ‘free’ but nothing in life comes without a cost and if we don’t stay alert to what we are giving up it will come with unexpected and unintended consequences far wider than what we are seeing today.