The BBC recently aired an episode of Panorama that aimed to lift the lid on what Facebook knows about you. It’s an intriguing discussion point, and one that I would hasten to guess the majority of users have no idea of. We create an account, and apart from tweaking a few minor privacy settings, we post all sorts of details of our everyday lives, ready for Facebook to cash in on.
Yep, you read that right. Facebook holds personal data about us all, which is then use to target ads at people based on our preferences.
Does that make it a marketeers dream or a privacy nightmare?
Both, I guess. As long as you are aware of what might be used for marketing purposes, I personally don’t have an issue with it, on a superficial level. Is that because I am a marketeer? Maybe, but I do genuinely find some of the ads I see useful and interesting. And that’s all because of how I interact online.
Let’s look at this in a bit more detail:
We – the Facebook user community; all 32m in the UK – produce Facebook’s content for free while advertisers queue up to sell their products. We are essentially crowd-funding Facebook. There is a closely-guarded top secret algorithm that powers Facebook and tracks our online activity; and holds considerably more data than any government agency. Facebook knows everything you do online.
Everything!
Antonio Garcia-Martinez, a former Product Manager at Facebook, told Panorama that Facebook matches its targeting for advertisers against data from your real online life, including your credit card purchases. As soon as you go online, Facebook is gathering data for advertisers’ use.
Did you know this? It’s kind of scary the first time you learn it.
What’s even more scary – or mind-blowingly clever if you’re a marketeer – is the fact that Facebook’s algorithm drives a quarter of all internet traffic. Yes folks, we are visiting web pages because Facebook told us to, and we don’t even realise it.
So how do you feel about that? Possibly a little affronted, right?
Starting to feel a little like you’re being brainwashed?
I am still in awe of the marketing possibilities, but when it starts to concern me a little is when we get to politically- and hate-related advertising. This becomes a whole lot more serious than the fact you have accidentally spent £200 on a pair of trainers because you somehow found yourself back on the shoe shop’s website…
Take Britain First, for example. If you don’t know who they are, they’re a far-right supporting group who stir up hatred and aggression through their Facebook page. It’s much subtler than that, but that’s the basic premise. Thanks to Facebook’s data-mining and relatively relaxed advertising rules (anyone can have an account and advertise what they like as long as that specific advert doesn’t flout the rules of the automated advert-checker algorithm), Britain First have built their following to over 1.6m followers, and openly admit they promote their videos to people who may sympathise with that post’s specific message, based on the individual’s online activity. Panaroma reported that the Brexit vote had also been influenced in this very way.
Imagine you had been reading some articles and statistics online about, let’s say, current immigration laws and Facebook collected this data, which Britain First-type advertisers then used to promote a video to you. Next thing you know, your innocent questions about immigration are being answered by a nice man in a suit, and the answers pointed you towards leaving the EU as being your sensible vote.
All because you looked something up online.
See how there are concerns about brainwashing now? Scary, huh?
It’s not just in the UK how selective political advertising has influenced a vote. The Trump campaign spent $70m on Facebook ads pre-election, according to Gary Coby of the Republican National Committee. Seventy million dollars! Again, as a marketeer, I completely applaud any business using readily available data to target potential customers, but the issue with Facebook advertising is that is it not regulated like advertising elsewhere on the internet. This leads to propaganda, fake news, and, yes, brainwashing. And that’s where it starts to become dangerous.
As Damian Collins, Chair of the Culture, Media and Sports Committee said, Facebook advertising is exploited as it’s not regulated. People can share what they like – fake or true – but an ad will basically only be removed if a Facebook user reports it (and even then there is no guarantee it will be removed; it all depends what the “community guidelines” algorithm decided).
Interestingly – or terrifyingly, depending on your user/marketeer bias – Simon Miller, Facebook’s monosyllabic, straight-answer-evading Policy Director, said that Facebook will never ban fake news (ie misleading stories) because this constitutes censorship, which Facebook is vehemently against. Part of me does wonder if they have a duty of care to their users though, especially when it comes to the spreading of far-right-wing propaganda and messages of hatred.
The simple answer to this dilemma, and a way to protect yourself, is to leave Facebook altogether. You’re not on there so you can’t be influenced by any advertisers, political or shoe-selling. But in this day and age of internationalism, we need social media to keep in touch with our families and friends.
The other option is to stay on Facebook but never to like, comment on, click, or share anything. Even your friend’s baby pictures. Again, not entirely a practical solution.
I hope you’ll now be aware of what Facebook knows about you and maybe think twice about some of the Facebook content you engage with in order to curb this unethical brainwashing, and to keep Facebook as the useful and fun communication tool it always was.