Tuesday, 4 June 2013
Please don't 'like' this post.
Watching the people that we know interacting on Facebook can be reminiscent of watching one's colleagues at an office Christmas party that's about to close its doors; the way that they all suddenly seem willing to chat up the bartender, tell stupid jokes to get the boss's attention and show naked photos of their lovers (or their dogs and cats) to the waiter... all in the hope of getting another round of free drinks and food. But it's not mince pies and beer that drive Facebook users - you and me included - to share and bare their most shocking, pathetic or downright annoying sides; it's the free dopamine and serotonin that the site offers.
Dopamine is a neurotransmitter that rewards seeking behaviour with a sense of exhileration. It's an inbuilt learning mechanism that precedes emotion and supersedes thought, ensuring that we learn what we need to do to survive no matter what. As one writer at The Cerebral Cortex blog puts it:
"[When] the person first realizes that a specific reward occurs after an event, his [dopamine] neurons fire based on their forecast of when the reward will be given. These neurons are crucial in learning new processes because they fire as they practice their prediction skills and as the learning is acquired."
In the same article, the author describes an experiment which helped establish the link between dopaminergic surges and unexpected 'rewards', carried out by Dr. Wolfram Schultz at the University of Cambridge, 1997:
"In the experiment a light was flashed, several moments passed, and then drips of apple juice were fed to the monkey. Schultz observed the monkey’s neuronal response during the period of time in between the flash of light and the apple juice treat. Schultz discovered the dopamine response [occurred] in the monkeys immediately after he flashed the light. In this case, the monkeys’ dopaminergic neurons were predicting when the treat would be received."
That seems pretty standard: ejoyable reward = happy feelings. But because dopamine is associated with learning, it quickly fades when the pattern stays the same. There's nothing new to learn, that's why. "Over time," writes the same author, "this dopamine response decreased. But when the monkeys found apple juice that was not preceded by a flash of light, their dopaminergic neurons were excited again."
How excited, you ask?
"The unpredictability and surprise of the reward accounted for a dopamine response that was three to four times greater than the response that occurred during the [initial] learning period. The monkeys thus experienced far more pleasure when the apple juice reward arrived at an unpredicted time than when it arrived on schedule right after the flash of light."
I don't know about the reader, but this pattern of uncertain rewards reminds me of a certain, small red box that I can see in the top left hand corner of my screen as I write this, and its ever-changing tally of 'notifications'. I'm not the only one that's made the connection...
"Facebook’s notification system may be synonymous to the randomly occurring apple juice reward," the author of The Cerebral Cortex writes. "As Facebook users, we log onto the site to check for notifications. Often times, our guesses are just as inaccurate as the monkey’s random predictions for the apple juice reward. It is an exciting feeling to check for Facebook notifications, but the gratification from actually receiving a notification is always greatest."
And this reaction kicks into overdrive when a greater array of learning possibilities are available. "It is possible for the dopamine system to keep saying 'more more more', seeking even when we have found the information," writes Susan Weinschenk, Ph.D, a contributor at Psychology Today. "Research also shows that the dopamine system doesn’t have satiety built in," she adds.
Dopamine has had all sorts of mud slung at it because experts blame addiction on that lack of satiety. But that isn't the dopamine's fault, is it? It's the fault of the people who exploit it. People like Facebook's design team.
When E.B Boyd of the Fast Company met with the Facebook design team last year, she summed up their current design strategy this way: "Facebook doesn't just want to catalyze interactions. It wants to catalyze emotions." Boyd went on to describe how, "A sticky note with the word ['serotonin'] scrawled on it is tacked on the wall of a design meeting". Design team manager, Julie Zhuo, explained why it was there, "[Serotonin is] our term for those little moments of delight you get on Facebook". The vice president of product at Facebook, Chris Cox, added, "It's the science of things you can't reason about, that you just feel." Anyone looking for evidence that emotional manipulation is the endgame of Facebook's interface design need look no farther than the designers themselves.
"It wants you to have the same feelings--the positive ones at least--that you have when you cuddle up to friends and family in person," summarizes Boyd. And if the fact that people are having those feelings for an automated interface, rather than for the people themselves, has led to addiction and alienation from real friends & family, then who really cares? All that matters is that it sells.
Serotonin may be responsible for the feeling of happiness when a person has found what they want but what motivates them to seek it out is dopamine. And dopamine, says Susan Weinschenk, is stimulated by three things: anticipation, unpredictability and incomplete satisfaction. Facebook has all three in spades.
The 'like' button and the 'adds friend' button are perfect examples of dopamine and serotonin manipulation. For a start, both limit the range of interactive options to a relentlessly-upbeat range. To the receiver, 'friend' and 'like' sound positive and to the sender they sound nice too, but also neutral enough to be bandied about pretty recklessly. Anyway, it's not like people have much choice... if they want to leave feedback or add a connection in a hurry, there sure as hell aren't many other buttons that they can click.
The word 'like' is usually not even an accurate reflection of what a person wants to convey when they click the 'like' button anyway. 'Like' might not look as strong as a word like 'love', but how often do you hear a phrase like, "I like you," "I like what you think" or "I like how you look"? Not every day, that's for sure. That is why, if I post a link to an interesting article on Facebook and one of my 'friends' clicks the 'like' button just to show me that they read it, I get a little kick of dopamine. It's because of the inbuilt positive connotations that the word 'like' has. The same is true when I see a photo of a friend at a party that we went to together, and 'like' it to let her know that I saw it.
In the real world, telling someone that you read the same article or saw the same photo as them just doesn't arouse such strong feelings as it does when you 'like' them or 'friend' them on Facebook. Basically, Facebook has managed to turn the banal act of saying, "Hey, I read/recognized that too" into a little seal of approval, even when there isn't any approval going on. Who wouldn't like that? Well, honest people, for a start. The 'like' button is a lie, a misinterpretation by the middle man of the Facebook interface. And like any middleman, it wants us to like it so badly that it never stops fishing for opportunities to flatter our egos.
That may be why, when a Facebook user meets one of her Facebook 'friends' in person, the meeting sometimes feels anticlimactic and the rapport less positive than it did on Facebook... unless they happen to be good friends in real life, that is. Without the super-positive filter of Facebook people seem less nice, less supportive, and less ideal. The only way to interact with the ego-stroking version of them is to stay on Facebook... forever. Creepy, no?
Now for the 'add friend' button. In real life, people tend to consider who their friends are very carefully. They still like being called a 'friend' by somebody else, though, regardless of how close they feel to that person. Maybe that is why Facebook decided that everyone you connect to on the site has to become a 'friend'. Once you've added them as 'friends' you can divvy people up into acquaintances, colleagues, family members and close friends, etc, but from the moment that they send you a 'friend' request (or vice versa) they are first and foremost a 'friend'.
The word 'friend' is similar to the word 'like'. It doesn't sound very intense but it does have an in-built emotional charge, full of positive connotations. Whenever you log onto your Facebook account, the first thing you will see will always be "368 friends". You will never see "16 friends, 52 acquaintances and 300 people-that-you-met-just-once-and-will-almost-definitely-never-see-again". Facebook wants it that way so that, every time you see that figure (368 friends, wow) you'll feel the same emotional charge that you get from being called a 'friend' by another person, multiplied by 368. A dopamine and serotonin rush, multiplied by 368. You might only feel it for just a second, but that's long enough to elicit a neurochemical response and bond you to Facebook on a slightly deeper, subconscious level. Which is the real aim of the interface: Facebook is not designed to enable bonding with real people; it's designed to enable bonding with the unconditionally positive version of them that is only available via Facebook ™.
Some Facebook users reading this may be thinking, "Well, I already know all about that... in fact, I posted that same article on my Timeline three days ago and 21 friends liked it, so they know about it, too." They are all missing the point: knowing something is a conscious, secondary action while the dopaminergic response is subconscious and primary one. It's an involuntary reaction that reprograms one's brain at a neurological level... and it happens so fast that the brain has already been changed by the time its owner 'knows about' it. Maybe it's possible to undo the effect with greater mental control, but is that really desirable?
"With mental control over neuronal firing," writes the author of Cerebral Cortex blog, "Facebook would no longer be so addictive. This is the plus side. But what about other enjoyments? Would they still be as pleasurable if we could personally control how much pleasure we felt?" Never mind about pleasure - if we could control dopamine's action, would we even still be able to learn? To survive? In an experiment in which rats had their dopamine exhausted through chronic overstimulation, they simply died of starvation. They had food but no desire to eat, since it gave them no pleasure.
By triggering a dopaminergic and serotinergic response in nearly every interaction, Facebook is effectively drowning out negative feedback and instilling a false sense of acceptance. It is silencing the social alarms that tell people when an attitude adjustment is necessary to survive. At the same time, it is also deadening them to less stimulating realities. Take a Facebook addict away from his / her computer and witness how apathetic they can be without the constant reinforcement of the 'like' button.
Until the invention of Facebook, the only way to get that same sense of stimulation and enjoyment artifically was to engage in obviously antisocial, self-destructive behaviour: compulsive shopping, drug abuse, joining cults, gambling, etc. The difference with Facebook is that it cleverly disguises its non-stop dopamine buzz as actual feedback from actual people on a 'live' social media site. None of the usual warning bells are set off by this set-up, despite the fact that the usual addictive mechanisms are totally involved. As anyone who's lost a friend to Facebook knows, spending hours on it per day can be every bit as self-destructive and isolating as any other bad habit out there.
By now, a few readers are probably thinking, "So what if Facebook makes people feel better about doing silly stuff online? That's hardly a crime, is it?" And it may not be a crime... as far as we know. There may be nothing at all wrong with 'liking' posts of someone's cat in a hat on their timeline, just to give them that random dopamine hit, like that monkey getting his apple-juice windfall. But what if this action is a crime that takes a long time to show its adverse effects? What if its damaging them / us, and we just haven't realized it yet? After all, dopamine is the very same neurochemical that rewards a person when they first learn how to speak, read, make food, have sex, listen to friends, fix a bike, graduate. It's the same chemical that programs a person with every necessary survival skill, embedding it deep within their physical brain. It may be that dopamine is just a tad too important for us to be tampering with it gratuitously. I would be especially reluctant to mess around with it on Facebook, where the only demonstrable benefit of my dopamine buzz is to get me hooked on the site so that Mr. Zuckerberg can brag about its potential advertising revenues to his shareholders.
If manipulating dopamine and serotonin somehow prevents our species dealing with its immediate survival concerns - anything from environmental and political problems to their own emotional well being - then messing with that now, when humanity's survival is poised on a cliff's edge IS a crime.
Plenty of financial commentators have been singing Facebook's praises for the sneaky ways in which Facebook taps the dopaminergic and 'serotonogenic' systems of Facebook users. Never mind that it's been done in much the same way that bile is tapped from a live tiger for the commercial Chinese medicine market; that it's led to much the same result e.g. a force of nature has been isolated and rendered useless by technology, for profit.
We're talking about the first computer interface that has ever been deployed to manipulate huge masses of people, and it's doing it in such an indiscriminate way that it supports every kind of behaviour, whether good, bad, malignant, insignificant or benign. Never mind if it furthers our survival as a race or not. All in all, it's a social experiment way too huge, and too risky, to be carried out by an utterly mindless, self-promoting commercial interface.
I asked what the social consequences of manipulating dopamine gratuitously could be; the advance polls seem to be saying, 'not good, not good at all'. Take the example of smartphones and mobiles; everyone currently owns one, which I strongly suspect is more down to the combined dopaminergic allure of having text messages, Twitter, email, answer phone and Facebook all in one place. And yet, every single mobile phone manufacturer on the planet is, somewhere along the line, doing business with a genocidal, rapist warlord from the Congo simply because he is in control of the world's biggest/cheapest cooltan mines (cooltan being the crucial mineral in mobile phone production). So yeah, basically, all of those stories on the six o'clock news about child soldiers being jacked full of meth, trained to use AK-47s, encouraged to gang rape girls and women, using old people and POW's for target practice... essentially, they are stories about how smartphones and cell phones get made.
That isn't half as shocking as the fact that most smartphone users know where smartphones come from... and still continue to buy the damn things. Even left-wing, humanitarian types who have been resistant to consumerism for decades. What could possibly be so special about owning a smartphone, or a top of the line mobile, that it would cause a person to ignore their deeply-held morals? Perhaps this is yet another unstudied, adverse effect of activating that teacher-neurotransmitter at every new message, tweet or like we get. Perhaps people who have had their dopamine gratuitously tweaked, again and again, actually start to believe that keeping up to date with their notifications is as life-and-death as genocide in the Congo.
When clearly, it is not.
Without any hard evidence, many people will probably continue to believe that Facebook, which uses nature's strongest bio-reward to reinforce trivial behaviour on a mass scale, is not that big of a deal, in the grand scheme of things. Maybe those same people would change their tune, though, if they considered the effect that the site has had on reinforcing the attitudes of woman-haters, fag-haters, neo-Nazis and all the other right wing nut jobs that are also using the site.
The Rapebook scandal was another little insight into the possible consequences of Facebook mind-f*ckery: dozens of 'fan pages' were discovered where men (probably all abusers) were posting violent images and comments about women on Facebook. And with very few exceptions, every she-was-asking-for-it rant, beaten-wife joke and dead prostitute picture was flooded with 'likes'. because Facebook is first and foremost a private networking site, anybody wanting to challenge such rabid bigotry had to put their private account in the firing line to criticise the bullies directly. By comparison, clicking 'like' is virtually anonymous. The result is that 'likes' are guaranteed, by default, to outnumber the critical comments left on hate pages.
It seems to have never crossed Facebook's mind that handing out digital serotonin to whomever signs up for an account, unconditionally, is like a really bad idea. Especially when that person has a habit of setting up pages like, “Hitler was right”, “How to kill a batty boy” or “Get on your knees bitch...and beg!” The site's relentlessly positive interface guarantees that even the creators of such hateful pages will receive far more positive reactions, even if they are only coming from a demented minority that actually agrees with their bigotry. I wonder what this says to the creators of said pages? And to the casual visitors who accidentally, or out of curiosity, end up on their fan page? Perhaps it makes unconscionable attitudes seem supportable, and vice versa.
Facebook's interface is not only too positive, but it's too easy to use. It favours simplicity over complexity, agreement over debate and superficiality over depth. Just what we need.
Not only has Facebook unleashed a program that allows some severely f*cked up people to reward one another's f*cked up-ness with innocuous 'friend' and 'like' buttons, but it has also been painfully slow to act in blatant cases of hate speech and bullying. Maybe it's more than just a coincidence that people holding right-wing and extreme attitudes are becoming stupider, louder and brasher by the day. Or maybe it's Facebook's fault for offering hate groups the near-total illusion of support. Or maybe not... but, at any rate, somebody needs to be asking these questions. At present, the only people talking about it are big business and bodies wanting to boost their following, like the Church. Yes even they are falling all over themselves to copy the facebook formula.
Plying psychopaths, neo-Nazis, misogynists and bullie, or even just weak or uneducated people, with neurochemicals that make them feel loved and accepted by all is a dangerously irresponsible thing to do. And no, I don't think that Facebook activism will solve the problem because Facebook IS the problem. Rapebook exposed just a few of the ignorant attitudes that have found almost-unconditional support on Facebook, but thousands more will probably grow out of that almost-unconditional illusion.
Websites that make everybody feel great about being whatever they want to be, can only work in Utopias full of perfect people. But the irony is that Utopia will never exist so long as Facebook keeps 'liking' our every move.
I should add that I have tried to go offline from Facebook a few times but came back, because there were always a few friends who I never heard from if I was not using my account. Friends that I knew well and was able to keep in touch with for years before Facebook was invented. As I cancel my personal account this time around, I know the chances that I may never hear from them again but hey, everybody I know has email and phone, so that's their issue. There is no practical reason for them to be so attached to a single site, but clearly there are plenty of irrational reasons at play that need to be addressed.
And my case is hardly unique: nearly everyone I've asked has tried to avoid using the site at some point, and struggled for the same reasons above: addiction and alienation from people they care about. Facebook has a monopoly on certain people's affections that mere individual friendships cannot match. That is why the site can act as a dividing line, keeping real-life friends apart instead of bringing them together. That is because keeping you in touch with your friends is not Facebook's real goal: keeping you on their site is.