Some years ago, I was travelling on the underground in Taiwan, and a woman opposite me was messaging someone on her phone. At the next stop her friend got on, and they seemed really excited to see each other. They talked for about 30 seconds, and then something strange but increasingly normal happened: the excitement and conversation just died, and after a short pause, their heads fell forward, and they looked down at their phones and began that familiar robotic action of flicking up their screens with their index fingers. It was like someone had turned off their aliveness switches.
I find the sight of so many people glued to their phones quite depressing – people walking down the street with their heads down, young couples in restaurants staring at their screens, kids trying to get attention from a mum who’s not looking up. I’m not sure why it bothers me so much, but there is something sad about the amount of distraction that is happening, and how that distraction seems to be taking people further away from experiencing more fulfilling face-to-face interactions. It also seems that many people don’t realise that it’s happening or that they have been manipulated by very clever algorithms.
Much of this technology is not neutral: most social media apps and websites are competing for your attention, and for many of the big companies this is driven by the advertising model – the more time you spend on site, the more revenue the company gets from advertisers. In a 2017 TED talk, Tristan Harris, who studied how to manipulate people’s thoughts while working at Google, explained how engineers at these tech companies are looking into how your psychology works to figure out the best ways to get your attention. He claimed that these companies steer 1 billion thoughts, as any notification on your screen leads you to have thoughts that you didn’t intend to have.
As well as the amount of distraction and wasted time this leads to, one problem with this competition for your attention is the type of thoughts these companies might lead you to have. News sources have known for a long time that stories that shock or make people angry or scared get more viewers or readers, so they have tended to show these stories more. And Harris says that in the online world, outrage is very good at getting people’s attention – stories that make people feel outraged are the ones that they tend to engage with and share more, which means the clever algorithms will show you more of those stories as it increases time on site. Think for a moment about what that does to your view of the world, and the things that you spend time reading, thinking about, and talking about. Harris believes this is “changing our ability to have the conversations and relationships that we want with each other.”
Social media is great for keeping in touch with people, but I have noticed that when I post a photo on Facebook, it’s quite unsettling to see how my behaviour changes: likes start popping up, and it feels good. Then I find myself compulsively checking to see if there are more likes, and more comments, as the good feeling changes to one of wanting more: ’20 likes; I hope it gets to 30’, my thinking goes, as if I will feel okay then, and I continue checking who has liked it and who hasn’t.
That compulsive checking and wanting more is a common experience, and again, it’s because the technology involved here is not neutral: Facebook have designed their likes system to make you want to keep checking. In an article in the Guardian, Sean Parker, founding president of Facebook, admitted that Facebook engineers designed this system to give you a little hit of dopamine when somebody comments on or likes your picture or post. Dopamine is a feel-good neurotransmitter connected to happiness, motivation and desire. And each time you get a like, dopamine not only gives you a little bit of pleasure, but also drives you to get more.
Social media has brought a lot of benefits, but the amount of distraction and the effect on people’s ability to connect is worrying. This, along with the way that we are targeted with news stories, and even lies, leads Harris to say that he doesn’t know of a more urgent problem right now.
I hope there will be changes, tech companies designing more ethical systems perhaps, or maybe some kind of waking up, as people become more aware of how they’re being manipulated and realise they have more of a choice about what they pay attention to.