Andrew Przybylski, a professor of human behaviour and technology at Oxford University, is a busy man. It’s only midday and already he has attended meetings on “Skype, Teams, in person and now FaceTime audio”. He appears to be switching seamlessly between these platforms, showing no signs of mental impairment. “The erosion of my brain is a function of time and small children,” he says. “I do not believe there’s a force in technology that is more deleterious than the beauty of life.”
Przybylski should know: he studies technology’s effects on cognition and wellbeing. And yet a steady stream of books, podcasts, articles and studies would have you think that digital life is lobotomising us all to the extent that, in December, Oxford University Press announced that its word of the year was “brain rot” (technically two words, but we won’t quibble) – a metaphor for trivial or unchallenging online material and the effect of scrolling through it. All this has sown widespread fears that the online world that we – and our children – have little choice but to inhabit is altering the structures of our brains, sapping our ability to focus or remember things, and lowering our IQs. Which is a disaster because another thing that can significantly impair cognitive function is worry.
It may come as some relief to hear, then, that for every alarmist headline there are plenty of neuroscientists, psychologists and philosophers who believe this moral panic is unfounded. “Since 2017, there has been a constant drumbeat of: ‘Screens and tech and social media are a different universe that is bad for you and bad for your kid,’” says Przybylski. “And two things happen. The first is low-quality research that confirms our biases about technology. It gets immediate press because it’s consistent with our existing biases. It’s really easy to publish low-quality research that kind of shows a correlation, and then exaggerate it, because it’ll get attention and it’ll get funding.”
No one is denying that dangers lurk online, but that doesn’t mean you’re guaranteed to come to harm. “Living is risky, leaving the house is risky, crossing the street is risky,” says Przybylski. “These are all things that we have to help young people learn to do – to size up risks and act anyway. The internet is risky.”
There has also been, he says, “a real push in opinion pieces and popular-press books that are sloppy scientifically but stated so confidently. The ideas in these books are not peer-reviewed.” The published studies they cite tend to have small samples and no control groups, and to be based on associations rather than proving cause. “People will say: ‘The iPhone was invented in 2007 and Instagram became popular in 2012 and, oh my God, look, tech use has gone up at the same time mental health has gone down!’ It seems like common sense – that’s why you have this kind of consensus. But it just isn’t scientific.”
In 2023, Przybylski and his colleagues looked at data from almost 12,000 children in the US aged between nine and 12 and found no impact from screen time on functional connectivity (“how different parts of the brain kind of talk to each other”, he explains), as measured with fMRI scans while the children completed tasks. They also found no negative impact on the children’s self-reported wellbeing. “If you publish a study like we do, where we cross our Ts, we dot our Is, we state our hypotheses before we see the data, we share the data and the code, those types of studies don’t show the negative effects that we expect to see.”
And of course no one talks about the positive effects of tech, such as finding connection and community. “If we zoom out, we find that if young people have access to phones that can connect with the internet, if they have high-speed internet at home, their wellbeing is higher. They say they’re happier across a wide range of metrics of wellbeing.
“When the Lancet commission on self-harm does an evidence review, when the National Academy of Sciences in the US does an evidence review, when academic researchers do their meta scientific research, these things don’t come out in line with this tech panic,” he says. “That’s because this tech panic is not based on evidence. It’s based on vibes.”
The “study” that spearheaded this cascade of concern in 2005, and is still quoted in the press today, claimed that using email lowered IQ more than cannabis. But Shane O’Mara, a professor of experimental brain research at Trinity College Dublin, smelled a rat when he couldn’t find the original paper. It turns out there never was one – it was just a press release. That finding was the result of one day’s consultancy that a psychologist did for Hewlett Packard. He would later state that the exaggerated presentation of this work became the bane of his life.
Alongside a survey on email usage, the psychologist conducted a one-day lab experiment in which eight subjects were shown to have reduced problem-solving abilities when email alerts appeared on their screens and their phones were ringing. He later wrote: “This is a temporary distraction effect – not a permanent loss of IQ. The equivalences with smoking pot and losing sleep were made by others, against my counsel.”
The studies finding changes to brain structure sound particularly alarming, even if they are looking specifically at people with “problematic internet use”, as opposed to the general population. The trouble with these studies, says O’Mara, “is that they can’t determine cause and effect. It may be that you go on the internet [excessively] because you’ve got this thing there already. We simply don’t know, because nobody has done the kind of cause-and-effect studies that you need, because they’re too big and too difficult.”
Besides, brain structures change throughout life. Grey matter has been observed to decrease during pregnancy, for instance, and start regrowing after, along with other brain changes. “The brain is remarkably plastic,” agrees O’Mara.
He also thinks we’re being deeply ahistorical when we berate ourselves for scrolling cute animal reels, celebrity regimes or cup-winning goals on social media. “Humans have always been distractible. We’ve always sought solace in the evanescent. If you look at the history of media in the UK, just as a simple example, back in the 1940s, 1950s, 1960s, how many millions of tabloids were sold every day? Staggering numbers, because people indulged in that stuff. This is something people have always done, and we’re being a bit moralistic about it.” Has the internet age led to greater numbers of plane crashes or patients dying on operating tables? “The answer is no: we’re much better at all of those things.”
We’ve always had to watch out for our “attentional bottleneck”, he says. “For as long as I’ve been reading and researching in psychology, we’ve always taught our students: ‘Don’t do two things at once. You can’t.’” Multitasking and its associated dilution of efficacy were not invented by the internet. As Przybylski alluded to, having children is a classic route to task-juggling, with constant interruptions leading to intelligent adults not being able to string a sentence together. Similarly, if you use your smartphone while driving, of course you’ll increase the likelihood that you’ll crash.
What about the terrifying proclamations that tech is on the rise while IQ is in decline? I call Franck Ramus, the head of the cognitive development and pathology team at the Ecole Normale Supérieure in Paris. Mercifully, he says it’s not yet clear if IQ is truly going down. Scores rose globally during the 20th century but growth started slowing towards the turn of the millennium. This plateau effect had long been expected, as we neared the limits of the human brain. “Height has been increasing over decades, but we’re never going to reach three metres, are we? So there are limits to human physiology, including brain size.”
Any small IQ decreases that do seem to have been detected, Ramus says, aren’t considered conclusive at this point – the studies would need further replication. “There’s a meta-analysis of all the data until 2013, and the score seems to be progressing at least until 2010 or so. At the same time, it is true that some studies have documented a slight decrease in some countries. For example, there is a widely discussed Norwegian study that found a slight decrease in the last two decades. But there are also a greater number of studies that continue to observe increases.”
As for screen exposure, he says, what do we even mean by that? “It could be anything. The screen is just a medium but what matters is content. So when you talk about screen, you might as well talk about paper. Paper is another medium, and anything can be written on paper.”
This brings us neatly to Plato, who wrote about brain rot in relation to the invention of writing, says Tony Chemero, a professor of philosophy and psychology at the University of Cincinnati, whose 2021 paper in Nature Human Behaviour asserted: “Technology may change cognition without necessarily harming it.” “This worry that people are having, Plato had as well, 2,500 years ago or so, writing about how the written word will make people stupid because their memories will be worse and they’ll be worse at telling stories.”
Chemero does not love smartphones or AI – and laments the hassle the latter has created for professors like him having to find new ways to check their students aren’t handing in ChatGPT-generated work. “But the one thing that they don’t do is make us stupid,” he says. “Over the history of hominids, many of our biggest challenges have involved adapting to new kinds of environments – and that’s being smart. This is just a new environment we’re in.” So while he can still remember the phone numbers of high-school classmates, younger people’s brains are simply freed up for other activities. “What we really want from technology is to do the things that are difficult and boring, such as lots of complex calculation, rote memorisation: humans just aren’t very good at that without technology.”
The relevant question, he says, is what is memory in this situation, when we’re outsourcing some of it to tech? “Is it something that your brain does or is it an ability that you have? If [technology helps you] remember more things while your brain does something different, I don’t think that your memory is worse. It’s just different. What really matters is what we are able to do.” After all, the secret to human success has always hinged on our use of tools. “Being smart is being able to do lots of stuff. And I don’t think our phones are making us less able to do many things.”
Gary Small, the chair of psychiatry at Hackensack University Medical Center in New Jersey, has studied potential harms and benefits of digital technology use. He too steers clear of studies based on mere associations. “To my knowledge,” he says, “there’s no compelling evidence that using digital technology or using devices is going to cause permanent brain damage.”
In terms of the negatives, he believes that certain platforms and content can be addictive. “It could be porn, shopping, gambling. This technology heightens human behaviour, puts it on steroids, accelerates all these issues.” He mentions a study two years ago for which he and colleagues sent a group of 13-year-olds off to nature camp, and assessed their emotional intelligence (reading emotions on faces) and social intelligence (describing a social interaction) before and after. “And we found that five days away from screen time led to significant improvements in both, and we had a control group on that.” This showed, he says, that the negative effects of phone use are temporary, and go away when we put our phones away.
And there are positives. “In our work, in our social lives, screens keep us connected. We can be much more efficient. We can get information much more rapidly. I can collaborate with people across the globe.” While the fatigue you can get from not taking breaks is real – “you can get physical symptoms, headache, neck pain, shoulder pain, mental fatigue, no question about it” – using the internet can be stimulating brain exercise in itself. “The study we did, for me at least, was cause for some optimism.” His team taught older people to search the internet while they were in an fMRI scanner and found that their neural activity increased.
The study also states: “Certain computer programs and video games may improve memory, multitasking skills, fluid intelligence and other cognitive abilities. Some apps and digital tools offer mental health interventions providing self-management, monitoring, skills training and other interventions that may improve mood and behaviour.”
So rather than hampering your cognitive (and possibly even parenting) abilities by fretting about brain rot, he says, “be smart about how you use your devices. Manage the devices – don’t let them manage you. I try to practise what I preach” – namely, by taking regular breaks and choosing appropriate modes of communication. “So many times I see these long email threads trying to deal with complex, nuanced issues. Best response is: ‘Call me or let’s meet.’”
Przybylski, meanwhile, doesn’t shield his young children from smartphones or games consoles. “It’s perfectly OK to have some time dedicated to leisure, and why not some screen activities,” he says. Content quality is a consideration, and, “like every activity, it should be only a reasonable amount of time. I think many of the negative effects that are attributed to screen exposure are not intrinsic to screen exposure. They just reflect the fact that time can be lost for other activities that would have positive effects.”
Likewise, O’Mara isn’t worried about spending most of his working day using a computer. “We just need to develop new ways of thinking about how we interact with these media. Go out for a walk. Get up and get moving. That’s very, very good for you, if you want to relieve a feeling of incipient anxiety caused by these things.” It’s all about balance and avoiding the temptation to multitask too much. O’Mara suggests making time for reading a book uninterrupted, or leaving your phone in another room while you’re watching TV. “Be intentional about your media choices,” he says.
#mind #surprising #truth #brain #rot #Health #wellbeing