logo
#

Latest news with #AndrewPrzybylski

Grow a Garden: The surprise Roblox gaming hit
Grow a Garden: The surprise Roblox gaming hit

BBC News

time8 hours ago

  • Entertainment
  • BBC News

Grow a Garden: The surprise Roblox gaming hit

If people discover they love virtual gardening, might they be encouraged to take up the real thing? Andrew K. Przybylski, a professor of human behaviour and technology at the University of Oxford, said it was possible the game could "plant a seed" that could lead to a passion for plants. But, overall, he's sceptical. "It is unlikely that a game like this will encourage real world gardening any more than Super Mario Wonder encourages plumbing," he told the BBC. Prof Sarah Mills of Loughborough University has carried out research into the experience of young people and gaming. She highlights a key appeal of Grow a Graden is it is free to play, but the in-game currency is important. "This wider landscape of paid reward systems in digital games can impact children and young people's experiences of gaming and financial literacy," she said. "It can also cause challenges for many families to navigate, changing the nature of pocket money." Gardening podcaster and BBC presenter Thordis Fridriksson, meanwhile, is hopeful that any interest in gardening is a good thing. "Obviously the whole process is pretty different to real life, but it taps into the same thing which makes gardening so addictive, and that's planting seeds and watching your garden grow. "Fingers crossed some of the people who love the game will try growing something at home." Outside the living room in Edinburgh where they play the game is Owen and Eric's actual garden, which both boys help in. "I like gardening - and gardening in Grow a Garden," says Owen. But asked which one he prefers, he's emphatic: "Grow a Garden!"

Touchscreens have taken over our lives
Touchscreens have taken over our lives

Telegraph

time06-03-2025

  • Telegraph

Touchscreens have taken over our lives

Imagine the following routine. You jump into your car and drive to the high street, where, after paying for your parking, you do a top-up supermarket shop and buy a cup of coffee before an appointment with your GP, after which you return home and put your shopping in the fridge. What binds every stage of this everyday scenario, beyond wondering whether World War Three has broken out yet? Touchscreens. The above hypothetical situation would involve six interactions with swipeable screens, more if you check your phone again once you've used it to shell out for parking. Touchscreens have colonised every aspect of our lives. Our cars, supermarkets, coffee shops, GP surgeries, banks, airports and kitchen appliances now rely on them, not to mention the apps we need to do everyday tasks from listening to a song, seeing who's at the door or checking the cricket score. This supposed rush for convenience and efficiency – the reasons corporations and councils cite for touchscreen tyranny – has brought with it the depersonalisation of society. 'We prefer people to impersonal systems. We evolved to look people in the eye and to talk and interact with them,' says Andrew Przybylski, professor of human behaviour and technology at the University of Oxford. 'There are many positive, amazing things about technology that we get through a screen: we connect with communities; we can learn; we sometimes can save time. But there's a lot of dissatisfaction with what happens when you can't talk to a bank teller and you have to use an app on your phone… It is not a good thing to have AI and screens shoved into every corner of your life when you just want some milk.' Mark Carrigan, senior lecturer in education at the University of Manchester, says that efficiency gains from better digital technology always carry a risk for interaction between actual humans. The key question, says Carrigan, is 'whether those gains can be used to free up people to interact in richer and more engaging ways, or whether they're used to process more people at a lower cost. Unfortunately it's almost always the latter.' In other words, do supermarkets use self-service tills so bosses can free up staff to help us in the aisles? Or are they used to save that supermarket time and money? Swipe left for the answer. As Carrigan points out, none of this is the fault of touchscreens per se – it's all about how organisations decide to use the technology. Cars Physical buttons in cars are being replaced by touchscreens at an alarming rate, and it's a huge bugbear, according to Telegraph readers. A whopping 97 per cent of new cars released after 2023 contain at least one touchscreen, a survey by S&P Global Mobility found. Many drivers find them fiddly, distracting, not user-friendly and potentially dangerous. A recent poll of Telegraph readers found that 91 per cent prefer physical buttons over screens. Reader Oliver Burns drives a Kia EV6. He loves the car but he doesn't like its touchscreen, particularly when he has to lean over to switch between climate control and other functions. 'You have to lean right over to the left-hand side to change that function. So you are inevitably taking your eye off the road. I really think there are [safety concerns],' he says. Another reader, Peter Green, says that touchscreens' position to the left of the driver in a right-hand-drive car means that right-handed people – 90ish per cent of the population – 'struggle even more to prod them with our useless left hands'. Meanwhile Colin Reed is happy with his Ford Mondeo – apart from the touchscreen. 'I spend my life working on a computer. The last thing I need is a computer experience in the car I'm driving,' he says. Car industry insiders say touchscreens allow manufacturers to save money on the tooling of switches and knobs, one of the reasons they're so common. Supermarkets and coffee shops Is there a more irritating phrase than 'unexpected item in baggage area'? I think not. There are an estimated 80,000 self-service tills in UK supermarkets, up from 53,000 in 2019, according to data from RBR Data Services. Customers aren't impressed. A 2022 petition calling for Tesco to 'stop the replacement of people by machines' was signed by 245,000 people. Some retailers report that self-scanning has increased shoplifting as frustrated customers feel they're owed the product due to the effort. But there are signs that the tide is turning. In 2023 Preston-based supermarket Booths, dubbed the 'Waitrose of the north', axed nearly all the self-service tills in its 26 shops after customers labelled them 'impersonal' and 'unreliable'. At the time a Booths spokesperson said, 'We believe colleagues serving customers delivers a better customer experience.' Booths retained self-service checkouts in just two of its Lake District stores for customers during busy periods. Other retailers are following suit. Last year the boss of Morrisons, Rami Baitiéh, said the chain had gone 'a bit too far' with self-checkouts, while Asda said last year it will put more staff on checkouts. A Tesco spokesman says the chain offers shoppers 'both types' of checkout: 'We are proud to offer customers choice when it comes to checking out and customers can always ask a colleague for a manned till to be opened.' A common complaint is that people with poor eyesight struggle to use the machines. Tesco has a 'zoomed in feature' on its self-service tills to enlarge the typeface to help partially-sighted people pay for their groceries, the spokesman says. Marks & Spencer, meanwhile, has launched a specific till in its shops where people can chat to a cashier without being rushed. The 'it may take a little longer' initiative is designed to give the elderly, the lonely or those who may have no family or friends that opportunity to interact with a friendly human in their own time. Some people might argue that this used to be called 'going to the shops.' But in the rushed 2020s, it's something to be cheered. But how about this for dystopian? I recently went to a coffee shop called Black Sheep Coffee on the Strand in London. Even though baristas were standing behind the counter (and not busy at the time), I still had to order my coffee via a touchscreen between us. I contacted Black Sheep Coffee online for this article to ask about the screens in its shops. I received an automated email: 'Your request 90608 has been received and will be reviewed by our team.' Sign of the times. Nice flat white, though. GPs and hospitals One area where touchscreens have arguably increased efficiency for the better is in the NHS. GP surgeries have touchscreens in their waiting rooms to allow people with appointments to check-in, thereby freeing up receptionists to answer phones or triage walk-in patients. These can cut the time a patient spends in the waiting room by half. The NHS has also introduced 'urgent care self-service' kiosks in some A&E and UTC (Urgent Treatment Centre) departments of its hospitals. On arrival, patients answer questions about their symptoms by scrolling through a screen. The idea is that they can be prioritised or redirected elsewhere if needed, according to the NHS England website. However one newspaper reported last year that patients had to answer 14 pages of multiple choice questions before being asked whether they were losing lots of blood, which is either admirably thorough or somewhat risky, depending on your outlook and stage of exsanguination. And, as will all touchscreens, elderly people may find this tricky. In a related issue, Age UK highlighted last year how smartphone apps and online booking systems are increasingly required to book GP and hospital appointments. 'Many [older people] are struggling with the rapid shift to online communication in the NHS,' the charity said, pointing out that one in three over-65s lack the basic skills to use the internet successfully while one in six don't use the internet at all. So-called digital exclusion is clearly problem in the touchscreen era, and Age UK said it can contribute to 'severe anxiety' amongst the elderly. On the flipside, however, touchscreens in the home have been found to help people with dementia by keeping them engaged and relieving stress. The first reported use of touchscreens for people with the disease was as far back as 1986. In the home Touchscreens are starting to appear on everyday household appliances such as ovens, washing machines, dishwashers, coffee machines and lawn mowing robots. But at the forefront of this technology seem to be 'smart fridges', or fridges that are connected to the internet with a high-definition swipeable LED screen on the front. Many such fridges have cameras inside, so they can see when you're low on milk and send you a text telling you so. The camera also allows users who are out and about to check what they've 'got in' so they don't buy items they already have. There's more. Many of the fridges can read barcodes, meaning they can compile recipe ideas based on what you do have or shopping lists based on what you don't have. Some also track expiration dates and send you reminders to eat the food before it goes off, thereby reducing food waste. Also, a smart fridge can talk to a smart oven so it preheats to the correct temperature depending on what recipe you've chosen. Around a quarter of UK households currently have 'smart appliances', according to recent research from Statista. This is expected to rise to more than half of all households by 2029. But the big question here is whether such technology actually makes life simpler or is just whizzy for the sake of it. Clever marketing is at play here, argues Przybylski. He says we use this all technology 'because we're sold the promise of time savings. And that is not the same as realising those time savings.' Take a banking app, for example. Paying for something with your phone is undeniably quicker than writing a cheque. But what happens when you have to 'do ten authentications' with one-off text message codes in order to use that app? 'The question on balance that you have to ask yourself is, 'Are these things really making it easier for me to live my life the way I want to live my life?'' says Przybylski. It would be depressing enough if, thanks to the all-pervasive touchscreen tyranny, we'd sacrificed community for convenience. But the truth is far worse. In many cases, we've sacrificed community for inconvenience.

All in the mind? The surprising truth about brain rot
All in the mind? The surprising truth about brain rot

Yahoo

time29-01-2025

  • Science
  • Yahoo

All in the mind? The surprising truth about brain rot

Andrew Przybylski, a professor of human behaviour and technology at Oxford University, is a busy man. It's only midday and already he has attended meetings on 'Skype, Teams, in person and now FaceTime audio'. He appears to be switching seamlessly between these platforms, showing no signs of mental impairment. 'The erosion of my brain is a function of time and small children,' he says. 'I do not believe there's a force in technology that is more deleterious than the beauty of life.' Related: 'Brain rot': Oxford word of the year 2024 reflects 'trivial' use of social media Przybylski should know: he studies technology's effects on cognition and wellbeing. And yet a steady stream of books, podcasts, articles and studies would have you think that digital life is lobotomising us all to the extent that, in December, Oxford University Press announced that its word of the year was 'brain rot' (technically two words, but we won't quibble) – a metaphor for trivial or unchallenging online material and the effect of scrolling through it. All this has sown widespread fears that the online world that we – and our children – have little choice but to inhabit is altering the structures of our brains, sapping our ability to focus or remember things, and lowering our IQs. Which is a disaster because another thing that can significantly impair cognitive function is worry. It may come as some relief to hear, then, that for every alarmist headline there are plenty of neuroscientists, psychologists and philosophers who believe this moral panic is unfounded. 'Since 2017, there has been a constant drumbeat of: 'Screens and tech and social media are a different universe that is bad for you and bad for your kid,'' says Przybylski. 'And two things happen. The first is low-quality research that confirms our biases about technology. It gets immediate press because it's consistent with our existing biases. It's really easy to publish low-quality research that kind of shows a correlation, and then exaggerate it, because it'll get attention and it'll get funding.' No one is denying that dangers lurk online, but that doesn't mean you're guaranteed to come to harm. 'Living is risky, leaving the house is risky, crossing the street is risky,' says Przybylski. 'These are all things that we have to help young people learn to do – to size up risks and act anyway. The internet is risky.' It seems like common sense that mental health has gone down as tech use has gone up. But it just isn't scientific There has also been, he says, 'a real push in opinion pieces and popular-press books that are sloppy scientifically but stated so confidently. The ideas in these books are not peer-reviewed.' The published studies they cite tend to have small samples and no control groups, and to be based on associations rather than proving cause. 'People will say: 'The iPhone was invented in 2007 and Instagram became popular in 2012 and, oh my God, look, tech use has gone up at the same time mental health has gone down!' It seems like common sense – that's why you have this kind of consensus. But it just isn't scientific.' In 2023, Przybylski and his colleagues looked at data from almost 12,000 children in the US aged between nine and 12 and found no impact from screen time on functional connectivity ('how different parts of the brain kind of talk to each other', he explains), as measured with fMRI scans while the children completed tasks. They also found no negative impact on the children's self-reported wellbeing. 'If you publish a study like we do, where we cross our Ts, we dot our Is, we state our hypotheses before we see the data, we share the data and the code, those types of studies don't show the negative effects that we expect to see.' And of course no one talks about the positive effects of tech, such as finding connection and community. 'If we zoom out, we find that if young people have access to phones that can connect with the internet, if they have high-speed internet at home, their wellbeing is higher. They say they're happier across a wide range of metrics of wellbeing. 'When the Lancet commission on self-harm does an evidence review, when the National Academy of Sciences in the US does an evidence review, when academic researchers do their meta scientific research, these things don't come out in line with this tech panic,' he says. 'That's because this tech panic is not based on evidence. It's based on vibes.' The 'study' that spearheaded this cascade of concern in 2005, and is still quoted in the press today, claimed that using email lowered IQ more than cannabis. But Shane O'Mara, a professor of experimental brain research at Trinity College Dublin, smelled a rat when he couldn't find the original paper. It turns out there never was one – it was just a press release. That finding was the result of one day's consultancy that a psychologist did for Hewlett Packard. He would later state that the exaggerated presentation of this work became the bane of his life. Alongside a survey on email usage, the psychologist conducted a one-day lab experiment in which eight subjects were shown to have reduced problem-solving abilities when email alerts appeared on their screens and their phones were ringing. He later wrote: 'This is a temporary distraction effect – not a permanent loss of IQ. The equivalences with smoking pot and losing sleep were made by others, against my counsel.' The studies finding changes to brain structure sound particularly alarming, even if they are looking specifically at people with 'problematic internet use', as opposed to the general population. The trouble with these studies, says O'Mara, 'is that they can't determine cause and effect. It may be that you go on the internet [excessively] because you've got this thing there already. We simply don't know, because nobody has done the kind of cause-and-effect studies that you need, because they're too big and too difficult.' Besides, brain structures change throughout life. Grey matter has been observed to decrease during pregnancy, for instance, and start regrowing after, along with other brain changes. 'The brain is remarkably plastic,' agrees O'Mara. He also thinks we're being deeply ahistorical when we berate ourselves for scrolling cute animal reels, celebrity regimes or cup-winning goals on social media. 'Humans have always been distractible. We've always sought solace in the evanescent. If you look at the history of media in the UK, just as a simple example, back in the 1940s, 1950s, 1960s, how many millions of tabloids were sold every day? Staggering numbers, because people indulged in that stuff. This is something people have always done, and we're being a bit moralistic about it.' Has the internet age led to greater numbers of plane crashes or patients dying on operating tables? 'The answer is no: we're much better at all of those things.' We've always had to watch out for our 'attentional bottleneck', he says. 'For as long as I've been reading and researching in psychology, we've always taught our students: 'Don't do two things at once. You can't.'' Multitasking and its associated dilution of efficacy were not invented by the internet. As Przybylski alluded to, having children is a classic route to task-juggling, with constant interruptions leading to intelligent adults not being able to string a sentence together. Similarly, if you use your smartphone while driving, of course you'll increase the likelihood that you'll crash. Be smart about how you use your devices. Manage the devices – don't let them manage you What about the terrifying proclamations that tech is on the rise while IQ is in decline? I call Franck Ramus, the head of the cognitive development and pathology team at the Ecole Normale Supérieure in Paris. Mercifully, he says it's not yet clear if IQ is truly going down. Scores rose globally during the 20th century but growth started slowing towards the turn of the millennium. This plateau effect had long been expected, as we neared the limits of the human brain. 'Height has been increasing over decades, but we're never going to reach three metres, are we? So there are limits to human physiology, including brain size.' Any small IQ decreases that do seem to have been detected, Ramus says, aren't considered conclusive at this point – the studies would need further replication. 'There's a meta-analysis of all the data until 2013, and the score seems to be progressing at least until 2010 or so. At the same time, it is true that some studies have documented a slight decrease in some countries. For example, there is a widely discussed Norwegian study that found a slight decrease in the last two decades. But there are also a greater number of studies that continue to observe increases.' As for screen exposure, he says, what do we even mean by that? 'It could be anything. The screen is just a medium but what matters is content. So when you talk about screen, you might as well talk about paper. Paper is another medium, and anything can be written on paper.' This brings us neatly to Plato, who wrote about brain rot in relation to the invention of writing, says Tony Chemero, a professor of philosophy and psychology at the University of Cincinnati, whose 2021 paper in Nature Human Behaviour asserted: 'Technology may change cognition without necessarily harming it.' 'This worry that people are having, Plato had as well, 2,500 years ago or so, writing about how the written word will make people stupid because their memories will be worse and they'll be worse at telling stories.' Chemero does not love smartphones or AI – and laments the hassle the latter has created for professors like him having to find new ways to check their students aren't handing in ChatGPT-generated work. 'But the one thing that they don't do is make us stupid,' he says. 'Over the history of hominids, many of our biggest challenges have involved adapting to new kinds of environments – and that's being smart. This is just a new environment we're in.' So while he can still remember the phone numbers of high-school classmates, younger people's brains are simply freed up for other activities. 'What we really want from technology is to do the things that are difficult and boring, such as lots of complex calculation, rote memorisation: humans just aren't very good at that without technology.' The relevant question, he says, is what is memory in this situation, when we're outsourcing some of it to tech? 'Is it something that your brain does or is it an ability that you have? If [technology helps you] remember more things while your brain does something different, I don't think that your memory is worse. It's just different. What really matters is what we are able to do.' After all, the secret to human success has always hinged on our use of tools. 'Being smart is being able to do lots of stuff. And I don't think our phones are making us less able to do many things.' Gary Small, the chair of psychiatry at Hackensack University Medical Center in New Jersey, has studied potential harms and benefits of digital technology use. He too steers clear of studies based on mere associations. 'To my knowledge,' he says, 'there's no compelling evidence that using digital technology or using devices is going to cause permanent brain damage.' In terms of the negatives, he believes that certain platforms and content can be addictive. 'It could be porn, shopping, gambling. This technology heightens human behaviour, puts it on steroids, accelerates all these issues.' He mentions a study two years ago for which he and colleagues sent a group of 13-year-olds off to nature camp, and assessed their emotional intelligence (reading emotions on faces) and social intelligence (describing a social interaction) before and after. 'And we found that five days away from screen time led to significant improvements in both, and we had a control group on that.' This showed, he says, that the negative effects of phone use are temporary, and go away when we put our phones away. And there are positives. 'In our work, in our social lives, screens keep us connected. We can be much more efficient. We can get information much more rapidly. I can collaborate with people across the globe.' While the fatigue you can get from not taking breaks is real – 'you can get physical symptoms, headache, neck pain, shoulder pain, mental fatigue, no question about it' – using the internet can be stimulating brain exercise in itself. 'The study we did, for me at least, was cause for some optimism.' His team taught older people to search the internet while they were in an fMRI scanner and found that their neural activity increased. The study also states: 'Certain computer programs and video games may improve memory, multitasking skills, fluid intelligence and other cognitive abilities. Some apps and digital tools offer mental health interventions providing self-management, monitoring, skills training and other interventions that may improve mood and behaviour.' So rather than hampering your cognitive (and possibly even parenting) abilities by fretting about brain rot, he says, 'be smart about how you use your devices. Manage the devices – don't let them manage you. I try to practise what I preach' – namely, by taking regular breaks and choosing appropriate modes of communication. 'So many times I see these long email threads trying to deal with complex, nuanced issues. Best response is: 'Call me or let's meet.'' Przybylski, meanwhile, doesn't shield his young children from smartphones or games consoles. 'It's perfectly OK to have some time dedicated to leisure, and why not some screen activities,' he says. Content quality is a consideration, and, 'like every activity, it should be only a reasonable amount of time. I think many of the negative effects that are attributed to screen exposure are not intrinsic to screen exposure. They just reflect the fact that time can be lost for other activities that would have positive effects.' Likewise, O'Mara isn't worried about spending most of his working day using a computer. 'We just need to develop new ways of thinking about how we interact with these media. Go out for a walk. Get up and get moving. That's very, very good for you, if you want to relieve a feeling of incipient anxiety caused by these things.' It's all about balance and avoiding the temptation to multitask too much. O'Mara suggests making time for reading a book uninterrupted, or leaving your phone in another room while you're watching TV. 'Be intentional about your media choices,' he says.

All in the mind? The surprising truth about brain rot
All in the mind? The surprising truth about brain rot

The Guardian

time29-01-2025

  • Health
  • The Guardian

All in the mind? The surprising truth about brain rot

Andrew Przybylski, a professor of human behaviour and technology at Oxford University, is a busy man. It's only midday and already he has attended meetings on 'Skype, Teams, in person and now FaceTime audio'. He appears to be switching seamlessly between these platforms, showing no signs of mental impairment. 'The erosion of my brain is a function of time and small children,' he says. 'I do not believe there's a force in technology that is more deleterious than the beauty of life.' Przybylski should know: he studies technology's effects on cognition and wellbeing. And yet a steady stream of books, podcasts, articles and studies would have you think that digital life is lobotomising us all to the extent that, in December, Oxford University Press announced that its word of the year was 'brain rot' (technically two words, but we won't quibble) – a metaphor for trivial or unchallenging online material and the effect of scrolling through it. All this has sown widespread fears that the online world that we – and our children – have little choice but to inhabit is altering the structures of our brains, sapping our ability to focus or remember things, and lowering our IQs. Which is a disaster because another thing that can significantly impair cognitive function is worry. It may come as some relief to hear, then, that for every alarmist headline there are plenty of neuroscientists, psychologists and philosophers who believe this moral panic is unfounded. 'Since 2017, there has been a constant drumbeat of: 'Screens and tech and social media are a different universe that is bad for you and bad for your kid,'' says Przybylski. 'And two things happen. The first is low-quality research that confirms our biases about technology. It gets immediate press because it's consistent with our existing biases. It's really easy to publish low-quality research that kind of shows a correlation, and then exaggerate it, because it'll get attention and it'll get funding.' No one is denying that dangers lurk online, but that doesn't mean you're guaranteed to come to harm. 'Living is risky, leaving the house is risky, crossing the street is risky,' says Przybylski. 'These are all things that we have to help young people learn to do – to size up risks and act anyway. The internet is risky.' There has also been, he says, 'a real push in opinion pieces and popular-press books that are sloppy scientifically but stated so confidently. The ideas in these books are not peer-reviewed.' The published studies they cite tend to have small samples and no control groups, and to be based on associations rather than proving cause. 'People will say: 'The iPhone was invented in 2007 and Instagram became popular in 2012 and, oh my God, look, tech use has gone up at the same time mental health has gone down!' It seems like common sense – that's why you have this kind of consensus. But it just isn't scientific.' In 2023, Przybylski and his colleagues looked at data from almost 12,000 children in the US aged between nine and 12 and found no impact from screen time on functional connectivity ('how different parts of the brain kind of talk to each other', he explains), as measured with fMRI scans while the children completed tasks. They also found no negative impact on the children's self-reported wellbeing. 'If you publish a study like we do, where we cross our Ts, we dot our Is, we state our hypotheses before we see the data, we share the data and the code, those types of studies don't show the negative effects that we expect to see.' And of course no one talks about the positive effects of tech, such as finding connection and community. 'If we zoom out, we find that if young people have access to phones that can connect with the internet, if they have high-speed internet at home, their wellbeing is higher. They say they're happier across a wide range of metrics of wellbeing. 'When the Lancet commission on self-harm does an evidence review, when the National Academy of Sciences in the US does an evidence review, when academic researchers do their meta scientific research, these things don't come out in line with this tech panic,' he says. 'That's because this tech panic is not based on evidence. It's based on vibes.' The 'study' that spearheaded this cascade of concern in 2005, and is still quoted in the press today, claimed that using email lowered IQ more than cannabis. But Shane O'Mara, a professor of experimental brain research at Trinity College Dublin, smelled a rat when he couldn't find the original paper. It turns out there never was one – it was just a press release. That finding was the result of one day's consultancy that a psychologist did for Hewlett Packard. He would later state that the exaggerated presentation of this work became the bane of his life. Alongside a survey on email usage, the psychologist conducted a one-day lab experiment in which eight subjects were shown to have reduced problem-solving abilities when email alerts appeared on their screens and their phones were ringing. He later wrote: 'This is a temporary distraction effect – not a permanent loss of IQ. The equivalences with smoking pot and losing sleep were made by others, against my counsel.' The studies finding changes to brain structure sound particularly alarming, even if they are looking specifically at people with 'problematic internet use', as opposed to the general population. The trouble with these studies, says O'Mara, 'is that they can't determine cause and effect. It may be that you go on the internet [excessively] because you've got this thing there already. We simply don't know, because nobody has done the kind of cause-and-effect studies that you need, because they're too big and too difficult.' Besides, brain structures change throughout life. Grey matter has been observed to decrease during pregnancy, for instance, and start regrowing after, along with other brain changes. 'The brain is remarkably plastic,' agrees O'Mara. He also thinks we're being deeply ahistorical when we berate ourselves for scrolling cute animal reels, celebrity regimes or cup-winning goals on social media. 'Humans have always been distractible. We've always sought solace in the evanescent. If you look at the history of media in the UK, just as a simple example, back in the 1940s, 1950s, 1960s, how many millions of tabloids were sold every day? Staggering numbers, because people indulged in that stuff. This is something people have always done, and we're being a bit moralistic about it.' Has the internet age led to greater numbers of plane crashes or patients dying on operating tables? 'The answer is no: we're much better at all of those things.' We've always had to watch out for our 'attentional bottleneck', he says. 'For as long as I've been reading and researching in psychology, we've always taught our students: 'Don't do two things at once. You can't.'' Multitasking and its associated dilution of efficacy were not invented by the internet. As Przybylski alluded to, having children is a classic route to task-juggling, with constant interruptions leading to intelligent adults not being able to string a sentence together. Similarly, if you use your smartphone while driving, of course you'll increase the likelihood that you'll crash. What about the terrifying proclamations that tech is on the rise while IQ is in decline? I call Franck Ramus, the head of the cognitive development and pathology team at the Ecole Normale Supérieure in Paris. Mercifully, he says it's not yet clear if IQ is truly going down. Scores rose globally during the 20th century but growth started slowing towards the turn of the millennium. This plateau effect had long been expected, as we neared the limits of the human brain. 'Height has been increasing over decades, but we're never going to reach three metres, are we? So there are limits to human physiology, including brain size.' Any small IQ decreases that do seem to have been detected, Ramus says, aren't considered conclusive at this point – the studies would need further replication. 'There's a meta-analysis of all the data until 2013, and the score seems to be progressing at least until 2010 or so. At the same time, it is true that some studies have documented a slight decrease in some countries. For example, there is a widely discussed Norwegian study that found a slight decrease in the last two decades. But there are also a greater number of studies that continue to observe increases.' As for screen exposure, he says, what do we even mean by that? 'It could be anything. The screen is just a medium but what matters is content. So when you talk about screen, you might as well talk about paper. Paper is another medium, and anything can be written on paper.' This brings us neatly to Plato, who wrote about brain rot in relation to the invention of writing, says Tony Chemero, a professor of philosophy and psychology at the University of Cincinnati, whose 2021 paper in Nature Human Behaviour asserted: 'Technology may change cognition without necessarily harming it.' 'This worry that people are having, Plato had as well, 2,500 years ago or so, writing about how the written word will make people stupid because their memories will be worse and they'll be worse at telling stories.' Chemero does not love smartphones or AI – and laments the hassle the latter has created for professors like him having to find new ways to check their students aren't handing in ChatGPT-generated work. 'But the one thing that they don't do is make us stupid,' he says. 'Over the history of hominids, many of our biggest challenges have involved adapting to new kinds of environments – and that's being smart. This is just a new environment we're in.' So while he can still remember the phone numbers of high-school classmates, younger people's brains are simply freed up for other activities. 'What we really want from technology is to do the things that are difficult and boring, such as lots of complex calculation, rote memorisation: humans just aren't very good at that without technology.' The relevant question, he says, is what is memory in this situation, when we're outsourcing some of it to tech? 'Is it something that your brain does or is it an ability that you have? If [technology helps you] remember more things while your brain does something different, I don't think that your memory is worse. It's just different. What really matters is what we are able to do.' After all, the secret to human success has always hinged on our use of tools. 'Being smart is being able to do lots of stuff. And I don't think our phones are making us less able to do many things.' Gary Small, the chair of psychiatry at Hackensack University Medical Center in New Jersey, has studied potential harms and benefits of digital technology use. He too steers clear of studies based on mere associations. 'To my knowledge,' he says, 'there's no compelling evidence that using digital technology or using devices is going to cause permanent brain damage.' In terms of the negatives, he believes that certain platforms and content can be addictive. 'It could be porn, shopping, gambling. This technology heightens human behaviour, puts it on steroids, accelerates all these issues.' He mentions a study two years ago for which he and colleagues sent a group of 13-year-olds off to nature camp, and assessed their emotional intelligence (reading emotions on faces) and social intelligence (describing a social interaction) before and after. 'And we found that five days away from screen time led to significant improvements in both, and we had a control group on that.' This showed, he says, that the negative effects of phone use are temporary, and go away when we put our phones away. And there are positives. 'In our work, in our social lives, screens keep us connected. We can be much more efficient. We can get information much more rapidly. I can collaborate with people across the globe.' While the fatigue you can get from not taking breaks is real – 'you can get physical symptoms, headache, neck pain, shoulder pain, mental fatigue, no question about it' – using the internet can be stimulating brain exercise in itself. 'The study we did, for me at least, was cause for some optimism.' His team taught older people to search the internet while they were in an fMRI scanner and found that their neural activity increased. The study also states: 'Certain computer programs and video games may improve memory, multitasking skills, fluid intelligence and other cognitive abilities. Some apps and digital tools offer mental health interventions providing self-management, monitoring, skills training and other interventions that may improve mood and behaviour.' So rather than hampering your cognitive (and possibly even parenting) abilities by fretting about brain rot, he says, 'be smart about how you use your devices. Manage the devices – don't let them manage you. I try to practise what I preach' – namely, by taking regular breaks and choosing appropriate modes of communication. 'So many times I see these long email threads trying to deal with complex, nuanced issues. Best response is: 'Call me or let's meet.'' Przybylski, meanwhile, doesn't shield his young children from smartphones or games consoles. 'It's perfectly OK to have some time dedicated to leisure, and why not some screen activities,' he says. Content quality is a consideration, and, 'like every activity, it should be only a reasonable amount of time. I think many of the negative effects that are attributed to screen exposure are not intrinsic to screen exposure. They just reflect the fact that time can be lost for other activities that would have positive effects.' Likewise, O'Mara isn't worried about spending most of his working day using a computer. 'We just need to develop new ways of thinking about how we interact with these media. Go out for a walk. Get up and get moving. That's very, very good for you, if you want to relieve a feeling of incipient anxiety caused by these things.' It's all about balance and avoiding the temptation to multitask too much. O'Mara suggests making time for reading a book uninterrupted, or leaving your phone in another room while you're watching TV. 'Be intentional about your media choices,' he says.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store