logo
#

Latest news with #JohnBurn-Murdoch

Thinking is a luxury good
Thinking is a luxury good

Observer

time30-07-2025

  • General
  • Observer

Thinking is a luxury good

When I was a kid in the 1980s, my parents sent me to a Waldorf school in England. At the time, the school discouraged parents from allowing their kids to watch too much TV, instead telling them to emphasise reading, hands-on learning and outdoor play. I chafed at the stricture then. But perhaps they were on to something: Today I don't watch much TV and I still read a lot. Since my school days, however, a far more insidious and enticing form of tech has taken hold: the internet, especially via smartphones. These days I know I have to put my phone in a drawer or in another room if I need to concentrate for more than a few minutes. Since so-called intelligence tests were invented around a century ago, until recently, international IQ scores climbed steadily in a phenomenon known as the Flynn effect. But there is evidence that our ability to apply that brain power is decreasing. According to a recent report, adult literacy scores levelled off and began to decline across a majority of OECD countries in the past decade, with some of the sharpest declines visible among the poorest. Kids also show declining literacy. Writing in the Financial Times, John Burn-Murdoch links this to the rise of a post-literate culture in which we consume most of our media through smartphones, eschewing dense text in favour of images and short-form video. Other research has associated smartphone use with ADHD symptoms in adolescents, and a quarter of surveyed American adults now suspect they may have the condition. School and college teachers assign fewer full books to their students, in part because they are unable to complete them. Nearly half of Americans read zero books in 2023. The idea that technology is altering our capacity not just to concentrate but also to read and to reason is catching on. The conversation no one is ready for, though, is how this may be creating yet another form of inequality. Think of this by comparison with patterns of junk food consumption: As ultraprocessed snacks have grown more available and inventively addictive, developed societies have seen a gulf emerge between those with the social and economic resources to sustain a healthy lifestyle and those more vulnerable to the obesogenic food culture. This bifurcation is strongly class-inflected: Across the developed West, obesity has become strongly correlated with poverty. I fear that so, too, will be the tide of post-literacy. Long-form literacy is not innate but learned, sometimes laboriously. As Maryanne Wolf, a literacy scholar, has illustrated, acquiring and perfecting a capacity for long-form, 'expert reading' is literally mind-altering. It rewires our brains, increasing vocabulary, shifting brain activity towards the analytic left hemisphere and honing our capacity for concentration, linear reasoning and deep thought. The presence of these traits at scale contributed to the emergence of free speech, modern science and liberal democracy, among other things. The habits of thought formed by digital reading are very different. As Cal Newport, a productivity expert, shows in his 2016 book, 'Deep Work,' the digital environment is optimised for distraction, as various systems compete for our attention with notifications and other demands. Social media platforms are designed to be addictive, and the sheer volume of material incentivises intense cognitive 'bites' of discourse calibrated for maximum compulsiveness over nuance or thoughtful reasoning. The resulting patterns of content consumption form us neurologically for skimming, pattern recognition and distracted hopping from text to text — if we use our phones to read at all. Tech notables such as Bill Gates and Evan Spiegel have spoken publicly about curbing their kids' use of screens. Others hire nannies who are required to sign 'no phone' contracts, or send their kids to Waldorf schools, where such devices are banned or heavily restricted. The class scissor here is razor-sharp: A majority of classical schools are fee-paying institutions. Shielding your kids from device overuse at the Waldorf School of the Peninsula will set you back $34,000 a year at the elementary grades. Even beyond Silicon Valley, some people are limiting digital stimulation (like social media or video games) for set periods of time as part of the self-improvement practice of dopamine fasting. The ascetic approach to cognitive fitness is still niche and concentrated among the wealthy. But as new generations reach adulthood having never lived in a world without smartphones, we can expect the culture to stratify ever more starkly. On the one hand, a relatively small group of people will retain, and intentionally develop, the capacity for concentration and long-form reasoning. On the other, a larger general population will be effectively post-literate — with all the consequences this implies for cognitive clarity. Lest you mistake me, there is no reason the opportunity to sideline the electorate or to arbitrage the gap between vibes and policy should especially favour either the red team or the blue team. This post-literate world favours demagogues skilled at code-switching between the elite language of policy and the populist one of meme-slop. It favours oligarchs with good social media game and those with more self-assurance than integrity. It does not favour those with little money, little political power and no one to speak up for them.

Brits Keep a Sweaty Upper Lip on Air Conditioning
Brits Keep a Sweaty Upper Lip on Air Conditioning

Mint

time18-07-2025

  • Climate
  • Mint

Brits Keep a Sweaty Upper Lip on Air Conditioning

(Bloomberg Opinion) -- There's a somewhat gratifying TikTok trend at the moment where Americans visiting London in a heatwave realize that, yes, British heat does 'hit different.' One tourist says, 'it feels as if I'm in a sauna.' Another admitted that he always thought British people were lying, but 'for some reason it just feels like you are melting.' Inevitably, the talk turns to air conditioning. After all, parts of the US definitely get hotter and just as humid as the UK, but there's usually refuge to be taken in mechanically cooled homes. In the UK, AC is rare — except in supermarkets and office buildings — and our housing stock, mostly built before climate change was a real and present threat, is designed to absorb and retain heat rather than keep it out. That's in part because many buildings here and across Europe were built before AC was available, and historically the main concern was keeping warm in frigid weather. As a result, we have a real problem with overheating that's only going to get worse as the climate crisis intensifies and elongates heat waves. More than half of homes in the UK currently suffer from overheating — meaning that the internal temperature exceeds what is comfortable for a certain amount of time, depending on whether the room is a bedroom or not. Under a 2C (3.6F) warming scenario, which we could reach as early as 2045, that could rise to 90% of homes. I used to live in a Victorian flat on the top floor. There were multiple occasions where I'd watch, sweaty and distressed, the mercury rise above 30C in the living room and bedroom for most of the day and evening, despite attempts to follow good heatwave protocol: shut the windows and curtains during the day, open everything up when the sun goes down. Working from home was a struggle, getting a good night's sleep impossible. My colleagues and friends from hotter climes will likely scoff at the discomfort of Brits. Though heatwaves are getting longer and hotter, they don't compare to the sweltering temperatures of other countries. Recent heatwaves saw temperatures exceed 45C in parts of Spain and Portugal, for example, a high mercifully not yet seen in the UK. Still, it's important to be attuned to the negative consequences of being poorly adapted to high temperatures — and the UK stands out on this front. For example, as John Burn-Murdoch recently highlighted in the Financial Times, sleep duration, work productivity and cognitive performance drop rapidly when indoor temperatures rise above the low 20s Celsius. Aggression and violence goes up. Mental health suffers. People die. A recent study from Imperial College London and London School of Hygiene & Tropical Medicine (LSHTM) found that an extra 263 Londoners likely died in the recent heatwave between June 23 and July 2, of which two-thirds could be attributed to the climate crisis. Another study from University College London and LSHTM found that, by the 2070s, annual excess heat deaths in England and Wales could exceed 34,000 in the worst-case scenario of 4.3C of warming with minimal adaptations. So, other than working hard to reduce emissions, what should we do? Adding green spaces and tree cover to cities makes a huge difference. Cities, with the sheer mass of asphalt, concrete and glass, are particularly vulnerable to overheating thanks to the 'urban heat island' effect. Studies have shown that the addition of tree canopy cover can reduce heat-related mortality and temperatures. Meanwhile, good public information, such as heat health alerts, clear instructions on how to stay cool and information about local public spaces with air conditioning, creates resilience within communities. But there's a key element missing. Government policy in Britain has focused almost exclusively on making homes warmer. You can currently get help with the cost of switching to a low-carbon heating system or get free or cheaper insulation. There's good reason for this. The cold has traditionally been of greater concern, and with home heating accounting for 18% of the UK's greenhouse gas emissions in 2021, these efforts are essential for meeting climate targets and energy security, as well as empowering households to reduce expensive heating bills. A couple of simple tweaks could help improve homes — and therefore the wellbeing of residents — in the summer months too. For example, a highly effective way of reducing the amount of heat that gets trapped in homes is to add shade via external shutters or awnings. Help ought to be available for installing these, particularly for low-income households who tend to be more at risk of overheating. It'd certainly improve upon advice to cover windows with yogurt, which may be surprisingly effective for the odd scorching day but turn into a drain of time and dairy in the long term. Meanwhile, landlords aren't incentivized to improve their homes. New rules will mean that private landlords must meet a certain threshold for energy efficiency by 2030. While this is a very positive and necessary move, there could be scope to expand it to overheating too. The development of a new metric for overheating risk, similar to an energy performance certificate, could help renters and buyers alike better understand what they're getting into, while opening the door for policy to encourage landlords to add shading and cooling measures. Though passive means of cooling should be prioritized, it's probably time to start embracing air conditioning, too — something that is being actively discouraged by government policy. Air-to-air heat pumps are able to both heat and cool rooms efficiently, but they are excluded from the government's boiler upgrade scheme (possibly because they don't provide hot water like a conventional boiler). The UK should rethink that. There is justified resistance to a wider adoption of air conditioning in the UK, from concerns about how the grid will handle extra energy demand in the summer to the idea that us Brits simply don't do air conditioning. A stiff but sweaty upper lip, you might say. But as the grid is upgraded and cleaned up, energy concerns become far less important — particularly if we're able to expand solar power that handily generates electricity when the sun is shining and AC demand is high. Cultural beliefs may be harder to sway, but in the face of weeks of restless sleep and sticky skin, we might eventually come round to it. More From Bloomberg Opinion: This column reflects the personal views of the author and does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners. Lara Williams is a Bloomberg Opinion columnist covering climate change. More stories like this are available on

It is time for India to ban smartphones in schools and colleges like Finland has
It is time for India to ban smartphones in schools and colleges like Finland has

India Today

time05-05-2025

  • Science
  • India Today

It is time for India to ban smartphones in schools and colleges like Finland has

First, it was all anecdotal. Over the years, as I have interacted with more and more people, I have come to believe — and I am using the word believe because this can also be construed as old-man-yells-at-moon — that young people seem to lack certain skills. And when I say skills, I am talking about work-related skills.I have noticed that, generally, young people are poor readers, not just of books but also anything that has printed text. They have difficulty with abstract ideas. They rarely notice subtext. They are mostly low on irony and sarcasm — likely because of their issues with subtext — and they struggle with problem-solving. Now, I am not throwing shade on Gen-Z and Millennials. They also have positive qualities in them, the kind of qualities that in certain cases, cover them in glory. But this piece is not about what Gen-Z and Millennials lack or what they have. Instead, it is about was anecdotal a few years ago is now increasingly turning empirical in 2025. Study after study reveals that people in the world, young and old alike, but the young in particular, are no longer that good with text, numbers, ideas and attention span. Some of these findings were recently covered by The Financial Times in an article titled 'Have humans passed peak brain power?' The article brings facts and figures to a feeling and notes that, since 2010, the cognitive decline among humans is real and not imagined. In general, we can no longer process numbers or read and comprehend in the same way as we used to do 30 to 40 years ago. Oddly, and encouragingly, The Financial Times article did not reach a conclusion that humans have biologically lost their mental faculties. Instead, it blamed the decline on the world we live in. 'The good news is that underlying human intellectual capacity is surely undimmed,' John Burn-Murdoch wrote in his piece. 'But outcomes are a function of both potential and execution. For too many of us, the digital environment is hampering the latter.'advertisementIn other words, our cognitive decline is the result of screen-time. Since 2010 — that is when smartphones arrived in the scene — it seems that screen-time has gone up and our ability to read text and numbers has come is probably the reason why countries have started taking note of smartphones and the havoc they seem to have been wrecking on people's minds. Given their ubiquity — and I must add a degree of usefulness — it is not possible for them to disappear from our lives. But countries have started limiting their usage wherever they can. Finland is the latest to do so. Its parliament passed a new law a few days ago, banning smartphone use in classrooms. The Netherlands is another country where similar rules exist. Italy, too, has done so and ditto for Brazil. At the same time, a number of countries have partial or conditional restrictions on smartphones in is high time India too comes out with a proper regulatory mechanism and guidelines to ban smartphones in schools. Ban — that word again which I dread. This necessitates that I must explain myself a bit.I do not want a ban on smartphones in schools and colleges because youngsters might use them to scroll through reels, or watch pornography, or share lascivious clips, or might use them to cheat in exams, or may end up playing violent video games, or might end up impacting their privacy because of the cameras and microphones in their phones. My reasons are not the reasons that one calls puritanical. I do not care about the prudishness or culture or 'corruption' of young minds. A ban is not even needed to keep children 'safe'. There are other ways to keep them safe. These are bad reasons to ban reason for suggesting a ban on smartphones is purely and plainly about what these devices are doing to our mental faculties. The world created by smartphones is a terrifying place for our brains. There is an information overload for which evolution hasn't wired us. It is this information — mostly junk and low-quality information — that led American writer John Naisbitt to quip decades ago, 'We are drowning in information, but are starved of knowledge.'Smartphones are increasingly leading to a world that Thomas Bernhard — the great hater as he was — presciently called 'stulted' at a time when smartphones did not even exist. Writing decades ago, Bernhard issued a warning against the primacy of images over text. 'The worldwide stultification was set in motion by photographic images and attained its present deadly momentum when the images began to move,' Bernhard wrote in Extinction. 'Humanity has for decades been staring brainlessly at these deadly photographic images and become more or less paralysed. Come the millennium, human beings will no longer be capable of thinking.'advertisementThe millennium has come, and while Bernhard is no longer alive to see the world he predicted, we increasingly get a sense that we are no longer thinking that sharply.A reversal is needed. Brought up feeding and feasting their eyes on screen, children and teens nowadays seem to be losing their sense of the world. And, of the word. Again, I am reminded of a few lines from a writer, although he wrote them in a different context. In Red Birds, Mohammed Hanif writes: 'Without their mobile phones and access to the internet, it was as if they were bats that had lost the use of their ears, and hence their ability to find things as they flew in the dark.'advertisementThis, I believe, is what has happened to all of us — and even more so to the generations that have grown up with small screens always attached to their hands. They are like bats that have lost the use of their ears and radar; now, each time they go out into the world, the noise, the light, and the chaos of it all render them immobile and dazed. They seem lost. We owe it to future generations that we give them the same skills that were given to us decades ago. Or else, we will be losing a little bit of that which made us dream big, create wonders and reshape our destinies.(Javed Anwer is Technology Editor, India Today Group Digital. Latent Space is a weekly column on tech, world, and everything in between. The name comes from the science of AI and to reflect it, Latent Space functions in the same way: by simplifying the world of tech and giving it a context)(Views expressed in this opinion piece are those of the author)Must Watch

Is mine the first generation thicker than our parents?
Is mine the first generation thicker than our parents?

Irish Times

time27-04-2025

  • Politics
  • Irish Times

Is mine the first generation thicker than our parents?

It may seem like a truism to suggest humanity is getting stupider. We achieved the incredible goal of making smoking almost extinct in the developed world, only to replace it with vaping. We pay €400 for tickets to a gig that were originally advertised at €86.50 and justify it to ourselves on the basis that the hours spent in the ticket seller's 'waiting room' would otherwise be wasted. Americans – or the 77,302,580 who voted for Donald Trump – elected a man who promised to bring down prices by imposing tariffs, which every sensible economist said would do the opposite. Now that he has actually gone ahead and done what he promised, with entirely predictable results, six in 10 are unhappy with both him and his tariffs. In Rome, pilgrims queue patiently for a fleeting glimpse of the body of the dead pope – and when the moment arrives, they lift their phones over their heads and crane their necks to experience it through a screen – which could have been achieved more easily by staying at home. But while this kind of self-defeating irrationality is hard-wired into humanity, the kind of stupidity I'm referring to is more intrinsic. READ MORE Teachers regularly sound warning bells about grade inflation and dumbing-down in education standards, but recent evidence suggests we've got much bigger problems than that. As John Burn-Murdoch, the Financial Times' chief data cruncher, puts it : 'Across a range of tests, the average person's ability to reason and solve novel problems appears to have peaked in the early 2010s and has been declining ever since.' Since it's not possible for our brains to have degraded in such a short space of time, attention has focused on Covid and school closures as the reason. But the trend is not just affecting schoolchildren. It's tempting to immediately blame the smartphone but the great mental slowdown was under way even before its arrival To be clear, this is precisely the opposite of what is supposed to happen. Just as every generation is slightly taller than the last, each is mentally sharper than their predecessors. James Flynn, the late American-born New Zealand-dwelling political scientist, found that from 1932 to 1978, average IQs rose by three points per decade. The 'Flynn effect' was never satisfactorily explained – it might have been better nutrition, access to education or the decline in childhood disease. But it's irrelevant now: the golden age of ever-increasing mental acuity seems to be over. Research from around the world – including from Norway, based on the results of IQ tests given to military conscripts ; and from US studies of 'composite ability scores' in adults – indicates a 'reverse Flynn effect'. My generation may be the first to be more stupid than my parents – and my children's generation may be thicker than mine. The Norwegian research found that the IQ of male conscripts rose six points between 1959 and 1979; two points the following decade; 1.3 points the one after, before declining by 1.3 points. In 2009, Flynn discovered tests carried out in 1980 and again in 2008 show that the IQ score of an average 14-year-old dropped by more than two points . And Burn-Murdoch drew his conclusions partly from latest round of analysis from PISA, the OECD's international benchmarking test for academic performance by 15-year-olds. It's tempting to immediately blame the smartphone – I'm loath to rule out blaming it for pretty much anything – but the great mental slowdown was under way even before its arrival. One theory suggests that because we are living longer and working memory declines with age, it might simply be a side effect of an ageing society. A more plausible explanation focuses on the way we process information generally. In the early years of the internet, we spent less time online overall, and more of that time engrossed in single topics or communicating directly with people we know. These days, we have morphed into dumb, passive consumers of 'the feed', an endless scroll of snatches of information and images unrelated to each other: someone's Penney's haul; puppies in need of rescuing; a haunted-looking child carrying the body of another child tightly wrapped in a white sheet in Gaza; Holly Willoughby flogging hair dye. Our brains are not designed to cope with information fired randomly in our direction like peas from a toddler's high chair. This is where the research – much of it collated in Johann Hari's book Stolen Focus (which was both feted and criticised when it was published but offers interesting food for thought) – gets pretty convincing, at least to anyone who has spent time in an office. Consider the following: if you're interrupted while in the middle of a complex task, it takes 23 minutes to get back to the task. We get less than one hour a day of uninterrupted time at work and can expect to be distracted on average every three to 11 minutes . The phrase 'multitasking' is meaningless for humans, but we convince ourselves we're doing it. The answer seems to be that we either accept that humanity has reached the point of obsolescence and hand over to AI – a suggestion some in Silicon Valley take quite literally, as Mark O'Connell writes this weekend. Or we finally take seriously all the research on the negative impacts of technology and resolve to meaningfully regulate big tech's influence over our lives. We need systemic solutions, but we can't wait for them either – the answer is in your own hands (for an average of 4.5 hours a day anyway, according to Comreg's 2022 data on smartphone usage). Technology's relentless assault on our ability to focus is just one of eight causes Hari highlights – the others include stress, exhaustion and the collapse in sustained reading. But it is hard to argue with the impact of technology on our ability to think, because we have all experienced it. If you're reading this on a phone, it's a minor miracle you're still here.

AI, and the cost of ‘optimised' learning
AI, and the cost of ‘optimised' learning

Time of India

time26-04-2025

  • Time of India

AI, and the cost of ‘optimised' learning

In a course I teach at a liberal arts university, I asked students to write a reflective essay about a personal cultural experience that changed them. What I got back was unsettling — far too many reflective pieces had the polished, impersonal sheen of AI. The sentences were smooth, the tone perfectly inoffensive, but missing the raw, uneven edges of real student writing. When I asked a few of them about their process, some admitted using AI tools like ChatGPT as a 'starting point,' others as an 'editor,' and a few simply shrugged, 'It gave me the answer.' The underlying sentiment was clear: why struggle when you can get it done by AI? (Photo credit: Getty Images) Not just in my classroom Professors everywhere are facing a generation of students who carry instant 'answers' in their pockets, bypassing the struggle that deep thinking, reflection and real learning demand. AI isn't just helping with assignments anymore — it's writing discussion posts, solving problem sets, even drafting essays before class. What we're seeing is not just a technological shift — it's a cultural one. But what do we make of this shift — from thinking, to outsourcing thought? Since 2012, standardised assessments across high-income countries have revealed a troubling phenomenon: a measurable decline in reasoning and problem-solving abilities among young adults. The data is stark: 25% of adults in developed economies, and a staggering 35% in the US, now struggle with basic mathematical reasoning. According to Financial Times journalist John Burn-Murdoch's piercing analysis, 'Have Humans Passed Peak Brain Power?', this decline is not due to biology, nor environment. It's something more insidious: it's due to how technology is reshaping our cognitive capacities. Where once we immersed ourselves in deep reading and reflective analysis, we now live in the age of the scroll. Algorithmically curated feeds dictate our attention, fragmenting our thoughts into 280-character conclusions and ten-second clips. Fewer than half of Americans read a single book in 2022. This isn't just a change in habit; it's a shift in the architecture of our cognition. We are witnessing a silent, collective decline of attention span, memory, and conceptual depth. And this crisis is now bleeding into education. Gyankunj Case These concerns are not limited to elite university campuses. In a study that I conducted with my professor and a colleague in Gujarat, evaluating the Gyankunj program — a flagship initiative to integrate technology into govt school classrooms — we found that students exposed to smartboards and digital content actually fared worse in mathematics and writing compared to their peers in classrooms without digital tools. The reasons were sobering. Teachers had not been adequately trained in using these technologies. Mathematics, which requires cognitive scaffolding and immediate feedback, suffered because the teacher was reduced to a passive facilitator of pre-designed content. Writing, an intensely human process involving revisions, suggestions, and encouragement, became mechanical. What we observed was not enhanced learning, but the opposite — a disconnect between medium and method. This points to a deeper malaise: techno-optimism. There's a growing belief, often fuelled by venture capital and consultancy jargon, that algorithms can fix education. That AI tutors, avatars, and dashboards can replace the 'inefficiencies' of human teaching. That every child's mind can be optimised, like a logistics chain. Learning Is Human But pedagogy is not content delivery. It is a relational, embodied, and context-rich process. It depends on trust, dialogue, spontaneity, eye contact, missteps and encouragement. No AI system, no matter how sophisticated, can replicate the chemistry of a teacher who senses a student's confusion and adapts — not by code, but by care. AI is now entering primary education spaces as well. I have seen prototypes where storybooks are narrated by AI voices, children's drawings are corrected by algorithms and writing prompts are generated automatically. But what happens to play-based learning? To dirtying one's hands with clay, engaging with textures, shapes, and emotions? Indian educators like Gandhi, Tagore, and Gijubhai Badheka emphasised the necessity of experiential, tactile learning in early years. Similarly, Sri Aurobindo emphasised that education must arise from the svabhava of the child, guided by the inner being — not imposed templates. Can an algorithm, however sophisticated, grasp this uniqueness? J Krishnamurti, in his talks on education, famously questioned whether any system, however well-designed, could ever nurture freedom. For him, true learning happened in freedom from fear, not in efficient content delivery. If AI's omnipresence in classrooms creates an atmosphere where mistakes are quickly corrected, paths auto-completed, and creativity constrained by what's already been done, are we not curbing the learner's inward growth? In reducing learning to clicks, nudges, and 'correct answers', are we not slowly extinguishing the inner flame? Walking the Tightrope And yet — let me be clear — I am neither a techno-skeptic nor a techno-romantic. The use of AI in education, when done thoughtfully, has made certain forms of learning more accessible and visual. Diagrams, simulations, and language-support systems have helped many students grasp complex ideas. It can assist teachers in planning. It can support students with special needs. But it should remain a tool, never the foundation. A servant of learning, not its substitute. When we raise children in screen-first environments, we risk creating what Jonathan Haidt (2024) now identifies as an anxious generation: digitally fluent but emotionally fragmented, constantly grappling with overexposure to screens, metrics, and digital surveillance. So, we have to ask: Are we preparing students not to be wiser, but simply more optimised? Not more reflective, but more 'prompt ready'? Not more social, but increasingly isolated behind screens and 'smart' interfaces? The challenge ahead is not technological. It is existential. Will we nurture depth, or distraction? Freedom, or feedback loops? A sense of self, or a sense of being constantly scored? Facebook Twitter Linkedin Email Disclaimer Views expressed above are the author's own.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store