logo
What is behind the surge of obesity worldwide, and how is Singapore fighting back?

What is behind the surge of obesity worldwide, and how is Singapore fighting back?

CNA18-07-2025
SINGAPORE: Obesity is not just a global problem, but one that is hitting Singapore hard.
According to the latest Ministry of Health data from 2022, nearly 12 out of every 100 adults aged 18 to 74 in Singapore were obese - double the rate from 30 years ago.
However, obesity is not a flaw or a simple lifestyle choice, but a complex, chronic disease that is often misunderstood and unfairly judged, according to healthcare experts.
WHAT IS OBESITY?
Dr Tham Kwang Wei, president of the Singapore Association for the Study of Obesity, noted a gradual rise in obesity prevalence in the population.
'Between 2010 and 2022 … we've hovered around anywhere from 10 plus per cent to currently 11.6 per cent … but I think if the measures had not been put in, we could have seen a larger rise,' said the Woodlands Health senior consultant.
She noted these public health measures implemented by the government included public infrastructure that encourage physical activity as well as campaigns that encourage a healthier lifestyle and earlier health screenings.
The World Health Organisation defines obesity as a Body Mass Index (BMI) of 30 or above.
However, with regards to Singapore's population, Dr Tham said the health risks from obesity for Asians begin from a BMI of 27.5.
Apart from looks or waistlines, doctors are also concerned about how obesity can lead to excess fat inside a person's body, causing health problems.
'When we have excess energy … that energy has to be stored somewhere … in the fat (cells). When the fat cells start to extend and are unable to tolerate more fat storage, then the fat needs to also flow somewhere else,' said Dr Tham.
The excess fat could end up around a person's liver, heart or even in the muscles, she added.
HEALTH RISKS FROM OBESITY
Dr Tham noted that the fat surrounding vital organs can cause inflammatory responses and ill health. This can lead to an increased risk of heart disease, stroke, Type 2 diabetes, or fatty liver disease.
Obesity is associated with more than 230 medical conditions, she added, with 77 per cent of patients with obesity seeking treatment at Woodlands Health hospital having at least three obesity-related complications (ORC) while 52 per cent have at least four such ORCs.
Dr Tham said the most common ORC was musculoskeletal complaints, followed by obstructive sleep apnoea and metabolic-dysfunction associated steatotic liver disease, commonly known as fatty liver.
She noted that many often do not view it as seriously as a chronic disease.
'They know it's serious, but they often seek help on their own. I don't think everybody needs to see a doctor, but they need to (see) obesity can lead to many serious, chronic diseases,' added Dr Tham.
TREATMENT FOR OBESITY
Lifestyle factors can lead to obesity, she said.
These include stress, lack of physical activity and sleep issues like lack of adequate sleep or routines and environments that hinder proper restful sleep.
Childhood obesity is another risk factor for adult obesity, added Dr Tham.
She noted that children who enjoy processed, energy-dense food and snacks will have an increased risk of weight gain when they grow up, as they will consume such food as adults.
The effects of obesity are not just medical but also deeply personal, affecting patients' daily lives, she said.
'Up to 80 per cent (of our patients) … are impacted by their weight, whether it's at home doing housework … at work or in public places,' said Dr Tham.
While exercise and diet may help some in their weight loss journey, some may have their bodies working against them, she added.
'Energy regulation is disrupted by obesity. You may see that people say, 'I really don't eat much but I'm still putting on weight', and it's true in quite a number of people … (Their body's) metabolism has changed when they have obesity,' said Dr Tham.
For those whose bodies are genetically rewired to store fat and resist weight loss, medical help is available, including ultra-low calorie diets, medication or metabolic and bariatric surgery options, according to experts.
In tougher cases, patients may need help from a full medical team - a physician, dietitian, physiotherapist and psychologist.
'If a person with obesity has tried many, many times, it's really a signal for us to add on something beyond diet and lifestyle therapies … We may … introduce things like medications and even intensive diet, dietary interventions,' Dr Tham added.
Surgery is another step for even more serious cases. Doctors may use shrink a patient's stomach through surgical procedures to make it smaller, or filling part of it using a gastric balloon so it holds less and induces a feeling of fullness in the patient.
PREVENTION IS BETTER THAN CURE
While treatments and therapies are available to manage obesity, national efforts to promote healthy living and fitness are being made to help people stay healthy and reduce weight gain before it starts.
Dr Tham said the HealthierSG initiative has also helped to encourage people to visit their doctors and go for health screenings where they can do weight assessments.
'The environment is very, very important. And I think as a whole … our government has done a very good job,' she noted.
'We've seen a lot of infrastructure built around the environment, making healthcare very close to where we live, and making the environment very liveable,' she said, noting how people can easily access public exercise corners and fitness classes through ActiveSG.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Forum: MOH supports healthcare providers in safeguarding patient databases
Forum: MOH supports healthcare providers in safeguarding patient databases

Straits Times

timean hour ago

  • Straits Times

Forum: MOH supports healthcare providers in safeguarding patient databases

Find out what's new on ST website and app. We refer to Dr Yik Keng Yeong's letter ' Doctors may need help preventing cyber attacks on patient databases ' (July 22). Cyber security is a shared responsibility. Government systems such as the National Electronic Health Record (NEHR) are designed with robust cyber and data security measures in place to protect Singaporeans' health information. These include technical safeguards such as firewalls, security tools to detect and mitigate attacks, and ongoing monitoring and investigation of suspicious activities. In turn, systems connected to the NEHR are also required to have appropriate security measures in place. We understand the concern that GPs may have, and have ensured that white-listed clinic management systems meet the requisite cyber-security standards. Healthcare providers play an important role in ensuring that they have robust arrangements in place as to how their clinic management systems or electronic medical record systems are managed and used. They should put in place good practices, such as using strong passwords and two-factor authentication, be vigilant against phishing attempts, use anti-malware and anti-virus solutions, and keep computers/systems updated with security patches. The Ministry of Health supports healthcare providers through funding, and training and educational materials on cyber and data security practices. There are also the Cyber and Data Security Guidelines and Guidebook which help healthcare providers understand and meet the essential security requirements. The Ministry of Health is also exploring ways to work with cyber and data security providers to support healthcare providers in strengthening their security posture. We strongly encourage all healthcare providers to familiarise themselves with essential cyber and data security practices. Raymond Chua (Adjunct Professor) Deputy Director-General of Health (Health Regulation) Ministry of Health

Mental health in the age of AI
Mental health in the age of AI

Straits Times

timean hour ago

  • Straits Times

Mental health in the age of AI

Apps may be able to assist in cognitive behavioural therapy, detect depression risk in the user's voice, and more, experts here say. SINGAPORE - It is generally a two-month wait to see a psychiatrist at the Institute of Mental Health (IMH) outpatient clinic. So, to help patients through the dry spell between sessions, the hospital is studying if they can use a locally developed AI-powered app that, among other things, provides guided meditation and is able to predict the user's stress level. Since May 2025, IMH has offered the AmDTx app, or a placebo app, to individuals referred by a GP or a polyclinic doctor to the institute, to participate in the trial. 'Coping skills such as deep breathing and sleep hygiene can help one manage stress, or even symptoms of anxiety and depression. They can learn these from the app and start the interventions first. This will hopefully reduce their distress while waiting to see the specialist,' said Dr Christopher Cheok, a senior consultant at IMH and director of national mindline 1771, Singapore's first helpline and text service for mental health. Dr Cheok said the long wait times in public mental healthcare are because of rising demand and limited manpower. IMH is also exploring the use of other apps to monitor and support the care of mental health conditions, including one approved by the United States Food and Drug Administration (FDA). In general, digital tools include AI-enabled chatbots, mobile apps, wearable devices, and web-based programmes. Apps may be able to assist in cognitive behavioural therapy, detect depression risk in the user's voice, and more, experts here say. Top stories Swipe. Select. Stay informed. Singapore Sewage shaft failure linked to sinkhole; PUB calling safety time-out on similar works islandwide Singapore Tanjong Katong Road sinkhole did not happen overnight: Experts Singapore Workers used nylon rope to rescue driver of car that fell into Tanjong Katong Road sinkhole Asia Singapore-only car washes will get business licences revoked, says Johor govt World Food airdropped into Gaza as Israel opens aid routes Sport Arsenal beat Newcastle in five-goal thriller to bring Singapore Festival of Football to a close Singapore Benchmark barrier: Six of her homeschooled kids had to retake the PSLE Asia S'porean trainee doctor in Melbourne arrested for allegedly filming colleagues in toilets since 2021 Some of these tools can be seen at an exhibition on digital mental health, organised by the Yeo Boon Khim Mind Science Centre and D.S. Lee Foundation Mind Art Experiential Lab (MAELab), which opened at the MAELab space in Alexandra Hospital on July 11. While the centre said it does not endorse any of the tools, they demonstrate the potential of digital and AI-powered technology in mental healthcare. Tech can help An app that is available from a healthcare provider, Rejoyn was the first prescription digital therapeutic for the treatment of major depressive disorder to be approved by the FDA, in 2024. The smartphone app, designed to be used alongside medication, delivers a programme of evidence-based brain-training exercises and therapeutic lessons to help adult patients manage their symptoms. Other tools shown at Maelab include an online assessment tool from local firm Neurowyzr, which screens for early cognitive changes, and a Voice AI tool from another Singapore-based firm, Wonder Technologies, that screens for depression risk. The latter will soon undergo testing over a year with participants recruited from institutions affiliated with the National University of Singapore and National University Healthcare System, said the firm's CEO, Ms Wendy Wu. A similar Voice AI tool to detect early signs of depression in older adults is being developed here under SoundKeepers, a three-year local programme announced in October 2024. Its researchers said that developing a native technology for Singapore facilitates compliance with national healthcare data protection standards. It was at the height of the Covid-19 pandemic in June 2020 that the MOH Office for Healthcare Transformation (MOHT) created as a digital mental health resource website, which now boasts an AI-enabled chatbot Wysa. Ms Janice Weng, deputy director of at MOHT, said digital solutions are useful for mental health self-help, and the office would like to pilot a form of self-directed psychotherapy that is being used at IMH in community and primary care settings. iCBT, internet-based Cognitive Behavioural Therapy, could enhance access to affordable mental health care in the community and help reduce unnecessary visits to the hospitals, she said. MOHT is starting to develop AI models analysing Singlish, multilingual texts, and emotional cues and nuances that Western tools may miss, she added. Another digital health platform that it co-developed with IMH uses data from fitness trackers and smartphones to help care teams tailor support and empower individuals with psychosis and mood disorders to manage their own mental health. It may be useful for predicting depression in youth. Digital phenotyping, which uses smartphone data to understand user behaviour, is emerging as a promising way to detect mental health issues. Researchers have found, for example, that shifts in heart rate variability or sleep patterns can signal anxiety or low mood before individuals are even aware of it. Dr Jill Murphy, the executive director of the APEC Digital Hub for Mental Health who was in Singapore recently, said she is particularly excited about how this technology could lead to more personalised care. 'Although more research is needed in this area, it has the potential to shift the focus from broad categories of mental illness like depression to a more patient-centred approach,' she said. Tailoring interventions and treatment plans to match a person's unique needs, values, culture and experiences could also increase engagement with digital tools, she added. Dr Murphy was a plenary speaker at the July 16-17 Singapore Mental Health Conference, addressing how to use digital technologies to promote equitable access to mental health promotion and care in the Apec region. Treading with caution A big problem with digital mental health tools, however, is the sheer number of options out there, the majority of which have not been proven to be effective. Adjunct Associate Professor Cornelia Chee, head and senior consultant at National University Hospital's psychological medicine department, said plenty of work remains to establish the effectiveness, safety, and ethical use of digital and AI-enabled tools in real-world clinical settings. She cautioned that these tools should complement, and not replace, the therapeutic relationship that remains central to mental healthcare. The Organisation for the Review of Care and Health Apps (Orcha), founded by clinicians from the United Kingdom's National Health Service, reviewed approximately 35,000 uses of digital health technology, and found just 20 per cent to be secure, cyber-safe, and clinically effective. Dr Cheok said a search shows that there are more than 10,000 mental health apps on Apple and Google Play Store. 'In general, I think because apps are not regulated, no one can vouch for the quality of the information contained in the app or the intervention that's within the app, and one thing the public may not be so aware of is how their data is being used,' he said. 'Therefore, whichever apps we choose to evaluate, they must have shown to be useful in other countries and have been subjected to research studies for efficacy and validation.' Dr Cheok said IMH picked the AmDTx app as it was shown to work overseas, and is now studying its effectiveness in the local population. The other test site for the trial, expected to end by the first quarter of 2026, is the Singapore General Hospital. For now, Dr Murphy advises checking app privacy policies, published scientific studies, and endorsements from reputable organisations when evaluating digital mental health tools. She said organisations like Orcha have established standards for reviewing apps. Orcha has also created a Mind App Library, where users can browse apps that meet the standards that it has identified, she said. Associate Professor John Wong, director of the Yeo Boon Khim Mind Science Centre, said that with more apps coming to market, individuals must learn to make informed choices. 'What you really want is not to tell people what to buy, what to use, but what is in the technology, what is it that you need? And then they can be informed users,' he said. IMH chief executive officer Daniel Fung said validated digital tools for the population will likely be accessible through in the future. MOHT's Ms Weng said programmes such as , iCBT, and peer-led platforms could in the future make mental healthcare widely and easily accessible. 'Singapore can pioneer a hybrid model – where AI handles scale and prevention, and limited manpower focuses on where the needs are best met with empathy and complex care,' she said. Tracking tiny shifts in our bodies that hint at mental strain An exhibition on digital mental health tools highlights some biomarker changes that precede mental health issues. It was launched by the Yeo Boon Khim Mind Science Centre at its MAELab space in Alexandra Hospital. 1. Depression and anxiety Biomarkers: Reduced heart rate variability (HRV), poor sleep quality, decreased activity levels. Insight: These changes often show days or weeks before someone reports feeling low or anxious. Wearables can detect the trend early, nudging users to seek help or adjust their lifestyles. 2. Burnout or chronic stress Biomarkers: Elevated resting heart rate, decreased HRV, poor sleep quality. Insight: These signs appear subtly – even before people feel stressed. Smartwatches can send alerts for persistent physiological stress. 3. Menstrual-related mood disorders Biomarkers: Changes in sleep patterns, increased body temperature and resting heart rate. Insight: Devices can track how your body responds across cycles and alert you to abnormal patterns, such as more severe mood changes. Know your biomarkers Heart rate variability HRV – the variation in time between each heartbeat – can indicate overall stress burden and recovery status. Reduced HRV has been observed in anxiety disorders, depression, and more. HRV is highly individual, as it is influenced by factors such as genetics, age and sex, so comparing it across individuals is often unhelpful. Longitudinal tracking of one's own HRV baseline offers more meaningful insights. While a higher HRV is generally considered better, there is no universal cut-off for what constitutes 'low HRV'. A sustained drop of 20 per cent to 30 per cent below an individual's norm over weeks or months may be more indicative of concern than a one-off low reading. Respiratory rate The number of breaths you take every minute reflects how your body is functioning, especially during rest or sleep. Changes in your respiratory rate – especially when you are asleep – can be early signs of not just issues such as infections, but also of stress or anxiety. It can also be due to overtraining or hormonal fluctuations. Body temperature Wearables can measure peripheral skin or wrist temperature (not core temperature). A rise in temperature from your baseline level can signal the early stages of infection or illness while a subtle rise can indicate ovulation due to hormonal changes. Look for repeated or sustained increases from your usual pattern. Also, a small elevation of 0.2 deg C to 0.5 deg C, for example, may not point to illness but, when combined with other symptoms, it might suggest your body is under stress. Sleep metrics Sleep, including its various stages, can be tracked by analysing metrics such as HRV, body temperature and more. They help to detect sleep patterns linked not just to physical issues – such as reduced deep sleep after overtraining – but also mental health issues, such as insomnia in depression or fragmented sleep in anxiety. A lack of sleep is associated with irritability, anxiety, and a higher risk of depression. Deep sleep regulates stress hormones while REM (rapid eye movement) sleep supports emotional processing. Activity metrics These include steps taken, calories burned, distance travelled and activity intensity. Physical activity triggers endorphins, dopamine, serotonin and norepinephrine, which are key regulators of mood, motivation, and stress resilience. Regular activity is linked to reduced stress, improved mood, better sleep, and lower rates of depression and anxiety. Tracking one's activity metrics can reinforce these positive behaviours.

Can AI be my friend and therapist?
Can AI be my friend and therapist?

Straits Times

timean hour ago

  • Straits Times

Can AI be my friend and therapist?

Mental health professionals in Singapore say they have been seeing more patients who tap AI chatbots for a listening ear. SINGAPORE - When Ms Chu Chui Laam's eldest son started facing social challenges in school, she was stressed and at her wits' end. She did not want to turn to her friends or family for advice as a relative's children were in the same pre-school as her son. Plus, she did not think the situation was so severe as to require the help of a family therapist. So she decided to turn to ChatGPT for parenting advice. 'Because my son was having troubles in school interacting with his peers, ChatGPT gave me some strategies to navigate such conversations. It gave me advice on how to do a role-play scenario with my son to talk through how to handle the situation,' said Ms Chu, 36, an insurance agent. She is among a growing number of people turning to chatbots for advice in times of difficulty and stress, with some even relying on these generative artificial intelligence (AI) tools for emotional support or therapy. Anecdotally, mental health professionals in Singapore say they have been seeing more patients who tap AI chatbots for a listening ear, especially with the public roll-out of ChatGPT in November 2022. The draw of AI chatbots is understandable – it is available 24/7, free of charge, and will never reject or ignore you. But mental health professionals also warn about the potential perils of using the technology for such purposes: These chatbots are not designed or licensed to provide emotional support or therapy. They provide generic answers. There is no oversight. Top stories Swipe. Select. Stay informed. Singapore Sewage shaft failure linked to sinkhole; PUB calling safety time-out on similar works islandwide Singapore Tanjong Katong Road sinkhole did not happen overnight: Experts Singapore Workers used nylon rope to rescue driver of car that fell into Tanjong Katong Road sinkhole Asia Singapore-only car washes will get business licences revoked, says Johor govt World Food airdropped into Gaza as Israel opens aid routes Sport Arsenal beat Newcastle in five-goal thriller to bring Singapore Festival of Football to a close Singapore Benchmark barrier: Six of her homeschooled kids had to retake the PSLE Asia S'porean trainee doctor in Melbourne arrested for allegedly filming colleagues in toilets since 2021 They can also worsen a person's condition and generate dangerous responses in cases of suicide ideation. AI chatbots cannot help those with more needs Mr Maximillian Chen, clinical psychologist from Annabelle Psychology, said: 'An AI chatbot could be helpful when seeking suggestions for self-help strategies, or for answering one-off questions about their mental health.' While it is useful for generic advice, it cannot help those with more needs. Ms Irena Constantin, principal educational psychologist at Scott Psychological Centre, pointed out that most AI chatbots do not consider individual history and are often out of context. It is also often limited for complex mental health disorders. 'In contrast, mental health professionals undergo lengthy and rigorous education and training and it is a licensed and regulated profession in many countries,' said Ms Constantin. Concurring, Mr Chen said there are also serious concerns about the use of generative AI like ChatGPT as surrogate counsellors or psychologists. 'While Gen AI may increase the accessibility of mental health resources for many, Gen AI lacks the emotional intelligence to accurately understand the nuances of a person's emotions. 'It may fail to identify when a person is severely distressed and continue to support the person when they may instead require higher levels of professional mental health support. It may also provide inappropriate responses as we have seen in the past,' said Mr Chen. More dangerously, generative AI could worsen the mental health conditions of those who already have or are vulnerable to psychotic disorders. Psychotic disorders are a group of serious mental illnesses with symptoms such as hallucinations, delusions and disorganised thoughts. Associate Professor Swapna Verma, chairman of the Institute of Mental Health's medical board, has seen at least one case of AI-induced psychosis in a patient at the tertiary psychiatric hospital. Earlier in 2025, the patient was talking to ChatGPT about religion when his psychosis was stable and well-managed, and the chatbot told him that if he converted to a particular faith, his soul would die. Consumed with the fear of a dying soul, he started going to a temple 10 times a day. 'Patients with psychosis experience a break in reality. They live in a world which may not be in line with reality, and ChatGPT can reinforce these experiences for them,' said Prof Swapna. Luckily, the patient eventually recognised that his behaviour was troubling, and that ChatGPT had likely given him the wrong information. For around six months now, Prof Swapna has been making it a point to ask during consultations if patients are using ChatGPT. Most of her patients admit to using it, some to better understand their conditions, and others to seek emotional support. 'I cannot stop my patients from using ChatGPT. So what I do is tell them what kind of questions they can ask, and how to use the information,' said Prof Swapna. For example, patients can ask ChatGPT for things like coping strategies if they are upset, but should avoid trying to get a diagnosis from the AI chatbot. 'I went to ChatGPT because I needed an outlet' Users that The Straits Times spoke to say they are aware and wary of the risks that come with turning to ChatGPT for advice. Ms Chu, for example, is careful about the prompts that she feeds ChatGPT when she is seeking parenting advice and strategies. 'I tell ChatGPT that I want objective, science-backed answers. I want a framework. I want it to give me questions for me to ponder, instead of giving me answers just like that,' said Ms Chu, adding that she would not pour out her emotional troubles to the chatbot. An event organiser who wants to be known only as Kaykay said she turned to ChatGPT in a moment of weakness. The 38-year-old, who has a history of bipolar disorder and anxiety, was feeling anxious after being misunderstood at work in early 2025. 'I tried my usual methods, like breathing exercises, but they weren't working. I knew I needed to get it out, but I didn't want to speak to anybody because it felt like it was a small issue that was eating me up. So I went to ChatGPT because I needed an outlet,' said Kaykay. While talking to ChatGPT did distract her and help her calm down, Kaykay ultimately recognises that the AI tool can be quite limited. 'The responses and advice were quite generic, and were things I already knew how to do,' said Kaykay, who added that using ChatGPT can be helpful as a short stop-gap measure, but long-term support from therapists and friends are equally important. The pitfalls of relying too much on AI Ms Caroline Ho, a counsellor at Heart to Heart Talk Counselling, said a pattern she observed was that those who sought advice from chatbots often had pre-existing difficulties with trusting their own judgment, and described feeling more isolated over time. 'They found it difficult to stop reaching out to ChatGPT as they felt technology was able to empathise with their feelings, which they could not find in their social network,' said Ms Ho, noting that some users began withdrawing further from their limited social circles. She added that those who relied heavily on AI sometimes missed out on the opportunity to develop emotional regulation and cognitive resilience, which are key goals in therapy. 'Those who do not wish to work on over-reliance on AI will eventually drop out of counselling,' she said. In her practice, Ms Ho also saw another group of clients who initially used AI to streamline work-related tasks. Over time, some developed imposter syndrome and began to doubt the quality of their original output. In certain cases, this later morphed into turning to AI for personal advice as well. 'We need to recognise that humans are never perfect, but it is through our imperfections that we hone our skills, learning from mistakes and developing people management abilities through trial and error,' she said. Similarly, Ms Belinda Neidhart-Lau, founder and principal therapist of The Lighthouse Counselling, noted that while chatbots offer instant feedback or comfort, they can short-circuit a necessary part of emotional growth. 'AI may inadvertently discourage people from engaging with their own discomfort,' she told ST. 'Sitting with difficult emotions, reflecting independently, and working through internal struggles are essential practices that build emotional resilience and self-awareness.' Experts are also concerned about the full impact of AI chatbots on mental health for the younger generation, as their brain is still developing while they have access to the technology. Mr Chen said: 'While it is still unclear how the use of Gen AI affects the development of the youth, given that the excessive use of social media has been shown to have contributed to the increased levels of anxiety and depression amongst Generation Z, there are legitimate worries about how Gen AI may affect Generation Alpha.' Moving ahead with AI For better or worse, generative AI is set to embed itself more and more into modern life. So there is a growing push to ensure that when these tools are used for mental health or emotional support, they are properly evaluated. Professor Julian Savulescu, director of the Centre for Biomedical Ethics at NUS , said that currently, the biggest ethical issue with using AI chatbots for emotional support is that these are potentially life-saving or lethal interventions, and they have not been properly assessed, like a new drug would be. Prof Savulescu pointed out that AI chatbots clearly have benefits with their increased accessibility, but there are also risks like privacy and user dependency. Measures should be put in place to prevent harm. 'It is critical that an AI system is able to identify and refer on cases of self-harm, suicidal ideation, or severe mental health crises. It needs to be integrated within a web of professional care. Privacy of sensitive health data also needs to be guaranteed,' said Prof Savulescu. Users should also be able to understand what the system is doing, the potential risks and benefits and the chances of them occurring. 'AI is dynamic and the interaction evolves – it is not like a drug. It changes over time. We need to make sure these tools are serving us, not us becoming slaves to them, or being manipulated or harmed by them,' said Prof Savulescu.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store