logo
Mobile phones to receive emergency alert in national test - here's what to expect

Mobile phones to receive emergency alert in national test - here's what to expect

ITV News07-07-2025
A loud siren will sound from millions of mobile phones this September as the government tests its emergency alert system.
The emergency alert text will be sent to mobile phones across the UK at around 3pm on Sunday, September 7, in its second-ever nationwide drill.
Phones will vibrate for roughly ten seconds - even if they are set to silent - and a message will appear on phone screens, making it clear it is a test.
The first time the system was tested was in April 2023, but some mobile phone users warned that their devices did not sound, with the problem traced to specific networks.
Why is it taking place?
The emergency alert system is used to spread information and advice rapidly, and is used to warn if there is a danger to life nearby, in instances like extreme weather.
Regular testing ensures the system is functioning correctly, should it be needed in an emergency, and familiarises the public with the alerts.
Who will and won't receive the alert?
During the test, the UK's approximately 87 million mobile devices will ring out with a high pitched alarm and vibrate for approximately 10 seconds, while a message will appear on the screen making it clear the alert is only a test.
The system has already been used in several scenarios – including storms, flooding and in one case when an unexploded Second World War bomb was discovered.
Emergency alerts work on all 4G and 5G phone networks in the UK, and a mobile phone or tablet does not have to be connected to mobile data or wifi to receive them.
Alerts will not be received if a device is turned off, connected to a 2G or 3G network, wifi only, or not compatible.
Ahead of the national test, ministers are spearheading a public awareness campaign to ensure people understand when it is taking place.
Domestic violence charities and campaigners are working with victims of abuse to ensure they know how to switch off alerts on a concealed phone.
People who find themselves in this situation are being told to consult details about opting out from alerts on gov.uk, the Government website.
Disability charities and campaigners are also being consulted to support disabled people, and drivers are being encouraged to find somewhere safe and legal to stop before reading the message.
No personal data will be collected or shared as part of the test.
When has it been used before?
Since the first national test of the emergency alerts system in 2023, five alerts have been sent.
4.5 million people in Scotland and Northern Ireland received an alert during Storm Eowyn in January 2025
3.5 million people across Wales and the South West of England received an alert during Storm Darragh, which killed two people, in December 2024
Other activations included when an unexploded Second World War bomb was discovered in Plymouth and localised flash flooding in Cumbria and Leicestershire.
What else is the government doing?
The government is publishing a Resilience Action Plan on Tuesday to improve the way it prepares for and responds to emergencies, and will publish an update on action being taken to secure the country from biological risks.
Pat McFadden, Chancellor of the Duchy of Lancaster, is the Cabinet Office minister who has taken charge of efforts to boost national resilience against crises.
He said: "Emergency alerts have the potential to save lives, allowing us to share essential information rapidly in emergency situations including extreme storms.
"Just like the fire alarm in your house, it's important we test the system so that we know it will work if we need it."
Other countries, including Japan and the USA, also regularly test their emergency alert systems.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Children must not grow up at mercy of toxic algorithms, says tech secretary
Children must not grow up at mercy of toxic algorithms, says tech secretary

Glasgow Times

time3 hours ago

  • Glasgow Times

Children must not grow up at mercy of toxic algorithms, says tech secretary

Peter Kyle said the Government was laying the foundations for a safer, healthier, more humane online world, as he warned tech firms they 'will be held to account' if they fail to adhere to the measures. The changes, as part of the Online Safety Act and set to be enforced by regulator Ofcom, require online platforms to have age checks in place – using facial age estimation or credit card checks – if they host pornography or other harmful content such as self-harm, suicide or eating disorders. Technology Secretary Peter Kyle said the Government had drawn a line on online protection for children (Stefan Rousseau/PA) They also require platforms to ensure algorithms do not work to harm children by, for example, pushing such content towards them when online. Actions which could be taken against firms which fail to comply with the new codes include fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater, and court orders potentially blocking access in the UK. Campaigners have warned the measures must be enforced strictly, with the NSPCC urging Ofcom to 'show its teeth' if companies fail to make changes in line with the regulator's protection of children codes. But the Molly Rose Foundation – set up by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media – said there is a 'lack of ambition and accountability' in the measures, and accused the regulator of choosing to 'prioritise the business needs of big tech over children's safety'. Mr Kyle insisted the Government has 'drawn a line in the sand' and that the codes will bring real change. He said: 'This Government has taken one of the boldest steps anywhere in the world to reclaim the digital space for young people – to lay the foundations for a safer, healthier, more humane place online. 'We cannot – and will not – allow a generation of children to grow up at the mercy of toxic algorithms, pushed to see harmful content they would never be exposed to offline. This is not the internet we want for our children, nor the future we are willing to accept.' He said the time for tech platforms 'to look the other way is over', calling on them to 'act now to protect our children, follow the law, and play their part in creating a better digital world'. He warned: 'And let me be clear: if they fail to do so, they will be held to account. I will not hesitate to go further and legislate to ensure that no child is left unprotected.' Ofcom chief executive Dame Melanie Dawes has previously defended criticism of the reforms, insisting that tech firms are not being given much power over the new measures, which will apply across the UK. Dame Melanie said: 'Prioritising clicks and engagement over children's online safety will no longer be tolerated in the UK. 'Our message to tech firms is clear – comply with age checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.' The regulator said X, formerly Twitter, and others including Bluesky, Reddit and dating app Grindr are among those to have committed to age assurances, and described its safety codes as demanding that algorithms 'must be tamed and configured for children so that the most harmful material is blocked'. It said it has launched a monitoring and impact programme focused on some of the platforms where children spend most time including social media sites Facebook, Instagram and TikTok, gaming site Roblox and video clip website YouTube. The sites are among those which have been asked to submit, by August 7, a review of their efforts to assess risks to children and, by September 30, scrutiny of the practical actions they are taking to keep children safe. Chris Sherwood, chief executive at the NSPCC, said: 'Children, and their parents, must not solely bear the responsibility of keeping themselves safe online. It's high time for tech companies to step up.' He said if enforcement is 'strong', the codes should offer a 'vital layer of protection' for children and young people when they go online, adding: 'If tech companies fail to comply, Ofcom must show its teeth and fully enforce the new codes'. Echoing this, Barnardo's children's charity said the changes are 'an important stepping stone' but 'must be robustly enforced'. England's Children's Commissioner, Dame Rachel de Souza, said Friday 'marks a new era of change in how children can be protected online, with tech companies now needing to identify and tackle the risks to children on their platforms or face consequences', and said the measures must keep pace with emerging technology to make them effective in the future. But Andy Burrows, chief executive of the Molly Rose Foundation, said: 'This should be a watershed moment for young people but instead we've been let down by a regulator that has chosen to prioritise the business needs of big tech over children's safety.' He said the 'lack of ambition and accountability will have been heard loud and clear in Silicon Valley'. He added: 'We now need a clear reset and leadership from the Prime Minister. That means nothing less than a new Online Safety Act that fixes this broken regime and firmly puts the balance back in favour of children.' Earlier this week, Mr Kyle said children could face a limit on using social media apps to help them 'take control of their online lives'. He said he wanted to tackle 'compulsive behaviour' and ministers are reportedly considering a two-hour limit, with curfews also under discussion. The Cabinet minister said he would be making an announcement about his plans for under-16s 'in the near future'.

Children must not grow up at mercy of toxic algorithms, says tech secretary
Children must not grow up at mercy of toxic algorithms, says tech secretary

South Wales Guardian

time4 hours ago

  • South Wales Guardian

Children must not grow up at mercy of toxic algorithms, says tech secretary

Peter Kyle said the Government was laying the foundations for a safer, healthier, more humane online world, as he warned tech firms they 'will be held to account' if they fail to adhere to the measures. The changes, as part of the Online Safety Act and set to be enforced by regulator Ofcom, require online platforms to have age checks in place – using facial age estimation or credit card checks – if they host pornography or other harmful content such as self-harm, suicide or eating disorders. They also require platforms to ensure algorithms do not work to harm children by, for example, pushing such content towards them when online. Actions which could be taken against firms which fail to comply with the new codes include fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater, and court orders potentially blocking access in the UK. Campaigners have warned the measures must be enforced strictly, with the NSPCC urging Ofcom to 'show its teeth' if companies fail to make changes in line with the regulator's protection of children codes. But the Molly Rose Foundation – set up by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media – said there is a 'lack of ambition and accountability' in the measures, and accused the regulator of choosing to 'prioritise the business needs of big tech over children's safety'. Mr Kyle insisted the Government has 'drawn a line in the sand' and that the codes will bring real change. He said: 'This Government has taken one of the boldest steps anywhere in the world to reclaim the digital space for young people – to lay the foundations for a safer, healthier, more humane place online. 'We cannot – and will not – allow a generation of children to grow up at the mercy of toxic algorithms, pushed to see harmful content they would never be exposed to offline. This is not the internet we want for our children, nor the future we are willing to accept.' He said the time for tech platforms 'to look the other way is over', calling on them to 'act now to protect our children, follow the law, and play their part in creating a better digital world'. He warned: 'And let me be clear: if they fail to do so, they will be held to account. I will not hesitate to go further and legislate to ensure that no child is left unprotected.' Ofcom chief executive Dame Melanie Dawes has previously defended criticism of the reforms, insisting that tech firms are not being given much power over the new measures, which will apply across the UK. Dame Melanie said: 'Prioritising clicks and engagement over children's online safety will no longer be tolerated in the UK. 'Our message to tech firms is clear – comply with age checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.' The regulator said X, formerly Twitter, and others including Bluesky, Reddit and dating app Grindr are among those to have committed to age assurances, and described its safety codes as demanding that algorithms 'must be tamed and configured for children so that the most harmful material is blocked'. It said it has launched a monitoring and impact programme focused on some of the platforms where children spend most time including social media sites Facebook, Instagram and TikTok, gaming site Roblox and video clip website YouTube. The sites are among those which have been asked to submit, by August 7, a review of their efforts to assess risks to children and, by September 30, scrutiny of the practical actions they are taking to keep children safe. Chris Sherwood, chief executive at the NSPCC, said: 'Children, and their parents, must not solely bear the responsibility of keeping themselves safe online. It's high time for tech companies to step up.' He said if enforcement is 'strong', the codes should offer a 'vital layer of protection' for children and young people when they go online, adding: 'If tech companies fail to comply, Ofcom must show its teeth and fully enforce the new codes'. Echoing this, Barnardo's children's charity said the changes are 'an important stepping stone' but 'must be robustly enforced'. England's Children's Commissioner, Dame Rachel de Souza, said Friday 'marks a new era of change in how children can be protected online, with tech companies now needing to identify and tackle the risks to children on their platforms or face consequences', and said the measures must keep pace with emerging technology to make them effective in the future. But Andy Burrows, chief executive of the Molly Rose Foundation, said: 'This should be a watershed moment for young people but instead we've been let down by a regulator that has chosen to prioritise the business needs of big tech over children's safety.' He said the 'lack of ambition and accountability will have been heard loud and clear in Silicon Valley'. He added: 'We now need a clear reset and leadership from the Prime Minister. That means nothing less than a new Online Safety Act that fixes this broken regime and firmly puts the balance back in favour of children.' Earlier this week, Mr Kyle said children could face a limit on using social media apps to help them 'take control of their online lives'. He said he wanted to tackle 'compulsive behaviour' and ministers are reportedly considering a two-hour limit, with curfews also under discussion. The Cabinet minister said he would be making an announcement about his plans for under-16s 'in the near future'.

Children must not grow up at mercy of toxic algorithms, says tech secretary
Children must not grow up at mercy of toxic algorithms, says tech secretary

Rhyl Journal

time4 hours ago

  • Rhyl Journal

Children must not grow up at mercy of toxic algorithms, says tech secretary

Peter Kyle said the Government was laying the foundations for a safer, healthier, more humane online world, as he warned tech firms they 'will be held to account' if they fail to adhere to the measures. The changes, as part of the Online Safety Act and set to be enforced by regulator Ofcom, require online platforms to have age checks in place – using facial age estimation or credit card checks – if they host pornography or other harmful content such as self-harm, suicide or eating disorders. They also require platforms to ensure algorithms do not work to harm children by, for example, pushing such content towards them when online. Actions which could be taken against firms which fail to comply with the new codes include fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater, and court orders potentially blocking access in the UK. Campaigners have warned the measures must be enforced strictly, with the NSPCC urging Ofcom to 'show its teeth' if companies fail to make changes in line with the regulator's protection of children codes. But the Molly Rose Foundation – set up by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media – said there is a 'lack of ambition and accountability' in the measures, and accused the regulator of choosing to 'prioritise the business needs of big tech over children's safety'. Mr Kyle insisted the Government has 'drawn a line in the sand' and that the codes will bring real change. He said: 'This Government has taken one of the boldest steps anywhere in the world to reclaim the digital space for young people – to lay the foundations for a safer, healthier, more humane place online. 'We cannot – and will not – allow a generation of children to grow up at the mercy of toxic algorithms, pushed to see harmful content they would never be exposed to offline. This is not the internet we want for our children, nor the future we are willing to accept.' He said the time for tech platforms 'to look the other way is over', calling on them to 'act now to protect our children, follow the law, and play their part in creating a better digital world'. He warned: 'And let me be clear: if they fail to do so, they will be held to account. I will not hesitate to go further and legislate to ensure that no child is left unprotected.' Ofcom chief executive Dame Melanie Dawes has previously defended criticism of the reforms, insisting that tech firms are not being given much power over the new measures, which will apply across the UK. Dame Melanie said: 'Prioritising clicks and engagement over children's online safety will no longer be tolerated in the UK. 'Our message to tech firms is clear – comply with age checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.' The regulator said X, formerly Twitter, and others including Bluesky, Reddit and dating app Grindr are among those to have committed to age assurances, and described its safety codes as demanding that algorithms 'must be tamed and configured for children so that the most harmful material is blocked'. It said it has launched a monitoring and impact programme focused on some of the platforms where children spend most time including social media sites Facebook, Instagram and TikTok, gaming site Roblox and video clip website YouTube. The sites are among those which have been asked to submit, by August 7, a review of their efforts to assess risks to children and, by September 30, scrutiny of the practical actions they are taking to keep children safe. Chris Sherwood, chief executive at the NSPCC, said: 'Children, and their parents, must not solely bear the responsibility of keeping themselves safe online. It's high time for tech companies to step up.' He said if enforcement is 'strong', the codes should offer a 'vital layer of protection' for children and young people when they go online, adding: 'If tech companies fail to comply, Ofcom must show its teeth and fully enforce the new codes'. Echoing this, Barnardo's children's charity said the changes are 'an important stepping stone' but 'must be robustly enforced'. England's Children's Commissioner, Dame Rachel de Souza, said Friday 'marks a new era of change in how children can be protected online, with tech companies now needing to identify and tackle the risks to children on their platforms or face consequences', and said the measures must keep pace with emerging technology to make them effective in the future. But Andy Burrows, chief executive of the Molly Rose Foundation, said: 'This should be a watershed moment for young people but instead we've been let down by a regulator that has chosen to prioritise the business needs of big tech over children's safety.' He said the 'lack of ambition and accountability will have been heard loud and clear in Silicon Valley'. He added: 'We now need a clear reset and leadership from the Prime Minister. That means nothing less than a new Online Safety Act that fixes this broken regime and firmly puts the balance back in favour of children.' Earlier this week, Mr Kyle said children could face a limit on using social media apps to help them 'take control of their online lives'. He said he wanted to tackle 'compulsive behaviour' and ministers are reportedly considering a two-hour limit, with curfews also under discussion. The Cabinet minister said he would be making an announcement about his plans for under-16s 'in the near future'.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store