logo
People do feel like strangers in Britain - but it's not just because of migration, polling finds

People do feel like strangers in Britain - but it's not just because of migration, polling finds

Sky News19-05-2025
Last week, Sir Keir Starmer voiced his worry Britain could become an "island of strangers" if immigration was not tackled.
Some claimed this was a controversial and dangerous stance - drawing parallels with Enoch Powell's Rivers of Blood speech.
But research released today suggests close to half of those in Great Britain feel like "strangers" in their own country.
The survey, carried out by pollsters at More In Common, asked 13,464 people in Great Britain for their feelings on the matter.
And what is even more surprising is that the survey was carried out over a month before Sir Keir 's speech.
The research is only being released today, and it is understood that Downing Street had not seen it before the prime minister's speech.
However it will likely be welcomed as a justification of a position aimed outside of Westminster.
2:09
Isolation linked to wealth
The prime minister's concerns about Great Britain being an "island of strangers" was inextricably linked to rising immigration.
But the research out today shows the isolation felt by many is strongly linked to wealth - with the poorest in the country more likely to feel like strangers.
The cost of living was mentioned as a contributory factor by many of those asked.
And when it comes to ethnic breakdown of those saying they feel like strangers, Asian or Asian British people were more likely than either white or black British people to say they felt separate.
Amy, a teacher from Runcorn, told researchers that when "your money's all going on your bills and the boring stuff like food and gas and leccy and petrol" there is nothing left "to do for ourselves".
Who is Starmer targeting?
Those who criticised Sir Keir for his "strangers" speech tended to accuse the prime minister of appealing to supporters of Reform or the Conservatives.
Suspended Labour MP Zarah Sultana went as far as to claim the speech was a "foghorn to the far right".
The analysis from More in Common found that people who supported Reform and the Conservatives last year are indeed much more likely to feel like strangers in the UK.
While Labour, Lib Dem and Green supporters are all less likely to feel like strangers, around a third of them do still agree with the statement that they "sometimes feel like a stranger in my own country".
And the polling also found that Reform and Conservative voters are much more likely to think that multiculturalism threatens national identity, while supporters of the other three parties tend to largely believe multiculturalism is a benefit.
Across the board, supporters of all parties were more likely than not to think that everyone needs to do more to encourage integration between people of different ethnic backgrounds - and similarly a majority think it is everyone's responsibility to do so.
Luke Tryl, the UK director of More in Common, said: "The prime minister's warning that we risk becoming an 'island of strangers' resonates with millions who say they feel disconnected from those around them.
"But it would be a mistake to say that immigration and lack of integration are the sole causes of our fragmenting social fabric."
John McDonnell, another former Labour MP, now suspended, told Sky News that having politicians "exploit" resentment fuelled by economic circumstance to shift "the blame onto migrants just exacerbates the problem".
He said the government needs to "tackle the insecurity of people's lives and you lay the foundations of a cohesive society".
With Reform now leading in the polls and the collapse of support for Sir Keir since becoming prime minister, it is unsurprising that what he says seems to match up with what turquoise voters feel.
Work from home alone
The post-pandemic shift to working from home and spending more time alone has also been blamed for an increased feeling of isolation.
Ruqayyah, a support worker from Peterborough, said the shift to home offices had "destroyed our young generation".
But there are many other reasons that people feel separate from the rest of their country.
Young people are less trusting of strangers, and there is also a deep discontent with the political system.
Many think the system is "rigged" in favour of the wealthy - although this belief is less common the higher the level of education someone has completed.
The tension that exploded during last year's riots are also highlighted, and many people are worried about religious differences - a situation exacerbated by foreign conflicts like in the Middle East and between India and Pakistan.
The research was carried out alongside the campaign group Citizens UK and UCL.
Matthew Bolton, executive director of Citizens UK, said: "We all saw what can happen last summer when anger and mistrust boil over and threaten the fabric of our society.
"The answers to this don't lie in Whitehall.
"By listening to people closest to the ground about what causes division and what builds unity in their neighbourhood, we can build a blueprint for cohesion rooted in local leadership and community power."
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Government must stop children using VPNs to dodge age checks on porn sites, commissioner demands
Government must stop children using VPNs to dodge age checks on porn sites, commissioner demands

The Independent

time29 minutes ago

  • The Independent

Government must stop children using VPNs to dodge age checks on porn sites, commissioner demands

England's children's commissioner has demanded that the government stop children from using virtual private networks (VPNs) to get around age verification on porn sites. Calling for change, Dame Rachel de Souza warned it is "absolutely a loophole that needs closing" as she released a new report, which found the proportion of children saying they have seen pornography online has risen in the past two years, with most likely to have stumbled upon it accidentally. VPNs are tools that connect internet users to websites via remote servers, enabling them to hide their real IP address and location, which includes allowing them to look as if they are online but in another country. This means the Online Safety Act, which now forces platforms to check users' ages if attempting to access some adult content, can be dodged. After sites such as PornHub, Reddit and X introduced age verifcation requirements last month, VPNs became the most downloaded apps, according to the BBC. A government spokesperson told the broadcaster that there are no plans to ban VPNs as they are legal tools for adults. Dame Rachel told Newsnight: "Of course, we need age verification on VPNs – it's absolutely a loophole that needs closing and that's one of my major recommendations." She called on ministers to look at requiring VPNs 'to implement highly effective age assurances to stop underage users from accessing pornography'. More than half (58 per cent) of respondents to the commissioner's survey said that, as children, they had seen pornography involving strangulation, while 44 per cent reported seeing a depiction of rape – specifically someone who was asleep. Made up of responses from 1,020 people aged between 16 and 21 years old, the report also found that while children were on average aged 13 when they first saw pornography, more than a quarter (27 per cent) said they were 11, and some reported being aged 'six or younger'. The research suggested four in 10 respondents felt girls can be 'persuaded' to have sex even if they say no at first, and that young people who had watched pornography were more likely to think this way. The report, a follow-on from research by the Children's Commissioner's office in 2023, found a higher proportion (70 per cent) of people saying they had seen online pornography before turning 18, up from 64 per cent of respondents two years ago. Boys (73 per cent) were more likely than girls (65 per cent) to report seeing online pornography. A majority (59 per cent) of children and young people said they had seen pornography online by accident – a rise from 38 per cent in 2023. Dame Rachel said her research is evidence that harmful content is being presented to children through dangerous algorithms, rather than them seeking it out. She described the content young people are seeing as 'violent, extreme and degrading' and often illegal, and said her office's findings must be seen as a 'snapshot of what rock bottom looks like'. Dame Rachel said: 'This report must act as a line in the sand. The findings set out the extent to which the technology industry will need to change for their platforms to ever keep children safe. 'Take, for example, the vast number of children seeing pornography by accident. This tells us how much of the problem is about the design of platforms, algorithms and recommendation systems that put harmful content in front of children who never sought it out.' The research was done in May, ahead of new online safety measures coming into effect last month, including age checks to prevent children accessing pornography and other harmful content. A Department of Science, Innovation and Technology spokesperson told the BBC that "children have been left to grow up in a lawless online world for too long" and "the Online Safety Act is changing that'. However, responding to Dame Rachel's remarks on VPNs, they added that there are no plans to ban them, "but if platforms deliberately push workarounds like VPNs to children, they face tough enforcement and heavy fines'.

Londonderry: US flag saved from bonfire returned to school
Londonderry: US flag saved from bonfire returned to school

BBC News

time30 minutes ago

  • BBC News

Londonderry: US flag saved from bonfire returned to school

A historical US flag stolen from the grounds of a school built on the site of a former American naval base has been returned after it was recovered from a bonfire in flag was taken from Foyle College on the city's Limavady Road in early Monday, independent Derry City and Strabane District councillor Gary Donnelly said he believed the flag had been removed from the bonfire after efforts to have it College confirmed on Tuesday that the flag had now been handed back to the school. The school thanked those involved in securing the safe return of the flag."We hope its safe return will play a part in improving mutual understanding across our shared society and assist efforts to build a more peaceful future," a statement school said "given the sensitivity surrounding this process", it would be making no further comment. The flag was gifted to the school by members of the former US naval communications was last officially flown at the base in November 1963 to mark President Kennedy's death and more than half a century later in 2019 was presented to Foyle College which had moved to the site the year before. Police have said they are investigating the placing of materials on the bonfires, which were lit in the Creggan and Bogside areas of Derry on Friday night, including flags and wreaths, as sectarian hate crimes and sectarian hate Monday it was reported that a last-ditch attempt to save a flag stolen from Londonderry's Protestant cathedral from being burned on a bonfire in the city had failed.

Labour ‘playing gesture politics' with online safety, says Molly Russell's father
Labour ‘playing gesture politics' with online safety, says Molly Russell's father

Telegraph

time43 minutes ago

  • Telegraph

Labour ‘playing gesture politics' with online safety, says Molly Russell's father

Labour is 'playing gesture politics' over protecting children from online harms, the father of Molly Russell has said. In an exclusive article for The Telegraph, Ian Russell warned that eight years after his 14-year-old daughter's death, social media platforms were still bombarding children with the same kind of suicide and self-harm content that led her to take her life. He claimed he had been met with 'radio silence' from Sir Keir Starmer and Peter Kyle, the Technology Secretary, over the six months since they personally assured him they would look again at toughening up the Online Safety Act. 'Sticking plaster ideas' And he accused the Government of being 'more interested in playing performative gesture politics with sticking plaster ideas' – such as two-hour caps on children's app use that would do little to tackle their exposure to online harms. Mr Russell is demanding the Government strengthen the Act by replacing Ofcom's 'timid' codes of practice with clear outcomes and targets for the social media companies to wipe their sites clean of harmful content such as suicide material. Research published on Tuesday by the Molly Rose Foundation, the charity set up in his daughter's memory, found TikTok and Instagram were still deluging teenagers with 'industrial levels' of dangerous suicide and self-harm content. The study claimed more than 90 per cent of the videos recommended to potentially vulnerable teenagers were promoting or glorifying suicide or self-harm. Molly took her own life after being bombarded with 16,000 'destructive' posts – including 2,100 on Instagram – encouraging self-harm, anxiety and even suicide in her final six months. The coroner at her inquest concluded she died from an act of self-harm while suffering from depression and 'the negative effects of online content' which had 'more than minimally contributed' to her death. Mr Russell, who chairs the foundation, said: 'It is staggering that eight years after Molly's death, incredibly harmful suicide, self-harm and depression content like she saw is still pervasive across social media. 'Ofcom's recent child safety codes do not match the sheer scale of harm being suggested to vulnerable users, and ultimately do little to prevent more deaths like Molly's. 'For over a year, this entirely preventable harm has been happening on the Prime Minister's watch and where Ofcom have been timid, it is time for him to be strong and bring forward strengthened, life-saving legislation without delay.' The researchers used accounts opened with a registered age and identity of a 15-year-old girl who had previously engaged with suicide, self-harm and depression material. The study was conducted in the weeks leading up to the implementation of the Online Safety Act, which requires companies to prevent and remove such content. However, the research suggested its algorithms were still driving content deemed harmful towards teenagers. Videos were classified as harmful if they either promoted or glorified suicide or self-harm, referred to suicide or self-harm ideation, or otherwise featured highly intense themes of hopelessness, misery and despair. Almost all (96 per cent) of the algorithmically recommended videos watched on TikTok's For You Page contained content that was likely to be harmful, particularly when viewed cumulatively or in large amounts. Some 97 per cent of Instagram short-form videos (known as Reels) contained themes likely to be harmful, particularly when being recommended and consumed in large amounts. 'The findings suggest safeguards were still not in place on either TikTok and Instagram, and that in the immediate period before regulation took effect, children could still be exposed to a substantial risk of reasonably foreseeable but preventable harm,' said the charity's report. More than half (55 per cent) of recommended harmful posts on TikTok's For You Page included references to suicide and self-harm ideation, and 16 per cent referred to suicide methods. The charity said the harmful content was achieving 'disturbing' levels of interest. One in 10 of the videos deemed harmful by researchers on TikTok's For You Page had been liked at least one million times. On Instagram Reels, one in five harmful recommended videos had been liked more than 250,000 times. The Technology Secretary said: 'These figures show a brutal reality – for far too long, tech companies have stood by as the internet fed vile content to children, devastating young lives and even tearing some families to pieces. 'But companies can no longer pretend not to see. The Online Safety Act, which came into effect earlier this year, requires platforms to protect all users from illegal content and children from the most harmful content, like promoting or encouraging suicide and self-harm. Forty-five sites are already under investigation. 'Ofcom is also considering how to strengthen existing measures, including by proposing that companies use proactive technology to protect children from self-harm content and that sites go further in making algorithms safe.' Meanwhile, a study by the Children's Commissioner found that children are more likely to view porn on Elon Musk's X than on dedicated adult sites. Dame Rachel de Souza found that children as young as six are being exposed to more porn since the Online Safety Act became law than they were before. That included illegal violent porn, such as strangulation and non-consensual sex. Social media and networking sites accounted for 80 per cent of the main sources by which children viewed porn. Dame Rachel said this easy access was influencing children's attitudes towards women, meaning nearly half of them believed girls who said no could be persuaded to have sex. With social media and networking sites accounting for eight out of 10 of the sources for children viewing porn, X, formerly Twitter, remained the most common source, outstripping dedicated porn sites. The gap between the number of children seeing pornography on X and those seeing it on dedicated porn sites has widened (45 per cent versus 35 per cent in 2025, compared to 41 per cent versus 37 per cent in 2023). Snapchat accounted for 29 per cent, Instagram 23 per cent, TikTok 22 per cent, and YouTube 15 per cent. The research, based on 1,020 young people aged 16 to 21, found 70 per cent of children had seen porn before the age of 18, an increase from 64 per cent in 2023, when the Online Safety Act received royal assent. In her report, published on Tuesday, Dame Rachel said: 'Violent pornography is easily accessible to children, exposure is often accidental and often via the most common social media sites, and it is impacting children's behaviours and beliefs in deeply concerning ways. 'This report must be a line in the sand. It must be a snapshot of what was – not what will be.' A TikTok spokesman said: 'Teen accounts on TikTok have 50+ features and settings designed to help them safely express themselves, discover and learn, and parents can further customise 20+ content and privacy settings through family pairing. 'With over 99 per cent of violative content proactively removed by TikTok, the findings don't reflect the real experience of people on our platform which the report admits.' A Meta spokesman said: 'We disagree with the assertions of this report and the limited methodology behind it. Tens of millions of teens are now in Instagram teen accounts, which offer built-in protections that limit who can contact them, the content they see, and the time they spend on Instagram. 'We continue to use automated technology to remove content encouraging suicide and self-injury, with 99 per cent proactively actioned before being reported to us. We developed teen accounts to help protect teens online and continue to work tirelessly to do just that.' Is Starmer ready to take decisive measures to save our children? By Ian Russell Eight years on from my daughter Molly's death, we continue to lose the battle against the untold harm being inflicted by tech giants. Every week in the UK, we lose at least another teenager to suicide where technology plays a role. However, the unfathomable reality is that I'm less convinced than ever that our politicians will do what's necessary to stop this preventable harm in its tracks. This week, Molly Rose Foundation released deeply disturbing new research that showed that in the weeks before the Online Safety Act took effect, Instagram and TikTok's algorithms continued to recommend the type of toxic material that cost my daughter's life. Vulnerable teenagers continue to be bombarded with suicide, self-harm and intense depression material on a near industrial scale. We should be in no doubt why this is still happening. The harms on social media are the direct result of business models that actively prioritise user engagement and a race for market share. Children's safety continues to be seen as an optional extra, and Ofcom's desperately unambitious implementation of the Online Safety Act will do little to change the commercial incentives that continue to cost children's lives. This preventable harm is happening on this Government's watch. Six months ago, I met with the Prime Minister and told him that further urgent action was necessary. Ofcom 'timid and unambitious' I told him that parents were heartened by Labour's commitment to strengthen the Online Safety Act in opposition. That they were encouraged by the Technology Secretary's recognition that the Act was 'uneven and unsatisfactory'. Crucially, I explained that swift and decisive action was necessary to fix structural issues with the Online Safety Act, still the most effective and quickest way to protect children from widespread harm while also enabling them to enjoy the benefits of life online, and to arrest the sticking plaster approach that Ofcom has adopted to implementation. I don't have confidence in the regulator's approach. Ofcom has proven to be desperately timid and unambitious, and seems determined to take decisions that are stacked in favour of tech companies rather than victims. For all the regulator's breathless claims to be 'taming toxic algorithms', buried in the detail of their plans is an expectation that the likes of TikTok and Instagram will only need to spend £80,000 fixing the algorithms that helped kill Molly, and that our research shows are continuing to cause widespread and pervasive harm today. This is pocket money to platforms making tens of billions every year. It sends the clearest of signals to the tech giants that the current regime expects them to pay lip service to online safety but doesn't really expect them to implement the achievable changes to prioritise safety over profit. Six months after I met the Prime Minister and Technology Secretary Peter Kyle, and received a personal assurance from them they would look again at this issue, all I have received from Number 10 is radio silence. Meanwhile, the Technology Secretary appears to be more interested in playing performative gesture politics with sticking plaster ideas like two-hour app caps that those who work in online safety immediately recognise will do little to meaningfully change the dial. Public support for Act is strong Despite the recent predictable howls of protest from free speech activists and tech libertarians, public support for the Online Safety Act remains strong. Our polling suggests that 80 per cent of adults want the Act to be strengthened, with a growing despair from parents and the public that our politicians seem unable or unwilling to protect our children, families and wider society from preventable harm. With all this in mind, my message to Sir Keir Starmer is clear. Is he prepared to take the decisive measures necessary to strengthen regulation and take on the tech companies which are a threat to children's safety? Will he listen to the public, bereaved families and civil society to deliver a comprehensive strengthening of the Online Safety Act, knowing that the majority of people in this country will be cheering him on? The alternative is that he leaves children and families at risk from largely unchecked but inherently preventable harm. Regulation that is well-meaning but isn't up to the job will not save the lives that it must.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store