
UK will handle US security concerns over new Chinese embassy plan, says Kyle
Technology Secretary Peter Kyle indicated the UK would offer a 'fulsome response' to any concerns raised by allies amid suggestions that US opposition could undermine transatlantic trade negotiations.
Mr Kyle sought to reassure the public that the Government deals with similar 'infrastructure issues' relating to embassies 'all the time'.
'These issues will be taken care of assiduously in the planning process,' he told Sky News's Sunday Morning With Trevor Phillips programme.
He added: 'These are the issues that we talk about as two countries all the time… we're in the Five Eyes agreement, America and Britain share intelligence… If people raise security issues even though it relates to planning, then I'm sure we will have a fulsome response for them.
'But look, the key thing is these are issues which are quite routinised in the way that we deal with the security of our country.'
A senior US official had told the Sunday Times: 'The United States is deeply concerned about providing China with potential access to the sensitive communications of one of our closest allies.'
The matter is believed to have been discussed during US-UK trade talks, with diplomats saying the Trump administration would have reservations about intelligence sharing with the UK if the building went ahead.
More than a thousand demonstrators gathered earlier this year for a rally against the proposed Chinese 'super-embassy' because of concerns about its potential proximity to Canary Wharf and the City of London.
The redevelopment plans at the former site of the Royal Mint were 'called in' last year, which means the Government will make the final decision following a report from the Planning Inspectorate.
The plan was initially refused by Tower Hamlets Council in 2022.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


BreakingNews.ie
30 minutes ago
- BreakingNews.ie
Badenoch says organisations should be able to decide if staff can wear burkas
Conservative leader Kemi Badenoch has said employers should be able to decide if their staff can wear burkas in the workplace. Mrs Badenoch also said people who come to her constituency surgeries must remove their face coverings 'whether it's a burka or a balaclava'. Advertisement Ms Badenoch posted a video on X of part of her interview with the Telegraph, in which she said: 'My view is that people should be allowed to wear whatever they want, not what their husband is asking them to wear or what their community says that they should wear. 'I personally have strong views about face coverings. 'If you come into my constituency surgery, you have to remove your face covering, whether it's a burka or a balaclava. 'I'm not talking to people who are not going to show me their face. Advertisement 'Organisations should be able to decide what their staff wear for instance, it shouldn't be something that people should be able to override.' She added that France has a ban and has 'worse problems than we do in this country on integration'. On Wednesday, Reform's newest MP Sarah Pochin asked Sir Keir Starmer during Prime Minister's Questions whether he would support such a ban. Reform UK deputy leader Richard Tice said his party has 'triggered a national discussion'. Advertisement Asked if he wants to ban burkas, Mr Tice told GB News on Sunday: 'We've triggered a national discussion. I'm very concerned about them (burkas). 'Frankly, I think they are repressive. I think that they make women second-class citizens. 'We're a Christian nation. We have equality between the sexes, and I'm very concerned, and if someone wants to convince me otherwise, well come and talk to me. 'But at the moment, my view is that I think we should follow seven other nations across Europe that have already banned them.' Advertisement He called for a debate on the topic to 'hear where the country's mood is'. Meanwhile, shadow home secretary Chris Philp said 'employers should be allowed to decide whether their employees can be visible or not', when discussing face coverings. Asked on the BBC's Sunday With Laura Kuenssberg programme if the Conservative Party's position is not to speak to people who cover their face, Mr Philp said of Mrs Badenoch: 'Well she was talking specifically about her constituency surgery I think, and it is definitely the case that employers should be allowed to decide whether their employees can be visible or not. 'But I don't think this is necessarily the biggest issue facing our country right now. Advertisement 'There's a legitimate debate to have about the burka. 'You've got, obviously, arguments about personal liberty and choice and freedom on one side, and arguments about causing divisions in society and the possibility of coercion on the other. 'That is a debate I think we as a country should be having, but as Kemi said, it's probably not the biggest issue our nation faces today.' Asked if he would talk to people who would not show their face, the Croydon South MP said: 'I have in the past spoken to people obviously wearing a burka – I represent a London constituency – but everybody can make their own choices, that's the point she was making, each employer should be able to make their own choices.'


The Guardian
31 minutes ago
- The Guardian
Campainers urge UK watchdog to limit use of AI after report of Meta's plan to automate checks
Internet safety campaigners have urged the UK's communications watchdog to limit the use of artificial intelligence in crucial risk assessments following a report that Mark Zuckerberg's Meta was planning to automate checks. Ofcom said it was 'considering the concerns' raised by the letter following a report last month that up to 90% of all risk assessments at the owner of Facebook, Instagram and WhatsApp would soon be carried out by AI. Social media platforms are required under the UK's Online Safety Act to gauge how harm could take place on their services and how they plan to mitigate those potential harms – with a particular focus on protecting child users and preventing illegal content from appearing. The risk assessment process is viewed as key aspect of the act. In a letter to Ofcom's chief executive, Dame Melanie Dawes, organisations including the Molly Rose Foundation, the NSPCC and the Internet Watch Foundation described the prospect of AI-driven risk assessments as a 'retrograde and highly alarming step'. 'We urge you to publicly assert that risk assessments will not normally be considered as 'suitable and sufficient', the standard required by … the Act, where these have been wholly or predominantly produced through automation.' The letter also urged the watchdog to 'challenge any assumption that platforms can choose to water down their risk assessment processes'. A spokesperson for Ofcom said: 'We've been clear that services should tell us who completed, reviewed and approved their risk assessment. We are considering the concerns raised in this letter and will respond in due course.' Sign up to TechScape A weekly dive in to how technology is shaping our lives after newsletter promotion Meta said the letter deliberately misstated the company's approach on safety and it was committed to high standards and complying with regulations. 'We are not using AI to make decisions about risk,' said a Meta spokesperson. 'Rather, our experts built a tool that helps teams identify when legal and policy requirements apply to specific products. We use technology, overseen by humans, to improve our ability to manage harmful content and our technological advancements have significantly improved safety outcomes.' The Molly Rose Foundation organised the letter after NPR, a US broadcaster, reported last month that updates to Meta's algorithms and new safety features will mostly be approved by an AI system and no longer scrutinised by staffers. According to one former Meta executive, who spoke to NPR anonymously, the change will allow the company to launch app updates and features on Facebook, Instagram and WhatsApp more quickly but would create 'higher risks' for users, because potential problems are less likely to be prevented before a new product is released to the public. NPR also reported that Meta was considering automating reviews for sensitive areas including youth risk and monitoring the spread of falsehoods.


BBC News
38 minutes ago
- BBC News
Former 1920s cinema Weybridge Hall could become Equippers Church
A former 1920s cinema, which has been empty since 2014, could be turned into a church Borough Council sold Weybridge Hall to Equippers Church for £1.2m in Tuesday, the planning committee at Elmbridge will decide an application to turn it into a community facility, primarily for religious officers have said permission should be granted. The venue, in Church Street, Weybridge in Surrey, had previously been granted permission to become a cinema with flats above it but the council said the development "never materialised".Currently, the building includes a vacant shop and community hall on the ground floor and a vacant four-bedroom flat across the second and third floors. Meeting documents said that under the plans the main auditorium could be used by local schools and community groups when not in use by the church and multi-purpose studios on the upper floors would be available to application received 32 letters of objection, many of which said there were other religious venues in the area and that the building should be used for the benefit of the community of Weybridge as a whole. At the time of agreeing the sale in November, the council said Equippers Church would "preserve the historical essence of Weybridge Hall" and "breathe new life into the building".The council said no offer to buy the building was received from a theatre or arts the objections to the current application was that a cinema, theatre or youth group would be preferable to the church to its website, Equippers Church is a global movement of local churches across 16 countries.