WhatsApp tells BBC it backs Apple in legal row with UK over user data
WhatsApp has told the BBC it is supporting Apple in its legal fight against the UK Home Office over user data privacy.
The messaging app's boss, Will Cathcart, said the case "could set a dangerous precedent" by "emboldening other nations" to seek to break encryption, which is how tech firms keep their users' data private.
Apple went to the courts after receiving a notice from the Home Office earlier this year demanding the right to access the data of its global customers if required in the interests of national security.
It and other critics of the government's position say the request compromises the privacy of millions of users.
The BBC has approached the Home Office for comment.
It has previously declined to comment directly on the Apple case.
But it has told the BBC the government's "first priority" is "to keep people safe" and the UK has a "longstanding position of protecting our citizens from the very worst crimes, such as child sex abuse and terrorism, at the same time as protecting people's privacy."
WhatsApp has applied to submit evidence to the court which is hearing Apple's bid to have the Home Office request overturned.
Mr Cathcart said: "WhatsApp would challenge any law or government request that seeks to weaken the encryption of our services and will continue to stand up for people's right to a private conversation online."
This intervention from the Meta-owned platform represents a major escalation in what was an already extremely high-profile and awkward dispute between the UK and the US.
Apple's row with the UK government erupted in February, when it emerged ministers were seeking the right to be able to access information secured by its Advanced Data Protection (ADP) system.
The argument intensified in the weeks that followed, with Apple first pulling ADP in the UK, and then taking legal action against the Home Office.
It also sparked outrage among US politicians, with some saying it was a "dangerous attack on US cybersecurity" and urging the US government to rethink its intelligence-sharing arrangements with the UK if the notice was not withdrawn.
Tulsi Gabbard, the director of US National Intelligence, described it as an "egregious violation" of US citizens' privacy.
Civil liberties groups also attacked the UK government, saying what it was demanding had privacy and security implications for people around the world.
Apple's ADP applies end-to-encryption (E2EE) to files such as photos and notes stored on the iCloud, meaning only the user has the "key" required to view them.
The same technology protects a number of messaging services, including WhatsApp.
That makes them very secure but poses a problem for law enforcement agencies.
They can ask to see data with lower levels of protection - if they have a court warrant - but tech firms currently have no way to provide access to E2EE files, because no such mechanism currently exists.
Tech companies have traditionally resisted creating such a mechanism not just because they say it would compromise users' privacy but because there would be no way of preventing it eventually being exploited by criminals.
In 2023, WhatsApp said it would rather be blocked as a service than weaken E2EE.
When Apple pulled ADP in the UK it said it did not want to create a "backdoor" that "bad actors" could take advantage of.
Further complicating the argument around the Home Office's request is that it is made under the Investigatory Powers Act, the provisions of which are often secret.
When the matter came to court, government lawyers argued that the case should not be made in public in any way for national security reasons.
However, in April, a judge agreed with a number of news organisations, including the BBC, and said certain details should be made public.
"It would have been a truly extraordinary step to conduct a hearing entirely in secret without any public revelation of the fact that a hearing was taking place," his ruling stated.
At the time, the government declined to comment on the proceedings but said: "The UK has robust safeguards and independent oversight to protect privacy and privacy is only impacted on an exceptional basis, in relation to the most serious crimes and only when it is necessary and proportionate to do so."
What Apple pulling Advanced Data Protection means for you
Apple pulls data protection tool after UK government security row
Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Gizmodo
33 minutes ago
- Gizmodo
After 18 Years, Apple Is Killing Its 9-Minute Snooze—That Can Only Mean One Thing
For years, it's always been nine more measly minutes. If you don't know what the hell I'm talking about, you've probably never owned an iPhone, or you're one of those freaks who wakes up without a device screaming in your face to do so. If you are in one of those camps, let me explain: for 18 years, Apple has maintained a vice grip on its alarm snooze feature, which grants nine more minutes to your alarm. No more, no less. Just nine minutes. And there's no adjusting that in settings. No adjusting that until now, that is. As noted by MacRumors, iOS 26, which was just introduced at Apple's WWDC 2025, finally lets you manually set your snooze time, which means one thing: it's time to sleep the f**k in, at least for as much as 15 whole minutes. In normal, non-sleep-related time, six minutes more isn't a lot, but when it comes to waking up, if you're anything like me, six minutes is basically a lifetime. Imagine all the horrible stress dreams about your teeth falling out you could have had in that time. Or heck, you might even luck out and get the one where you're driving a car and the brakes go out. The possibilities are really endless, or at least endless within a 15-minute span. Not only that, but you can even—if you're a total masochist—set your snooze time to be shorter. As noted by MacRumors, the developer beta allows you to choose anywhere between one and 15 minutes. The world is now your sleepy little oyster, and you are able to shuck it into the future up to 15 minutes at a time. On one hand, it's kind of wild that it's taken this long to give people the option to extend or retract their snooze times, but also very Apple-like. For many years, Apple was known for its definitive design that locked people in, though that's changed as the years have gone by. In today's iOS, you can change app icons, customize wallpapers, and—soon in iOS 26—choose backgrounds for your threads in Messages, and much more. Those are all things that iOS users of yore only dreamed about, and now they're a reality. It's a shift for Apple, but in this case, probably one that most people will welcome. As for the 9-minute default, well, it'll still have its place as the iOS default and also its own place in history. The 9-minute snooze, if you'll allow me a quick reverie, is a vestige of alarm clock history, originating from GE's Model 7H241 from 1956, which was the first alarm clock with a snooze feature. Why nine minutes exactly? Well, back in the day, clocks had gears, and that meant you had to work around the physical constraints of said gears. GE wasn't able to set 10 minutes exactly due to those constraints—it had to choose nine minutes and change or 10 minutes and change, and ultimately it went with nine. Clearly that decision lasted a lot longer than nine minutes in the long run. If you're ready to break out of the 9-minute prison Apple has kept you in, you'll have to wait a little bit, though. Currently, iOS 16 is only available via a developer beta, and the first public beta launches next month. The non-beta software should launch in the fall in full, along with Apple's newest-generation iPhones, and once that happens, we can all rest easy—at least for 15 more minutes.


Tom's Guide
40 minutes ago
- Tom's Guide
Copilot Vision just launched on Windows — here's what it actually does
Microsoft just flipped the switch on one of its most ambitious Copilot features yet. Copilot Vision with Highlights is now rolling out to Windows 11 users in the U.S. The new tool allows Copilot to 'see' what's on your screen and provide contextual help — a move that puts it in direct competition with Google's Gemini Live and Apple's upcoming Apple Intelligence. Essentially, it's Microsoft's answer to the next generation of AI assistants: ones that are proactive, ambient and deeply integrated into your device. At its core, Copilot Vision gives the AI access to "see" whatever you're currently doing on your PC. Whether you're browsing, editing a document, watching a video or working in Excel, and allows it to offer help based on that screen content. For example, you can ask questions like: Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Copilot can now view the apps and windows on your screen (with permission), making the AI smarter and more responsive in real time. Highlights is a companion feature that automatically surfaces useful content from your apps, browser and documents. Think of it like an AI assistant that notices what you've been working on and suggests relevant files, reminders or actions; but no prompt is necessary. Highlights appear in a refreshed Copilot interface, which now docks to the side of your screen for quick access. These features are now available to U.S. users running Windows 11 version 23H2 with Copilot+ PCs, or select devices that meet the hardware requirements. You'll need to have screen reading enabled in the Copilot settings. Note, Vision only activates when you give it permission. You can try it today by opening Copilot from the taskbar and clicking the new Vision icon in the corner. A pop-up will confirm screen access and let you toggle Highlights on or off. While you're at it, try the prompts in the video to help get you started. Microsoft's move signals a major shift towards staying competitive, giving their AI assistant more capabilities, similar to those of rivals like Google's Gemini and OpenAI's ChatGPT. With OpenAI powering Copilot and Meta and Apple launching their own ambient AI tools, we're entering the age of 'AI that sees.' Whether that's helpful or a little creepy may depend on how well it works — and how much you choose to share.


CNET
42 minutes ago
- CNET
I Need Apple to Make the iPhone 17 Cameras Amazing. Here's What It Should Do
Apple's WWDC was a letdown for me, with no new hardware announced and few new features beyond a glassy interface for iOS 26. I'm pinning my hopes that the iPhone 17 will get my pulse racing, and the best way it can do that is with the camera. The iPhone 16 Pro already packs one of the best camera setups found on any phone, it's capable of taking stunning images in any conditions. Throw in its ProRes video, Log recording and the neat 4K slow motion mode and it's a potent video shooter too. It even put up a strong fight against the other best camera phones around, including the Galaxy S25 Ultra, the Pixel 9 Pro and the Xiaomi 14 Ultra. Read more: Camera Champions Face Off: iPhone 16 Pro vs. Galaxy S25 Ultra Despite that, it's still not the perfect camera. While early reports from industry insiders claim that the phone's video skills will get a boost, there's more the iPhone 17 will need to make it an all-round photography powerhouse. As both an experienced phone reviewer and a professional photographer, I have exceptionally high expectations for top-end phone cameras. And, having used the iPhone 16 Pro since its launch, I have some thoughts on what needs to change. Here are the main points I want to see improved on the iPhone 17 when it likely launches in September 2025. An accessible Pro camera mode At WWDC, Apple showed off the changes to the upcoming iOS 26 which included a radical change to the interface with Liquid Glass. But that simplified style extended to the camera app too, with Apple paring the interface down to the most basic functions of Photo, Video and zoom levels. Presumably, the idea is to make it super easy for even the most beginner of photographers to open the camera and start taking Instagram-worthy snaps. The new camera app is incredibly barebones Apple/Screenshot by Joe Maldonado/CNET And that's fine, but what about those of us who buy the Pro models in order to take deeper advantage of features like exposure compensation, Photographic Styles and ProRaw formats? It's not totally clear yet how these features can be accessed within the new camera interface, but they need to not be tucked away. Many photographers -- myself very much included -- want to use these tools as standard, using our powerful iPhones in much the same way we would a mirrorless camera from Canon or Sony. That means relying on advanced settings to take control over the image-taking process to craft shots that go beyond simple snaps. If anything, Apple's camera app has always been too simple, with even basic functions like white balance being unavailable. To see Apple take things to an even more simplistic level is disappointing, and I want to see how the company will continue to make these phones usable for enthusiastic photographers. Larger image sensor Though the 1/1.28-inch sensor found on the iPhone 16 Pro's main camera is already a good size -- and marginally larger than the S24 Ultra's 1/1.33-inch sensor -- I want to see Apple go bigger. A larger image sensor can capture more light and offer better dynamic range. It's why pro cameras tend to have at least "full frame" image sensors, while really high-end cameras, like the amazing Hasselblad 907X, have enormous "medium format" sensors for pristine image quality. Even on pro cameras, sensor size is important. Even the full-frame image sensor in the middle is dwarfed by the medium format sensor on the right. Phone camera sensors don't come anywhere near to this size. Andrew Lanxon/CNET Xiaomi understands this, equipping its 15 Ultra and previous 14 Ultra with 1-inch type sensors. It's larger than the sensors found on almost any other phone, which allowed the 15 Ultra to take stunning photos all over Europe, while the 14 Pro was heroic in capturing a Taylor Swift concerts. I'm keen to see Apple at least match Xiaomi's phone here with a similar 1-inch type sensor. Though if we're talking pie-in-the-sky wishes, maybe the iPhone 17 could be the first smartphone with a full-frame image sensor. I won't hold my breath on that one -- the phone, and the lenses, would need to be immense to accommodate it, so it'd likely be more efficient just to let you make calls with your mirrorless camera. Variable aperture Speaking of the Xiaomi 14 Ultra, one of the other reasons that phone rocks so hard for photography is its variable aperture on the main camera. Its widest aperture is f/1.6 -- significantly wider than the f/1.78 of the iPhone 16 wider aperture lets in a lot of light in dim conditions and more authentically achieves out-of-focus bokeh around a subject. The streetlight outside this pub has been turned into an attractive starburst thanks to the variable aperture of the Xiaomi 14 Ultra. Andrew Lanxon/CNET But Xiaomi's 14 Ultra aperture can also close down to f/4, and with that narrower aperture, it's able to create starbursts around points of light. I love achieving this effect in nighttime imagery with the phone. It makes the resulting images look much more like they've been taken with a professional camera and lens, while the same points of light on the iPhone just look like roundish blobs. Disappointingly, Xiaomi actually removed this feature from the new 15 Ultra so whether Apple sees value in implementing this kind of technology remains to be seen. More Photographic Styles Though Apple has had various styles and effects integrated into the iPhone's cameras, the iPhone 16 range took it further, with more control over the effects and more toning options. It's enough that CNET Senior Editor Lisa Eadicicco even declared the new Photographic Styles her "favorite new feature on Apple's latest phone." I think they're great too. Or rather, they're a great start. The different color tones, like the ones you get with the Amber and Gold styles, add some lovely warmth to scenes, and the Quiet effect adds a vintage filmic fade, but there's still not a whole lot to choose from and the interface can be a little slow to work through. I'd love to see Apple introduce more Photographic Styles with different color toning options, or even with tones that mimic vintage film stocks from Kodak or Fujifilm. I like the warmer tones produced by the iPhone's Amber style in this image, but I'd definitely like to see more options for getting creative with color tones. Andrew Lanxon/CNET And sure, there are plenty of third-party apps like VSCO or Snapseed that let you play around with color filters all you want. But using Apple's styles means you can take your images with the look already applied, and then change it afterward if you don't like it -- nothing is hard-baked into your image. I was recently impressed with Samsung's new tool for creating custom color filters based off the look of other images. I'd love to see Apple bring that level of image customization to the iPhone. Better ProRaw integration with Photographic Styles I do think Apple has slightly missed an opportunity with its Photographic Styles, though, in that you can use them only when taking images in HEIF (high-efficiency image format). Unfortunately, you can't use them when shooting in ProRaw. I love Apple's use of ProRaw on previous iPhones, as it takes advantage of all of the iPhone's computational photography -- including things like HDR image blending -- but still outputs a DNG raw file for easier editing. The DNG file typically also offers more latitude to brighten dark areas or tone down highlights in an image, making it extremely versatile. Previously, Apple's color presets could be used when shooting in ProRaw, and I loved it. I frequently shot street-style photos using the high contrast black-and-white mode and then edited the raw file further. I do a lot of street photography in black and white, and I'd love more flexibility to take ProRaw shots in monochrome. Andrew Lanxon/CNET Now using that same black-and-white look means only shooting images in HEIF format, eliminating the benefits of using Apple's ProRaw. Oddly, while the older-style "Filters" are no longer available in the camera app when taking a raw image, you can still apply those filters to raw photos in the iPhone's gallery app through the editing menu. LUTs for ProRes video And while we're on the topic of color presets and filters, Apple needs to bring those to video, too. On the iPhone 15 Pro, Apple introduced the ability to shoot video in ProRes, which results in very low-contrast, almost gray-looking footage. The idea is that video editors will take this raw footage and then apply their edits on top, often applying contrast and color presets known as LUTs (look-up tables) that gives footage a particular look -- think dark and blue for horror films or warm and light tones for a romantic drama vibe. But Apple doesn't offer any kind of LUT for editing ProRes video on the iPhone, beyond simply ramping up the contrast, which doesn't really do the job properly. Sure, the point of ProRes is that you would take that footage off the iPhone, put it into software like Davinci Resolve, and then properly color grade the footage so it looks sleek and professional. ProRes footage looks very low contrast and desaturated. Apple needs to introduce ways to help you do more with ProRes files on the iPhone. Andrew Lanxon/CNET But that still leaves the files on your phone, and I'd love to be able to do more with them. My gallery is littered with ungraded video files that I'll do very little with because they need color grading externally. I'd love to share them to Instagram, or with my family over WhatsApp, after transforming those files from drab and gray to beautifully colorful. With the iPhone 17, or even with the iPhone 16 as a software update, I want to see Apple creating a range of its own LUTs that can be directly applied to ProRes video files on the iPhone. While we didn't see this software functionality discussed as part of the company's June WWDC keynote, that doesn't mean it couldn't be launched with the iPhone in September. If Apple were able to implement all these changes -- excluding, perhaps, the full-frame sensor which even I can admit is a touch ambitious -- it would have an absolute beast of a camera on its hands.