logo
#

Latest news with #KevinBeaumont

Microsoft introduces terrifying new AI tool, angers users
Microsoft introduces terrifying new AI tool, angers users

Yahoo

time05-05-2025

  • Business
  • Yahoo

Microsoft introduces terrifying new AI tool, angers users

The world of artificial intelligence (AI) just took a shocking turn, one that seems to be terrifying consumers. For many people, the AI revolution is reshaping entire industries and markets in a frightening way. Companies are creating the type of technology that science fiction writers have long predicted could overpower humanity. 💵💰Don't miss the move: Subscribe to TheStreet's free daily newsletter 💰💵 Others are more worried about their jobs being rendered obsolete, as AI systems demonstrate clear abilities to perform human tasks with extreme precision. Companies have quickly moved to eliminate positions that center around tasks such as data entry and customer service. One company that is helping usher in the next phase of AI is Microsoft () . But the tech leader recently released a controversial feature sparking strong concern among its user base, which seems extremely worried about the consequences. Over the past few years, Microsoft has worked hard to remain competitive in the AI market as startups such as OpenAI and Anthropic have made significant advances. Its Copilot AI chatbot remains a popular choice among users, although it has not caught up with ChatGPT. Recently, however, Microsoft has made multiple AI-related announcements that many consumers likely find frightening. Last week, it released a detailed report describing a future in which most teams of office workers are composed of AI agents managed by humans.A few days ago, though, the company announced plans to bring back an AI tool that generated a lot of controversy the last time they tried to roll it out. Originally introduced in May 2024, Recall is a Windows 11 feature that can take a screenshot of everything a user does before indexing and storing it in a matter of seconds. Concerns about security and privacy violations quickly rose, prompting the company to push back the launch, originally intended for June 2024. But a year later, Microsoft is pushing forward with its plans to bring Recall to its Copilot+ PCs. In a recent statement, Microsoft attempted to frame Recall as a powerful new feature that can save users time by allowing them to search more efficiently. 'You are always in control of what snapshots are saved and can pause saving snapshots at any time,' it promises users. However, this isn't necessarily going to appease people's anxiety. 'Even if User A never opts in to Recall, they have no control over the setting on the machines of Users B through Z,' Ars Technica reports. 'That means anything User A sends them will be screenshotted, processed with optical character recognition and Copilot AI, and then stored in an indexed database on the other users' devices.' The outlet adds that this could allow the AI tool to collect any sensitive material from the user in question, which could include passwords, medical data, and messages sent through encrypted services. More Tech Stocks News:Cyber expert Kevin Beaumont recently confirmed that Recall can capture messages sent through encrypted apps such as WhatsApp and Signal, even after they have been deleted. 'I would recommend that if you're talking to somebody about something sensitive who is using a Windows PC, that in the future you check if they have Recall enabled first,' he advises. Beaumont's point highlights an important underlying truth regarding Recall. It isn't just PC users who may be at risk from this technology, it is anyone who communicates with them whose private messages may be messaging platforms such as WhatsApp and Signal are popular because they offer users more privacy and allow them to exchange sensitive information and materials. Now, if Microsoft moves forward with introducing Recall, these popular features could be compromised. Many social media users are voicing their displeasure at Microsoft's decision, making it clear they do not approve. 'If I were to have a Windows PC, this would instantly void that device as trash material for me,' one Reddit user states. 'Why would anyone let this literal spyware run on their device is beyond me.' Another adds that they can't see who might benefit from this, aside from 'data brokers and cyber criminals,' who could certainly reap the benefits if they find a way to hack into the system. Other Reddit users have professed plans to switch over to using an Apple computer, citing the Recall news as the reason why.

Microsoft introduces terrifying new AI tool, angers users
Microsoft introduces terrifying new AI tool, angers users

Miami Herald

time03-05-2025

  • Business
  • Miami Herald

Microsoft introduces terrifying new AI tool, angers users

The world of artificial intelligence (AI) just took a shocking turn, one that seems to be terrifying consumers. For many people, the AI revolution is reshaping entire industries and markets in a frightening way. Companies are creating the type of technology that science fiction writers have long predicted could overpower humanity. Don't miss the move: Subscribe to TheStreet's free daily newsletter Others are more worried about their jobs being rendered obsolete, as AI systems demonstrate clear abilities to perform human tasks with extreme precision. Companies have quickly moved to eliminate positions that center around tasks such as data entry and customer service. One company that is helping usher in the next phase of AI is Microsoft (MSFT) . But the tech leader recently released a controversial feature sparking strong concern among its user base, which seems extremely worried about the consequences. Over the past few years, Microsoft has worked hard to remain competitive in the AI market as startups such as OpenAI and Anthropic have made significant advances. Its Copilot AI chatbot remains a popular choice among users, although it has not caught up with ChatGPT. Recently, however, Microsoft has made multiple AI-related announcements that many consumers likely find frightening. Last week, it released a detailed report describing a future in which most teams of office workers are composed of AI agents managed by humans. Related: Microsoft shares terrifying new use for AI A few days ago, though, the company announced plans to bring back an AI tool that generated a lot of controversy the last time they tried to roll it out. Originally introduced in May 2024, Recall is a Windows 11 feature that can take a screenshot of everything a user does before indexing and storing it in a matter of seconds. Concerns about security and privacy violations quickly rose, prompting the company to push back the launch, originally intended for June 2024. But a year later, Microsoft is pushing forward with its plans to bring Recall to its Copilot+ PCs. In a recent statement, Microsoft attempted to frame Recall as a powerful new feature that can save users time by allowing them to search more efficiently. 'You are always in control of what snapshots are saved and can pause saving snapshots at any time,' it promises users. However, this isn't necessarily going to appease people's anxiety. The outlet adds that this could allow the AI tool to collect any sensitive material from the user in question, which could include passwords, medical data, and messages sent through encrypted services. More Tech Stocks News: Former big tech CEO has shocking take on NvidiaAlphabet unveils answer to major AI questionAnalysts reset Palantir stock forecast amid rally Cyber expert Kevin Beaumont recently confirmed that Recall can capture messages sent through encrypted apps such as WhatsApp and Signal, even after they have been deleted. 'I would recommend that if you're talking to somebody about something sensitive who is using a Windows PC, that in the future you check if they have Recall enabled first,' he advises. Beaumont's point highlights an important underlying truth regarding Recall. It isn't just PC users who may be at risk from this technology, it is anyone who communicates with them whose private messages may be stored. Related: Analysts reset Microsoft stock target amid post-earnings rally Encrypted messaging platforms such as WhatsApp and Signal are popular because they offer users more privacy and allow them to exchange sensitive information and materials. Now, if Microsoft moves forward with introducing Recall, these popular features could be compromised. Many social media users are voicing their displeasure at Microsoft's decision, making it clear they do not approve. Another adds that they can't see who might benefit from this, aside from 'data brokers and cyber criminals,' who could certainly reap the benefits if they find a way to hack into the system. Other Reddit users have professed plans to switch over to using an Apple computer, citing the Recall news as the reason why. Related: Veteran fund manager unveils eye-popping S&P 500 forecast The Arena Media Brands, LLC THESTREET is a registered trademark of TheStreet, Inc.

Microsoft's AI Secretly Reads Your WhatsApp, Signal Messages
Microsoft's AI Secretly Reads Your WhatsApp, Signal Messages

Forbes

time29-04-2025

  • Forbes

Microsoft's AI Secretly Reads Your WhatsApp, Signal Messages

Be very careful what you send. NurPhoto via Getty Images Update: Republished on April 28 with news that Meta's AI will also read messages. Timing is everything. Just weeks after America's NSA warned about the hidden dangers with secure messaging platforms like WhatsApp and Signal, especially when users link phone apps to PCs and other devices, everything is suddenly worse — much worse. Microsoft has decided to release its controversial Recall to Copilot PCs, which then continually screenshots and optically reads everything on screen to be saved behind a simple PIN. It doesn't matter how secure you think you are, if you message someone who has a Windows PC with this feature enabled, all that security falls away instantly. As Ars Technica explains, 'even if User A never opts in to Recall, they have no control over the setting on the machines of Users B through Z. That means anything User A sends them will be screenshotted, processed with optical character recognition and Copilot AI, and then stored in an indexed database on the other users' devices.' That means anything Users B through Z sees on screen, bar some specific data types Microsoft will try (and sometimes manage) to redact such as passwords. Ars Technica warns, that will 'indiscriminately hoover up all kinds of User A's sensitive material, including photos, passwords, medical conditions, and encrypted videos and messages.' Unlike with new options to record phone calls, there is no warning here that your content is being saved and stored by someone else, that your secrets are now dependent on the security of countless Microsoft's Windows PCs to stay secret. That's the operative word. For Users A, this all takes place secretly, without warning or opt-out. Cyber guru Kevin Beaumont put all this to the test and has found security and privacy holes galore. While Recall's screenshots are stored locally and secured by the infamous TPM 2.0 that stops so many Windows 10 users upgrading, once set up the only security protecting all that data is a simple PIN, to say nothing of the risk from hackers. 'To test this,' Beaumont says, 'I tasked my partner with using my device while I was away from desk to use Recall to find out who'd I'd been talking to the previous day in Signal and what I'd been saying.' She guessed the PIN and was in. 'So, in 5 minutes, a non-technical person had access to everything I'd ever done on the PC, including disappearing Signal conversations (as Recall retains anything deleted). That isn't great.' Recall is an easy target. It was withdrawn when Microsoft first unleashed it on the world, and was put through a privacy and security sheep dip before its second coming. Now it's here again, with better opt-outs and security wraps, but with the same very basic flaws. The idea that every interaction you have with a Recall user is screenshot and kept forever without you knowing feels — at its core — very wrong. But this is just another example of AI bringing unlimited scale to dangerous activities with ease. Your messages — disappearing or otherwise — have always been subject to a recipient screenshot. But not at industrialized scale. Similarly, targeted phishing attacks and better-written spam and brand ripoffs are all now being industrialized by AI. Put together, the linked device warning and Recall's launch means it's time for Signal and WhatsApp and others to end their linked device options or provide some way for messages to be tagged so as only to appear on primacy devices — meaning phones. The simple truth is that secure messaging and staccato screenshotting don't mix. In the meantime — and this is a serious warning — do remember that anything you send may not disappear into the chat archive on a phone, but may be analyzed, indexed and stored by AI in an easily searchable database on a device you do not control. As Beaumont says, 'Recall still captures and stores things after deletion. Disappearing Signal and WhatsApp messages are still captured, as are deleted Teams messages. I would recommend that if you're talking to somebody about something sensitive who is using a Windows PC, that in the future you check if they have Recall enabled first.' Ironically, just as Recall starts optically reading WhatsApp (and other secure messages), WhatsApp itself has stepped in to create even more AI-fueled confusion for its 3 billion users. Meta's engineers have suddenly announced that its AI will process messages after all, despite saying that it won't, but with assurances it's all done privately. So, nothing to worry about then? 'We're sharing an early look into Private Processing,' the team posted, 'an optional capability that enables users to initiate a request to a confidential and secure environment and use AI for processing messages where no one — including Meta and WhatsApp — can access them. To validate our implementation of these and other security principles, independent security researchers will be able to continuously verify our privacy and security architecture and its integrity.' Per Wired, 'the whole effort raises a more basic question, though, about why a secure communication platform like WhatsApp needs to offer AI features at all. Meta is adamant, though, that users expect the features at this point and will go wherever they have to to get them.' That's the crux of this new debate for billions of users. 'What makes me more nervous,' crypto expert Matthew Green posted on X, 'is what comes after these systems? Will these AIs stay strictly private? Or will they begin to share summarized private data with providers like Meta, for example to improve search results? There's a huge risk of a total privacy unraveling here.' Despite assurances that 'Private Processing will allow users to leverage powerful AI features, while preserving WhatsApp's core privacy promise,' there are clear privacy concerns here. While Meta insists 'no one except you and the people you're talking to can access or share your personal messages, not even Meta or WhatsApp,' this is the grey area where AI is currently changing how we think about our privacy. And even if Meta's engineers achieve this level of private processing, Recall will take its snapshots of all these private messages and will store them outside WhatsApp. For users this is becoming overly complex. You have been warned.

Microsoft's AI Starts Secretly Copying And Saving Your Messages
Microsoft's AI Starts Secretly Copying And Saving Your Messages

Forbes

time28-04-2025

  • Forbes

Microsoft's AI Starts Secretly Copying And Saving Your Messages

Be very careful what you send. Timing is everything. Just weeks after America's NSA warned about the hidden dangers with secure messaging platforms like WhatsApp and Signal, especially when users link phone apps to PCs and other devices, everything is suddenly worse — much worse. Microsoft has decided to release its controversial Recall to Copilot PCs, which then continually screenshots everything on a user's screen to be saved behind a simple PIN code. It doesn't matter how secure you think you are, if you message someone who has a Windows PC with this feature enabled, all that security falls away instantly. As Ars Technica explains, 'even if User A never opts in to Recall, they have no control over the setting on the machines of Users B through Z. That means anything User A sends them will be screenshotted, processed with optical character recognition and Copilot AI, and then stored in an indexed database on the other users' devices.' That means anything Users B through Z sees on screen, bar some specific data types Microsoft will try (and sometimes manage) to redact such as passwords. Ars Technica warns, that will 'indiscriminately hoover up all kinds of User A's sensitive material, including photos, passwords, medical conditions, and encrypted videos and messages.' Unlike with new options to record phone calls, there is no warning here that your content is being saved and stored by someone else, that your secrets are now dependent on the security of countless Microsoft's Windows PCs to stay secret. That's the operative word. For Users A, this all takes place secretly, without warning or opt-out. Cyber guru Kevin Beaumont put all this to the test and has found security and privacy holes galore. While Recall's screenshots are stored locally and secured by the infamous TPM 2.0 that stops so many Windows 10 users upgrading, once set up the only security protecting all that data is a simple PIN, to say nothing of the risk from hackers. 'To test this,' Beaumont says, 'I tasked my partner with using my device while I was away from desk to use Recall to find out who'd I'd been talking to the previous day in Signal and what I'd been saying.' She guessed the PIN and was in. 'So, in 5 minutes, a non-technical person had access to everything I'd ever done on the PC, including disappearing Signal conversations (as Recall retains anything deleted). That isn't great.' Recall is an easy target. It was withdrawn when Microsoft first unleashed it on the world, and was put through a privacy and security sheep dip before its second coming. Now it's here again, with better opt-outs and security wraps, but with the same very basic flaws. The idea that every interaction you have with a Recall user is screenshot and kept forever without you knowing feels — at its core — very wrong. But this is just another example of AI bringing unlimited scale to dangerous activities with ease. Your messages — disappearing or otherwise — have always been subject to a recipient screenshot. But not at industrialized scale. Similarly, targeted phishing attacks and better-written spam and brand ripoffs are all now being industrialized by AI. Put together, the linked device warning and Recall's launch means it's time for Signal and WhatsApp and others to end their linked device options or provide some way for messages to be tagged so as only to appear on primacy devices — meaning phones. The simple truth is that secure messaging and staccato screenshotting don't mix. In the meantime — and this is a serious warning — do remember that anything you send may not disappear into the chat archive on a phone, but may be analyzed, indexed and stored by AI in an easily searchable database on a device you do not control. As Beaumont says, 'Recall still captures and stores things after deletion. Disappearing Signal and WhatsApp messages are still captured, as are deleted Teams messages. I would recommend that if you're talking to somebody about something sensitive who is using a Windows PC, that in the future you check if they have Recall enabled first.' You have been warned.

New Security Warning After 1 Billion Windows Users Told Do Not Delete
New Security Warning After 1 Billion Windows Users Told Do Not Delete

Forbes

time27-04-2025

  • Forbes

New Security Warning After 1 Billion Windows Users Told Do Not Delete

That mystery Windows security update could block new security updates. As if users of the world's most popular, although I use that term with some caution, operating system don't have enough security issues to worry about, Microsoft appears to have introduced one of its own making. With dangerous infostealer malware on the hunt for Windows passwords and 2FA code bypassing cookies and a record number of vulnerabilities reported, the last thing a billion Windows users want to hear is that an update meant to solve security issues could have introduced a new one of its own. As regular readers will know, I'm something of an advocate, almost evangelical in fact, when it comes to security updates. Whether it is the latest Google Chrome browser emergency update, or the monthly Patch Tuesday rollout of fixes, often relating to zero-day vulnerabilities are actively being exploited, impacting Windows users, my advice is always the same: update now. Sometimes, however, the early bird that gets the worm discovers it's a rotten one. Who can forget the recent security update that killed Microsoft's Windows Hello security feature, for example. Or, even more recently, the disastrous April 8 update to protect against the CVE-2025-21204 vulnerability that installed a mysterious folder, and got everyone's collective conspiracy theory panties in a bunch. Microsoft had to issue a notice explaining that the folder was critical protection against being attacked by threat actors exploiting the vulnerability in question and, unlike the advice spreading across social media platforms, not to delete it under any circumstances. That folder was called inetpub and it's at the heart of this latest warning, from a highly respected security researcher who used to work for Microsoft itself. 'I've discovered this fix introduces a denial of service vulnerability in the Windows servicing stack that allows non-admin users to stop all future Windows security updates,' the researcher, Kevin Beaumont, said. I have reached out to Microsoft for a statement, but in the meantime this is some of the response that was sent to Beaumont after he contacted Microsoft about the issue: 'After careful investigation, this case is currently rated as a Moderate severity issue. It does not meet MSRCs current bar for immediate servicing as the update fails to apply only if the 'inetpub' folder is a junction to a file and succeeds upon deleting the inetpub symlink and retrying.' Microsoft told Beaumont that it had shared the report with the relevant Windows security team, which would consider a potential fix, but for now, the case was closed.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store