
From 10 feet away, AI can now listen to your phone calls using..., researchers say their goal is to make people...; Is our privacy in danger?
According to a report by Interesting Engineering , researchers at Penn State University's computer science department have found a surprising new way to eavesdrop on phone calls. They used millimeter-wave radar along with artificial intelligence to detect tiny vibrations coming from a phone's earpiece and convert them into speech. This technology can track conversations from 10 feet away with about 60 per cent accuracy. Millimeter-wave radar turning vibrations into words
The researchers used millimeter-wave radar, the same type of technology used in self-driving cars, motion sensors, and 5G networks. When we talk on the phone, the sound from the earpiece causes tiny vibrations in the phone's body. This radar can detect those vibrations.
With the help of AI, the vibrations are then analyzed and converted into text. Researcher Suryoday Basak explained that by capturing these vibrations with radar and using machine learning, it's possible to figure out the conversation. While the technology isn't 100 per cent accurate yet, it is accurate enough to leak sensitive information. 60 per cent of the conversation was correct
In their trials, researchers placed the radar about 3 meters (roughly 10 feet) away from the phone. From this distance, the radar detected tiny vibrations from the phone, and AI converted the data into about 10,000 words of text. Around 60 per cent of the conversation matched accurately. While it's not fully precise yet, if this technology improves in the future, it could become a serious threat to privacy. Big privacy concerns
This development has sparked serious privacy questions. Researcher Suryoday Basak explained that it works a bit like lip reading, just as lip readers can guess what's being said, this system can do the same with vibrations. The real danger comes if someone with bad intentions uses it. The researchers say their goal is to make people aware of the risk so they can be more careful during sensitive conversations.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Hindustan Times
2 days ago
- Hindustan Times
Deepfakes are really dangerous but this small process could make them easier to catch
Worried about deepfakes and how they can impact our societies, systems, and political processes? This simple fix could be the key to address this issue. A team at Cornell University has shown that you can watermark reality itself by using light, not software. Instead of embedding a signature in a file that a bad actor can strip or ignore, they embed a quiet code in the scene while you record. Lamps or panels fitted with a tiny controller nudge brightness in patterns that people do not notice, yet cameras do. The camera captures those fluctuations as part of the image. Later, anyone with the matching key can recover a low fidelity code stream from the footage and check whether it lines up with the scene. If a face was swapped, an object pasted in, or a section reshot, the code in that region will not match. What you get is a built in authenticity check that travels with the frames and does not rely on downstream cooperation from platforms or models. The lights look ordinary yet they quietly embed a code the camera sees, letting later edits reveal themselves.(Unsplash) How the light code works and why it helps At capture, the system gently modulates one or more lights with a pseudorandom sequence. The variations sit below human perception, so the scene looks normal to viewers in the room and on camera. Because the camera sensor integrates that light, the code becomes part of every frame. During verification, software extracts a reference signal from the footage and compares it with the expected pattern. A clean match says the scene was recorded under the coded lights. A mismatch highlights regions that do not belong. The clever twist is that you can run different codes on different fixtures in the same scene. That makes life difficult for forgers because any edit has to respect multiple overlapping light signatures, frame by frame, across moving subjects and changing shadows, according to Interesting Engineering. File based watermarks and metadata have never solved this. They depend on compliant software and can be stripped, reencoded, or never added. A light borne signature raises the bar in settings where truth matters most, such as interviews, debates, press briefings, and courtroom recordings. It does not stop every attack, and it will not fix content that was never lit with the system, but it shifts trust earlier in the chain and makes convincing edits costlier and slower to produce. Where this could land next The practical upside is that you do not need to replace cameras. You retrofit lights. A postage stamp sized controller can live inside a studio panel, a conference room downlight, or a stage fixture. Newsrooms can light sets with coded patterns that look normal on air. Event organisers can enable coded lighting for high stakes appearances without changing run sheets. Fact checkers can ask sources to supply a short verification clip alongside raw footage, which speeds reviews and reduces guesswork. Standards bodies can define open keys and audit trails so that verification scales beyond a single lab and works across vendors. None of this is a silver bullet. Lighting can drift. Keys can leak. Outdoor scenes are harder to control, and the method needs care around skin tone rendering and flicker. The Cornell team frames it as a layer, not a lock. Pair it with provenance logs, capture time attestations, and robust forensic models, and you get a defence in depth that prioritises trust at the moment of recording rather than a late scramble after a video goes viral. In a year when election content will be tested by cheap synthesis, a watermark carried by photons is a refreshingly simple way to make fakes work harder and truth easier to prove.


India.com
2 days ago
- India.com
From 10 feet away, AI can now listen to your phone calls using..., researchers say their goal is to make people...; Is our privacy in danger?
There are many software tools around in the world that can detect what people are talking about on the phone. Cases of phone tapping are common, and the Pegasus spyware controversy made headlines a few years ago. But now, technology has advanced so much that you don't even need spyware to listen to someone's phone conversation. With the help of AI, conversations can be picked up from as far as 10 feet away—just by detecting the vibrations from the phone. This has raised serious concerns about privacy. Millimeter-wave radar and AI make it possible According to a report by Interesting Engineering , researchers at Penn State University's computer science department have found a surprising new way to eavesdrop on phone calls. They used millimeter-wave radar along with artificial intelligence to detect tiny vibrations coming from a phone's earpiece and convert them into speech. This technology can track conversations from 10 feet away with about 60 per cent accuracy. Millimeter-wave radar turning vibrations into words The researchers used millimeter-wave radar, the same type of technology used in self-driving cars, motion sensors, and 5G networks. When we talk on the phone, the sound from the earpiece causes tiny vibrations in the phone's body. This radar can detect those vibrations. With the help of AI, the vibrations are then analyzed and converted into text. Researcher Suryoday Basak explained that by capturing these vibrations with radar and using machine learning, it's possible to figure out the conversation. While the technology isn't 100 per cent accurate yet, it is accurate enough to leak sensitive information. 60 per cent of the conversation was correct In their trials, researchers placed the radar about 3 meters (roughly 10 feet) away from the phone. From this distance, the radar detected tiny vibrations from the phone, and AI converted the data into about 10,000 words of text. Around 60 per cent of the conversation matched accurately. While it's not fully precise yet, if this technology improves in the future, it could become a serious threat to privacy. Big privacy concerns This development has sparked serious privacy questions. Researcher Suryoday Basak explained that it works a bit like lip reading, just as lip readers can guess what's being said, this system can do the same with vibrations. The real danger comes if someone with bad intentions uses it. The researchers say their goal is to make people aware of the risk so they can be more careful during sensitive conversations.


Time of India
22-07-2025
- Time of India
MVA govt collapsed as MLAs honey-trapped with hidden cameras, Pegasus-like system: Sena (UBT)
Hidden cameras and a Pegasus-like surveillance system were used to honeytrap Maha Vikas Aghadi (MVA) MLAs and MPs, leading to the collapse of the Uddhav Thackeray-led government in 2022, the Shiv Sena (UBT) claimed on Tuesday. A Saamana editorial stated that some MLAs of the undivided Shiv Sena and the NCP switched loyalties due to pressure from Central agencies. At least 18 MLAs and four MPs were "honeytrapped", prompting them to join hands with the BJP to save their image. Explore courses from Top Institutes in Please select course: Select a Course Category Data Science Leadership Artificial Intelligence Design Thinking Others Data Analytics Technology Management Digital Marketing Public Policy Operations Management PGDM others Data Science healthcare Project Management Healthcare CXO MCA Product Management Cybersecurity Finance MBA Skills you'll gain: Duration: 30 Weeks IIM Kozhikode SEPO - IIMK-AI for Senior Executives India Starts on undefined Get Details Skills you'll gain: Duration: 11 Months IIT Madras CERT-IITM Advanced Cert Prog in AI and ML India Starts on undefined Get Details Skills you'll gain: Duration: 10 Months E&ICT Academy, Indian Institute of Technology Guwahati CERT-IITG Prof Cert in DS & BA with GenAI India Starts on undefined Get Details Skills you'll gain: Duration: 30 Weeks IIM Kozhikode SEPO - IIMK-AI for Senior Executives India Starts on undefined Get Details Skills you'll gain: Duration: 30 Weeks IIM Kozhikode SEPO - IIMK-AI for Senior Executives India Starts on undefined Get Details Skills you'll gain: Duration: 11 Months E&ICT Academy, Indian Institute of Technology Guwahati CERT-IITG Postgraduate Cert in AI and ML India Starts on undefined Get Details Skills you'll gain: Duration: 11 Months E&ICT Academy, Indian Institute of Technology Guwahati CERT-IITG Postgraduate Cert in AI and ML India Starts on undefined Get Details Skills you'll gain: Duration: 10 Months IIM Kozhikode CERT-IIMK DABS India Starts on undefined Get Details It further said Congress leader Vijay Wadettiwar had alleged that MPs and MLAs were blackmailed and as a former leader of the opposition, his remarks must be taken seriously. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Up to 70% off | Libas Purple Days Sale Libas "Hidden cameras and a Pegasus-like system from Israel were used (for surveillance) to its fullest extent. It is now clear that the MVA government collapsed due to this (honey) trapping," the editorial said. Pegasus is a spyware developed by Israeli cyber-arms company NSO Group. Live Events The BJP had a system to honeytrap, and even policemen conducted surveillance on the opposition, the Saamana said. When a pen drive containing evidence of honeytrapping, which involved Shiv Sena MPs and MLAs, was handed over to Eknath Shinde , they set out on their journey to Surat, Guwahati and then to Goa. This is all like a suspense thriller, the editorial said. The Sena (UBT) was alluding to the split engineered by Shinde in the undivided Shiv Sena in 2022. The Saamana stated that Shinde initially lacked the numerical strength and had the backing of only nine or ten MLAs at the time. However, the people from the home department and the then leader of the opposition, Devendra Fadnavis, blackmailed MPs and MLAs. The editorial stated that Shiv Sena ministers Sanjay Shirsat, Yogesh Kadam and Dada Bhuse, and their NCP colleague Manik Kokate should be sacked from the state cabinet. Some ministers were honeytrapped, and they will also have to go. The conduct of some ministers indicates that a cabinet reshuffle is on the cards in Maharashtra, the Saamana added.