logo
#

Latest news with #AIPRM

A call that sounds like a loved one? Decoding the dangers of AI voice cloning scams
A call that sounds like a loved one? Decoding the dangers of AI voice cloning scams

Indian Express

time23-05-2025

  • Indian Express

A call that sounds like a loved one? Decoding the dangers of AI voice cloning scams

On a quiet Delhi afternoon, Laxmi Chand Chawla's phone rang. Since it was a call from an unknown caller, Chawla was reluctant, yet he answered. On the other end of the line, a man claiming to be a cop told Chwala that his nephew, Kapil, has been taken into custody over a sexual assault case. 'We are putting Kapil on the line,' the cop said, and moments later, Chawla could hear the panicked, frail voice of his nephew. Kapil told Chawla that he was innocent and that he did not know what to do. The cop took over the call and told Chawla that they could hush up the matter by paying Rs 70,000 to the party for not pressing charges. Chawla and his wife, Santosh, managed to collect around Rs 50,000 and transferred the money in a bid to help their nephew. At the same time, they tried reaching Kapil's parents; however, they were unavailable. Later, the cop called again, this time demanding Rs 2 lakh. The couple sensed foul play. Yet again, they reached out to Kapil's family. Much to their relief, they found that Kapil was safe and at his home. Kapil was unaware of any police case or phone call, and the couple realised they had been scammed. The callers had cloned the voice of Kapil to dupe his family. AI voice cloning is becoming increasingly common, with many bad actors using it to scam unsuspecting users. These scammers are not only stealing money but are also manipulating people's fears and vulnerabilities through advanced technology. In a similar case, Mumbai resident KT Vinod got a call from someone claiming to be from the Indian Embassy in Dubai. Seconds later, Vinod could hear the cries of his son Amit, who sounded scared. 'Please, bail me out,' Amit said. Owing to the fear of losing his son, Vinod did not hesitate. The caller insisted that Vinod pay Rs 80,000 immediately. While Vinod paid the sum, he only realised it was a scam after finding out that his son was safe at home. The voice he heard over the call was generated using AI to sound similar to Amit. Even though Vinod reported the incident, the emotional toll it cost persists. AI voice cloning gained momentum in the last few years. Today, there are 23,000 monthly searches on AI voice cloning. According to AIPRM, a company specialising in AI prompts, AI voice cloning was among the fastest-growing scams of 2024, and 70 per cent of adults are not confident that they could identify the cloned version from the real voice. Perhaps this explains the spate of AI voice cloning scams in recent times. 'Scammers need just three seconds of audio to clone a person's voice and use it for a scam call,' said Christoph C Cemper, founder of AIPRM. 'Even something as simple as repeatedly saying 'hello' during a blank call can give scammers enough data to replicate your voice. It is that easy and even dangerous,' opines Sagar Vishnoi, co-founder of Future Shift Labs and an AI & cybersecurity expert. The caller will typically claim to be a friend, family member, colleague, or someone you know. Ask the caller a question that only they will know the answer to, or create a secret phrase that only you and the caller would know. If they cannot answer with the correct response, it is likely a scammer. 🎯 If you only hear your friend or loved one's voice for a brief period, it could be a warning sign, as scammers often use the voice clone briefly, knowing that the longer it is used, the higher the risk of the receiver catching on. 🎯 If you are called from an unknown number, it can be a strong indication of a scam, as AI voice scams often use unknown numbers to make unsolicited calls. If the caller is claiming to be a company or someone you know, hang up and dial them back using a known number, either from your contact list or the company's official website. 🎯 Be mindful of what you share. Avoid sending voice notes or personal videos to strangers online, because once your voice is out there, it is incredibly easy to misuse. 'Whenever you receive a suspicious call, stay calm. Talk to your family about a secret code or phrase only you know. It's one of the simplest ways to stay a step ahead of voice-based scams,' says Sagar Vishnoi. 'AI scams have seen a huge rise in recent years, but 2025 may prove to be the most dangerous year yet, with developments in AI and scammers' tactics growing more sophisticated. As a result, understanding how to detect and avoid falling victim to these scams is crucial to prevent fraud and financial loss. It is crucial to follow the above advice and take caution if you receive any unexpected calls or texts that seem too 'urgent' or don't feel right. However, some people will unfortunately be caught out by fraudsters,' Cemper added. 🎯Register a complaint: Report the scam to a government agency dealing with scams and cybercrime. Register a complaint on the National Cyber Crime Reporting Portal ( or call the 1930 helpline. Provide as much information as possible on the website about the scam. 🎯Halt transactions: Freeze your bank cards immediately; this is a quick and essential step to ensure scammers cannot access your financial accounts or apply for loans in your name. 🎯Change passwords: Make sure to change your passwords, especially if you use the same passwords for multiple accounts, and ensure these are all unique and strong across all accounts. It is also a good idea to use two-factor or multi-factor authentication to add extra layers of security. 🎯Report AI scams: It is crucial to report AI scams, even if you feel embarrassed or think that the amount is too small to warrant action. No matter how big or small the scam, reporting it helps not only you but also contributes to building data on scams, which allows authorities to take action against fraudsters. 'These scams work not because people are careless but because the emotional weight of hearing a loved one in distress can override logic. Scammers are exploiting this with increasing precision. AI tools can now recreate voice tone, emotion, and even pauses with frightening accuracy, and they only need a short clip to do it. This changes how we think about trust and verification. From everyday people to businesses, we need new rules of engagement. Simple callbacks, identity confirmation, and extra checks may feel inconvenient, but they'll become essential,' suggests Apurv Agrawal, co-founder & CEO of SquadStack. According to Agrawal, better detection tools and safeguards in critical workflows are a must, along with greater public awareness around AI-generated voices and impersonation scams. 'In particular, companies providing AI-driven customer experience solutions must recognise that the key to combating voice scams lies in real-time detection and multi-layered identity checks that go beyond traditional authentication. AI gives us powerful tools, but it also lowers the cost of deception. We need to keep up not just technologically, but emotionally and socially too.' The Safe Side As the world evolves, the digital landscape does too, bringing new opportunities—and new risks. Scammers are becoming more sophisticated, exploiting vulnerabilities to their advantage. In our special feature series, we delve into the latest cybercrime trends and provide practical tips to help you stay informed, secure, and vigilant online.

Warning to parents over 'sharenting' social media trend putting children at risk of danger
Warning to parents over 'sharenting' social media trend putting children at risk of danger

Daily Record

time20-05-2025

  • Daily Record

Warning to parents over 'sharenting' social media trend putting children at risk of danger

Social media is a great way to connect, but oversharing can be a risk for all involved Social media is inescapable in this day and age- around 1.3 billion photos are shared on Instagram every day, and there are around 600 million monthly users on X. But sharing pictures and information online can be dangerous, particularly with the rise of artificial intelligence. It's vital to be careful about what you're posting online, particularly if you have children. And AI experts at AIPRM have shared some information on why you should avoid oversharing on social media. ‌ Here are five risks associated with posting pictures and information online, and what you can do to keep yourself and your children safe. ‌ 1. Dangers of oversharing about your child Often known as 'sharenting', it is becoming increasingly common for parents to share screeds of information about their children via social media. But Christoph advises parents to be vigilant: 'Cyber crime is rising rapidly, and exposing your child's details online could see them fall victim to fraud or other crimes. "If you want to share images or details on your child's activities, it is always best to do this in private group chats with people you trust. I would always advise avoiding sharing this information on social media, due to the risks involved." 2. Identity theft All social media pages and accounts host a gold mine of personal information for criminals such as names, date of birth, home location, places of work, and even the details of our family, friends and kids. ‌ Christoph advises: 'The host of readily available personal information on social media has made it even easier for criminals to carry out identity theft, and with the rise of AI's capabilities, this is even quicker to do. "By combining real data with fabricated details, AI can be used to generate realistic IDs, official documents, or utility bills. This makes identity theft much harder to detect. "That's why it's crucial to think carefully about what you share online - not just to protect your own privacy, but also the security of your family and friends, who could be targeted by a fake version of you.' ‌ 3. Social engineering and financial scams Social engineering often involves attackers creating fraudulent social media profiles and using these to impersonate a trusted or legitimate individual or organisation. Through this, they can psychologically manipulate victims into sharing information or clicking on links that are unsafe and contain malware or scams. ‌ Christoph states: "Always be cautious when engaging with people on social media, and if they claim to be someone you know, be sure to ask them personal questions that only they would know the answer to.' 4. Deepfakes and voice cloning Scammers need just three seconds of audio to clone a person's voice, and with so many videos available on social media, it is becoming even easier for criminals to generate voice clones. Similarly, the wide array of selfies on social media has fuelled AI- powered scams. ‌ From a single image of your face, fraudsters can use AI to create a full photo, complete with a natural looking background and other aspects that appear authentic. 'Deepfakes are becoming widespread, and with our voices and images often available on social media, millions are at risk of becoming victims," says the expert. "To protect yourself, make sure that those you follow on social media and allow to view your content are people you know and trust. Having a publicly open profile increases your risk of being targeted. ‌ "If you think you have been a victim of a deepfake, contact your local fraud centre as soon as possible to report it.' 5. Reputation damage Oversharing on social media also carries the risk of personal or professional reputation damage. Engaging with the wrong things on social media can create a negative image of the user. ‌ The AI expert says: "Many people have fallen into the trap of engaging with this content, whether it be an AI generated image, or a fake article. "Always fact check any news or sources you see on social media via trusted and reputable sites, as unverified online information could be fake, and may even include fraudulent links." Join the Daily Record WhatsApp community! Get the latest news sent straight to your messages by joining our WhatsApp community today. You'll receive daily updates on breaking news as well as the top headlines across Scotland. No one will be able to see who is signed up and no one can send messages except the Daily Record team. All you have to do is click here if you're on mobile, select 'Join Community' and you're in! If you're on a desktop, simply scan the QR code above with your phone and click 'Join Community'. We also treat our community members to special offers, promotions, and adverts from us and our partners. If you don't like our community, you can check out any time you like. To leave our community click on the name at the top of your screen and choose 'exit group'.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store