Latest news with #DavidMaimon


WIRED
04-06-2025
- Business
- WIRED
Deepfake Scams Are Distorting Reality Itself
Jun 4, 2025 6:00 AM The easy access that scammers have to sophisticated AI tools means everything from emails to video calls can't be trusted. Imagine you meet someone new. Be it on a dating app or social media, you chance across each other online and get to talking. They're genuine and relatable, so you quickly take it out of the DMs to a platform like Telegram or WhatsApp. You exchange photos and even video call each over. You start to get comfortable. Then, suddenly, they bring up money. They need you to cover the cost of their Wi-Fi access, maybe. Or they're trying out this new cryptocurrency. You should really get in on it early! And then, only after it's too late, you realize that the person you were talking to was in fact not real at all. They were a real-time AI-generated deepfake hiding the face of someone running a scam. This scenario might sound too dystopian or science-fictional to be true, but it has happened to countless people already. With the spike in the capabilities of generative AI over the past few years, scammers can now create realistic fake faces and voices to mask their own in real time. And experts warn that those deepfakes can supercharge a dizzying variety of online scams, from romance to employment to tax fraud. David Maimon, the head of fraud insights at identity verification firm SentiLink and a professor of criminology at Georgia State University, has been tracking the evolution of AI romance scams and other kinds of AI fraud for the past six years. 'We're seeing a dramatic increase in the volume of deepfakes, especially in comparison to 2023 and 2024,' Maimon says. 'It wasn't a whole lot. We're talking about maybe four or five a month,' he says. 'Now, we're seeing hundreds of these on a monthly basis across the board, which is mind-boggling.' Deepfakes are already being used in a variety of online scams. One finance worker in Hong Kong, for example, paid $25 million to a scammer posing as the company's chief financial officer in a deepfaked video call. Some deepfake scammers have even posted instructional videos on YouTube, which have a disclaimer as being for 'pranks and educational purposes only.' Those videos usually open with a romance scam call, where an AI-generated handsome young man is talking to an older woman. More traditional deepfakes—such as a pre-rendered video of a celebrity or politician, rather than a live fake—have also become more prevalent. Last year, a retiree in New Zealand lost around $133,000 to a cryptocurrency investment scam after seeing a Facebook advertisement featuring a deepfake of the country's prime minister encouraging people to buy in. Maimon says SentiLink has started to see deepfakes used to create bank accounts in order to lease an apartment or engage in tax refund fraud. He says an increasing number of companies have also seen deepfakes in video job interviews. ' Anything that requires folks to be online and which supports the opportunity of swapping faces with someone—that will be available and open for fraud to take advantage of,' Maimon says. Part of the reason for this increase is that the barriers for creating deepfakes are getting lower. There are a lot of easily accessible AI tools that can generate realistic faces and a lot of tools that can animate those faces or create full-length videos out of them. Scammers often use images and videos of real people, deepfaked to slightly change their faces or alter what they're saying, to target their loved ones or hijack their public influence. Matt Groh, a professor of management at Northwestern University who researches people's ability to detect deepfakes, says that point-and-click generative AI tools make it much easier to make small, believable changes to already-existing media. 'If there's an image of you on the internet, that would be enough to manipulate a face to look like it's saying something that you haven't said before or doing something you haven't done before,' Groh says. It's not just fake video that you need to be worried about. With a few clips of audio, it's also possible to make a believable copy of somebody's voice. One study in 2023 found that humans failed to detect deepfake audio over a quarter of the time. ' Just a single image and five seconds of audio online mean that it's definitely possible for a scammer to make some kind of realistic deepfake of you,' Groh says. Deepfakes are becoming more pervasive in contexts other than outright scams. Social media has been flooded over the past year with AI-generated 'influencers' stealing content from adult creators by deepfaking new faces onto their bodies and monetizing the resulting videos. Deepfakes have even bled over into geopolitics, like when the mayors of multiple European capital cities held video calls with a fake version of the mayor of Kyiv, Ukraine. People have started using deepfakes for personal reasons, like bringing back a dead relative or creating an avatar of a victim to testify in court. So, if deepfakes are everywhere, how do you spot one? The answer is not technology. A number of technology companies, including OpenAI, have launched deepfake detection tools. Researchers have also proposed mechanisms to detect deepfakes based on things like light reflected in a person's eyes or inconsistent facial movements, and have started investigating how to implement them in real time. But those models often cannot reliably detect different kinds of AI fakes. OpenAI's model, for example, is specifically designed only to report content generated with the company's own Dall-E 3 tool but not other image generation models. There's also the risk that scammers can abuse AI detectors by repeatedly tweaking their content until it fools the software. ' The major thing we have to understand is that the technology we have right now is not good enough to detect those deepfakes,' Maimon says. ' We're still very much behind.' For now, as video deepfakes get more popular, the best way to detect one relies on humans. Studies on deepfake detection show that people are best at distinguishing whether videos are real or fake, as opposed to just audio or text content, and are in some cases even better than leading detection models. Groh's team conducted one study which found that taking more time to determine whether an image was real or fake led to a significant increase in accuracy, by up to eight percentage points for just 10 seconds of viewing time. ' This sounds almost so simple,' Groh says. 'But if you spend just a couple extra seconds, that leads to way higher rates of being able to distinguish an image as real or fake. One of the ways for any regular person to just be a little bit less susceptible to a scam is to ask, 'Does this look actually real?' And if you just do that for a few extra seconds, we're all going to be a little bit better off.' Deepfakes' popularity could be a double-edged sword for scammers, Groh says. The more widespread they are, the more people will be familiar with them and know what to look for. That familiarity has paid off in some cases. Last summer, a Ferrari executive received a call from someone claiming to be the CEO of the company. The person convincingly emulated the CEO's voice but abruptly hung up the call when the executive tried to verify their identity by asking what book the CEO had recommended just days earlier. The CEO of WPP, the world's biggest advertising agency, was also unsuccessfully targeted by a similar deepfake scam. 'I think there's a balancing act going on,' Groh says. ' We definitely have technology today that is generally hard for people to identify. But at the same time, once you know that there's a point-and-click tool that allows you to transform one element into something else, everyone becomes a lot more skeptical.'
Yahoo
12-05-2025
- Business
- Yahoo
People's identities for sale; prices start at just $1
People's identities are up for sale. In some cases, prices start at just $1. [DOWNLOAD: Free WHIO-TV News app for alerts as news breaks] We look at how this is happening and what you can do to protect yourself today on News Center 7 Daybreak from 4:25 a.m. until 7 a.m. TRENDING STORIES: Superload weighing almost 370K pounds could cause problems for Greene County drivers Dog dies after being found in basement of Ohio home Officer injured after police hit by vehicle, police say The prices start at $1 on online marketplaces to buy a Social Security number and more. Justin Gray, from our sister station WSB TV in Atlanta, reports he has even seen a spreadsheet of identities offered for free to a company for not paying a ransom in a data breach. 'You don't have to be very skilled. You just need to know where to look,' said Professor David Maimon. We will update this story. [SIGN UP: WHIO-TV Daily Headlines Newsletter]
Yahoo
24-03-2025
- Business
- Yahoo
Social Security retirement accounts are being sold online – What you need to know
An identity theft researcher at Georgia State University has found Social Security retirement accounts for sale online. 'They take over the SSA accounts. Then they change the account details on the Social Security Administration website, and then they funnel the payments,' said GSU Professor David Maimon. Maimon showed Channel 2 consumer investigator Justin Gray a video an identity thief posted bragging about his access to a retiree's account. 'You can see the name, date and the balance,' Maimon said. 'The whole point of the scam is to try to take over there those individuals Social Security payments.' The Trump Administration said new changes they are making this month to require more in-person identity verification are designed to prevent just this type of fraud. 'Americans deserve to have their Social Security records protected with the utmost integrity and vigilance,' said Lee Dudek, Acting Commissioner of Social Security. 'For far too long, the agency has used antiquated methods for proving identity. Social Security can better protect Americans while expediting service.' RELATED STORIES: New Social Security requirements pose barriers to rural communities without internet, transportation Social Security in-person identity checks opposed by advocates and retirees alike A list of the Social Security offices across the US expected to close this year Under the new SSA policy being rolled out over the next 2 weeks, SSA beneficiaries will no longer be able to verify their identity by telephone. People looking to claim benefits or change direct deposit who cannot use their personal 'My Social Security' account, which requires online identity proofing, will then need to visit a local Social Security office to prove their identity in person. Former Social Security Commissioner Martin O'Malley counters that cuts in staff by the new SSA leadership will hurt not help protect from fraud. SSA plans to cut 7,000 jobs even though its current staffing is already at a 50-year low. 'In one day, on one Friday, the entire leadership of the Cyber Security office left. What the public will see is longer and longer and longer and longer waiting times for everything,' O'Malley said. Maimon told Gray that his team started seeing an increase in the SSA accounts for sale before the change in administration, and it has continued after. 'Usually, we see things rising and falling and because of the absence of control, criminals and fraudsters, thrive in times of crisis and times of change,' Maimon said. Another step SSA says they are in the process of taking to limit fraud is using the Treasury Department's Account Verification Service for any direct deposit changes. The Treasury system is designed to verify the existence, status, and ownership of bank accounts before making payments, reducing the risk of improper or fraudulent payments.