
The cursed world of AI kiss and hug apps
Doomscroll on TikTok long enough, and you'll come across an ad for AI video apps. In one ad, a stereotypically nerdy girl puckishly smirks as she uploads a picture of herself and her much more handsome crush. Boom — suddenly, thanks to AI, they're smooching. In another, I'm shown a woman in a blouse and jeans. Do I want to know what she looks like in a blue bikini? Psst. There's an app for that. The ad then shows me the woman in said blue bikini.
These apps aren't peddling the digital nudes many people associate with AI deepfakes, which are proliferating in their own right on app stores. Slapped together by opportunistic developers and sprinkled with subscription fees and microtransactions, they're all pitching tools to help you make benign fantasies a bit more tangible — but the results feel more cursed than magical.
AI video ads link out to apps with titles like Boom.AI, VideoAI, and DreamVid, made by companies you've probably never heard of — a short perusal of Apple's App Store brings up roughly two dozen options. Despite their titillating promotional material, they feature plenty of innocuous video templates. By uploading one or two photos and hitting a 'generate' button, you can change your hair color, hold up a sign, or hug Goku from Dragon Ball Z. But for every one of those, there are several other subtly disturbing or sort of gross ones. In the DreamVid app, there's an Enhance option that lets you give a person bigger breasts. In the preview, a blonde with a B cup is shown getting an automatic boob job, smiling playfully as she jiggles her new DD size. The AI Dancing category in the same app has scantily clad women suggestively swaying their hips.
It's a mix that feels calculated. Just when you think there's too many bikinis and breasts, you'll see templates featuring cuddly AI cats, Studio Ghibli-style filters, and wholesome grandmas to hug. At the same time, when you look at DreamVid's AI outfit-of-the-day option, six of 12 outfits are some form of bikini or bathing suit. The rest include skimpy maid outfits, lingerie, a schoolgirl uniform, and gothic lolita cosplay. Only the wedding dress and cheongsam are relatively benign. None of them are aimed at creating pictures of men.
In the ads, the videos generated are in that hazy category of 'real enough' to make you uncomfortable yet curious enough to download. Try it yourself and you'll see the telltale AI cracks appear. Kissing looks awkward — like how a toddler imagines kissing, faces and lips rhythmically smooshing together. (The few that attempt French kissing prove AI really doesn't know what to do with tongues yet.) Hugs look stiff, with dubious limb and hand placements. If the photos don't line up, hilarious zoom effects ensue as AI tries to match up bodies. Clothing, hair, accessories, and facial features often morph in and out of existence mid-video.
AI systems have a long-standing racial bias issue, and pairing up subjects of different races seems to confuse these apps. My non-Asian celebrity crushes sometimes spontaneously developed Asian features when I joined them in a video. Other times, the app morphed my features into more Eurocentric ones to match my spouse. I don't know whether to laugh or cry that multiple AI apps insist that kissing parties should generally be the same race. I do, however, feel insulted when it generates a video of my spouse proposing to me — but has them turn away and propose to a random, spontaneously appearing white woman instead.
None of this comes for free. The majority of apps charge microtransaction fees and subscriptions that range from $2.99 to $7.99 per week or $49.99 to $69.99 annually, providing limited credits that you can spend to generate videos. It's a financial model similar to that of AI nudes apps, even if the content is different.
If you're curious about where those funds are going, one deep dive into the Videa: AI Video Maker app traced its origins to a company called Pure Yazlim Limited Sirketi that's based out of Istanbul, Turkey. Boom.AI is run by a company called NineG, which describes itself as 'non-gaming app publishing' on its barebones website. Its app store listing also touts the Mozart AI song generator, art generator Plum AI, an AI font creator, and, randomly, Reel TV — a Quibi-esque app for short dramas. DreamVid is run by Shenzhen iMyFone Technology Co.Ltd., which also has a suite of what seems to be productivity and utility apps, plus a Studio Ghibi generator. The Verge reached out to both NineG and iMyFone but didn't receive a response.
In exchange, you get something infinitely simpler and more permissive than all-purpose video generators like OpenAI's Sora. You can theoretically produce a kiss on Sora, but only after crafting a text prompt describing what you want, uploading photos for the tool to work with, and clicking through pop-ups asking if you're over 18 and have consent to use the material you're uploading — and even then, Sora flagged me smooching Edward Cullen as a potential policy violation. Google's Veo is much the same. I tried the Edward Cullen kiss test, and Veo refused, saying it would reject prompts that are sexually suggestive, nonconsensual acts, or those that promote harmful stereotypes. On these other apps, you don't even need to come up with the idea — just upload a couple of pictures, and the system will deliver what you want.
Simple apps for creating deepfaked nudes have produced numerous instances of clear harm, including widespread harassment of women and teen girls. Some of these incidents have led to lawsuits and arrests. There are also legal efforts to crack down on AI-generated nudes and unauthorized 'digital replicas' of real people, including the recently signed Take It Down Act, the No Fakes Act, and a bill passed by the New York State Senate.
These apps are unlikely to fall under the purview of anti-deepfake porn laws, though the frequent appearances of celebrities — Boom.AI offered templates that let you make out with both Robert Pattinson as Edward Cullen and Timothee Chalamet — make their status under digital replica rules shakier. For now, they sit in a murky zone between app store and platform moderation policies. Major tech companies have lagged on removing even sexually explicit AI generators, and the status of anything milder on their platforms seems nebulous.
Google spokesperson Danielle Cohen tells The Verge that the Google Play Store doesn't allow apps that contain content or services that could be intended as sexually gratifying, and companies aren't allowed to use sexually explicit ads (including AI-generated ones) to direct people to their Play Store listings.
Apple's App Store guidelines state apps shouldn't contain content that is 'offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy.' Provided examples include 'mean-spirited' content, as well as 'explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings.' There are no rules about ads for these apps.
I sent Meta an example of an ad for a kiss and hug AI app I found on Instagram Reels. In response, Meta spokesperson Faith Eischen told The Verge, 'We have clear rules against nudity and sexual exploitation, including non-consensual intimate imagery — both real and AI-generated — and we've removed the shared piece of content for breaking our rules.' Eischen also noted that Meta removes such ads when notified, disables accounts responsible for them, and blocks links to sites hosting such apps.
The Verge reached out to TikTok about its policies but didn't receive a response.
While it's fraught to create sexually charged images of celebrities, it overlaps with the existing territory of fan art and meme-ification. Many of these apps' functions, though, tread in more uncomfortable territory. While it might not be overly pornographic, it's creepy to deepfake yourself kissing someone. It would be even creepier to do it to a friend or acquaintance who didn't consent to it. But it's also not really clear what the average user is looking for — most reviews are simply complaining about the microtransactions.
Moderating this sort of content is kind of like whack-a-mole. Boom.AI had plenty of 'use AI to kiss your crush' ads several weeks ago. Now, all the ones I bookmarked have disappeared from social media. Within the app itself, I can no longer generate any kind of kissing video. Instead, the app moved on to ads of a suburban mom twerking, before they, too, were subsequently removed.
Experimenting with AI video apps wasn't always creepy. Few people would object if everyone was using them to generate heartwarming videos of kids hugging their grandparents; you could argue that it's weird to want to do this, but it's not inherently wrong or illegal.
But the fun or arguably helpful use cases are mixed in almost inextricably with the creepy stuff. Changing my hair is a pretty unobjectionable process, but it's unsettling to swap my own face onto a model 'dancing' while wearing cat ears, a plunging crop top that shows off her midriff and bra, hot pants, and lacy garters. (Leonardo DiCaprio's face on the model is perhaps less disturbing than simply unhinged.) Conversely, I've had genderqueer friends say they privately used AI templates that let them see what they'd look like as a different gender, and it helped them figure out their feelings. Even the kissing templates could have fairly innocuous uses — you could be a fiction writer seeking inspiration for a romance novel. In that case, what's the difference between drawing your own fan art and using an AI video generator? Perhaps, you're trying to process something and need a little visual help — and that's how I ended up deepfaking my dead parents.
In a plot stolen straight from The Farewell, my mom died before my grandmother, and my family decided not to tell her out of fear she'd drop dead from shock. But whereas that film dealt in regular white lies, my family decided to update its deception for the modern era. When my grandma started lamenting that my mom had stopped calling, a cousin asked me if there was any chance that I, a tech reporter, could use AI to create video messages of my mother. That would, my cousin said, give my dementia-addled grandma some sense of peace. At the time, I told her it wasn't possible.
Three years later, I finally generated the deepfake she requested while testing these apps. It was eerie how much it looked like my mom, except when she smiled. My real mother was self-conscious of her underbite. AI mom's teeth were perfect. All I could see were the ways that AI had failed to capture my mother's essence. I thought my cousin would feel the same way. Instead, the text I got in response was four hearts interspersed with several exclamation marks and crying face emojis. For her, the horrible deepfake was comforting. My mom would've hated this AI version of herself, and yet in the days after creating it, I found myself replaying it over and over — if only because spotting what the AI got wrong reminded me that I hadn't forgotten the real her.
I found myself replaying it over and over — if only because spotting what the AI got wrong reminded me that I hadn't forgotten the real her.
After that, I deepfaked my dad hugging me at my wedding. Some little girls dream of their fathers walking them down the aisle. Mine died before that day ever came, and I didn't make it to his deathbed in time for a proper goodbye. I wondered if deepfaking dad would give me a sense of closure. I used the last good photo I had of him, taken a few days before he passed, and a solo photo of me from my wedding.
The AI did a horrible job. For one, it interpreted my dad's beanie as a thick shock of black hair. In my family, we teased him for his thin combover and fivehead — which, in his broken English, he insisted was proof he was a true 'egghead.' I tried again and got a slightly better result. Still, the pattern on his sweater changed. His facial features morphed into someone who looked close, but ultimately wasn't my dad. Even so, it made me cry. The AI got so many things wrong, but it was good enough to sketch the shape of my longing. This, too, I sent to my cousin, who replied back with even more crying emoji.
AI evangelists tout this as a positive use case for AI. Wouldn't it be nice to reanimate your dead loved ones? Before deepfaking my parents, I'd have scoffed and said this is a dystopian premise that denies the humanity of our mortality. But all I can say now is that grief is a strange beast. I'd be lying if I said that I found comfort in these deepfakes, but I can't deny that a part of me was moved. I'm also no longer inclined to describe this as a bad way to use AI; it's just weird.
Perhaps the question isn't whether these apps are inherently harmful or what platforms should do when they appear. Maybe it's a matter of asking what we're hoping to see of ourselves reflected in them.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Forbes
20 minutes ago
- Forbes
No Pipeline, No Progress: Meeting The Demand For Advanced Degrees
As demand for master's and doctoral degrees surges, too few programs exist to support the students most often excluded—despite their potential. The United States stands at a crossroads. While innovation, competitiveness, and global leadership increasingly depend on highly educated workers, access to graduate education remains deeply unequal and underfunded. Over 60% of business and government leaders hold graduate degrees—with more than half in business and nearly a third in law. A 2020 report by Brint and colleagues found that 61% of top media figures and 78% of think tank and foundation leaders also held advanced degrees. In many leadership roles, graduate education is no longer a competitive advantage—it's a requirement. Demand is rising. A 2024 report from Georgetown University's Center on Education and the Workforce projects that nearly 1 in 5 jobs will soon require an advanced degree. Among 'good jobs'—those offering middle-class wages of $43,000 or more—1 in 4 will demand graduate credentials. Yet access to graduate education remains deeply inequitable. Madeline Brighouse Glueck finds that parental education still shapes graduate enrollment, especially in high-investment, high-return programs like law, medicine, and PhDs. In medicine alone, over 75% of students come from the top two income quintiles. Even academically qualified first-generation and low-income students are often left behind. While families with financial and social capital can navigate elite admissions and cover soaring costs, others are shut out. The only federally funded graduate pipeline program is the McNair Scholars Program, which supports first-generation, low-income, and underrepresented undergraduates seeking PhDs. This program—and others like Upward Bound—are now at risk of being defunded. As the federal government grows increasingly hostile toward identity-based programs in higher education, the burden of promoting equitable access is falling to the private sector and nonprofit organizations. Yet only a handful of national nonprofits directly focus on this issue: These organizations are doing powerful work—but their combined reach can only serve a fraction of the students who deserve access. To meet the moment, coordinated investments are needed—not just in graduate preparation, but also in affordability, mentorship, and long-term support: Graduate education is not a luxury—it is a national imperative. If we want to lead in science, health, law, and business—and if we believe in opportunity—we must invest in the people who will lead those fields. Let's ensure that talent, not zip code or family background, determines who has a seat at the table. Change can't wait. The time to invest is now. —--------- Help us widen the pipeline. Support Leadership Brainery in creating equitable pathways to graduate education. Donate today!


Motor Trend
21 minutes ago
- Motor Trend
The Cheapest Pickup Trucks You Can Buy in 2025 Aren't All Small
Almost across the board, pickup truck prices are creeping upward. Most of this is due to inflation (and more recently, tariffs), but formerly cheap trucks like the new generation Toyota Tacoma are going somewhat upmarket, while the price creep affecting the cheapest pickups like the Ford Maverick appears to be due to automakers capitalizing on unexpected success and, again, more recently, responding to tariffs. (The Maverick, like some other trucks on this list, is assembled outside of the U.S., which raises price pressure compared to home built options.) For now, the cheapest work trucks you can buy can still be had for under $40,000, but you don't need us to tell you that the versions of the most common full-size trucks most consumers buy are in the $50,000 to $60,000 range. Of course, the base price isn't the only metric by which to measure a truck, but it's an important one. If you want to explore other ways pickup trucks stack up against each other, MotorTrend 's proprietary algorithm provides the ultimate source of automotive data by combining over 75 years of our own instrumented performance, comfort, and efficiency testing on more than 5,000 vehicles. That data is fused with decades of expertise from former heads of design, engineering, and our own car buying experience experts. Built by statisticians and honed by automotive experts, MotorTrend 's Ultimate Car Rankings will assist in finding your perfect vehicle. But, you came here for cheap trucks, and here they are, the cheapest trucks you can buy in 2025:


Fox News
24 minutes ago
- Fox News
WATCH LIVE: Unite for Veterans, Unite for America rally on the National Mall
All times eastern Making Money with Charles Payne FOX News Radio Live Channel Coverage WATCH LIVE: Unite for Veterans, Unite for America rally on the National Mall