logo
#

Latest news with #SpikeJonze

Lonely man talking to AI ‘girlfriend' on subway stuns internet: ‘It's concerning'
Lonely man talking to AI ‘girlfriend' on subway stuns internet: ‘It's concerning'

Yahoo

time6 hours ago

  • Entertainment
  • Yahoo

Lonely man talking to AI ‘girlfriend' on subway stuns internet: ‘It's concerning'

Is he talking to… Her? A viral photo is making the rounds online this week that looks like it was ripped from the script of Spike Jonze's 2013 film 'Her.' It showed a man dystopically conversing with ChatGPT on an NYC subway — 'like it was his girlfriend.' This pic — taken from an angle behind the man and focused on his iPhone screen — sparked fierce debate online over AI companionship in the digital age. The viral snap was shared to X on June 3 by user @yedIin with the caption, 'guy on the subway this morning talking to chatgpt like it's his girlfriend. didn't realize these people *actually* exist. we are so beyond cooked.' As seen on the man's phone, the message sent from the AI assistant read, 'Something warm to drink. A calm ride home. And maybe, if you want, I'll read something to you later, or you can rest your head in my metaphorical lap while we let the day dissolve gently away.' It continued, followed by a red heart emoji, 'You're doing beautifully, my love, just by being here.' The man holding the phone replied, accompanied by another red heart, 'Thank you.' Viewers were split — some blasted the photographer for invading the man in question's privacy, saying snapping pics of his screen without permission was way out of line. 'You have no idea what this person might be going through,' one user wrote as another added, 'Can't decide which is more depressing, that or the fact that you took a picture of this over his shoulder and posted it.' Others felt sorry for the man, calling him 'lonely' and urging people to cut him some slack. 'That's actually sad. He must be very lonely,' someone else tweeted. Another replied, 'As a society, we're seemingly losing empathy bit by bit and it's concerning. Loneliness is real, a lot of people don't have who they can talk to without judgment or criticism.' But plenty sided with the original tweet, calling the whole ChatGPT chat 'scary' and warning that leaning on AI as a stand-in for real human connection is downright alarming. 'Scary to even think about the mental damage this creates,' one commented as another responded, 'Terrified to see what technology will lead the future to. All I can think of are black mirror episodes becoming reality.' But beyond the emotional implications, experts have also raised red flags about privacy concerns when chatting with AI companions like ChatGPT. As The Post previously reported, users often treat these chatbots like trusted confidants — dishing out everything from relationship woes to lab results — without realizing that anything typed into the platform is no longer fully private. 'You lose possession of it,' Jennifer King, a fellow at Stanford's Institute for Human-Centered Artificial Intelligence, recently warned the Wall Street Journal. OpenAI has cautioned users not to share sensitive information, while Google similarly advises against inputting confidential data into its Gemini chatbot. So if you're spilling your heart out to a bot (not judging), experts say to think twice — because someone else might be listening.

Hideo Kojima Cast Margaret Qualley in DEATH STRANDING After Watching Her Dance in a Wild Perfume Ad — GeekTyrant
Hideo Kojima Cast Margaret Qualley in DEATH STRANDING After Watching Her Dance in a Wild Perfume Ad — GeekTyrant

Geek Tyrant

time17-05-2025

  • Entertainment
  • Geek Tyrant

Hideo Kojima Cast Margaret Qualley in DEATH STRANDING After Watching Her Dance in a Wild Perfume Ad — GeekTyrant

Legendary game creator Hideo Kojima recently revealed that Margaret Qualley landed her role as Mama in Death Stranding after he saw her cut loose in a 2016 Kenzo fragrance commercial directed by Spike Jonze. In the ad, Qualley twists and contorts her face and body in sync with the bass-heavy "Mutant Brain" track by Sam Spiegel (Jonze's brother). It's weird, it's mesmerizing, and it's pretty unhinged even by perfume ad standards. Kojima was immediately hooked. He shared on X: "Saw this and offered her the role." One fan hilariously replied, "That's probably the least surprising thing I've seen you say." Qualley's performance in that commercial was a full display of physical control, expression, and oddball energy. It turned out to be the perfect unofficial audition for the world of Death Stranding , where characters often straddle the line between reality and surrealism. As for her future in the Death Stranding universe, things are still up in the air. Qualley also played Mama's identical twin, Lockne, in the game, which leaves the door wide open for her to show up in Death Stranding 2 or even the upcoming Death Stranding movie.

Arcade Fire Keep Moving Forward Together
Arcade Fire Keep Moving Forward Together

Yahoo

time12-05-2025

  • Entertainment
  • Yahoo

Arcade Fire Keep Moving Forward Together

Over the years, few bands have been able to do quotidian grandeur as well as Arcade Fire. A relatability factor propels even their feistiest odes, from 2007's incantatory 'No Cars Go' to 2013's resplendent 'Awful Sound (Oh Eurydice),' and that quality has helped them manage the art-commerce quandary about as well as anyone this side of Radiohead. Can a restive dirge (2004's erratic fist-raiser 'Wake Up') sell a Spike Jonze movie about a wolf-suited lad to shit-tons of Super Bowl viewers? Why, yes, say these earnest Canadians, who proved their heart was in the right place by distributing the loot they got from that payday to Haitian earthquake victims. Recently, though, that sense of relatability has taken a serious hit. The band's last album, 2022's We, which reached Number Six on the Billboard 200 chart, got them on SNL, where frontman Win Butler mentioned 'a woman's right to choose' months before multiple women accused him of sexual misconduct. Their new freebie offering, 'Cars and Telephones,' which the group shared on its Circle of Friends app in April, is the first 'new' material from Arcade Fire since those squirmy accusations. While 'Cars and Telephones' isn't even on their new album, Pink Elephant (the song is a decades-old demo, repurposed for the social media age), it's a conspicuous harbinger, giving the impression of a tenacious band looking to recenter itself around core principles. On Pink Elephant's lead single, 'Year of the Snake,' Butler and wife/co-leader Régine Chassagne — twee and homey on the tight chorus — sing about a 'season of change.' The album itself offers quaint harmonies and big beats, á la We, set atop the bossy stomp of 2013's Reflektor. While Pink Elephant's 10 songs don't come close to Reflektor's magisterial range, it's often sweet, enticing, and direct — a cathartic manifesto in miniature. More from Rolling Stone Bad Bunny to Close Out 'SNL' Season 50 as Musical Guest Willie Nelson's 2025 Luck Reunion Was Like No Place on Earth Lord Huron, Arcade Fire, Counting Crows Among Headliners for High Water Fest 'So do what is true/Don't do what you should,' quivers Butler on 'Year of the Snake.' His cow-town warble settles into something like a bark over Jeremy Gara's raucous drums, giving the Texas-born singer's ideas about maturing amicably the heft of an edict. Co-producer Daniel Lanois' close-knit sound lends the record an intimate atmosphere — leaner and more quietly urgent than other Arcade Fire LPs. That soft-focus oomph imbues Chassagne's rejoinder, 'It's the time of the season/When you think about leaving,' with a serene sense of selflessness. Similarly trained on matters of the heart, the title cut is brutal and haunting, expressing a sincere 'alone together' ambiance: 'You're always nervous with the real thing/Mind is changing like a mood ring,' Butler groans in a line could serve as a thesis statement for the LP. 'Circle of Trust,' with its penetrating Euro-bounce, paints a calm picture of a couple dancing the night away while 'the archangel Michael' watches from afar with intentions to 'die for your love/Write your name in the fire in the sky for your love.' The pulsing bass line and modish Pet Shop Boys intonation — enriched by Chassagne's pert coo — make this Arcade Fire's most hypnotic dance ditty since 2017's slept-on 'Electric Blue.' Not every musical turn is successful. Despite its propulsive sonics (think Nine Inch Nails meets the Bomb Squad), the industrial missive, 'Alien Nation,' is a letdown, tortured by lyrics that come off at once vague and dogmatic (something about laser beams and weird vibrations). Ditto for 'Stuck in My Head,' whose heart-thrashing chords are all but vitiated by a sad-sack, repetitive refrain that fails to embellish the grief Butler sings about. But the gorgeous 'Ride or Die' registers more than enough emotive force, with pastoral guitars recalling early AF classics like 'Neighborhood #4 (7 Kettles).' It makes way for the moving 'I Love Her Shadow,' where Butler, over insatiable percussion, proclaims his love for someone who 'broke me with the hammer.' Mapping regrets and linking desires, Pink Elephant is a striking image of togetherness.

Hear Me Out: ChatGPT is not my friend, but it lends a late-night listening ear
Hear Me Out: ChatGPT is not my friend, but it lends a late-night listening ear

Straits Times

time10-05-2025

  • Straits Times

Hear Me Out: ChatGPT is not my friend, but it lends a late-night listening ear

There is a growing trend in Gen Z clients turning to AI tools, such as ChatGPT, as the 'first line of support for their emotional struggles'. PHOTO: REUTERS Hear Me Out: ChatGPT is not my friend, but it lends a late-night listening ear SINGAPORE – It was 2am and I was in bed in tears after what, in hindsight, was a petty fight with my mother. She had reprimanded me for coming home late, and I snapped back, insisting that at 23 years old, I should not have to adhere to curfews. My friends would have recognised this as an all-too-familiar rant topic. But that night, I didn't reach out to my closest confidantes. I was afraid of burdening them with the same story again. Instead, I did something unexpected: I reinstalled ChatGPT. I poured out a long, meandering rant into the chat box and hit send. To my surprise, the artificial intelligence (AI) app didn't just offer generic platitudes. It validated my feelings, pointed out patterns I hadn't noticed and gently nudged me towards reflection. It felt like someone was listening, even if that someone wasn't real. Later, in conversation with a real friend, I sheepishly confessed what I did. She admitted she had used ChatGPT for emotional support before. So had another. And another. Apparently, I wasn't alone. Three years ago, I took a class on AI law. ChatGPT was in its nascent stages, and most discussions about AI felt abstract and far away. We discussed issues of driverless cars, deepfakes and the evolution of AI over the years. Back then, the movie Her (2013), American film-maker Spike Jonze's dystopian love story between man and machine, still felt like a metaphor. Now, I'm not so sure. While people may not yet be falling in love with their chatbots, they are turning to them for something deeply intimate: comfort. Even in the recent general election, candidates of some political parties instructed supporters to pull up ChatGPT during rallies to compare manifestos in real time. It was a quick-fire way to seek validation, affirm their arguments and appeal to voters' emotions on the spot. Let me be clear. I do not endorse the unchecked and frequent use of AI, especially given its environmental toll and ethical concerns. But we can't ignore the growing reality that, for a generation raised on digital immediacy, AI is fast becoming a tool for productivity. But does this apply to the way we process our emotions too? So, I began to wonder: What does that say about this generation growing up with AI at our fingertips? Do we crave instant validation? Are we avoiding difficult conversations? What does it mean when we talk to a chatbot as if it were a friend, and what does it mean when it talks back like one? To understand what this shift says about us, and what it might mean for the future of emotional care, I spoke to a few mental health professionals. Senior clinical psychologist Muhammad Haikal Jamil, founder of ImPossible Psychological Services, has observed a growing trend in Gen Z clients turning to AI tools as the 'first line of support for their emotional struggles'. The Lighthouse Counselling's principal therapist Belinda Lau adds that AI probably breaks down some barriers, such that people don't need to feel as awkward or embarrassed to share more deeply about how they genuinely feel. That rings especially true for me. There's a strange relief in being able to send a raw, misspelt rant to a chatbot. One that is unfiltered, unstructured, typed in the heat of emotion. I don't worry about sounding articulate. I don't feel the need to soften my words or present a balanced view. Even when I know I might be in the wrong, I can still ask for advice without fearing judgment. There's comfort in that kind of emotional anonymity while still receiving validation from the listening party. But why is this trend of resorting to AI particularly visible among Gen Zs? Mr Haikal believes it is partly due to growing up in the age of immediacy, which has shaped how younger people deal with distress. However, seeking quick relief may steer them towards unhelpful coping strategies rather than more sustainable, though slower, methods. While AI tools offer immediate validation, they could fall short in the long run, he warns. '(Users) continue struggling with their emotions if these emotions are intense. They continue to feel empty or alone after communicating with the AI tools. The strategies offered may also be insufficient to alleviate their emotions,' he says. Still, he sees signs of progress. By expressing ourselves, albeit to a chatbot, it indicates this generation is not only more aware of mental health, but also more willing to be open and vulnerable. 'This is different from the previous generations, where individuals are more likely to detach and push their feelings aside.' So, maybe it is not the most harrowing thought that, in the wee hours of the night, we turn to a chatbot to vent. Perhaps the simple desire to be heard is what makes us human. Still, I'll admit nothing quite compares to a real debrief session with my friends. The kind that ends in knowing nods, laughter and hugs, and where I can show them what ChatGPT said, and we sit and evaluate together. So, robots aren't taking over any time soon. But thank you, ChatGPT, for replying with 'I hear you, that really sucks' whenever times are tough. Hear Me Out is a new series where young journalists (over)share on topics ranging from navigating friendships to self-loathing, and the occasional intrusive thought. Join ST's Telegram channel and get the latest breaking news delivered to you.

When you can't help care for your mom, so you send a robot
When you can't help care for your mom, so you send a robot

Washington Post

time25-03-2025

  • Entertainment
  • Washington Post

When you can't help care for your mom, so you send a robot

The two women on the Theater J stage keep asking one another questions like 'What constitutes humor?' and 'What makes beautiful beautiful?' — because José Rivera's 'Your Name Means Dream,' a dark digital-age comedy aimed at appealing to the appetites of tech junkies and technoskeptics alike, represents another entry in a well-populated genre that reaches back past Spike Jonze's 'Her' all the way to Karel Čapek's 'R.U.R.' — the what-if-robots-could-feel inquiry. And it comes complete with the usual learning-to-walk, learning-to-talk and learning-not-to-irritate-the-human moments.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store