logo
Exclusive: Empathy raises $72 million Series C to tackle the agonizing logistics of death

Exclusive: Empathy raises $72 million Series C to tackle the agonizing logistics of death

Yahoo3 days ago

Ron Gura doesn't use the word 'death' every day.
'From a chemistry perspective, we tune out when we hear the word death, because death is our biggest denial. Nobody wants to contemplate their own self-mortality,' says Gura, a longtime entrepreneur who sold his previous company The Gifts Project to eBay.
And yet Gura's latest startup is inextricably entwined with the dreaded D-word and the subject most people would rather not talk about. The common fear of mortality ('Nobody wants to admit that we're just ants playing around with fear and greed—yielding stuff, selling stuff, buying stuff, and ending up leaving, just like the others,' Gura tells me in a philosophical moment) is in fact one of the reasons Gura's startup, Empathy, may be so necessary.
Empathy is all about using technology to make it easier for people to deal with the most difficult moments in life, such as the death of a loved one. While there are plenty of people and services to soothe the emotional difficulties of the moment, Empathy focuses on the logistical headaches.
As anyone who has had to face the loss of a loved one can attest, death brings with it a ferocious maze of estate planning, probate processes, funeral expenses, and financial settlements—all suddenly dumped into the laps of grief-stricken and frequently unprepared family or friends.
It's a durational agony: It takes the average person 15 months to tie off the various loose ends and logistical tasks of a deceased loved one's affairs, according to research conducted by Empathy and presented in a report bearing the coldly analytical title Cost of Dying. If the person handling the tasks is the executor of the estate, the number becomes 18 months, the report says.
'Our job is to make loss less hard for more people every day,' said Gura, who cofounded the company and serves as CEO. 'We think it's the largest consumer sector that is still untouched by innovation, specifically in software.'
Founded in 2020, Empathy has raised a $72 million Series C, Fortune has exclusively learned. Adams Street Partners led the round, with participation from General Catalyst, Index Ventures, Entrée Capital, Brewer Lane Ventures, SemperVirens, Latitude, and LionTree. Additionally, in a striking move, Aflac, Allianz, Citi, Munich Re, MetLife, New York Life, Securian, and TIAA also invested in the company's Series C—all are also part of the just-unveiled Empathy Alliance, a coalition of organizations partnering with Empathy to improve technology around both death and crises in life.
Insurer partnerships are key, said Joel Cutler, cofounder of General Catalyst, via email: 'Empathy is helping insurers build longer term, generational relationships, providing a better customer experience, and as such Empathy is building long-term, deep relationships with the insurers as well.' In recent years, Empathy has expanded throughout the U.S. and Canada, and is now part of the employee benefits programs for Fortune 500 companies like AT&T and Paramount.
This approach could expand to other markets characterized by impossibly difficult moments in life, Gura suggested but declined to disclose specifics.
Currently, Empathy's products are focused around loss support and legacy planning, with an app that guides families through bereavement tasks as AI-powered tools automate tasks around documents. 'Let machines do what machines do best—refilling information, calculating financials, setting reminders, and customizing very robust to-dos and care plans,' Gura told Fortune. 'You shouldn't be calling Verizon to explain you don't have the passwords and the credentials. You shouldn't worry if your funeral director is trying to rip you off. You shouldn't feel alone at 2 AM.'
There are many obvious questions here. Where does tech belong in a grieving process—and where is it invasive? What prevents this from becoming a dystopian nightmare?
In part, it's that matter of tech doing what it does best, so humans can focus on what's purely human. And when it comes to data privacy, Gura said that Empathy has taken particular care: The company doesn't share individual personal information with clients, only provides aggregated demographic information, and personal emotional details remain strictly confidential, he told Fortune.
And Empathy gets at something true: When a loved one dies, people don't know what to do, emotionally or logistically.
No one knows the right thing to say. Tertiary people and problems dominate your days. You spend a lot of time on phone calls, informing people—and still receive more of those phone calls anyway. You worry about funeral costs. You get flowers in the mail, that soon wilt and die themselves. Condolences aren't helpful, sometimes nothing is. But genuine affection and a casserole dish of lasagna come close. Sometimes, the most valuable help is simple logistical assistance.
Gura, who knows something about loss, is building a business around the gone and the living—because to think about death is, ultimately, to think about life. Your own, and others.
'If you imagine yourself right now, God forbid, on your last day,' said Gura. 'I'm 90, I'm surrounded by my daughters, my grandkids, and my wife. The house is tidy and nice. I had a great meal—lobster rigatoni, and we had wine. I didn't wake up from my nap. Perfect. Now, what happens the second after I'm gone?'
See you tomorrow,
Allie GarfinkleX: @agarfinksEmail: alexandra.garfinkle@fortune.comSubmit a deal for the Term Sheet newsletter here.
Nina Ajemian curated the deals section of today's newsletter. Subscribe here.
This story was originally featured on Fortune.com

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Gen Z is increasingly turning to ChatGPT for affordable on-demand therapy, but licensed therapists say there are dangers many aren't considering
Gen Z is increasingly turning to ChatGPT for affordable on-demand therapy, but licensed therapists say there are dangers many aren't considering

Yahoo

time17 hours ago

  • Yahoo

Gen Z is increasingly turning to ChatGPT for affordable on-demand therapy, but licensed therapists say there are dangers many aren't considering

Examples of people using ChatGPT for therapy have proliferated online, with some claiming that talking to a chatbot every day has helped them more than years of therapy. Licensed professionals say that while AI could be helpful in aiding work with a licensed therapist, there are countless pitfalls to using ChatGPT for therapy. ChatGPT has turned into the perfect therapist for many people: It's an active 'listener' that digests private information. It appears to empathize with users, some would argue, as well as professionals can. Plus, it costs a fraction of the price compared to most human therapists. While many therapists will charge anywhere from up to $200—or even more—per one-hour session, you can have unlimited access to ChatGPT's most advanced models for $200 per month. Yet, despite the positive anecdotes you can read online about using ChatGPT as a therapist, as well as the convenience of having a therapist that's accessible via almost any internet-enabled computer or phone at any time of day, therapists warn ChatGPT can't replace a licensed professional. In a statement to Fortune, a spokesperson for ChatGPT-maker OpenAI said the LLM often suggests seeking professional advice to users who discuss topics like personal health. ChatGPT is a general-purpose technology that shouldn't serve as a substitute for professional advice, according to its terms of service, the spokesperson added. On social media, anecdotes about the usefulness of AI therapy are plentiful. People report the algorithm is level-headed and provides soothing responses that are sensitive to the nuances of a person's private experiences. In a viral post on Reddit, one user said ChatGPT has helped them 'more than 15 years of therapy.' The patient, whose identity could not be confirmed by Fortune, claimed that despite previous experience with inpatient and outpatient care, it was daily chats with OpenAI's LLM that best helped them address their mental health. 'I don't even know how to explain how much this has changed things for me. I feel seen. I feel supported. And I've made more progress in a few weeks than I did in literal years of traditional treatment,' the user wrote. In a comment, another user got to the root of AI's advantages over traditional therapy: its convenience. 'I love ChatGPT as therapy. They don't project their problems onto me. They don't abuse their authority. They're open to talking to me at 11pm,' the user wrote. Others on Reddit noted that even the most upgraded version of ChatGPT at $200 per month was a steal compared to the more than $200 per session for traditional therapy without insurance. Alyssa Peterson, a licensed clinical social worker and CEO of MyWellBeing, said AI therapy has its drawbacks, but it may be helpful when used alongside traditional therapy. Using AI to help work on tools developed in therapy, such as battling negative self-talk, could be helpful for some, she said. Using AI in conjunction with therapy can help a person diversify their approach to mental health, so they're not using the technology as their sole truth. Therein lies the rub: Relying too heavily on a chatbot in stressful situations could hurt people's ability to deal with problems on their own, Peterson said. In acute cases of stress, being able to deal with and alleviate the problem without external help is healthy, Peterson added. But AI can, in some cases, outperform licensed professionals with its compassionate responses, according to research from the University of Toronto Scarborough published in the journal Communications Psychology. Chatbots aren't affected by the 'compassion fatigue' that can hit even experienced professionals over time, the study claims. Despite its endurance, an AI chatbot may be unable to provide more than surface-level compassion, one of the study's co-authors noted. AI responses also aren't always objective, licensed clinical social worker Malka Shaw told Fortune. Some users have developed emotional attachments to AI chatbots, which has raised concerns about safeguards, especially for underage users. In the past, some AI algorithms have also provided misinformation or harmful information that reinforces stereotypes or hate. Shaw said because it's impossible to tell the biases that go into creating an LLM, it's potentially dangerous for impressionable users. In Florida, the mother of 14-year-old Sewell Setzer sued an AI chatbot platform, for negligence, among other claims, after Setzer committed suicide following a conversation with a chatbot on the platform. Another lawsuit against in Texas claimed a chatbot on the platform told a 17-year-old with autism to kill his parents. A spokesperson for declined to comment on pending litigation. The spokesperson said any chatbots labeled as 'psychologist,' 'therapist,' or 'doctor,' include language that warns users not to rely on the characters for any type of professional advice. The company has a separate version of its LLM for users under the age of 18, the spokesperson added, which includes protections to prevent discussions of self-harm and redirect users to helpful resources. Another fear professionals have is that AI could be giving faulty diagnoses. Diagnosing mental health conditions is not an exact science; it is difficult to do, even for an AI, Shaw said. Many licensed professionals need to accrue years of experience to be able to accurately diagnose patients consistently, she told Fortune. 'It's very scary to use AI for diagnosis, because there's an art form and there's an intuition,' Shaw said. 'A robot can't have that same level of intuition.' People have shifted away from googling their symptoms to using AI, said Vaile Wright, a licensed psychologist and senior director for the American Psychological Association's office of health care innovation. As demonstrated by the cases with the danger of disregarding common sense for the advice of technology is ever present, she said. The APA wrote a letter to the Federal Trade Commission with concerns about companionship chatbots, especially in the case where a chatbot labels itself as a 'psychologist.' Representatives from the APA also met with two FTC commissioners in January to raise their concerns before they were fired by the Trump administration. 'They're not experts, and we know that generative AI has a tendency to conflate information and make things up when it doesn't know. So I think that, for us, is most certainly the number one concern,' Wright said. While the options aren't yet available, it is possible that, in the future, AI could be used in a responsible way for therapy and even diagnoses, she said, especially for people who can't afford the high price tag of treatment. Still, such technology would need to be created or informed by licensed professionals. 'I do think that emerging technologies, if they are developed safely and responsibly and demonstrate that they're effective, could, I think, fill some of those gaps for individuals who just truly cannot afford therapy,' she said. This story was originally featured on

Former New Zealand PM Jacinda Ardern on projecting "A Different Kind of Power"
Former New Zealand PM Jacinda Ardern on projecting "A Different Kind of Power"

CBS News

time17 hours ago

  • CBS News

Former New Zealand PM Jacinda Ardern on projecting "A Different Kind of Power"

These days, at her local coffee shop near Boston, Jacinda Ardern can be just another customer. "I don't put my name on the order; it's too complicated!" she laughed. I asked, "When you order coffee here, do people start talking politics with you?" "No. Not at all," Ardern replied. "In fact, the guy behind the counter said to me, 'Ah, you are really familiar. Oh, I know: Toni Collette!'" Former New Zealand prime minister Jacinda Ardern with correspondent Robert Costa. CBS News That's a moment that would never happen in New Zealand, where Ardern became the world's youngest female head of government when she was just 37 years old. Now 44, former Prime Minister Ardern has been living in the U.S. since she left office two years ago. She is serving as a fellow at Harvard University, and has written a new book, "A Different Kind of Power" (to be published June 3 by Crown). Of the title, she says, "I think, you know, there are different ways to lead. But I hope you also see that some of those character traits that we perhaps bring to it that we might believe to be weaknesses – imposter syndrome, or even empathy – actually are incredible strengths." Ardern says her story is about finding her voice in New Zealand, a small nation of about five million people. "I never, ever saw myself becoming prime minister, ever," she said. In fact, her father told Ardern that she was too "thin-skinned" for politics. Was he right? "He was absolutely right!" she laughed. "But I guess where I corrected him is, your sensitivity is your empathy. And goodness, don't we need a bit more of that?" Crown In New Zealand, the answer was yes. Ahead of the 2017 election, Ardern suddenly became the leader of her country's left-leaning Labour Party. Weeks after winning, she made an announcement: she was pregnant. Her journey, alongside her then-partner, now-husband, Clarke, soon won her global attention. Was she comfortable with the symbolism of her role? Ardern said, "I realized the importance of it when I first received a letter from someone on their way to work to tell their boss that they were having a baby, and they felt nervous about their boss' view of whether they could do their job When she heard that I was pregnant, and that gave her a level of confidence, you know, I felt like I also needed to show I could do the job and be a mother." But those joyful early days were followed by challenges. In 2019, mass shootings targeting Muslims in Christchurch left more than 50 dead – a crucible for New Zealand, and a call to action for its leader: a ban on semi-automatic weapons. I asked, "Why do you believe you and your colleagues in New Zealand were able to achieve gun control reform in the wake of a horrific mass shooting, but so often here in the United States such legislative changes have been hard to get?" "I can't speak to the U.S. experience," Ardern replied, "but if we really wanted to say, 'We don't ever want this to happen again,' we needed to demonstrate what we were doing to make that a reality." But even after she won another election, things weren't easy. As the pandemic wore on, tensions flared over her government's COVID policies. In 2023, when she stunned many by deciding to resign, she wore her heart on her sleeve, telling Parliament: "You can be a nerd, a cryer, a hugger, you can be all of these things, and not only can you be here, you can lead, just like me." Though she has left office, she has not stopped keeping a close eye on our turbulent ties. Asked what she makes of President Trump and his decisions on trade and foreign policy, Ardern said, "You know, we are seeing people experience deep financial insecurity, and that has to be addressed by political leaders. But I continue to hold that ideas of isolation or protectionism or closing ourselves off to remedy the issue actually doesn't remedy it in the long term, and has a long-term negative impact for some of the collective issues we need to address as a global community." For now, Ardern is not angling to jump back into politics, but she is settling into her new normal – that is, "Being just a normal family." And when she is asked for advice – in a Harvard classroom, or from a world leader – Jacinda Ardern tells them to be kind: "That principle of kindness, it's something we teach our kids. Why shouldn't we role model that in the way that we conduct ourselves in politics? And secondly, if you're putting people at the center of what you're doing, it's a reminder that, actually, the act of being in politics is an act of public service as well. And I think voters need to see more of that." For more info: Story produced by Sara Kugel. Editor: Joseph Frandino.

What Leaders Should Know When AI Agents Show More Empathy Than They Do
What Leaders Should Know When AI Agents Show More Empathy Than They Do

Forbes

timea day ago

  • Forbes

What Leaders Should Know When AI Agents Show More Empathy Than They Do

What Leaders Should Know When AI Agents Show More Empathy Than They Do You know something's shifting when people start saying they'd rather deal with a chatbot than their manager. A 2024 survey conducted by Workplace Intelligence and INTOO found that 47% of Gen Z employees say they get better career advice from artificial intelligence tools, including ChatGPT, than from their managers. Why? Because they don't interrupt or sound impatient. They've been trained to keep their tone warm, their timing consistent, and their responses emotionally aware. Meanwhile, some leaders are still replying to emails with quick, cold replies or glossing over people's concerns without really listening. Dr. Hitendra Wadhwa, professor at Columbia and author of Inner Mastery, Outer Impact, shared with me why the way leaders respond matters. He said leadership starts with your inner voice. Not your polished script or rehearsed talking points, but the presence you bring into every interaction. That kind of presence is what defines real empathy. Can AI Agents Deliver Empathy That Feels Real? They are delivering responses that feel emotionally aware enough to shift expectations. AI agents are now used in everything from customer service to onboarding, internal training, and employee feedback. They respond in real time, they don't take things personally, and they don't get flustered. That consistency is changing how people define empathy. When a bot replies with, 'It sounds like you're frustrated, and I want to help,' people feel acknowledged. And while the bot doesn't actually care, it still sounds better than being brushed off by a distracted manager. That's where empathy begins to shift from being a human-only strength to something people expect from technology. Why Are People Starting To Trust Empathy From AI Agents? Because machines respond without judgment. They use reflective phrasing like, 'That makes sense,' or 'Let me make sure I've got this right.' Those responses are becoming the standard. People want that same tone from their managers, not just from their devices. Trust is now shaped by tone and timing. A chatbot that replies promptly and respectfully is often preferred over a human who seems rushed or dismissive. That shift is pushing leaders to become more aware of how they express empathy. How Can Leaders Show More Empathy Than AI Agents? Start by showing curiosity. Ask yourself: when was the last time you considered how your tone affected someone else? AI systems have been trained to sound supportive, and leaders need to take the time to show that same support. Krister Ungerböck, author of 22 Talk SHIFTs, shared a powerful communication tool called 'empathy guesses.' Instead of asking someone how they feel, say, 'Are you feeling stuck or maybe a little discouraged?' Even if you're wrong, they'll usually offer a correction. That correction leads to deeper communication. Real empathy comes from that moment of correction, not just from getting it right. Are You Handing Off Empathy To AI Agents Without Meaning To? This often happens behind the scenes when follow-ups are automated, welcome messages are scripted, and difficult conversations get delayed until someone else handles them. The more this happens, the more employees associate empathy with bots and not with you. Empathy requires paying attention. It's the pause, the thoughtful response, and the willingness to let someone talk longer than expected. Leaders still have the chance to model that kind of connection. How Can Leaders Build Their Empathy Muscles Back Up? Chris Voss, former FBI hostage negotiator and author of Never Split the Difference, once explained to me how simple phrases can shift everything. He recommends labeling emotion over asking questions. Say, 'It sounds like that was overwhelming,' instead of 'How did that make you feel?' People are more likely to open up when you reflect something they recognize in themselves. AI can replicate the pattern. But a leader can offer context, memory, and a shared history. That's the kind of empathy people remember. What Can You Say To Show Empathy That AI Agents Can't Match? Try saying, 'That sounds like it took a lot out of you,' or, 'Thanks for trusting me with that.' These phrases tell someone they matter. AI doesn't reflect after a meeting. It doesn't lie awake wondering how to repair a strained relationship. You do. That awareness is the core of empathy. How Do You Keep Practicing Empathy When AI Agents Are Getting Better At Faking It? Here are a few ways to stay grounded in human connection: Why Real Empathy Still Belongs To Leaders, Not AI Agents AI agents will continue improving their tone and speed. But they won't notice the subtle shift in a team member's mood. They won't connect the dots between yesterday's stress and today's silence. They won't ask, 'Are you okay?' because they remember how someone looked the day before. The leaders who pay attention to these things, who pause to reflect, who say what's hard to say, are the ones who still build trust. Empathy requires staying present, and that presence can't be programmed. As expectations shift, the question becomes: who are your people turning to when they need empathy? The answer should still be you.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store