
Funders commit $1B toward developing AI tools for frontline workers
The funders announced Thursday that they will create a new entity, NextLadder Ventures, to offer grants and investments to nonprofits and for-profits to develop tools for those who often manage huge caseloads with few resources.
'The solutions that we're investing in, the hundreds of entrepreneurs that are going to bring forward solutions that incorporate leading edge technologies, are going to do it by coming alongside people who are living through some of the struggles in the economy,' said Brian Hooks, CEO of Stand Together, a nonprofit started by Kansas-based billionaire Charles Koch.
The other funders include hedge fund founder John Overdeck and Valhalla Foundation, which was started by Inuit cofounder Steve Cook and his wife Signe Ostby. Ballmer Group is the philanthropy of former Microsoft CEO Steve Ballmer and his wife Connie. The funders declined to reveal the exact financial commitments made by each of the contributors.
The point of investing in these AI tools is to spur economic mobility, a focus all the funders share, they said. The funders believe there are many ideas for how AI technologies could help match people with resources after a disaster or an eviction, for example, or help a parole officer close out more cases for people who have met all of the criteria but are waiting for the paperwork to be processed.
'As we traded notes on where we were making investments and where we saw broader gaps in the sector, it was readily apparent that there was a real opportunity to come together as a group of cofunders and cofounders to establish a new kind of investment organization,' said Kevin Bromer, who leads the technology and data strategy at Ballmer Group. He will also serve as a member on NextLadder's board, which will include three independent board members and representatives from the other funders.
NextLadder will be led by Ryan Rippel, who previously directed the Gates Foundation's economic mobility portfolio. The funder group has not yet determined if NextLadder will incorporate as a nonprofit or a for profit organization but said any returns they make from investments will go back into funding new initiatives.
NextLadder will partner with AI company Anthropic, which will offer technical expertise and access to its technologies to the nonprofits and companies it invests in. Anthropic has committed around $1.5 million annually to the partnership, said Elizabeth Kelly, its head of beneficial deployments, which is a team that focuses on giving back to society.
'We want to hand-hold grantees through their use of Claude with the same care and commitment we provide to our largest enterprise customers,' Kelly said, referencing Anthropic's large language model.
Hooks, of Stand Together, said philanthropy can reduce the riskiness of these types of investments and offer organizations more time to prove out their ideas.
'If we're successful, this will be the first capital to demonstrate what's possible,' Hooks said.
Researchers like those at the Active Learning Network for Accountability and Performance in humanitarian action have studied some of the risks associated with using AI tools when interacting with sensitive populations or handling high-stakes interactions, for example, in humanitarian contexts.
They recommend assessing whether AI is the best tool to solve the problem and, crucially, if it works reliably and accurately enough in high-risk settings. They also recommend assessing tools for bias, considering privacy protections and weighing the cost of potential dependence on a specific provider.
The National Institute of Standards and Technology also emphasizes that trustworthy AI systems should be accountable to users and that it should be possible to explain or trace how a tool arrived at a certain conclusion or decision.
Hooks emphasized that any AI tools NextLadder invests in will be shaped by the needs and feedback of these frontline workers. Tools that don't work for them, won't succeed, he said. Even with the potential risks of AI tools, he said it's imperative that groups that are struggling to move up the economic ladder have access to new technologies.
'The idea that we would deprive those who are struggling in our country from the benefits of the leading edge solutions is unacceptable,' Hooks said.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Forbes
16 minutes ago
- Forbes
Concern About AI Girlfriends And Digital Fantasies Of Emerging Adults
Concept of an AI female Advances in AI technology are quickly changing the dynamics of many industries across the globe. A report on this week outlined how the federal government is seeking to scale back AI regulation and expand its infrastructure and investments. Though advancements in AI can benefit many industries, some professionals are concerned that advancements in certain industries might have a negative impact on the mental health and social functioning of emerging adults. For example, advances with AI Girlfriends can potentially result in more young adults struggling with problematic pornography use, developing distorted views about relationships, and being the victim of sexual exploitation. Advances In AI Girlfriends According to a 2025 report on a global software as service AI company, 'AI Girls' are AI-generated characters that are visually represented as female, and these characters started in popularity as virtual assistants and chatbots. However, their images can now be uploaded into a video platform in which AI technology will animate the character with natural movements, speech and expressions. Even though AI models can be male, there's an emerging industry based on AI Girlfriends that targets straight men with sexual content, and some of these products allow users to create AI characters with the appearance of real people, such as peers, former romantic partners, or colleagues. Concern That AI Girlfriends Are Too Enticing A 2025 study in the International Journal of Psychiatry Medicine found that 23.3% of undergraduate college students displayed problematic pornography use, with a higher association for older male students. Problematic pornography use is often correlated with emotional distress, impulsive behaviors, and addictions. This issue is of particular concern for emerging adults because the brain functioning of this group is not fully developed, meaning that emotional regulation and impulse control is often not as strong as in older adults. Furthermore, the life trajectory of successful romantic and sexual relationships among emerging adults can be severely damaged. These concerns are amplified with the advancement of AI-generated models, which can now have the appearance of real people and can create hyper-realistic and sexual-based interactions with users. Using images of real peers could negatively impact real-life relationships. Possible examples include sexualizing friendships, using AI images of a former partner to avoid the emotions of a break-up, and having digital fantasies about inappropriate or illegal sexual interactions. Furthermore, though more research is needed about possible behavioral outcomes associated with AI Girlfriends, it's possible that frequent users might display more problematic behaviors, such as infidelity and boundary violations. Concern That AI Girlfriends Could Be Exploitative Another 2025 report on highlighted a company with the goal of becoming OnlyFans for the AI era. Some might argue that AI could replace the current sex industry, and that this could potentially save many human models from being exploited. However, though there are laws regarding the use of AI in copyrighted materials and with public figures, more attention is needed regarding the privacy of individual images in explicit AI companions. For example, the creation of AI companions without consent could be a future consideration for many families of college students as well as the conduct officers of many universities. In addition, a report earlier this month on explored the dangers of parasocial relationships with AI bots among emerging adults, and many of these relationships require monthly subscriptions. In summary, the parasocial relationship between a young adult and an AI companion, which can have the appearance of a peer, is hyper-sexualized, highly realistic, includes algorithms to give personal responses, can store sensitive information, and includes a monetary fee has the potential to be highly exploitative. A 2025 report by an online platform promoting AI graphic design, suggested that the global AI Girlfriend market is projected to reach $9.5 billion by 2028, and that one in five men on dating apps have tried AI Girlfriend platforms. Males and females can be subject of AI companions, but the targeted and unregulated promotion AI Girlfriends is a potential concern for mental health professional who work with emerging adults.

Yahoo
37 minutes ago
- Yahoo
The ‘hallucinations' that haunt AI: why chatbots struggle to tell the truth
The world's leading artificial intelligence groups are stepping up efforts to reduce the number of 'hallucinations' in large language Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
37 minutes ago
- Yahoo
Big Tech lobbying surges as companies try to shape Trump's AI policy
Companies and business groups are rushing to influence Washington's artificial intelligence policies as the industry booms and Donald