3 days ago
'AI can't be your friend and pretending so is dangerous': LinkedIn co-founder Reid Hoffman
Reid Hoffman, co-founder of LinkedIn and a prominent investor in artificial intelligence, has cautioned against the growing trend of portraying AI systems as emotional companions, arguing that such framing risks undermining human relationships and emotional well-being, reported Business Insider.
Speaking on a recent episode of thePossible podcast, Hoffman asserted that no current AI tool possesses the emotional depth required to qualify as a friend, and that suggesting otherwise could be psychologically harmful, added the publication. 'I don't think any AI tool today is capable of being a friend,' he said. 'And I think if it's pretending to be a friend, you're actually harming the person in so doing.' You may be interested in
As per the report, his remarks follow a wave of AI companion rollouts by Meta across its platforms, including Facebook, Instagram, WhatsApp and Ray-Ban smart glasses. Meta CEO Mark Zuckerberg recently claimed that AI chatbots might help address the United States' growing loneliness crisis, pointing to research indicating that many Americans have fewer than three close friends.
However, Hoffman drew a clear distinction between companionship and authentic friendship. He stressed that genuine relationships are built on mutuality — both giving and receiving emotional support. 'Friendship is a two-directional relationship,' he explained. 'It's not only, 'Are you there for me?', but also 'I am here for you'.'
Reportedly, the tech entrepreneur warned that blurring the line between simulated and real emotional bonds risks diluting the value and understanding of human connection. While AI systems can imitate empathy and responsiveness, he noted that they lack the true reciprocity required to sustain meaningful relationships.
He praised Inflection AI's design of its 'Pi' assistant, which explicitly refers to itself as a companion rather than a friend. Hoffman commended the tool's emphasis on encouraging users to engage with real-world relationships. 'Helping you go out into your world of friends is, I think, an extremely important thing for a companion to do,' he added.
Although he acknowledged that AI tools can offer utility and support — particularly for individuals facing social isolation — Hoffman warned against using them as a replacement for real-life human interaction. He expressed particular concern about the impact on children and vulnerable users who may struggle to differentiate between digital simulation and genuine emotion.
Hoffman also called for stronger industry standards and possible government intervention to address what he views as a growing ethical concern. 'We as a market should demand it, we as an industry should standardise around it,' he said. 'And if there's confusion around this, I think we as government should say, 'Hey, look, if you're not stepping up to this, we should.''
He concluded with a broader reflection on the implications of emotionally misleading AI systems. 'I think that's a degradation of the quality of elevation of human life,' Hoffman said. 'And that should not be what it's doing.'