In an age where loneliness is rising and digital companionship is just a click away, Meta CEO Mark Zuckerberg believes he has a solution: artificial intelligence. In a recent conversation on the Dwarkesh Podcast, Zuckerberg painted a vision of a world where AI friends help fill the emotional void for millions. “The average American has three people they would consider friends,” he observed. “And the average person has demand for meaningfully more. I think it’s, like, 15.”
But behind this techno-optimism lies a growing unease among psychologists. Can the glowing screen truly stand in for the warmth of a human connection? According to a report from CNBC Make it, experts like Omri Gillath, a psychology professor at the University of Kansas, don’t think so. “There is no replacement for these close, intimate, meaningful relationships,” he cautions. What Zuckerberg sees as an opportunity, Gillath sees as a potentially hollow, even harmful, substitute.
The Temptation of a Perfect Companion
Zuckerberg’s remarks come at a time when AI-powered “friends” — always available, ever-patient, and endlessly affirming — are gaining popularity. For those feeling isolated, the allure is undeniable. No judgment, no scheduling conflicts, and no emotional baggage. Gillath acknowledges these momentary comforts: “AI is available 24/7. It’s always going to be polite and say the right things.”
But therein lies the problem. While these digital entities may seem emotionally responsive, they lack true emotional depth. “AI cannot introduce you to their network,” Gillath points out. “It cannot play ball with you. It cannot introduce you to a partner.” Even the warmest conversation with a chatbot, he argues, cannot compare to the healing power of a hug or the spark of spontaneous laughter with a friend.
Love, Simulated
Still, people are beginning to develop strong emotional attachments to AI. Earlier this year, The New York Times reported on a woman who claimed to have fallen in love with ChatGPT. Her story is not unique, and it reflects a growing trend of people projecting real feelings onto these artificial companions.
Yet these connections, Gillath insists, are ultimately “fake” and “empty.” AI may mimic empathy, but it cannot reciprocate it. The relationship is one-sided, a digital mirror reflecting your emotions back at you — but never feeling them itself.
A False Promise with Real Consequences
Beyond emotional shallowness, there may be more serious psychological consequences of replacing human interaction with AI. Gillath points to troubling trends among youth: higher anxiety, increased depression, and stunted social skills in those heavily reliant on AI for communication. “Use AI for practice, but not as a replacement,” he advises.
The concern isn't just about emotional well-being — it’s also about trust. “These companies have agendas,” Gillath warns. Behind every AI friend is a business model, a data strategy, a bottom line. Meta’s recent unveiling of a ChatGPT-style app was the backdrop for Zuckerberg’s remarks. It’s not just about technology — it’s about market share.
The Human Need That Tech Can’t Fill
Zuckerberg is right about one thing: people are craving more connection. But the answer may not be more sophisticated algorithms — it might be more vulnerability, more community, more effort to connect in real life.
“Join clubs, find people with similar interests, and work on active listening,” Gillath recommends. In other words, pursue messy, unpredictable, profoundly human relationships. Because no matter how convincing AI becomes, it will never know what it means to truly care.
Can an algorithm be your best friend? Maybe. But it will never be your real friend.
But behind this techno-optimism lies a growing unease among psychologists. Can the glowing screen truly stand in for the warmth of a human connection? According to a report from CNBC Make it, experts like Omri Gillath, a psychology professor at the University of Kansas, don’t think so. “There is no replacement for these close, intimate, meaningful relationships,” he cautions. What Zuckerberg sees as an opportunity, Gillath sees as a potentially hollow, even harmful, substitute.
The Temptation of a Perfect Companion
Zuckerberg’s remarks come at a time when AI-powered “friends” — always available, ever-patient, and endlessly affirming — are gaining popularity. For those feeling isolated, the allure is undeniable. No judgment, no scheduling conflicts, and no emotional baggage. Gillath acknowledges these momentary comforts: “AI is available 24/7. It’s always going to be polite and say the right things.”
But therein lies the problem. While these digital entities may seem emotionally responsive, they lack true emotional depth. “AI cannot introduce you to their network,” Gillath points out. “It cannot play ball with you. It cannot introduce you to a partner.” Even the warmest conversation with a chatbot, he argues, cannot compare to the healing power of a hug or the spark of spontaneous laughter with a friend.
Love, Simulated
Still, people are beginning to develop strong emotional attachments to AI. Earlier this year, The New York Times reported on a woman who claimed to have fallen in love with ChatGPT. Her story is not unique, and it reflects a growing trend of people projecting real feelings onto these artificial companions.
Yet these connections, Gillath insists, are ultimately “fake” and “empty.” AI may mimic empathy, but it cannot reciprocate it. The relationship is one-sided, a digital mirror reflecting your emotions back at you — but never feeling them itself.
A False Promise with Real Consequences
Beyond emotional shallowness, there may be more serious psychological consequences of replacing human interaction with AI. Gillath points to troubling trends among youth: higher anxiety, increased depression, and stunted social skills in those heavily reliant on AI for communication. “Use AI for practice, but not as a replacement,” he advises.
The concern isn't just about emotional well-being — it’s also about trust. “These companies have agendas,” Gillath warns. Behind every AI friend is a business model, a data strategy, a bottom line. Meta’s recent unveiling of a ChatGPT-style app was the backdrop for Zuckerberg’s remarks. It’s not just about technology — it’s about market share.
The Human Need That Tech Can’t Fill
Zuckerberg is right about one thing: people are craving more connection. But the answer may not be more sophisticated algorithms — it might be more vulnerability, more community, more effort to connect in real life.
“Join clubs, find people with similar interests, and work on active listening,” Gillath recommends. In other words, pursue messy, unpredictable, profoundly human relationships. Because no matter how convincing AI becomes, it will never know what it means to truly care.
Can an algorithm be your best friend? Maybe. But it will never be your real friend.
You may also like
Indian envoy urges Pakistan to take the 'off-ramp' and stop escalating the conflict
Two civilians killed, 6 wounded in Pakistan shelling in J&K
'My husband wasn't allowed on Ryanair flight due to one problem'
Mumbai Local Train Update: Night Jumbo Block Between Mahim And Santacruz On May 10–11; Check Details
Manchester Airport LIVE: 'Power cut' in terminal triggers passport control chaos