People Are Better Than AI at Reducing Loneliness
I consider myself an AI optimist. There is a lot of pessimism out there about how artificial intelligence will affect our society, but I see enormous potential for AI to improve our lives. That said, I think it is important to study and recognize AI’s limitations, especially when it comes to the parts of life that matter most. As a psychologist, I am particularly interested in understanding where AI can support human wellbeing and where it falls short. A really cool new study takes on one of the biggest questions in this space. Can an AI companion make people less lonely?
People are increasingly turning to AI for emotional support and connection. According to a report published in Harvard Business Review, therapy and companionship is now the number one use case for generative AI, up from number two the year before. And a survey from Common Sense Media found that 72% of U.S. teenagers have used AI for companionship at least once, with over half saying they are regular users. This is not a rare phenomenon. It is a rapidly growing trend.
But do AI chatbots actually fulfill human social needs? The research so far is mixed. Some studies have found that people feel happier and more socially connected right after chatting with a chatbot. But those studies mostly captured immediate reactions, not lasting effects. And at least one longer-term study found that relying on AI for companionship actually predicted increases in loneliness over time. A new study helps clarify the picture.
The study, published in the Journal of Experimental Social Psychology, focused on first-year university students, a group naturally susceptible to loneliness as they navigate the transition away from established support systems. Participants were randomly assigned to one of three conditions for two weeks. They either texted daily with a custom-built AI chatbot named Sam, texted daily with a randomly assigned fellow first-year student, or simply wrote a brief journal entry about their day. The researchers measured loneliness before and after the two-week period, so they could track whether each condition actually reduced loneliness over time.
I think it is worth highlighting the strength of this design. By comparing human connection, chatbot interaction, and journaling side by side, the researchers could isolate what actually matters. Journaling is a good comparison because it mirrors the basic act of processing your day without any social component.
In addition, the chatbot was not some generic tool. It was specifically designed to embody the qualities of an ideal friend based on principles from relationship science, including active listening, emotional validation, and empathetic responsiveness. In other words, the researchers gave AI every advantage they could. And participants engaged with it. They exchanged roughly eight to ten messages per day in both the chatbot and human conditions, with chatbot conversations actually generating more words per day.
Despite all of that, only participants who texted with a real human peer showed a significant reduction in loneliness over the two-week period. They also reported significantly lower loneliness at the end of the study than those in both the chatbot and journaling conditions. Meanwhile, participants who interacted with the chatbot reported loneliness levels that were statistically indistinguishable from those who simply journaled. The chatbot, for all its engineered warmth, did not reduce loneliness.
This does not mean the chatbot did nothing. Participants who interacted with it reported less negative mood compared to the journaling control group, consistent with prior research showing AI can provide short-term emotional relief. But when it came to the deeper issue of loneliness, the chatbot fell short.
Why? The researchers offer several compelling explanations that align with what we know about the psychology of human connection. First, meaningful relationships are built on reciprocity. In human conversations, both people disclose, both people listen, and both people provide support. Interestingly, the study found that the chatbot actually expressed more empathy than human participants did. But participants themselves expressed less empathy when talking to the chatbot than when talking to another person. Alleviating loneliness may depend not only on receiving support but on having the opportunity to give it.
As I have previously discussed, meaning is deeply social and agentic in nature. We feel most meaningful when we are making a positive difference in the lives of others. Being a source of support for someone else is not just nice. It fulfills a basic human need to matter. A chatbot that does all the heavy lifting may actually deprive people of the very thing that makes connection feel meaningful in the first place.
Second, there is the issue of social capital. Human relationships exist within broader social worlds. The people we connect with introduce us to other people, invite us into new groups, and open doors we did not know existed. This is how social, community, and professional networks are built and maintained. A chatbot, no matter how supportive, cannot substitute for any of that.
Third, there is something about knowing that another person chose to engage. When someone takes the time to reach out despite being busy with their own responsibilities and goals, it signals that the relationship matters to them. That signal is a powerful ingredient in human social life. A chatbot that is always available and always responsive cannot send it. The act of choosing to be there may matter more than anything a chatbot can say.
A news story I once saw captures this well. Several local churches had teamed up to put on a back-to-school event for families in need. In one location, parents could get backpacks, school supplies, clothes, and even haircuts for their kids. All of this was free and provided by volunteers. A reporter interviewed one mother who said the resources made a real difference for her struggling family. But knowing that people chose to help mattered just as much. Her family had been going through a really tough time, and beyond the financial stress, she had been feeling isolated and depressed. The fact that strangers freely gave their time reminded her that people cared, and that made her feel less alone and less hopeless.
None of this means AI has no role to play in addressing psychological and social problems. But I think the key distinction is between AI that works to bring people together and AI that attempts to substitute for human connection. If the goal is to reduce loneliness, the most effective AI tools will probably be the ones that help people find, build, and maintain real relationships, not the ones designed to simulate them.
There are dimensions of human experience that technology cannot replicate. Loneliness is not just the absence of someone to talk to. It is the absence of meaningful mutual connection, of knowing that another person is there for you and that you are there for them. No matter how advanced AI becomes and how many ways it can benefit our lives, human flourishing ultimately requires real human connection.
Have a great weekend!
Clay
