pixel

GPT as a friend

By James W. Pennebaker
Professor Emeritus of Psychology at the University of Texas at Austin
A close-up image of a digitally designed series of cubes that range from colorless to glowing vibrant shades of green. This visual represents how an idea ignites the potential of AI to contribute to human flourishing, magnifying endless possibilities in this rapid revolution.

A rich friendship network is a core dimension of human flourishing. Close friends provide support, trust, and a willingness to talk about a wide range of topics from intimate to practical to philosophical. Could a future version of GPT become a valued friend? If so, what would be the implications?

Friends are trusted people with whom we can talk openly. We understand each other’s perspective and may give helpful feedback. Good friends learn from each other to interpret and deal with different parts of our lives. The common currency of friendship is a shared language and understanding of each other.


The common currency of friendship is a shared language and understanding of each other.

My personalized GPT-friend would likely have the language and emotional skills needed for a friendship. With training or experience, it would develop an understanding of me, my background, and the context of my life. It could provide feedback and information related to my everyday questions.

Is this shirt too wrinkled for tonight’s dinner party? One of the guests works in fintech. What’s fintech and how might fintech be related to my research?

It could even detect shifts in my mood or behavior that I might not see. By the same token, I can learn from and provide feedback to my GPT friend.

What movie should my wife and I watch tonight? Hmmm, she’s not big on magical realism. And no horror movies for me. Why do you feel that Women Talking is something we’d like?

An AI friend is not a new idea. The commercial chatbot Replika is advertised as a “virtual friend.” By using simple algorithms, Replika can mimic users’ speaking styles and reference topics they have discussed in the past. Recent studies of committed Replika users suggest that about half consider the system as a friend or an extension of themselves. It’s not difficult to imagine future iterations of GPT will considerably expand and refine friendship-related skills exceeding current programs and making AI more friend-ready.

Is GPT really a friend? That’s beside the point. GPT can act like a friend and many, perhaps most people who use it in “friend mode” may eventually perceive it as a friend. But what are the implications of millions of people using GPT as one of their friends? Will it undermine real world friendships? If you meet someone and they become a friend, does this new friend undermine your other friendships? Most people would say no. A new friend typically enhances our lives and social connections. Would GPT-friend be any different?

Would a GPT friend be trustworthy, keep your secrets, and remain loyal? We already take these kinds of risks with technology every day. Think what your bank, credit card company, doctor, search engine, phone carrier, online retailer know about you. Their reputations hinge on not revealing your private information. Even their worst data breaches haven’t told your neighbors about your perverse search term history or financial problems. How have they done in keeping your secrets compared to your closest friends? Clearly, safeguards must be built into GPT-friend systems. For example, what if users disclose to their GPT friend that they are suicidal or plan to murder someone? Or they begin to show signs of major distress. Any GPT-friend system, just like real friends or therapists, will have to adhere to legal, ethical, and common-sense rules.


Would a GPT friend be trustworthy, keep your secrets, and remain loyal?

Could GPT friendships become addictive? Possibly, but not in the ways certain social media sites work. GPT friendships would be more personal and private where “likes” or other popularity markers would not exist. One can imagine, however, that some people could become obsessed with a GPT friend in ways they become enmeshed with another person. There are occasional news stories of people falling in love with early versions of GPT, somewhat reminiscent of the 2013 movie, “Her.”

Can future GPT-friends help humans flourish? While I believe the answer is yes, we need to proceed with great caution. The introduction of this technology must be tested exhaustively. The current versions of GPT are not close to acceptable. However, many of us see for the first time that the possibility of a viable friend model is a possibility within the next few years.

Perhaps the greatest benefactors of a GPT friend will be people who live isolated or lonely lives. Dozens of medical and social science studies point to the physical and mental health risks of social isolation, particularly for those living in isolated regions, and people who are elderly, sick, or disabled. Some of the most frequent Replika users have recently undergone a major life upheaval, are living alone or away from family and friends, lacking in social skills, or are highly anxious.

Humans are social animals. At times, we all need a friend to talk with. Historically, that friend was another person. In the future, we may expand our definition of friendship.

The view, opinion, and proposal expressed in this essay is of the author and does not necessarily reflect the official policy or position of any other entity or organization, including Microsoft and OpenAI. The author is solely responsible for the accuracy and originality of the information and arguments presented in their essay. The author’s participation in the AI Anthology was voluntary and no incentives or compensation was provided.

Pennebaker, J. (2023, June 19). GPT as a Friend. In: Eric Horvitz (ed.), AI Anthology. https://unlocked.microsoft.com/ai-anthology/james-pennebaker


James W. Pennebaker

James W. Pennebaker is Professor Emeritus of Psychology at the University of Texas at Austin and co-founder of Receptiviti, a Toronto-based text analysis company. He is best known for his research on expressive writing, studies linking natural language use and social psychological processes, and his text analysis program Linguistic Inquiry and Word Count (LIWC).

A portrait of Jamie Pennebaker