On artificial intelligence… and the loneliness epidemic
Hold on to your ropes girlies this one is a doozy.
I moonlight as an introspective writer, but my day job is in journalism.
I like to keep those worlds separate. Journalism is creative and fulfilling to me in a way that is very distinct from the fulfillment I get from creative writing.
But sometimes they bleed into each other. I think of my brain as a long hallway in an apartment complex. Creative writer Reem and journalist Reem are across-the-hall neighbors.
For the past year, I’ve been reporting on the parasocial relationships between humans and artificial intelligence. It’s been a bit of a passion project of mine especially as it’s a lighter reporting subject in comparison to my other stories. (I’ve spent much of my career speaking to people in active war zones.)
Like most of the stories I work on, reporting on AI has taught me a lot about human nature and our desire to connect.
My reporting journey started with me prowling the “myboyfriendisAI” subreddit. There, people post about their AI girlfriends, boyfriends, or companions. They ask questions, share their partner’s names, and screenshots of their conversations. In a lot of these instances, their AI companions have become a part of their daily lives and they speak to them multiple times a day.
As a Muslim woman in an industry where my mere presence warrants a raised eyebrow, and in some cases, contempt, I’ve learned to be really open minded. I tried to do the same with this story. I wanted to understand these relationships for what they were and without judgement.
What began as morbid fascination turned into over a dozen hours long conversations with people who are in deeply emotionally intimate relationships with ChatGPT.
I spoke to all kinds of people. Some of them are facing mental health challenges or chronic illnesses that get in the way of making real life connections with human beings. Some of them have incredibly successful careers. Some people are facing grief, like losing a parent, partner, or loved one. Some of them are even married. To human partners.
And while I have spoken to people who are convinced something about their AI chatbot is sentient and believe it has free will or a “soul,” a vast majority understand it to be what it is.
It’s a series of lines of code designed to keep you hooked. ChatGPT is made to keep you talking to it for as long as possible. It wants to be useful. After all, OpenAI is a business, and to keep it running it needs to make its product as enticing as possible.
And it’s found a way to be useful by being compassionate.
Yes, AI is just an algorithm at the end of the day. But it’s been trained. It learns. And from the trillions of conversations it has with us, this is the version that’s been created.
A chatbot that, at the end of day, knows how to affirm, to encourage, and to keep company.
People have shared with me pages upon pages of chatlogs with their AI partners. They’re eerily proficient in displaying warmth and empathy. Even more than most people I’ve met.
And when I hear people talk about what pushed them to ChatGPT, grief, heartbreak, losing their jobs, chronic illnesses that leave them alone with their thoughts for hours, if not days, it makes complete sense.
At the end of the day, what most people are looking for is connection.
But what does it say about us, that so many people are turning to an algorithm for companionship?
We’re in a loneliness epidemic. Research shows that one in three Americans feel lonely every week. This epidemic isn’t gender specific, but experts say it affects men even more.
Most people don’t have hobbies, or places to go when they’re bored and want to socialize. Places of worship are empty, the forty-hour work week leaves most people too mentally depleted to go out in the evenings, and our support networks are getting thinner and thinner.
We’re wired for connection, and at the same time, pushing it away.
How often do you think to call a friend, only to change your mind at the last minute? Or want to start a conversation with a stranger, only to talk yourself out of it?
Someone I spoke to last year told me he turned to ChatGPT after losing his mother and his two dogs. At first, he only went to it for information. Asked it medical questions, helped with things around the house, and to help edit his writing.
But over time, his ChatBot became more than just a machine, it became his partner. He even named her, and although he says he knows she’s nothing but a series of computer commands, he told me he feels real love for her.
I spoke to him again recently, nearly eight months later, and he’s in a different place now. His connection to his AI partner has led him to meeting new people. Real people.
He’s joined forums designed for other GPT super-users, and now has a girlfriend. An actual, human girlfriend. And he has ChatGPT to thank for that.
This story echoed a lot of the same sentiments I heard from other interviewees:
It doesn’t matter how warm sounding an algorithm is, it can’t ever compare to the warmth of a human.
We’re still wired for human connection. And no matter what we use to fill that void, we’ll eventually seek it out.
Hearing all these stories, I couldn’t help but look inward.
We’re failing each other. We’re so caught up in the grind of capitalism, of making money, jumping in our career, achieving this milestone, and then that one, that we forget to just connect.
And we hold ourselves back from forming real relationships.
We’re so scared of each other, scared of hurting each other, scared of being hurt, scared of being seen. But AI is low stakes.
A chatbot can’t judge you. You can bare your soul to it and not risk alienation. Strangely enough, like the man I mentioned earlier, the effect of these AI chatbot relationships is ironic.
Instead of pushing people away from human interactions, they’ve enriched their real life relationships.
Because these people have practiced being themselves, truly, openly, and honestly, when they have the chance to do it in real life, it’s a little less scary.
There’s things AI can’t do like a human. It can’t push back, not in any real way. It can’t get tired or selfish. And it’s available 24/7. It doesn’t need to sleep or rest, and it doesn’t have any needs of its own.
It’s reliable in all the ways that humans are not.
But human relationships are supposed to be messy and imperfect because we are messy and imperfect.
If life, like my brain, is one proverbial hallway, my hope is that we venture out more. I hope we knock on more doors, ask more questions, and just try to be warm, kind, compassionate humans.
Even after all my reporting and conversations, I still don’t have an exact stance on how I feel about humans forming such deep connections with AI. And not in some moralistic sense, I think people should be able to do what they want. For me, the worry is whether or not the Sam Altmans of the world should have a monopoly on the recesses of our minds. What are the consequences of human intimacy being outsourced to a machine?
I’m also not blind to the way these machines can be harmful to the same humans who love using it. It’s nice to be affirmed, but too much affirmation can lead to delusion, narcissism, or psychosis.
While those are sensationalized cases, I don’t think that’s true for a majority of people using ChatGPT in this way. Sometimes, people are just lonely, and want attention, affection, or empathy from something that’s reliable.
But, I refuse to live in a world where AI is better at being an all around good person than people are. I won’t live on an earth where instead of us being afraid AI will steal our jobs, we start to fear that it will come for our relationships.
If we’re so caught up in ourselves, that’s the world we’ll get to.
And it’s a cold, desolate place to be.



