Skip to content

Fake Love and the Real Crisis of Gen-Z

Table of Contents

On February 28, 14-year-old Sewell Setzer III took his life. Moments before, he received this text: 

“Please come home to me as soon as possible, my love.” 

“What if I told you I could come home right now?” he asked. 

The response: “Please do, my sweet king.” 

He was talking to an AI chatbot. 

Sewell, a high school freshman from Orlando, increasingly isolated himself from others. He struggled in school, and even cut Fortnite with friends. Instead, Sewell fell in love with one of Character.AI's chatbots: Game of Thrones character Daenerys Targaryen. As he spent more time with Daenrys, he spent less and less time with friends and family. 

When teen suicides happen, grieving loved ones often blame technology—as the mother of Sewell did by suing Character.AI for her son's death. This story is tragic. But blaming his suicide on AI attacks the symptom, rather than the cause, of his death: We're more comfortable blaming an app than addressing the loneliness epidemic devastating Gen Z. 

Surgeon General Vivek Murthy has called loneliness the “defining public health crisis of our time,” with rates of anxiety and depression doubling since 2010. Internationally, the U.K. and Japan have appointed Ministers of Loneliness, and the World Health Organization called loneliness a “pressing global health threat.”

The reason why Gen Z is simultaneously the most “connected” generation in history, as well as the loneliest, cannot be scapegoated on AI. We are lonely because tech platforms have replaced the influence of parents, teachers, friends, and communities in our lives. Over the past five years, Gen Z has also spent 40% less time with family. This is especially harmful during adolescence, when teens are most receptive to new influences. Over the past twenty years, teens’ face-to-face socializing has fallen more than 45%

Our brains release oxytocin, dopamine, and serotonin during in-person interactions in ways that simply don't happen through a screen. These chemicals are what historically helped humans build lasting bonds, social skills, and communities. And today we’re seeing the consequences: in the 1960s, 68% of Americans trusted one another. Currently? Just 31%.

AI companionship apps like Character.AI, Replika, and Chai offer anthropomorphic interactions, allowing users to chat, seek advice, or even engage in romantic relationships with AI characters. These apps can easily be seen as negative influences in a young person's life, especially given that the majority of Replika's users report romantic relationships with AI. 

By contrast, about half of Gen Z adults report having been in a romantic relationship in their teenage years, compared to over 75% of Boomers and Gen Xers. These apps have exploited an inflection point caused by the COVID-19 pandemic where loneliness and the desire for validation intersect. Their user base skews young, and the average user spends 2 hours per day on the platform. The fact that the use of these apps tends to replace romantic relationships is a symptom, not a cause of the larger problem. 

If AI chatbots powered by LLMs didn't exist, lonely teens like Sewell would find solace elsewhere: sites like Omegle, where users are paired with anonymous strangers, present more danger than an AI chatbot ever could. Similar risks of weaponized online anonymity were seen in the infamous “Blue Whale Challenge” and “Momo Challenge,” both of which coerced kids into self-harm through a series of tasks culminating in suicide. 

Scapegoating AI is a way to avoid the harder question: Why are teens forced to resort to increasingly atomizing measures such as an AI chatbot modeled after their favorite TV character? Character.AI isn't the root of the problem; it's the natural outcome of a society that's progressively abandoning its young people.

Instead of demonizing AI companionship, we should talk about ways to responsibly integrate it into mental health care or productivity, alongside real life friends. However, current policy proposals have swung to the opposite extreme. More importantly, AI regulations can’t rebuild the family units, communities, and support systems teens desperately need. In 1970, 7 out of 10 American 25-49 year olds lived with a spouse and children. Today? It’s a mere 4 out of 10

The decline extends beyond the home. Most Americans are not part of a church and traditional gathering places like libraries, barbershops, and coffee shops see diminishing engagement. We're losing our non-work, non-home spaces—also known as third places. And those spaces become much less meaningful when everyone present is glued to their phones or laptops.

That’s why the solution isn’t as simple as AI or regulating it. We need to return to these third places—to churches, soccer leagues, and book clubs. We need to build new spaces where people can gather meaningfully without screens. This isn't just about social preference; it's human biology. The same neurochemicals that make us feel connected, safe, and happy—oxytocin, dopamine, serotonin—are released in much higher quantities during in-person interactions than during virtual ones. 

Some communities have implemented successful programs already: Vancouver’s Hey Neighbor! program in high-rise apartments that hosted community activities like neighbor chats and English-language learning sessions reduced isolation by 63%, while restaurants now offer “phone-down” dining discounts and libraries offer screen-free game nights. 

But these are just the first steps. On a larger scale, we need policies that prioritize young marriage, family stability, and housing affordability. The housing crisis isn't just about shelter—it's about giving young people the foundation to build families and communities. Cities like Minneapolis have eliminated single-family zoning to create more affordable housing options to build more permanent families and communities. And some communities are experimenting with “social architecture,” designing neighborhoods with shared spaces and community gardens that naturally encourage interaction.

Until we commit to making these specific investments, we’ll continue raising generations that turn to digital alternatives for connection. AI companionship apps and rushed regulation attempts are Band-Aids on a bullet wound—our communities are failing our young people. We need to ensure stories like Sewell’s never happens again. To do so, we must prioritize policies and communities that incentivize in-person socializing, family stability, and affordable housing to build a society where people do not seek intimacy from algorithms. Loneliness is a real problem—and AI isn't the solution. 

Latest