top of page

Current Most Read

Have We Become Too Reliant on AI?
Nintendo Switch 2 Launches to Record Sales, Mixed Reviews, and Market Shifts
Tensions in Los Angeles as Protests Continue Over ICE Raids

The Growing Trend of Parasocial Relationships Among Younger Generations

  • Writer: Connor Banks
    Connor Banks
  • Jun 17, 2024
  • 3 min read

The digital age has given rise to a new form of relationship—one that is largely one-sided and deeply emotional. Parasocial relationships (PSRs), where individuals form strong connections with media figures like YouTube creators or fictional characters, are becoming increasingly prevalent among younger people. The recent study published in Scientific Reports from the University of Essex sheds light on why these relationships are not just a passing fad but a fundamental shift in how emotional needs are met in the 21st century.


A Young girl constantly on her phone.

Why Are Parasocial Relationships Booming?

Let's face it: today's youth are digital natives. From the moment we wake up to the time we go to bed, our lives are interwoven with the internet. Platforms like YouTube, TikTok, and Instagram are more than just entertainment—they're lifelines. The study reveals that PSRs are more effective at fulfilling emotional needs than casual acquaintances, though they still fall short of the intimacy provided by close friends and family. These virtual connections can be incredibly comforting for young people, who often face social challenges and a constant quest for belonging.


The Role of Accessibility and Consistency:

Unlike traditional relationships, PSRs don't require reciprocation. A YouTube creator is always there, posting new content regularly, sharing their thoughts, their lives, and sometimes, their vulnerabilities. This consistency creates a sense of reliability and emotional safety that can be harder to find in real-life relationships, especially during tumultuous adolescent years.


A mobile phone with Social Media apps on it.

High Self-Esteem and Social Rejection:

Interestingly, the study points out that individuals with high self-esteem find these parasocial bonds particularly satisfying when facing social rejection. In an age where bullying and social exclusion can extend into the digital realm, having a dependable, albeit one-sided, emotional support system can make a world of difference.


Why Is This Happening?

The rise in PSRs can be attributed to several factors. First, the pervasive influence of digital media means that young people are constantly exposed to media figures. These figures often share content that feels intimate and personal, creating an illusion of friendship. Additionally, societal shifts, including increased social isolation and the fragmentation of traditional community structures, have left a void that PSRs can fill.


A Double-Edged Sword for Mental Health:

While PSRs can provide much-needed emotional support, they also come with potential drawbacks. Over-reliance on these relationships might impede the development of real-life social skills. The key is balance—using PSRs as a supplementary support system rather than a replacement for actual human interaction.


Is This a Good or Bad Thing?

The answer isn't straightforward. On one hand, PSRs offer a vital source of support and connection in a world where traditional social bonds are weakening. They can be particularly beneficial for those who struggle with social anxiety or lack a strong support network. On the other hand, there's a risk that these relationships might discourage people from seeking out and nurturing real-world connections, which are essential for a well-rounded emotional life.


Final Thoughts:

Parasocial relationships are more than just a quirky byproduct of our media-saturated world; they're a testament to the evolving nature of human connection. For younger generations, these relationships can offer significant emotional support, filling gaps left by traditional social interactions. As we navigate this digital landscape, understanding and integrating the positive aspects of PSRs could be crucial for fostering emotional well-being in an increasingly connected yet isolated world.


The full study provides a deeper dive into these dynamics and can be accessed [here](https://www.nature.com/articles/s41598-024-58069-9).

Have We Become Too Reliant on AI?

Have We Become Too Reliant on AI?

13 June 2025

Paul Francis

Want your article or story on our site? Contact us here

The ongoing unrest in Los Angeles has escalated, with President Donald Trump deploying the National Guard and Marines in an attempt to clamp down on protests. This move has drawn criticism, particularly after images surfaced showing Guardsmen sleeping on cold floors in public buildings—images that quickly sparked outrage. But this article isn’t really about that. Well, not directly.


What’s more concerning is what happened next.

As these images began circulating online, a troubling trend emerged. People started questioning their authenticity, not based on verified information or investigative journalism, but on what artificial intelligence told them. Accusations of “fake news”, “AI-generated images”, or “doctored photos” spread rapidly. Rather than consulting reputable sources, many turned to AI tools to determine what was real.


And they trusted the answers without hesitation.


These AI models, often perceived as neutral, trustworthy, and authoritative, told users that although the images were real, they weren’t recent. According to the models, the photos dated back to 2021 and were taken overseas. The implication? They had nothing to do with the situation unfolding in Los Angeles.


People believed it. Anyone suggesting otherwise was dismissed as misinformed or biased. The idea that these images were being used to fuel an anti-Trump agenda gained traction, all because an algorithm said so.


But there’s one major flaw: the AI was wrong.


These images didn’t exist online before June 2025. They aren’t from 2021. They weren’t taken abroad. They are, in fact, current and accurate, just as the original reports stated. But because AI tools misidentified them, many dismissed the truth. This isn’t just a harmless mistake; it’s a serious issue.

We are placing too much trust in machines that cannot offer certainty. These tools don’t rely on real-time data or fact-checking methods; they generate responses based on probabilities and patterns in the data they’ve been trained on. And when those outputs are flawed, people can be dangerously misled.


So what happens when more and more people begin to trust AI over journalists, subject matter experts, or even their own eyes?


We risk entering a reality where truth is no longer defined by facts, but by algorithms—where something can be deemed false not because it lacks evidence, but because a machine didn’t recognise it. If we reach that point, how do we challenge power? How do we uphold accountability? How do we know what’s real?


AI is a remarkable tool. But it is just that—a tool. And when tools are treated as infallible, the consequences can be far-reaching. If we blindly trust AI to define our reality, we may find ourselves living in a world where facts are optional, and truth becomes whatever the machine decides it is.

bottom of page