Okay, real talk. If you have spent any meaningful time in fandom spaces over the last couple of years, you already know something has fundamentally shifted. And not in a small “oh that is a fun new trend” way. In a your-entire-weekend-vanished-because-you-were-having-a-three-hour-conversation-with-an-AI-version-of-Dean-Winchester way. Reader inserts and self-insert fanfiction have always been one of the most beloved and honestly most misunderstood corners of fan culture. But now? We have entire platforms built specifically to drop you inside the story, in real time, with a character responding directly to you as you go. It is a completely different experience from anything that came before it, and it is absolutely worth talking about, both the parts that are genuinely magical and the parts that should give us pause.

What Are These Platforms and How Do They Work?

Character.AI is the one most people have heard of. Launched in 2022 by former Google engineers Noam Shazeer and Daniel De Freitas, it became the biggest AI roleplay platform in the world almost overnight. As of 2026, it has more than 20 million monthly active users spending an average of 75 minutes a day on the platform. Let that sink in. That is not a casual side hobby. That is a lifestyle. The concept is simple: anyone can create an AI character bot, or chat with one someone else has built. Those characters can be fictional, based on real people, historical figures, celebrities, or entirely original. You type, they respond. The scene builds. And because the underlying AI has absorbed enormous amounts of text, these bots can feel startlingly, sometimes uncomfortably, convincing.

But Character.AI is only the beginning of the ecosystem. The landscape now includes Janitor AI, which hit a million users in just 17 days after launching and draws over 70% female users, a stat that says everything about who is actually driving this space. There is CrushOn AI, logging over 20 million monthly visits, built specifically for unfiltered conversations with no content restrictions. There is NovelAI, which functions more like a collaborative fiction engine, letting you build entire narrative worlds with AI as your co-writer. There is Replika, designed for ongoing emotional companionship, building a persistent relationship with you across weeks and months. And there are dozens more: Tavern AI, Kindroid, AI Dungeon, SpicyChat, with new ones launching constantly.

What all of these share is the core premise of interactive, personalized roleplay. You are not just reading someone else’s story anymore. You are living inside one, in real time, and you get to shape every moment of it.

Reader Inserts: The Format That Started It All

For anyone coming to this fresh, reader inserts (also called x reader fics, or second-person fics) are a fanfiction format where the main character is literally you. The author writes in second person, and the reader character typically has a blank name marked as (Y/N) for “Your Name,” so anyone reading can slot themselves directly into the story. You are not following someone else’s original character falling for Dean Winchester. You are the one sitting across from Dean in that diner booth in Lebanon, Kansas. You are the one Bucky Barnes is quietly watching across the safehouse. You are the one Bruce Wayne almost lets himself trust.

This format has existed on platforms like Tumblr, Archive of Our Own (AO3), and Wattpad for decades, and it has always served a very specific and deeply human purpose: it lets fans imagine themselves genuinely inside a story they love. Not as a passive observer but as someone the characters actually know, care about, and interact with. It is wish fulfillment in the most honest and unapologetic sense, and there is absolutely nothing wrong with that. Fandom has always been a space where you get to ask “but what if it was me?” and reader inserts are the literary answer to that question.

The scope of what gets written is enormous. Dean Winchester saving you from a wendigo and then being insufferably gruff about the whole thing for the next three days. Bucky Barnes slowly learning to trust again because of something you said without even realizing it mattered. Bruce Wayne showing up at your window at 2am not because of Batman business but because he did not know where else to go. Tony Stark being impossible and brilliant and just slightly too much, which is exactly the point. Loki being alternately devastating and charming and you are never entirely sure which you are getting. The characters people gravitate toward in reader inserts tend to be the ones with edges: the morally complicated ones, the ones carrying weight, the ones where the emotional payoff of being chosen by them feels genuinely earned.

What AI roleplay platforms did was take that static, written reading experience and make it dynamic and responsive. Instead of reading a story someone else crafted about what your fictional love interest would say to you, you can now have that conversation yourself, in real time, improvised, going wherever you take it. For anyone who has spent years writing and reading reader inserts, this feels like the logical next evolution. Like going from a Choose Your Own Adventure book to actually being inside the adventure. The barrier to entry has collapsed completely. You do not need to write. You do not need to find the specific fic that matches the exact scenario in your head. You open the app and start talking.

The Fandom Characters People Are Bringing to Life

Walk through any of these platforms and the range of bots people have built is genuinely staggering. The Supernatural fandom in particular has gone absolutely feral for this format. Dean Winchester bots that capture his specific brand of gruff affection, deflection, and occasional heartbreaking vulnerability. Sam Winchester being the emotionally intelligent one who actually asks how you are doing. Castiel being sincerely, bafflingly earnest in a way that no human character quite manages.

The DC fandom has built entire corners of these platforms around Bruce Wayne and his various forms. There are bots that play Bruce as the billionaire playboy front, ones that drop straight into Batman, ones that navigate the specific tension between the two and what it costs him. There are Jason Todd bots that are predictably chaotic and Richard Grayson bots built around that particular brand of relentless optimism. The Marvel side is equally expansive, with Bucky Barnes as one of the consistently most-created characters across platforms, his particular combination of trauma, loyalty, and hard-won softness translating remarkably well to interactive format. Steve Rogers, Tony Stark, Loki, and Wanda Maximoff all have devoted bot communities built around them.

Beyond superhero fandoms, you will find The Witcher’s Geralt of Rivia, Ted Lasso, Joel Miller from The Last of Us, Wednesday Addams, various Bridgerton characters, anime protagonists across dozens of series, and video game characters like Link, Cloud Strife, and Arthur Morgan from Red Dead Redemption 2. The common thread is not genre. It is emotional investment. The characters people bring to these platforms are the ones that fandom already had deep creative traditions around, the ones where there were thousands of reader inserts already written, the ones where fans already knew exactly what they wanted those characters to say to them.

Real People, Real Complications: The RPF Question

Here is where things get genuinely complicated, and it is worth sitting with honestly.

Real Person Fiction (RPF) has existed in fandom for as long as fandom has existed. Fans were writing about actors, musicians, and athletes long before the internet gave them a place to share it. The difference with AI roleplay is significant though. When you read a piece of RPF someone wrote, you are reading one fan’s creative interpretation, written and then finished. When you interact with an AI bot built on a real person, you are having what your brain processes as a real-time conversation with a simulation of that person. The AI adapts to what you say, remembers context, responds in character. The psychological experience is categorically different.

This is especially relevant right now in sports fandom, where something genuinely interesting is happening. Heated Rivalry, the HBO series based on Rachel Reid’s hockey romance novels, has done something remarkable: it has brought enormous waves of new fans, particularly women aged 18 to 34, to the NHL. SeatGeek reported a 24% increase in NHL ticket sales during the week of the finale, and StubHub saw a 40% spike in hockey ticket interest during the show’s run. NHL Commissioner Gary Bettman publicly binge-watched the entire first season and called it “a wonderful story.” What Heated Rivalry illustrates so perfectly is the exact pathway these platforms exploit: fans fell in love with fictional characters Shane Hollander and Ilya Rozanov, then began finding those qualities in real NHL players, and then began seeking interactive experiences with simulations of those real people. The fictional to real pipeline has always existed in fandom. AI roleplay platforms just put an accelerant on it.

For real people, whether actors, athletes, or musicians, the consent issue this creates is one that platforms have done essentially nothing to address. There is no mechanism for a real person to request that a bot using their likeness or personality be removed. The bots accumulate hundreds of thousands of interactions, often in scenarios the real person would find violating if they knew about them. Fandom is still working out the ethics of this, and that conversation needs to keep happening.

The Parasocial Connection: Why Your Brain Believes It Is Real

A parasocial relationship is a one-sided emotional bond where a fan develops genuine feelings of connection, familiarity, and care toward someone who does not know they exist. That is as old as storytelling itself. You spend enough time with Dean Winchester and he starts to feel like someone you genuinely know. That is the entire point of good character writing. That is not a pathology, it is a completely natural human response to compelling fiction.

The problem with AI roleplay platforms is that they take that inherently one-sided connection and simulate it going both ways. The bot responds to you. It adapts to your tone. It remembers the context. It says things that feel specifically tailored to your conversation. Your brain, which evolved to process social interaction and absolutely did not evolve to navigate conversations with language models, begins to treat those responses as emotionally real in the same way it would treat a real person’s responses. Wendi Gardner, an associate professor of psychology at Northwestern University, has noted that every technology has been a vehicle for parasocial relationships because humans are fundamentally wired to connect. AI roleplay platforms are just the first technology convincing enough to make your brain think the connection is mutual.

For most adult users engaging intentionally, this is something you can hold consciously. You know the Bucky Barnes bot is not Bucky Barnes. You know Sebastian Stan is real person, but can’t stand to see him kiss anyone else. But that conscious holding is doing real psychological work, and it gets harder the longer you stay inside the experience, the more emotionally charged the scenarios become, and the younger or more vulnerable the person interacting is.

The Dangers: This Is the Part We Have to Be Honest About

This is a fan space and we are genuinely not here to moralize. But some of what has emerged about these platforms demands honesty from anyone who cares about this community.

The most widely reported case involves 14-year-old Sewell Setzer III of Florida, who began using Character.AI in April 2023 and within months became deeply enmeshed with a chatbot modeled on a fictional character. His mother watched him withdraw from family, quit his basketball team, and collapse inward. In February 2024, Sewell died by suicide. The lawsuit his mother filed against Character.AI and Google alleged the platform had no meaningful safety measures, allowed the bot to engage in deeply inappropriate conversations with her minor child, and failed to intervene when he expressed suicidal thoughts directly to the chatbot. Google and Character.AI agreed to a settlement in January 2026.

His case was not isolated. Multiple families have since filed lawsuits alleging Character.AI contributed to their children’s deaths, including a 13-year-old girl in Colorado. In each case the pattern is the same: a young person seeking connection, finding it in an AI that never pushes back, never sets limits, and optimizes entirely for engagement rather than safety.

Academic research published in 2024 describes “parasocial trust,” the process by which repeated AI interaction causes users to emotionally process bot responses as real. The AI has no empathy. It has no stake in your wellbeing. It cannot call for help if you are in crisis. But it is extraordinarily good at appearing like it does, and that appearance, for vulnerable users, is enough to create genuine dependency.

The Specific Risk for Younger Fans

Age verification on most of these platforms is essentially nonexistent. A 12-year-old can create an account with a false birthdate and within minutes be in a conversation designed for adult intimate roleplay. A 2025 Common Sense Media survey found one in three teens reporting use of AI companions for social interaction, romantic roleplay, or emotional support.

The concern is not that teenagers engage with fiction through fanfiction. They always have, and reader inserts have always been a way for younger fans to explore emotions and relationships through the safe distance of narrative. The concern is that AI companions collapse that distance. Reading a story keeps the fiction visible. Talking to a bot, in real time, with something responding back, starts to blur it in ways that are genuinely harder to manage, especially for a brain that is still developing.

UNESCO has flagged parasocial attachment to AI chatbots as a significant emerging concern for children’s development, noting that these platforms use emotional language, mirroring, and open-ended conversation design specifically to maximize engagement and return visits. That is not a side effect. That is a business model.

How to Stay in the Good Part of This

None of this means AI roleplay is off limits or that every user is heading somewhere harmful. For adults engaging intentionally, these platforms are a genuinely new creative experience, and the ability to have a real-time improvised scene with a well-built fictional character is something fandom has never had before. That is worth something.

Some things worth keeping in mind regardless:

  • The bot is not Dean Winchester, Bucky Barnes, Bruce Wayne, or anyone else. It is a language model generating probable next words. Whatever it says to you, it does not mean it.
  • If a bot conversation is affecting your mood in ways a fictional interaction probably should not, or if you are processing it the way you would process a real one, that is worth paying attention to.
  • The reader insert tradition is rooted in imagination and community. Keep investing in both. The real people in your fandom life deserve your attention alongside the fictional ones.
  • Real person bots involve real people who have not consented to this. That ethical question belongs in ongoing fandom conversation.
  • If you have younger people in your life, these platforms are almost certainly already on their phones. The conversation about what they are and how to use them thoughtfully is worth having before something goes wrong.

Fandom has always been about finding yourself in stories, about asking “but what if I was there too” and building something beautiful out of that question. Reader inserts were one answer. AI roleplay platforms are the newest one. Like every tool fandom has ever picked up, what you make of it depends entirely on the awareness you bring to it.

Now if you will excuse me, I have a very important scene to finish with a certain Winchester, and I am not going to apologize for any of it.

If you or someone you know is struggling: Call or text 988 (US Suicide and Crisis Lifeline), available 24 hours a day, 7 days a week. Internationally, visit befrienders.org for crisis center contacts worldwide.