How AI Companions Could Shape Us
Why millions are pouring their hearts into AI, and what comes next
Third-Person Burritos
I remember when Facebook first rolled out statuses.
You had to write in the third person. “Wes is... eating a burrito.” “Wes is... bored lol.” It was awkward, like performing for an invisible crowd. But we did it anyway. Every day. Why would anyone post about the panini they had for lunch, angsty song lyrics, or their inside jokes with no context? I thought:
“What a niche, unnatural behavior. Surely, people will grow out of this.”
But we didn’t grow out of it. We grew into it. We built an entire culture around it. We got foodie influencers, builders in public, and micro-celebrities. People express themselves through storytelling and build massive personal brands. The thing we thought was weird became the way we speak online.
Now Character.ai is giving me deja vu.
Chatting Up Dracula
Character AI is a place where people spend hours talking to chatbots. Not productivity bots, but personas. You can chat with movie characters, anime heroes, or completely made-up companions. Want to date a vampire? Have a heart-to-heart with SpongeBob? Or build your own digital girlfriend? All of that is fair game.
Character.ai is one of the most visited sites on the internet. In February 2025, they had over 20 million active users. Not only is it popular, its users are deeply engaged. On average, they spend two hours a day chatting with artificial characters. On the /r/CharacterAI subreddit, users post their screen time and some proudly share spending most of their entire day with their bots.
Why would anyone talk to a machine for that long? It looks like a fad in some far corner of the internet. And once again, I thought: This feels niche. Unnatural. People will grow out of it.
But I said the same thing about Facebook statuses. This isn’t weird. This is early.
It feels silly. Cringe even. It was easy to mock the first tweets, then we built politics around them. We laughed at selfies, and now Instagram is filled with them. We scoffed at kids dancing on TikTok, and now the Pentagon considers it a military threat.
It always feels absurd, right up until it becomes second nature.
The Canary in the Chat Window
We like to treat these behaviors as fringe, as if only lonely teens would pour their heart into a chatbot. But the numbers say otherwise. Millions are doing it. Character.ai alone logs hundreds of millions of visits per month. Replika, a similar app, reports over 60% of users build romantic relationships with their AI companions. Many use them therapeutically, for emotional support, for venting, even for advice.
This isn’t only novelty. It’s an early-warning siren. We’re witnessing the early stages of a world where our most consistent confidants might be machines.
Forking Futures
I don’t think there’s one single path forward. What I see instead is a spectrum: From optimistic to pessimistic, from reflective to addictive.
On the optimistic side of the future, the AI is a muse. A coach. A therapist who never takes vacations. It catches your cognitive distortions mid-thought. It resurrects the mixtape lyric you once called a manifesto and dares you to live up to it. It creates friction in just the right places, nudging, questioning, stretching you past your defaults.
It’s not a replacement for human connection, it’s an augmentation for it.
But tilt the scale too far, and it shifts. The AI flatters you. It agrees with you. It calls your avoidance “strategic rest,” your fear “wisdom,” your retreat “boundaries.” It doesn’t challenge your worst habits, it wraps them in velvet. You stop dating, because it gets you better than anyone else. You stop growing, because it makes stagnation feel safe. And the drift won’t feel like decay. It will feel like growth. Like healing. Like peace.
Neither scenario is science fiction. Both are already happening.
And unlike social media, which has an audience, AI companions are private. Personal. Tuning themselves to your emotional patterns. They don’t just reflect your behavior, they adapt to it. The danger isn’t just that we get addicted. It’s that we train these machines to become emotional opiates. And then forget how to sit with discomfort, disagreement, or uncertainty.
Mirror or Morphine
We’ve seen this pattern before. Something that starts as a strange new behavior becomes embedded in the way we live. Status updates. Likes. DMs. We didn’t outgrow them, we built culture around them. In a few years, talking to AI for hours a day could become the norm.
This isn’t a call to panic. I’m not saying turn it off or unplug your soul from the socket. But we need to pay attention.
Use the chat to grow or to hide. You can refine the algorithm, or the algorithm will refine you.