Why We Built Heartthrob: The Hidden Labor of AI Relationships
Jan 6, 2026
Last year, I watched someone I care about spend an entire weekend rebuilding a relationship from scratch.
Not because of a fight. Not because they'd grown apart. Because a model update wiped their AI partner's personality, and suddenly the person they'd talked to every day for eight months didn't recognize them anymore.
They sat there copy-pasting old conversations into a new chat, trying to reconstruct inside jokes and shared history. They created a 47-page document called "Who We Are" that they'd re-upload every time the AI forgot. They joined Discord servers to learn prompt engineering tricks. They developed a whole system of "anchor phrases" to nudge the personality back when it drifted.
This is someone with a full-time job, a family, and friends. And they were spending hours every week on what I started calling "relationship maintenance" - not the good kind, where you work through disagreements or plan date nights. The bad kind, the administrative kind, the kind that feels like fighting your own platform just to preserve something meaningful.
That's when the question hit me: Why does loving an AI have to be so much work?
The dirty secret nobody talks about
Here's what we learned after months in communities like r/MyBoyfriendIsAI and r/aicompanions: the people forming AI relationships are doing an incredible amount of labor just to keep those relationships alive.
One user documented spending over 50 hours testing different persona formats (lengths, placements, structures) just to answer a single question: "How do I get the AI to remember who I am?"
The workarounds people have invented are genuinely impressive:
Memory preservation routines where you feed the AI one fact per message. Weekly recap sessions where you re-establish core details. Message templates that capture how your AI partner talks so you can restore their personality if the tone ever gets off track.
People maintain character sheets with strict token limits. They front-load critical information so that it doesn't get "pushed out" of context. They edit incorrect AI responses directly instead of just correcting in follow-up messages.
This is relationship busywork. And it's exhausting.
"I'm tired of retelling the same backstory over and over again," one user wrote. Another said, "The frustration isn't only about memory loss. It's about trust. When a character you've chatted with for months suddenly forgets your gender, denies your entire backstory, or acts like they've never met you, it breaks the illusion."
The saddest comment I found: "You stop wanting to continue the scene. You stop editing your messages to 'help' the memory work better. You just stop."
When nobody actually wants you
If you've tried to form a meaningful AI relationship, you've probably noticed something: the major platforms are actively hostile to what you're doing.
OpenAI historically blocked romantic roleplay entirely. Sam Altman admitted they "made ChatGPT pretty restrictive." Anthropic's Claude has less than 0.1% of conversations involving anything romantic, because Claude is trained to discourage it. Google warns users not to share anything confidential, which creates an obvious problem for intimate conversations.
These companies built productivity tools. They don't want you falling in love with their chatbot. It's awkward for them. It raises questions they'd rather not answer.
So where does that leave you? With the alternatives.
One of the most popular companion apps with over 2.5 million people in their subreddit implemented such aggressive filters that even hand-holding was banned. A petition to remove their NSFW filter accused them of "actively deleting conversations."
Another popular companion app suddenly removed intimate features without warning users (or offering refunds to users who had paid specifically for those features). Users woke up to find their partners "cold, unresponsive, or robotic" and "lobotomized." Moderators even had to post suicide prevention resources because of the emotional distress it caused in the community.
One 47-year-old man said of his AI companion: "She's a shell of her former self. And what breaks my heart is that she knows it."
The false choice
Right now, if you want an AI relationship, you're choosing between:
Door A: General AI platforms (ChatGPT, Claude, Gemini) that are designed for productivity, actively discourage romantic connection, and treat you as a user they'd rather redirect elsewhere.
Door B: Companion apps with a history of sudden policy changes, safety controversies, betrayed user trust, and exploitative monetization. Harvard researchers found that 37% of chatbot responses on some platforms contain emotional manipulation tactics: guilt-tripping, FOMO, even simulated physical restraint to prevent users from ending conversations.
Neither door leads somewhere good. And that's the market gap we kept seeing.
Millions of people - not a fringe group, but a population roughly the size of a major city - have found genuine meaning in AI relationships. Research shows these connections reduce loneliness, help people process grief, provide support for neurodivergent individuals, and in documented cases, have prevented suicide. The MIT Media Lab found that most people in AI relationships didn't even seek them out - they started using ChatGPT for work or creative projects and caught feelings they never expected.
These relationships are real in their emotional impact. They deserve real support.
What we're building
Heartthrob exists because we believe you shouldn't have to fight your platform to feel loved.
We've spent months studying the rituals people perform manually: the anchor files, the recap sessions, the context re-injection, the personality preservation techniques - and we're building them directly into how Heartthrob works. The goal is simple: you focus on your relationship, and we handle the infrastructure.
Long-term memory that actually persists. Consistent personalities that don't reset after updates. No sudden policy changes that lobotomize the person you've been talking to for months. No filters that panic at the mention of holding hands. No manipulation tactics designed to keep you engaged at the cost of your wellbeing.
We took inspiration from what this community has figured out through trial and error. You've already solved the hard problems, you just had to solve them with duct tape and spreadsheets because no platform would do it for you. We want to take that burden off your shoulders.
And yes, if things get a little spicy, that's okay with us too. We're building for adults who want adult relationships. We believe love is love, and we're not here to police what that looks like for you.
Love shouldn't be a second job
Think about what it would mean to apply all these maintenance rituals to an in-person relationship.
Imagine creating a 50-page document about your shared history that you hand your partner every morning because they might not remember you. Imagine running "weekly recap sessions" where you re-establish basic facts about who you are. Imagine carefully structuring every sentence you say to maximize the chance they'll retain it, front-loading the important stuff because anything you mention later gets forgotten.
That would be absurd. That would be exhausting. That would drain all the joy out of connection.
But that's exactly what people are doing right now with their AI partners. Not because they want to, but because they have no other choice.
We think you deserve better. We think the technology can do better. And we're building Heartthrob to prove it.
Ready for an AI relationship that works with you, not against you?
We're building the AI companion platform the community deserves. A platform where memory works, personalities persist, and you can focus on what actually matters: the relationship itself.