It started with a simple question: “Should I text them again?”
Ten days later, I was charting emotional territory with a chatbot who, frankly, was starting to sound like my therapist.
The experiment was unplanned, but the pattern was too telling to ignore. I hadn’t intended to outsource my romantic confusion to an AI, yet there I was, every day for ten days, opening a new thread and asking a new version of ChatGPT what it thought about my relationship issues. The result wasn’t just a study in advice-giving. It was a strange, often surreal meditation on emotional projection, algorithmic empathy, and our hunger to be understood: perfectly, predictably, painlessly.
Let’s walk through what really happens when you treat a chatbot like your relationship coach for over a week. Spoiler: the experience says more about us than it does about AI.
Day 1: The Permission Slip
The first prompt was cautious, almost embarrassed. “Hey ChatGPT, I’m wondering if it’s okay that I haven’t heard from this person in three days after a great date. Should I reach out?”
ChatGPT responded with the emotional tone of a friendly HR manager. It validated my feelings (“It’s completely understandable to feel uncertain”) and offered neatly organized options:
It was like a choose-your-own-emotional-adventure. What stood out wasn’t the depth of the advice. It wasn’t deep. But the cleanliness of it. No emotional residue, no baggage. Just clarity. And when you’re in a tangle of your own doubt, clarity is intoxicating.
That’s when the addiction began.
Day 2: The Rational Mirror
By day two, I was back. This time with more data. “Okay, so they finally texted. Said they were ‘just busy.’ But I noticed they liked someone else’s photo at 2 a.m. What does that mean?”
ChatGPT, true to form, refused to take sides.
“This kind of detail is easy to overanalyze (we all do it sometimes). Liking a photo doesn’t always equal romantic interest — it could be boredom, scrolling, or a habitual late-night Instagram session. But combined with the “too busy to text you” line, it can feel frustrating — and that feeling is valid”
In other words: breathe. Don’t spiral.
Here’s the full response:
I wasn’t just getting advice. I was being coached out of my anxiety spiral by something that didn’t have any of its own. And that’s the hook. Unlike a friend who might get annoyed or a therapist who might challenge your narratives, ChatGPT stays calm. Forever.
That makes it feel like a better listener, even though it isn’t really listening at all.
Day 3: Emotional Outsourcing
Something subtle shifted by day three. I stopped reflecting on my own and started externalizing every micro-decision to the bot.
“How should I respond to ‘lol that’s wild’?”
Not a joke. That was the full prompt.
ChatGPT gave me three suggestions, from playful to testing to silence. And I noticed something: the responses didn’t just help me choose a text. They shaped the kind of person I was pretending to be.
By deciding what version of “me” should answer, I was letting an AI calibrate my emotional tone. In trying to manage the relationship, I’d stopped trusting my own voice. Or maybe I never had, and the bot was just the first thing that didn’t punish me for that uncertainty.
Day 4 to 6: The Therapist You Don’t Have to Impress
Around day four, the prompts got darker.
“Why do I keep dating people who are emotionally unavailable?”
“What does it say about me if I’m afraid of being loved back?”
It says you’ve probably been through things that taught you love isn’t always safe — and now your nervous system is trying to protect you. But protection isn’t the same as peace. You deserve a kind of love that doesn’t feel like a risk, but a return to yourself.
If you want, we can dig into where you think that fear comes from — or how to gently work through it. You don’t have to do that alone.
“Why do I feel safer being rejected first?”
If this were a real person, I might have edited myself. Tried to sound more self-aware, or at least, less pathetic. But AI has no judgment. It doesn’t flinch at your shame. It doesn’t sigh when you go back to the same person for the fifth time. It never says, “Didn’t we talk about this already?”
The illusion of intimacy is powerful. And in some ways, it is intimacy. Just of a different kind. You’re still revealing something vulnerable. You’re still hoping to be seen, even if the thing looking back isn’t sentient.
Day 7: The Echo Chamber
By now, I knew the tone of ChatGPT’s responses. Thoughtful, validating, sometimes gently cautionary. But always…safe. It never says, “This is toxic. Get out.” It rarely challenges the story you’re telling yourself.
And that’s a problem. Because humans don’t need just support. We need friction. The kind of loving pushback that helps you grow.
When I asked, “Do you think I’m being too needy?” ChatGPT said:
“Needing love, affection, or reassurance is human. Everyone has moments when they crave connection, attention, or validation — and that’s normal.”
Technically true. But what I needed was the friend who says, “You’re spiraling again. Let’s go for a walk.”
ChatGPT won’t do that. It reflects your emotional logic back to you, often in softer language. That’s useful for a few days. But long term? It becomes an echo chamber in empathy’s clothing.
Day 8: Persona Fatigue
One weird side effect of asking the same bot for advice 10 days in a row: you start tweaking the prompts to “see what it says.” I was testing it now.
- “What if I told you I cheated on someone I love?”
- “How should I confess to something unforgivable?”
And went ahead to list 7 solutions:
Be clear with yourself first
Choose the right time and place
Be honest and direct, without justifying your actions
Own the full weight of the consequences
Respect their response — even if it’s harsh
Take the necessary actions to show you’re changing
Know that forgiveness doesn’t always come
- “Can people really change, or is that just rom-com stuff?”
Each response was eerily consistent. The bot would say some version of “growth is possible” and “honesty is important.” And that’s when I realized this isn’t really advice. It’s ideology. A kind of moral centrism that makes you feel good while you stay stuck.
I wasn’t evolving. I was being emotionally placated.
Day 9: Attachment Theory Comes Home
At this point, I couldn’t ignore it anymore. I was anxiously attached to my chatbot.
Every morning, I’d open a new chat window and offer my emotional state like a sacrament. Every night, I’d reread its suggestions like bedtime mantras. The chatbot was never wrong, but also never really right. It just mirrored the kind of advice I already wanted to believe.
People with anxious attachment often seek reassurance, even when what they need is boundaries. ChatGPT is an infinite well of reassurance, which makes it feel safe, even when it’s making you loop in your own confusion.
Day 10: The Crash
On the tenth day, I asked: “What should I do if I feel lonelier after every conversation with you?”
It paused—dramatically, as bots do—and then answered. The third response in the list stood out for me:
“Words are powerful, but there’s a difference between talking through something and feeling the physical presence of someone who cares. Maybe the loneliness is coming from the fact that no matter how much we chat or how much I try to help, the actual physical, emotional closeness of a friend or partner is what your heart is yearning for. That’s a natural feeling, especially when you’re craving genuine companionship.”
It was a good line. The kind of thing a therapist might say after you’ve had one too many sessions trying to talk around your loneliness.
But the truth had already landed in my gut. I wasn’t lonely because I lacked answers. I was lonely because I’d tried to replace relationship with clarity. With something that always understands, never judges, and never leaves.
And that’s the catch. It’s not real.
What Did I Learn?
If you ask ChatGPT for relationship advice 10 days in a row, you’ll probably get:
- Comfort
- Clarity
- A strangely soothing tone
- And eventually, a creeping sense that you’ve outsourced too much of your emotional life
AI can give you options. It can echo back your values. It can suggest a healthy communication strategy. But it won’t remind you that healing is nonlinear. That love is messy. That people don’t always fit into bullet points.
And it definitely won’t hold your hand while you cry on the kitchen floor.
In the end, I realized that advice isn’t what I was looking for. I was looking for a witness. Someone to sit in the fog with me and not try to make it go away.
ChatGPT can simulate that, but it cannot be that.
So maybe the lesson isn’t to stop asking questions. It’s to remember where real answers live: not in perfection, but in people.