×

The New Red Flag Is Asking ChatGPT Before Asking Your Partner


The New Red Flag Is Asking ChatGPT Before Asking Your Partner


17787749128a5344942a72bd1c0e38bea5fc9deeaae53d8fe1.jpegAndrea Piacquadio on Pexels

Something shifted quietly in the last few years, and most of us barely noticed it happen. The moment we started treating our phones as the first stop for emotional clarity, a subtle rewiring began. We stopped turning toward the person across the table and started turning toward a chat window. Not because we don't love our partners, but because the chat window doesn't sigh, doesn't get defensive, and never makes us feel guilty for bringing something up at the wrong time.

A 2025 survey of 1,000 married Americans found that 64% of couples turn to AI tools for relationship advice before turning to each other. That sounds almost too strange to be true until you think about the last time you typed something into ChatGPT that you probably should have just said out loud. The behavior is common enough now that it deserves an honest look, not because using AI is inherently wrong, but because of what it quietly signals when we do it reflexively and repeatedly.

The Comfort of a Witness Who Can't Hurt You

The appeal is obvious once you name it. AI is fast, available around the clock, and costs nothing compared to therapy. More than that, it offers something rare: a space to test your thoughts without consequence. You can describe your partner's behavior in the least flattering terms, work through your anger, and craft your argument with zero risk of escalation. For people who grew up in households where conflict felt dangerous, that safety isn't trivial.

The problem is that what feels like preparation is often substitution. Relationships therapist Lucy Frank has noted that intimacy is built on reciprocity, the act of being vulnerable and then offering support in return, and that turning to AI removes that emotional give-and-take. When we process a conflict through a chatbot first, we arrive at the actual conversation already resolved, already certain, and a little detached from the messy human on the other side. We've rehearsed a verdict, not a dialogue.

MIT social scientist Sherry Turkle saw this structural problem coming over a decade ago. In Alone Together, she described what she called the Goldilocks Effect: our tendency to engineer relationships that are not too close and not too far, mediated through technology to avoid the emotional risk of real contact. Asking an AI to navigate our most intimate conflicts is that dynamic fully realized.

When Avoidance Gets a Productivity Makeover

One reason this habit is so hard to examine is that it disguises itself as self-improvement. Asking ChatGPT how to phrase a difficult conversation or whether your feelings are reasonable sounds like emotional maturity. Research by Collins et al. (2025) found that most people reported more benefit than risk from using ChatGPT for mental health issues, noting it is particularly effective at helping people word difficult messages. That's real. The tool can help. The danger is when clarity-seeking becomes a routine way of avoiding the discomfort of being uncertain in front of someone else.

The same survey found that 33% of married respondents felt AI tools understand their relationship struggles better than their spouse does. An algorithm doesn't understand anything, but it also doesn't interrupt you, challenge your framing, or bring its own emotional needs to the table. When that feels preferable to talking to your actual partner, the issue isn't the technology. The issue is that the relationship has developed a significant avoidance pattern, and the AI is making it easier to maintain.

The study also found that 28% of respondents had made a financial decision based on AI advice without telling their spouse. Financial decisions made alone, conflicts processed alone, emotional labor outsourced to a machine that won't remember the conversation tomorrow. Each feels minor in isolation. Together, they describe a relationship where two people are increasingly solving the problem of each other rather than solving problems together.

What We Lose When We Optimize for Ease

Men make up roughly 85% of ChatGPT's user base and are nearly three times more likely than women to use it for relationship advice, a pattern that tracks alongside data showing men are significantly less likely to seek therapy or discuss emotional struggles with friends. For a generation that was never taught that vulnerability is a strength, an AI that listens without judgment can feel like a miracle. What it actually is, is practice for a conversation that still has to happen with a real person who might push back.

Among Gen Z adults, 41% report having used AI to navigate their romantic lives. That number makes sense when you consider that this generation has always had the option to edit, delay, or avoid difficult conversations in real time. A text version of a vulnerable thing is almost always a smaller version of that thing, and that real intimacy requires patience, vulnerability, and a willingness to be inconvenienced. Optimizing that process out of existence doesn't make relationships easier. It makes them shallower.

Using a tool to get clearer on what you feel before a hard talk is reasonable. The red flag is when the chatbot becomes the relationship's de facto therapist and confidant, when the AI knows more about what's going wrong than your partner does. At that point, you're not using technology to get closer to your person. You're using it to maintain a comfortable distance, and calling it communication.