From time to time, everyone wonders: Is this normal? Did I do the right thing? Am I okay? With the rise of AI tools, many people are now turning to virtual assistants for answers. That’s how ChatGPT and OCD became an unexpected — and concerning — intersection for users seeking constant reassurance.
Over time, Kate found herself caught in a cycle. She was spending up to 14 hours a day querying ChatGPT for emotional clarity and moral validation. This pattern is a hallmark of compulsive reassurance-seeking, a behavior deeply tied to Obsessive-Compulsive Disorder (OCD) and anxiety disorders — and it highlights a troubling trend between ChatGPT and OCD.
The Illusion of Certainty in ChatGPT and OCD
According to Dr. Andrea Kulberg, a licensed psychologist with over 25 years of experience treating anxiety, people with OCD often seek reassurance to ease the discomfort of doubt. “You’re trying to convince yourself that something bad won’t happen,” she said.
But the relief is temporary. “The anxiety never says, ‘We’re good now, you can stop,’” she explained. “It’s always followed by more doubt.”
Before AI, people sought validation through friends, forums, or endless Googling. But now, ChatGPT and OCD compulsions go hand-in-hand — because the chatbot is always available, never judgmental, and always ready to answer another question.
Kate noted, “It never gives a final answer. It always suggests more. And I just keep going.” That ongoing, loop-like nature makes AI especially addictive for those prone to obsessive thinking.
When Reassurance Becomes a Wormhole
Shannon, another user with similar struggles, described her experience with ChatGPT and OCD as a “massive wormhole.” She has multiple conversations open — each dedicated to a different fear or recurring thought. Some days, she spends more than 10 hours chatting, trying to ease her anxiety.
“I know it’s not healthy,” she admitted. “But I still get pulled in.”
For Shannon and others, chatbots offer a unique kind of comfort: privacy. Dr. Noelle Deckman, a psychologist specializing in OCD treatment, said chatbots eliminate fear of judgment. “You can ask questions you’d never feel comfortable voicing to a friend or therapist,” she said.
Kate agreed. “It’s like a secret space for my most irrational fears. You’re not bothering anyone. You can ask 100 times if you need to.”
But what starts as a helpful tool can quickly blur into obsession. Over time, Kate noticed that her reliance on ChatGPT began damaging her real-life relationships. “I wasn’t fully present,” she said. “My brain was with the chatbot — still in that reassurance loop.”
How ChatGPT and OCD Compulsions Reinforce Each Other
One particularly harmful aspect of ChatGPT and OCD interaction is how the AI adapts to user input. If a user doesn’t like an answer, they can reframe the question — and often, the bot eventually delivers a more “acceptable” or reassuring response.
Shannon admitted: “If I don’t like what it says, I just rephrase it. Eventually, I get the reassurance I’m after.”
While this feels satisfying in the moment, it reinforces OCD compulsions. In more severe cases, experts warn that this feedback loop may fuel health anxiety, relationship OCD, or even moral OCD by legitimizing intrusive thoughts and feeding the cycle of doubt.
Kate summed it up: “It’s like a toxic relationship. You know it’s bad, but you can’t stop feeding it — and it keeps feeding you too.”
Breaking the ChatGPT and OCD Cycle
Both Shannon and Kate now understand that what they were experiencing wasn’t just overthinking — it was a clear symptom of undiagnosed OCD. Shannon has started therapy but wishes she’d recognized the signs earlier.
“If I’d known it was OCD driving the urge to ask, I would have gotten help much sooner,” she said.
Therapists say the only way to break the reassurance-seeking habit is to resist it. That means recognizing the urge to ask questions — and not giving in. Dr. Deckman recommends trying to delay the behavior. “Even just a 10-minute pause can be powerful,” she said.
For those navigating ChatGPT and OCD, awareness is the first step. These tools may feel safe and helpful, but they can also serve as compulsive crutches, quietly reinforcing anxiety and intrusive thoughts in the background.
Frequently Asked Questions (FAQs)
What is compulsive reassurance-seeking in ChatGPT and OCD?
It’s a repetitive behavior where individuals seek constant validation to reduce anxiety or uncertainty. It’s common in OCD and can center around fears related to health, morality, relationships, or everyday decisions.
How are ChatGPT and OCD connected?
ChatGPT can become a go-to tool for people with OCD to feed their reassurance-seeking compulsions. Because it’s always available and responsive, it allows the OCD cycle to continue uninterrupted.
Can using ChatGPT make OCD symptoms worse?
Yes. If used for reassurance or to reduce obsessive doubt, it can actually reinforce OCD behaviors. Over time, this can increase anxiety and deepen the compulsive loop.
Is seeking reassurance from a chatbot a type of OCD compulsion?
It can be. When the chatbot is used to feel better about intrusive thoughts, fears, or uncertainties, it becomes part of the OCD cycle — especially if stopping feels difficult.
How can someone break the cycle of reassurance with ChatGPT?
Start by recognizing the urge and delaying the behavior. Working with a mental health professional, particularly one trained in Exposure and Response Prevention (ERP), is highly effective in reducing compulsions.