You’re struggling. It’s 2am. You can’t afford therapy. Or you’re on a waitlist. Or you’re afraid to tell a real person what you’re thinking.

So you open ChatGPT. Or one of the AI chatbots marketed specifically for mental health. And you start typing.

It responds. Sounds understanding. Asks questions. Offers suggestions. No judgment. No wait time. No copay.

It feels like help. Like someone actually listening.

But here’s what you need to know: an AI therapist isn’t a therapist. And using AI for mental health support carries risks most people don’t understand until something goes wrong.

Does AI Therapy Actually Work?

The short answer: we don’t know. And that should concern you.

There’s no research base.

Real therapy modalities have decades of research. Thousands of studies. Evidence about what works, for whom, under what conditions. AI therapy? We’re conducting that experiment on you. In real time. With your mental health.

It can’t actually understand you.

The AI therapist isn’t comprehending your situation. It’s pattern-matching from training data. Predicting likely next words based on billions of text examples. That’s not understanding. It’s sophisticated autocomplete.

When you tell ChatGPT about your depression, it’s not grasping what you’re experiencing. It’s generating text that sounds like responses to depression based on its training.

It has no actual clinical judgment.

A human therapist notices: you’re mentioning suicide more frequently. Your affect is flat. You’re describing symptoms that suggest psychosis. They intervene. Adjust treatment. Refer you to higher level of care when needed.

AI can’t do this. It has no clinical judgment. No ability to recognize when conversation isn’t enough. No way to get you help when you need more than text on a screen.

The feedback loop is backwards.

In real therapy, you get better or worse. The therapist adjusts based on your actual progress. With an AI therapist, you feel better in the moment (because it says comforting things) but your underlying issues don’t change. You mistake emotional relief for therapeutic progress.

It reinforces avoidance.

Part of therapy is showing up. Being vulnerable with another human. Tolerating the discomfort of being truly seen. AI therapy lets you avoid all that. Feels easier. But that avoidance prevents actual healing.

At Modern Insight, we see clients who spent months with AI therapy, felt like they were working on their issues, then realized nothing actually changed. They were getting validation without transformation.

Can You Use ChatGPT as a Therapist?

Technically? Yes. You can type anything into ChatGPT.

Should you? No. And here’s why:

ChatGPT isn’t designed for therapy.

It’s a general-purpose language model. Not trained specifically for mental health. Not designed to handle crisis. Not programmed with safeguards for vulnerable populations.

When you use ChatGPT as an AI therapist, you’re using a tool for something it wasn’t built to do. Like using a hammer as a screwdriver. Might work sometimes. Might also cause damage.

It can give harmful advice.

ChatGPT generates plausible-sounding responses. Sometimes those responses are therapeutic. Sometimes they’re actively harmful. It can’t tell the difference.

We’ve seen examples where AI therapy suggested exposure exercises to trauma survivors without proper stabilization. Recommended medication adjustments (which it absolutely shouldn’t). Reinforced eating disorder behaviors by validating restriction.

There’s no accountability.

Real therapists have ethical codes. Licensing boards. Supervision. Consequences for harm. An AI therapist has none of that. If it gives you terrible advice, there’s no recourse. No professional consequences. No system ensuring your safety when you’re relying on an AI therapist.

Your data isn’t protected.

Everything you tell ChatGPT goes into their system. It’s not HIPAA protected. Your deepest struggles, trauma history, suicidal thoughts… potentially used to train future models. No confidentiality. No privacy protection.

It can’t handle crisis.

You’re suicidal. You tell the AI therapist. It generates sympathetic text. Maybe suggests calling a hotline. But it can’t actually intervene. Can’t assess your risk. Can’t get you immediate help. In crisis moments, an AI therapist is worse than useless. It’s a dangerous illusion of support.

Some AI specifically marketed for mental health:

Apps like Woebot, Wysa, Replika (when used for mental health). These are trained on therapeutic conversations. Marketed as mental health support. Better than ChatGPT? Maybe. Still AI therapy with all the same fundamental limitations.

They can’t replace human therapists. At best, they’re supplementary tools. At worst, they delay people getting actual help.

Is AI Therapy FDA Approved?

No. And understanding why matters.

The FDA regulates medical devices and treatments.

For something to be FDA approved, it has to go through rigorous testing. Prove safety and efficacy. Demonstrate it works better than placebo. Meet strict standards.

AI therapy hasn’t done this. Most AI therapy tools aren’t even submitted to FDA because they classify themselves as “wellness apps” or “general wellness products.” This lets them avoid regulation while still marketing mental health benefits.

What this means for you:

No oversight on whether it works.
No requirements for safety testing.
No standards for what claims can be made.
No accountability if it harms you.

Some AI tools claim to be “evidence-based” because they use therapeutic language (CBT terms, mindfulness concepts). That’s not the same as being evidence-based treatment. They’re using borrowed language without proven effectiveness.

The regulatory gap:

AI therapy exists in a space between technology and healthcare. Tech companies say “we’re not providing medical treatment” to avoid medical regulation. But they market mental health benefits to users who need actual care.

This gap means you’re taking all the risk. The companies have none of the accountability.

At Modern Insight, we stay current on AI developments in mental health. We’re not opposed to technology. We’re opposed to people substituting AI for therapy when they need actual clinical support.

What AI CAN Be Used For

Not all uses of AI for mental health are problematic.

Journaling prompts. Using AI to generate reflection questions, explore thoughts, practice articulating feelings. Low-risk. Can be helpful.

Between-session support. If you’re IN therapy with a real therapist, using AI for additional reflection or skills practice. Supplementary, not replacement.

Psychoeducation. Learning about conditions, treatment approaches, coping skills. AI can provide information. Just verify it with reputable sources.

Accessibility tool. For people who can’t access therapy (financial, geographic, disability barriers), AI might be better than nothing. But it’s a band-aid, not treatment.

What it can’t replace:

Human connection. Clinical judgment. Therapeutic relationship. Crisis intervention. Actual diagnosis. Treatment planning. Real healing.

Get Real Support

If you’re using an AI therapist because therapy feels inaccessible, we get it. Therapy is expensive. Waitlists are long. Finding the right fit is hard.

But an AI therapist isn’t the solution. It’s a placeholder that might prevent you from getting actual help.

Actual alternatives:

Community mental health centers with sliding scale fees. Online therapy platforms (with real therapists). Training clinics at universities. Support groups. Crisis lines with trained counselors.

These involve real humans. Real training. Real accountability. Real ability to help.

At Modern Insight, we work with people who’ve been managing their mental health alone or with AI and are ready for actual therapeutic support. We understand the barriers that led you to AI therapy. We help you access real care.

Using AI for mental health support? Contact Modern Insight. We provide actual therapy with human therapists who have clinical training, ethical accountability, and the ability to truly understand and help you. Because you deserve more than algorithmically generated sympathy.

Reach out to us today.