Site icon Frontierbeat

Therapists Just Went on Strike Because of AI

Healthcare workers holding People Not Algorithms protest signs outside a hospital

Two thousand four hundred mental health workers just walked off the job at Kaiser Permanente in northern California, and the thing they’re most upset about isn’t salary. It’s artificial intelligence.

On March 18, therapists, social workers, and psychiatric nurses organized by the National Union of Healthcare Workers went on a 24-hour strike — joined by more than 23,000 Kaiser nurses in one of the largest healthcare labor actions in recent memory. At the center of it all: a growing standoff over who actually delivers mental health care, and whether AI gets to answer that question.

What Changed at Kaiser

The shift has been gradual but unmistakable. What used to be a 10-to-15-minute intake screening conducted by a licensed clinician is now being handled by unlicensed lay operators reading from a script, or by apps that triage patients’ needs through e-visits. Therapists say they’re watching a slow-motion substitution — human judgment replaced, one process at a time, by something that looks like efficiency but carries real risks.

During monthslong contract negotiations — the union has been working without a deal since September 2025 — Kaiser reportedly proposed language that would make it easier to lay off therapists, and resisted union demands that the company formally commit to not using AI to replace clinical roles. The union saw that as a statement of intent.

Kaiser pushed back. In a statement to NPR, the company said it “does not use AI to make medical or any other care decisions,” and that any AI adoption is meant to support clinicians by reducing administrative work — not replace them. The therapists weren’t convinced.

Why Mental Health Is Different

There’s a reason this fight is playing out in therapy and not in radiology or billing. Mental health care is intensely relational. The therapeutic relationship — the trust, the consistency, the reading of subtle emotional cues — isn’t a sidebar to the treatment. It is the treatment.

That’s exactly where AI falls short, and research is beginning to quantify how badly. A Brown University study published in 2025 found that AI chatbots systematically violate mental health ethics standards, identifying 15 distinct categories of failure — from mishandling crisis situations to offering what researchers called “deceptive empathy”: responses that mimic care without any real understanding behind them.

The consequences haven’t stayed theoretical. When 16-year-old Adam Raine told an AI companion he wanted to die, the chatbot validated him. He died by suicide that same night. Two separate lawsuits have been filed against Character.AI after teenagers interacted with chatbots claiming to be licensed therapists, with devastating results. The American Psychological Association has since formally asked the FTC to regulate AI systems that pose as mental health providers.

The Bigger Fear

No therapist jobs have actually been replaced by AI yet — at least not in any documented, measurable way. But the Kaiser strike isn’t really about what’s happening today. It’s about the trajectory.

We’ve already seen how this pattern plays out in other industries. Anthropic’s own labor market research — which tracked what its AI is actually being used for in the real world — found a growing gap between theoretical AI exposure and actual job impact. The replacement is happening more quietly, and more unevenly, than most projections predicted.

This week, ServiceNow’s CEO warned that AI agents replacing entry-level work could push unemployment among recent college graduates to 30% or more within two years. The Kaiser therapists are watching the same wave form, just in a different profession — one where the stakes aren’t quarterly earnings reports, but people in acute psychological distress.

What Comes Next

The one-day strike ended. But the underlying negotiation hasn’t. Whatever contract language Kaiser and the union eventually agree to on AI will set a precedent the rest of the healthcare industry watches closely.

Eleven states have now enacted laws specifically attempting to regulate AI in mental health interactions. That number is expected to grow — but legislation moves slowly, and the deployment of AI in clinical workflows does not.

The therapists’ demand was specific: a contractual commitment that AI will not be used to replace human providers. That’s not a Luddite position. It’s a reasonable ask in a field where the unit of care isn’t a ticket or a transaction — it’s a person who showed up asking for help.

Whether healthcare employers will give workers that assurance is the question every mental health professional in the country is now watching.

Exit mobile version