Nancy De Jesus, RPh-IPA, MSc, MPH, PhDs; Mobile Crisis Unit and Research Intervention Coordinator, CNRS – INSERM – WHOCC France
One evening, in a quiet suburb near Paris, a mobile crisis unit received a digital alert. An 82-year-old man was flagged “moderate risk.” No major history. No red flags. A routine visit, according to the system. In reality, he was curled on the floor: dehydrated, disoriented, barely able to speak. The algorithm hadn’t malfunctioned. It had done exactly what it was built to do: assess inputs, crosscheck history, calculate risk. But it couldn’t detect the tremors. Hear the hesitation. Sense the quiet accumulation of isolation. This is the paradox we face in psychiatric crisis care: the more precise our tools become, the more we risk overlooking the signals.
The Blind Spots of AI
Digital tools promised faster, smarter, more connected care. And in some ways, they’ve amazingly delivered. But they’ve also introduced a dangerous illusion: that if we track enough metrics, we’ll fully understand the human behind the screen. In practice:
- A young woman with bipolar disorder sends overly polite emails at the onset of mania. The system flags “improved engagement.”
- An autistic adult, institutionalized for years, tells the system what it wants to hear. His file reads “stable.”
- An elderly widow’s home shows deep self-neglect: dust, expired food, unopened mail. Yet her self-assessments log was “compliant.”
These are not bugs in the system. They are features of AI-design that was never built to listen.
The Human Cost of Digital Efficiency
In recent survey, 60% of psychiatric nurses report spending a third of their shifts on documentation. Digitalisation isn’t just missing patients, it’s draining professionals:
- Veteran staff are quitting, citing burnout and hours spent on fragmented platforms.
- Suicide attempts have been missed… not from clinician negligence, but because patient warning signs didn’t appear in the data.
The glitches are striking, technology meant to enhance connection but often sidelines the human touch – central to mental health care. Psychiatric work has become dashboard management. We are no longer just health and care providers. We’ve become data-clerks, ticking boxes in systems that barely reflect the complexities of the field. Spending more time clicking than caring… isn’t just a complaint – it’s a warning! Is there a better way forward? We’re not anti-tech. But we question the illusion that care can be automated. Here’s what we’ve learned from the field:
- Co-design with frontline workers: Build systems with us, not for us.
- Train for judgment, not just compliance: Teach caregivers to see beyond alerts and trust them when they say, “The system says X but I feel Y…”
- Redefine success: Prioritize connection not just speed. Ask: Did this prevent a crisis? Did the patient feel seen? Did the caregiver feel supported?
- Measure what really matters: Stop counting how fast we respond or how many boxes we tick. What about fewer relapses? More trust? Fewer staff breakdowns? Those things matter too.
- Protect time for presence: Sometimes healing means just being there. No form, no app, no checkbox… just presence.
Don’t Mistake the Tool for the Treatment
EPHA’s (2025) policy paper captures the growing crisis on the health and care workforce – a reality we see daily on the ground. Burnout is rising. Teams are stretched thin, yet Tech won’t solve this on its own. But if we redesigned right, it could actually lighten the load. That means tools that connect instead of isolate. That listen, not just record. Systems that recognize care isn’t just about data points: it’s about the essential: presence, empathy and nuance. Because in psychiatry, the most critical symptoms don’t appear on a dashboard. They emerge in silences, the subtle changes in tone, in hesitations, in what is not said. Digital mental health without humanity is just expensive machinery. As Europe’s technological health ambitions grow, and rightly so – we must ground them in real-life frontline experience. Otherwise, we risk building systems that look smart but feel too cold.
P.S.
Looking back, let’s not forget, some wounds are invisible, and some cries for help are silent… That old man near Paris didn’t need a better app. He needed someone to notice him and hold his hand.
Disclaimer: the opinions – including possible policy recommendations – expressed in the article are those of the author and do not necessarily represent the views or opinions of EPHA. The mere appearance of the articles on the EPHA website does not mean an endorsement by EPHA.