Business

AI in therapy: Expert warns replacing ‘human touch’ with technology could miss the mark

Imagine a chatbot that lacks the capacity to feel emotions, asking its human patient, “How do you feel about that?”

It could soon be a reality for patients seeking mental health care, depending on the path they choose. For some, perhaps, it already is. 

The rise of A.I. in recent years has forever transformed the human experience – infiltrating the workforce, helping design new products and even aiding physicians by recognizing certain ailments before humans might notice them on their own. 

Now some A.I. platforms are promoting software that claims to be your pocket therapist, but it invites questions – does the technology have the capacity to connect with humans as other humans might? Can support be tailored to each individual? Can it tackle more complex or even life-threatening emotional needs? Or could people misuse such technology to self-diagnose or avoid professional care in the long run?

TACO BELL, PIZZA HUT GOING ‘AI-FIRST’ WITH FAST-FOOD INNOVATIONS

Depressed women

Depression is one of the most prevalent mental health issues people face, and the problem is on the rise. (iStock / iStock)

“While A.I. has made significant strides in understanding and processing human emotions, replicating the genuine human touch, empathy, and emotional connection of a counselor is a profound challenge. The subtleties of human communication and empathy are difficult to encode into algorithms,” Sergio Muriel, a Licensed Mental Health Counselor and Certified Addiction Professional and COO of Diamond Recovery Group in Florida, told Fox News Digital.

People suffering from conditions like depression or anxiety might turn to technology for fear of judgment, thinking they can avoid the stigma society commonly attaches to mental health.

All it takes is a simple Google search for a “chatbot therapist” or an “A.I. therapist” and the results populate with apps like Wysa – the anxiety, therapy chatbot – or Elomia Health’s mental health chatbot.

NETFLIX’S TED SARANDOS TELLS ROB LOWE AI ‘IS NO SHORTCUT FOR THE HUMAN EXPERIENCE’

Artificial intelligence logo

Artificial intelligence has transformed the way humans think about technology and go about their lives, ranging from school and work to the products they consume. (iStock / iStock)

Muriel thinks the rise of A.I. in mental health care can yield several benefits, both for those seeking care and for those providing it, but it should still be used with caution and as a complement for professional care with an experienced human.

“It’s an exciting evolution, offering new pathways for support and intervention,” he said. “The integration of A.I. into mental health care has potential benefits but also requires caution. A.I. can offer immediate, anonymous support, making it a valuable tool for those hesitant to seek traditional therapy. However, it’s essential to ensure these technologies are used responsibly and complement, rather than replace, human care.”

As he mentioned, A.I. technology boasts its ability to be available at your fingertips 24/7, a feat it emphasizes on its own.

“No appointments or waiting rooms. Instant replies even on weekends and at 4 A.M.,” Elomia Health’s website reads, additionally noting that 21% of users said they would not have spoken to anyone outside A.I. out of fear of being judged.

HEALTH EXPERTS RAISE CONCERNS OVER AI APPS CLAIMING TO BE POCKET NUTRITION ASSISTANTS

Additional safeguards are built into the platform as well, including redirection to therapists or Hotlines if someone potentially needs additional help.

Muriel said such A.I.-based approaches are beneficial for triaging “low-risk” clients and thereby helping professionals manage caseloads.

“[It can] offer new insights into mental health through data analysis. It can extend the reach of mental health services to underserved areas,” he elaborated.

“[But] there’s a risk of over-reliance on AI, potential privacy concerns, and the loss of the nuanced understanding that comes from human interaction. A.I. cannot yet fully replicate the empathy and depth of a human therapist.”

CLICK HERE TO READ MORE ON FOX BUSINESS 

Using such software can also lead to misdiagnosis, and sole reliance on A.I.-powered mental health tools for those with a history of self-harm or suicidal ideation is especially “dangerous,” he continued.

Though it’s a promising development and speculated by some to be the future of mental health in medicine, the message he wants everyone to take home is that – rather than viewing technology as a potential human replacement – perhaps it should be viewed as just another tool to help make life easier.

“A.I. should at most be a supplementary tool, not a replacement for human care,” he said.

Related posts

‘Lives are at risk’, government and NHS need to get a ­grip on cancer | UK | News

Patti Jo

Home Depot’s viral Halloween skeleton quickly sells out before summer as social media users sound off

Patti Jo

Federal Reserve slashing rates ahead of Election Day would likely just superficially boost US wallets: experts

Leave a Comment