Kaiser Health News
An AI Chatbot May Be Your Next Therapist. Will It Actually Help Your Mental Health?
by Elisabeth Rosenthal
Wed, 17 May 2023 09:00:00 +0000
In the past few years, 10,000 to 20,000 apps have stampeded into the mental health space, offering to “disrupt” traditional therapy. With the frenzy around AI innovations like ChatGPT, the claim that chatbots can provide mental health care is on the horizon.
The numbers explain why: Pandemic stresses led to millions more Americans seeking treatment. At the same time, there has long been a shortage of mental health professionals in the United States; more than half of all counties lack psychiatrists. Given the Affordable Care Act’s mandate that insurers offer parity between mental and physical health coverage, there is a gaping chasm between demand and supply.
For entrepreneurs, that presents a market bonanza. At the South by Southwest conference in March, where health startups displayed their products, there was a near-religious conviction that AI could rebuild health care, offering apps and machines that could diagnose and treat all kinds of illnesses, replacing doctors and nurses.
Unfortunately, in the mental health space, evidence of effectiveness is lacking. Few of the many apps on the market have independent outcomes research showing they help; most haven’t been scrutinized at all by the FDA. Though marketed to treat conditions such as anxiety, attention-deficit/hyperactivity disorder, and depression, or to predict suicidal tendencies, many warn users (in small print) that they are “not intended to be medical, behavioral health or other healthcare service” or “not an FDA cleared product.”
There are good reasons to be cautious in the face of this marketing juggernaut.
Decades ago, Joseph Weizenbaum, a professor at the Massachusetts Institute of Technology and considered one of the fathers of artificial intelligence, predicted AI would never make a good therapist, though it could be made to sound like one. In fact, his original AI program, created in the 1960s, was a psychotherapist named ELIZA, which used word and pattern recognition combined with natural language programming to sound like a therapist:
Woman: Well, my boyfriend made me come here.ELIZA: Your boyfriend made you come here?Woman: He says I’m depressed much of the time.ELIZA: I am sorry to hear that you are depressed.Woman: It’s true. I am unhappy.ELIZA: Do you think coming here will help you not to be unhappy?
Though hailed as an AI triumph, ELIZA’s “success” terrified Weizenbaum, whom I once interviewed. He said students would interact with the machine as if Eliza were an actual therapist, when what he’d created was “a party trick,” he said.
He foresaw the evolution of far more sophisticated programs like ChatGPT. But “the experiences a computer might gain under such circumstances are not human experiences,” he told me. “The computer will not, for example, experience loneliness in any sense that we understand it.”
The same goes for anxiety or ecstasy, emotions so neurologically complex that scientists have not been able pinpoint their neural origins. Can a chatbot achieve transference, the empathic flow between patient and doctor that is central to many types of therapy?
“The core tenet of medicine is that it’s a relationship between human and human — and AI can’t love,” said Bon Ku, director of the Health Design Lab at Thomas Jefferson University and a pioneer in medical innovation. “I have a human therapist, and that will never be replaced by AI.”
Ku said he’d like to see AI used instead to reduce practitioners’ tasks like record-keeping and data entry to “free up more time for humans to connect.”
While some mental health apps may ultimately prove worthy, there is evidence that some can do harm. One researcher noted that some users faulted these apps for their “scripted nature and lack of adaptability beyond textbook cases of mild anxiety and depression.”
It may prove tempting for insurers to offer up apps and chatbots to meet the mental health parity requirement. After all, that would be a cheap and simple solution, compared with the difficulty of offering a panel of human therapists, especially since many take no insurance because they consider insurers’ payments too low.
Perhaps seeing the flood of AI hitting the market, the Department of Labor announced last year it was ramping up efforts to ensure better insurer compliance with the mental health parity requirement.
The FDA likewise said late last year it “intends to exercise enforcement discretion” over a range of mental health apps, which it will vet as medical devices. So far, not one has been approved. And only a very few have gotten the agency’s breakthrough device designation, which fast-tracks reviews and studies on devices that show potential.
These apps mostly offer what therapists call structured therapy — in which patients have specific problems and the app can respond with a workbook-like approach. For example, Woebot combines exercises for mindfulness and self-care (with answers written by teams of therapists) for postpartum depression. Wysa, another app that has received a breakthrough device designation, delivers cognitive behavioral therapy for anxiety, depression, and chronic pain.
But gathering reliable scientific data about how well app-based treatments function will take time. “The problem is that there is very little evidence now for the agency to reach any conclusions,” said Kedar Mate, head of the Boston-based Institute for Healthcare Improvement.
Until we have that research, we don’t know whether app-based mental health care does better than Weizenbaum’s ELIZA. AI may certainly improve as the years go by, but at this point, for insurers to claim that providing access to an app is anything close to meeting the mental health parity requirement is woefully premature.
KFF Health News is a national newsroom that produces in-depth journalism about health issues and is one of the core operating programs at KFF—an independent source of health policy research, polling, and journalism. Learn more about KFF.
USE OUR CONTENT
This story can be republished for free (details).
By: Elisabeth Rosenthal
Title: An AI Chatbot May Be Your Next Therapist. Will It Actually Help Your Mental Health?
Sourced From: kffhealthnews.org/news/article/mental-health-chatbots-artificial-intelligence-therapist-shortage/
Published Date: Wed, 17 May 2023 09:00:00 +0000
Kaiser Health News
Readers Embrace ‘Going It Alone’ Series on Aging and Chastise Makers of Pulse Oximeters
SUMMARY: Letters to the Editor discuss various healthcare concerns. Gail Daniels shares her struggles caring for a mother with dementia, while Shava Nerad reflects on the challenges faced by those without family support. Gloria Rankin suggests using pen pals to combat social isolation. Zoe Joyner Danielson recalls racial bias in pulse oximeter development, while Suzann Lebda questions fluoride’s impact on dental health. Readers also address issues like Medicare Advantage, high drug costs for seniors, and the financial burden of prepaying for baby deliveries. Liviu Steier advocates for fluorescence in dental care, emphasizing its diagnostic benefits.
The post Readers Embrace ‘Going It Alone’ Series on Aging and Chastise Makers of Pulse Oximeters appeared first on kffhealthnews.org
Kaiser Health News
Georgians With Disabilities Are Still Being Institutionalized, Despite Federal Oversight
SUMMARY: Lloyd Mills, a 32-year-old with autism, cerebral palsy, and kidney disease, has faced prolonged hospitalization due to inadequate community support in Georgia. After being admitted to Grady Memorial Hospital for mental health issues, Mills waited over eight months for appropriate housing, highlighting the systemic failures of a state still grappling with the consequences of a 2010 Department of Justice lawsuit regarding care for people with developmental disabilities. Despite significant investments and improvements in services, challenges like workforce shortages and inadequate funding persist, often leaving individuals like Mills in hospitals, impacting their mental and physical well-being.
The post Georgians With Disabilities Are Still Being Institutionalized, Despite Federal Oversight appeared first on kffhealthnews.org
Kaiser Health News
TV’s Dr. Oz Invested in Businesses Regulated by Agency Trump Wants Him To Lead
SUMMARY: President-elect Donald Trump nominated celebrity doctor Mehmet Oz to head the Centers for Medicare & Medicaid Services (CMS). Oz, known for his investments in healthcare, tech, and food companies, holds significant stakes in UnitedHealth Group, CVS Health, Amazon, and other companies involved in health insurance and pharmaceuticals, raising potential conflicts of interest. His financial ties include hospital stocks and pharmaceutical investments. Oz has expressed support for Medicare Advantage and criticized the food and healthcare industries. Critics question whether Oz can separate his financial interests from his role, particularly with companies doing business with the federal government.
The post TV’s Dr. Oz Invested in Businesses Regulated by Agency Trump Wants Him To Lead appeared first on kffhealthnews.org
-
Local News7 days ago
Celebrate the holidays in Ocean Springs with free, festive activities for the family
-
Kaiser Health News5 days ago
A Closely Watched Trial Over Idaho’s Near-Total Abortion Ban Continues Tuesday
-
Local News4 days ago
Sherral’s Diner to be featured on America’s Best Restaurants
-
Mississippi Today7 days ago
On this day in 1972
-
News from the South - Alabama News Feed3 days ago
Trial underway for Sheila Agee, the mother accused in deadly Home Depot shooting
-
News from the South - Georgia News Feed3 days ago
Jose Ibarra found guilty in murder of Laken Riley | FOX 5 News
-
News from the South - Alabama News Feed3 days ago
Alabama's weather forecast is getting colder, and a widespread frost and freeze is likely by the …
-
News from the South - Kentucky News Feed2 days ago
Nicholasville organization activates weather plan in response to bitter cold temperatures