AI could fill gap left by shortage of mental health professionals

By Tony Poland, LegalMatters Staff • Chatbots offering mental health support could be “the wave of the future” depending on the help provided and how they are used, says Ontario disability insurance lawyer Courtney Mulqueen.

Mulqueen, principal lawyer of Mulqueen Disability Law Professional Corporation, says artificial intelligence (AI) has evolved to the point where a chatbot may be able to help fill the gap left by a critical shortage of human therapists.

“To be clear, it can be one more tool to help those suffering from a mental health issue,” she tells LegalMattersCanada.ca. “Chatbots cannot replace human interaction, whether we are talking about a psychiatrist, a psychologist or a social worker.

“In my area of practice there are many people in need of a little extra support,” Mulqueen adds. “Because of an increasing demand for mental health services and the shortage of practitioners, these people are waiting months before they can see someone. AI could be an essential bridge that provides help in the meantime.”

Millions struggle with mental illness

She says the Centre for Addiction and Mental Health (CAMH) referenced a World Health Organization study that found about 450 million people struggle with mental illness worldwide.

According to CAMH, only half of Canadians who experience a major depressive episode receive “potentially adequate care.”

“It’s fair to say that most people still don’t have a true understanding of just how big, onerous, and potentially damaging the crisis really is – on both a societal and personal level,” the Centre states. “Mental illness is a leading cause of disability in this country, preventing nearly 500,000 employed Canadians from attending work each week. 

“To make matters worse, the cost of disability leave as a result of mental illness is about double the cost of leave due to physical illness. All in, the economic burden of mental illness in Canada is an estimated $51 billion per year including health care costs, lost productivity and reductions in health-related quality of life.”

Mulqueen says although a shortage of mental health professionals has been frequently chronicled in the media and medical studies, it appears to be a problem with no foreseeable solution.

Light at the end of the tunnel

However, she says artificial intelligence could provide some light at the end of the tunnel, pointing to a recent 60 Minutes report.

The television news magazine states chatbots are providing mental health support 24/7 on people’s smartphones. But, “like human therapists, not all chatbots are equal. Some can help heal, some can be ineffective or worse.” 

60 Minutes spoke with research psychologist and entrepreneur Alison Darcy who founded Woebot Health.

“We know the majority of people who need care are not getting it,” she told the newsmagazine. “There’s never been a greater need, and the tools available have never been as sophisticated as they are now. And it’s not about how can we get people in the clinic. It’s how can we actually get some of these tools out of the clinic and into the hands of people.”

According to the report, Woebot is specially trained to “recognize words, phrases, and emojis associated with dysfunctional thoughts … and challenge that thinking, in part mimicking a type of in-person talk therapy called cognitive behavioural therapy.”

Since going live in 2017, the company reports 1.5 million people have used it.  

Woebot is rules-based so it is mostly predictable but chatbots using generative AI are not, it was reported.

Challenge is ‘protecting people from harmful advice’

The challenge for Darcy and those looking to develop these types of chatbots is “protecting people from harmful advice,” 60 Minutes states.

“There are going to be missteps if we try and move too quickly. And my big fear is that those missteps ultimately undermine public confidence in the ability of this tech to help at all,” Darcy says. “But here’s the thing. We have an opportunity to develop these technologies more thoughtfully.”

Mulqueen says a responsibly developed chatbot can serve several functions.

“It can be helpful for people who do not need extensive psychotherapy. People who just need some advice,” she says. “Someone can have questions and get some answers or a referral for more help.

“Mental health issues can be stigmatized and some people may feel embarrassed about seeking help. But talking to a chatbot could make them more comfortable about seeking further support,” Mulqueen adds. “It can also provide some relief as people wait to see a psychiatrist or psychologist in person.”

While chatbots are a useful tool, they should not be viewed as the “be all, end all,” she says.

“It is unreasonable to think a chatbot would be able to treat severe depression or a bipolar disorder, for example,” says Mulqueen. “That takes ongoing therapy and specific medical treatment.

Risk of doing additional harm

“As well, there is also the risk of people receiving incorrect advice that could cause additional harm,” she adds. 

Mulqueen says employers could embrace AI as a way to address mental health issues in the workplace.

“Many companies offer Employee Assistance Programs that provide a set number of therapy sessions,” she says. “This could be another tool. For instance, an employer may identify someone who is feeling stressed or anxious. The issue has not developed into a long-term disability and they might benefit from some extra support chatbot therapy could provide.

“Insurance companies may even find AI to be a cost-effective way of supporting a disability claimant in their return to work.”

However, Mulqueen says she doubts insurers would be willing to rely solely on chatbots any time soon.

“We are likely far off from that,” she explains. “In practice, if your condition is such that you are unable to work due to a mental health issue, insurance providers are going to expect that you have seen a psychiatrist or have been referred to one or that you are getting ongoing, recognized therapy

Insurers want to see a diagnosis and treatment plan’

Insurers want to see a diagnosis and treatment plan from a medical practitioner to ensure the claimant is satisfying the terms of their policy and receiving the appropriate care,” Mulqueen adds, “But even using a chatbot could conceivably prove to the insurance company that people are doing everything possible to get better, which they have a duty to do. This could prove that they are not simply sitting back waiting for an insurance payout.”

In the end, there need to be safeguards in place to ensure people are suitable candidates for AI mental health support, she says.

“It is clear that with a shortage of treatment providers, we must adapt the way we have been traditionally doing things,” Mulqueen says. “But with the right checks and balances chatbots can perhaps be the wave of the future in mental health care, bridging a gap for people who are suffering.”