Advances in technology have affected a variety of industries, with artificial intelligence (AI) and machine learning becoming increasingly prevalent as a result. This has become certainly noticeable in the wake of the coronavirus pandemic. Several companies, including many not even in healthcare, have leveraged technology against COVID-19 effects. Thus, it’s not surprising that healthcare is also turning to such applications given the pandemic’s effects. Not only is machine learning being used in routine healthcare diagnostics. But AI algorithms and mental health benefits are now being utilized to help those with a variety of mental illnesses.
Research studies as well as clinical data are now reporting the use of machine learning as new suicide prevention technologies. AI algorithms and mental health protocols combined to identify those individuals who are at the greatest risk. These new suicide prevention technologies could offer a much-welcomed improvement in detecting risk. This is especially valuable since suicide rates have been climbing for two decades. And it’s also noteworthy given the stress associated with the coronavirus pandemic.
“It is a critical test for these big-data systems. If these [AI algorithms] have a high rate of false positives, for instance, that marks a lot people at high [suicide] risk who are not — and the stigma associated with that could be harmful indeed downstream.” – Alex John London, Director, Center for Ethics and Policy, Carnegie Mellon University, Pittsburgh
Research Involving AI Algorithms and Mental Health
AI algorithms and mental health benefits are being increasingly explored as of late. Researchers at MIT and Harvard, for example, recent joined forces to determine the change sin mental health post-pandemic. In their study, over 800,000 Reddit social media messages were analyzed using AI algorithms. The process scanned a variety of forums involving both mental health and non-mental health groups. Using a natural language processing (NLP) process, language was scanned for word frequencies. Several key pieces of data were discovered, which suggested these approaches might be useful as suicide prevention technologies.
Based on the tone and content of the Reddit messages, the presence of suicidal ideation and loneliness doubled from baseline occurrence. Forums dealing with anxiety were affected first. But subsequently, other mental health forums followed as did some non-mental health discussion areas. Those individuals who appeared to be the highest risk were those in borderline personality and PTSD forums. Thus, it’s feasible that AI algorithms and mental health apps could be used in screening. As suicide prevention tools, these could offer insights not only at individual levels but at public health levels also.
“The things this [suicide risk] program picks up wouldn’t necessarily be the [risks] I thought about. The analytics are beginning to change our understanding of who’s at greatest risk.” – Marianne S. Goodman, M.D., Psychiatrist, Veterans Integrated Service Network, Bronx, New York
Clinical Applications Involving AI Algorithms and Mental Health
Believe it or not, AI algorithms and mental health protocols have already made their way into actual patient care. In the VA system, a machine learning program called Reach Vet is actively being used to assess suicide risk. The system basically compares a vast database of patient information form veterans dating back to 2008. This data is then compared to over 60 different factors of current patients. If the algorithm identifies the individual as being in the top 0.1% of suicide risk, the patient’s chart is flagged. These individuals are then reassessed by their doctors on a monthly basis.
Despite their active use as suicide prevention tools, these AI algorithms and mental health protocols are unproven. The VA is actively collecting data to determine their benefits as well as their costs. However, it has already been noted that the system is flagging individuals as high-risk that otherwise would not be. Thus, it will be interesting to see if such systems have a better success rate that current clinical models. Statistically, the ones flagged by these suicide prevention technologies are believed to be 40 times higher risk than average. If the number of false positives and negatives are low, then this could have profound consequences for the future.
“Right now, this [AI algorithm] and other models predict who’s at highest risk. What they don’t tell you is who is most likely to profit from an intervention. If you don’t know that, you don’t know where to put your resources.” – Ronald Kessler, Professor of Health Care and Policy, Harvard Medical School
Evolution of AI Algorithms and Mental Health Screens
The latest developments related to suicide prevention technologies are certainly exciting. However, the use of these types of systems are not necessarily new. In fact, the National health Service began using AI algorithms and mental health screening systems in 1996. Likewise, the U.S. Army, Kaiser Permanente, and Massachusetts General Hospital each have their own versions. But none as of yet, only the VA is actively using these systems clinically to screen for suicide risk. Applying these systems to larger groups of people will be critical to determine their utility moving forward.
One thing is clear, however. The existing screening protocols for suicide risk have been failing. Over the last 2 decades, there has been a 30 percent increase in suicide rates in the U.S. veteran population. Physician and provider screenings continue to miss many who are at risk. This, in turn, prevents them from getting the support and help they need. The fact that the current AI algorithms and mental health protocols are detecting different patients offers some hope. Perhaps, these suicide prevention technologies detect risk factor combinations that clinicians can’t. This is why many psychiatrists and mental health workers are cautiously optimistic about recent developments.
A Catalyst for Mental Health Technologies
Like many other aspects of healthcare, the pandemic looks to be a catalyst for mental health technologies as well. Telemedicine has advanced in this area, and AI algorithms and mental health apps look to be as well. These digital therapeutics, diagnostics, and screening tools can enhance existing practices and lead to better outcomes. This is particular true in relation to suicide prevention technologies where significant improvements are needed.
Want to make 2021 a better year than 2020? Then check out PROJECT BOLD LIFE: The Proven Formula to Take on Challenges and Achieve Happiness and Success.