Research shows doctors and clinicians are often unable to accurately predict suicide. As if suicide wasn’t sad and humbling enough for families and helping professionals. We all want to do our best to help someone in crisis or in need. In step AI for suicide prevention to give us some hope.
*Note that we do not recommend using AI applications as a method for suicide prevention (e.g., via chatbots). Research efforts and understanding are still early at the time of this report.
Suicide prediction is difficult for professionals due to a number of factors. Self-harm itself can be an impulsive or unpredictable behavior. Emotional or mood-based indicators of suicide are often internal or not visible (e.g., worsening depressed mood). Helping professionals are also merely human and so are subject to certain levels of bias. For example, we may be hopeful about a patient or client’s progress (nothing wrong with that!).
Recent studies show AI may provide a more objective and all-encompassing view. AI is an extremely objective yet highly powerful prediction machine. Studies use this power of prediction to look across multiple areas to uncover patterns the average human professional is not able to see all in one place.
The AI language models typically do this by compiling and analyzing text from a collection of treatment notes and social media posts. It compares its text analysis with records to predict risk for suicidal ideation, suicide attempts, and/or suicide completion. This provides a powerful possibility of AI for suicide prevention to add to doctor and therapist resources.
AI language models appear to be highly effective at predicting suicidal ideation, in particular. However, those models appear somewhat less effective at predicting suicide attempts and suicidal completion. This is partially due to fewer instances of attempts and completion being reported in studies, so more research may shed a better light.
So then when can clinicians start using AI to assess suicide risk?? Although this provides a lot of hope, AI is still not perfect. Unfortunately, much more work needs to be done to more fully understand and develop clinical AI tools for suicide prevention.
And all this occurs amid a lot of controversy, following reported instances of individuals using AI chatbots to get information on suicide methods. Many believe AI applications need to do more to prevent these user queries and to direct users to suicide prevention resources.
It does not appear to be time for helping professionals to use AI to predict suicide yet. But this does provide a lot of hope! Hopefully we’ll see some tools soon that improve risk assessment and prediction using AI for suicide prevention.





