There is a new rash of AI apps for therapy session notes on the scene. Transcription software for mental health and medical professionals have been around for a while. But newer apps utilize AI (of course…) to listen in to an entire session and then pump out notes at the end of the session. All ready to go, for therapist convenience.
Magic? There must be little gnomes listening, turning cranks in the machine….
These apps come along with a new subindustry of AI involved in mental health diagnosis and treatment. They build on traditional dictation software with added multi-voice recognition. They then use AI language models to easily summarize important topics discussed in therapy. Apps like Mentalyc, DeepScribe, and Lindy boast the ability to provide notes in specific commonly-used formats (e.g., SOAP notes).
SimplePractice and TherapyNotes even promote integrating notes into a whole practice management system. Adjacent healthcare industries, like medicine, have started to utilize similar apps already. You may have seen this in action if your doctor has used this with you lately.
Some dictation apps are geared to be able to understand the context of mental health, in particular. They identify specific approaches used by clinicians in therapy. So for example, working with clients on challenging negative thoughts might allow the AI to notice that a therapist is using CBT. Or mindfulness and breathing techniques may be picked up by the AI and dictated into the note.
These apps also may provide an effective way to encapsulate sessions without human error. That is, it is easy to forget details that might be important or discussions that led to certain clinical decisions. Catching those details could effectively remind therapists what they did during each session and why. A nice boost to continuity of care is always nice.
Although this all sounds “convenient AF”, I’m sure many are skeptical upon hearing about these apps. What about HIPAA? And what are the AI and app companies doing with this data? Fair questions. After all, data breaches and data sharing have been a particular issue in the tech and mental health space. Many have called for greater regulation of mental health apps in general.
Many of these AI apps claim HIPAA compliance and state that they do not share data beyond internal use or product improvement. However, ethical standards for mental health degrees usually stress due diligence on the part of each therapist to understand the tools they use.
Such due diligence is especially emphasized for fresh technologies not yet demonstrating track records for how they’re used. So mental health professionals will want to research and understand any AI note-taking products, their degree of HIPAA compliance, and their data sharing policies, before using them.
AI therapy note-generating apps represent another mixed bag within the AI movement. Highly promising for convenience and even effectiveness, but new ethical ground. Another area to approach with hope, while also a healthy dose of professional diligence.





