AI for therapeutic purposes – Conversational Agents
AI for therapeutic purposes – Conversational Agents

AI for therapeutic purposes – Conversational Agents

More yellow cars. With an interest in neurodiversity and AI, it seems like that there has been a flood of articles cross my radar.

More than a handful of articles and research papers on AI highlighted therapeutic use cases, superficially the use of chatbots as “mental health conversational agents” (CAs). Therapeutic use cases include: mindfulness and mindfulness exercises, reducing anxiety and depression symptoms, there is one discrete use case for “autism spectrum disorder (ASD)” traits.  

In summarising, it is reported that these conversational agents offer an accessible, anonymous, immediate, multi-lingual, non-judgemental space to both ask questions and receive support. Second, they are reported to be especially valuable for young people facing barriers to traditional therapy, such as stigma, cost, or availability. Third, they provide scalable, cost-effective, and confidential means of support.

In terms of treatment, numerous studies have indicated that NLP-based chatbots possess the capability to identify mental health issues through a question-based approach similar to that of mental health practitioners (39). For instance, chatbots may inquire about various aspects,

The potential breadth of AI to efficiently and detect early various disorders, allowing timely intervention and improved prognosis was quite staggering: Emotional dysregulation, mood disorders, ASD and neurodegenerative disorders such as Alzheimer’s, Parkinsons and motor neuron disease.

Invest a little more time researching and very quickly you will learn, as I did, that there is a “bleed” between real-world applied uses of conversational agents and digital popular culture – notably character.ai – a neural language model that can impersonate… well, anyone. A platform or userbase dominated by users aged 16 to 30 (arguably young people). With one agent more in demand than any other – @Blazeman98’s “Psychologist.” In fact, there are hundreds of agents with “therapist” or “psychologist” type titles.

Dig a little deeper

We’ve seen a lot of people say they’ve shared things with Woebot that they’ve never shared with someone else.

Alison Darcy, founder and CEO of Woebot

Dig a little deeper and you will read about Earkick and Woebot. Conversational agents or apps designed from the ground up to act as mental health companions, offering Cognitive Based Therapy based support tools (in Woebot’s case, for adults, adolescents and new mothers, having guided 1.5 millions individuals). Just this past month, Tech Crunch covered  Sonia, the first AI therapist I have read about that has additional algorithms / models to detect “emergency situations” and that then can direct users to national hotlines.

Dig a little deeper still, and you learn that Limbic Access* became the first mental health chatbot to secure a UK medical device certification by the government at a time where the NHS Improving Access to Psychological Therapies (IAPT) services are experiencing significant capacity challenges in the face of record demand.

The research presented highlights that the use of the Limbic Access reduces workload for NHS Improving Access to Psychological Therapies (IAPT) services. As I understand it, gathering information in advance and effective triage means clinicians have less admin, are more informed ahead of appointments and can spend more time focusing on the patient. Patients enjoy shorter wait times and faster recovery.

Services that used Limbic Access saw an improvement in patient reliable recovery rates from 47.1% to 48.9% compared to services without AI-enabled solutions, which saw a decrease in reliable recovery rates from 48.3% to 46.9% in the same time period. Limbic Access exhibited an estimated cost per additional recovery ranging from £118.25 to £221.89, whereas alternative methods in the same study incurred costs up to 1014% higher at £1,200 per recovery in economic terms within two years.

https://www.medrxiv.org/content/10.1101/2022.11.03.22281887v1 – Note that there is a competing interest statement and funding statement.

Conversation agents and autism spectrum disorder

Lastly, I found the application of AI and Machine Learning in the form of a conversational agent to detect “autism spectrum disorder (ASD)” traits. Here the camera is required, so that the user/child can interact with the dashboard. Images can be used in chats with the patient, and the conversation is stored for use by the physician. Parents and carers can upload videos of their children in different situations so specialists can screen for ASD using video evidence instead of questionnaires.

Thakkar A, Gupta A, De Sousa A. Artificial intelligence in positive mental health: a narrative review. Front Digit Health. 2024 Mar 18;6:1280235. doi: 10.3389/fdgth.2024.1280235. PMID: 38562663; PMCID: PMC10982476.

Note – I ask for your understanding if I unintentional use outdated terms or phrases.

Leave a Reply