Can You Trust Your AI Therapist?
There’s an app for that – and that includes therapy, of course. Smartphone apps purporting to offer the comfort of a psychologist’s couch are on the rise, as startups around the world design tech to help with mental health issues such as anxiety and depression.
Many of these apps work via virtual conversation agents – a.k.a. chatbots – that analyse user responses to produce (usually) appropriate suggestions for how to deal with one’s emotions.
Woebot, for instance, is a smartphone app – or “your charming robot friend” – that checks-in daily for a short conversation about how users are feeling, tracking moods and offering quick lessons for managing feelings, based on the methods of cognitive behavioural therapy (CBT). Wysa – which calls itself your 4am friend and life coach – offers a similar message-based experience, via a cute penguin avatar that asks users how they’re doing, then offers a combination of breathing exercises, mindfulness exercises and CBT techniques to deal with the current mood. Both apps are free, although for $29.99 a month, Wysa additionally offers messaging with a real-life coach.
Artificial intelligence has been hailed as a potential solution to the global need for affordable, reliable mental healthcare. One in four people will be affected by a mental health condition at some point during their lives, while two-thirds of the 450 million people with a mental disorder never seek treatment, whether because of cultural stigma, a lack of services, or treatment that is too costly.
A smartphone chatbot could be a handy answer to all three issues.
“Computers have strengths that humans don’t – they’re available 24/7, their voice is consistent, they don’t have personal prejudices, and they are good at finding patterns in large volumes of data,” says Dr. Fjóla Helgadóttir, clinical psychologist, researcher and co-founder of AI-Therapy, an online CBT program for social anxiety.
Less private than therapy?
One potential issue is that the majority of chatbot-based therapeutic apps are technically categorised as health and fitness apps. This means that their claims do not need to be approved by regulators, and data protection is not necessarily held to the same standards as typical medical records.
Indeed, Woebot is careful to flag exactly what it offers in one of its first messages to a new user: “I don’t do open-ended conversation and I don’t do therapy either. My super smart creators made sure that I adapt to what you say.”
While it’s unclear whether the Woebot apps are HIPAA-compliant, the US regulatory framework for securing healthcare data, the company says it is compliant with the European GDPR – messages exchanged with Woebot are encrypted in transmission and stored encrypted; users’ data is anonymised; and users are able to request and delete their data.
“Users should certainly look for explicit confirmation that such a program is not going to be sharing your data with a third-party,” says Helgadóttir. “As well, who created the program? You pay for your privacy in a way, so if the program is for free, I would be extra aware of potential problems.”
Even paid-for services can run into potential privacy snags if they’re accessed through less secure platforms. Tess.ai is a mental health chatbot “who delivers emotional wellness coping strategies”. It’s HIPAA-compliant, but only on the company’s own platforms: x2.ai, tess.ai and karim.ai. If users communicate with Tess via SMS or WhatsApp – and this convenience is perhaps part of the attraction for its four million paying users – the company does not guarantee the same protection for user data.
The effectiveness of human-free interaction
The conversational abilities of today’s chatbots aren’t top-notch yet – telling Woebot about a racing heart and impending doom, for example, prompts the bot to ask what type of problem this could be categorized as – but the CBT techniques they espouse have been proven effective at helping patients rationally think through emotions and thought patterns.
Many studies have found that CBT delivered over the internet, including via apps, offers numerous benefits for patients, in part because they can access treatment whenever they like, with the anonymity that can especially benefit anxious or depressed patients.
Though research is thin on the ground for the efficacy of the chatbot apps for therapy, especially for a particular program’s methods, the field is a young and fast-developing one. Wysa is putting out a call on its website for partners to run research trials, while a small study of Woebot users found that depression symptoms were reduced over a two-week period.
With the need for mental health services continuing to outstrip its availability in various parts of the world, a greater focus on researching effective applications of CBT in AI-powered apps and online programs could build an effective complement to existing treatments.
The role of technology in health is already expanding – the first medical-grade wearables are emerging for commercial use, while the developer of the ultra-popular meditation app Headspace is currently working on prescription-grade mindfulness tools to launch in 2020.
These could pave the way for tech innovators in the mental health field to seek regulatory approval for their products, rather than operating in the health and fitness category. For anyone needing support with their mental health, an AI therapist is always in – next up is for the doctor to be regulated and privacy-assured too.