How people are using AI for therapy
This blog is a snapshot of how people are using generative AI for mental health support, based on anecdotal evidence from my own therapy practice and published accounts. Although research around the use of AI is limited, the technology is already part of our lives, with ‘therapy/companionship’ currently one of its main uses (according to the Harvard Business Review). Below are some thoughts that might be useful when considering whether or how to use AI for therapeutic purposes.
Information technology
Therapy often involves developing a new understanding of ourselves. Recognising how we think, feel and behave allows us to develop new habits that better serve us. AI is being used to identify patterns by applying statistical analysis to what and how we communicate with it. Understanding ourselves also often involves psychoeducation, which is learning about how the human mind works. This knowledge allows us to recognise brain functions (e.g. common responses to threat or reward), better interpret messages (e.g. why we compare ourselves to others), and know when something needs attention (e.g. signs of burnout). AI has proved effective in providing information and tools for common mental health conditions, such as depression and generalised anxiety. However, other studies have shown its limitations in understanding content, especially around complex issues and risk of harm.
Digital relationships
Emotional support is another substantial part of any kind of therapy. AI is being used to help people manage their feelings by being available to listen, normalise experiences and provide encouragement. Having someone on hand when feeling distress can be invaluable. However, there’s a question whether the absence of time limits when using AI, which would be a boundary with in-person therapy sessions, might result in dependency on external support. Our need for connection is demonstrated by our ability to project human attributes on to non-human forms (Anthropomorphism). This allows us to experience meaningful relationships with AI and to feel supported by it. However, this same phenomenon makes it difficult for us to hold in mind that AI can’t interact in a human sense, and to distinguish formulated responses from ‘felt’ responses.
Online security
Conversations with AI are a way to separate our personal development from the outside world. Holding the questions, experiences and self-reflections we have in an online space can provide a sense of security. However, it’s difficult to guarantee privacy when interacting with technology. And gaining the benefit of accumulated AI conversation often requires permitting our data to be stored and potentially shared. Exploring ourselves with AI can also help us to share things we fear others may judge us for. The feeling of giving voice to something we’ve been holding in can bring a sense of relief and allow us to gain a new sense of perspective. However, it’s unclear whether this way of working teaches us to show our vulnerability in person-to-person relationships, to develop trust in others and let go of shame.
Combining AI and IRL therapy
Many people are using AI alongside work with a trained therapist, as support to reflect on and prepare for sessions. This can help to maintain engagement with the process, encouraging us to put into words and document what we’ve taken away from a session. However, for some there is benefit in letting thoughts and feelings settle between sessions, observing what rises to the surface or emerges over time. AI is also being used to practice and reinforce therapeutic techniques explored during sessions. This might include how to reframe or challenge unhelpful thoughts or to embed new ideas or perspectives. The usefulness of such tools will depend on our ability to consider our individual needs. For example, in the moment, someone feeling panic would likely benefit from grounding techniques rather exercises that address underlying fears.
Further Information
This piece written by AI expert Bernard Marr for Forbes summarises some ‘Groundbreaking Mental Health Tools’. And here Eleanor Lawrie writes for the BBC on ‘Can AI therapists really be an alternative to human help?’