How therapists are using AI in their work
This blog is a snapshot of how therapists are using AI to support their work, based on anecdotal evidence from my own therapy network and from published accounts. This technology is already embedded into many mental health services, and individual therapists are also using automated technology in a variety of ways. Below are some thoughts that might be useful when considering the ways you might consent for your therapist to use AI in your work together.
Automated therapy
Delivering therapeutic services involves the administration of sessions in a way that offers clear boundaries and communication to clients. Some therapists use AI to undertake tasks such as bookings and payments, to provide greater consistency and efficiency. Automation of services has also proved effective in reducing administrative pressure on therapists. This can result in greater resilience with which therapists can support their clients and can model the importance of self-care in sustaining good mental health. It also allows clients more control over their experience, promoting self-determination and empowerment. However, studies have shown the limitations of AI recognising when a client might need human intervention. Maintaining a person-to-person dialogue on practical issues allows clients to consider the therapist’s perspective and allows therapists to be responsive to what clients might need.
AI in a supporting role
Therapists also support clients by reflecting on what they’ve discussed and noticed within sessions and using these reflections to inform the work. Some therapists use AI to document the theme and focus of each session, to identify points for personal exploration and learning, and to formulate ideas. While AI is useful in summarising, extracting key themes, and detecting patterns, evidence suggests that it’s currently unable to capture emotional nuance, accurately apply theoretical concepts, and identify risk to clients. As such, you should be able to rely on your therapist to remain accountable for keeping a record of your work that’s accurate, confidential, and with your consent. If used in this way, AI can provide valuable support while preserving the trust that forms a key part of any therapeutic relationship.
Therapist AI literacy
Much therapy training now recognises the existing presence of AI in the lives of therapists and clients. Increasingly, courses criteria respond to a need for an understanding of the opportunities, risks and regulations around the role of AI within therapeutic work. Therapists can also develop their practice by using AI to simulate client conversations, role-play interventions, model communication skills, and review their work. However, at the time of writing, many professional bodies in the UK advise against therapists sharing audio or written transcriptions of client sessions with AI due to ethical and legal concerns. When choosing a therapist to work with, it might be useful to discuss how they incorporate AI into their work, and what measures they have in place to ensure they use AI safely and effectively.
AI therapy resources
Therapy often involves work that happens between sessions, where clients further self-reflect, apply newly developed tools and strategies to everyday life, and gather further information. Many therapists recommend AI-supported mental health resources to support this. Evidence suggests that AI apps can be effectively used for journalling, tracking mood, and guided exercises (e.g. breathing exercises, sleep techniques). Current guidance from UK public health bodies recommends against using AI chatbots as a therapist due to a lack of clinical research and safeguards. However, many therapists invite clients to share and discuss their experiences of AI chatbots within therapeutic sessions. They recognise that respecting each individual’s use of AI is an important part of a respectful and non-judgemental therapeutic relationship.
Further Information
This piece written by AI expert Lance Elliot talks about how mental health therapists are surprisingly making use of generative AI during therapy sessions. And here human rights advocate Dwight Decker summarises plans to regulate AI mental health apps in the US and Europe.