I played around with AI last week. Partly as home education with my child, partly because I’ve agreed to trial offering the Affiniti app to my clients for support between sessions. Affiniti is a WhatsApp based chatbot specially developed to offer text based emotional support to people in therapy between sessions. It was an interesting exploration. We texted with Affiniti and then by way of comparison chatted with Hume (marketed as “the world’s first voice AI that responds empathically”) as various avatar versions of ourselves, We described symptoms of anxiety, depression, addiction, paranoia, PTSD and insomnia. And we told it we were struggling with the belief that we “weren’t good enough”, “weren’t lovable”, “would fail” or “would be rejected”.
Both versions of AI sounded supportive. Hume empathised with our problems then quickly tried to change the subject and cheer us up. Much as a well-meaning friend might do. Sadly, it often missed the mark of what might actually be needed. But then it’s not claiming to be therapeutic support and doesn’t have therapists on its development team so that’s perhaps not surprising. Nice mission statement and aims but quite a way off doing me out of a job.
Affiniti has been developed by researchers who understand therapy with a human and want to support rather than replace the process. It seemed more inclined to listen and acknowledge than to rush us into positivity. It offered prompts to gently challenge some of the unhealthy beliefs we were modelling and suggested exercises that reminded us to think about conflicting information. It’s not a substitute for a real connection with a living human being. But might it help on those days when we are struggling to get out of bed and get started? When we’ve lost our sense of purpose for existing and feel like, “What’s the point?” or forgotten what brings us joy?
How about when we are feeling more vulnerable? Worrying that things won’t turn out right? Fearing our headache is a brain tumour or our mole is cancerous? When we’re tired and low and trying to resist sugar, alcohol or porn?
Talking to a machine is most definitely not the same as talking to a living breathing human being. But there are times when this simply isn’t possible. It’s unfortunate that we often feel more vulnerable in the evenings when we are tired and the hormonal effect of our circadian rhythm kick in. Is it possible that AI could plug the gap?
One area that the Affiniti app really lends itself to is helping us do the things we know make us healthier humans. We often know the choices we could make that could help us feel better. Say no to sugar, go for a walk, set aside time to journal, practice gratitude, stay off the alcohol and stick to a regular routine. But we struggle to get started or waste time doom scrolling. We can’t have 24/7 access to a therapist but maybe we can have round the clock access to an AI coach encouraging us to take the steps towards better physical and mental health? Giving us a gentle nudge at the points that are the sticking points in our day.
As a clinical psychologist, I’m really invested in forming deep connections with those I work with. It’s a real privilege to be trusted with moments of vulnerability and help someone navigate their way to better health. I doubt a machine will ever be able to create the sense of being seen and “got” by another person which is at the crux of most psychotherapeutic endeavours. However, I also know that there are times when we all need a little extra support and maybe Affiniti has the capacity to meet that need?
Affiniti is exclusively available in partnership with psychologists and psychotherapists. If you think you could do with a bit more support beyond a 50 minute therapy session, please get in touch for a complementary consultation to see how the combination of therapy with a human accompanied by AI coaching might work for you.
Thanks for the report. I’ve been wondering about AI and counselling for a while now. It’s interesting to know Affiniti gets a green light from a professional.