Maybe not spooky, but today I am reflecting on AI taking over every aspect of our lives until there is no point in doing anything anymore and, as a species, we just dissolve into pointless nothingness. I’m reflecting on the bleakness of the future and how it is a land without colour, joy, or creativity.
… alright maybe that was too intense. I’ll try again.
Ahem.
A friend of mine posted on Facebook earlier about an ‘AI therapist’ she tried. Within ten minutes, it had diagnosed her with ‘adjustment disorder’. Is that good? Was it right? No idea. My friend did this because she had been a therapist for thirty years and wanted to see what this AI thing was all about. It left her with a bad taste in her mouth. It felt very ‘business’ focussed. It felt dubious. It felt unsafe. (These are all her words).
I looked into it myself. I am currently training to be a counsellor and if I am about to be replaced by a robot, I might as well go back to my old job of being a writer.
(oh wait, dang.)
Long story short, I didn’t trust it. Firstly, the website of the AI therapist claimed to be trained in the ‘latest techniques’ and was held to the ‘highest ethical standards’. Nowhere did it say what these techniques were, or what ethical standards it held (itself?) to. Never mind the fact that the website privacy policy and terms of service were missing, so who knows who you are giving your data to and what they are doing with it.
I’m particularly interested in the ethical standards though, as in my most recent counselling lesson, we were talking about ethics – more specifically the ethical and moral principles that counsellors and psychotherapists hold themselves to. There is a list (taken from the BACP ethical framework), but here are some of them that stuck out to me while researching this AI therapist bot:
Care: benevolent, responsible and competent attentiveness to someone’s needs, wellbeing and personal agency
Diligence: the conscientious deployment of the skills and knowledge needed to achieve a beneficial outcome
Empathy: the ability to communicate understanding of another person’s experience from that person’s perspective
Identity: sense of self in relationship to others that forms the basis of responsibility, resilience and motivation
Humility: the ability to assess accurately and acknowledge one’s own strengths and weaknesses
These five key personal qualities (out of a list of twelve) that counsellors and psychotherapists try to achieve every day when working with clients cannot be achieved by an AI. They could probably adopt some kind of facsimile of these qualities – but it will be fake and limited. How can an AI be aware of their user’s needs if the user doesn’t straight up tell them? They can’t look at body language, they can’t use other tools that therapists have to assess properly. But apparently they can diagnose you with something in ten minutes.
Hmm.
Anyway, I’m not going to try and hide the fact that AI concerns me. Businesses trying to make a quick buck off vulnerable people concerns me. I don’t trust any of these AI bots (or the business models behind them) and that, to me, is why a bot can never be a good therapist. If you can’t trust a bot, you, as a client, can’t form a meaningful relationship with them and change and growth can’t happen.
That wasn’t a very fun blog post, was it? Or spooky.
Spooky robots, bleep bloop!
See you tomorrow.








Leave a comment