There’s a fear in many sectors that automation will take people’s jobs away. Will that ever happen to counsellors?
Rather than delve into reams of theory about how interaction with humans will always offer therapy that machines or artificial intelligence (AI) ever can, I decided to try out some apps. I like to think I was being open minded, despite these being a potential threat to my business and a conflict of interest to my writing a balanced assessment of them. In fact, I was, and am, genuinely intrigued and excited by the developments. I believe the apps work on both Android and IOS but as this isn’t supposed to be a technical article, I’ll leave it at that.
So , without further ado, here goes.
First off is a CBT app called Woebot. I actually downloaded this about a year ago for free, but hardly used it. Yet I found myself turning to it recently when I was beset by worry about a family member.
It’s a chatty interface. It decided to show me a picture of a baby hedgehog to cheer me up and invited me to respond with “Awww”. In fact most responses are pre-set, using chatty language like “Sure”, “Okay” and “Got it”. I am guessing this might grate with some in the UK, when compared with the USA. Then it takes a couple of different courses. You can either “check in”, which starts chatty then becomes informative as Woebot educates you about CBT and how our thinking can become distorted etc. Or you can ask it to address a particular issue, which you type in. I find it tends to then ask me for three thoughts or beliefs that it will go on to analyse with the me.
This second route is probably what most people need, but it’s probably not its strongest element. It doesn’t really understand any of what you tell it. So if I say “I’m standing on my head right now”, It just repeats back things like “In what way does I’m standing on my head right now contain black and white thinking?”. Which is OK but at this point, if you’ve forgotten that you’re not talking to a human, you certainly won’t any more.
After a few more responses you will hopefully have successfully challenged your thinking and realised that whatever expectations and distortions were wrapped up in it will have become obvious.
I have to say I quite like having Woebot on my phone. Even the chatty side of it can be fun, in a distractive kind of way. I especially liked it when Woebot came up with “I was just thinking about you”. The teasingly asked “Wondered if you’d be interested in something I thought you’d like”. Yes, I know it’s a program. But it’s a very well-mannered one (there, I’m personifying it now) and accepted the invitation. It was just a run-through of an example of some type of thinking, which I’ve now forgotten, but it was interesting nonetheless.
Verdict: In small doses, surprisingly good. Would love to know more from someone who has used it exhaustively.
Similar to Woebot is Wysa. It works in much the same way. Perhaps less chatty but works in the same way, by stating back to you your thoughts which you are invited to analyse. It does things in a slightly different order, asking that you choose which of a number of pre-defined distortions is affecting your thoughts. You might not get it right first time, but bear with it.
Using Wysa was, for me, a little more frustrating. I went round in circles and ended in a couple of dead ends as if I was navigating a maze. (Those with memories of text-based adventure games from the 1980s will know what I mean). And I found I had to get inside its mind, ironic though that sounds, to be able to move forward. Nevertheless the effort was worth it. In the middle of a thus far irritating day, when I had done a number of pointless things, I offered it “I believe I am competent and make the right decisions” as the reason why I was frustrated by making some wrong decisions. By the time it finished with me, I had “I use strategic thinking to great effect but outcomes can and do vary”. It was a “wow” moment.
Wysa requires willingness to engage with its methods as well as persistence and commitment to the task. You have to believe the app is a teacher and you have to do as you are told. If you feel rebellious you won’t get the best out of it.
Wysa also gives you a “toolbox” which contains information about CBT helps when you’re e.g. overwhelmed, worried or can’t sleep. It can also connect you with a human, but that wasn’t the point of my using it.
Verdict: Good if you like to be pushed and challenged, and can avoid being frustrated by its limited ways of navigating through its steps.
A different type of app, and not one based on CBT, is Pacifica. Being based on mindfulness, it doesn’t chat like Woebot and Wysa, other than to ask how you feel. It then recommends you do a particular activity – calm breathing or getting some rest. It will then play music or whatever for a set time to encourage you to do this. You will be invited to log your feelings in your daily check in.
Its front end also suggests a different purpose. Dreamy pictures rather than robots and thought bubbles.
I used Pacifica to relax for 20 minutes. In fact I fell asleep. So I can’t criticise it for doing what it says it will do. But the lack of actual engagement after experiencing Woebot and Wyse felt like a let-down. There are options to work through e.g. building confidence, but these come at an additional cost. The basic app is essentially like one of those fitness coach apps, only it’s for mindfulness.
Verdict: Good for those who don’t want to chat to an app, and prefer a degree of control in their life. For me, a little lightweight.
Is this the end for counselling?
It’s a challenge, but for the right reasons. Counsellors who behave like robots, doing nothing more than offering text book solutions to issues will likely feel the most threatened. But as I often say to clients who question why they are coming to counselling “If you wanted a self help book, why are you speaking to a human?” Replace book with app and the discourse is the same.
CBT is probably an approach that lends itself to AI more than most. The psychodynamic approach (which features highly in my practice), person centred and transpersonal approaches can’t as far as we know be replaced with machines. Because they all rely on thoughts and feeling stirred up in the counsellor in some way or another. Can an AI empathise? Probably not in my lifetime.
Should I use counselling apps