Welcome to the Business of Psychology podcast. I'm really excited to be interviewing Dr Rachael Skews, a psychologist, coach, trainer, supervisor, speaker, advisor, researcher, and author. She is an internationally recognized subject matter expert in acceptance and commitment coaching and has a really interesting background working with tech companies, including Headspace, to develop effective and ethical behaviour change initiatives. I saw Rachael giving a webinar for the International Society for Coaching Psychology, and I knew I had to ask her to be a guest on this podcast because I found her insight into how the emerging AI technology could support and enhance our work, so refreshing and so fascinating. I get kind of scared by the unbridled enthusiasm for tech that the tech community often has. But I'm also really uncomfortable with the alarmism and pessimism that the mental health world often defaults to when we're faced with new stuff. So it was really great to hear a balanced view from somebody that really understands the ethical issues and potential pitfalls, but also embraces the excitement of the new technology.
Full show notes and a transcript of this episode are available at The Business of Psychology
Links for Rachael:
LinkedIn: Rachael Skews
Website: www.cognuscoach.com
Other Links:
Reading Our Minds: The Rise of Big Data Psychiatry by Daniel Barron
Rosie on Instagram:
Ready to grow your practice beyond one person and a laptop?
We are here to support you to build a thriving, impactful and profitable business.
Invest in our growth pack to confidently grow your service with associates, organisational work or passive income.
Our unique package includes strategy and marketing training from Dr Rosie Gilderthorp, Founder of Psychology Business School, and legal contracts from Clare Veal, Commercial Lawyer from Aubergine Legal.
Together, we will ensure that you have the strategy and documents you need for growth so you can expand your impact and income while maintaining your work-life balance.
Sign up now: The Business Growth Pack
Thank you so much for listening to the Business of Psychology podcast. I'd really appreciate it if you could take the time to subscribe, rate and review the show. It helps more mental health professionals just like you to find us, and it also means a lot to me personally when I read the reviews. Thank you in advance and we'll see you next week for another episode of practical strategy and inspiration to move your independent practice forward.
SPEAKERS
Rosie Gilderthorp, Rachael Skews
Rosie Gilderthorp:Hello and welcome to the Business of Psychology. And today I'm really excited to be interviewing Dr Rachael Skews. Rachael is a psychologist, coach, trainer, supervisor, speaker, advisor, researcher, and author. She is an internationally recognized subject matter expert in acceptance and commitment coaching and has a really interesting background working with tech companies, including Headspace, to develop effective and ethical behaviour change initiatives. I saw Rachael giving a webinar for the International Society for Coaching Psychology, and I knew I had to ask her to be a guest on this podcast because I found her insight into how the emerging AI technology could support and enhance our work, so refreshing and so fascinating. I get kind of scared by the unbridled enthusiasm for tech that the tech community often has. But I'm also really uncomfortable with the alarmism and pessimism that the mental health world often defaults to when we're faced with new stuff. So it was really great to hear a balanced view from somebody that really understands the ethical issues and potential pitfalls, but also embraces the excitement of the new technology. So I could not be more excited to bring this interview to you today. So without further ado, here's my interview with Dr Rachael Skews.
Hello and welcome to the podcast, Rachael. It's so good to have you here.
Rachael Skews:Thank you for inviting me on. I'm quite excited to be talking about this topic today, so…
Rosie Gilderthorp:Yeah, me too. And obviously we're going to dive all into AI and the positives and the pitfalls and the stuff to look out for. But before we jump into all of that, can you just say a little bit about who you are and your professional background?
Rachael Skews:Certainly, I have got quite an, perhaps a less typical journey into becoming a chartered psychologist than some people. So I, my first degree was in philosophy. So one of the things that that tells you about me is I'm a massive nerd for like philosophy of science and those kinds of things, that's also been a little bit helpful as we're moving into more technology orientated eras, because, you know, philosophy is kind of explored some of those existential questions. So I wasn't a psychologist, I retrained as a psychologist in my thirties. So I was a consultant in business working in employee assistance programs and human resources type services, provision. So doing a lot of psychological stuff, but I was on the business side and I just really enjoyed it and I thought, you know, when you're in your twenties, you've got a long career ahead of you, and I decided that I would retrain as a psychologist. My mum was a psychologist and she was also a teacher and she said, don't ever be a teacher. So, so I actually became an academic accidentally, but I was like, I'm still not teaching mum. You know, it's not like I didn't listen to you. So, in brief, I did a, I didn't actually do, I meant to do a conversion diploma, but I had enough credits to do a full psychology degree if I did one extra little course in biological psychology. So I do actually have a BSc in psychology. And then I went to Goldsmiths and did my master's in occupational psychology. And I did my PhD there. So my PhD was looking at acceptance and commitment coaching. So acceptance and commitment therapy, which I know a lot of people are getting quite interested in. What I did is I applied that to the workplace and did a randomised controlled trial of a coaching intervention with senior leaders in the civil service. So that's, you know, coaching something I can talk about for a very long time. I know that's not what we're talking about today, but, you know, that kind behaviour change, but also looking at mental wellbeing in the workplace. So that's, as a psychologist, that's my kind of area, is that intersect between performance and wellbeing. How do you get better performance, but in a way that's sustainable and how do organisations create environments that really foster that, and how do leaders create environments that really foster that. So that's my very nerdy little area of psychology. But what we're talking about today is more to do with, you know, building psychological interventions in technology, using AI, and this whole blossoming area which is affecting all of us. But I think one of the things that I've been trying to do over the last few years is talk about what that means as a psychologist, particularly in the UK. I think there's a lot more support for psychologists in this space in the US because there's more companies that are employing what they would call behavioural scientists. So that was my role when I was at Headspace. So, how did I get to that point? My PhD work, I was looking at the relationship, the coaching relationship, which is the same, you know, it's often a similar measure used as the therapeutic relationship. And so it's this kind of understanding of how do we create behaviour change? And that was one of the things that I was really interested in. I did a couple of analyses of looking at processes of change within the coaching intervention, and was very surprised to find that the coaching relationship wasn't accounting for any change. And I've seen a couple of subsequent studies that have had a similar kind of finding, so…
Rosie Gilderthorp:Wow, that's so surprising.
Rachael Skews:Really interesting. Which then left me pondering the extent to which you could put a really effective behaviour change intervention into something where there's not a human involved. So it doesn't, it doesn't sound so kind of unfeasible if you think about self help literature, which has been around for a long time. You know, CBT has been using self help for quite a long time. And so it's just this idea of can we change the mode of delivery so that we're using an app or we're using technology or something like that.
Rosie Gilderthorp:That's really interesting. I'm just wondering, was that an emotional question for you? Because I think a lot of the time when we're talking about technology and I'm talking to other mental health professionals about it, there's a real emotional attachment to the idea that human interaction is core to it, that that therapeutic relationship is central. And the therapy literature, I think is a bit different in that area, the therapeutic relationship does seem to be important. But I think research that might challenge that slightly, I think might be kind of upsetting for people. Did you feel anything or was your perspective different because you were coming from a business background? Because you describe it in such a pragmatic way.
Rachael Skews:To be honest, pragmatism is very much who I am. I'm quite, I'm very, you know, if you, if you come across Hogan, one of the personality measures, I am very low on agreeableness and I'm very low on kind of emotional activation.
Rosie Gilderthorp:I love that. Proudly not agreeable.
Rachael Skews:Proudly is probably, I'm not quite sure I'm that far, but it's just something like when you know yourself, you know, it's something that I know about me. It's one of the things that it can be a little bit of a challenge when I'm working with people, because I am very extremely, you know, at that, that end of things, and so that it takes a little bit of flexibility to try and meet people where they are in terms of their emotional reactions to things. But for me, as a human being, that's just my natural way. And I don't know if that's because I'm northern. I don't know if it's, you know, it's just a personality thing. But I'm also like, I very much associate as being a scientist. And I kind of think we have a view on the world and that is helpful because it helps us to create these questions and to, you know, design the studies and everything. But I do want to try and reduce my own bias around, and it's really hard, almost impossible, you can't take yourself completely out of the research, but I was surprised, but it didn't, I wasn't, what I thought is this is really interesting and it opens us up to different ways of creating behaviour change. Now one thing I do want to say about this is just because the relationship isn't in and of itself accounting for the change, it doesn't mean that it's not important and so I often say that to people, but what it means is that there are other options. And there are not enough what we would call care providers to meet the need. And particularly, so the area that I work in, and when I was working in tech, I kind of, this is how I positioned myself, is around that preventative mental health space. So how do we help people to learn how to self care more effectively? How do we help people to have alternative strategies and more workable for them in terms of their mental health? You know, and that's not going to be, you know, technology is not necessarily going to be the right thing for somebody who's in crisis. But not because there's this magical interaction between us and the client, it's because of the things that we are doing in session. That's what makes the difference. And so those things, you can kind of replicate that through technology. That's the idea behind the research that I was doing. So it's not saying that it's not important. And in fact, if you look at the research that's coming out of research spaces like David Moore's lab over in the US, what they're finding is that the, although you can take the human relationship out in terms of actually getting the change, like you can have alternative interventions delivered through the technology, what you get is a lot lower engagement. And so when you've got the human involved, because of that relationship, there's a stronger engagement with the intervention, you get greater kind of treatment adherence. So lots of these things that are really helpful, those things increase and it's, it's, you know, it's pretty robust. There's a robust relationship there, but if there's a human involved, you are going to get better engagement. I think that's probably, if you looked at the evidence for CBT through books and things like that, I imagine you'd see a similar thing because, you know, behaviour change is hard.
Rosie Gilderthorp:Yes. And I was thinking, you know, in the coaching space, we tend to, or the way that I tend to approach my coaching work is we start from the idea that the client already has what they need and we're helping them to use the skills and resources and abilities that they have to get them wherever they want to go. Whereas in therapy, often you really have to scaffold somebody through developing new resources. So it may be that they never had an attachment relationship that taught them what they needed to learn before. And so that relationship has a different function often in therapy. So it kind of makes sense to me about how, for some people, that relationship aspect might be so much less important. And for others, they may not be able to engage without that scaffold, that person saying, you know, I see you, I understand you, I've got this with you. And they're just different people at different moments of their journey, aren't they?
Rachael Skews:Or dealing with different things. So again, a big question around, you know, what's an appropriate intervention, and I would imagine for most of us, it's a mix. And of course, you know, the UK system, we do use a step care model. We use a lower level intervention, and then if that's not gonna provide what somebody needs, then we go up in intensity. And that was kind of the way that I think about it. If you think about coaching, and I'm thinking specifically around like health coaching rather than occupational coaching or leadership coaching, which I think is, you know, we're often working in a slightly different domain. But the closer one to therapy, the sort of, the closer comparison would be something like mental health coaching or health coaching, more broadly. And so I used to say, you know, I think it's a level, it's about the level of intensity. It's how, how difficult are these emotions? How intense are these emotions? How difficult is it for somebody to do something different? How entrenched is this behaviour? And to what extent do we need to have a really strong, robust relationship with somebody so that they feel safe enough to do something that is very different to what they've been able to do before. With coaching, it's just that, like you say, people have kind of, you know, they haven't got that intensity in the emotions that they're feeling, which is not to say that people, I mean, especially in leadership coaching, you know, there's a lot of emotion there, but it's not, we're not working in that therapeutic domain. We're not dealing with things that are so deeply painful for people that they've just struggled for probably their entire lives, you know. So all of that to say, one of the things that I think technology is really good for is when you've got that lower intensity, almost like helping people to learn how to look after themselves more effectively. And I think the technology can do some things that, like a care provider, a human care provider wouldn't necessarily be able to. And I often talk about that. There's some really cool stuff that is coming out about just in time interventions. And this is something I'm really interested in is how do we bridge the gap between what I might be able to do in a coaching session with somebody and then into their own lives. So I don't know if I'm explaining that. Does that make sense?
Rosie Gilderthorp:Yeah, it does. ‘Cause you know, often one of the things we know from research and practice is that people can understand something when they're working with us and they can have the best intentions and the best plan, but then they go home and suddenly it doesn't make sense anymore. And that's because of the way we learn. And we know that there is this bridge that we need to cross between understanding in one context and generalising that understanding. So am I right in thinking that's sort of what you're talking about?
Rachael Skews:Absolutely. And also that difference between, we're going to talk about something so it kind of makes sense in your head to you are going to go out and do it and you're experiencing it.
And of course as an ACT practitioner, I'm very into helping people to have those experiences because we know that's where change happens. You don't need to think differently in order to act differently. And so, I think, I do think one of the reasons that I'm quite interested in the role of technology is probably because of the philosophical backgrounds of ACT, and the way that we kind of work as ACT practitioners. And so, one of the reasons that I got involved in tech is because of that research that I've done, and unfortunately in academia. Oh, I loved, I loved Goldsmiths. I was so sad to leave, but Headspace were looking for somebody who was a coaching specialist in terms of acceptance and commitment coaching to help build an intervention, which was looking at chronic pain. And ACT has got a really strong evidence base for chronic pain. And this was going to be a combined meditation and ACT intervention. And so they asked me to help them to build the coaching intervention and then train the coaches. And then after a few discussions, I said, you know, we're really happy to come on board. They were looking for somebody to basically manage this team and to help them with the randomised control trial, testing the intervention. So that was how I made the move over. It wasn't really planned. It was like my research had kind of taken me to this place and then there was an opportunity to, to start to explore that a little bit more, but it was a hybrid. So there were human coaches who were using the technology to interact with these clients. It was personalised. I think that's something that's quite important, is that if we are going to build interventions in technology, like thinking about how you structure them, how you fit them to your client, because I think that's one of the benefits, is that you can actually work in a high standard, high quality, based on what it is that you're doing. So you can make some really nice content, which is the tech term for it all. Creating content. But you can have some really nice, very evidence-based content that you're able to send to people, and you can probably do it at a lower cost because you're not having my time. You know, you and I are expensive because we're highly trained. Whereas if we build content, that content becomes less expensive the more times it's used. It's that price per wear kind of principle. So, I think that's one of the benefits of it, but I think how you set interventions up is quite important. That has a bearing on the quality of experience that somebody might be having and that being able to personalise it to them. When I finished on that project, I was asked to be the scientific lead for innovation at Headspace. And this is when I started working a little bit more with AI. So we had a really amazing team. And all of my work within technology was interdisciplinary. So I was working as part of a team, I don't get the credit for anything because it's, you know, a really, it's a very kind of, an environment where people are all coming together with their own expertise and just collaborating hugely, which I really loved. But as I said, I'm quite low in agreeableness and I was working with a lot of Californians. So there was a, there's a requirement to, to meet, you know, meet them where they were. And that's, that's always interesting isn't it? You always have to flex your style a little bit, but it was…
Rosie Gilderthorp:And I wonder as well, if coming from that scientific and psychological background, must've been quite different to some of the other people in the room, which can be hugely positive and probably creates a better end result, but might not always have been comfortable?
Rachael Skews:Yeah, and I would say for psychologists, this is a really interesting point because, especially coming from an academic background, what I found is that where there was a need for pragmatism, which I'm quite, you know, I'm quite prone to that. Actually, you know, you have this discomfort of, is, does this still have integrity? But there was a really interesting thing that one of my colleagues said to me, who had quite a solid background in health technology, and he, you know, this wasn't his first rodeo, he'd done it before. And he said, you know, it's, you can have the perfect intervention, but if nobody uses it, it's useless. And I was like, yeah. I do think as an academic, you know, my work was not focused on that, which is not to say my work wasn't engaging, but it, you know, a one to one, face to face coaching intervention, you know, given for free because it's part of a research program, you know, nobody really dropped out. Oh, it was a bit of a problem actually in my PhD, I was knackered. But like we were saying, you know, with the technology has to be engaging, you have to find ways of making it very accessible. One of the things that I learned, which has stayed with me, is that all of the content that we created, you had to, you had to create it for a reading age of nine.
Rosie Gilderthorp:Wow.
Rachael Skews:So when you are, when I'm talking to people just doing my one to one coaching, you know, I'm not really thinking on that level because they can ask, they can tell me to stop. They can, you know, Oh, can you explain that to me? I don't quite know what you mean? But we were having to do this for something that was accessible for the general population that was going to be supporting of pretty much everybody. And we'll talk about safeguarding a little bit later, because that's a big issue here, but I'll park that for the moment. And so what we were trying to create was something that was, one of my colleagues in the design team used the word delightful. I mean, how many times do you think about your, your therapy or your interventions being delightful? But this was a new part of the lexicon, you know?
Rosie Gilderthorp:Yeah, well, it's so interesting because I, I've been doing a lot of research and looking into the customer journey literature from marketing science and I've found myself saying to people, what are you doing to delight your customers? Which, you know, feels a bit jarring sometimes when you're used to that NHS background where it's like, no, I'm, I'm delivering the best evidence based care, I don't really mind if they like it. So actually that's a bit messed up, isn't it? These are human beings. It'd be quite nice if they felt like they were getting good value from us and maybe even enjoying their experience, which, you know, it can be really jarring.
Rachael Skews:And also I think there's something about that, that if you make it enjoyable, if you make it engaging, people will stick with it when it's hard, when they, when they're going through something hard. So, you know, it is really, I think there's so much that's coming out of the technology space that we can learn from, and it's not a one way. It's not that we're the experts and we can just put our stuff out there and everybody should pick it up because it's wonderful. You know, I've got my deep expertise, but I was working with, you know, a decent sized team, maybe 10 people, each of whom was an expert in their own area. And it was wonderful and humbling. And I loved it. And it was hard and it was frustrating. And, you know, all of these things, but what you're doing is you're creating and collaborating together. And, you know, there's somebody who's going to make the ultimate decision around how you move forward. And if that doesn't fit with what you think we should be doing, you know, then you kind of have to, you know, roll with it. You have to be willing to say the extent to which this is really important from a clinical perspective. And that's one of the reasons, I mean, I talk a lot about leadership, but the leadership role within that context is very important, both for me as an individual to be the leader that I want to be when that's appropriate. But also for, for everybody in that space to be able to kind of manage themselves well. It's no good chucking your toys out the pram because you haven't got your own way, there'll be a reason. And you know what you're trying to do when you're building this sort of technology is, is have something that people like.
Rosie Gilderthorp:It's so crucial, isn't it? Because we know that engagement is key to everything. And yeah, it is a real mindset shift to go, Okay, how do we meet people where they are today? Rather than maybe where we hope they will get to. You know, I talk about it a lot with language, you know, how, how we might market ourselves as therapists, or if we're reaching out to corporates to offer wellbeing programs, we have to use the language that they understand. And that can be a real challenge, especially if you're a specialist in your field, and maybe the literature has moved on and isn't using certain terminology anymore. But you know, people are 20 years in the past in the real world, in terms of the gulf between academia and real life. And we have to acknowledge that. And I guess it's sort of a similar struggle where it's like, okay, I might really want them to do this thing, but if they're not going to do it, what's the point? Let's make this, you know, suitable for where they actually are. So would it be all right if we have a think about some of the opportunities you see on the horizon for mental health professionals and AI? What's coming down the track that we should be excited about?
Rachael Skews:Well, I don't know how valuable my predictions are, but I'll caveat them just as that. These are my predictions. This is what I think we might do. And I am quite an optimist. So I'm going to put my happy head on and just really focus on what I think the real opportunities are. So one of the big ones for me is scalability. So, you know, there is a lot of evidence and you know, there's issues worldwide. This is not something that's unique to the UK or to Europe or to any part of the world. There are mental health challenges that people are experiencing and I think there is not sufficient kind of trained care providers to meet that challenge. And one of the things that I'm very passionate about is prevention, helping people. We all know that, you know, we're going to go through challenging times in our lives, pretty, pretty much everybody, but the majority of people will, they may experience a mental health challenge, but they're unlikely to have it at that chronic kind of, you know, really kind of where it's affecting them day in day out for the entirety of their lives, there is a small percentage of our population who are in that kind of space. And I think technology has got limited impact there just because of the nature of that lived experience. But for the majority of us, I think we could benefit from learning how to look after ourselves a little bit better. And then when things aren't going well, and as we were talking about with that intensity of need that's kind of going up, you've already got a really decent set of skills that you can draw on. Now, as an ACT person, of course, one of the things that I really help people to develop is psychological flexibility, but there will be others, you know, so there will be other things that people can do, house related self efficacy, those kinds of things are what I'm thinking about. So I think the technology allows us to build relatively low intensity preventative interventions that we can scale. And that makes it more accessible for people. One of the other things is that there are challenges around rural areas. Maybe it's not so much in the UK, but, you know, in places like America, for example, the access to healthcare is not equivalent. And I think it allows people to access health care more effectively.
Rosie Gilderthorp:I would argue we do have that problem in the UK, and it's not so much one of geography as it is of perhaps stigma in accessing a particular service that's in a particular building. Or it might be around people who have caring responsibilities that mean that they can't get out of the house, or they have physical health problems that mean it's very difficult to come out of the house. And often the services that we do offer are, because we moved to this hub and spoke model many years ago, are very far away from people's homes, are not well served by public transport, and we do have these massive health inequalities because of that. So I, you know, I think it's shameful actually that we don't necessarily have the hundreds of miles problem that you might have in the US or Australia, but we do have massive health inequalities despite that, because we have so poorly invested in both our transport infrastructure and our mental health care system. I don't know why I'm laughing, it's sad.
Rachael Skews:It's a massive challenge, but I would say that I now live in a rural area and access to health care is actually significantly better than when I lived in South London, when getting access to any healthcare, and once you were in the system, great, but getting into the system because of the volume. So that's all I mean is it's, it's, we have, we have our own access challenges. It might not be exactly the same as some of those more rural areas, just, you know, we, what, what technology can do is hopefully reduce that a little bit. Now I've got my happy head on, so I'm not going to start talking about all of the issues around that. But we'll come back to this idea of like a two tier healthcare system and the challenges that that poses too. So other, you know, happy, happy, joy, joy things. This idea of being able to personalise people's kind of treatment. If you look at, one of the things that I did is I led on a project looking at wearable technology because this wearable technology becomes more common. I mean, most of us, we've got like a Fitbit or a, I have a Garmin cause I'm a little bit…
Rosie Gilderthorp:I've got a Garmin too.
Rachael Skews:Garmin. I love it. Garmin buddies. Apple Watch, you know, all of these different bands, Oura Rings. I was in, in London with a friend of mine who's, she works at Oura and was chatting to her about some of the things that they've got going on. She's a papule scientist. And, you know, there is a great opportunity there to get more evidence based predictive algorithms. Now you need AI for that because it's too complicated for us, my little head, bearing in mind, I do have dyscalculia as well, my little head would not be able to manage that level of data. So, you know, to actually find those predictive kind of messages, signals within large quantities of wearable data, you know, that's something that can be really helpful. There's a really interesting book, which is up on my shelf somewhere and I can't remember the name of it. I'll have to send you a link that you can share for people to follow up on. It's by a psychiatrist in America, and it's basically talking about for mental health, how wearable technology could be a really helpful indicator, both for people in terms of their self care, but for healthcare practitioners who are supporting them. And one of the examples used in that book is looking at GPS data. So, and I'm not advocating this, I'm just, you know, sharing it as an example to get people thinking about the ways that this data could be helpful. So basically, if you had a larger kind of span, that you're going in, you're going further away from home, you're going to different places, versus having a much smaller. So that could be an indicator of somebody's behaviour, but you've got an actual evidence base. You know, the, the, the reliance on self report within psychology is something where we should always be looking at, you know, are there alternatives? Are there different ways that we could try and get some indicators?
Rosie Gilderthorp:I must admit, I got really excited about that when you mentioned it on the webinar where I met you, I got really excited. I read the book. I was just looking for it in my audible catalogue and it's Reading Our Minds by Daniel Barron, and yeah, I, I literally finished it while I was on my holiday last week and I think this is really exciting because so many clients can't detect their progress. They can't feel that they're getting any better. And you know, it's the ACT kind of side of things, isn't it? You know, your thoughts might still be telling you that this is terrible and your emotions might still be a real struggle. But actually, if you're living more of the life that you want to live, you're doing more of the things that you value, that is progress and we are going somewhere. And for me, it was really exciting to read that because I've had a few people over the years where I have known as a therapist that they were making progress, but they couldn't feel it. And, you know, we got over that hill eventually, but that is where a lot of people might drop out. And so for me, that was really exciting in terms of potential for engagement. But also, I think, you know, my, one of my special interests is hyperemesis gravidarum in pregnancy, it's a very severe morning sickness, which is not in the morning. And actually one of the core problems that contribute to the mental health consequences of that condition is poor communication with health providers. It comes out through all the interviews and the focus groups and all the research that we've done in this area. And this is potentially, I see a way for wearable technology and AI driven applications to help people communicate more than in that once a month or once a quarter appointment that they have with the health provider. So you can get a real picture as a clinician of how is this person doing. And maybe even alerts when things are taking a turn for the worse. And so for me, it's really exciting, but I can also see the problems with it and the challenges, but I do feel really excited by the potential for that.
Rachael Skews:I think that's it. You know, there's potential within these things and what it's allowing us to do is to personalise the experience and do that in a way that's actually quite beneficial for both the client and the practitioner. And that's what I'm, that's the area of technology and AI that I'm really interested in is, you know, how do we improve that? How do we share information? How do we kind of help practitioners? Because what you could have is an app that then summarises some of the highlights from that information, so that you as a practitioner, with your client's consent, obviously, are able to see some of that data and to be able to, you know, both of you can then see the impact and the behavioural changes in a much more kind of you know, self reports are great, I'm not saying let's chuck them away and never use them again, because they're so important, but it's part of the story. And I think that's what this is about is, are there, is there more that we can add to the story?
Rosie Gilderthorp:Absolutely. Really exciting new frontier. And actually this is already happening for personal trainers, for example. They already have this technology that they're able to use, but I guess, and maybe this is coming onto the challenges a little bit, when we're talking about mental health, there is this sort of extra layer of care that we need to take with data and getting secure platforms to communicate together, that's probably going to be one of the biggest challenges I'm guessing?
Rachael Skews:Yes, I think so. So we will, we will definitely talk about that. Let's think, once, so we've talked about the wearable data, we've talked about personalization, we've talked about scaling, and I think the other thing that we can do is kind of use technology to make it part of people's general life. You know, and kind of game, you can do gamification stuff, and you can get immediate, you know, feedback or access depending on the different models. And there's lots and lots of different models that people can use in terms of how do you actually put this into the technology. One of the other things, which I'm going to talk about this on the flip side, is that you can actually see the impact of an intervention. You can see how many people are engaging with a piece of content. You get this kind of immediate feedback on the intervention itself. And one of the things that I'm really, oh, if I was still in tech, this is like one of the things I would love to do, but I'm now doing slightly different work for you know, my own kind of enjoyment, I suppose. But one of the things that I would be really interested to do is really isolate the processes of change. And I think this is something where, you know, with human caregivers, care practitioners, we almost kind of just, we're like water, you know, we follow where our clients go. And it's interesting, isn't it? Like, should we be following protocols? Should we be, you know, I'm, I'm quite into this idea of process based therapy, process based coaching. I follow a lot of the literature around that, and I find it very compelling. So it's this idea that actually instead of providing a protocol, you are activating different processes of change, different mechanisms of change through the intervention. And so the technology allows us to, to follow that story for each individual client and allows us to make those analyses. So we're less dependent on these kind of aggregates of change. You know, is somebody's psych flex increasing? Is somebody's self efficacy increasing? And we can much more individually follow that pathway of change. Now as a researcher and a scientist, I think for us as psychologists, that could be really interesting. It could be a really interesting addition to our current suite of research methods that we use. I did some research on this, and I'm going to have to take my happy head off when I talk about this, because it was just a nightmare.
Rosie Gilderthorp:That's okay, happy head can come off.
Rachael Skews:We're now moving into the issues. As we have with a lot of psychology, you know, I often say to people, you know, this distinction between hard and soft science, I think it's rubbish. Psychology is one of the most difficult sciences because the phenomena are you know, we're kind of interpreting everything. We're kind of making logical assumptions about what's going on. We're creating psychometrics to measure latent psychological phenomena that don't even exist. Like I cannot think of a more difficult way of trying to do science. So I'm always saying that to people, if it feels hard, that's because this is hard. We just, it's so difficult to explore. So one of the things that I did was we did a EMA. Are you familiar with this? Ecological Momentary Assessments. There is so much scope here. This is a really exciting area, but we are really dependent on having good measures. And at the minute, the main way that you would set one up is you take the best psychometric to measure whatever is that you want to measure. So we were looking at stress. So we use the highest loading item from an established robust stress score and we sense it. So the, the way you do this kind of study is again, you use an app and you have this question pop up on people's phones and they have to answer it and you can set it up so it's quite random, it's like really fantastic. And you know what? No change. Not one of our participants. The measure, it wasn't sensitive enough to measure changes in stress over a seven day period. So people were answering this one item, this one question, for seven days. I think we set it up so it asked them eight times a day.
Rosie Gilderthorp:That's so interesting. It makes you wonder, doesn't it? If we were measuring heart rate variability or breathing rates or any of those things, would there be a difference there or sleep quality even? Would there be a difference not detected by the question?
Rachael Skews:Yeah, you've hit the nail on the head. That's what we were comparing it to. Looking at the EMA kind of stress metric and looking at heart rate variability and working out, you know, could we get a signal from heart rate variability? And so you can with the heart rate variability, and it's a really good measure of stress. And also I think, a more valuable measure of stress because it's taking those physical components and, you know, have you been sleeping well? Have you, you know, got a lot of environmental factors that are causing you strain as well as the psychological factors? So heart rate variability, and I was lucky enough to partner and collaborate on this project with one of the world's experts on heart rate variability, Ulrich Kirk, who is an amazing colleague to work with. So, you know, we've, we've got these ways of doing stuff and being able to measure these different things. You know, that's one of the things that if people are, if you're innovating in this space, You're kind of, you, when you've got the right collaborations, you learn so much, you learn so much. And I think that's the other side of it is that we can get data from the technology that we wouldn't be able to get otherwise. And yeah, that's a very big positive. We've, we've barely scratched the surface I think with that. But should we talk about some issues? I feel like, uh…
Rosie Gilderthorp:Yeah, I think we better had. I can imagine people at home being like, but this sounds scary. There's lots of problems. So yes, let's speak to that.
Rachael Skews:Well, let's talk first about data. One of the biggest issues is around, you know, the management of sensitive data. So there's lots of platforms that are coming out, which are compliant to things like GDPR, or in North America, there's equivalents like HIPAA. And you can ensure that they are GDPR compliant. However, one of the things that I think as practitioners and as well, because a lot of psychologists are independent practitioners, you know, so if you're part of an organisation, the organisation is going to make those decisions around what technology they interact with. They're going to make sure, they're going to have experts who are going to be able to make a decision on whether or not a piece of technology meets all of those different criteria. For us as psychologists and bearing in mind, we are dealing usually with health data. I think there is, I mean, you can ask, and I wouldn't be frightened to ask, if you are thinking about using a platform or, you know, I think the more that we use, I don't know the extent to which different psychologists might be using AI, like ChatGPT, that kind of thing. I don't know that many who are using it in their practice, but I think, you know, it's more about data storage and how do we make sure that we keep our client data safe. Now, of course, usually as a psychologist, you've had some training in that. And so even though people don't feel confident to know, you know, what a platform might be, you know, what, what might be the, the compliance with GDPR and things like that, those platforms should have that somewhere. You should be able to request that from them. And so I wouldn't be frightened to ask. Another organisation that may be helpful is the ICO, the Information Commissioner's Office. So as an independent practitioner, I'm part of the ICO and use a lot of their guidance on best practice, so I think those, that's a, that's a great resource. And I know, Rosanna, you mentioned that this is something that people ask you about a lot. And I just don't think at the minute that there is a simple yes or no. I think that this is a little bit about risk and your, your appetite and your kind of view on risk. It's unlikely that an independent practitioner business is going to be targeted, but the different platforms might be. And that's just a reality of, you know, information and cyber security that, you know, when there are people who are targeting different, different platforms and they're going to go for those bigger ones.
Rosie Gilderthorp:So it's interesting because I, the approach that I tend to take with these new tools, ‘cause they get chucked into my inbox all the time. You know, here's a note taking tool. Here's this or that that's going to revolutionise your practice and that they are usually US based companies. So what I tend to do, I mean, not that I use many of them at the moment, but I tend to read the small print in as much detail as I can, and then maybe ping a message over, and it's usually still the founders that will come back to you cause these are quite new companies. And so you can have quite an interesting dialogue about that, if it's not clear to me what's going to happen. But something I've noticed is quite common is that they will say that they will use an anonymized version of your data to train their models. And talking about that with my community, there are really varying levels of comfort with that idea. So, you know, what's your view on that, on kind of maybe transcripts from sessions, for example, being used to train these language models?
Rachael Skews:Well, firstly, I think without client consent, it's not something that we should be doing. At the end of the day, those sessions are paid for by our clients. It's not something that I personally would choose to do, partly because I don't know what the utility of it would be. But that's for me, in my practice. And as a coach, I'm not working in the same way that a therapist would. So there, it's likely that, and again, I think this is a decision to make around your view of what's going to be helpful for you. There is always going to be a risk. But I think there's ways that we can deal with data and safeguard our clients. And again, I, I kind of feel like as psychologists, we might feel not that confident in this space. But it surprised me that when I was working at Headspace, because of my research background and my psychological training, I had quite strong training in ethics and dealing with data, you know, that's something that we get trained how to do, and getting consent. You know, those are things that are familiar to us, which within the collaborative teams that I was working in, actually other people were not as familiar with, and we're not as used to dealing with health related data as I was, and the rest of my science team, obviously, who are all psychologists too. So, you know, we are probably going to know more of the questions to ask, and if you don't feel confident on what those founders or the representative of a particular, you know, software provider, if you're not comfortable with it, then, you know, it could be at this point, just make that decision to say, I'll stick with what I've got now. Having said that, you know, often the risk is quite low. And I don't think that we should be frightened to adopt technology if it's going to have, you know, a really impactful, if it's going to be really helpful for us.
Rosie Gilderthorp:Yeah, and I think sometimes it's about that confidence to make your own ethical decision, because the way that I always approach ethical decision making is to seek out peers to debate with because it is not going to be clear cut. Like for example, these note taking systems, the way that they seem to work is that they, they need that data, which they say that they will anonymize in order to learn how to summarise a transcript into useful notes. For me, you know, similar to you, I'm not really up for my clients being used as guinea pigs in that way, so it's a no from me at this point. But I think it's, it's reasonable to debate it, and we had a debate about it in our membership community and Psychology Business School and hearing different perspectives and going backwards and forwards, that's the cornerstone, I think, of ethical decision making. And we should, as professionals, be confident that we can do that and we don't need the decision to always be made for us. I think often people are looking for, I want to know, is this a yes or a no? And it isn't going to be like that in this space for a while. It might become like that one day.
Rachael Skews:Yeah, to be honest, my partner is a cyber security expert, and I've talked to him about it when I set up as an independent practitioner. And he was like, the only way you can take all the risk out of it is you lock everything down. You know, so you wouldn't even have email, you know, you wouldn't even have, you would have no permeability at all. And so often within cyber security, what you're doing is you're just managing risk. And it's not a case of binary yes or no, it's what's the usefulness of this particular piece of software? What is the kind of, you know, is it, has it got all of the protection that it needs to have? And if you know, if you feel like it's the right piece of software, then go for it. But it is really difficult if you don't have that, that technology background. Luckily, you know, I just go to him and I sort of chat to him about, you know, what's the level of security that I should be using for these different things? So a lot of my training, for example, you know, I have that stored within Google. Google is pretty safe. But for my client nodes, I've got an extra level of security on my laptop. You know, when you're uploading things to the cloud, where are they going? It's, it's one of the things that we need to be a little bit careful of if we're using these transatlantic systems, is there something about where is the data stored? Now, of course, we've also Brexited and so you know, it's likely that our information security legislation is going to change. So, trying to stay on top of all of that. But I agree with you, I think as a community we can support each other, that doesn't take the personal responsibility away, but it means that we can support each other and, you know, share. And I do think supervision, and I do a lot of supervision work for coaches, and it's something that I often talk to my supervision clients about is, you know, what, what are the ethical elements of your business practice? And I think information security and the in, you know, the influence of technology, I think that's part of it, too. I think it's a much, much bigger question for these, you know, organisations that are providing healthcare through technology. So, you know, there are some big ones in the US. Woebot, for example, you know they are providing therapy, using a large language AI model, completely transparently, you know they let people know, that's what's going on, all of that data, you know, how are they storing that data? There has been some horror stories about technology organisations selling client data. None of us would do that. As chartered psychologists, we would know that's completely unethical. So sometimes when you've got these technology organisations, their view on how to handle data is not as developed as ours. And so I think it's really important to kind of ask the questions and ascertain your own level of comfort. You've got organisations like the ICO that can be helpful and supportive and, you know, the peer community and the supervision community that we have. But, yeah, like you say, it's going to be a little while before there is a really strong sense of the right way to do things, the right legislation to be complying with, you know, and whether or not there's a, there's an unnecessary risk that using a particular piece of technology would, would mean.
Rosie Gilderthorp:I mean, this stuff is just fascinating and I could talk to you about it all day. But I'm conscious we've taken up a lot of your time already. And so I imagine that people listening to this will want to learn more from you and will want to find you on the internet. So where's the best place for people to go?
Rachael Skews:So LinkedIn, I'm always, that's a good place for you to find me. My surname is pretty unusual. There are not many Skewses around and if they are, I'm probably related to them. So LinkedIn is always a good bet. And I post little bits and pieces of information. So if people want to get in touch that way, and then I have my own business, Cognos Consulting where I do coaching training and supervision and other bits of content, shall we say.
Rosie Gilderthorp:Amazing. All right. So I'll make sure that those links are in show notes so that people can come and find you easily. Thank you so much, Rachael. This has just been fascinating.