AI is already better than most coaches — faster, cheaper, and available 24/7. And yet, there’s one line that draws a hard boundary no algorithm can cross: it can’t bleed. In this episode, Kellan, Yi, and Ankush dismantle the myths propping up modern coaching, expose why the middle of the industry is being hollowed out, and name the brutal choice facing every coach right now: evolve into embodied, transformational leadership — or disappear.
🔥 Key Takeaways
👉Ready to turn your truth into impact? Join the Dream • Build • Write It Webinar — where bold creators transform ideas into movements.
Reserve your FREE seat now at dreambuildwriteit.com
👉 Future-proof your coaching by learning to lead from lived experience with Yi Jinkai Shan at resetandrise.com and Ankush Jain at ankushjain.co.uk.
Welcome to the show. Tired of the hype about living a dream? It's time for truth.
This is the place for tools, power and real talk so you can create the life you dream and deserve your ultimate life. Subscribe, Share, create. You have infinite power. Hello and welcome to your ultimate life.
This is one of the special Thursday episodes that we started a few months ago in anticipation of the book Coaching in the Rise of AI and the impact that AI may or may not have on the coaching industry.
Specifically, as with the other Thursday episodes, I've got a couple of experienced people who have a lot of experience in their own set of opinions and actions with regard to this new technology and we're going to just talk about it today. So E, welcome to the show.
Yi Jinkai Shan:Thanks Kellen. Happy birthday man.
Kellan Fluckiger:Thank you Kush. Welcome to the show.
Ankush Jain:Thanks Kellan, and wishing you a very happy birthday.
Kellan Fluckiger:Well, I appreciate that. So that'll be fun. So I don't have a particular set of questions, I just want to get the conversation started.
So E, you happen to be on the left side and I read left to right. So we'll start with you. AI. Are you using it? If you are, how are you using it?
You know what, what, what are you doing with this new technology, if anything?
Yi Jinkai Shan:And why and why not 100% just some context.
I was a tech entrepreneur and and then transitioned from building one of the fastest growing startup in New York City to coaching executives, social entrepreneurs and entrepreneurs to to as a coach to have their effort and 10x their positive impact in the world. And now I am building a platform where we're combining some of the most powerful live experiences in one platform called Reset and Rise.
And the short answer is absolutely I'm using AI as a, as an experienced techie I'm always interested in some of the latest trend and I found that it's been an incredible tool to shorten time and to ensure that I've been using it to help make some of the tedious things to get started for a human to kind of leverage and save time, specifically for example drafting some legal documents or making some reflections on my own.
And I've definitely used AI as a coach of a mirror reflection of some sort of in the past as well on top of kind of using the generative AI for kind of output and business and legal related items.
Kellan Fluckiger:When you use it for reflection, what's your experience with that?
Yi Jinkai Shan:I've had different experiences of different GPT built by different people, kind of designed for reflection for purposes by different teachers and I found that they're built quite differently and there's a typical experience I've experienced with ChatGPT being just like whatever I say is a very people pleaser kind of a reflection. Right. Not rub the ego the wrong way.
And in the past I've also experienced there's versions of GPT when the, the version 3o come out, it's actually very direct and it doesn't try to people please. I've had AI reflect like where are my shadows? It's actually quite confronting but the newer versions don't quite do that anymore.
And I've also have experiences of specifically build GPT to build with specific teachers teaching in mind. And I've also had friends who tried to build really confronting AI coaching. I've had some deeply transformative experience.
I've had experiences where I was crying because I was shown parts of me that I wasn't paying attention to. I wasn't quite experience on my own and that was surprising and it was actually very healing and at the same time quite attuned.
Kellan Fluckiger:That's fascinating. We'll dig into it more in a minute. Kirsch want to give you a chance. So are you using it? What are you using?
You mentioned something in the green room about building something. So tell us about what you're doing with this technology right now.
Ankush Jain:Yeah, so I, I would say 12 months ago and perhaps even 11 months ago, I was doing absolutely nothing with it. I have done a complete 180 on, on AI and maybe that's not 100% true, but, but I've had quite a drastic shift this year.
So a year ago, anyone talking about AI and AI coaching, I kind of just ignored it, poo pooed it, didn't really think much of it. And then in January this year I met someone who was, you know, really into it, who was introduced to me by my coach at the time, Steve Hardison.
And so we got talking. I started using ChatGPT in a, in a big way. I'd kind of played around with it a little tiny bit and I found it very, very helpful.
But we're still, I don't know, not, not really seeing it as a coach, but I, but I kind of got into using it quite, quite a lot. And I'm not saying I'm.
I'm any kind of super user, but amongst friends etc, I seem to have been using it a lot more than anyone else that I know in my immediate circle for a variety of different uses, including once or twice getting it for kind of advice or coaching you might call it And I've had various results. I can completely agree with Yi that ChatGPT can be very agreeable. Um, but. But also the opposite.
So I think there was some prompt that someone put out which was like give me the no BS response and what you see, et cetera, et cetera. And. And it's been interesting because as I started using it, I actually introduced a number of my clients to it who.
Who typically are kind of leadership coaches.
And I was using it to help them get more out of our coaching and say look, you can use this to prepare for sessions, we could use this to help refine your goals and make plans. But I've also used it with a client wants to using that kind of no BS prompt.
And we walked through it of how actually it misses the mark in terms of where I would point him. Now kind of fast forward a little bit. A few months ago I started having a conversation with a company called Awakening who is which.
Which kind of an AI coaching platform, but more tailored towards the higher end at JP Morgan, who I has been on your show, I believe.
Kellan Fluckiger:Yeah, I think he was the second or third episode of this back in September. October.
Ankush Jain:Yeah. So. So, so kind of I've partnered with them and we're currently beta testing my AI clone.
It's been out for just over a week or so now, which is quite interesting. So it's kind of good that we recorded this. Now I've had a little bit more of that experience.
And so again that's something that probably even six months ago I wasn't really drawn towards. And I'm still kind of figuring out kind of day by day.
But what I've seen so far in these early days is it's very useful, I think kind of tailoring it to the way that Awaken are doing it where it's very specific not only to be a coach, but actually to also model on specific coaches like jp, myself and other people. And it's. And we're refining it, but it's not me and it's incredibly useful. So where I'm seeing it right now is. Is.
Is definitely a support tool to the work that I do. And I can't remember who I spoke to.
Maybe it was Townsend some months ago who was saying, you know, he was seeing that AI can be a really powerful support to. To our practices. And I.
And I'm certainly moving in that direction and certainly seeing that what degree that will look like no one really knows and how it's unfolding. But it's certainly very interesting time so cool.
Kellan Fluckiger:If you think about. And it's for either one of you, I don't care what do you see? And you've mentioned some of this in your answer.
And we don't know how it's going to evolve. But what I noticed, I was struck. I help people write books.
So I started a cohort of two cohorts of people writing their own stories and creating products and stuff from their own life experience and what they have experienced and become. Kind of like we do with coaching and coaches. But I started that in February and started writing. I promised the group I'd write a book with them.
And that started in February. And about a month into it, I had this sort of download or experience that I needed to do a different book and that's coaching and the rise of AI.
So I ended up doing both of them. But I did this one first. And in the research part and writing and you know, seeing the state of the coaching industry right now and talking about as.
As deeply as it was able, here's what I noticed. I noticed that every three months, like twice in the six or seven months that it took for me to research and write doubled in its capacity.
And that's a subjective assessment, but in its accuracy, its speed, its relevance, its reasoning, you know, it's a bill. It did, it doubled and then doubled again.
And you know, I, in part of the conversation with it, I asked, well, given the speed of development, what and what do you think is going to happen with this? So I guess the question for either one of you is what passes for coaching today?
Like, I'm still seeing ads on Facebook, be a life coach, live on the beach or work on the beach, do whatever you want from wherever you want.
And most of what passes for that and most of the schools or methods that I've been associated with, either directly because I've taken classes or stuff that I've read or worked with, material is irrelevant and will be more irrelevant as fast as you can breathe. So what do you think this tool development, given whatever experience you've had with it, what do you think it portends for the coaching industry.
Yi Jinkai Shan:Personally, as a personal experience and just watching technology evolve over the years, as someone who work in technology, I believe it's gonna grow and keep the exponential growth is something that our human mind is hard to grasp. And so I think there will be a lot of ways that will surprise a lot of us how fast it's going to grow its capacity.
And I believe generally, as a lot of these technologies evolve, it's going to grow faster than we're comfortable and a lot of coaches are comfortable. And so it's a possibility that our mindset will evolve slower than how fast AI will evolve.
And this I see the possibility of this whole industry completely changing and a lot of coaches will lose their livelihood and lose their job. Because as my personal experience I've already tested, I've had experiences with AI that are better than the average coaches I've encountered so far.
Especially the ones that you mentioned. Callan are trained in traditional coaching school.
It's already happening and, and then the AI that was almost free and better and available anytime and so it's already better than the average coach. Just a matter of time when that was kind of the. Every woman catches on and a lot of the coaches will lose their job. I truly believe that.
And on the opposite end I do believe as AI evolve there will be a counterbalance of us wanting more human experience of not talking in connection with each other and more community, more, more connection and more live experience based. So as everything when technology evolve the average will sink to the bottom because technology gets better.
But there are certain top of the top human interaction experience that will be so precious like having dinner with good friends. That will take a while for AI to replace that kind of experience with friends.
And so I do believe when it comes to coaching there are a lot of experiences like that that AI will take a while to experience and actual humans on top of receiving top notch AI guidance will really crave those kind of special. We're together, we are a tribe, we are connected experience.
So we will go to extremes with most coaches lose their jobs and some of the best ones creating some really precious human experiences will thrive and find ways to support more humans in those connections.
Kellan Fluckiger:So my, my thesis in the book is 5%, 95% are going to lose their livelihood. And I define making a living as arbitrarily is 100k US dollars. But I just don't think there's any space in the middle for that.
But that's just what my research and opinion is. Ankush, what do you think about that? Your. Your thoughts about what does it portend for the our industry?
I mean I know your goal is to change the consciousness of the coaching industry and fix the world. And I love it and I'm right with you in the university and what I'm doing. So what do you think it portends for what now passes as coaching?
Ankush Jain:Yeah, you know, I think that I've heard a lot of answers like you, you've given and that may be the case. And you know, this year I've gone through like a, an emotional roller coaster with AI.
You know, like I've gone from this is the greatest thing ever to holy shit, this is the end of the world. And at one point I was, you know, sucked into watching YouTube videos from experts about how this is really bad and all sorts of stuff.
And I think drawing on my own personal growth journey, I kind of caught myself going, I'm operating a lot from fear here, which is never the best direction to come in.
And I think there's a lot of coaches like that and I'm mindful of people listening to this, that I don't want this to be kind of like a fear based thing that like, oh, we're all going to lose our job. So I kind of stepped back from that and I started talking to real people, right?
And people who are outside of our bubble of coaching or certainly my traditional network and you know, people who are out in the world, not coaches using AI and what were their opinions and how are they using it and what does that look like? And it settled me down a lot.
So, you know, my sister in law, it works for a business consultancy and I asked her, hey, what, what do you think about it? And she said she doesn't see and doesn't mean she's right, it's just a data point, right? She's like, I don't see any change in the next decade.
And how they're using it right now is, she goes, it's allowing us to do stuff that we couldn't do before and trawl through vast amounts of data and whatever. But it still needs the same human element to kind of present that to the client and do whatever. But it's a, it's kind of increasing the capability.
My, my brother in law just did an MBA where his thesis was on AI and AI in leadership. So I was like, what are you thinking about it, right?
And he was going, well actually for us, what we looked about in AI is how as leaders do we manage a hybrid workforce because we're going to have some, you know, agents and we're still going to have some people. So how does that work? But I was like, are you worried about your job going, Are you worried about being replaced?
And he goes, again, not for the next decade, right? And he was right there in the kind of cutting edge of it.
My other brother in law, I speak to him and he's in a very senior role in a big energy company and basically his job is out deploying AI into their workforce. And, and he goes, hey, Kush, this is the thing that I'm seeing. It's actually really expensive right now. So they're, they're deploying it into.
Basically when a customer emails them, the AI does the interaction, gets all the data, gets all the information, and then gives it to the real person at the other end of the phone to deal with the customer. And he said, in theory, oh, we can, we can actually get rid of a lot of our workforce and, you know, have AI do this.
And he goes, in reality, that's not what's happening, certainly not right now. And this may change.
But he goes, right now what's happening is it's actually increasing the level of service we're able to give to our customers because the AI is getting all the right data.
So when we've got someone on the end of the phone who's really busy, they've actually got everything they need to give a much better solution to the, to the client. And again, so I've got another client of mine who's chief AI officer for a starter. Again, people are quite knowledgeable working the field.
And I realized none of these people were kind of worried or scared.
And I, I, and then I spoke to someone about like, you know, coaches, this idea, coaches losing, losing their livelihood, and someone challenged me on it and they said, well, how many coaches are making over 100,000 right now anyway, right? And I was like, that's a fair point. So if you look at the ICF data, it's only 1% of coaches who make over $100,000 a year.
And that's ICF trained coaches, I would estimate. And people might say goes the other way, but I would think if you actually included non ICF coaches, I think all of the averages come down.
I think actually it doesn't go the other way. So it's way less than 1% anyway. It's a very small number of coaches anyway who are operating at that level. And I think it's a good thing.
Um, my family member, I don't know how we were related, but I guess he was technically my dad's cousin, I think, or my dad's uncle, I can't remember, but he was a very, very successful business, Very successful, you know, passed away far too early in life.
I think he was not even 60, but by the time he died, you know, built of a business and basically bought a house in the nicest part of where we used to live, right? Huge, huge property. And I had Heard some stories about him.
He was a very successful businessman and he used to say there's nothing like a good recession because he said it clears out all this stuff.
And I think when, if we look at how the coaching profession is, has grown over the years, even before I was in it, there was a time where you could put out an ebook and put up a website and you could make 100 grand a year. I know I've spoken to someone who did that. Right. And they.
Kellan Fluckiger:Right.
Ankush Jain:And then Google came along, the algorithm changed and suddenly that didn't work anymore.
So we've always been on this path where we've had to up level the service that we give and I think right now I think it's just pushing us more and more in that direction. But I'm actually very hopeful and what I see when I speak to people is there is a space for AI I'm definitely seeing that's why I've got a clone.
I think there's a space where that can support what I do. But right now, and this may change in a year, two years, like we say, it's exponential growth. You have no idea what it's going to turn out like.
But I don't, I'm a lot more hopeful and I also don't see it ever.
And I might be wrong and maybe we'll get into one of these sci fi futures but I don't see it replacing friends in the same way or a robot spouse, you know, and maybe there'll be some people who do that, but there's something about that humanness which I think is, you know, coaching is the most intimate service you can buy.
Kellan Fluckiger:It is. So, so let me give you some other things.
I alluded to something before we started and I said I'd tell you during the show and here's the right time when I ask AI, okay, you do all this stuff and I, I think you're right.
I, when I said 95% won't be able to make a living and you're right, not very many people make 100k, but there's a ton in the middle that earn 50 or 40 or 30 or 70 and they're a half job or they work with a spouse, that whole middle is going to be gutted and gone. There won't be anything for them to do at all.
You're either going to be up at the level where you're in high value, high powered kind of stuff or you're going to be fine doing something else. That's what I think And I think that it is because it is now a call, a demand, a requirement to uplevel. And so here's what I asked.
I said, okay, fine, you do all this really good and I can't describe. We don't have enough time for me to describe what I did in this research. But I poured all kinds of stuff.
I had a thread in ChatGPT that I called One Million Words. And I put my books and transcriptions and all kinds of stuff and after a while it corrected me and said, try four and a half million words.
So that's how much stuff it has of mine. So when it talks to me, it, it is, you know, it's the same kind of thing.
And I'm building a clone too because I think that it has real value and I think it's going to gut all the other stuff. But anyway, I said, you do all this really good. And I too had an emotional experience.
I asked it a intentionally a whole series of difficult, probing, searching questions. And if you tell it to quit, if you tell it to quit fluffing you up, it will.
And what it started doing instead is, okay, here's the straight no BS answer, right? And it prefaced everything with that, right? And then finally I said one day, will you quit doing that too? I said, just talk to me.
Enough with this crap. So it doesn't do any of it anymore, it just talks to me. So I said, so what do you suck at? What are you terrible at?
What are you not going to be able to do? Because you've basically eviscerated all these coaching models.
And I'm not saying the 11 I used were all of them, but it was many of the ones you know about, including icf. And it ranked ICF as extreme high vulnerability to AI and said all the reasons because it's so formulaic.
But anyway, it came back and gave me a whole bunch of well reasoned answers. And then it said one thing, that is the answer.
And it said straight up, after all this crap, you know, it writes effusively last line, one line, I can't bleed. And I read that and I thought, okay, there we are.
So what you're alluding to is the hollowing out, the up leveling the technological requirement to get our choices made and our crap together. And I'm going to give you a model, not a model, but a thought. I want you to respond to that or anything else.
I said, I think there's three reasons that coaches have to think about in this choosing. First of all, there's the head in the sand problem. There's lots of people in this context that I talk to. Well, you know, the human element.
And I've got that and nobody can replace me and that sort of crap, which is true, except you're not really doing that. You're not. You're not that you're talking about it, but it's not happening.
So the head in the sand problem, and the second problem is I'm going to picture a casino with a whole bunch of blackjack tables and all the tables, $10 tables, they're all full of robots. And the only place there is for a person to sit down is in the high roller room. And it says above the door ante, $10,000.
I think the ante to be good in this game has just gone up by a thousand fold. I don't know if that's the right number, but like a lot, not a little.
And the third thing is, if you're going to really coach at the level that changes consciousness and creates a kind of intimacy and power that needs to happen, that's work.
You have to stay and be all the time on the mountain of growth, because if you're not, you're not going to be in a position to create that kind of space. So. And it's hard. It's work to stay on that. It's intentional. You have to be focused.
So I don't think my experience is that there are some, but most people, because they're not now paying the dues and doing that stuff on that mountain. So that's kind of what I'm thinking. So I'm agreeing with you in terms of leveling up.
And I still have that concern that it will eviscerate most of the middle and upper. You know, the bottom 2/3, 3/4 is going to be history. So what are you thinking? What happened when I said all that in your head?
Yi Jinkai Shan:I'm with you, Kellen Ankush. I think that when you're sharing, like, hey, that's not like, respond or react with fear. That's generally my approach to life as well.
And I want to make a distinction, right? Like facing what might be. Looking at it objectively, even when it's bad news, is not the same as reacting with fear.
Because I'm in looking at the data, looking at the trend, I'm with Kellen here that it's. It's very, very likely. I'm willing to make a pretty high roller bet at this moment that a lot of the coaches will be replaced.
It's not because I'm fearful, but rather that's what is like a lot of the manufacturing jobs was replaced and when automation started and people losing their jobs, it's, that's, that's how sometimes technology works. And by facing what might be realistically and integrating our fear, I believe that's a good place to approach it so that we can.
Coaches like Kellen, you talked about like, hey, they can actually face like, okay, what do I need to do? How can I level up? I, I do see that's important is to see like, okay, people driving Uber right now.
They're a lot of cities already having selfless, self driverless cars. And that might be happening to the coaching industry. Okay, what do we need to do? How do we level up? How do we shift our approach?
And I love the optimism that it won't happen for a decade. I personally don't believe that that kind of timeline in some ways I think a gentleness may be good for society, but it's a pandemic.
I believe it's a Pandora's box that's already being opened. We cannot control the speed to be like, give us 10 more years to prepare. So I'm with you Kellen, that I do believe a lot of them will be wiped out.
g from the previous year from:And much as I wish I can just talk to a friend and convince me otherwise, I'm as a technologist and a coach and an entrepreneur, my approach is let's look at reality as it is.
And it's also my Zen Buddhist talking is how can we just be with reality as it is integrated and how can we respond with the present moment with what we know to the best of our abilities?
Not from fear, but rather from really having a clear head of okay, what's supportive, how can we support each other, how can we find the human spirit and together find the best solutions moving forward? It may be scary and sad for people to lose their jobs to go through changes. How can we make the best out of the situation?
Kellan Fluckiger:Trish, I want you to respond to but in a couple of three months ago I there was a big announcement with Microsoft and OpenAI that does ChatGPT and I don't know if you guys saw the details of that announcement, but the detail they renegotiated their contract and I was shocked with one thing that I saw in there one thing was that there was an enormous increase in funding available to OpenAI. And so that's going to accelerate development faster than it even was. And that was in the middle or near toward the end of writing.
And I'm thinking maybe I need to revise my timelines. Maybe it's Thanksgiving, maybe it's Halloween, I don't know. But I didn't.
The other thing is they had actual targets in that agreement for the achievement of AGI, which is artificial general intelligence, which is like the beginning of the holy crap moment. Right. And so they actually have targets for that. And they're not 20, 50, they're now like 20, 30, 29, which is 15 minutes from now, essentially.
And when you get as old as I am, it's 15 minutes from now. And so I saw that and I thought, yeah, okay. So anyway, I just thought of that as you were saying that. Kush, what are you thinking about that stuff?
I saw you making some notes and exploding stuff ready to come out of you.
Ankush Jain:No, no, no, no. I think we're actually aligned in more ways than. Than not.
And when I spoke about fear, I was kind of talking very personally, you know, So I see my own fear and I see. I do see fear and other people. And I think it's. It's important.
And I think there's a responsibility for us to talk more about, like, what's the bit to do and not just the headline of everyone's going to lose their jobs. Because I think in some ways nothing's changed, right? It's become more pronounced.
And I say nothing's changed in terms of being successful as a coach. If you were doing just frameworks and models, I, I don't know, maybe there are. Maybe I'm. I'm naive.
There were people making a living doing that, but I haven't, certainly in Europe. And maybe it's different in the, in the U.S. but in Europe, the way the culture has been, a lot of stuff is state funded.
So if you're a coach and you're getting paid to coach, you had to have something real to offer that was not just, hey, here's a. Here's a model that I've gone to school to learn. And so what I've seen is the thing that it took to be successful as a coach. Yes, it's going to.
It's the same thing. And so I wrote something down. You went, you need to stay on the mountain in its work. And it made me smile, Kellen, because that's me.
And in Fact, for me, it's not work, right? For me it's. And a lot of coaches that I meet where they see coaching as a calling, it's not work.
You know, in my, in my book I wrote, there was a, there's a section in there where I said, to me, coaching is a calling. And Steve Chandler challenged me on that and he said, you know, is that true? Because he's all about coaching being a blue collar job.
And he said, keep it in there if it's really how you feel. I mean, it's 100% how I feel. And I've been talking about it as a calling since then.
I've been meeting more and more coaches who were like, we get into this because it was a job or a, like a good career path. Like it often doesn't make sense to a lot of people, but they've really been drawn to it.
So I think the ones that are successful, let's just say the ones that are making over 100, I don't know, maybe I'm again, like, maybe I'm naive, but I see they are a lot of them wanting to, you know, wanting to keep growing, wanting to change.
And I think that like this is one of the things I was, I was interacting with chat earlier this year and I was asking it like stuff that you're asking, what's going to happen in the data? And then I went, and I just paused and I went, what's the thing that everyone's missing? Right?
Because we can talk ourselves into like things are like this. And I totally understand, like we need to look at what reality is. And I did a quick little search just now.
the things that I noticed in:And I think there's a whole bunch of things that no one really knows.
So I'm by the way, totally open that the fastest timeline that we've got for AGI and whatever is possible, I'm not saying it's not, but what I also know is when you look at anything, one of the things that, whether you look at polls for, for presidential elections, whatever, everything's wrong. There's always stuff that we don't see. There's always we, we live in such complex systems that we, that we don't, you know, understand.
Like right now, you know, when you look, on a lot of the forums, people are just moving away from chat GPT and they're going down Gemini and using other things. I've started using Claude a lot more myself. So there are other things.
And like, from a personal level, one thing that I found is I've really found it helpful for writing, right? And at one stage I was like, oh, this is great, I could help. This would help me get more of my stuff out onto paper and get more books out.
What I found is, it hasn't. What I found is I'm just using my time in a different way. It's making my writing better.
I can use it to critique and coach me through my writing, which it has. But right now it's not seemingly saved me huge amounts of time when I really stop and look at it. And so I think that there is definitely going to be.
My next door neighbors lost his job. There's definitely job losses coming. We can't bury our heads in the sand about it. But maybe this is my eternal optimism.
I think this is also a massive, massive opportunity.
And so one of the things that I've been speaking to leadership coaches about is given everything we know about human potential, leadership, whatever, and say like, let's say we know this, whatever it is, right? And no one knows everything but like we know whatever this is.
And then I said, how much of that, all of that are leaders really embodying and knowing right now? Like, is it 95%? Is it? And whenever you ask and they're like, yeah, it's probably about 2% or something. Like we haven't really saturated that at all.
There's this huge gap. And so for me, it's like when we just. The coaches are interested in that and focusing on that and there's a lot of people.
It's interesting to me when I started talking about raising the conscious of the planet, how many other people like you and would have kind of similar things? And there's a lot of people interested in that.
And you know, I spoke to a coach the other day and it's like, I think we have an opportunity because what AI needs right now, and a lot of people have spoken about this is leadership. Like the way we use it and the way that we're teach it in a way will decide how it goes forward.
So I forget the guy's name, but he's known as the, the godfather of AI or whatever. One of the early guys who started and he was, that's one of his warnings, right?
It's like we need to Teach it how to be as it gets more and more intelligent. Well, I think there's a role for coaches to play in that.
And maybe just what I see as coaching is I don't see models and stuff as coaching personally. And I think anyone to be successful right now. And I think the more away from America you go, the more pronounced that is. So I've got clients of mine.
I've got a client who's a business coach in India. Culturally, it's so far behind the US to be successful out there, you can't just be teaching models anyway.
And it's much more intimate, I feel, and I think. So that piece is important. And I do think that actually there's.
What we also don't know, and we're still trying to figure out, is how this is helping us to enhance our offering. So I know what Townsend said is, wouldn't it be interesting is if you were on a coaching call and we had zoom and then the.
The AI was going on the side and it was supporting the coach. Right. Because one of the things that I find is I coach very intuitively and sometimes I say stuff that I've never said before ever.
Like I'll say things for the very first time. And so in one way, I think I can create an AI coach that's actually a better coach than me. Right. As a pure coach.
And I think that's the aim, to create something even better than me. But then there's another thing which maybe isn't coaching that I can support, but it. But there's a difference.
And maybe we're all kind of pointing to the same thing, which is that's the direction that I think this profession is. And I say profession rather than industry. I think that's the direction the profession's going in.
And also I think there'll be huge opportunities, yi for whatever platform you're creating. Awaken other things.
I think there's going to be a lot of people who will do very well who, you know, you know, develop something based on this technology that will be helpful for people. I think it's still early days. I don't know what the adoption rate's going to be.
It's interesting that JP said to me that people used his AI, which is really good, and then contact him saying, can we hire you? I would, if you asked me that a few months ago. Do you think that would happen? I'd say no.
You got something for 30 or 50 pounds a month, or you got the real guy at, you know, multiples of that if they use that and it's really good, surely they stick with that. But there seems to be this desire even now. How that evolves, I don't know, but it, I, I just think there's a lot of variables out there.
Kellan Fluckiger:There are. It looked like you were ready to explode with something. Why don't you say what you're thinking?
Yi Jinkai Shan:Oh, I was just more confused. Oh, like, oh, I'm just listening to be like, okay, we're headed.
Kellan Fluckiger:All right, so I had something to say, but I just thought you had a comment. So I think there's wild opportunity. I'm extremely optimistic and I'm out of control excited, which is why I'm starting the university.
I'm committed as anything to that coaching profession.
And that's why I, in the book, I propose a new model and a method that I had go through the same analysis as all the other ones and it, and maybe it was being nice, but it told me that the vulnerability was zero. It didn't even rate it high, medium or low. It said zero and I asked why and we dug into it deeply. So I'm excited and I think what.
Ankush Jain:You'Re saying could, could, could you explain that, Colin? I just wanted to understand what you mean, the vulnerability in terms of replicatability.
Kellan Fluckiger:No. So I asked with 11, I picked 11 models that I knew and I wasn't pretending they were all of them.
And I ran it through a series of analysis with AI and I said, so how effective are each of these models at accomplishing the desired outcome? Recognizing that people pay for results and they pay with money. They don't pay and feel goods, you know, try to be as hardcore practical as I could.
And it gave the best analysis that it could based on available data. And you know, that's flawed just because everything's not available. And I understand all that.
And then it said, okay, here's how good it is at achieving its thing and here's where the holes are because it focuses too much on this and unless you do it this way and all this kind of stuff, it's going to miss these parts. And I'm not going to give all the details right now.
Not because they're secret, because it would take too long, but you want to read it all in the book.
So then I, I said, okay then, given your own rate of development over this last time and what's planned right now, the releases that are coming, how do you think this particular coaching methodology that you can research and understand the best you can from books and things, how vulnerable do you think this each methodology is to the advances in AI, in other words, in order to do it instead of having a real coach. And I asked, you know, there's the human element. And I put all of those caveats in there.
And so it ranked each of the 11 models how good they are at achieving the desired result and how vulnerable they are. And they just use low, medium, high. And then it told me why, told me all the things and all the caveats that would happen.
You did a decent job explaining it and putting maybes in there in the right place. So I had already been working on something that embodies what you were talking about.
The reality of the truth, of the sacredness of space and of the intimacy of human connection and things that both of you have said.
And so I said, okay, good, now that you've done all these with this one, you know, here's what I'm looking at and had described it in long, long detail, just thousands of words, this is what I'm trying to do, etc. I said, so I want you to take it through the same analysis using the same methodology.
How good is it achieving the results that it's supposed to achieve and how vulnerable it is to AI and so that's what I meant. So it took it through the same analysis and it gave me the thoughts. So that's.
Yi Jinkai Shan:I love that, Kellen. Absolutely. I think we're on the same page and same mission that I am building the platform, reset and rise where we're combining.
I'm recruiting some of the best coaches in the world I can find running modalities that so far no AI can run.
That is some of the most powerful effective approaches out there that I've experienced many of them one session is more powerful than me doing weeks or months of traditional coaching work and then combining them. But everything is live with real humans in the community.
That's something that we're almost going to another extreme around like, oh, there are eight things that AI will do really, really well.
But here is some of the top notch coaches who are at the top of the mountain, top of their game, running some of the best tools out there that AI cannot quite do yet because it doesn't have, it doesn't bleed.
Kellan Fluckiger:It can't bleed, it can't bleed. That says everything, at least to me, 100%.
Yi Jinkai Shan:It doesn't quite have a human heart and in connection to be with each other. And so I'm with you that I, I am really excited about that possibility of AI can really. There may be some growing pains.
But AI will help reduce a lot of the redundant, repetitive things that humans are actually not very good at. Very boring for human to do. And that will do exceptional really well. But then they will leave a lot of the top notch, like human.
Some of the best coaches really float to the top.
And especially a lot of them I'm working with, they're amazing healers and, and coaches, but they're not good at, you know, being on top of the mountain, shouting their services. And so I'm excited to create a platform where they can actually be seen.
We will have partnership with companies, with philanthropists, donating scholarships with companies and changing school systems, changing education systems where a lot of those can truly a lot of the emotional transformations work and the most effective ones can be democratized and widely accessible. So I'm with you. I love what you're doing with the university as well.
It's like, okay, let's leave some of the stuff that AI is getting better at than the average coach to the AI, because that's where the industry is going. Let's not fight that.
But how can we create things that are so powerful that only humans can do, only people who bleed can do exceptionally well, make that available to the world to benefit from.
Kellan Fluckiger:Cool. All right, so we're. We're over the time and I don't really care. I want to make sure that each of you.
What's on your heart that we didn't talk about that we ought to. So what's on your mind? Is there something you didn't say that you're like, I really need to say this.
Ankush Jain:I just want to say. Yeah, I love what you said. I love the platform that, the sound of it that you're creating. Kellen, I love what you're doing. I think that.
I don't think there's that much different in terms of the direction that we're all thinking in. It might.
It might seem like that on some, some things, but I think ultimately there's a lot of people I'm noticing and maybe again, it's just who I'm surrounding myself with who are very excited and are using this new change, this period of change to create something that we weren't able to create before or couldn't. And it's going to be exciting to see how it plays out. There'll be some great stuff out of it.
There'll be some stuff that's not so great, but I think you're right. There's so many people that I know and coaches and mentors that I've had that no one knew about. Right. That weren't well known.
There are so many coaches I come across who do exceptional work, who have struggled, and. And this may be an opportunity for them to. To really rise up. We. We.
I don't have anything burning to say, but one thing just occurred to me, Kellan, is that one of the things that, you know, my coaching's developed in over the last three or four years is really the. The work of being, or some people call it ontological coaching.
And for me, I think, again, that's a real piece that the AI can't do because it's actually the journey of the coach stepping into and learning. It's not about perfection, but being what we teach, being what we coach. Walking the talk. I remember my.
My time with Steve Hardison, and it was like I got so much out of what he said, but then I learned so much just from who he was. And I'm finding the same thing that, you know, in the spiritual circles, for years they've called it transmission.
And I don't know if it's exactly the same, but it's like when a person says a certain thing, it can land. And we've all had experience of that. The right person says the right thing at the right time, and it lands.
You might have read it a thousand times or heard it from other people. And I think the. The secret source of that is who each of us are being. So if.
And I really think about this, for me, in terms of maybe this is what you mean by staying on the mountain. It's like if I'm going to coach someone on. On their marriage and. And my marriage is not very good.
It doesn't matter if I've got all the right things to say. And great, like, that doesn't work. And it's the same thing in every area of my life.
And it's also, and here's the rub, it's not about perfection, right?
Kellan Fluckiger:It's about growth, opportunity. So you're saying that's why one of the things that. One of the schools in the university is the School of Transformational Authorship and Creation.
And here's what I say, look, the only thing that we have to offer, the most powerful gift and power that we have, is the story of our own becoming, because that's all we own. Everything else is that thing over there. I don't care what you learned how to do or learned how to talk about.
The only thing that matters, that carries the power that you're talking about when you Speak is your lived experience, the story of your own becoming. And that's. That's all that there is in this thing. And those that learn to. To the thought is.
And the basis of that model that I proposed in the book is you have to be the full on embodiment of what you teach. Not perfect, but in that process because that's when that transmission and that word occurs a lot in the book.
It leaks out of your eyes and ears and pores and mouth because it's the energy of your being and you know, there's all kinds of principles and stuff to talk about. So yes, yes, yes and yes. And that embodiment could be better words. But that's the words where it is the essence of who you are. And I agree with you.
It's not hard when you choose. That's who you're gonna live as. It's just what you've decided to do. E. Do you have anything that you want to close us up with?
Yi Jinkai Shan:Yeah, I'm with you on being present with each other as humans and our stories are becoming. What's on my heart is like on this day of your birthday. I remember the first time we met when we both shared our stories becoming.
I remember having tears like we both have tears in our eyes sharing that and that a fellow human who went through what we each went through to get here and to be devoted to the service of humanity. And that's a fellow human who bleed go through the hardship of being a human and that we can relate with each other.
And to me that's the human spirit. That's what we're here for. We are coming here with flesh, with.
With meat bought back like in spirit and to be of service and go through the hardship and like of loss of gain, of love that is so precious that AI can talk all the pretty talk, but at the end of the day they don't bleed. And when then we can't. It's different than sitting across from a immortal. No bleeding, no heart being. Even if it's intelligent versus Kellen.
When we have our conversation that's irreplaceable by I will always remember it. It comes to my mind often and I'm just feeling deep gratitude and love for the way you been living your life and how you show up.
And that's continue to inspire me and touch me and continue to remind me why we're here. Even with the AI taking over and making a lot of changes rapidly. At the end of the day that heart, that being is what really matters.
And I'm just feeling deep gratitude and love towards you for being a human. That's really walking the talk.
Kellan Fluckiger:Well, thank you. I want to thank both of you for being here.
Ankush, thank you for sharing your love, your heart, your time, your experience, your insight and your encouragement today. Thank you.
Ankush Jain:Thank you, Kellen. And I just want to say, really, that was beautiful. Yi.
Kellan Fluckiger:Who.
Ankush Jain:Who has someone acknowledged them like that on a podcast? I've never heard it. That's the first time. So that says a lot about you, Kellen.
That you have people coming onto your show and speaking about you in that way is remarkable, and it says a lot about you as well.
And I think that's such a beautiful way to end the show because it really points to this highest space that maybe that's where I really see the profession going in, in that real heart space, which is beyond the techniques and models and everything. So thank you so much for having me on here. It's been. It's been so wonderful.
Kellan Fluckiger:E, thank you for being here and sharing your heart and your love and your insights and your encouragement and the generous energy that you have around you. Thank you.
Yi Jinkai Shan:Thank you so much for having me. It's lovely to be with you on Kush and Callum. Thank you for your being and happy solar return.
Kellan Fluckiger:Yeah, thank you. So, listeners, this has been a good. A really good episode and I want you to go through. You know, this is for coaches.
And I think, like I said before, this isn't really marketing, but I think you need to read that book and at the same time, you need to examine your own heart. Like, you know, Chris was talking about, people that are dedicated to that growth. That's all there is left.
If you're not dedicated to the growth, you're done. And that's the thing that's going to be left and what's required.
And I say it boldly and clearly, and you can piss you off or whatever, but that's the point, is it's a choice you get to make about who you're going to be and how you're choosing to show up, representing that sacred opportunity to be a light encouragement and love for those who turn to you for space and encouragement and healing. So this is the end of the show and I want to encourage you to listen. Listen twice. Look both these guys up. They're cool and they do good stuff.
And if you listen, pay attention, you'll be able to move forward to create, reach your ultimate sky. Never hold back and you'll never ask why.
Ankush Jain:Open your heart.
Kellan Fluckiger:Right here right now. Your opportunity for massive growth is right in front of you. Every episode gives you practical tips and practices that will change everything.
If you want to know more, go to Kellen Fluker media.com if you want more free tools, go here. YourUltimate Life ca Subscribe Share.