Artwork for podcast Your Ultimate Life with Kellan Fluckiger
AI Can Give the Answer — But It Can’t Give the Transformation
Episode 107719th February 2026 • Your Ultimate Life with Kellan Fluckiger • Kellan Fluckiger
00:00:00 00:52:23

Share Episode

Shownotes

AI is accelerating faster than anyone predicted.

It can write. Research. Design. Strategize. Coach you through decisions. Even mirror your thinking back to you without judgment.

So what happens to coaching?

In this powerful Thursday coaching edition, Kellan is joined by Dr. Matt Markel and Lucia Rester to explore the uncomfortable truth: AI can replicate frameworks, tools, and even insight — but it cannot embody lived transformation.

This conversation cuts through the hype and exposes the illusion of competence, the danger of polished but inaccurate output, and the difference between giving answers and facilitating real change.

If you're a coach, consultant, therapist, or leader — this episode isn’t optional.

Key Takeaways:

  1. The real difference between coaching and consulting
  2. Why AI gives the illusion of insight
  3. Hallucinations and the danger of polished misinformation
  4. The 80/20 rule in AI accuracy
  5. Why “information” is not the same as transformation
  6. Embodiment vs. borrowed expertise
  7. The future of coaching in an AI-driven world
  8. Consciousness, energetics, and what machines cannot replicate
  9. The illusion of competence in low-tier coaching
  10. What will remain when 95% of coaching disappears

🔥 Ready to turn your truth into impact? Join the Dream • Build • Write It Webinar — where bold creators transform ideas into movements.

👉 Reserve your free seat now at dreambuildwriteit.com

Mentioned in this episode:

Visit www.dreambuildwriteit.com

Visit www.dreambuildwriteit.com

Transcripts

Kellan Fluckiger: [:

Kellan Fluckiger: Subscribe. Share, create. You have infinite power.

Kellan Fluckiger: Hello, this is your Ultimate Life, the podcast I created to help you live a life of purpose, prosperity, and joy by serving and creating with your gifts, your life experience, and all the good stuff that you have inside. This is one of the special Thursday coaching editions.

ing over the world here, but [:

Kellan Fluckiger: Welcome to the show, Matt.

Dr. Matt Markel: Hey, uh, Kel, it's great to be here again. It's great to see you. And, uh, Alicia, great to see you as well. It's, uh, it's, uh, fun being on a show where we're friends talking about fun topics.

Lucia Rester: You bet,

Kellan Fluckiger: Lucia, welcome to the show.

Lucia Rester: Thank you so much, Kelli. It's really fun. And I'm delighted to be here with one of my favorite P persons, Matt, and as well as you What a, what a fun idea.

Kellan Fluckiger: All right, cool. So I've set a timer to kind of keep track of where we're at and let's just kinda get started. So our, our topic today is about coaching in the context of ai, and I wrote a book about that and all that jazz. But I don't want to, we'll, we'll explore that as we go along. I guess the first question I have, both of you do some kinds of coaching.

forth, and that's wonderful. [:

Dr. Matt Markel: Lucia go ahead

Lucia Rester: Um, I use AI every day, um, and for several hours a day actually. So, um, it is, I could say it's, I, I rely on it quite a bit, but I rely on it in a very particular way and to me, I like to call it kind of that 80 20 rule applied to ai.

mazing capability in certain [:

Lucia Rester: Uh, generally speaking, you can expect about 80% of it to be, um, somewhat accurate. And I think for me that's, uh, that's how I use it. I use it as a very faithful, uh, 24 7 assistant, but I don't rely on it as the subject matter expert

Kellan Fluckiger: when you. Matt, I'll get you in a sec about how you use it. Um, when you do that, when you say you don't rely on it as the subject matter expert, tell me a little bit more about that.

is really to please us. And [:

Lucia Rester: Expertise so that I always have a crosscheck of, is this valid, is this accurate? Based on my experience and. To me that 80 20 is also the gap within which, going back to your topic, Kellen, within which people who are experts in their field can really rely because it, it is not at this point, um, AI isn't something we can depend on as we can a subject matter expert, somebody who truly understands.

or discipline or topic, um, [:

Kellan Fluckiger: talk more about, about that later.

Kellan Fluckiger: Yeah, that's another place to go. Matt, I wanna give you a chance now to jump in and tell me how you use it and, and whatever makes sense there.

Dr. Matt Markel: Sure. So I think one thing that, that we can perhaps make a distinction on here is the difference between coaching and consulting. Okay. Because I think we may broad, broadly lump those two together, but they, they serve different.

Dr. Matt Markel: They serve ultimately different roles, and I think each one of those will use, uh, ai just like they use tools differently depending on what the applications are. When I think of consulting, that is perhaps strategic business advising. The where you are, you're imparting certain types of knowledge and experience to someone, um, to a, achieve a goal.

hereas coaching it's more of [:

Dr. Matt Markel: But specifically for your, your question, I use AI in a variety of. Methods. And I, I like the idea of saying that this is an assistant that can, that can assist me in tasks where it can do things either quicker or better or without the, the cognitive drain for me that, uh, I could do myself. In most cases, I've got an exception here, I'll show you in a second, that I could do myself, but it's, but I don't want to because I can go much faster without it.

ng to a library, right? And, [:

Dr. Matt Markel: You know, a lot of, you know, very slow things. Then we moved in this situation of the, of the internet where we can type things into Google and Google will provide information. Most of, most AI is now integrated into, into the modern search engines so that, so when you type in Google, you're getting AI helping you there as well.

Dr. Matt Markel: So whether you type it into chat, GPT. Or Claude or uh, or Google itself. It's using some aspect of AI in the research there. So that's, that's one way that we use it. But I think specifically for me, the way I use it the most right now. Is on, on writing where it will help me edit and help me, uh, make changes on things much faster than I can type them.

can talk pretty fast and we, [:

Dr. Matt Markel: With, uh, whether that's in documents, whether that's in, in content for social media, et cetera. That's how I use it a lot. And I also use it as a graphic designer. Now, this is the area where it is better than me. I can envision what I want from a, from a product a lot easier than I can make it in Canva or PowerPoint or, uh.

Dr. Matt Markel: Harvard graphics are whatever, so I can describe what I want to see and have it iterate on the drawing, uh, a lot quicker and a lot actually more, uh, with higher quality than I can. 'cause I'm the, I'm the worst artist ever. So those are a lot of the ways I'm using it. I think one more thing that's a, that is a, uh, an an interesting aside on that is when you think of a.

aching or strategic advising [:

Dr. Matt Markel: That's just a fact. It's not better, it's not worse, but it is different. But you can tell things to the computer. You can tell things to the AI about, Hey, I'm struggling with this, or I need help with this. And basically use it as a way for you to talk yourself through something. It's like a really, really good friend that wouldn't be judging you.

Dr. Matt Markel: But you, but can also, you know, forget exactly what you said, you know, at the, at a moment's notice so they don't think the worst of you. Um, that is a really useful tool. I've used it for myself and for example, like I was, you know, during the Christmas break, I was. Struggling with some things about the, how to set up this, this one facet of my, uh, of my business.

ad been burned by some, some [:

Dr. Matt Markel: So just hours of just pouring myself out into that and really getting the feedback back in a way that no one's gonna judge me. No one's going to, to think, wow, you know, Matt looks like he's got it all together. I thought Dr. Markle always knew everything. No, he doesn't. He's he's, he's human like us. Well then no one's gonna judge you with that.

Dr. Matt Markel: And that's, uh, that I found to be very, very therapeutic. So that's probably one. Now we don't do that every day, but that is one that we're an area where I personally used it in the business.

Kellan Fluckiger: Cool. So I, I liked that. And thank you both for sharing that. I think you're both right. You said something, Matt, that I wanna hone in on.

tant. I was a consultant for [:

Kellan Fluckiger: And as a consultant expert, you know, you're paid a lot of money and you frigging better know the answer. Like you're supposed to come in and know stuff. At least that's how it was in the work that I did. And as a coach, you're not, that isn't part of the skillset to know the answer. It is the discovery process that makes it, but you makes it valuable.

Dr. Matt Markel: Yeah. Just so you know, I, I agree. It's a, it's a, there's a substantial difference between, between them. Sometimes the lines get blurred, especially if you're doing Yeah. Uh, advising or consulting. And the, the CEO has personality issues that you need to, you know, you need to address before the, before the company can move on.

Dr. Matt Markel: Then you kind of, the lines get blurred a little bit.

Kellan Fluckiger: Absolutely. Absolutely.

yes, as a consultant you're [:

Dr. Matt Markel: And that timetable is kind of pre-negotiated. It's usually, that's usually not a. Uh, most co coaches, like, you know, I'm gonna do this for you and it's gonna take x number of months. You, it might take x months to get through your program, but the, the results are a little bit more loosey goosey on, on what that's going to, what that's going to give you.

Dr. Matt Markel: Um, you know, coaches are supposed to know things too, but they're supposed to know how to get you on the path as opposed to how to get you to, um, you know, okay, we need to turn this around. We need to be profitable by this quarter. We need to have free cash flow of whatever metrics. So.

Kellan Fluckiger: Yep. Agreed. Hundred percent.

that I've done also, I talk, [:

Kellan Fluckiger: And I'm gonna invite both of you to it. It'll be a Zoom concert, it's got some music and stories and stuff, and I put the lyrics into the chat and told the stories and you know, talked to it and asked it to give me some, you know, stuff to say in between songs and all that kind of stuff. And did it in the way you described sort of just.

Kellan Fluckiger: Talking to it. And what I really appreciate about it is, uh, it's, you know, I could say, no, no, no. Forget that, uh, do it this disorder and I don't have to stop or do any of it. And it untangles it all really well. And I hear lots of people talking about hallucinations and all these mistakes that it makes.

use how I'm using it is in a [:

Kellan Fluckiger: Because when you described its ability to talk you through something. That's the function, or at least one of the functions of coaching. And so when I say that, that sort of lower and mid tier of what passed for coaching is gonna be gone, you know, that's an example of, of what I'm talking about. So tell me some about what are the problems that you've seen?

ination gets into areas that [:

Dr. Matt Markel: Does a, uh, probabilistic matching of those, and those go into like the standard LLM, right? That's kinda like the, probably the way that most people when they, when they make their own custom thing, that's actually what they're. What they're doing is they're, they're loading in documents and instructions into, into making a custom LLM in cloud or chat chip t or whatever, and that, because that is probabilistic in its matching, but we'll do a matching.

ality of the answer. You are [:

Dr. Matt Markel: And, and you're, you're assessing that that quality. Um, but when it's more factual based then, and it's, and it's making an, an assignment on that where there is more of a right or wrong, wrong answer, then uh, it can get it wrong and it is struggling currently with the ability to accurately say, Hey, I am not very sure about this.

Dr. Matt Markel: That is one of the differences between the AI and a human coach that I think we will continue to see, uh, to give the humans an advantage. There's a lot of advantages for the AI based, you know, either assistance or tools or whatever, but for the humans, I think number one is a good human we'll confidently say.

pplicable to this situation. [:

Dr. Matt Markel: We don't get that with the, uh, with the ai. And that's the, that is one of the issues. So when we talk about hallucinations, hallucinations is basically. Providing something that was a probabilistic match, but not really correct, uh, at all. Um,

Kellan Fluckiger: you know, so, so Alicia, what do you think about that? Where have you seen some, uh, things that are in that 20% that, that bother you, that drag you, like, ah, you know, whatever, what have you seen?

ematic because we all have a [:

Lucia Rester: Of a person with six fingers, we, even if we're not a graphic artist, we'll be able to say, what are you doing with those six fingers? Like, stop. The, where it gets tricky is actually. In the simpler levels, which is the tech space things when we aren't the subject matter experts. So here's a case in point to answer your question, Kellen, I was doing some research for, um, the curriculum for Blue University, and I really wanted to extend some additional resources for students.

t. But I asked it like, give [:

Lucia Rester: And the links were broken and, um, the, there were no, the references it cited were not actual references, um, that were accurate on the internet. So then my response was, only give me links that work for the general public and that are not broken. Did it again, still broken. So that's kind of a very specific example that just kind of shows.

Lucia Rester: What I'm talking about, if I had just assumed it was correct, I could have put those links into a book or you know, whatever, some ancillary materials and they would've been inaccurate. And the, the thing, the, I'll just say one more thing. The issue that I, that concerns me, I mean, I love ai. I'm a real proponent of using it, but using it intelligently.

Lucia Rester: [:

Lucia Rester: It all sounds so polished.

Kellan Fluckiger: You know, that strikes me as not a small thing. That's a catastrophic failure as far as I'm concerned, for anything to come back with things that simply aren't there or don't work. And, and I, you know, the sky isn't falling and I don't mean that and, but it does absolutely illustrate.

gs in terms of someone using [:

Dr. Matt Markel: Well, no, I, I think that, but that's because we are still nascent in our ability to understand the relationship between us and the, and the computer on this or us and the, in the, in the ai.

Dr. Matt Markel: Right. You know, Kellen, if you and I are having a conversation and I tell you something about, uh, about finance, or I tell you something about engineering, you know, the things that I'm, you know, either a world class expert in or, uh, pretty darn good in, then you'll, you'll pretty much trust me, right? Um, if I tell you something about.

Dr. Matt Markel: Uh, I don't know about, um, bobsledding or about, um, Taylor Swift. You know, I, you know, and, and I sound very authoritative. Authoritative in that, you know, you're probably gonna say like, you know, Matt is probably not an expert on Taylor Swift, by the way. My wife is, you know, you should ask her. But for me, I'm like, I'm not.

me. You know, I'm, you know, [:

Dr. Matt Markel: So we haven't yet learned, because again, the relationship is nascent. We haven't yet learned how to trust, how to not trust, basically the, the, the outputs of a conversation we have with ai like we have with humans.

Kellan Fluckiger: And I wanna make two observations. One's funny and then one's about what you just said. Uh, I have a friend who uses a lot of AI tools in creating paintings, and it did create a picture of someone playing a saxophone that had six fingers.

Kellan Fluckiger: Just so that you'll, just so that you know, that was a funny when you said that, I thought, oh yeah, I've seen that. Um,

e of the LinkedIn groups I'm [:

Dr. Matt Markel: We're very much into. Into AI and AI regulation and ai, secondary and tertiary effects on the environment and so forth. And they, they basically posted one which has like some AI thing, a very underground with like, you know, six, six fingers. Uh, yesterday my wife, you know, there's a lot of things going on.

Dr. Matt Markel: This is, we're taping this the beginning of February. There's a big thing going on, uh, on social right now about how you can make. An AI sort of version of yourself, which kind of has like a, remember like when you used to go to like a carnival or something like that? They're the do they're the guy doing the caricatures of you.

Dr. Matt Markel: Yeah, yeah, yeah, yeah. And they put some of the stuff around you like, are you into music? Are you into sports? Or fin, you know, finance, whatever. And they put some of those things in the background. Well, those are very popular. Well, they did one of. My wife and two of their friends at the, at the gym after a workout and the original picture had one of the friends had her arm around my wife.

e had her, her hand up there [:

Dr. Matt Markel: We're still seeing things like, like that. Uh, so the, so the, the six fingers is like an easy tell, but there's other things as well that are sure that are, you know, a little bit less on the, on the just ridiculous humor side.

Kellan Fluckiger: So that drives me back to what I said. You gave a very articulate and specific example of people talking about their area of expertise and being believable and then talking about other things that they're, that are not related.

now exactly Jack and yet are [:

Kellan Fluckiger: And yeah, that's what we ought to do. I question the ability of humans to effectively differentiate between areas of expertise and wild ass opinions.

Dr. Matt Markel: Yeah, there's, there's, there's definitely the. We always are subject to the halo effect. Right?

Yeah,

Dr. Matt Markel: that's what I was, so that, that, that's, we always, we always fall prey to that.

Dr. Matt Markel: Right? Uh, but, but I think in general we have a pretty good, and I like how you said the, the spidey sense of it.

Kellan Fluckiger: We do, we do. And, and we ought to know it and stuff, but I just noticed that, and that's far as I'm gonna go on social commentary right now, but, all right. So let's, let's get into a couple of other things.

new possibility and change. [:

Kellan Fluckiger: I ask in in the research, and this was using the AI tools, how good is this method at achieving its results? And obviously it's gonna go look for whatever it looks for and give an opinion, but it gave me a quite detailed analysis that as I read, it felt accurate in terms of what I know about each of these systems.

ut doing that and ask it for [:

Kellan Fluckiger: It makes me ask the question, okay, what is left? If AI can write better, faster research, more, give you frameworks, tools, prompts, accountability questions, call you out on pattern deviation and all those kinds of things that it does really well. What do you think is left for the truth of real? The, the essence of what coaching is, which is causing, uh, people to have a new insight, make long-term sustainable behavioral change to use of the Dan is.

Kellan Fluckiger: What do you think is left?

Lucia Rester: Well, from my point of view, the most important thing.

Kellan Fluckiger: Good. What is it? Talk about it

ere's another angle of this, [:

Lucia Rester: So a lot of coaches mistake themselves as therapist. I am both a licensed therapist as well as a coach and a consultant. Why I bring that into the, into this conversation is because. There's, there's four from, from my point of view, there's four different levels of consciousness and levels of transformation.

nd who can facilitate deeper [:

Lucia Rester: So certainly on the behavioral level, on even the mental level, there may be equivalencies here. I would say though that the. The, the fac, the ability to facilitate and to create, uh, bespoke responses based on what is happening in that moment. The energetics that are happening in that moment. The relationship between the, the coach and their client, whether it's single or group.

don't see that. Where I see [:

Lucia Rester: Of their coaching career who don't, haven't created that level of fluency and facility. Um, and more importantly for pe for the general public who doesn't even know that that's a possibility. And so there, there, you know, this, this area of transformation is one that has a lot of misunderstanding in it and, um, a lot of hype from my point of view.

Lucia Rester: So,

Kellan Fluckiger: yeah, I agree with that. So what do you think's left?

Dr. Matt Markel: So I think just a couple observations before we kind of jump into this one. Many coaches that are coaching today really don't have the expertise to truly be a coach. And I think that that's important that we recognize that the, I think, uh, cia, you were, you were hinting at that towards the end of your, uh, of your answer there that, you know, if they're early in their, in their journey on that, most likely they're not qualified.

seen many, many people that, [:

Dr. Matt Markel: You know, something, something along the lines like that. And

Kellan Fluckiger: I ads on meta, specifically with some dude in a beach chair on a laptop with the ad saying, uh. Make a good life, make a difference in people and make a living being a life coach. So I'm still seeing that nonsense floating around.

Dr. Matt Markel: Yeah, exactly.

Dr. Matt Markel: That's, that's by the way, that's why that sort of just come on, you know, that sort of feeling that we get from that. That's why I built the interpreneur coal concept was like, no, that's a, this, that's some moron who's trying to sell you on something so he can hopefully have, be able to take a vacation this year.

let's get back to your, your [:

Dr. Matt Markel: The proliferation of the ability even to read. So books weren't as, you know, we didn't, uh, didn't have, uh, books like they are, like they are today. You know, we all, uh, these are real books behind me and I'm, I'm a, a prolific reader and I think both of you are as, as well, the. But books didn't solve the problem, right?

Dr. Matt Markel: Uh, YouTube videos very, very cool, you know, and gr and I have now found myself, if I wanna learn how to do something, I had to fix a lock on my door. Recently, I'd looked it up on YouTube and I watched videos and I did everything and, uh, uh, surpri except for this one little part that was broken and I couldn't actually fix.

e lock on my door because of [:

Dr. Matt Markel: That will be what remains because people want. Someone to go down that journey with them, someone who's qualified, someone who can guide them, and someone who can, they can look to for advice and inspiration and see in their eyes that yes, they understand me, they get me, and they have my best interests at heart.

ave planted a better, a sto, [:

Dr. Matt Markel: did he just call us a STO in a shill? I said, uh, what, what that, oh,

Lucia Rester: I think, what is the show?

Lucia Rester: Honestly, I think that was the 80 20 issue. I think that was the,

A plant in the audience,

Lucia Rester: right? It was the off, like it was, it was the Kellen bot going, going a little.

Kellan Fluckiger: Well, no, I just meant a plant in the audience to say the right thing. The volunteer, right. The magician calls for volunteer and the right person say that.

Kellan Fluckiger: Right. So you said, you said a very leading thing and that Is this the conclusion I came to after all that analysis? And in the book I also propose a coaching model that is based specifically on what you've talked about, and I also ran it through the same analysis and everything else to see how vulnerable it came out to be in order to validate, at least to a degree my own thinking.

, if you will. 'cause I said [:

and questions and tools, and [:

Kellan Fluckiger: And the only thing that's left for us is the most important thing, as you said to start with, which is that the truth. Of the lived experience. So that's fundamentally what I came to as a conclusion. And then I also proposed a coaching process and methodology that adheres to that. So when you look at coaching and ai, what's, uh, what scares you if, if anything.

Dr. Matt Markel: I think that it's,

Kellan Fluckiger: or a scare, that may be the wrong word,

u wr wrote up something on a [:

Dr. Matt Markel: You might be able to look like, you know, what you're doing without really knowing what you're doing. Right. The, so AI gives the, gives the illusion of competence, gives the illusion of quality, gives illusion of insight, uh, without really, without really being able to back that up. So it's just one more tool that supports that level of, of, um, you know, you could say aspiration for someone who's on the, uh, who's not very good or very early in the career or, um, um.

Dr. Matt Markel: Uh, deception, if you wanna be a little bit more, uh, negative about it. So it helps out with that. You know, you can definitely sound a lot smarter with ai, but, um, and people will, you'll, what's the phrase? One's born every minute. Uh, then, you know, there's, you know, there are people that'll, that you'll be able to make some money off of that.

it really is. You know, the, [:

Dr. Matt Markel: I think it'd be gonna be, it's gonna be about the same. 'cause the people that want to do that just now have another tool to do it.

Lucia Rester: That's a i I, that's, I think, very insightful, Matt. I, I don't have much concern about this because in a way, it's kind of a good thing. It's kind of like Hogwarts sorting hat, you know, like,

Kellan Fluckiger: I'm, I'm with you.

Kellan Fluckiger: I think it's gonna have a big impact, and I'm grateful.

it's the intelligence, it's [:

Lucia Rester: For their magnificence, whatever line they're doing. All of those things take a lifetime to develop and one has to be basically a, uh, devoted to that as their path. If they're really going to be in that space, that higher level of space. And so this kind of instantaneous, you know, I've, I've read three books and now I'm gonna hang up my shingle as a coach.

ousness. So for those people [:

Lucia Rester: Um, I would say that Matt's ex explanation during Christmas of him working with AI is because Matt brought his consciousness and AI reflected it as opposed to AI bringing the consciousness and Matt. Gleaning from it, you know, you know what I mean? Right. It's like this amazing mirror kind of thing. So it, people are gonna find wherever they are in their journey.

Lucia Rester: And you know, for all of us, I think we're at the place in our career, in our life that we say, yeah, well we're, we're, we're really serving. We wanna be part of the 5%, um, as you said, Kellen. And we also want to serve from that level.

ust ask a, a question or, or [:

Kellan Fluckiger: I, when I said the whole middle of all of that's gonna be gone, I, I don't know if there's gonna be, I mean, like, how do you make a living? I define Make a living is a hundred k just because that was an arbitrary number. But if you can't make a hundred K, you can't make a living or else you better have another job or.

Kellan Fluckiger: Partner that works or whatever, and because I think AI is gonna be so available and so good, everybody that did something at a less than that level will be replaced by an $49 a month. You know, bot and Tony Robbins just released one for 99 bucks a month, and you know, everybody's doing that. I'm building a coach by Kellen Bot that I am, and it's full of all the stuff that I have, and I'm gonna make it available to clients for in between sessions and things.

t truly devoted to this. And [:

Kellan Fluckiger: It's not that thing, ah, I've heard this, believe me. Nah, I'm not worried. I've already got that. It's a human connection. I've already got that. You're. Because the, the, that, that person who talks like that about that sort of thing in my mind, doesn't even get what they're talking about. The second is what I call the anti problem, and that is imagine a casino where there's all these blackjack tables and they're all $10 tables and you go in to play blackjack and all the tables are full and they're all robots, and the only place for me or you to sit down is in the high roller room in the antis, 10,000 bucks.

tly devoted to that practice [:

Kellan Fluckiger: So I see those three things as the the barriers for the transition that people are gonna have to make a choice. And decide about if they wanna stay in this business. And, and maybe I'm pessimistic, but I don't think there's gonna be a lot of room for that other stuff. Now, I, I think it will live as, as this improves be replaced by these apps and bots.

Kellan Fluckiger: And I don't necessarily think that's a bad thing because it'll get rid of a lot of the nonsense that we see.

Dr. Matt Markel: But someone could make a, make a business out of the, the apps and box, right? Mm-hmm. And you get a, of course you get a good marketing, get good, good digital marketing, uh, set up. You do a good funnel.

is you see, you find people [:

Dr. Matt Markel: 'cause what they, they, what they're not doing is they're not really thinking about building a business. They're trying to basically bath them, build themselves a job and they think, find that their job can be a hundred grand. A, if they can make a hundred grand a year, they can pay their bills and they, for that, I've got a four grand product and I can sell it.

Dr. Matt Markel: I can, I need to sell two a month and I'm there. You know, so, because they're basically there just. They've, they're not building a business. They're, or that's scalable and, and can operate without them. They're basically bought themselves a job. But I think that the, um, the, the concept of, of marketing out these little, you know, Kellen bots or Tony Robbins bots or whatever, yeah.

t's, that's kind of the same [:

Dr. Matt Markel: You know, you could make the, make the comment that maybe that's sort of like socialized medicine. It's pretty cheap. Um, you know, at least in terms of like what you pay, if you exclude your taxes. Um, but it's usually better than nothing. And that may be adequate for, for many people, especially to get started and to give them the, the level that they want.

Dr. Matt Markel: So yeah, spending 50 bucks a month or a hundred bucks a month on some product, maybe that's, maybe that's great. It's a lower price point is a easier, uh, easier path to entry, easier path to exit if you don't wanna continue it. So there's, there's nothing necessarily wrong with that, and I don't think that necessarily is gonna.

Dr. Matt Markel: Necessarily replace coaches trying to fill that gap also, because they're gonna put, they're gonna put advertising, they're gonna put marketing out there and try to fill in that gap with their own little bot.

have the ability for people [:

Lucia Rester: Spend $10,000 or $20,000 or $30,000 on a high-end coach, they now have the ability to have some level of service, albeit AI bots, but still have a level. Of service because I can only assume Kellen, that you and Tony Robbins and everyone else who's doing this bot, I'm doing one myself, are gonna cross check it right before they send it out.

Lucia Rester: They're gonna, it's not gonna be like, oh, I know. I just trust that this bot's gonna sound like me and be as wise as me. You're gonna be cross checking that thing and making sure. It's going to represent you in a fairly accurate way. So how wonderful is it that somebody who's a five figure person who really suddenly caught, found out about you, gets your wisdom in a way that they may never be able to reach?

a Rester: The other thing is [:

Lucia Rester: I mean, he doesn't even coach a couple of people right now. He's, he's beyond that. So now we've found ways to be able to bring that kind of, of insight and training. And with AI it can actually, with good prompting and good engineering, it can actually walk people through a sequence that could really serve them.

one it for a long, long time [:

Dr. Matt Markel: What if the, I don't know, 75, 250 books that Collin's written. I don't know. How many have you've written up to Come on 23. Let's get, I lose. I don't talk to you for a week and it goes up by seven. I don't, you know.

Lucia Rester: Exactly,

Dr. Matt Markel: exactly.

Lucia Rester: I think he just wrote a book while we were on the show.

Dr. Matt Markel: He did. I saw him. I saw, I think, I think he was off typing on it or I dictating to ai.

Dr. Matt Markel: He went on mute for a second. Anyway, so the, so yeah, the how we consume. Information is, is always changing how be like, and when I write on LinkedIn, the, that is a different way of writing than when I write for a, a journal or when I, if you were to write for, you know, HBR write for, uh, uh, a, a corporate environment, it's a different way of writing because this is a different way people consume content and how we consume things is always.

at, nobody used PowerPoint in:

Dr. Matt Markel: So how we communicate and how we communic and consume information continues to evolve the bot, the maybe a way to that people can, or more. Comfortable with and more efficient with consuming information today than reading a book would've been, you know, 10 years ago.

Kellan Fluckiger: So we've come to the end of our time.

Kellan Fluckiger: I really wanna appreciate, uh, express appreciation. It's been a very textured and rich discussion and like always it's completely different, uh, than other ones. And I, you know what would be fun? I would, uh, suggest you guys wanna, might wanna go back and listen some of the other Thursday episodes just to hear some of the other things that have been talked about.

Kellan Fluckiger: I don't know if you do that sort of thing, but you ought to

Dr. Matt Markel: just send

me the transcript

Dr. Matt Markel: [:

Lucia Rester: Yes. I think Matt and I are gonna write a book in, I think, do you have 15 minutes, spare minutes, Matt? We'll just write out,

Dr. Matt Markel: I've already got

a draft,

Dr. Matt Markel: but I'll just, I'll just email that to you.

Kellan Fluckiger: Alright. Anyway, at least thank you. Is it Alicia or Licia?

Lucia Rester: Lisa,

Kellan Fluckiger: Lisa, cia. Yes, Lisa. I wanna say it right. Anyway, thank you. Thank you. Thanks for being here with me today and sharing your heart. Matt, thank you for being here too and sharing your heart.

Lucia Rester: It was great. Super, super fun.

Want you to take. Have a

Dr. Matt Markel: great day, everybody.

Dr. Matt Markel: Thank you so much.

Kellan Fluckiger: I, I, so just, uh, your listeners, I want you to take this to heart because like the other Thursday episodes, especially if you're a coach, lots of ways to think about this. Lots of opportunities and like anything, AI is gonna create a mess and a lot of opportunities. And each one of those can be used.

to create your ultimate life,[:

Kellan Fluckiger: open your heart right you

Kellan Fluckiger: right now. Your opportunity for massive growth is right in front of you. Every episode gives you practical tips and practices that will change everything. If you want to know more, go to kellen fluger media.com. If you want more free tools, go here. Your ultimate life.ca, subscribe, share

Chapters

Video

More from YouTube