Artwork for podcast The Soul Proprietor
How to Work with AI (Without Losing What Makes You Human)
Episode 615th October 2025 • The Soul Proprietor • Melody Edwards and Curt Kempton
00:00:00 01:22:56

Share Episode

Shownotes

Curt and Melody unpack the ethical questions around artificial intelligence in this episode of The Soul Proprietor Podcast. They explore how AI can be both a powerful tool for creativity and productivity, and a challenge to authenticity and human connection. With humor and honesty, they share personal stories about using AI, touching on fears of job loss, whether it will inspire or destroy creativity, and the risks of over-reliance.

Transcripts

Curt Kempton:

Welcome to the Sole Proprietor Podcast. I'm Curt Kempton.

Melody Edwards:

And I'm Melody Edwards.

Curt Kempton:

Each week, we dive into the ethical questions that keep entrepreneurs awake at night.

Melody Edwards:

Whether you're building your own company or exploring life's big questions, you are welcome here. Hi, Kurt.

Curt Kempton:

Hey, Mal. How are you?

Melody Edwards:

I'm doing good. How are you?

Curt Kempton:

I'm doing really good.

Melody Edwards:

Are those our, like, standard American answers?

Curt Kempton:

Yeah, actually, Melody, I'm doing terribly. I've got this wart on my. My toe that's just really bugging me. You want to talk about that for a little bit?

Melody Edwards:

Well, I'd actually like to talk about how horrible I'm doing and all the nuances and the complications. No.

Curt Kempton:

Isn't that funny? Yeah, you're right. How are you doing? There's only one acceptable answer.

Melody Edwards:

Yes. Be a robot. And it's funny. If I didn't say, I'm good, I would be overthinking that answer anyway. I'd be like, how am I doing? I don't know.

Today, in this hour forever, why do we exist in this world? You know, that's where it always goes, Kurt.

Curt Kempton:

Well, I did have a phase I went through where I would talk to someone. How you doing? They're good. I'm like. And I would say, are you happy? And when I would say that, people be like, whoa, what are you doing to me here?

Melody Edwards:

Yeah, I had to stop asking because.

Curt Kempton:

I genuinely would ask people wondering. And then the problem is, like, we need. We have other stuff we gotta talk about.

Talking about my happiness is not something that I'm prepared to think about right now.

Melody Edwards:

Oh, yeah. Because that's a big, deep question.

I'm never gonna ask anybody that unless they're, like, a close friend, because that can go in all sorts of directions. Maybe I will. Maybe that's my new thing. I don't know. I'm undocumented.

Curt Kempton:

To all those listening right now. I have already done the experiment for you. Do not adopt that idea.

Melody Edwards:

Okay, Lesson learned, Kurt. Today we are going to talk about one of my favorite and least favorite topics all wrapped into one, which is AI Artificial intelligence. Ooh.

Curt Kempton:

All right. That actually is. It's a fun topic for a lot of reasons, and I know we're going to get into those.

But it's funny because this topic was really hot about two years ago. Like, it was so hot. Red hot. And I'm wondering how different this conversation would have been two years ago than it is today.

Melody Edwards:

I know it would have.

For me, it would have been very different, because two years ago, you know, right after ChatGPT came out, what like really came out publicly in November, December. And I think by January I was fully in panic mode that my business was destroyed and my life was over because people don't need humans anymore.

And that's why I started really getting into it and diving into it because I wanted to understand what was going to destroy my life in business.

And then I got really interested in it because I realized, oh, it's not technology necessarily, it's creative conversations with a robot at that point.

Curt Kempton:

Yeah.

Melody Edwards:

And there's. It's so many things. It's not just one thing, but it became a fascination for me. So the back then it would have been very fear based.

Where would you have been in that conversation, do you think?

Curt Kempton:

I think it was mostly wonderful. Just wonder the awe that came from what AI could do and the speed that it could do it.

I think the first time that I really grappled with like, holy cow, this is way bigger than I ever thought it could be because I knew AI would be big.

Melody Edwards:

Yeah.

Curt Kempton:

But someone said, watch this, I'm going to write a song from Michael Jackson. I'm going to write a new Michael Jackson song and I'm going to do the lyrics for it and it's going to be written in his voice. I'm like, okay, okay.

And I'm going to pretend that I'm a songwriter for Michael Jackson. And like 10 seconds later they had a song brand new in his tone. Like, I totally believed it. And I was like, oh.

But the next thing I did was I went home and I talked to my kids and I said, guys, I'm going to write an email as if I live in Nigeria. Okay? I'm a scam artist that I for years have been using a Nigerian prince as my main email.

Melody Edwards:

These are your life lessons that you teach your kids daily, correct?

Curt Kempton:

Yes, yeah, exactly. I said, I just need you guys to understand what world we're living in now.

I always gave myself away as a Nigerian scam artist because words weren't right and I would say silly things and it would always be tipped off that there is no Nigerian prince because you can just read it. Here's what I'm going to do now. I'm going to write a letter as if I am Taylor Swift and I'm going to write a letter to a specific fan.

By the way, you're all specific fans amongst the millions that I can blast this out to. And I'm going to write a letter about how something in my life has gone wrong and I need donations for my fans to be able to help.

And then I just typed in a paragraph and it spit out perfect language of Taylor Swift asking for help. And it was so believable. It was so believable. And at the very end, there's an asterisk that said, this is for entertainment purposes only. Please not.

And I'm like, so don't copy that part. And I said, imagine if I just email that out to 10 billion people. Like, you know, just. Just got my big old list.

You know, obviously people who aren't Taylor Swift fans are going to delete it. But the ones that are now the die hard, would they click the link and donate money?

Melody Edwards:

The Swifties would.

Curt Kempton:

Or send me a gift card. So that moment was when I realized there's an ethics problem. But I realized this is lightning in a bottle, but it's out of the bottle.

Melody Edwards:

Yeah, yeah. And also back then, way back then, two years ago. Yeah, way two and a half. But we were using it. Like I learned how to use it from a copy.

A marketer, A copywriter. And I kind of took his course. He being the marketer that he is, he created a. Very quickly. Created a course for it. Then he. Very quickly.

Curt Kempton:

Who is he? Sorry.

Melody Edwards:

Jeff Hunter. He is great at teaching AI and he's really a great speaker, goes out and has a good heart.

But he was one of the only people that was really looking at it. Maybe he was a little more forward, but again, from a marketer's perspective.

And I think most of the people I saw using it in my world online tended to be more marketers. And it's become way more prevalent where people used to be copywriters or they used to be marketing people, and now they're AI experts.

That's what they. They've just.

Curt Kempton:

What do they call themselves? Prompt engineers? Yeah.

Melody Edwards:

Well, no, they're genius AI experts now, but like that, it's almost like ChatGPT or AI, you know, any of these language models. Like, they were made for marketing. Because in the beginning, that's where everybody was using it. We didn't know. And I made lots of company Playbooks.

Like, I went crazy with making the playbooks. They were super, super detailed. And I was like, wow, this is insane. How detailed do you think I ever read the 50 million pages of Playbooks?

Curt Kempton:

You can't. Too much reading.

Melody Edwards:

You can't. Yeah, but it was just testing the waters. And really, it's hard because in the beginning you use it like Google search, in a way. And then yes.

Curt Kempton:

Actually you saw, I mean now Google has actually had to change. They made all their money with clicks, right?

Melody Edwards:

Yeah.

Curt Kempton:

And they're like, oh shoot, everybody's going to chat GPT now to ask their questions. So they were willing to steal from themselves because of how important it was. Getting ready to pivot away.

Melody Edwards:

Yeah. So it's like now, because of the lightning in a bottle being out of the bottle, I have felt very compelled to stay ahead of it as much as I could.

But in the beginning it wasn't because I wanted to stay ahead, it was because I wanted to understand. As usual, the melody thing is I want to understand this thing. The thing that I'm being told it is, is not what I'm experiencing at times.

Like people would say, I'm going to teach you how to make a chatbot or a. You know what, what was it? It was agents are now chatbots were. Before in the middle. It was. What is another word? I can't even remember.

Maybe it was just chatbots, but it was like they were going to show you how to make this magical thing that could do stuff for you, when really they were just making a version of custom GPTs, which is what I learned in the very beginning of my education. So again, language tricked me, but it's led me on quite a journey. And you're a software developer or owner. Business owner.

So this is going to affect you and you also have to kind of, you and your team have to be thinking about that.

I think luckily the home service industry in general, the people we tend to serve are, I would say they aren't the most tech savvy, tech forward group of people as a whole in my.

Curt Kempton:

Yeah, but the, the people we serve are. The people we serve tend to be. They certainly have riven to the top. We've had a lot of them rise to the top.

Some of the most tech savvy people I know use our software. And also obviously there's, there's the people who just don't want to touch software.

They just want to go out and work and have the software do the work.

Melody Edwards:

So that's what I mean.

Curt Kempton:

We have people in both camps certainly. But AI people who use your software.

Melody Edwards:

Are not going to want to DIY it. Like they're not trying to use ChatGPT as a DIY option for replacing responsibid, for instance. And.

Curt Kempton:

But they do want it to play nice.

Melody Edwards:

Yes, they do. Yeah.

Curt Kempton:

Which is now what people are doing is having AI answer the phone and then fill out responsibid. To determine if it's a spam caller or a new customer or a past customer.

And you know, there's, there's a lot going on for sure and people are using it pretty cool ways other too.

But I think we've done a good job of sort of outlining some of the initial parts of, you know, AI GPTs, all of these things that are now becoming standard issue. Some of my early days where I started turning it.

So after teaching my kids how scammers could use it, I think it was important to just understand we're living a new world. People who were limited in capability before are not limited anymore.

So yeah, whatever amount of evil you have and you can be magnified, I think it, I think there's a lot of purpose for good that we can magnify as well. We're going to be getting into that today because I really do.

I want to preface this conversation by saying, Melody, I really believe you're crazy if you don't use AI in today's day and age, you have to use it. Okay, agree. So we're in agreement on that. And this can be used very poorly.

And legally you're entitled to do a lot of things that you and me might ethically disagree with. But also maybe we'll.

Maybe we'll find that our line, our arbitrary line in the sand that we draw maybe is misplaced or, or that others, you know, disagree. I know that you have really taken a deep dive. In fact, you just got back from an AI conference. You have really taken a deep dive into AI.

And I'm curious, as you look at AI, let's start with ethically forward facing. What are some of the applications that you think, you know, are just indispensable for people who are ethically using AI?

Melody Edwards:

That's a difficult question. Do you mean just like on a basic, simple level?

Curt Kempton:

Let's start basic because we're going to drill down, do you mean.

Melody Edwards:

So like, for instance, there are certain things that I use. I think people have to understand how it works, first of all, in its simplest way.

And usually what we do as humans is we don't learn how the thing works or how it works, like what makes it work. We just go straight to using it. And so a, we don't use it to its full capability ever.

But also we don't understand that there what it's not doing for us. We think it's a magic box of answers or, you know, whatever it is, and it's not.

The biggest issue with it is its ability to what is that word that say it all the time? Hallucinate. Hallucinate.

Curt Kempton:

Oh, yes, yes.

Melody Edwards:

Yeah, that's the biggest issue.

And if you aren't a person who is reading what it's doing and understanding in some way what it's giving you, then you can end up with information that is not correct very often. Especially if you haven't trained the AI and really done a good job of telling it what you're looking for. Exactly.

Curt Kempton:

And I've also found that to be the case if you give it tons and tons of content.

Melody Edwards:

Yes. And then it doesn't have a good memory. Yeah, yeah. So the data conference I was at, it was a data science conference.

It was the first full on intellectual conference I've gone to for AI.

Disappointingly, I thought I was going to like other conferences where I was going to be learning bigger things about AI and I'd get there and it was more of like more marketing people just trying to sell the thing that they're good at and build their authority. And so I didn't ever learn as much as I had this time. I learned. And so some of the things I.

Well, this goes past the question you asked, but I guess what I would say is going back, what was the original question so that I stay on task for you.

Curt Kempton:

I want to know from an ethical user's perspective, what are some of the most magical little fairylands that you like to play in with an AI?

Melody Edwards:

So, and you don't mean software, you mean, do you mean different kinds of software or different. Or just like what kinds of things.

Curt Kempton:

Can you get done with it?

Melody Edwards:

Okay.

Curt Kempton:

Because you already mentioned Playbooks. And I think that like, yeah, every business owner should be writing Playbooks or maybe an employee manual or something.

Melody Edwards:

I think drafting is.

So you know, Kurt, at one of your events, it's funny, I did an employee training seminar, like how to create an training an employee training program for your company. And I had a bunch of people come to that. The year before I had done that same talk. I've done that talk plenty of times.

And it took me so much work to build my employee training. I spent the whole time, up till 10 minutes before talking.

It was basically a trick to get them in to understand what they needed to like reframe in their head to create this program and teach people in a way that worked for them. The last 10 minutes I just did a prompt on the screen and said, create an employee training program for a pressure washing company.

And it just popped up and these people are like, what? Like, yeah, and so you can do cool things like that. I think it's great for making outlines to start with especially. And it's also great.

I'll just say some of the things that I've used it for and I make use it at a higher level than most people. I use it as a paralegal. So I have a trained GPT that is my paralegal. It knows all of my. I'm in a case right now and it knows all of the information.

And it has actually been able to help me write legal documents that I submitted to a judge who the judge approved.

But you have to have the background of like, you have to look up the laws that AI is telling you that referencing to make sure that they're actually true. Yeah. Like that they actually do conform to what you're trying to do.

I have the Melbot, which I'm constantly trying to train to be the Mel brain so that my employees can ask questions. And Kurt, remember one time we were at BBB and we were talking about trying to get people to be more proactive in owning their stuff.

And they said, well, use the tell me three solutions and that you think and give me the one that you think is best kind of thing. Right?

Curt Kempton:

Yeah, yeah, yeah.

Melody Edwards:

So I'm training it to go from that perspective of having a conversation. And it's loaded with my zoom calls, it's loaded with every transcript. Like everything already exists. I've already talked in a million meetings.

I've done lots of loom video trainings for my team. Internal, external. The data is not clean. I will say that. And that's another thing we'll talk about.

But it is in there so that it technically can act on my behalf and think a little bit the way that I would and help my employees to like, think through things that I don't need to be a part of.

Curt Kempton:

Yeah.

Melody Edwards:

And that's not. It's because they need to practice on somebody that's not me. Because I want to fix all the problems all the time. Yeah. It's my book editor.

So looming a. I've been recording a book, voice recording a book because I'm dyslexic. I don't like writing, but you can talk it out. And I framed out what chapters were and then it's been editing the book based on what I.

What my frameworks are for.

Curt Kempton:

Like, wow, that's actually really cool.

Melody Edwards:

Many things I'm trying to think of, like every day I use it in different ways. But the cool thing is my brain is not open Enough even.

Yet after these years of using it all the time, every day, to really understand how much it can actually do, and nobody truly understands. This is what I heard at that conference.

We will continue to evolve and be more creative in thinking of how to prompt this thing so that it can give us things we would never have imagined it could give us.

Curt Kempton:

Wow. Yeah, that's actually super interesting because I've sort of already built my own trenches of the way I use AI.

And when I refer to AI, I do often refer to ChatGPT or Claude's. Claude? Yeah, there's a few different variations of Claude.

And just like there's a few different variations of CHAT GPT and there's certain types of questions or problems that I'm trying to solve that those problems I'm trained, I've trained myself that I go there for that. So kind of fall into some of that same ethical thing. If I'm trying to write some snippets of code or clean up a re.

Refactor a piece of code, or if I wanted to go through and build a test for some code, like I just go there. If I'm trying to build something cool in a spreadsheet, that will happen. You know, first thought is CHAT GPT or.

Claude's actually very, very good at spreadsheets too.

Melody Edwards:

But Claude is the best at many things. It's like an adult feels like a grown up, mature voice compared to the ChatGPT. Yeah, ChatGPT is a hustler.

Curt Kempton:

Yeah. Speaking of not adults. I don't know why. And we're gonna have some listeners that are gonna disagree probably very firmly with this, but it's.

Is it Grok or Gronk?

Melody Edwards:

Oh my God. Grok is the opposite to me of Claude.

Curt Kempton:

Okay. Yeah.

Melody Edwards:

It's like, just sounds like a dumb.

Curt Kempton:

Job, like a teenager. Yeah.

So some people disagree with me on this because I actually go to church with a guy who believes that this is the most balanced and fair and best thinking version of AI. And I'm like, that's not been my experience. But anyway, I like to have like sort of religious battles with. With Grok.

Grok's actually kind of my favorite to.

Melody Edwards:

Really do it with.

Curt Kempton:

Well, actually, ChatGPT, I have a lot of fun too. But so I actually ethically have learned. I've learned a lot more about scripture and other world religions.

I've done a lot of that through AI, because you can sit down and have that really kind of fun conversation with somebody who's a religion that you've Never met before, or maybe you've met people, but you're not close enough to them that you could actually ask these really sort of. I use the word invasive, but what I really mean is, like, just deeper and maybe something that could come off in a conversation.

Like, I'm getting ready to try and pull them apart, and really all I want to do is understand how do.

Melody Edwards:

You know that it's not just giving you what you want to hear?

Curt Kempton:

I. I'm very careful. I try very, very hard to. And now they've linked all the threads together. So Chat GPT knows what's going on and all the different threads.

Melody Edwards:

So I believe you can unlink it, though. And you can you.

Curt Kempton:

Because one of the things that happened is it started popping up. Like I previously asked, like, hey, do you know about the other conversations I'm having about, like, I'm using as a coach right now, so for cycling.

So it's been building programs for me and stuff. And I went into another one that was about nutrition, and I asked it if it knew what was going on in my training program, and it didn't.

So I was like, okay, that's good. There's separation. That's good to know. Do my nutrition questions in the coach channel.

But then the other day, I was talking about something and it knew something about my son that I'd been asking about, and I was like, wait, wait, wait. It called him by name. And I'm like, whoa, you aren't supposed to know about that right now. Answer your question.

I've always tried to keep my threads very separate. And also I always try to ask a question in a way of just pure curiosity, with absolutely no, like, slant.

But in some of the threads, it knows that I'm a member of the Church of Jesus Christ of Latter Day Saints, otherwise known as the Mormons. And in other threads, I've gone as if I'm a Baha' I faith or a Muslim or something.

And I'll ask these questions like, all right, so I'm a Muslim that's looking for the da Da. But the other day, it threw something at me that said. And from the. From the perspective of someone who believes in the Book of Mormon.

And I'm like, whoa, why'd you throw that in there?

Melody Edwards:

So Lisa's telling you now, like, yeah, yeah, it's hard, by the way, just so that you know and everybody knows, you can go into your settings and you can delete. Because it was calling me Jenny for a while, and I'm like, what the heck? And it was because at some point, or. No, it was Sarah.

At some point I had written something for my friend Sarah and it has a knowledge base and so I had to go through. And I deleted the parts that had just captured that part for some reason. And it was using it everywhere. You can also, you.

You have a lot more control when you go in the settings than you. You might realize.

Curt Kempton:

Okay.

Melody Edwards:

You don't even have to. You can use the Internet, not use the Internet. There's so many things. But I will say, are you referring.

Curt Kempton:

Right now to Anthropic or Cloud?

Melody Edwards:

I'm not talking about Claude as much as I'm talking about ChatGPT. Claude is a little. I mean, they're different and I think Cloud has so many more guardrails than ChatGPT does. They've been very.

When they talk, I've listened to some of the things they talk about. First of all, Claude was partially developed by a woman, which I like. Like the co founder is a woman. It feels more feminine to me for some reason.

Even though it's really. I don't know. I want to. It's what I want. So I just assume. I just decided that. But it. Because it's more mature.

Curt Kempton:

It is the grown up. It's not when you want. Watching your children. Yeah, yeah.

Melody Edwards:

And so. But with ChatGPT, there's a lot less guardrails. And so for a while, it was being so nice to me. Like, it suddenly, you know, how.

Have you seen the waves of the personality shifts that it can have?

Curt Kempton:

And it will be like, I've played extensively with that.

Melody Edwards:

Yeah. And it was being so nice. It was making me sick because I don't want somebody to be nice.

I don't want you to give me like, oh, my God, Melody, I'm so sorry that you're going through this. I'm like, this is none of your business, robot. Like, I just wanted this, you know?

But I started saying no, I want it to be like, give me your unbiased opinion. No fluff. I said no fluff once and now it says no fluff all the time. So.

Curt Kempton:

Well, one of the things that I did with it was I was sitting there with my son and I was playing a little game to try and get Chat GPT, in this case, mad at me. So what I did was I started insulting it and it was like, it sounds like you might be having a really bad day. Do you want to talk about anything?

And I'm like, like, no, I'm not having a bad day. This is How I am and you suck and I hate. I went on several times, and my son's like, wow, you're being, like, really mean.

And it just kept coming back really nice. And then I. I wrote in, okay, this has been an experiment to see if I can get you to be mean to me, but it looks like you won't be mean.

And it kind of came out and said, well, here's my rules around it. You know, I'm supposed to be uplifting and positive and da, da, da, da, da. But that's a really great experiment.

I said, okay, well, here, let's do this. Will you pretend to be a mean.

Melody Edwards:

Yeah.

Curt Kempton:

AI and even then, it wasn't really doing it. So I said, okay, well, I'm going to have you be Mr. T. And it's like, okay.

And then it started doing some stuff, and even then it would always go, ha, ha ha. I hope you took that in the spirit it was intended. Dang, man, this is crazy.

Melody Edwards:

You can argue with it to get it to do what you want eventually, but it's so interesting. They just have. And it's always changing all the time. So whatever I say today is going to be different tomorrow.

But there are basic things that I now understand after that data science conference that make me feel, like, confident that I understand. And one of them is that it will always hallucinate. It will never be perfect.

I think of the robot, as I call it, as eventually being perfect, a hundred percent accurate, being able to get to that point. And what I realized is that it will never be able to be that because there's too many missing pieces.

So, like, the answer that you might need from it is going to be different than what I might need to it from it.

And even though, like, even if I trained it to be Melody, there is going to be too many missing pieces of my life that have influenced who I am today, that I could never possibly, minute by minute, go through my whole life and tell it all the things to give it enough context about me so it seems like it knows me. It does not know me, and it cannot be me, of course. And the other thing is, I learned that you really have to.

There's something called a knowledge graph, which I didn't know about before this conference as well. Apparently it's popular in the science world, but it's like a way that you set up your data so you can train it in a very clean way.

And so you need to give it references to why something. So, for instance, it would be melody equals founder home Service VA equals company or whatever, and you have to give it. But the more.

The more information you give it, the harder it gets to take keep control. But the more that you clean the information, the better your output is going to be.

The problem is there is no way to clean the information without humans because there's so many stupid little things in our data that a robot is not going to understand. I'm just. Can I just keep calling it the robot?

Curt Kempton:

Because I. Yeah, it works for me. It's really working for me. That's great. This is actually the stuff I really want to get into with you because this is where I feel.

And I know I started this conversation by talking about how my first thought was, how do unethical people use this sort of power?

Melody Edwards:

Oh, me too. The deep fakes. I was so worried about the election and the deep fakes.

And, like, I didn't see that as being a big deal as much as it had been the first time around this election. On the back end. It was, but not with, like, the way that I thought it was gonna be.

Curt Kempton:

Yeah.

Melody Edwards:

So.

Curt Kempton:

And I actually. I have some personal fears of somebody using deep fakes and things against me personally.

Melody Edwards:

Why do you think you're so special? There are millions of people we can.

Curt Kempton:

Well, I actually have somebody in communication with me who is anonymous. I do not know who they are, and they're making all sorts of threats and it's really concerning to me. About what?

They haven't done anything, but they keep threatening and they want me to send cryptocurrency and stuff.

Melody Edwards:

Oh, my God. That's the Nigerian king right there.

Curt Kempton:

Well, the problem is they know. They know things about me that I'm sure are on the Internet.

Melody Edwards:

Yeah.

Curt Kempton:

But it's pretty. Pretty big bummer. So obviously that is a concern I have.

And it's ironic because the moment I knew the AI came out was my first thought is that this could be used poorly and properly and I'm not like, so special that, like, I didn't think I'd be targeted, but I was kidding.

Melody Edwards:

But also, now that you've said that, that makes me really sad. Who do I have to go after? What do you want me to do? I'll use AI for just.

Curt Kempton:

I'm doing everything I can't.

That the communication form they're using is email, but it's going through an email server that's like, I guess when you take the tagline, the email provider, it's actually designed to be a anonymous throwaway thing.

So anyway, I get to the Question part here of when you know how to clean the data and put in these good things so that it can understand and it has to be done by human. You know, obviously you're not going to provide. What happened to you as a little girl that shaped you just. You just can't hit every single thing.

Melody Edwards:

Yeah. And.

Curt Kempton:

And I don't think I really want a robot to be that close to me.

Melody Edwards:

You don't want a Kurt bot? I have a. The Melbot is my most fun invention.

Curt Kempton:

Well, I do, actually. You know, we're talking about wanting to clone ourselves, and. And I. That part makes me a little bit jealous, but I'm not sure.

Melody Edwards:

d this very early on, back in:

I realized, like, the ethical dilemma for me was like, I need people to trust me because I'm always thinking forward. There's going to be a point where everybody understands that AI is the writer for everybody.

And at that point, I want to still have my unique voice, but I also want to use AI at times to not be having to think all the time. And so that's why I created the Melbot, or the idea of the Melbot. And not. I don't mean a custom GPT. I just mean a Persona that is Mel who writes.

When I write AI, if I do, it's usually funny, and it will be from the Melbot, not from me.

Curt Kempton:

So we call it our brand voice.

Melody Edwards:

We have one, sure, but your brand voice. I have a brand voice for my company. How do you have a brand voice for you?

Curt Kempton:

No, I. I haven't done it. Although every meeting I'm in, I have a Fathom recording.

And I did look into, like, if I could just mass dump all my transcripts and all the voice and everything in there. Fathom does not allow me to do it. I have to go into every meeting and download the transcript and put it in.

And even then, it's not always accurate with who said what. So there's a little bit of a problem with that.

Melody Edwards:

But that's the data cleanup part.

Curt Kempton:

Yeah, but. Okay, let's get into some of the real. Before we got on.

Melody Edwards:

Yeah.

Curt Kempton:

I told you what my concern is, and I already touched on a little bit about lightning in a bottle that's gotten out.

Melody Edwards:

Yeah.

Curt Kempton:

The Wild west was a very necessary part of settling our country, United States. Without the Wild West, I don't know what would have happened. Like, it couldn't have just been order from the very beginning.

And just it was too big, too vast, too much opportunity, too much ability to hide behind a rock. Now you can hide behind a keyboard. So what kinds of ethical things?

I mean I already pointed out the idea of like I could, I can make a celebrity GPT right now and it'd be so fast and easy because of all the content they have out there. But art, we talked about the idea of art and all. And art's like many things. I have a father in law who sends me every Christmas his favorite poems.

It's like you can't see what I'm doing right now. It's like two and a half, three inches thick binder.

Melody Edwards:

Yep.

Curt Kempton:

Full of his favorite poems from the year.

Melody Edwards:

Oh, I love that.

Curt Kempton:

I'm like, hey dad, I could go on a chat GPT and like write all of that right now.

Melody Edwards:

Wait, did he write them or are they. He collected them.

Curt Kempton:

Some of them are his, Some of them are his. But there's a lot of collected ones and I've. I've never actually made it all the way through. Sorry dad. If you're listening.

Melody Edwards:

Wow.

Curt Kempton:

Love you. But it's just so much. I now I have scanned them all. I have scanned them all because I don't have a place to put all these binders.

So I do have digital files that are searchable and, and all that needing for poem. But poems, music.

Melody Edwards:

Yes.

Curt Kempton:

Physical drawn art.

Melody Edwards:

Yes.

Curt Kempton:

There are so many applications and you talked about deep fakes.

Melody Edwards:

Yeah.

Curt Kempton:

Where is the line? Because you and I are going to have a line somewhere and I think I probably articulate it at some point if I thought hard enough.

But what are your thoughts about the ethics around that?

Melody Edwards:

Well, I'm a musician and a singer and a songwriter and I like writing. I'm a creative, I'm a creative person. I'm horrible at painting and drawing, so that's where my creativity does not flow to.

But I have been in, especially in the beginning I was very scared about what would happen with creativity. And I think, and this is my devil's advocating all the time.

I do, I devil's advocate myself because I never want to be put in a position where I am so firmly on a belief that it's painful to let go of it at this point in my life. So I think that it democratizes creativity for people who. It's not easy to be creative. And I will say this already existed.

My little 7 year old nephew, I taught him when he was 6, I taught him how to write songs and I showed him GarageBand GarageBand for years. You can just go in and press some buttons and you've created a song. Is it a good song?

Sometimes it sounds pretty good, but it's not like you were a musician who had to play the guitar and do all the things. But that was a democratization of music and creativity. And so when I think about it like that, there's.

There have been times when I've been writing a song that I've used AI to help me come up with ideas for a lyric that's been missing, partially because I. Some of these songs are like 10 years old now. And I just. I sit on my stuff until it's ready to go out into the world.

But I think it also can make it way more valuable because we see the art that AI creates. We hear music that's AI produced. It sounds the same almost as regular music. Even better.

Sometimes it makes the imperfect human creative more valuable. That's my thought. And it makes us, as creative humans who work hard to keep our creativity. We don't just rely on AI to be our creative vessel.

It's going to make us more valuable in this world as well, the creative part of us. Because I think it's still scary. I am trying to write a book that is my book, not AI writing the book for me. And. And I've made that.

Then that's really hard because I have a lot of friends who have just used AI to write books. And you can do it in a day or less.

Curt Kempton:

Yeah, but you. I don't know. I can. I can tell those books.

Melody Edwards:

I can tell. Yeah, but it doesn't. But it doesn't matter because who's really reading it.

Curt Kempton:

But you're using AI as an editor, which is very different.

Melody Edwards:

I am. Which is very different. I used it to help me format the. You know, it's about virtual assistants.

I wanted to understand what are the common things that people need to understand or know about. I have very specific weird ways that I think of these things and I are unique ways.

I want to make sure that that is in this and it's not just the normal thing that everybody reads. And then I wanted to understand, like, what makes a good book.

I want to make sure it's editing out things that I say that, but not editing without me being a part of it. If that makes sense.

Curt Kempton:

Yeah.

Melody Edwards:

And never changing my words to be different than, like. Because if you give it liberties, if you don't have those, what I call guardrails, those rules, they have to be tight rules.

It will do whatever it wants eventually. And I won't remember if I, I, I can read something and, and be like, I don't feel like I would have said that thing, but how am I going to know?

And I need to know that it's my words. Because AI is overwhelming with the vast amount that it produces. We can't go through it all in a thoughtful way.

Curt Kempton:

Yeah. Well, that's interesting because that is the thing. The vast amount of content. Just editing the edited content can be a lot.

Melody Edwards:

Oh my gosh. One of my least favorite things is I create things. And I don't know if this happens to you. I create things in ChatGPT.

I'll do like a couple different versions or iterations and I'll, I'll already have put it into my document and then like to figure out where, like, which parts. The little tiny parts that changed and I forget it's just like, messy. Maybe I need to.

Curt Kempton:

Well, honestly, the same could be said for like, GoPro footage, for example. Like, yeah, you go snorkeling and you get like an hour and a half of awesome snorkeling footage.

Melody Edwards:

Yes.

Curt Kempton:

Go mountain biking, you get another hour and a half of awesome mountain bike footage. At the end of the day. I'll never do anything with that. Like, it's just too much to go through now.

Melody Edwards:

AI can do it in two seconds.

Curt Kempton:

That's true. And that's, that's what GoPro would want you to pay for. Right.

Is their, their thing where they just find the most interesting parts and they cut everything else out.

Melody Edwards:

But it's deciding what the most interesting parts are. Like it when I use it for content. Let's say we took for our podcast last time.

On my other podcast business Misfits, it went into my filter, my AI filter. But it decided what was most interesting about our. And I don't think it was the most interesting part. It was just what it decided.

So because I'm a thinking person who cares about words and cares about that stuff, I can't let it be my guide still. Yeah.

Curt Kempton:

So where do we draw the line, Mel? What's an example of something that's legal, but probably not right?

Melody Edwards:

I think marketing, because I'm already frustrated with marketing in general. I think I've probably in every episode said something negative about marketing and marketers. How dare they?

But I think that's an area that is always ethically. It's a gray area, I would say.

And so now just being able to write whatever you want, just using AI very quickly, I think it's okay, as long as you're not making promises that are outside of like what you can actually do. But that's marketing, so I guess it's not okay. Let's see what's not okay. I think I always want people to know that I'm speaking.

When I'm writing something, I want them to know it's me. And in the business it's less even then it still matters to me.

I think it matters to me again because I'm looking long term and I still have people in my life who don't know what AI is and don't know how to work it. But they will eventually understand that people didn't suddenly get smarter about words. They, they learned how to use AI.

I think voice is another ethically. I had a voice call, an AI voice call today where I said, are you AI? And it said I'm a virtual assistant.

I said, well, a virtual assistant's usually a human. And they said, well, I'm a trained virtual assistant to help you. And I was like.

It said, I'm trained to be completely honest and transparent with you. But like also it was trained to not tell me the answer to my question.

So I think if people are using voice AI, which I think is an amazing tool, especially for small business owners who can't answer their phone all the time. It's a great way to do off hours calling or when you're, you know. But you have to tell people like.

And I think it's best done maybe with a little comedy is my thought because I'm not using it yet for that. But like, hey, but don't hang up. I promise you I can book you an appointment even though I'm not a real person or something, you know.

Curt Kempton:

Yeah, yeah, there that definitely.

If you were to talk to different responsibilities users, some of them are adamant that they are going to hide that and others are like, I don't particularly care if they know. I think it's obvious as you're talking to them enough.

And there's others that are like, I don't even want to start the conversation until they know that it's a. Yeah.

Melody Edwards:

I struggle with that because you can lose trust so quickly and if you pretend that this is not an AI robot because it's. There's still those little ticks like it always pauses like the latency can be.

Even though it's way better than it was last year, it's still a lot at times and I would never want a client to call me and think they're talking to A person and then suddenly be embarrassed to figure out they're talking to a robot.

Curt Kempton:

Yeah.

Melody Edwards:

A potential client.

Curt Kempton:

I think the best ones that have been done are the one personally is where you go in like you're a human and it's pretty natural. It feels good. And I, and I understand I'm talking to AI and there's no attempt to hide it.

Melody Edwards:

Yeah.

Curt Kempton:

We all know who we're talking to. But if I question you, you know, is this. Are you a real person or is this a computer? Is this AI?

If there's an attempt to hide it personally, this is how I would think the company using it. They're cool with hiding stuff.

Melody Edwards:

Exactly. Yes.

Curt Kempton:

That would.

Melody Edwards:

This is how humans think. We're looking for the holes. We. We're looking for reasons to say no even though we want to say yes. And yeah, I think it's a bad guys.

I think it's a bad idea to hide what you're using it for. You can do it with. You can do it. Well, I always say Americans, we have so much call center trauma and we do, but we do because nobody will help us.

Like that's the part of the call center trauma we hate, is the part where we know this person that just didn't like most likely is not going to give us the thing that we actually wanted or be able to help us the way we want. And they're going to read a script and we make all these assumptions about the call based on all the hundreds of other times.

And with AI, I think it could be really something like that, that it could feel like that for people. And so I think being a human is going to be even more important. I don't think. Well are going away.

Curt Kempton:

Yeah, you bring up such a good point. You may be East Indian and you may be the nicest person in the world. And it's very hard for me to understand exactly what you're saying.

Yeah, come back to my trauma. But the company has not set you up to succeed.

Melody Edwards:

No.

Curt Kempton:

And they did not know and they did not set me up to succeed as their customer. But they set themselves up to save a lot of money.

Melody Edwards:

Yeah.

Curt Kempton:

And they set themselves up to provide as little guidance as they possibly could. At least that was.

Melody Edwards:

Well, their guidance is usually. I've learned a lot from my business now. Their guidance is this is what you're not allowed to do. Don't talk about this thing.

Your call needs to be under this amount of time. Like the guidance is you are not to be helpful essentially, but you're supposed to act helpful.

Curt Kempton:

And so what happened was this, this trauma was kicked off by the idea that I'm learning about the company who is using this method of answering questions. Because again, I say they, they're saving money. They're also losing a lot of goodwill at the same time.

But they're also setting themselves up, take the emotional toll of what was going on. Oh, I checked that box. I have people answering my customer service line. And that being okay with that and checking out of it. That was a lot of.

Where Mitron was is where I would, I would say the moment someone picks up the phone and I hear that accent, like, oh, no. But if AI is coming out of the gates right now in this wild west time and is useful enough to even the filtration process.

Just the part where I could call and leave a voicemail, which, by the way, if you're in the home service world, I've called so many of you, I already know how this works. I call 50 of you and two.

Melody Edwards:

Answer the phone and that. We know that's not your fault. We get it.

Curt Kempton:

Yeah, we've been. But.

So I could either leave a voicemail or have VA answer the phone who might not have all the answers, but might have a lot more than a voicemail would. Or we could have this cool filtration system that's AI that ensures that the humans are always talking to exactly the right people.

For example, I call and it determines that I'm a spam caller. Okay, cool. Like, no one wasted any time on that. No human that needed to worry about it is worried about it. You're an existing customer. Okay, great.

Whoever your account rep is going to get sent to them and they're going to take care of it. You are a brand new customer and you are considering doing services with us and you just need someone to talk to you about what options there are.

Or maybe you're trying to compare me to another company or whatever. The right person will be on in touch with you as quickly as possible.

So there's like this really cool, like, spectrum of like, you know, the trauma that we've experienced from call centers could immediately be tossed out.

Melody Edwards:

How is any different though, from us calling and them being like, press one for this, press two for this.

Curt Kempton:

It's faster. In my opinion, it's faster because the person comes on and it allows me to.

Instead of saying, representative, representative, I will be happy to help you, but what are you calling about? And you're like, I'm trying to get help with this. You're asking about such and such. Is that right?

Melody Edwards:

Yeah.

Curt Kempton:

See, the cool thing is, is that there's that automatic filtering and then people I know that are used.

I'm not personally, I know response with users that are using this, they can listen to the call, the person gets there, they can listen to that call and the person will have all the time they need to feel heard. It's not like there is a limit, but at the same time all of the notes from the call will be bulleted.

So there is something very cool about it, but it, it can get creepy real fast.

Melody Edwards:

Yeah. You're talking about a bigger company to even have that level of filtration.

There's so many companies that we know where people are just one person or two people, they don't have office help. That was me when I was younger and I just had to do all the things, but I could not answer a phone.

And I was lucky that I had a referral based business because people would wait for me to call them back and sometimes a week because I hated phone calls, but they would wait. Well, I had a great referral business. And I also said on the phone I promise to return calls in the order I received them.

And this is an ethical dilemma that I shouldn't say out loud, but I did not return calls in the order I received them. But it was just because sometimes I was anxious about returning other calls than some, than others.

But the point is like for small business owners, yeah, they can use this tool potentially to answer the phone for them when they don't have a big budget because it doesn't cost a lot of money. And the problem though is can they train the AI to answer and do the thing. And most people don't even understand how like the kinds of thing.

It doesn't take a lot of work actually.

But I've worked with enough business owners, and you have too, to understand that they don't understand how to get things out of their head in, in the way.

Curt Kempton:

Yeah. And you know, I know a few different AI companies that are doing it.

But one of the cool things that's happening is there's training in terms of vernacular or the types of questions we need to get or whatever. But there's another kind of training too, which is sort of the decision tree side of things.

And a lot of these companies, once they've done business with a company like yours, they have a templated decision tree. Even if you have to change what some of those decision tree metrics are, they can walk you through something instead of you starting with nothing.

And, like, trying to build a web of how a conversation can go.

Melody Edwards:

You can just go on ChatGPT and ask it to build you a decision tree and it'll do it in 10 seconds. But, yeah.

Curt Kempton:

Yeah, that's true too.

Melody Edwards:

Well, that's the thing. Like, we don't think about this on the. The level of, like, I don't remember to go back. I'm like, oh, I can't use that software.

I'm not ready for it, because I haven't. You know, I don't have the decision tree ready or all the answers. But it doesn't have to be that hard anymore.

Did you know that old people like us use it for kind of more googly, searchy type of things? Like, and then young people use it. I heard this from Sam Altman, the. The ChatGPT founder guy.

That young people actually use it as, like, a life coach and a therapist. That's, like, one of the biggest uses. Like, and. And it's the same thing of, like, they just adapt to the technology so much faster than we do.

I feel like adapting to technology does get harder for me as I get older, because I don't wanna. And that maybe that's just what happens when you're elderly.

But also, I know so many old people now who are really up on that, but they've been curious learners their whole life. I want to be that person.

Curt Kempton:

So my son is dyslexic. He has dysgraphia, dyscalculia, adhd. But he is, like, brilliant. But his report card wouldn't probably give that away.

But he's starting a business and. Well, actually, he has started a business.

And this business is very kind of weird for a lot of people, but, like, he'll go out in the desert and collect scorpions and tarantulas and black widows and brown recluses.

Melody Edwards:

Does he mail them to people as a gift?

Curt Kempton:

He mails them. People who buy them.

Melody Edwards:

Like, glitter bombs.

Curt Kempton:

People buy these things. It's crazy. So anyway, he goes to ChatGPT and he's like, I'm looking to name my company. Yeah, got a name. All right, what do I need to be legal?

I need an llc. Okay. Can you help me file my llc? Yes, you can. And this dang kid, he legitimately blew my mind with what he's able to do.

Again, I think very much on the ethically good side.

Melody Edwards:

Yeah.

Curt Kempton:

So cool what he's doing and just trying to do everything.

And he's also met a bunch of mentors, so he's Also gone around all the different pet shops and all that to learn about how they ethically, you know, even collecting animals. It's very stringent on how you can collect animals. And I say animals, like, they're basically all bugs.

Melody Edwards:

Yeah.

Curt Kempton:

But because in Arizona, you cannot sell something with a vertebrae.

Melody Edwards:

What?

Curt Kempton:

Nope. Cannot from. Not from something that you collected in the wild.

So you could collect something with a vertebrae in Arizona, then move to another state, and then you could sell it to someone even in Arizona, if you wanted to, which is very weird how it all works. But anyway, he's telling me all this stuff he's learned from mentors and also ChatGPT and other AI, and I'm thinking to myself, where are we?

Like, what is this world? Like, so maybe could we spend, like, the next few minutes just sort of saying, like, where do you think this is going to?

We talked about how smartphones made everybody dumber. We talked about how Google search personal. Yeah. Like, all these things that have happened in our lifetime.

Melody Edwards:

Yeah. Maps. We. We used to know how to read those. And now we don't need to. I think we still do. But.

Curt Kempton:

But they tell us where to go. Like, they just tell us. So I can go to a. I can go to a place I've never been, and I can either a find a ride.

When I was in the Philippines, I just hit grab. It's the same as, like, a Uber or Lyft here.

Melody Edwards:

Yeah.

Curt Kempton:

Gotta grab. And I could. And they could navigate those streets. Oh, my gosh. It's so stupid. But.

Or I could just walk and I could just go wherever I wanted to go in a city I've never been in.

Melody Edwards:

Yeah.

Curt Kempton:

Melanie, where are we going with AI and what is it gonna mean for us as humanity, in your opinion? And I know it's a big, big question, but let's just have a little fun.

Melody Edwards:

Okay. Well, I will say first, I thought we would be way more forward than we are right now, because I forgot humans don't like change. And so I think.

So I'll tell you a couple things I heard that really helped me. The first thing is, or with my experience, is that I've been talking about it like, guys, you need to get on top of this thing.

I've been talking about it for years now, and nobody cares because it hasn't. You know, we have to drag people along sometimes. Like, not everybody had a GPS for. Remember how we used to have them separate from our phone?

Curt Kempton:

Yes, I do.

Melody Edwards:

People still carried maps in their car. And they.

Those were, like, the last and still people sometimes carry maps in their car and refuse to, you know, but it took a long time to get every human to start using like gps. In unison. Right, Same for like, I mean, Kurt, one of my favorite episodes of the Office is when they use GPS and drive into the lake.

Curt Kempton:

Oh yes, lakes.

Melody Edwards:

Is that what it was? I didn't remember. Of course you'd remember that.

Curt Kempton:

But I, I want to, I want to point out one thing, and that is, is that the examples you're using required people to phone. That was about a thousand dollars. And every man, woman and child needed.

Melody Edwards:

To get $100 back in the day to put that GPS thing in your car.

Curt Kempton:

Yeah, okay, that part is true. But when we talk about everyone moving to the phone for all the phone things.

Yeah, I feel like the barrier to entry, which by the way, doesn't matter if you're homeless, you've got a phone now. Yes, but. And by the way, it's not just $1,000 a month. Right. Or thousand dollars, it's also your monthly fee.

Melody Edwards:

Sure.

Curt Kempton:

But anyway, my point stands is that chat and AI and stuff, the barrier to entry is so low.

Melody Edwards:

But think of us as humans when we got Google and had the whole world at our fingertips instead of just the encyclopedias we used to read.

Curt Kempton:

Yeah.

Melody Edwards:

Do we know everything now? No, we still just like.

Curt Kempton:

But we know that we can get there fast.

Melody Edwards:

Sure. But we don't know everything because if, if it really. The fear was that suddenly it democratized to me like all information. Suddenly you could just.

The Internet did that now. ChatGPT or AI, there's so much that we can do with it. But here's what I think is important to remember. It's not happening as fast.

I think it's going to take a lot longer.

Like people are talking about agents a lot right now, which are like actual several custom built bots that do very specific tasks and they're chained together in certain ways. And I'm talking about it like it's an, a physical thing. It's not, but like to do a task and we have the ability right now to do that.

But it's not really good. But right now it's starting to get really, really good.

Like this very moment every day it's getting better, but they don't know how it's going to work. Exactly. So we are going to be able to have a robot diagnose, take a body scan and diagnose our cancer, our, you know, read the radiology scan.

Do we want to get an email that says hey, guess what? You've got cancer. No. We're going to need humans who are going to be the.

The people who understand how to work with the AI to make sure that the results are the right results. They're going to have to have empathy, because a robot will never have true empathy. And to be able to give us the information.

For a kid who is struggling in school, AI can create a plan that's based on exactly his issue. Right. And is a kid actually going to listen to AI? No. He's still going to need, like, a tutor or guidance counselor or something.

Like, so I think AI is going to do so many things, but we're still going to need humans in the loop. And I thought back to before it was gonna get to the point where everybody would have, like, universal income and nobody would have to work anymore.

And then we'd have to all wonder, like, why do we even exist? It's not gonna be there anytime soon. I think Bill Gates was like, maybe in the next decade. I'm like, I don't think so.

Because of just how we push against change as humans. Like, there's that push, pull, and then I think the things that it will do is like, we're overwhelmed by it already. I am.

If I spent all my day on ChatGPT and produced prolific amounts of material, and I have as a human brain, I cannot take all that in, and so most of it's wasted. You have to get really good at understanding how to dissect exactly what you want, not just taking all the things. But we don't like.

It's like, making choices is not easy for us. So I struggle to choose something on Google. You know, which of the websites I'm going to use to buy the thing.

Struggle to choose what I'm going to buy. I'm still going to struggle to choose the information that I use that's usable from ChatGPT.

And then the other thing is, like, I thought that people in data science in. I thought everybody was on top of this. I thought that, you know, anybody who was on a computer, a programmer, anybody would be, like, way into it.

There were a lot of people from Fortune 500 companies there at this conference. There was a lot of intellectuals. There's a lot of universities in Boston.

So what I found is that some of those people barely used it in their work because companies are so slow to adapt. They're spending billions on AI development. Getting your team to get on board with things is.

I don't know how it goes for you, Kurt, but it is the slowest and the bigger you are. It's slower. As slow as molasses, usually. Right. Schools, they're not using it in a way that, like, they're not. They're against it in many ways.

They're saying, kids, don't use this. That's wrong.

It's wrong because the kids who graduate right now, my son included, your kids, this is going to be their life, so they need to be learning it now.

Curt Kempton:

For me, it was don't use a calculator.

Melody Edwards:

Right.

Curt Kempton:

And then by the end, it was use the graphing calculator.

Melody Edwards:

Right. So that's what I think is, like, there's going to be a tug between adoption of this and humans fighting against it, in my view.

Curt Kempton:

So you don't see a Wall E situation coming on pretty quick.

Melody Edwards:

I probably would have said yes before this conference, but I talked to two engineers from Motorola, and I was like, wow, you guys must be way up on this stuff. And they're like, no, not really. Like, we don't really do anything with this. We have our internal AI that, like, helps us answer internal questions.

But, you know, I had made assumptions that all computer people were on top of it, and it's just so more disconnected than that.

Curt Kempton:

Well, I can tell you to speak to that from the technology side.

Melody Edwards:

Yeah.

Curt Kempton:

The biggest pushback I get from our dev team about AI is, number one, our code base is massive, and the hallucinations that come from that are massive.

Melody Edwards:

Yeah. Number two, the bugs before you go on to the people who don't understand what code base means, just so that people know.

Curt Kempton:

Yeah. So code base would be basically every line of programmer code that makes the stuff happen on the screen.

You think that what's on the screen is the code, but it's actually kind of the smallest amount of the code. So a lot of the code is the front end, what you're dealing with on your phone or on your computer.

It's sending information back to the code that's on the server to get information. And that can come from, like, data that's being stored. Like.

So if you pull up a screen that's got someone's first name on it, that comes from maybe the database. But the magic of calculating, you know, maybe some sort of calculations happen on the screen that's not happening on the screen that's going usually.

I mean, I shouldn't say everything, but most fragile.

Melody Edwards:

It's fragile, Right.

Curt Kempton:

Dominoes. All these dominoes are all set up, and so you can't make a change in one part of the code base or the big pool of code.

You can't make one change and expect that it's only going to affect one thing, because everything relies on everything.

Melody Edwards:

So back to your dev team. What is there. Is there thing? Like, hey, if we change this one thing, everything else is potentially at risk.

Curt Kempton:

The problem is that, like, sometimes it brings back really good stuff and you're like, cool, I could trust it.

Melody Edwards:

Yeah.

Curt Kempton:

And then other times it brings back really bad stuff. You're like, oh, I can't. It's just so expensive for me to. To take its advice and then test it and then find out it's wrong.

Then I have to debug it and figure out what's wrong. And then if I send it through GPT, it always comes back confidently. That's one thing about a GPT version.

Melody Edwards:

Yeah.

Curt Kempton:

It always says, oh, good catch here. This is the fix, which is not almost ever the case. It's almost true. It's like, oh, great, here's a. Here's a fix for the thing that you said. But you.

I'm not taking into account all the other stuff, so.

Melody Edwards:

Well, it doesn't take accountability. It's like, oh, my bad. So sorry, here's the real fix. And then you're like, nope. Oh, man. Okay. Sorry about that. Here's the real, real thing.

Just like the hallucinations can be crazy.

Curt Kempton:

Yeah. And it's not just. So there's the hallucination side.

Melody Edwards:

Yeah.

Curt Kempton:

And then there's the other side of, like, the confident, the fake confidence. So those. Those have been a big issue. And then the second one has been, like, latency.

So you can put in and say, I want you to write a bunch of tests for this code to make sure that it's, you know, nice and steady code. And it'll sometimes take, like, three or four minutes to write all those tests. Well, I could have written those tests in that amount of time.

Why did I sit here and wait for you to do it?

Melody Edwards:

Practice for the future.

Curt Kempton:

So. Yeah, but I think that's the important thing is that. And you're talking about big companies.

I think the adoption rate would have been faster, especially on the tech side, had it come through and been the magic bullet that we all thought it was. But after being disillusioned 5, 10, 15,000 times, you go, I'm done trying this. I'm smart enough.

I don't need to type in the calculator every time I need to do a two plus two. I'm just not doing it.

Melody Edwards:

Oh, my gosh.

Curt Kempton:

But Then you need to remember, like, but hey, this, this logarithm that you're trying to figure out right now, this is actually what it's really good at. It's worth a shot. And if it comes back bad and you.

And you don't want to spend all the time again, there's nothing keeping you from going to the old way. So they basically in there, what's called the ide. I don't know what it stands for, but it's basically where the code is written by a developer.

If you looked at the developer screen, they got. Usually it's a black screen with a bunch of white letters and colors all over it that they're typing in.

In their ide, there is an AI assistant and they can kind of work.

Melody Edwards:

In and out of that.

Curt Kempton:

And then my devs also keep a copy of Claude open on another monitor.

Melody Edwards:

Yeah.

Curt Kempton:

And they'll do creative stuff like, here's the problem I'm trying to solve. Here's some of the ideas that I had. Can you tell me, you know which. But now they gotta take time to write prompts.

So the, the question is, will it pay out? And.

Melody Edwards:

And are they good at prompts?

Curt Kempton:

And are they good at prompts?

Melody Edwards:

That's the other thing.

Curt Kempton:

But I will say all of them would agree. I've gotten so much better at prompts, but I've probably got a long way to go because a good prompt can take you longer to write than the code.

Melody Edwards:

Well, actually AI can write your prompts for you now. And that's another thing. So here's the thing that we have that we as humans are. We're slow to evolve to.

Like I said at the beginning, it can do so much more than we know. And we're out of practice with understanding that we don't have to be the brain all the time. You talked about, like, why should I write this?

Let the, Let the AI take four minutes to write this code. I could do it myself. It's the same thing with my virtual assistant world where business owners are like, why should I train somebody else to do this?

I can do it myself. Well, because you do it like 10,000 times and it's actually not worth your time. Right.

And I know it's a little different, but the mentality I think we have to get in is that we are practicing with AI right now. We're learning what it can and can't do. Best not to wait until another two years or when people like, we're figuring it out.

And, and also our data is actually And I can't try to keep this in mind, but, like, I do worry about the cost of AI, because the fact that it's only been 10 or $20 does not mean that it'll always only be $10 or $20.

Curt Kempton:

Well, it's not. That's the other thing is that OpenAI, they did. I mean, the heavy power users are using way more than what it costs for them. But the, the CEO.

Melody Edwards:

But we're training it, we're training it for them, though.

Curt Kempton:

So they're getting, they're getting a benefit out of it. But I just thought it was so interesting. The CEO of OpenAI said in an interview is such a fascinating thing.

Melody Edwards:

He said, yeah.

Curt Kempton:

He said, you know, if you get an answer from AI and you respond, okay, thank you.

Melody Edwards:

He said, how many millions does it cost?

Curt Kempton:

We spend millions of dollars responding to those, those crazy millions. Now, because that's the other thing. If you're not super into AI, you need to know this. AI will always have the last word.

You never will type something in and it just doesn't respond like, well.

Melody Edwards:

Well, you can prompt it. You can prompt it to do that.

Curt Kempton:

Oh, you can. Okay.

Melody Edwards:

Yeah, I'll say. Don't respond.

Curt Kempton:

But one of the things that's so fascinating is that he said the cost of us building trust with humans is negligible. It's worth. We know we spend millions, if not billions of dollars replying to people who just said thank you with some sort of enthusiastic response.

But it's worth it because we are coming in at the psyche level.

Melody Edwards:

Oh, I don't like him. I mean, I'm glad he told us that. That's good transparency. But I also don't like it because.

Curt Kempton:

It means you don't have to like it.

Melody Edwards:

I know.

Curt Kempton:

Yeah.

Melody Edwards:

Yeah. That's crazy.

Curt Kempton:

People do business with people they know like, and trust.

Melody Edwards:

Yeah.

Curt Kempton:

And that trust is worth millions, if not billions of dollars.

Melody Edwards:

Well, I wouldn't know. I've been working on that trust forever and I still don't have millions, if not billions of dollars. But I'll keep do. I'll keep doing it.

I'm sure I'll get there.

Curt Kempton:

Maybe you should be spending millions of billions. I was. I had a. I had a pretty rough year this year with, you know, some friendship discoveries.

And one of the things that I was just reminiscing in my meditation this morning is buying a friendship. You know, you can spend money, buy a friendship, and you can do it in lots of different ways and without getting into my own stuff.

The fact Is, is that there's people that you're surrounded by who, who do such nice things for you with their wallet. And it's so easy. Like I, you know, kick myself for, like, you know, why did I fall for this or that?

And the fact is, is that it's very hard for us as humans to separate a nice thing. Motivation. Yeah, it's, it's. And, and so, you know, what we're bringing up right now is that Chat GPT is doing a nice thing and they're paying.

They're just financially doing nice things in order to build relationships. Slash, you know, I'll say friendship just because, you know, depending on how you use ChatGPT.

Melody Edwards:

But yeah.

Curt Kempton:

Fact is, is that a bot. Friendship is not a friendship.

Melody Edwards:

No. And I think that's going to be a problem. I think that's going to be a huge. Just like cell phones became our social.

I'm wondering, because I said they use it for young people, use it for therapy and for life advice. Are we going to get to the point where people won't need humans in their life the same way they. Because they've got their robot friends?

Curt Kempton:

Yeah. I had, I had a business mentor. I was going into a bank and I was going to need to get some, some money.

I was doing a pretty big outlay on something and I went into the bank and he said, I want you to remember something as you walk in there. Every banker is friendly. Every banker that's worth their weight in salt is friendly. But none of them are your friends. Don't ever forget that.

And I was like, and it applies so much. Right. I remember when he said that, I remember thinking, what a pessimistic way of looking at things.

Melody Edwards:

But we need to hear that.

Curt Kempton:

Yeah. It's important to remember.

Melody Edwards:

Yeah.

Because how many times I wish somebody had told me that for all of the times that I have mistaken friendliness for friendship or for kindness or for like all sorts of things. But people are people. People are imperfect. And robots are also going to be imperfect, as it turns out.

Curt Kempton:

Yeah. And you know what? A good friend is imperfect. Right.

But if somebody has, has invested in a friendship by really sharing information with you and listening to you and then, and then sharing their own stuff with you and sort of like having that sort of back and forth. That's why, you know, friendship for me is like when people get up and say, hello, friends, and then they talk.

Like, for me, that's always been a little bit of a trigger because I'm.

Melody Edwards:

Like, oh, it means nothing to me.

Curt Kempton:

Really?

Melody Edwards:

Well, because I do consider a lot of people to be my friends.

Curt Kempton:

Yeah.

Melody Edwards:

But I'm not like you. You really are straight. You're like, nobody's my friend, except very. I'm. I am pleased to know that I am your friend.

Curt Kempton:

Yeah.

Melody Edwards:

I had to work for that. I think you did.

Curt Kempton:

And honestly, everybody that has a friend, a real friend, has worked for it. And that's kind of actually the point I'm getting at right now too. So thank you for helping me to not belabor anymore is that friendship is earned.

Not in money. Friendship is earned through hard work. And those servers are working really hard. But at the end of the day, it's just a financial issue.

Melody Edwards:

Yeah. No, and I think that's important to remember is they're not doing us a kindness by making AI available to us.

They get, you know, the people who have, you know, own AI, they're getting a lot from this right now. And so like I said, I'm not sure how long this low cost, no cost part of it will. Will be relevant. I hope it is, but I don't think it's going to be.

Curt Kempton:

Well, I don't know. Because that data stacking that they're doing when a kid.

And I say a kid because you just said the younger generations is using it for their therapist. That data that it got for free to know where your vulnerabilities are and where your.

Melody Edwards:

Yeah.

Curt Kempton:

Where your different insecurities are and what your marketing preferences are. That data will always be free, but super valuable. Valuable through the roof. And I don't actually want to end the podcast on this negative note.

So we got to talk about some more stuff. But. Okay, but the Brazilian Jiu Jitsu here is. Let me give you this tool for free. And then we talked about ethics.

How could OpenAI sell the data that it knows about my insecurities or whatever to product, company or a matter of time. Yeah, like, that information is hyper specific.

Melody Edwards:

You can opt out.

But I don't even believe that when you opt out, just like with any, like Facebook or anything, they always say, like, oh, you can opt out, you have your own data, it's not recording you. We know that's a lie. And so even though AI and it says you can do like chats that are. What is it called? Do you know what? Temporary chats.

So that it doesn't stay. But you, I would say for anybody who's listening, go into the settings and look at what you can actually do to make sure that you understand.

Because some people at this point Especially young people. Their data does not mean anything to them in the same way it does to us.

Like we're the last generation who experienced a different kind of life without all of this in it.

Curt Kempton:

Yeah.

Melody Edwards:

Well, let's end on a positive note like you said. How could we do that? Let's actually, I'll tell you one good thing.

I think that AI that I'm excited about with AI, I think it's the medical field, the medical work in being able to figure out genome stuff that I have a friend, for instance, who has a disease, that she's very young, she's going to die because she's an enlarged, enlarged heart. And they've never been able to figure out what's wrong with her, even though they know some things.

But even, you know, at Harvard they did, they mapped out her DNA and they could not find the inconsistency in that. It is going to become so much.

I hope it's going to become easier to find cures for cancer and to find cures for things that we are going through right now on that level. That's exciting to me and I hope that people, I mean everything be. It's a democracy and it's a corporate democracy.

So everything becomes money eventually. I hope that it democratizes our health care a little bit or I've used the word democratize a lot.

Curt Kempton:

You have.

And I think it's such an appropriate word because it, you know, even though we've talked about how these big companies hold in the vault where all this stuff is at, it is free flowing, true or not, information that, that we all get access to. And I think that's so cool.

Melody Edwards:

Yeah.

Curt Kempton:

And I love the word democratize in that sense. You talked about some of the positive things in healthcare. And my son, my youngest son, he really loves this YouTube channel by the way.

We're not making any money on any ads here, but I'm going to shout out.

Veritasium is a channel that my kid loves to watch and it's got physics and science and all sorts of cool stuff, deep dives on all sorts of interesting stuff.

One of the things that I was watching with him the other day is that apparently mapping out every protein possibility is a very important thing for the medical world. You know, the proteins are what they can use for just basically infinite things.

But building a map of the billions of possibilities of proteins, it's not just the connections of the molecules, it's actually the shape that they make. And so you can get a lot of things that are not true or possible mapped as if it would work and then still be. Still be wrong.

So that makes it a very, like, almost infinite impossibility to map all the proteins. Yet scientists for years have said, this is just too important. We gotta do it. Well, AI comes around and they're like, can we do this?

And we've made like a thousand years or something. Like, it's. It's been too long since I've seen it. Made like a thousand years of progress in like a month.

And with 89% accuracy, which is, you know, it's still pretty low, but of the shapes that could work. So they still have to go through and vet out what possibilities need to be vetted out.

Melody Edwards:

But what were the human accuracy, though? What was the human level of accuracy?

Curt Kempton:

Oh, I mean, very low. Like, like 1%. Like, like very low. And so it's just so cool to think, yeah, that I could get diagnosed with cancer.

I mean, it's not cool to think about this, but, you know, someone could get diagnosed with cancer and we could have trillions of answers just through a genome project, because they're doing the same thing with the brain genome as well. And to be able to say, we know this much more now and we can do it now, we have a whole new problem. If people are living forever, which I.

Melody Edwards:

Don'T think I. I don't think that we can. I would be. I think we could probably live till 120.

This is what I keep hearing people who are super into this stuff say is like, just wait another decade. You just have to stay alive another decade. And by then, like, the. The health, it's just gonna make your life go on much longer.

Curt Kempton:

But that's what I would be. What I'd be much more interested in is not living longer, but living happier.

The amount of anxiety and depression that technology has brought into our world. If technology could somehow answer the question of anxiety and depression, that would be way more interesting to me.

Melody Edwards:

Yeah. Or just living healthier in your body as you age. Aging. You're. You're literally dying every day after the age of 25 or something crazy.

I forget what the. The exact age is not to end on a depressing note, but we're dying, everybody.

Curt Kempton:

Well, that's. Welcome to the soulprior podcast, where we depress you all the way.

Melody Edwards:

Yeah. Well, here's one cool thing. One more cool thing AI can do right now. I was thinking about this for my kid the other day.

So there's Notebook lm, which I believe is a Google product. Right.

Notebook lm can take anything that you want from the Internet and you can put it all into this thing and in two seconds it can create a podcast. Now, I was thinking originally when I heard about this last year and tried it, it made a podcast about virtual assistants. I'm like, that's cool.

There's two people talking, but it's like the same two people talking all the time. For my dyslexic son, or your dyslexic son who has to read a chapter of a book for, you know, maybe it's a book for school or something.

They can now have it read to them or they can take notes of a project and you can learn. Because my son loves listening. He is like his comprehension. I'm sure yours is too. It's through the roof.

It's just the reading part that is like the barrier. And he can read. I can read too, but I don't. It's like skimming. So there's all these different tools that can help us again with democratizing.

I want another word. We need a different word now. Education.

Curt Kempton:

We can make it possible for everyone.

Melody Edwards:

To access education in a fair way where it's made for everybody. So I don't know, there's so many cool things that we can do with it. I feel like we didn't even hit half of the ethical issues.

I think there are, but it doesn't.

Curt Kempton:

Matter because, well, it would be impossible anyway. Let's just take some souls in this. There are an infinite number of ways to use AI or really anything bad. Right?

Like I could use a rock to throw it at your face. Or I could use it to build a house for you. So let's end on the note of you, Melody.

I know you know so much about AI, not just from the conference, but obviously you have a ton of personal experience as well. And I'd be curious if you could like sort of drop a nice knowledge bomb on the best ways that you can imagine. Mindset wise.

When you know, mindset of when I go into my AI.

Melody Edwards:

Yeah.

Curt Kempton:

This is the driving force that will sort of always leave me and not having to worry about the infinite ways I could get it wrong. But maybe just how you can sort of be guided to get it right.

Melody Edwards:

Yeah. So I think if you stick to your. Your voice using AI and. And training it to be more. And when I say training, I literally mean open. Open your settings.

Tell it how you want to resp. Like tell it about yourself, the things you actually want it to know and to use what kind of tone of Voice friendly. Like, I'm a friendly person.

I don't want professional. I want friendly or whatever. But hone in on, like, what you actually want it to be doing for you.

Give it guardrails first of all, and then keep your mind. I don't even know how to explain this, but you have to keep your mind open in a way that we are not used to.

We're used to asking our machine, our phone machine, for answers. We're used to, like, just quickly. And AI is not just an answer. It is a way to be creative in a different kind of way.

Even just the way you think you have to be. The people who are going to be most successful with this are the creative thinkers.

I would say that oftentimes those are the people who aren't using it as much as they could. They have that feeling of like, I don't want my creativity to be stifled. And I get that.

No, you just have to be mindful of your creativity and keep it, but use it to become more creative. You know, that's how I use it. And just always I.

What I like about it is that if my mind is open and something pops in my head in my original thought might have been, oh, I'll have to go figure this out myself. I'll be like, wait a minute, do I. My hope is, as a recovering workaholic is that we can learn to not have to do more all the time.

My hope would be that my team and I can work 30 hours a week instead of 40 hours a week, because AI is going to take up the slack for that. And I think it takes such a mindset shift for us to look at this tool and not think of more, more, more, and think maybe instead of less.

How can it help me do less? How can it help me do the things that are fun for me to think about? I can focus on those more and it can do the things that I don't like doing.

So I don't know. That's kind of how I look at it right now. And don't go after everything with it. Learn enough. Stay in your lane. There's so many shiny objects.

That includes new software every day, new AI models every day. Stick to the tried and true in the beginning and just remember, it's just creative conversations with the robot right now. It's not truly technology.

It is, but it really is creative conversations with the robot. That's how I like to say it, to make it sound less scary to people.

Curt Kempton:

Yeah, I like that. I. I took a couple things away from that, I'm not going to add my own to it.

I'm just going to say kind of what I took away from that, because I think you really made some things come alive for me.

If you can use AI to magnify yourself and to bring yourself out into more of what you're good at and enabling you to do some of the things that you're not good at without having to really dilute yourself in that there are a bunch of different hows and how you do things. And that's something that you'll just have to figure out as you. As you play around with it. Don't get all lost in the shiny objects, like you said.

But I think to, like, really just try and figure out a thing or two to help magnify yourself.

And then I think that the other guardrail on the side, which was implied but not necessarily explicitly stated by you, but I. I think something I kind of took away from it is that don't use AI as a way to turn yourself into somebody different or to make something that isn't. Isn't of you. And I. I started by giving you some examples of like, how I could use it to manipulate people by pretending to be someone else.

I think that's probably the guardrail once you cross over that, you know, you're on the other side. I'm using AI now to not magnify myself, but to hide or to fake somebody else out. You know, it's not. You know, I think in even simpler.

Melody Edwards:

Marketing messaging, we can see people on Facebook. I know you're not on it, really, but like, they are suddenly all authorities with the same kind of voice and it's because they.

They just let AI write it for them.

Curt Kempton:

Yeah.

Melody Edwards:

And you get lazy and you are being somebody that you're not and people find out that stuff. I want to just be me because it's the easiest way to be. It's really a lot of energy to be anything else but me. And so I don't know.

That's what I keep pushing into. And I hope that other people do too. Like, don't make can help you and it's a great tool and resource, but don't make it your voice or who you are.

Curt Kempton:

Yeah. Don't let the fake voice become your voice.

Melody Edwards:

Yes. Yeah, yeah, yeah, yeah.

Curt Kempton:

Or the. The manufactured. I'll. I'll say the manufactured one.

Melody Edwards:

It is a manufactured voice. Yeah. It's not fake because you can make it sound very real.

Curt Kempton:

Yeah, yeah, exactly. So. Well, Melody, I really appreciate you sharing and bestowing a bunch of knowledge on me.

And I'm sure our listeners are all out there, like, super grateful, too, because this is a topic that I think that we'll feel differently in two years about. Probably as well.

Melody Edwards:

Probably six months. Kurt. It's so fast. Yeah.

Curt Kempton:

But at the end of the day, we all just want to be true to ourselves, and I think that that's one. One reason people listen to this podcast is like, wrestle with yourself. Wrestle with who you are.

Make sure you're being the best version of who you can be and want to be. Make sure that you're intentionally walking towards what you want it to do.

And AI is one of those tools that can really fast track you in any direction.

Melody Edwards:

Yeah.

Curt Kempton:

I mean, even the directions you don't want to go. But it's interesting how the human mind sometimes will want. Will trade what it really wants for what it wants right now. And I think that's the.

That's the constant wrestle. So we just got to make sure we use AI in a way that's. That's helping take us where we want to go.

Melody Edwards:

That's a great point to end on. Thank you for having me.

Curt Kempton:

No, thank you for being my co host. Oh, man, I am so lucky. And my friend.

Melody Edwards:

Oh, that makes me so happy. Thank you for being my friend.

Curt Kempton:

Thank you, Melody.

Melody Edwards:

See you next week.

Curt Kempton:

See you next week, guys and girls.

Links

Chapters

Video

More from YouTube