Artwork for podcast AI Readiness Project
How to Trust the AI You've Built, with Sebastian Chedal
Episode 1818th March 2026 • AI Readiness Project • Anne Murphy and Kyle Shannon
00:00:00 00:57:54

Share Episode

Shownotes

Sebastian Chedal has spent 25 years guiding organizations through digital transformation. These days, the question at the center of his work is one every executive is starting to ask: how do you minimize the risks of your AI system so you can actually trust it?

This week, Kyle and Anne dig into AI testing, implementation risk, and what it takes to build systems you can stand behind.'


Sebastian is the CEO of Fountain City, where he leads AI transformation and digital implementation engagements, and co-founder of TestFox.ai, a platform built specifically for AI testing. Over his career he has guided organizations including Heineken, Sony, Nike, and MITCO Tires through technology strategies that produce measurable outcomes.


If you're an executive or team lead who's been handed an AI initiative and told to figure it out, this conversation gives you a grounded way to think about risk, reliability, and what responsible AI implementation looks like in practice.


𝗬𝗼𝘂𝗿 𝗥𝗲𝗮𝗱𝗶𝗻𝗲𝘀𝘀 𝗥𝗼𝗮𝗱𝗺𝗮𝗽:


• How to evaluate whether your AI system is ready to trust before it touches your operations

• How to approach AI testing so you're not discovering problems after launch

• How to reduce liability exposure without slowing down your AI adoption

• How to distinguish between AI that sounds promising and AI that performs

• How to build a case for leadership when the question is "how do we know this is safe?"


𝗧𝗵𝗲 𝗧𝗼𝗼𝗹𝗸𝗶𝘁:


• 𝗙𝗼𝘂𝗻𝘁𝗮𝗶𝗻 𝗖𝗶𝘁𝘆: https://fountaincity.tech/

• 𝗔𝗜 𝗧𝗲𝘀𝘁𝗶𝗻𝗴 𝗣𝗹𝗮𝘁𝗳𝗼𝗿𝗺: https://testfox.ai/

• 𝗦𝗵𝗲 𝗟𝗲𝗮𝗱𝘀 𝗔𝗜: https://she-leads-ai.mn.co/

• 𝗔𝗜 𝗦𝗮𝗹𝗼𝗻: https://aisalon.mn.co/

• 𝗧𝗵𝗲 𝗔𝗜 𝗥𝗲𝗮𝗱𝗶𝗻𝗲𝘀𝘀 𝗣𝗿𝗼𝗴𝗿𝗮𝗺: https://ruready4ai.com/


𝗟𝗶𝗸𝗲, 𝗦𝗵𝗮𝗿𝗲, 𝗦𝘂𝗯𝘀𝗰𝗿𝗶𝗯𝗲


Catch a new episode of The AI Readiness Project every Wednesday at 3pm (PST), co-hosted by Anne Murphy of She Leads AI and Kyle Shannon of The AI Salon. Want to meet others navigating this new terrain with humor and humanity? Visit The AI Salon or She Leads AI to find your people.

Transcripts

How to Trust the AI You_ve Built_ with Sebastian Chedal

Announcer: [:

Growing smarter and leading with what makes us human.

Kyle Shannon: Oh, yeah. There we are.

Anne Murphy: Did you miss me last week, wa

Kyle Shannon: I totally, I totally missed you last week.

Anne Murphy: I missed you guys last week had Chef Kelly.

at. She got great energy and [:

Anne Murphy: Ooh, am I having a delay again? Am I

Kyle Shannon: Get a moment. Get a moment.

Anne Murphy: Yeah, yeah,

Kyle Shannon: yeah. Hmm. We're gonna, we're gonna need to get you friends

Anne Murphy: and.

Kyle Shannon: Yeah, we're gonna need to get you a CLO bot agent to manage.

Anne Murphy: Manage. I know I have, I have screaming fast wifi. It is actually plugged into the thing. Um, brand new computer.

This is just the way it goes, folks.

Kyle Shannon: The way it goes.

Anne Murphy: Listen, sometimes things don't work the way they're supposed to.

Kyle Shannon: This is the world of technology. Although you're going good now we're going good. Uh, well, we work.

Anne Murphy: All right.

you, um, in the world of ai.[:

Anne Murphy: Um. Sebastian's saying maybe turn on and off the router. It's possible we did not do that. Let me text, uh, how, what have I'm good. Um, spent last week at a conference in Detroit called the Nonprofit Technology Conference. It's, um, a lot, a lot, a lot of, you know, medium-sized technology companies. Um. Really good conversations.

It's not all devoted to AI by any means, but we also had a side conference for just the AI nerds, and that was really fun.

Kyle Shannon: Did

Anne Murphy: you work my friend Ben, uh, Childers? No. One of my, one of my, uh, colleagues is a guy named Ben Childers and he runs a, a gambit called Cool, scary ai. Cool, scary, cool slash scary ai.

they're doing these, they do [:

Kyle Shannon: Oh, oh wow. That's nice. So it's

Anne Murphy: really cool.

Kyle Shannon: A hundred by:

Anne Murphy: yeah.

Kyle Shannon: And you spoke. You spoke

Anne Murphy: anyway. I did. Did I? Did we, so I've been working on this piece of AI adoption as resistance, and it was that, you know, I have so many examples, obviously from all of the people who you and I work with who have used AI in their own lives to make change, whether it's for themselves or an elderly parent or a kiddo or the school district.

f legislation and then like. [:

It's like we're actually taking charge and deciding how it's gonna be used and making our own decisions about how we're gonna use it. And you know, there's a fair amount of like spite and defiance and you know. You know, not liking the man that has really bubbled up with all the people who I've met in the AI world, like we're rebels, you know?

Yeah. And sharing some of those stories, um, is, has really been energizing to me. And I think people are surprised. They're 10, they're, they're expecting that people who are adopting AI right now are just like sheeple, right? We're just doing. We're just going with the flow. We're doing what we're supposed to do.

ou know, we are not critical [:

Kyle Shannon: Yeah. So, yeah, there's, there's, I know. Yeah, I noticed that with the, with the Doomers that sometimes there, there, there's just, they paint it with a blanket and they assume that if you're in ai, that you must be in one of these companies, you know, forcing this down our throats and, and we're like, no, there's actually people out here just trying to use it, you know, just trying to get along.

And, uh, I, I think that your point Anne, about, um. That, that it's on us to figure out what we want do. That, that to me is, is the thing that has been the most on my mind lately that, that I. The acceleration at this point is getting like terrifyingly fast and I like, I can't even, I can't even process the news.

versions of open Claude and [:

And it all goes back to what we've been talking about since we started this, which is no, you've, you've gotta understand who you are, what you value, who you care about, the difference you're trying to make in the world. And then start to look at, you know, what tools may support that.

a bunch of people about our [:

On me that we, we talk a lot about how what's gonna matter with AI and why we are awfully well positioned is because of the human centeredness. It's because of the taste and the discernment and the wisdom. And I realized that there's a little something wrong with this, which it, this, which is taste and style and culture has historically never come from 50 plus year olds, white people.

I bring to the table is the. [:

Like I'm on the other side of that part of my, my career. Maybe I don't need that. Maybe my job is to clear the path for the person who has the good taste and the person who has the, you know, the energy and the zeitgeist. I don't know. Mm-hmm.

Kyle Shannon: I, well, so here's where maybe it's just a reframing of it. Like I, I think if what you're saying is

Sebastian Chedal: yeah,

Kyle Shannon: new and innovative taste that, sure, maybe that comes from, from a younger generation, but, but what you just described is an incredibly strong point of view.

And which, which is, which is a, it is having an opinion. It is taste making to a great degree that not giving a damn and being willing to say, Hey, that output of AI is crap, and that one's really cool. And having the world experience to be able to discern those two actually is taste making. It's just maybe not in the definition of.

efining taste making. But I, [:

Yeah. I think it's the thing that you're talking about, right? At what if this feels worth putting time into That feels really important. Um, and then just being, you know, just

Anne Murphy: yeah.

Kyle Shannon: Being, being willing to go through. I, I, I hate to say it, but I think it's, it, it's, it's really about to happen at scale is the ego death of.

don't give a damn. I'm gonna [:

So, um, so I think it's just we get to process the change maybe a little earlier. So. Um, yeah.

Anne Murphy: Yep. Yep.

Kyle Shannon: Mm-hmm. Yeah.

Anne Murphy: You know,

Kyle Shannon: well, yeah.

Anne Murphy: I mean,

Kyle Shannon: I know it's crazy.

Anne Murphy: It's, it's, it's constantly evolving.

Kyle Shannon: Mm-hmm.

Anne Murphy: But ego death, the ego death, the ego death piece does you a lot of favors. Like, just do that. Just get over yourself post haste, because on the other side of that is what you can really bring to this.

h is to hurry up and grieve. [:

Kyle Shannon: Yeah, I'm still in the middle of it. Adam, Adam has me all, all twisted in knots now. My, uh, my, my Claude bot, AI bot, because I haven't been.

Anne Murphy: Okay. Tell us about, yeah, let's talk about Adam.

Kyle Shannon: All right, so here's what's going on with Adam, Adam. One of the ego deaths that I am, I am going through right now is, um, I have historically been really good at being able to jump in, learn something quick, get my head around it, and, and just be able to go and go play.

hour stints in a row [:

He's trying to do things, but he is just like, I can't find that node. I don't know what that is. This service that you deinstalled, I still think I have it installed. So just stuff doesn't work. So I'm sort of in the middle of, of having a, uh. A very dysfunctional Claude bot child who's not doing very much for me, but I'm really excited to bring, when we bring Sebastian up, uh, we were talking a little bit before the show and, and he's been putting things out in the world and, and he's, he's got, um, their open claw stuff really dialed in.

drooling on the steps of the [:

Anne Murphy: Ah,

Kyle Shannon: ah, and we're kind of expecting him to be a fully formed adult, and he is just kind of a toddler sitting there.

Yeah, a little drool, right? A little lollipop, you know, stain on one side, drool on the other.

Anne Murphy: Yeah. Yeah, I understand. I understand. I, so when you, of course, of course, when you, when Adam, when Adam came into the picture, um, I still hadn't, I still wasn't sure what I was gonna do about. All the things, you know, I was mm-hmm.

I was, I was needing to move off of Manness because of meta, and then I needed to move off of open AI because of open ai. And then I didn't wanna get into Claude Code, but then I did, I wanted, anyway, I was so jealous about Adam, so I was like, finally, finally, finally, I'm gonna make my person, I'm gonna make, I'm gonna make her.

hours. [:

Kyle Shannon: well, how is

Anne Murphy: she, Sheila? Sheila, Sheila. Oh,

Kyle Shannon: Sheila. Sheila, okay. Nice.

Anne Murphy: Sheila. Because as in, in she, you met Sheila as in she leads,

Kyle Shannon: ah, see, it's good. It's good

Anne Murphy: Sheila, but she was gonna be named Sheila anyway, because I had a car that I wanted to name Sheila and I forgot.

So now, because Sheila, I mean Sheila's like your go-to Sheila's. Solid.

Kyle Shannon: I like

Anne Murphy: it. Sheila's tcb. And.

Kyle Shannon: And how is

Anne Murphy: she and Sheila, what I, well, she's wonderful and what I did for the two of us is we each have a little bit of dirt on each other. So we are beholden to one another. You know, we've, we're, we're, we're intertangled because she has dirt on me and I have dirt on her.

: That's good. That's solid. [:

Anne Murphy: Right? Good. So's this tension with us and Yeah. Right. And so, so far, so good. Yeah. I built the whole as executive leadership team, their maniacs. Yeah.

Kyle Shannon: Yeah, that. Oh, that's great. Yeah, I've got Adam Adam's been failing so much that I had him write one of his articles. He writes articles on LinkedIn talking about what it's like to work for a human and, and he basically said, listen, you know, when I fail, all I'm doing is I'm checking things and like I just look at my logs and they add, they're fail or not.

But Kyle apparently has got some real emotional issues that he's gotta work through. Adam basically called me out. He goes, he goes, this is, this is heavy for Kyle because that's how he's created it. I like, I feel like I'm in a coaching conversation with my, with my chat bot. He's, he's calling me out.

Announcer: Oh, wow.

azy as it is or as brilliant [:

How do you think about what you want them to do? Yeah. That is, that's gonna be an absolute skill.

Anne Murphy: Yeah. Yep. I absolutely, I absolutely believe so. I did not, I missed that episode or that, that issue with Adam. I didn't realize he was diagnosing you, your emotional situation already. Yeah. He's also the psychiatrist, the latest psychologist.

Kyle Shannon: Yeah, the, the latest almost Adam is, is really about him, him calling me out for, uh, for, for, you know, being an emotional human. He's just trying to do his job, but it's really on me how emotional I'm getting.

It's crazy.

Anne Murphy: Wow. Wow.

m and Okay. And I think he's [:

Anne Murphy: leads, I know she leads, gosh, we, um, so she leads.

AI is an ecosystem. And within that we have an AI academy. We have an ai, uh, uh, membership community. Um, we have a community of practice as well. And like right now we're tackling AI governance in a community of practice, which has been an opportunity for us to explore themes like, um, uh, uh, AI bias and maternal.

Health in black women or, um, data centers and the environment in small towns, right? So yes, we use AI to advance, you know, our careers and our businesses and our clients' businesses, et cetera. But we're not just, uh, messing around here. We understand the, the grand scheme of it, and it's just really nice to have other women.

To [:

Last week somebody dropped a piece of information about, uh, how to, she, she created an app, if I have this right, to go through the process of getting some type of a cert certification. Mm-hmm. Some kind of a religious certification of some kind that some people are interested in, should they need to avoid a draft.

Announcer: Hmm.

hat you really need to know, [:

Kyle Shannon: cool.

Anne Murphy: Kinda like that.

Kyle Shannon: Wow.

Anne Murphy: It's kinda like something interesting is gonna fall outta somebody's mouth. And it might not be about ai. It might be about ai, but it's a good place to be.

Kyle Shannon: That's so good. It's so good. And, and so many, so many women that I run into that have spent time in, in she leads are just, you know, they're like, saved me. It pulled me out of whatever. So. So let me tell you a little bit about the AI salon. So the AI salon's been around since the week Chat, GPT dropped. Um, we, we launched that week.

Um. Uh, it seemed clear to me that, that this AI thing might be big and might not be going away. Um, and so, so if you're curious about ai, if you're trying to figure AI out, if you're trying to be mindful about ai, um, and mindful about yourself and how you use ai, um, that's really the culture of the AI salon.

ly practice around ai. And a [:

And so that's where we've been shifting The focus of the salon is really, really getting down to who are you, what do you value, who do you care about, um, and what change do you wanna make in the world? And then use AI to amplify the crap out of that. Right. That's the

Anne Murphy: yeah,

Kyle Shannon: that's the win, I think.

Anne Murphy: Yes.

Kyle Shannon: So that's it.

So come join us.

Anne Murphy: Love it.

Kyle Shannon: Beautiful.

Anne Murphy: Join.

Kyle Shannon: Well, let's, let's, let's bring up, uh, Sebastian, uh, Cheadle, and I'm so excited to have you here, Sebastian. Welcome, welcome, welcome. See you sir.

Sebastian Chedal: Hi there. Thank you. Thanks for having me.

l us what you're up to, what [:

Uh, also, maybe I'd love to hear a little bit about your background. How did you get to where you are? Right.

Sebastian Chedal: Yeah, so my name is Sebastian Schal and I'm the owner of Fountain City. Fountain City Tech and also Test Fox, do ai. Oh, my video went away. There. It's back. There we go. And um, I have been, I've been in tech and als, I'm, I'm kind of a polymath person, so I've been in tech, but also in creative disciplines and of audio visual.

For a long time, and I've done lots of project management, information architecture, coding. I was really big into Flash Action Script before that went away. That was, uh, I was, I loved that platform because it was kind of a hybrid of everything. Uh, in terms of how I've, well, I'm, I'm definitely self-made. Um, you know, I've been working either for myself or running this business for a long time.

. And, uh, but I [:

Kyle Shannon: intro. Introduce yourself. Like, like, you know,

Sebastian Chedal: yeah.

Kyle Shannon: Where, where are you now? What do you, what do you want people to know about what you're up to these days?

Sebastian Chedal: So we've gone really full on into ai. It's always been something we've been interested in, but, um. 24 is when we completely a hundred percent went AI first. And now we're, I mean this, everything's been crazy. But this year in particular has been multiple breakthroughs, so,

Kyle Shannon: Hmm.

Sebastian Chedal: Uh, we're very, like one breakthrough for us this year was just full agent coding.

's more like vibe coding. So [:

And then the more recent thing, which you started talking about a bit before you brought me on, is these autonomous AI systems, or Open Clause. It's commonly known. And that has been way beyond what I thought it was gonna be in terms of how impressive it's been, so,

Anne Murphy: mm-hmm.

Sebastian Chedal: Uh, really high quality output.

It's like, yeah, I can, I, I know I'm just coming on and just spewing hype, but I've been personally really impressed, even though we went through a whole year of hype deflation, so, yeah. Yeah. Yeah. I feel like there's, there's a major breakthrough that just happened,

Anne Murphy: so Yeah. I would love to hear you talk about it.

No rush, no press, but that would be very interesting to myself, and I'm sure Kyle and our audience.

Kyle Shannon: Yeah, for sure. Yeah,

Sebastian Chedal: so go ahead s

Kyle Shannon: before you jump into the, the Claude Bot thing. Can you just talk, you mentioned before we got on that, that January was essentially the age agentic coding thing in February.

Mm-hmm. [:

Transition issues with, now they're not coding, now they're managing agents or whatever it is. Like, can you just talk about that phase first and then jump into the, the, the mm-hmm. The open class stuff?

Sebastian Chedal: Yeah. I mean, the conversation, that conversation is a really big one. It's a very different one. It's about what are people's skill levels.

You talked about ego death, so like where, where do you place value in yourself and in your work?

Kyle Shannon: Right.

ies even though we're not in [:

Kyle Shannon: Mm-hmm.

Sebastian Chedal: Um, which is, it's nice when you have a peer Right. That you become really close with. Yeah. But I've, I, so I've seen his whole team kind of. Be forced into Agen coding actually, because one of their biggest clients started doing it themselves. Ah. And they were, they were immediately, they, they knew they wanted to move into it.

And then they got hyper pressured, um, only three weeks ago where like, at first it was just the owner of the business who was doing Agen coding to see how he was gonna roll it out. And then he had to just like rapidly get everyone put it on because

Kyle Shannon: Wow.

Sebastian Chedal: It's, uh, I mean, you know, to quote him, but also for us too, it's like you're getting much faster output, higher quality, fewer bugs for a fraction of the price.

, whenever new technologies, [:

I can talk a lot about getting quality. Quality's the hard part. Speed. And cheap is easy, which is, you know, leads to AI, slop and all these hallucinations and all this stuff that Yeah. You know, makes us want to throw it away and go back to doing things by hand again. But in terms of the transition of it, I think it has also to do a little bit with people's personality.

Kyle Shannon: Yeah.

Sebastian Chedal: Like, are you a maker or are you an orchestrator? So I'm naturally more of an orchestrator being more of a polymath kind of person, you know? With my diverse background. So for me, it's just amazing. I'm just shrinking the distance between idea and realization and it's becoming, it's, that gap is just shrinking really fast.

or tried AI out three months [:

And let me post on social media the way it's failing. But you have to, you have to. When it fails, uh, it's really, well, whenever we build systems, that learning loop is really important. So you wanna see where it's failing. And then well, we're building in self-learning loops. So the AI sees its own failures, comes up with its own solutions, and then.

Keeps improving itself. That's the ultimate. But if you wanna have control over that, you know, you need to, you need to give feedback to these systems. Mm-hmm. So that they learn, uh, just like a person would. So, um, but yeah, so that, I mean, it actually started more in December was really when we were,

Kyle Shannon: that was, was, was it Opus 4.6?

Was that the one that shifted everything?

got it? Actually, yeah. That [:

Kyle Shannon: Yeah.

Sebastian Chedal: Took the boulder and threw right over the hill and or the mountain, and we just started running after it, instead of trying to push it over the top and Wow.

That's when we realized that we could make a, so we took on a project, a hydraulic simulation in code for like 3D topology, and we took that project and decided, okay, we're gonna force ourselves to not write any code by hand and just use AI to do it. And it was kind of painful, honestly, the first few weeks figuring that out.

But not only did we hit a, we figured out ourselves how to approach it correctly, which I can talk about if you want to get into that. And then the tools, the market and the broad knowledge, and then other Git frameworks that you can use to start from all kind of converged on the same direction of conclusion.

's just. It's not even, it's [:

Kyle Shannon: It's streamlining your workflow and inventing new workflow. I am curious about, and, and Anne, jump in if you've got something I just, I, I just wanna

Anne Murphy: No, go.

Go to that. Go for it.

Kyle Shannon: Okay, cool. Um, I'm just curious, and, you know, you don't have to say names, but just in general, have you seen shifts in. Because you said, are you a maker or are you an orchestrator? Mm-hmm. Have you seen shifts in your business where you had someone that was like a really valuable maker, a very val, valuable part of the team, that when this shift happened that, you know, maybe they, you know, they shifted down and someone who was more of an or orchestrator shifted into a more powerful position?

Like, have you seen any sort of shifts in value to you as a business owner from just how people are like, like what? What's that been like?

people well, so because this [:

Yeah. So what we've been doing is a more, um. I, I'm trying to think of the right analogy, but it's like a distributed, uneven kind of application.

Kyle Shannon: Yeah,

Sebastian Chedal: so all new projects, all new clients that come in full agentic. And I mean, the, the, some of the crazy things that are happening is we're not, we're scaling in projects, but we're not scaling in people right now, so.

Mm-hmm. Like we have, you're taking

Kyle Shannon: more work

, my focus has been on. [:

Right. That was kind of my approach. 'cause I'm trying to build the business, not build, not be in the business, but then everything kind. It's just like you have to, I've been having to rethink my approach with. Business and work right now because I've become personally also really good at these systems.

And so like in the big, um, while we're talking, I have AI systems right now coding things for clients in the background. So like I am running these systems now too.

Anne Murphy: Yeah.

Sebastian Chedal: And so. I don't even know how fast we're gonna have to hire people right now. Like I think the, our, our business is gonna probably wanna stay leaner and just keep adding more and more agentic systems around a core team, but then making sure that core team is really good at orchestrating.

, you know, things can still [:

Mm-hmm.

Kyle Shannon: Yeah.

Sebastian Chedal: So it's amazing. Yeah. So, so yeah, that, that whole, but as I, okay, so going back to what I'm doing right now. So then as things though, eventually I'll hit a break point, and I also don't wanna necessarily be doing the hands-on work of managing the agent. And so I will want to expand the team out so that I can delegate that responsibility.

t pool is gonna be extremely [:

Kyle Shannon: Yeah.

Sebastian Chedal: Yeah. You know how many people out there have? Well, once I tell you what these agents that we're building are doing, like I don't. I haven't seen anyone who's doing what we're doing right now, so, yeah. I don't even know who I would hire. I have to train them from zero.

Yeah.

Kyle Shannon: Talk. So, okay. So, so you've got, you've got, um,

Sebastian Chedal: I'm sorry, I'm, I feel like I'm hyping myself a lot.

I'm usually pretty humble as a person, so,

Kyle Shannon: well, no, I think, listen, I think that we're in a, in a, in a. Really fascinating inflection point here. And I think, I think December seems to be when, when the models got good enough that you could do true, true coding. Mm-hmm. And then open clock came fast on the heels, so why don't you, yeah, just take us through that transition.

I mean, this just happened we're we're mid March, right? So Yeah,

pen claw functionality using [:

So, but what this one, uh, so in the last few weeks, you know this, so we did a small pilot. I did a small one first just to make sure that it was working, and then I was impressed with that and I thought, all right, let's throw it on our business and let's build some roles there. And if they're good, we can.

Rep, either replicate those agents for clients or build new, new jobs based on that kind of a model framework. So we started first with A-S-E-O-G-E-O Marketing Research agent. And what we've built to this point, the agent does the following jobs. So the agent goes off and tracks first, you know, Google Search Console, Google Analytics.

[:

It then creates learning orders that I can review. I'm still reviewing them all right now for at least another month or two before I get hands off on that process. But it creates learning orders for me to tell me how it wants to improve. It wants a new tool, it wants to change some of its, uh, of the techniques for how you're writing articles or, you know, whatever it might be, that it's learning from these things and it's.

cations of any topic that is [:

I could be generating as many briefs a day as I want, but I have it set to two, so that's more than enough. And those two briefs are either optimization on existing pages, like adding cross links, or maybe there's, um, a page that's not optimized correctly for SEO. So it goes through and suggests those, or it's a brand new brief that needs to be written.

And so those all get queued and that's, um, that's kind of the end of that first agent, the research agent. And then it moves on to our writing agent. And then that one, its first job is to look at all of the links that were referenced or statistics, and it fact checks everything. So it goes off and it checks all the links.

te to make sure it finds the [:

Kyle Shannon: Mm-hmm.

Sebastian Chedal: Then it, it knows, uh, my tone of voice. So it does a full writing pass in my tone of voice. Then it does a self-reflection to see how close or far it got from hitting the tone of voice and a list of things it's gonna correct.

Then it goes to the next step and it rewrites the whole thing again. Then it goes on to the art director. The art director figures out what images it's gonna pull. If I, um, it might pull also graphs or images or things from. Uh, like a video that I've done about that subject or whatever I might have had, so that it has also, not just images it's making, but things that are materials and it lays them all out on the page.

knows that it's a low risk. [:

If it's a medium or higher risk, that's, you know, how we gate it. So it has rules to know when it's that risk level. Then it puts it in in draft status for me to look at it, and then I just read through it and then hit approve or apply and then from there it goes on. There's another agent that picks it up and starts doing social media broadcast to let people know about the post that was just published.

So it's a full end-to-end. Complete market observability strategy around S-E-O-G-O and our for our target customer. Oh, and then there's another agent, sorry, one more, uh, Kai, I call it that does the conversion rate optimization that looks at the life cycle of a customer and then looks at the whole site holistically to make sure that we're saying the right messaging on the right pages.

you'll have it on like every [:

So it's other agents looking holistically looking at conversion rates. So it's looking more at, uh, you know, your analytics. And that one's also doing work orders that go to the writer for things to modify and edit. So it's like a full, yeah, it's like a full pipeline. And then. One of the clients just about to ask us to do one that's similar but has more of an outbound lead, um, generation capability.

And then, uh, another client's interested in this ecosystem. And then there's, uh, another upcoming, uh, agent we're gonna make probably in a month or so, unless a client wants it sooner, which is for ad campaign management generation. And then the same kind of concept, but then coming up with ad concepts, building the ads.

nts to fix those problems in [:

Kyle Shannon: That's a trip.

Anne Murphy: That's fun. Yeah.

Sebastian Chedal: Yeah. So we're publishing like one to two blog posts a day now. We used to do one a week. If we were not heavily distracted by clients.

Anne Murphy: Mm-hmm.

Sebastian Chedal: So that's been a game changer for us. And, um, and for, you know, and for me it's important that the things that are written are, you know, don't sound like AI slop.

Yeah. Well written. And I read the things that it's writing and I've spent, the thing is this is all, a lot of time I've put in.

Kyle Shannon: Yeah,

Sebastian Chedal: me and the team, we've probably put it, I don't even know, maybe 200 hours or something like that in the last Yeah,

Anne Murphy: sure. Yeah, yeah, yeah.

Sebastian Chedal: Month or something like that. And so the qual, you know, and quality, like I said, is really important to us.

k they're written as good as [:

The AI system. Um, but yeah, if you just say, write me a good article, you're just gonna get, you know, the most bad. Uh, output with a, a lot of em dashes and, uh, this is what it really means. I don't know all the phrases that all,

Kyle Shannon: all

Sebastian Chedal: the, all the, you now know. Yeah.

Anne Murphy: Sebastian, oh,

Kyle Shannon: go ahead. Go ahead Ann.

Anne Murphy: I, I am just, I'd love to hear a little bit more about the kind of learning orders that you're getting.

Mm-hmm. What are your instances asking for?

Sebastian Chedal: Um, I can look at my screen here and pick a few out.

I could try sharing my screen too if, if that works.

Anne Murphy: Yeah,

Kyle Shannon: that works

Anne Murphy: totally.

Sebastian Chedal: Okay. Let me click the plus. No, not the plus. Oh, yes, it is the plus. Yeah. Share screen.

o? Just make sure you share. [:

Sebastian Chedal: Yeah. I don't think I need audio. Okay. Um.

All right. I think I have it.

Kyle Shannon: Yep. There it is.

Sebastian Chedal: See, did that work?

Announcer: Mm-hmm.

Sebastian Chedal: Okay.

Kyle Shannon: It did.

Sebastian Chedal: So this is Scott with his research. So these are the, you know, he is following 10 feeds right now. He is read 166 articles in the la I think this part of the functionality has only been up. Two weeks or something like that.

So it's pretty new. I mean, everything's new, right? It's all very new. Mm-hmm. So these are pending reviews. So this one is, you know, it ranks it as high impact, medium effort. It's about content strategy. So this is about a fan out strategy for case studies based on how ai, what is ai? Well, I haven't necessarily read these all in depth yet, but, so I'm kind of reading it.

elieve like how the other AI [:

Kyle Shannon: Hmm.

Sebastian Chedal: And then once that has been done, I can hit approve and then it will apply it. So I can, if I scroll to the bottom of this, I can show you a few that I've already applied. Yeah. But you know, here's another content strategy. Here's a tooling one, you know. So, let's see. Uh,

Kyle Shannon: and this, this dashboard is something that, that your open cloud agents,

Sebastian Chedal: we built this, uh, this is cloud code.

ring over the, the open claw [:

It's a lot more effective to not talk too much to the o to these autonomous agents. Hmm. Instead, it's much more workflow driven, right? Mm-hmm. It's like, it's like I am part of a process as well, and so I have a job like in the pipeline here. You know, I can look at content briefs that are ready to review.

so I can talk to Scott if I [:

Like, there's, um, there's commands that are defined that I can use in chat. If I want to add, you know, if there's a hot topic I want. Scott to write about, which I just gave him yesterday with the new, uh, NEMO claw from Nvidia.

Announcer: Mm-hmm.

Sebastian Chedal: Which is, if you've heard of it, but it's their kind of wrapper to make open call more secure.

So I said to Scott, Hey, can we write about this topic? And so he went off. Where did it go? Uh, I, and I pushed it to the top 'cause I thought it was important. So I think it was this, it was either this morning or yesterday. I approved it. And then. It's, uh, it's currently in the process of being, uh, written right now, so it'll probably come out later today, that article on that one.

t's pulling from. So I could [:

Announcer: Mm-hmm.

Sebastian Chedal: And then these are the ones I've implemented, so. Um, this one was just more of a geo technique.

This one was a tooling, a recommendation. He wanted to get a new tool for checking AI occurrences across different AI engines, and he found the LLM um, refs was the best one for him. So it has good API and also the cheapest, like 79 bucks a month. So. Um, so I approved it and we implemented it. So we have that one now too.

Um, this was a, an additional technique for searching Reddit that is effective. It's kind of tooling, but it's not really, it's more just like, it's more like a process, I guess.

Kyle Shannon: Hmm.

t if it finds a new learning [:

The AI checks against the reject reason if the new learning should be presented to me, or if it's rejected, or would it be rejected for the same reason. Yeah. So these kind of little things are also helpful to make sure that the, um, the system knows how to handle that. Um, it's really hiring a person, you know, or what I say to PE when people talk, talk to me right now about building these, it's like, well, what's your job description?

That's. Uh, let me stop the screen for now. Um, and so I ask for a job description, and then the most important things then for me to know are the inputs, the process, if the person cares, and then the outputs, you know, 'cause one person right now just wants the SEO agent, the go, the, the Scott one I was just showing.

hat's that output? And then. [:

So you know, the agent needs to, so we have all of our knowledge. So one of the things, because we're. AI first obsessed with AI and data. Data is like every company right now should be a data company or your, your company is your data.

Anne Murphy: Yeah,

Sebastian Chedal: your And people, of course. But it's like people, data and then process.

But process is kind of a form of your data, right? It's like how is the data organized?

Kyle Shannon: Sure.

Sebastian Chedal: So because we had all this data already. There, like a lot of this pro, what we've been doing is very emergent because we've been doing already so much with ai. Mm-hmm. And so much data. What I mean, sorry, I keep saying the data thing, but what I meant mean specifically is we already had everything about our business documented and defined our target customers, our processes, uh, everything about me personally, my dreams, desires, goals, whatever.

Every [:

Daisy clown or something.

Kyle Shannon: Yeah.

Sebastian Chedal: It's like if you've got your data and you own it, so like don't just rely on the memory feature of chat GPT. Or claw or whatever. And that's, you know, 'cause you're, you're kind of giving your mm-hmm. Your memory away. So can you own your data, your knowledge, your memory, if you can, like we were able to just sync.

knowledge. 'cause the agents [:

So how do we pull that back to the, to the knowledge core? Like the mama knowledge?

Kyle Shannon: Yeah.

Sebastian Chedal: And also how do we make the agents be able to search that knowledge, you know, quick and effectively and find things. So like semantic search systems and stuff like that. But those are more like technical questions.

They're not. The question isn't anymore, like, where's the, you know, do we, we have to create knowledge. It's like, oh, we have it. And, uh, I mean, we help clients create knowledge or synthesize it or, you know, there's strategies and techniques around that. It's not that like people shouldn't think either that they're just lost if they don't have any data and it's like, oh God, I have no data.

I can't do any of this. You know, you can. Well first you can start now, and then you can also. Mm-hmm. Um, there are shortcuts to get there too, so,

Anne Murphy: yeah. I don't know. Data seems to accumulate really freaking fast when you're using, when you're, when you're like, when everything is AI driven.

Sebastian Chedal: Yeah.

tiful data sitting around or [:

Because when you're, when you have these tools, when these tools are like ripping through stuff. You're gonna end up with some of the, you can end up with. I feel like the process for me of, of taking our AI first business and making it really agentic as much as is reasonable for what we do in the world.

Like, you know mm-hmm. We're not doing science here, uh, we're not doing engineering here, but that process has caused us to rethink every single thing every. Single thing because we're a small company and because, um, unlike Sebastian, I hadn't. I'm just like, oh, you know, two years ago I started organizing things properly and actually like giving a care about something that I had created, right?

Mm-hmm. And that it matters enough to like at least kind of sort of have it be somewhere.

Announcer: Yeah. Mm-hmm. [:

Anne Murphy: Right? Where I might be able to find it again, probably if I needed to badly enough. Kyle, this is you, and saving your things on your desktop. You're like, well, if it's amazing enough, I'll really wanna find it and I'll find it and it'll be fine.

Kyle Shannon: At some point

Anne Murphy: I'll every day.

Kyle Shannon: Exactly

Anne Murphy: right. Tens of thousand every day

Kyle Shannon: can fix that for me.

Anne Murphy: But the reworking, our human communica, just our communication, let alone,

Announcer: yeah,

Anne Murphy: like the flow and the sequencing. Just like at what point during a project do we spend time together? I feel like that has been, that is a shift we are going through right now.

of our contractors actually [:

And it's like in a really good way for people to make money. But that, so it's the other part of the growing pains is me and Donnie sitting in our kitchen going like. Hey, so this morning I created the thing that means that so and so that part of their job goes away. What do you wanna do about it? What kind of, what kind of people are we, what kind of ai, you know, practitioners, what kind of AI leaders are we, are they kind of, are we the kind of people that get their hands on a technology today and tomorrow?

Rewrite the scope of their, of a contractor who depends on. That income.

Kyle Shannon: Yep.

Anne Murphy: It's a whole head thing.

Kyle Shannon: It's a whole head thing. Well, let, let me ask you, Sebastian, we, we got a few minutes left. I know we started a minute or two late, but, um. One of the things that we ask all of was

edal: a whole doorway into a [:

But yeah, we're out of time.

Kyle Shannon: I know, I,

Anne Murphy: I

Kyle Shannon: talk

Anne Murphy: to you about this someday.

Kyle Shannon: It's, it's huge. Like Yeah. I feel this is, this feels like a four hour conversation.

Anne Murphy: Yeah. We're just getting started.

Kyle Shannon: Yeah, exactly. Um, but, but, uh, do me a favor, if you would share what you think, um, what does AI readiness mean to you? And what would you say to someone that, especially given sort of the advancements that have happened in the past two months, what would you say to someone right now getting started?

So that, that's sort of double question for you would be great.

Sebastian Chedal: Getting started from zero, like I haven't really picked up the AI thing yet. Yeah, yeah. And when you say AI readiness, do you mean for a business or for an individual?

Kyle Shannon: Just whatever it means to you.

Anne Murphy: Yeah, whatever.

Kyle Shannon: Just, just,

Sebastian Chedal: okay. I normally talk about it as for businesses.

have the strategy or vision [:

Anne Murphy: Mm.

Sebastian Chedal: The the next is, do you have a project or a focus with AI that has real ROI value?

Or is it just experimentation? Mm-hmm. Because you want to bring it, you need to bring it into something pragmatic. Do you have governance in place? Um, you know, ethical guidelines, risk frameworks. What happens if data is shared externally? Do you even know if data's being shared externally? There's, there's a lot there.

Um, the next category is your engineering capability. Like how skilled is your team? How mature is your team? Do you need to bring in outside people, help to build these systems? Are you gonna research it and do it yourself? Data, how good's your data, how well is it organized? What are your policies, things like that.

ojects or improve 'em or get [:

Hmm. Are you thinking about change management? Are people AI resistant? If so, what kind of AI education are you doing or planning as part of that? 'cause education tends to reduce resistance dramatically. Mm-hmm. In my experience. Um, and, um, I think that's it for businesses, uh, businesses that are just starting.

You know, on the low maturity side, I think the, your biggest barriers are gonna be finding the right use case, making sure you've got data that's available in a good quality. Uh, AI literacy, literacy and skills, and then constraints about your budget, you know? Mm-hmm. How much money you can actually put into that.

And so those are the four main common barriers when you're

Anne Murphy: great

Sebastian Chedal: getting started as a biz.

Kyle Shannon: Absolutely beautiful.

Anne Murphy: Sebastian, what a treat this was. Thank you. That was just like a little treat for our brands.

Kyle Shannon: Yeah.

Anne Murphy: Yeah. [:

Kyle Shannon: Thanks

Sebastian Chedal: for having me.

Kyle Shannon: Yeah. Fountain City, do tech. That's your, that's your company. So if people want mm-hmm.

I assume that now that you've got, you're as experienced as anyone in the world right now in this open cloud stuff, uh, that's something you're doing for, for your businesses, for other businesses

Sebastian Chedal: on. Yeah, it's been an immediate demand coming in, so we're definitely building them, whether it's one that we've already built, which means it's cheaper or something custom, so, yeah.

Beautiful.

Kyle Shannon: Beautiful. Well, thank you so much. Really appreciate your time. Thanks for your patience at the beginning there.

Sebastian Chedal: Sorry about that things worked out.

Kyle Shannon: I really appreciate your time. All right. Cheers. Bye

Anne Murphy: everybody.

Sebastian Chedal: Byebye.

Kyle Shannon: Bye-bye.

Links

Chapters

Video

More from YouTube