Artwork for podcast The CTO Compass
Revolutionizing Audio Production with AI with Peadar Coyle
Episode 931st October 2025 • The CTO Compass • Mark Wormgoor
00:00:00 00:39:29

Share Episode

Shownotes

Peter Coyle, CTO and co-founder of AudioStack, shares insights on building an AI-driven audio production company before the AI hype. He discusses the importance of understanding customer needs, the impact of AI on workflows, and the significance of team dynamics in a distributed environment. Coyle also emphasizes the value of open source contributions and offers advice for startup founders navigating the competitive landscape of AI technology.

Timestamps

00:00 Building AI Before the Hype

03:41 Surviving the GPT wave

05:57 Defensibility without mega-rounds

07:30 What AudioStack actually does

08:11 One-to-many: turning unscalable into scalable

12:06 Thin wrappers vs thick platforms

14:58 Model churn & provider neutrality

19:34 Ship faster: the metric that matters

24:17 Open Source Communities vs. Business Customers

26:40 Scaling across locations

30:25 Government advice

36:05 CTO and Startup Lessons

About Peadar

Peadar Coyle is CTO and co-founder of AudioStack.

Together with the Head of Engineering, he is dedicated to making AudioStack’s audio production infrastructure flexible, reliable, secure and scalable to meet the diverse ambitions of agencies, brands, publishers and AdTech.

He is an authority on software engineering, data science and AI, and has been invited on multiple occasions as a contributor to steering groups on AI innovation by industry groups and the UK Government.

He has long been an involved member of the Open Source community, and his core role in developing PyMC3, a Bayesian Computing Library, has supported the development of numerous scaleups around the world.

With an AI-native, technical founder’s perspective, he is passionately involved in conversations about the culture of engineering and has published a series of interviews of the teams behind data science innovations at some of the world’s best-known tech disruptors (Amazon, Spotify, Etsy).


Where to find Peadar

Transcripts

Mark:

Hello everyone. Today's guest started to build AI before the hype cycle even began.

Before that even he helped create PyMC3, which is a Python statistical modeling and probabilistic machine learning library, quite interesting. And he's the CTO and co-founder of AudioStack, basically an enterprise platform that takes audio production from weeks and it brings it down to seconds. He's advised the UK government on scaling AI and creative tech Absolutely amazing guy. Welcome, Peadar Coyle. Please tell us a little bit about yourself.

Peadar:

Thanks for having me, Mark. Yeah, that intro was pretty accurate. Spent a lot of time writing code and spent a lot of time building teams in the past couple of years. And Was lucky to be part of the AI wave before the AI wave took off.

So sometimes you're in the right place at.

Mark:

The right time. And tell me about that, because I think that's really interesting. You started AudioStack, which is an AI company around audio in 2019. And it was way before the GPT hype, before anything else came out. What is it that you saw in 2019 that .

Peadar:

So I don't think it was possible to predict things like GPTs, right? So, but I think you could have like a reasonable thesis that like machine learning technology in general would improve. And you're sort of like, and we were sort of bathing on things like synthetic speech technology to just get to human quality, you know, in the same way that I think we saw in translation. I think translation was kind of a reference point. Now, I think there's risk involved in that, but you kind of, you need to have a thesis before you set up a company or a reasonable thesis. About what kind of like external trends will tend your way. And I think you see a lot of the companies who are quite successful in this AI way have been doing Some version of ML. Or some adjacent technology for a long period of time, it'd open their eyes, be a route for it. For I think over a decade now.

So, you know, so I think that's kind of like, you know, the technological inflection point in itself Like, That wasn't, you know, that you probably can't predict, but you can predict that, you know, something will happen in a certain area. I think that's kind of the only way you can approach these.

Mark:

Things. And your background before that, I saw that you even mastered as a mathematician. What's your original background? I.

Peadar:

Studied mathematics and physics to various degrees. I've always been using computers. My father was a computer teacher, so I grew up very much programming or hacking to some degree since for the last 20 years.

Mark:

So you had a really good understanding already of machine learning models before The AI hype really came around.

Peadar:

Yeah, a lot of the stuff I did in math and stats was very kind of computational.

Mark:

Nice. So, and you started this company, AudioStack? Which is really around applying AI and AI models, machine learning models to audio. You started the company in 2019 and then suddenly two, three years later, this whole GPT hype and wave comes around. How did your... Company actually survive all of that.

Peadar:

Well, it wasn't just me. I have two co-founders, Timo and Bjorn. I think it's very difficult to sell a company without co-founders. In terms of like navigating the wave, We were like, you know, we're building products and we were working, we had customers and stuff like that around that time. One thing we did notice was that There was like, inflection point in terms of like interest. Around the time of the chat-GBT moment, it very much became you know, people had the language as such on the buyer side to sort of ask about it. I love that was kind of exploratory at that time. And I think that's what we've probably seen the last two to three years.

Mark:

That's been a huge benefit For you, Dan, because buyers now actually understand what it is that you do.

Peadar:

I think buyers level of understanding is evolving.

Like, so I definitely think you see like, You see the rise of the AI councils and organizations as they try to get their heads around these things. You see buyer becoming various levels of sophistication. I usually buy really worried about things like data protection, data residency, all these kind of things very much on the And even things like sharing their data for training data with... Kind of large technology companies. So I'd say they had a bit of a mental model and had a mental model that could be transformative, but they're still like a kind of a change management process. Ongoing in a lot of organizations.

Mark:

Yeah. Yeah, and a lot of learning with our buyers or all the buyers that are out there.

Peadar:

Yeah, exactly. Yeah.

Mark:

And still I've noticed, I've seen that you've raised some funding, but we live in a world now where Some of your peers, right? The true AI tech giants, they raise hundreds of millions in a single round. How do you create defensibility for your company?

Peadar:

I think defense abilities often kind of focused on like as silly as it sounds listening to your customers right because i think by listening to your customers you often discover the workflows that are actually important as opposed to like the superficial workflows i think like you know like building up like you kind of you know deep subject matter knowledge both on the kind of go-to-market side on the creative side you know allows you to bring these learnings into the product and that fundamentally over time becomes a much more stickier and defensible mode because you've You know, you've worked with them, you've built a skill, you've built trust. You've understood their problems in kind of a deep way. And our argument, I think, if you're building a really great product, you should understand your customers' problems deeper than they understand. And I think that's quite difficult to, you know, You can build stuff, but that's quite difficult to kind of copy overnight.

So that's like a kind of a moat, I.

Mark:

Believe. Just a better customer understanding than these really large companies, which are basically everyone's their customer, or that's what it seems like sometimes.

So what is it that AudioStack actually does? Give me a quick explanation of the kind of customers you have and what are the services that you provide to them?

Peadar:

Sure. We accelerate the production of audio production.

So we're about 99 to 100 times faster than existing workflows. And existing workflows are often human in the loop or voice actors or studios. Our customer base is largely media and entertainment companies, although some of the larger brands in the world. And we enable them largely to communicate with our customers by doing advertising. But we have customers in the industry. Audio books and podcasting space as.

Mark:

Well. Okay.

So basically, and I think what you said in your documentation is that you enable one to many plays. So you let a small team operate like a big one.

Yeah. How does that work and what does that mean?

Peadar:

I think fundamentally... You know, if you take like an unscalable process and like look at making it scalable, you kind of like, you open up like net new activity.

So like a concrete example is, you know, some of our customers use us to build, ads where they're building an ad for each particular Store. And you couldn't do that sort of thing beforehand because it'd be just cost prohibitive. Boring. Your error rate will be quite high.

So that sort of ability to sort of take, you know, kind of one creative asset and like, turn that into multiple assets and sort of, you know, do both things like experimentation, which applies to the performance side, but like allows you to, you know, you're kind of fundamentally doing that new activity that I think you weren't able to do.

Mark:

Before. Okay. That makes sense. I like that analogy, right? It's taking an unscalable process and making it scalable through AI, audio in your case.

So how do you believe that AI is doing that for, I mean, you're in technology, you're a CTO. Where do you see the same thing or the same method playing up in technology? With your teams maybe?

I mean, you have a tech team or in the tech industry, the wider tech industry?

Peadar:

Sorry, is your question like where do I see AI affecting kind of workflows?

Mark:

Yeah, in that same space, right? So where you have an unscalable process, maybe that suddenly become, like you said, scalable, but then in the business of technology.

Peadar:

Yeah, I think like we've, I mean, we experiment and we use a lot of the kind of modern tools of cursor, lovable, I think someone in the team is using pretty much every new tool that comes out. So you're, And we do a lot of sharing of knowledge about that. One thing that I think is quite scalable, two things I think are quite scalable that I think were... Kind of hard to do before is like things like spinning up testing quite quickly.

So we've had quite a lot of success at like sort of reclaiming older code, you know, that people didn't really understand. You'll put some tests on this, bring some stability, which generally would have been quite clunky and labor intensive. But like things like rapid prototyping so you know so I was doing some stuff that's recently rather than going to the developer team and asking for a prototype you know I was able to hack it together and cursor and have a quick prototype that we could actually have a proper conversation you know internally about no but we wouldn't do it like that or the text is wrong and normally like That would go to like a design team and that would probably go into a roadmap. And to be honest, like some of this kind of adjacent like marketing or just experimental stuff, just never go.

Yeah, just never work. You know, I've seen people many a roadmap where this stuff just sat there and nothing happened.

So I think it's that you know, not just like turning the unscalable into scalable, but turning the non-done work into done work, right? You know, and that's, I think, is like the kind of the underappreciated thing about a lot of AI.

Like we open up like, you know, new possibilities where.

Mark:

We're able to. So I'm not just about speeding up the work that we were already doing before and making it slightly more efficient, but doing the work that just didn't get done before.

Yeah, of course. Nice.

So, and then to get on, and this is one of your, I think your design philosophies, which I'd really like to hear more about. You talk about thin wrappers in the AI era, right? Versus thick wrappers or a thick platform.

Peadar:

But I think the real challenge becomes about bringing in the business logic or the kind of complexity of the business logic. And then you become, I think, a thicker wrapper or like thin versus thick.

I mean, thick in terms of defensibility. I mean, thick in terms of complexity.

I mean, thick as in harder to copy. But I mean thick as in more value, right? Because... I think that's one of the things you see with, a lot of these kind of like simplistic kind of like quick moving copycats or whatever. And you see that throughout the tech space at the moment. They don't have that kind of intricacy of guardrails and the kind of more complex stuff that an enterprise will drive you towards because it's not about the quick demo it's about doing something reliably across a large team you know and you know with all their different challenges and opportunities internally whether it's regulatory whether it's their own brand guidelines whether it's their own you know kind of like compliance needs you know that's all quite complicated and that's like a kind of a thick thing i think that you know where a lot of the volume in the AI era is going down.

Mark:

In the thick layer that you create on top of a given AI model, whether it's TransCPD or something else. Okay?

Yeah. I understand.

Peadar:

Yeah. And you have the advantage of like that the underlying models aren't proven, right?

So you like that, you know, the underlying models aren't proven. You can have like, yeah, kind of backup models that's very important but like But then the underlying business challenge is still in that kind of like more thick work where you do these other components on top of.

And then I think there's a further question, which is, you know, I see a lot of companies are using large language models, very complicated models for things that large language models don't need to be used for. So there's a whole thing about pulling out that and then replacing that either with your own IP, which is one possibility, or kind of other kind of IP that's out there, which can be more cost effective and can balance things from that, which is a kind of important thing for protecting your margins and stuff like that, which is a very important thing in.

Mark:

Itself. Yeah, I understand that.

So and then like you said, these models, The providers, the models, they change, I'd say almost monthly, if not every two or three months. How do you look at changing that models and changeability? Should you just be ready and open to rip out ChatGPT and place it with Cloud 4.5 or with the next Gemini model or Do you see more like just sticking to one provider? How do you build when you look at these models or what would you advise someone?

Like.

Peadar:

We went on a bit of a journey on that. Like I don't think there was an easy answer at first. I think we did a lot of experimentation first. One thing that we've got a lot more robust about and we've built a lot of that internal testing about is making sure that the behavior of for the customer is the same as before, right?

So it's all well and good. You know, so, you know, if you're selling to like a brand or advertiser or whatever, A lot of people are like, we deal with that. They don't want their thing to suddenly change overnight because they have already figured out a certain way to use a tool and change can be quite disruptive.

So what we've done there is we've built like an entire repository of tests and things we expect and we run that through and we use that as our kind of our evaluation process. And from that, we select, you know, generally not the leading test, Models actually, probably the models like, you know, a few months after they're kind of like, you know, in the wild and there's some sort of understanding of because I think you could do too much change that could be disruptive and like the benefits not always there.

So that's kind of like, so we have this kind of robust testing framework and that's what a lot of companies I've talked to and the ecosystem see that we're doing there. There, you know, we see, you see the rise of kind of LME valves, trampage, application there's all these kind of testing applications out there and you know there are even third parties you can deal with you know there's a whole ecosystem and the ecosystem is about kind of reducing hallucinations but like increasing customer trust because at the end of the day It doesn't matter if it's AI or magic or whatever. Customers need to trust the results.

Mark:

True. With AI, that is sometimes... Difficult. Okay. No, and I understand, right?

I mean, we had, even when the models become better, we had 4.0 go to GPC 5, and then it was much better. And I still think it is a much better model, but the noise around it, just because the whole... Everything changed in the model. It upset quite a few people in the way and I think used to it, but that's gonna keep happening of course, when models evolve and adapt and change. A similar question, and this is more, and we're now gonna go a bit more into the actual, I think we talked a lot about AI, a bit more into the technology, but a very similar question that I thought of because of the last one, and it's with infrastructure and network providers or the providers that you use. How do you look at those?

I mean, I've seen you do quite a bit with Amazon. Do you actually make everything Kubernetes or Docker so that you can just go from one provider to the next that they are hosting in your infrastructure? Do you actually go all in on one provider? What's your infrastructure strategy, if I may ask.

Peadar:

We're pretty much all in on AWS. I think there's a lot of talk about kind of multi-client and kind of flexibility there. I think there's a cost to building that, and there's a cost to kind of building an abstracted, the... The primitives there.

You know, that being said, we do use a lot of Docker, we do use a lot of Kubernetes and we do in terms of like things like looking for backup systems, we do look at like, you know, experimenting with other systems. But, you know, I would say that, you know, if you pick any of the major cloud providers and you have devs who are happy with it, like you're probably going to be okay. And like, you know, But like the real reason we picked it was I was very familiar with a lot of our, the engineers were, and I could see that was quite easy to hire into our skills on the market. I don't know if it's changed now, but you know, at, There are other kind of you know, networking choices that would be like more complicated, I guess, to.

Mark:

Operation. Prize ways.

I mean, they all look at each other, so they're never going to be that far apart. Yeah.

Yeah. Copying each other constantly. Okay, and if we then go in your team, in your development team, overseeing a team of developers is hard, quite hard.

So what's the one metric that for you is most important with your team? What's the one thing that you look at with them?

Peadar:

I'm a big believer in shipping. We even have a picture in the office. Yes, we ship actually in all our offices. And I'm a big believer in like, you know, just like accelerating that time to value.

So, you know, we look at a couple of different ways, but broadly it's that. I'm very interested in how do we accelerate? For example, recently the team spent some time optimizing CICD pipelines. Because going from a 20 minute build time to a 5 minute build time was a significant uplift, especially across to a wider engineering team.

So these sort of like optimization, but then there's other questions like, you know, How do you make your product team interface with other designers interface with your engineering team in a more effective way like what kind of like what processes do you bring in? Like whether it's design documents, whether it's, you know, better testing, you know, it's all these sort of things. But I think, I'm a big believer in like you have to just keep shipping because like that iteration as soon as you stop doing that you stop having that iteration and you I think morale drops which is a very big problem but I think like you know it becomes harder to respond to the market you know not everything we've shipped has been the right thing to ship I don't think any engineering team gets that right but or any product team, but like, you know, by at least getting stuff out there, you're doing that test and learn. And of course you're keeping morale and, you know, it's a big thing for us. And like, even like when we have an intern join, we try to get them to ship in their first week as well, which I think is important. Which sort of helps the systems improve because if they can do it, they're probably quite easy to adopt.

You know, so I think it's all about those sort of things. And that's never... That's a multi-dimensional problem and an evolving problem. That can get more complicated as your infrastructure gets more bigger, as you fit your clients.

So it's all about figuring out ways to keep that momentum going.

Mark:

And just to make sure I got that right, I mean, time to value, sorry, time to shipping, right? Time to value, that makes a lot of sense. Did you just say that when you onboard an intern, they can actually ship code or deliver codes In that first week? Are you Bill's?

Peadar:

Wow. Yeah.

Yeah. Yeah. But in the first week.

Yeah. We had a couple in the first couple of days, but yeah, the last one we had, he shipped something his first week.

So I think some of that's making things small, but some of that's just making sure that there's you know, stuff to work on and that, you know, they can, log into things and get up and running.

Mark:

It's impressive. So, and then something else you said, and I really found that one interesting. One of the other hard problems in tech is of course keeping everyone aligned. How do you keep your tech team, your product team, customer support, infrastructure, go to market, your sales teams, How do you keep them all aligned on the same vision direction and use that as input to what you're going to ship next?

Peadar:

I think those things are ongoing. Like, battles like and challenges as you get bigger. And some things work for a period of time and then you have to like kind of like iterate through. We have monthly, weekly meetings where we talk about what's being shipped on product. We try to make sure we unblock each other. Quite a lot of the product teams spend time in cost you know you're watching customer videos or with kind of customer facing people on the team We regularly communicate by Slack, by Zoom, by various other mechanisms. But I think you know, with multiple time zones and like new people coming in and stuff like this becomes like a, you know, you have to think Build out both things like your off-sites, which are very important because you need to allow the use of team time to speak together away from operational load, but making sure that the team spends time with each other to build up trust and that you spend some of that time on the strategy side and some of that time on the operations side.

Mark:

Very down-to-earth practical items you're going to ship next. Okay. You have a background in the open source community. You've contributed quite a bit. Delivering open source is quite different from working for a business. What have you learned from open source that is influencing how you lead your team today or how you work.

Peadar:

Today? I think... I think there's a big part about So I think one of the things open source taught me is that when it comes to making technological choices, right? Because I think like, you know, tech strategy is one of the things that I think It is very important.

Like betting on the communities that seem to be growing or are active is a good proxy for that kind of feature. You know, so you don't need to, the products you're using or the languages you're using or even the library you're using don't need to be perfect now, but they do need to be you know, better, you know, like you can see fundamental development. They will improve over time. That's one reason why we've got React and our front end, one reason why we've got Chakra and kind of like some of our front end components, one reason why we use a lot of the Python community. And you're like, of course, that means that they will even things like Kubernetes is an open source community itself.

So I think it's like, it's kind of betting on those things. The team's big believers in open source and sometimes I curse for like can't we just get a vendor to do that and that becomes a little discussion and just like being aware of like what kind of new things are out there you know because like you know like the space is evolving so I think that's a big part so like you know So I guess like to summarize, you know, look for a community and sort of lean into community and like, you know, be involved in that, you know, and sometimes the team will be contributing to some of these libraries. But you know, you... You have to think about like the wider ecosystem and some of the trade-offs you're.

Mark:

Dealing with. And in that sense, your open source community isn't, that different from the paying customers that you have as a business? Of course, I mean, they provide your input and their needs and their business cases.

Peadar:

Corsair.

Mark:

Nice. That's cool.

And then, I mean, you started small, but now AudioStack is working across at least a couple of different locations. You have London, but Barcelona, New York. How do you lead your team across the time zones or is your team just in one? And how do you manage across the different cultures and locations that you're in?

Peadar:

Most of engineering and product is in no, all of engineering and product is in Barcelona or London. So, you know, we have other people, more go-to-market operations in the U.S., I think it's spending time together. It's like making sure there's some sort of like calls that people can sink on. It's I think one thing I learned is as you have more times and you do have to lean more into async.

Kind of like decision making so like you know like and that's a bit of a challenge because you have to like write things in a much more detailed way you don't get the back and forth of clarification you know that could you know that means that you can hand things over to someone in a different you know times or you know who you can't quite sync up with and in terms of cultures I think I think that, you know, you have to be like respectful of cultures, obviously, but like, a certain element of diversity is healthy but you have to kind of keep a bit of a monoculture corporately, if that makes sense, because you want to make sure that sort of works across your different jurisdictions. And I think these are all just, I don't have all the answers to these. I think these are ongoing. Things as you grow and there definitely are those inevitable things like you're told at startups like you're above 20 people this becomes an issue but 50 people this becomes an issue etc and then these are generally correct like you run into these kind of moments all the time I've grown son like organizational complexity. I think things like culture and stuff. The other thing is like leaning into like, you know, making sure that your values are used in your recruiting process, you know, and sort of like making sure like not just the kind of more skill level, but making sure you really lean into them and that you train the team to sort of like be aware of that. That's one thing we realized in some of our debriefs whenever we were looking at hiring people.

So many people couldn't put their finger on it, but that turned out to reflect another line of cultural value in itself.

Mark:

And I love that statement as well. So you bring in people from different backgrounds, different culture, making sure that you have a diversity of people, but then enforcing almost a corporate monoculture so that you have one company culture across that diversity of people. It's nice.

And then hiring for values that align with that. Monoculture did i summarize that correctly yeah.

Peadar:

I i haven't been much time thinking about this lately but i that feels about right to me and definitely that's I think... I think you want to make sure that you have, that each office, like, looks the same or feels the same. There could be differences, but there should be a relatively homogenous experience. And that helps with things like communication, understanding, you know, and stuff like that.

Mark:

Yeah. I understand. I want to go a bit more to the outside. Last year you spoke in front of the UK House of Lords on scaling up AI in Creative Tech.

Really interesting. What is it that you told them?

Peadar:

What did I tell them?.. It's all publicly available, so you can dig it out if you want. I... Talked a lot about R&D tax credits and kind of some of the policy things that I encountered as an entrepreneur. I think that was important to bring up. And there are things governments can do. And I think governments often aren't aware of what's actually important and what's actually causing friction. Talked a lot about enabling great tal to come, which of course is an issue for a lot of countries out there, particularly in Canada. AI talent, artistry, that we're currently part of, or at least that has been talked about. And like, Things like, Some of the successful groups here, part of Digital Catapult, for example, which was a little government group, which was actually very helpful at a particular stage. And maybe the ecosystem needs stuff for a bit of a further stage than that. I think that's kind of broadly what I thought. Okay.

Mark:

Because from my perspective, I'm based in the Netherlands, it seems that it's becoming harder and harder for UK to EU. Other countries to compete in the world against US and China, which are just investing so much money and are just going at such speed.

So it's cool to see companies like yours in the UK. What should we do more or better to compete So as, well, let's say the UK to EU, either from a government perspective or maybe from perspective as startup founders, what can we do to compete and Keep up.

Peadar:

Globally? In terms of how you asking that question, from which point of view.

Mark:

With the US and China?

Peadar:

I think there's a bunch of good ideas out there like making it easier to set up companies with the EU Inc thing in the EU it's quite popular at the moment quite topical I think making it easier for great talent to come in and, you know, That can be beneficial. I think there are some things about stock options and stuff like this and set devices That, I think, is a bit... Fragmented throughout Europe at the moment. And could be, you know, and definitely it could be more, you know, more alignment, right? Because you're sometimes countries jump ahead and then other countries are behind. But those are the main ones. I think there's.. Perhaps like...

You know, deeper things about culture and like, you know like Gil. America makes you feeling like very acceptable socially. I think that's something in Europe that or not quite there.

Mark:

Yet. - And if I'm an individual founder, I'm gonna build my next startup in the UK or in Europe in the AI space, something like AudioStack. What would your advice be?

Peadar:

I think the ecosystem has a lot of positive aspects. So I guess something like leaning into whatever it's like. Good for you locally you know like so like you know kind of like figuring out what your competitive advantage is I mean, London has a lot of great AI talent. That's one reason why we ended up setting up here. But yeah it's something like that but I don't think I have a good general answer to like such a question I get depends and sometimes just get started is the right answer and sometimes don't get started is the right answer.

Mark:

Just get started. Ship value. Okay. And if you're now a startup founder or a CTO and you're starting, and I mean, you've been in this for six years, but let's say somebody is in their first year. What would your advice be to a CTO in this day and age of AI being.

Peadar:

Everywhere? I would say figure out when you have to step away from like the critical path and step away at that point right so you don't have to do it immediately but there will be come some point when your team is big enough or your product is complicated enough that the rational thing is to step away. And I think that's sometimes Something I see with other executives there is, they hang on too long and they end up causing more issues than then they're solving, right?

So, you know, and some of that's like a mindset shift of going into a job, be it the executive or manager. And some of that's just like, you know, Be comfortable. Learning a new chaos skill. But like, you have to do that because you're probably... Severely inhibiting your ability yeah.

Mark:

Take knowing when to take a step back from being in a critical path everywhere. And then you said somewhere in the beginning, you said you have two co-founders or a team of three.

Yeah. You said that's really important for a startup. What are the most important things? You now have quite a successful startup after these six years. What's the most important things that you've learned about running a startup and building a successful startup or scale up already?

Peadar:

Most important thing?.. I'd say like It's the... Cultivating your talent and keeping attracting talent is probably your number one job and like you probably should not stop doing that.

You know, you don't need to be doing it like full time. But like, there'll be points when you won't need to. But furthermore, it's really important to like, you know, incentivize people who are going above and beyond and giving them a path and that's where like you know it takes a career path and stuff you know becomes important the smaller you are it's probably less important But you can always sort of look for growth opportunities and I think that this pays off in a kind of a long run period because if you can keep an engineer for four to five years they like a lot of deep subject matter expertise and deep expertise in your, you know, the very highly productive, you know, and that allows, you know, kind of a compounding effect because, you know, like to kind of say like the kind of thick wrapper idea again, you know, that stuff doesn't happen overnight and it involves a lot of intricacies.

So I think, but I think like, a big thing for me and my co-founders was like making a good place that's, you know, to get creative work done and, you know, and think a lot about that, you know, sort of stuff, you know, Because I think one thing that often happens whenever we're recruiting people is people say well I've supported my last job because I couldn't get anything done right so you're often you can often be competing for talent like that you know it's So that's a kind of a... You know, important thing vision wise or kind of culture wise as well.

Mark:

Nice. And then for our listeners, there's a lot of CTOs, CIOs, people that are in tech leading. What was your... Final piece of advice. - And.

Peadar:

How to play with all the new AI tools. They're very cool. Don't be scared of them. Enable your king to be you know, exploring these things because I think that's where a lot of the kind of new fresh ideas come from.

Mark:

Let your engineers play. It's been incredible having you on. A lot of very valuable insights. Really appreciate it. Thank you very much. Thank you very much.

Links

Chapters

Video

More from YouTube