Artwork for podcast Creatives With AI
E35 - AI's Impact on Personalised Advertising and the Future of Audio Branding with Steve Dunlop
Episode 3519th January 2024 • Creatives With AI • Futurehand Media
00:00:00 01:08:15

Share Episode

Shownotes

Join us on a fascinating journey with Steve Dunlop, the visionary behind A Million Ads, as we uncover the seismic shift from old-school broadcast to the digital storytelling revolution. Steve takes us through the transformative era of digital ad inventory sales and the birth of DAX, right up to his lightbulb moment that sparked the innovation of personalised digital advertising.

In a world where AI is rapidly becoming the new creative partner, we tackle how these advances are shaping the future of branding and advertising.

We also take a moment to highlight the voiceover industry's resilience in the age of synthesised voices, weighing the pros and cons of this evolving technology.

Lastly, we reflect on the nuanced human interactions with voice-activated AI devices, considering the gendering of AI and the media's influence on our expectations of these technologies.

Takeaways

  • Dynamic creative advertising can be more engaging and effective in communicating with audiences.
  • AI and machine learning are often used interchangeably, but AI refers to a broader concept of machines performing tasks that require human intelligence.
  • AI can be used in the creative process to generate ideas and improve workflow productivity.
  • The use of synthesised voice in advertising can provide cost-effective solutions and enable real-time personalisation.
  • Automation in newsrooms can streamline processes and generate content for repetitive tasks. AI can be used to generate templatised stories and news articles based on stock market performance.
  • AI-generated TV stations exist, but their viewership and popularity are uncertain.
  • The radio industry has evolved, with many shows being pre-recorded and automated.
  • AI is being used in music and radio to create personalised playlists and DJ-like experiences.
  • Dynamic creative optimisation in audio advertising is challenging due to the lack of strong signals for optimisation.
  • AI has the potential to optimise advertising and attribution modelling in the future.
  • AI is expected to have a significant impact on businesses, automating mundane tasks and improving efficiency.
  • Politeness and kindness in interactions with AI can shape its behaviour and responses.
  • Naming AI personal assistants can create a more personalised and human-like experience.
  • There is a balance between cynicism and excitement about AI, with both positive and negative implications to consider.

Links relevant to this episode:

Thanks for listening, and stay curious!

//david

--

Subscribe to the AI Podcast Network Weekly Digest to receive links to new episodes every Friday morning.

--

Tools we use and recommend:

Riverside FM - Our remote recording platform

Music Radio Creative - Our voiceover and audio engineering partner

Podcastpage - Podcast website hosting where we got started

Transcripts

00:09 - David Brown (Host)

Hello everybody, welcome to the Creatives with AI Podcast. I'm your host, David, and on today's show, we have Steve Dunlop. I met Steve a few years ago. Well, it must have been five years ago.

00:18 - Steve Dunlop (Guest)

It's longer, I think it's about eight years.

00:20 - David Brown (Host)

Yeah, it might have been eight years ago now when he had just left Global Radio and was starting his own company called Million Ads, and we're going to talk about that because that's part of the reason why I wanted to talk to him today. But Steve is fascinated by how he tells stories, and the more engaging and compelling the story, the more effective it is at communicating our needs, desires and fears. And his background as a radio producer, strategist and engineer at companies like Intel, the BBC and Global Radio has given him the empathy to apply technology to real world human problems. His company, AMA, is the culmination of this and, at its heart, helps others tell more effective stories. So, I'm hoping he's going to tell some effective stories today and keep us entertained. Talking about AI, how are you doing, Steve?

01:06 - Steve Dunlop (Guest)

ually. So we met, I think, in:

01:48 - David Brown (Host)

you. At the maximum, we had 3,700 members in the Meetup group, and that was just before COVID, and we were. We had moved on from the place we used to meet in Covent Garden. You're thinking of the place in Covent?

02:02

Garden, where we used to meet downstairs. That's changed names about six times since we used to meet there, so I don't know what it's called today, but it is still there and I did go in there the other day, funnily enough. But we then graduated to go into the WeWork because it had a much larger area and was quite a good one. We went to Facebook. Facebook hosted us one time and we had a really popular Meetup there, but we were getting 200-300 people every month coming.

02:32 - Steve Dunlop (Guest)

And for me that was such an eye-opener to meet other people with similar problems and questions and stories and backgrounds and realise that there's a community there. I mean that's the whole point of Meetup, I guess, isn't it To see that you're not on your own? And certainly this Million Ads is the first time I've ever been a founder or an entrepreneur or started my own business. So feeling that there was a community out there to support, to commiserate, to lean on was great. So thank you for guiding us through that.

03:01 - David Brown (Host)

We were all trying to start our own businesses back then, and, unfortunately, GDPR got mine.

03:07 - Steve Dunlop (Guest)

That's right.

03:08 - David Brown (Host)

There's still a need for it. We still have the same problem that if you buy something online, you still see ads for it, like Riverside. I use Riverside as the platform to record on Sorry, my voice is a bit weird today and I constantly see ads for Riverside and I'm like surely.

03:26 - Steve Dunlop (Guest)

I'm a brains subscriber, you got me.

03:29 - David Brown (Host)

And I use it in the browser that I use to surf the web. So you would think that somehow they'd be able to work that out, because it would just save them money and really stop annoying.

03:39 - Steve Dunlop (Guest)

So you go and click on their links right, just to mean that that really cost them.

03:42 - David Brown (Host)

Yeah, yeah, of course, always, always, click the top result in.

03:47 - Steve Dunlop (Guest)

Google as well. Make them pay.

03:49 - David Brown (Host)

So I think maybe it might be good. I mean, obviously we don't want to pitch, but if you just give a quick sort of little bit of background of what you were doing and how you thought about a million ads, sort of what a million ads is and then we can talk about how it uses algorithms and AI and machine learning to do what it does.

04:10 - Steve Dunlop (Guest)

it's back back to that time,:

06:09

You know, radio creative and we talked about this at the start of our conversation earlier, you know. But radio credit is very characteristic and it's been designed for the medium so often it's quite loud, it's compressed quite hard, so it's so it's that you can hear it when you're in a noisy car or when you're across a kitchen. You know, radio has built up over 50, 60, 70 years of understanding how that medium works, and advertising within radio has done the same thing. So you repeat brand name often, you say the phone number three times, you use all of the tricks of music and drops and so on, and in a loud voiceovers. That's how radio is growing up.

06:42

ur very first clients back in:

07:36 - Voiceover (Announcement)

So that was the kind of a you know what would you call the.

07:39 - Steve Dunlop (Guest)

olders. And you know, back in:

08:22

In:

08:30

So we, we, we in my family moved back to the UK in August last year.

08:35

We now have 15, 16 people in our New York office and 17, 18 people in our London office and, you know, team of engineers and product people, creatives, script writers and campaign management people, so so we often provide our service as a self-serve, so for some publishers around the world, they can use our tool to design these dynamic, creative ads, but also we provide it as a managed service, so we have in-house creative teams, which which leads, you know, which, which means we think about AI a lot, because we actually have a lot of human resources making creative at the moment.

09:14

And just just to you know, fill in the rest of the story, I talked about the publishers that we work with. But but actually our may, our customers, our brands and agencies, you know so. So our clients are people like Starbucks and McDonald's, cvs and Lowe's and Home Depot it's got live at Dunkin Donuts in the US and in the UK, sky and the Lottery and and so so it's a kind of big blue chip advertisers which, again, I think, again, when we start to talk about AI, we'll we'll focus how we think about AI relative to the advertising set that we're serving.

09:53 - David Brown (Host)

Well, that's probably a good point to jump into that. So thanks for the overview of that. So, yeah, let's jump into the AI part, because I think it's really interesting and back when you started the company, I don't think we thought about it as AI. Back then, nobody talked about any of this stuff. Like, my feeling is about 80% or maybe 90% of the stuff that's called AI today actually isn't artificial intelligence, it's just really good machine learning.

10:19 - Steve Dunlop (Guest)

Or even just a smart algorithm that's doing it. If this, then that pretty standard tree flow, but does it smart, does it fast and does it better than a human. Let's just call it.

10:29 - David Brown (Host)

AI Exactly, and I think I know I've said this on the show before, but and I don't know if I've said it to you or not, though I was at Oxford and had an AI presentation last year and there was a doctor was talking about using AI and medical, and he said something that I've said many times since then and he said you know, if you're trying to sell it, it's AI, but if you're a professional, it's machine learning, right, yeah, yeah, sure.

10:54 - Steve Dunlop (Guest)

rning parts of our tool since:

11:47 - David Brown (Host)

Yeah, that's going to say they're everywhere.

11:49 - Steve Dunlop (Guest)

They're everywhere, and so AI help us with that long list of 14,000, because there's a lot of. Otherwise you'd be effectively using a spreadsheet going down one line at a time to try and match audio files. So we effectively use Amazon's machine learning tools to map audio clips to script and to expand the script out. So we've been doing that for a long time, and the other tool we have is to. If you start writing a script and it looks like you're using one of the kind of predefined rules that we have. When we say rule, that's basically a data variable, so whether or time, date, location.

12:27 - David Brown (Host)

Can I jump in there really quickly, because I know what your tool does, but we haven't really talked about how it works and how you came up with the name of a million ads, so it might be good just to quickly just go. Oh yeah, by the way. This is what the system does. Oh, by the way, here's how it works. Yeah, so people know what it does.

12:45 - Steve Dunlop (Guest)

So there are two parts to our product. One side is a design tool to help audio producers, creatives, engineers, design and build dynamic, creative advertising. And when we say dynamic creative, we mean the creative itself. The voiceover, the music, the sound effects change based on who's listening, based on what we know about the listener at the end. So, for each listener, we know roughly where you are From location, we know what time zone you're in. From time zone, we know what the date and the day and the time Is it light outside or dark outside, Is it lunchtime or breakfast or dinner and from then location, we can work out what the weather's like where you are.

13:26

We have a partnership with AccuWeather. So wherever you are in the world, we know if it's sunny, rainy, cloudy. We're not the temperatures like what's the wind speed, Is it going to rain tomorrow? And so any of these data. Oh, and it's also a device you're using. So are you on a Alexa speaker versus an Android phone, versus a Windows PC versus an iOS iPad and everything in between.

13:47

And then any of those data triggers or rules we can use to change the components of the script. So you, David, would hear a different version to me because we're in different parts of the country it's sunny where I am and it's raining where you are and so the client, the creative, the person writing that ad and putting that script together can use any of those components to make the ad feel like it's more personal, feel like it's dynamic. And actually we've proven over the eight years that we've been doing this and the thousands and thousands of campaigns that we've run, the advertising is significantly more effective when the listener feels like the creative is for them. Either they perceive it they know that oh gosh, that has said something that is clearly personalized to me or actually sometimes it's just kind of in the background, it's just kind of making it feel like it's more aware.

14:32 - David Brown (Host)

It's the subtle modification that that's right.

14:34 - Steve Dunlop (Guest)

One of the rules we use the most is called sequence, which is where we know how many times you've heard this ad and so we can keep it sounding fresh. One of the big complaints about radio advertising is you hear the same flipping ad every 15 minutes and it just drives you nuts. And actually then you get. You get this peak of awareness and then you get a drop off because there's a negative feeling once you've heard the ad a thousand times and you hear that again.

14:57 - David Brown (Host)

Whereas we did that with songs as well.

14:59 - Steve Dunlop (Guest)

Oh for sure. Yeah, the burn of songs on commercial radio, yeah, that was the bane of our lives back in the day. And so we have this, this raw call sequence, which means you can keep it sounding different every time. So every time that you use it and you can take on a journey, you know you can tell a story, you know.

15:15

Back to storytelling, you can, you can make impression number one, you know part one of your story, impression number two, part two, and keep the story moving on and you can change the language you use like you would in a normal human conversation. Like the first time we met we'd have said, oh, nice to meet you, what are you, what are you doing. But the second time we meet we'd say, oh, great to see you again, and we'd shake hands differently and our expression of the words we use would be different. The third time would be bro hugging and slapping, you know whatever. And so actually you can use, you can. You know that's what, that's what humans do, right? That's that's the way we naturally interact with each other, and so we can make advertising kind of follow that form using something so simple as as our sequence rule.

15:53 - David Brown (Host)

And what that does.

15:54 - Steve Dunlop (Guest)

So yeah, that's so for advertisers. That just gives them flexibility, and the reason the business is called a million ads is it's just as easy with our tool to create a million versions of an ad as it as it historically was to create one. And today we use human voice actors to record all of those different versions, and we have a technique to create the script such that you read the smallest snippet that you need to then be able to concatenate all of those versions together to create the million versions, and more and more we're using synthesized voice to help with that as well.

16:25 - David Brown (Host)

Which actually is an amazing segue. There is one thing I want to pick up on, though, which is I remember back before we had digital radio and all that sort of stuff. When you know everybody just listened to the radio and you know you would get a, certainly, you know the pop stations would always just it was whatever the record companies were paying them to play.

16:46 - Steve Dunlop (Guest)

And just 20 minutes.

16:47 - David Brown (Host)

You heard the same song over and over and just like, oh my God, I can't even stand this. But you had to listen because there wasn't anything else to do, you know so yeah.

16:57 - Steve Dunlop (Guest)

You're reminiscing about the good old days, the good and bad of radio.

17:02 - David Brown (Host)

But it was everything. It was TV, it was radio, it was all of that Like we didn't have the choice. And I think a lot of people today, certainly my son, doesn't. He can't even understand a world in which he can't watch, you know, any, any song, any. He doesn't. You know he doesn't have access to basically any song, anytime, anywhere he wants.

17:22 - Steve Dunlop (Guest)

So why would you limit yourself yeah?

17:25 - David Brown (Host)

How that even works. Maybe I need to take him off into the woods for like a year or so. I'll move out there and then he can, I can go. This is sort of what it was like when I was a kid.

17:34 - Steve Dunlop (Guest)

Yeah, I think I mean that we could. We could debate the pros and cons of that way of of running a music radio station, definitely, and also then debate what that does for for your ability to have preference. My worry is that is that because, again, my, my kids are the same. They can access any movie, any TV show well, moves and TV slightly differently but certainly any music ever recorded and continues to be recorded ever. So that catalog's only getting bigger and my worry that it there is that it's hard to form a preference because as soon as you, oh, I love this song, there's another one that's coming along tomorrow that you might love as well. And so actually, like we, we started this conversation talking about the Beatles.

18:17

My, my eight year old is learning to play hey Jude on the, on the, on the guitar. And you know the Beatles were some music I grew up with because my dad would put that on a cassette when we went on on family holidays together and would have that three cassettes in the car and they would just rotate around. That built familiarity, built a preference in me for that kind of music. Trace is now learning hey Jude on guitar, but that, but next week he'll be learning something else, so so that depth of preference won't be as strong. So I think that you know, let's see when our kids are a bit older, whether you know they reminisce in about their you know youth and the music they were listening to in the same way that we would.

18:52

But the point that I think, the point that's interesting, that I think about for for our business, is familiarity, and one of the reasons why advertising is effective is that is that you do get used to the voiceover, you get used to the music. Oh, when I hear that little jingle, though, that I associate that automatically my brain connects that with a brand or connects that with an experience or a feeling. So that's when it comes to creativity. Actually, familiarity is one of the strongest kind of mental links that an advertiser can use to shortcut our brains to thinking about a brand. You might not have had to say anything, you know, just play some music or create an atmosphere. And my worry about when we start to delegate creativity to computers is that they won't understand that At least they don't today. All, all, all LLMs are good at as repeating stuff we've already done. There's no net new thinking.

19:46

And so, yeah, stuck right into this, but when it comes to writing, creative humans can still start from scratch and come up with something new, and I think that's that's going to continue to be very important in in advertising.

20:00 - David Brown (Host)

Yeah, it's, it's true. And I think the other interesting thing is, you know, the AI is designed to never give the same answer twice. It is just a big predictive model and all it does is it takes. It's just. It works at such a scale now where it has billions and billions of variables and options it can choose from on sort of what direction to take.

20:21

But then there's a. There's a. There's a randomness coefficient that they put in there. So it says don't always choose the best, you know the most likely word. Next, choose something different sometimes. And that's what makes it feel human, because if you don't have that I've read some stuff about it and they basically say if they, if you don't have that randomness added to it, it will just give you exactly the same answer every time, and it's super boring and it's really really bland and no one engages with it.

20:51

So you need that human factor of being different all the time and coming up with something. But what that leads to, I think which is building on what you're saying is is it, let's say, you're working for a brand or you're a startup and you've got a new company and you're trying to establish a brand for yourself and a voice and and all of that is, if you're using AI a lot, I suspect that what it might do is you're not getting that consistent voice because it's intentionally being random and some of the things that you want to have in there that are part of your brand and your sound and all of that don't get put into it, but maybe people. What I'm worried is is that people rely on it too much and they think that it's just going to be right all the time and they're not thinking about that aspect of it.

21:40 - Steve Dunlop (Guest)

Yeah for sure. And so then that leads to how we use, certainly, llms and generative AI. We use it right at the top of our creative process. So, but that's the top of our creative process. We only start working when a brand will issue an RFP and a request for proposal, and that might be so. For example, we just started with Dunkin Donuts. They gave us, three months ago, a brief that said here's how we want our brand to sound, here's the messaging we want to get across, here's the vibe we want to have, and then here's the things we care about in terms of weather and location and all of these other things. And we then went ahead and wrote a response to that brief and to help us like just generate some thoughts around it, we used chat GPT as it was the tool we used. Then. We used various different tools all the time just to give us like a breadth of ideas, but we've already been briefed by this point. So the brand already has a preference, and that was the point I was making earlier.

22:40

The brands we work with you know Dunkin Donuts don't need a new tagline. America runs on Dunkin. They do not need a new one, so they don't need to go and ask chat GPT hey, can you give us a new dunk? No, and also they've been doing this for a long time. They have brand equity in the words, the colors, the font, everything that they use, the way their stores look, the way their coffee tastes. So they don't need to start being kind of out there creative at all, but for specific implementations of their campaigns, having something that just it's like having another person in the room.

23:13

You know, imagine a writer's room and you've got five people around the table and you're always like I love it when Sarah's in the room because she always comes up with the idea about something, and I love it when David's in the room because he always throws in this. And now we've got kind of robot in the room and they can chuck in a few more little kind of irons in the fire. But at the end of the day you then go right, but which one matches the brief the most? Which one will have the best outcome for the brand? Which one matches the audience that we're trying to reach? And so we'll have almost like the robot in the writer's room with us, but still make very human decisions around. Then how do we respond to the brief. And then how do we write a script and then go and record it?

23:51 - David Brown (Host)

Have you. I've had this thought before when the doctor was talking about the AI and medicine and again, I've talked about this loads of times, so if anybody's listening they know the story. But basically they were using AI to try and look at mammograms to see if they could find breast cancer and in the beginning they had all these problems with it because it wasn't very accurate, because it was misdiagnosing and all these cases and stuff like that, and so they were really worried that it wasn't going to work, and it wasn't until four years later that they realized that it actually was correct because, it was seeing it so much sooner than they thought it was possible

24:30

that they thought it was wrong, and so, if you take that into other areas, sometimes I wonder if I ask it to help me write something, like I'm doing this new podcast network thing and sometimes using it, exactly like you said, to try and get some inspiration and to say what things should I think about. It's very good at giving lists of things that you might want to consider and sometimes it comes up with stuff that I never thought about or that I'd forgotten about, which is genius, and I love it. But if I ask it for copies, sometimes it writes things that I wouldn't write. But then there's this little thing in the back of my head that says, but is it actually right?

25:08 - Steve Dunlop (Guest)

Yeah, yeah, like would that copy actually, yeah, and I'm reverting to my old biases or traits, and actually this is trying to push me ahead. Yeah for sure.

25:16 - David Brown (Host)

Exactly, and I don't have enough audience yet to A-B test but, as soon as I do. I mean, I'd love to work on a show that had like millions of listeners and where I could A-B test, you know, sort of the human messaging versus what you say to the AI.

25:31 - Steve Dunlop (Guest)

Well, I'm sure that's happening around the world. You know big publishers are doing that all the time.

25:34 - David Brown (Host)

They're certainly testing headline writing whether that's just haven't seen any results of it yet and I'm part of this other group and that's all based around AI and it's called the AI Collective and if anybody listening wants to go find it. After we get off, this is Friday, thursday, the 18th the Jan I'll go post something on my LinkedIn, on the channel, for if you want to sign up, you can go and sign up there, but the point is now I lost my point. I got distracted.

26:15 - Steve Dunlop (Guest)

The AI coming up with ideas that pushes you out of your comfort zone, and A-B testing with big audiences.

26:22 - David Brown (Host)

Yeah, I don't know.

26:23 - Steve Dunlop (Guest)

I lost it. Okay, we'll cut this bit out.

26:27 - David Brown (Host)

No, I leave all these embarrassing bits in because it makes me try and be better next time. I'll remember it in a minute. So another reason that I wanted to have you on and if you remember when I reached out was because you'd been on a podcast another podcast recently and I can't remember the name. So I'll rely on you to give a shout out to that podcast as well.

26:50 - Steve Dunlop (Guest)

It was called Ad Tech Heroes. Someone somewhere thinks I'm an Ad Tech hero, which is very nice.

26:57 - David Brown (Host)

Absolutely. You're one of the big UK success stories in Ad Tech as well, so you have to just that's across. You have to bear, steve. But one of the things that you said in there, what I thought was really interesting and you've touched on it a little bit here, but it almost sounds like you've changed your story a little bit from when we spoke originally, which we were talking about voiceovers, and I'll let you do it but basically you were talking about the fact that the voiceover industry wasn't in really too much risk because it's such a small portion of a budget. That was basically what you were saying. Right? Yes for sure.

27:32 - Steve Dunlop (Guest)

So again, using Duncan as an example. They've had a voiceover that's been doing their creative for years and years. That person they still want that voice on their new creative. That person will be on a retainer with them. So on a per campaign basis, if Duncan's spending, let's not talk about Duncan, just in case you know, I don't want to talk about their numbers or anything like that.

27:54

Let's say an example client might spend $10 million a month on a campaign. Let's call it a digital audio campaign. Of that $10 million, probably roughly 70 or 80% of that will be on the media. So that will go to Spotify, pandora, iheart, the places where they want to go and buy the inventory, the actual spots that then get played out to listeners. So that cost would be there whatever happens. And even if you're just playing one version and it was the same one version to everybody, you'd still have to pay the media cost. And the more you want to target that media, the more you want to add data to it, etc. Etc. Or the more premium the audience is, the price of that media will go up. And normally what happens if the price of the media goes up is the amount of impressions that you serve goes down, because your overall budget stays the same $10 million I've got to spend. So if the unit cost is more expensive, I'll just buy less units.

28:45

So then there's remainder of the budget to spend with people like us to add dynamic creative to the solution, and then there's a creative cost. So right now we need to go and record it. We need to go and use a voiceover or maybe we use synthesized voice. But we're talking of that $10 million, probably in the order of maximum $5,000 to $10,000 for the creative. Just on the. That's how much it costs right to hire a studio engineer. The voiceover time, the licensing cost for the music, maybe add all that up and you'd be lucky to push $5,000 to $10,000. So, on a $10 million budget, if you're the person at Duncan, are you really going to be caring? If that voiceover cost that's within that $5,000, the voiceover cost is probably $5,000 to $800,000. If that went down to $50, who cares? You know we're talking decimal places here, so there's no downward pressure from the buyer to make sure that the voiceover cost goes down for the kind of clients that we work with who are spending $10 million a month on their campaign, which includes, as I said, a load of media. So no one is saying to us oh please, can you reduce that cost.

29:53

The other, you know, one of the one of the factors of our business is if we get over a certain minimum, if we get campaigns are big enough, we'll actually deliver the creative managed service included in the cost, in our cost. So the client doesn't even see, doesn't even break out, how much we've spent on a human voiceover versus a, versus any other kind of provision of voice, like a, like a synthesized voice. So there's no downward pressure from a cost point of view. However, there is pressure from a workflow, productivity, time point of view.

30:25

If you want a voiceover to go and read 14,000 store addresses and names, that's a significant amount of time in the studio and, frankly, it's really boring. And to get a human to do that humans will you know, you probably have to do it in three or four sessions. By the third session, the human might have got a cold because it's raining today, or they might have just smoked a cigarette, so their voice sounds different, et cetera, et cetera. All these kind of human, you know, random elements come in. So that is a perfect reason to use, or a perfect use case to use, synthesized voice.

30:57

ne we did, by the way, was in:

31:21

They wanted it to sound like a massive development and an increase in quality and speed and so on of synthesized voices. But in that time not lots changed in the kind of the workflow. So if we wanted to do 14,000 stores again, absolutely we'd use a synthesized voice to do it, but we'd use the human voice actor at the start and at the end, so the bit in the middle doesn't feel like it's kind of standing out on its own. So there's no price pressure, there's a little bit of workflow pressure. But I think the biggest opportunity for synthesized voice and advertising is where the list is unlimited.

31:53

So if you imagine there's 14,000 stores, that's still only 14,000. So we can see when we get to the end. But if you imagine tomorrow's news, we don't know what that is yet, so we can't go and record it today. We can't get a human to record that, so it's ready to go to play tomorrow. If you imagine stock prices, there's kind of an infinite number of stock. Well, it's not infinite, but there's a large number of stock prices that you wouldn't be able to record all ahead of time Sports scores. So there are really good use cases, all of which would be really valuable in an advertising context to make the ad sound like it's more aware of its context, more aware of its real-time position in the world, and that's where synthesized voice, I think, will have really good use case for our kind of advertisers the other end of the tale, actually looking at where most social advertising comes from, for example is

32:46

not the big advertisers, it's not the Blue Chips, the Fortune 500. It's the local bakery or the local car dealership or the local coffee shop that want to spend 50 bucks and they want to have with a social payload. That's often just a photo and a bit of text, which is very easy to create and obviously even easier to create using AI. But for audio there's always been this kind of hump, this minimum cost, to get over. I always joke that there's Bob's bumpers in Boston and if Bob wanted to go and record an audio ad he's got to go and spend that voiceover cost that $250, $500. And his budget was only $1,000, so he's already spent half of it on his creative, whereas, remember, in the $10 million budget that $500 was still more or less the same, so it was much smaller proportion of the overall, whereas for Bob this is half.

33:37

So AI has an absolute role here, where that $500 becomes $5, $10, $15 for some kind of license. That then creates an AI-generated version of an audio ad and then of your what do we say $1,000 budget, $990 of that is being spent on the media, not on the creative. So in those use cases I think synthesized voice and generative AI is very helpful. Over time, I think the distinction between those two advertising sets the Fortune 500s versus the Bob's bumpers in Boston. They'll blur and merge and I think big blue chips will find a use case for the Bob's example and the Bob's example will find use cases for premium voiceover talent as well over time. So I think there's definitely going to be a blurring. It's not happening quickly, because all this stuff's been out there for a long time. Now we're just all talking about it more.

34:32 - David Brown (Host)

I think it's been out there for a long time, but I also think that the voice technology has got so much better. You can still and I went on a rant not too long ago about this but the instructional videos on YouTube that are obviously just.

34:50 - Steve Dunlop (Guest)

I heard you were rant about that, David. You were very specific about your hate for these things.

34:58 - David Brown (Host)

Yeah, and it's because they're terrible. They're absolutely terrible. But what you were talking about, though and I spoke to a lady journalist in Poland, joanna, and she was talking about how, in the newsroom, which is kind of, it's the same numbers that you're talking about it's the stock prices, it's the weather, it's traffic, it's that sort of stuff that, actually, a lot of the stations are automating now Sport scores, all that sort of stuff, because it's very dry and you could. It's easy to automate all of that, because there's not much, you don't need a lot of personality and a lot of context to go around, and it's very templatable, and we've also Sports scores.

35:43 - Steve Dunlop (Guest)

We've been doing it for 50 years, so there's a lot of learning, there's a lot of training data. So you could almost I mean to be honest again you don't need AI to do this. There was a company I met really early on in my a million ads days that was doing effectively a templatized story generator using stocks and it was if and it had if this, then that rules in it If stock has gone up, use the sentence A great day for insert stock name. If stock has gone down, say disappointing news for insert stock name. So they were creating a rules-based version of a kind of templated. In their case it was a text story.

36:20

I think it was just a plug or what you know push up, add a story about each individual stock and how its performance. Have done so. It's not a challenge again to say it's great use of brand new technologies. It's kind of not really. It's because we've done it all before. So it's just inserting the right stock into the right thing into, and I don't know what your previous guess conclusion was, but that sounds pretty dry and pretty dull.

36:45 - David Brown (Host)

It is. I think that there is a whole TV station that, basically, is all AI generated, so it's avatars, it's not real people, and they do have some real people. So what they have is they'll have, they'll have a live presenter, but then that presenter might be talking to an avatar, and then in some instances they have two avatars talking to each other. Like it's pretty crazy.

37:10 - Steve Dunlop (Guest)

Is anyone watching it? I'm watching it but it's. It's a TechCrunch article. It's a funny anecdote to talk about computers, but actually real people who are getting up in the morning turning on the TV to get their kids to school with a bit of news on in the background, and it's Sky News or it's BBC Good Morning or whatever it is. Does anyone want to go to channel 387 and watch, or flick open their laptop and go to avatarnewscom?

37:35 - David Brown (Host)

Yeah, I don't know.

37:36 - Steve Dunlop (Guest)

I don't know. I mean I don't know, I genuinely don't know, and I'm sure I'd be surprised if we saw that.

37:41 - David Brown (Host)

I think a lot of people would be surprised actually to know that most radio shows are recorded ahead of time and only take about half an hour for the host to record the show and it's all pre-recorded and it's just uploaded the music, the ads, everything is all automated and there's not actually like.

37:58

I still and maybe this is my age, this could be my age but when I think of the radio, I think of a guy sitting in a booth with a microphone like LBC, because you can watch LBC at the same time and I've been in there when James O'Brien was on or whatever and there's someone physically sitting there in front of a mic like this. They're talking to people and then the ads come in and I think that's what most people envision, but for, I would guess, probably the majority of radio stations, that's not actually how it works now. It did 20 years ago, but not anymore.

38:34 - Steve Dunlop (Guest)

Have you listened to Spotify's XDJ?

38:37 - David Brown (Host)

Yeah.

38:39 - Steve Dunlop (Guest)

It's interesting. I was actually quite pleasantly surprised by that, bearing in mind my ears are tuned to, as you say. They're kind of almost the old school pros and cons of radio and I think they used a lot of the tricks and techniques that we expect from radio, but in this generated voice. Firstly, the voice. I thought was very good.

39:04

They've rolled this out across the world now. It was in the US first, but I think it's very old-time now. The voice is very good, firstly. Secondly, it talks about the music that you've listened to. Unfortunately, my Spotify is utterly messed up at the moment because my children listen to number block soundtrack and the Bluey theme song Whilst I'm trying to listen to what I want to listen to. So I'll get Stone Roses and then followed by the Bluey theme song. So the poor old X AI DJ is going. Steve, here's a song you played a lot last year. It's the Bluey soundtrack. Oh my God.

39:39 - David Brown (Host)

Family Glenn. Yeah, it's really interesting. I had a DJ on last week who's been doing. I mean, he started off DJing on vinyl and still prefers vinyl. He's very old school about the whole thing and you know we talked about this a lot and one of his comments you know that he made is he said he was talking to a friend of his and they said, oh yeah, you know the spotter. No, he said, have you listened to my show lately? And they said no, no, no, I just listened to Spotify because the DJ on there is amazing. And he's like but I'm a DJ and they're like, yeah, but it's better. And he was like, yeah, and it's infinitely variable.

40:15 - Steve Dunlop (Guest)

Right, they can say anything and play any music, whereas your guest, he has a record box with 100 records in it. So that's his, that's his finite set that he can choose from and his personality and what he can say is the stories that are in his head. So it's definitely a scale challenge for humans now against that kind of that kind of service. And I think I was ready to listen to that and go and feel quite impersonal, like this is. This is like it feels like a robot talking to me. Didn't feel that at all. Actually there was a night. The way they the terms of phrase, the language used. Again, the voice was great and you know again tricks from radio being up close to the mic, so it feels like you're very close into someone's ear, the speed that you talk with the volume that you talk with. You know they've obviously done a lot of testing to do it. I mean again, if people haven't tried it to give it's called X DJ or AI DJ within the Spotify app worth having a listen to?

41:14 - David Brown (Host)

Yeah, for sure. So we've talked a lot about how you've been using it and how you sort of use it every day in your business and that sort of thing. Where do you see it going from here? You know, particularly in think, thinking about maybe you know ad tech or the advertising industry and and you know your radio and your digital, I guess, the digital platforms Do you, do you guys go to, like smart TVs and everything, and can you push ads into other tools or do they? I mean, youtube has its own advertising environment and I assume that you can't do anything there. But is it just, is it just audio that you work with? And I guess there's a couple of questions there. So, yeah, it's you know, I guess. Where do you see it going from here in sort of those environments?

42:04 - Steve Dunlop (Guest)

So we're audio only. We had a video product for a while and it was pretty good actually, but then COVID happened and various different strategies. Later we're focusing just on dynamic creative for audio. There's definitely a space for us to operate here. We've got great partnerships with our audio publisher partners and the AdTech ecosystem within audio. So we're doubling down on audio, for sure. But the dynamic creative as a thing obviously is deployed in every other advertising channel, or at least digital advertising channel. It's quite hard to do in old school press, for example. But even out of home now, with the number of digital screens, a lot of it is dynamically created, connected TV for sure. There's players who can start to build dynamic creative video ads that can be played out, and so different homes see different ads based on data about that home. That will only get more prolific.

43:10

The bit that's missing in audio and I think this is when we talk about where this is going is the O, dco.

43:17

ally, funnily enough, back in:

44:26 - David Brown (Host)

h, we were doing that back in:

44:29 - Steve Dunlop (Guest)

Right. So DC has been around forever. In audio it's still hard to do that O piece because the signal strength from the listener is quite weak and a lot of that is just purely a function of audio. We are often listening in our ears on earbuds and let's say the ad says tap your screen now to purchase the thing, whatever the ongoing call to action might be. But either that your device is in your pocket or the screen's off, or you're scrolling through Instagram and the app that's playing the ad to you is in the background. So already, and that's just one use case where you're on your mobile phone, imagine you're on a sonos device in our office here we have 20 people sitting in a room listening to sonos and when the ads come on and it says, tap your screen and like where's the screen? And there's 20 people in here. So audio has this kind of innate measurement issues, which means that it's quite hard to optimize against. So what?

45:28

we do is we will do what we call pre-optimize, which is where we know you're more likely to be feeling a preference for breakfast items at a certain time, monday to Friday, than you are lunchtime items. So we pre-optimize our script to mean that you don't need to run 50 versions of a script to tell you that people want to hear about breakfast items at breakfast time. There's some kind of common sense that we can do here. So I think over time, and what AI will help us with is things like attribution modeling where, without those really clear signals from audio, we can still detect or infer which version of an ad has had the most impact and speak most effective, and then we can start to optimize. So I think over the next few years we'll start to see maybe not from us. I think we know we partner with everybody. We partner with all the measurement people like Nielsen and Foursquare and Ninth Decimal and everyone who's doing attribution modeling and footfall and traffic modeling and all that stuff, and so with any of those signals.

46:29

I think over the next few years we'll start see optimization come in a bit more prevalently, and what AI helps with is spotting patterns in measurement data to be able to make predictive suggestions for optimization. Again, it within display and social. There are a few companies who've just started to do this kind of stuff, so I think you'll see that come on.

46:51 - David Brown (Host)

Yeah, I went to an event a couple years ago now where the lady who does she's the head of data science for number 10 and they have their own data science team and obviously, to you know, look into data for whatever the prime minister wants to look into, to look at develop policy and those sorts of things and she said that they've been using AI, an AI data analytics tool, for quite some time was how she phrased it. But what was interesting is, as she was talking about the fact that you know most of them are like a black box, so if you ask it something or you ask it to analyze something, you don't know how it came up with the answer. It just gives you an answer, but the tool that they have, she said it's been amazing because you know you can take enormous data sets. You can say here's the analysis I want to do on this data. You can let it run. And she said you know there was a.

47:45

She gave an example of one of her teams set off this analysis that they wanted to do something for the NHS or whatever, and he basically set off the query, went to lunch, came back a half an hour later and it was finished and she's like that would have taken like probably a couple weeks for him to do manually, you know.

48:03

And the answer came back and she said but the really interesting thing about the tool that we use is that it gives you a full log of what statistical analysis, what algorithm it used to do the analysis, what variables it found were important, why it found those variables important and it it basically gives you the whole you know reason why it did everything that it did and it tells you so you can actually go back and double check it if you want. And she said we did it in the beginning but we don't bother anymore because we know we've never found that it was a problem and in fact sometimes it does stuff in a way that we never would have thought of, which is which is much faster and easier. So she's like we've learned from the tool anyway.

48:42 - Steve Dunlop (Guest)

But yeah, so apply that to advertising? And I mean there's plenty of data of all sorts and shapes and sizes, so being able to analyze that fast and then use that output to to optimize creative brilliant and your example.

48:59 - David Brown (Host)

Going back to your example, when you were talking about reading 14 000 addresses, that's not something that I'd ever. I mean it totally makes sense, right, like that's the perfect use for it. And phone numbers, it like all that sort of detail information, and if you can slip it in to a real voiceover or a human voiceover in the same voice, chances are 99 percent of the time people aren't going to notice the difference anyway because the voice is so close and that's a fantastic use. And and I think that's where we're going to see initially, that's where the power of AI is actually going to come from for big business. Big business is going to use it to automate a lot of those really annoying, time consuming tasks that, like you said, you know it would take a, it would take a human. I would. I would turn down the contract if I was doing voiceover and they came and said you need to read 14 000 addresses, I'd be like no, thank you.

49:56 - Steve Dunlop (Guest)

Yeah, a lot of copy is required yeah, do you know what I mean?

50:00 - David Brown (Host)

and it's like I can see where that's gonna hugely, you know, improve efficiency in business and that sort of thing, and I think that's really where it's going to start to get its foothold, but I don't know how much I mean. I think there was an article that came out yesterday and Microsoft has really they've been super active in the last week. I've posted probably four or five articles specifically from Microsoft talking about AI and how it's going to impact business. There's been stories in the FT and Forbes, you know everybody talking about the IMF came out and said it's going to affect 40% of businesses over the next, you know, couple years.

50:38

And there are all these stories and what we're starting to see is business, big business, is now starting to cotton on and saying well, actually, this, maybe this tool is good to do some things that are taking a long time and that are slowing everything down and take a lot of manpower. So I I think it is slowly going to start to erode in, but it might not be what we expect. I think initially there was a knee jerk and we thought, oh, my god, you know, all the creatives are going to be out, pr is going to be out the window. We we don't need to see a mo anymore and all that sort of stuff, and that's not where we're actually seeing the most traction.

51:12 - Steve Dunlop (Guest)

Where we're seeing the most traction is in doing really dull, boring stuff like reading 14 000 addresses yeah, and I think that's that's certainly where we're starting, and I think there'll be a sliding scale over time of of using it more and more and um. But but what do we want here? I mean, at the end of the day, we don't want humans to be doing boring tasks, so the fact that that person that was doing the boring task isn't doing it anymore, let's hope that they go and do something more interesting, like spend time with their family, or or, or watch an algorithmically generated news channel painting by river yeah, right when you talk to AI and when you think about AI.

51:52 - David Brown (Host)

So I have a couple questions that I ask everybody, um and, but this one is my new one for the year and this one is are you polite to it?

52:03 - Steve Dunlop (Guest)

do you say please and thank you and those sorts of things when you, when you deal with chat, gpt or whatever, or are you like it's just a machine and it's a tool and I'll just ask you what I want so the interesting model here, um is how I speak to Alexa versus how my children speak to Alexa, um, which, in the end, isn't quite the same because, as we're working out, alexa's quite a way behind being a large language model, it's a rules-based response engine, and I think amazon have suddenly gone oh whoops, we need to make it into an lm, and certainly some of the voice interactions that you can do with chat, gpt are more sophisticated than exit. But, however, I think the, the human machine interface, um, through speech, uh, it could help answer your question. I I will always say um, alexa, play the Beatles, uh, and off we go you know that's the number one use case of all Alexa's is to play an entertainment service, and more often than not it's music or it's radio play capital FM.

53:02

Um, the second is asking the time, the third is asking the weather and and and so. So you know these, these incredible devices that have been in our homes the last day is we're still just saying put some music on, please. Um, I will say Alexa, play the Beatles, and if it plays a song I don't like, okay, alexa. Next, my kids. I'll say Alexa, play the Beatles, and if it's something that, like, tristan will go, alexa, shut up. You know, really, whoa, hang on he's like it's only a computer and and and then.

53:31

So my, my brain just goes well, it's be nice, be kind, you know, just as a, just as a human, whether you know whoever you're talking to, whomever you know, whatever you're talking to, be nice. But he's, he's kind of seen it for what it is and it's really interesting that that, that he goes well, why do we care how, how we talk to it? There was a story, um, this is quite a while ago that I, um that I read about how, um, uh, voice activated devices can change their answers based on how they're asked.

54:00

So if you, if you do say, alexa, shut up, then the answer could be, of course, I'll turn it down if you ask nicely and that she starts to kind of um, you know, preempt, or at least at least have a, have a, an interaction, just just a little slight um tangent to this, um it's. I think it's been fascinating to see how these um. So Alexa is a is a female name. Uh, siri is a female name. Um, big Sp is some kind of odd maybe foot servant name that Samsung chose to call their um assistant.

54:38

Google chose to say okay, google, and now all of our devices are gone but um, and and I think there's a big mental leap here between talking to a, someone who has a thing that has a female name, versus a thing that has the name of a company and a weird kind of um.

54:57

You know, activation, uh thing and chat gbt, of course I mean couldn't be more technical.

55:02

Gbt is not a consumer facing brand name, it's, it's a, it's an acronym for the actual technology used in how llms are billed, transformers, and, and I think they've got that wrong. I mean, certainly if they, if they want to have us as humans feeling comfortable interacting with this thing they've got to learn from from the path we've already been down here. You know, the most successful voice activated devices is Alexa and I would argue it's called Amazon Echo, but, but everyone calls it Alexa because we say her. You know, in our house we hear her to put some music on, you know, and I think again, humans like familiarity, they like feeling like they're interacting with other humans. So I think, whatever happens next with these kind of models that we talk to, whether that's with voice or whether it's through chatbots, or whether it's through prompts or whether it's whatever, giving it some kind of personality, calling it something personal, uh, will help it's interesting because the question I asked all last year was when you think of AI, do you think of it as being male or female?

56:09

What does chat GPT say? If you ask, I am neither I am an it. Actually we did ask that my kids asked me that and it said I'm neither I'm a computer, and my pronoun is it.

56:19 - David Brown (Host)

Yeah, it says something like that, but my thought always was that traditionally through science, sci-fi and all that sort of stuff, it's always been a female voice. I think it's because it's less threatening, generally speaking, because then the next step is you put it in a robot, because they always conflate the two things. So you end up with this terminator thing and you start going okay, well, if you've got this massive, intimidating looking robot and you put a male voice in it, that's scary as hell, whereas if you soften it a little bit and give it some more feminine type shapes and make it a little bit softer looking and it's not so threatening even though it can still kill you just as easily as the other one, but there's a psychology.

57:04

You're absolutely right, there's a psychology behind that and it's very interesting.

57:09 - Steve Dunlop (Guest)

And how media and entertainment has trained us to think what a robot looks and sounds like, because we've all watched Star Wars and seen the androids, and so actually a lot of the androids in Star Wars are either genderless or they're male and female voices. I think I mean, this is all to play for right. I think we shouldn't assume about how we're going to react If a robot walked past the window right now. I don't know how I'm going to go. Brilliant, they've arrived. Now come and help me with my heavy lifting. Oh my God, don't talk to it, I don't know.

57:43 - David Brown (Host)

We're just learning. Yeah, I think it depends. Did you see Elon Musk's tweet where he's got his robot was folding t-shirts?

57:52 - Steve Dunlop (Guest)

Yeah, I mean it's incredible. His worldview is that we'll all have nothing to do within 10, 20 years, whatever His company will be the most valuable in the world, we'll have nothing to do, and actually we'll spend that time having fun with each other and eating and drinking and watching movies and skipping down the beach, because all of the boring, mundane stuff, as we were saying before, will be done by robots, whether that's computational robots or physical robots, or obviously both. Yeah, I mean, some of that's quite appealing.

58:25 - David Brown (Host)

So when you have a personal assistant that's a fully AI personal assistant, what are you going to name it? Another little story that I'll tell you as an aside.

58:37 - Steve Dunlop (Guest)

I'm just going to cough, excuse me. So another little story I'll tell you as an aside is we have a car that you can name. So on the screen you can type in the name you want to call it. And when we lived in the US we called the car Blastoff. The kids named the car because it had a nice little generation and it was kind of looked a bit like a rocket and kids like rockets, whatever. So we called it Blastoff. And then we moved back to the UK so we had to sell the car in the US and we bought a new car in the UK and had to, so had to name this car. So we've had a very recent conversation about what to name an inanimate object. Of course, when you give that task to an eight and a four year old, we've come up with you never guess. So our car is called Rainbow Dave. So when you say to me what would you call your virtual assistant, unfortunately the first thing that comes up into my head is Rainbow Dave. I love it.

59:31 - David Brown (Host)

That's awesome. Kids are great at that kind of thing. Yeah, the reason I asked and just to go back, just I don't think you and I talked about this before and if we have you can stop me but the reason that I asked that you know should you be nice to AI or not was I read an article by a lady who said that she uses it for to help her with code. So if she's writing code and she has a code issue that she can't get past, a lot of times she'll paste it in and she'll ask it for help to fix the problem or to diagnose the problem and say what's wrong with this code, why doesn't it work, kind of thing. And then it gives her the answer and she says she feels gratitude but she doesn't know what to do with it, like, which is a super empathetic response, like I would have never thought about it that way or communicated it in that way, but I know exactly what she means and she said it perfectly and it's like you ask it for something and it helps you.

::

Yeah, brilliant. Or you thought of something that I didn't think of, or you've solved that problem for me. You've added utility.

::

And you like if it was a person, you'd be like, oh my God, thank you. That's amazing, that I'm so glad you know and you could express that and you could be like I want them to feel good because they helped me. And then you've got this thing on the screen and you're like what do I do with? That.

::

Yeah, just as you were talking, I was thinking of it at the prompt level, like asking nice prompt, because clearly every prompt that we write and every answer that it gives is going to be training the next iteration of the model. So if we start asking for prompts in a nasty way, that will train the model that nasty prompts are okay and then it might start spitting out nasty stuff because of that training feedback loop. So ideally we still want to be treated like nice humans. I don't think we don't want robots shouting at us, so I think we should speak nicely to the robots so that they speak nicely back to us once they've learned through that feedback loop.

::

Most people I talk to agree and my thought process is is it when the AI starts to take over? I wanted to go, but that guy was always nice to us.

::

But that guy said please and thank you.

::

That guy over there. He was really mean to us most of the time, so he's going to go first.

::

That lady said thanks when I gave her a good code prompt.

::

Exactly exactly Excellent, steve. Thanks very much for your time today. My pleasure Is there anything else that you feel that's burning, about AI that you wanted to talk about.

::

I mean just I have equal measure of kind of cynicism and excitement about it. I think the cynicism comes from oh my goodness, how many times do I get asked every day about what my business is doing in AI? And so I feel a little bit, a little bit not with you. We've got into the detail and you've asked the intelligent questions around it rather than they kind of sound by answer questions, and so that can get frustrating, particularly if the person asking just wants the sound bite so that they can say, yep, that company does AI tick, and then they can put us in a different category on a spreadsheet somewhere, and I think that devalues overall product and what we've been doing for ages and there's a little bit of snake oil around all this stuff, and so that's where my kind of cynicism and frustration comes from.

::

And there certainly are companies and ideas and product out there, your training videos that are just headless. That is utter bullshit and wasting everybody's time. It's friction in the process and devalues the industry and everything else for everybody else, for all the humans. The excitement though I'm bullish on AGI and I think we'll get there at some point, I must have told you about this before, but maybe your listeners won't have heard of this. There's a blog called Wait, but why? And actually about five years ago. It's quite an old blog post. It's in three sections. This guy I'm now blanking on his name, tom somebody wrote an essay about artificial intelligence, and this is long before LLMs and chatGVT and so on. And the first he used all these little stick diagram pictures of people to explain what are otherwise reasonably complicated concepts.

::

And he uses a staircase, and I know this isn't necessarily his, but he redraws it in his lovely little stickman. And imagine that there's a staircase of intelligence and humans are on step seven and on step one is an ant and on step six is a monkey.

::

And we look down at the monkey and go oh, mate, you'll get there oh, look at you, bless you, using your opposable thumbs to make little tools so that you can eat nuts more efficiently, great. And we look down at the ant and we go oh, I didn't even notice you there, pal, when I'm building this road through your home. And AI today is probably on step one or two and maybe, with like an LLM, we're getting to maybe step three or four when we get to AGI, that's when AI joins us on step seven. And then there's ASI, which is superintelligence, which will be and actually this will happen very quickly as soon as you get to AGI. There's nothing to stop the computer other than energy and compute cycles for getting up that staircase and suddenly being at step a thousand, and then next week step 10,000, and next week step infinity.

::

So if you use that analogy of us on step seven and we ain't going anywhere, we aren't going to get on step eight. Not a chance. A billion years we need to pass with evolution before we go up there. We're looking down at an ant on step one. Imagine the computer that's now on step 10,000, looking up at us on step seven, and the deference that we have for the ant. What are they going to have for us. So that's the first kind of model that he proposes in this blog and he's kind of sitting go shit. And then the other bit that I really like, and it's definitely worth a read, is he has a two by two and I'm an ex-consultant, I love putting arguments on two by two. One axis is will AI be benevolent or will it be aggressive towards humans?

::

And then the other axis, the y-axis if you like, is how soon will we get to AGI or ASI? And then on this, two by two, he plots all the big thinkers about AI. Where do they sit on that chart? Bill Gates, elon Musk and hundreds of others? You know academics and people who've written research papers and people who know way more than us about AI and have been thinking about it way longer than we have. Where is their general perspective? Ie, is it going to happen soon and be benevolent, or is it going to happen in?

::

a long time, but when it does come, it's going to be nasty.

::

And that is because no one knows right and actually I'd love him to redraw this grid, you know, in the light of all the recent kind of LLM development, because you probably would argue that everyone who thought it was going to be 50 years out, you could probably bring that in a little bit now based on, you know, the developments that have happened more recently. Anyway, I'll leave you with that.

::

It's a good little bit of offline reading to go and do. I found that very helpful to get my own view on AI in general, but also just to get what might be called AGI All the open AI shenanigans about. Are they developing tools too fast or too slow? Should we have a general pause? Who owns the model? Who owns the training set, All that stuff? It's really helpful to have just the general landscape in your mind and I found this is very helpful to do that.

::

So, yeah, thanks for that. That's amazing, and I'll put a link in the show notes.

::

Did you find it out here? Wait the wine.

::

Yeah, I found it already Great, so that's cool. So, yeah, I'll drop a link in. And if there are any advertisers out there who are looking for dynamic audio content and have 10 million pounds a month to spend, I'd love it if that was all with us.

::

I mean, we take a fraction of the rest we spend with the publishers, but yeah, come to a million ads dot com.

::

That's it, steve. Thanks very much. I appreciate your time today and it was really, really interesting and I, like some of the you brought up some things that I hadn't thought of before and some positive uses, I think, for AI, which is some good examples that I can use in the future and I think maybe other people can as well. That aren't quite so scary and, you know, maybe people go, oh, I never thought about using it like that, so no, that's been amazing. So thanks.

::

Real pleasure, nice to see you again.

::

Yeah, I'll speak to you soon. Thanks, Mike Bye bye.

Links

Chapters