Artwork for podcast Cognitive Engineering
Destroying the World
Episode 3884th March 2026 • Cognitive Engineering • Cognitive Engineering
00:00:00 00:40:30

Share Episode

Shownotes

A few things we mentioned in this podcast:

- Mirror life https://edition.cnn.com/2025/10/17/science/mirror-cell-life-dangers

For more information on Aleph Insights visit our website https://alephinsights.com or to get in touch about our podcast email podcast@alephinsights.com

Transcripts

Fraser McGruer:

Hello and welcome to the Cognitive

Fraser McGruer:

Engineering podcast, brought to you by Aleph Insights and

Fraser McGruer:

produced by me, Fraser McGruer. I'm here with Nick Hare, Chris

Fraser McGruer:

Wragg and Peter Coghill of Aleph Insights. On this podcast, we

Fraser McGruer:

look at a wide range of topics, and today we are discussing the

Fraser McGruer:

dangers of Mirror Life.

Fraser McGruer:

Chris, what on earth is mirror life? I have no idea what we're

Fraser McGruer:

talking about.

Chris Wragg:

Yeah, this isn't some reference to sort of the,

Chris Wragg:

you know, narcissism of modern culture or something. This is

Chris Wragg:

actually a whole sort of discipline of science, which

Chris Wragg:

involves mirror organisms, such as mirror bacteria that are

Chris Wragg:

constructed from mirror images of molecules that are found in

Chris Wragg:

nature. So if you take our DNA and all the DNA you know on

Chris Wragg:

Earth, it is all what is called right handed so it's helix

Chris Wragg:

spirals in a clockwise direction. And and then, you

Chris Wragg:

know, proteins are all that's because righty tighty lefty

Chris Wragg:

loose. Righty tighty lefty loose. Right handed means it's,

Chris Wragg:

if you're going up a staircase, right, a spiral staircase, and

Chris Wragg:

you would use your right hand on the inner Bannister, you're

Chris Wragg:

going in that direction, right? I think, I think the

Peter Coghill:

sprinkle a bit more technical jargon. Yeah,

Peter Coghill:

certain molecules have isomorphism, which means that

Peter Coghill:

you can have it looking one way, or you can get a mirror image of

Peter Coghill:

that,

Nick Hare:

an enantiomorph, if we're trying to use Yes,

Nick Hare:

exactly, yeah, an opposite chiral

Peter Coghill:

molecules don't so simple sugars are just

Peter Coghill:

symmetrical in whatever way you look at them. Boring, yeah. Then

Peter Coghill:

some molecules, such as the the protein, the molecules of

Peter Coghill:

makeup, proteins and some sugars, some alcohols, etc, are

Peter Coghill:

handed they you can have one version or a mirror image of

Peter Coghill:

that version, and they can naturally, they can actually

Peter Coghill:

occur.

Nick Hare:

But isn't it true that orange, the taste of

Nick Hare:

orange, is actually caused by a molecule that is the mirror

Nick Hare:

image of the one that causes the taste of

Peter Coghill:

lemon, something like that? Yeah. But it turns

Peter Coghill:

out that all life on planet Earth has, for some reason, sort

Peter Coghill:

of sent that gone towards one handedness. What is always right

Peter Coghill:

handed.

Chris Wragg:

The implications

Fraser McGruer:

of this being, before you go on, if you can

Fraser McGruer:

tuck a little bit further on under or will it not go down

Fraser McGruer:

further? No, or There you go. I've got so many, okay, I've got

Fraser McGruer:

so many questions, but keep going. Chris, yeah.

Chris Wragg:

I mean the key, the key thing, really, never mind

Chris Wragg:

why this occurs. The fact is, it's, it's a phenomenon in

Chris Wragg:

nature, and it applies to all organisms. And the key issue is

Chris Wragg:

that human scientists have been experimenting with the creation

Chris Wragg:

of mirror images of naturally occurring molecules, and in some

Chris Wragg:

cases, starting to consider how you might create a mirror image

Chris Wragg:

bacteria right for various issues. Now what this means is,

Chris Wragg:

right. So there are, you know, some benefits around, you know,

Chris Wragg:

treatment of disease, being able to produce chemicals at scale

Chris Wragg:

and so on. But the potential massive downside to this is that

Chris Wragg:

you create bacteria for which our immune system has no ability

Chris Wragg:

to recognise or contain. You know, if you think about like a

Chris Wragg:

phone charger, you know, our our immune cells are a little bit

Chris Wragg:

like, you know, a phone charger that you're trying to plug into

Chris Wragg:

the to the bacteria, and if you've got the wrong phone

Chris Wragg:

charger, you're a bit buggered. And so we're potentially the

Chris Wragg:

issue is we're potentially creating super bugs, I suppose

Chris Wragg:

that that all organisms have no natural defence against and so

Chris Wragg:

the scientific community itself has, has sort of been

Chris Wragg:

increasingly realising this is probably something it shouldn't

Chris Wragg:

do. And so there was a there was a large sort of paper and a

Chris Wragg:

commentary on it in the Journal of Science last year where

Chris Wragg:

hundreds of scientists came together, you know, Nobel Prize

Chris Wragg:

winning scientists, and they said, Unless compelling evidence

Chris Wragg:

emerges that mirror life would not pose extraordinary dangers,

Chris Wragg:

we believe that mirror bacteria and other mirror organisms, even

Chris Wragg:

those with engineered bio containment measures, should not

Chris Wragg:

be created so effectively trying to Create a moratorium on this

Chris Wragg:

element of science, because they feel it could pose an absolutely

Chris Wragg:

catastrophic risk too.

Fraser McGruer:

Okay, look, a couple of things. First of all,

Fraser McGruer:

we've done a lot of podcasts together, and I'm often felt

Fraser McGruer:

confused during these things, right? I have no idea what we're

Fraser McGruer:

talking about.

Peter Coghill:

Don't think. It matters, good. I don't think it

Peter Coghill:

matters because I think we're going to treat it as the

Peter Coghill:

biochemistry existential risk equivalent of AI takeover. Fine.

Fraser McGruer:

Think that's okay, good. And also, lots of

Fraser McGruer:

these words I've never heard of before, and sometimes when stuff

Fraser McGruer:

gets explained to you, you understand it more, but

Fraser McGruer:

sometimes you understand it less. We've made you stupider.

Fraser McGruer:

Yeah, who knew, right, that this?

Peter Coghill:

Yeah, tell intelligence without words.

Peter Coghill:

Yeah. I can't

Chris Wragg:

help but feel this is my problem. But the

Chris Wragg:

fundamental question behind it, and also, why are we discussing

Chris Wragg:

this, right? Okay, so essentially, what we're what

Chris Wragg:

we're looking at, is whether or not our demise as a species is

Chris Wragg:

going to be brought about by stuff we do, or some other, some

Chris Wragg:

other means. Are we our own worst enemy?

Fraser McGruer:

Okay, gotcha? Sounds good? Well, that's a

Fraser McGruer:

nice, nice, happy topic. Who's right answer? Are we the are we?

Fraser McGruer:

What was the what was the what was the question? Again, the

Fraser McGruer:

simple question, Are we our own worst enemy? Okay, who's gonna

Fraser McGruer:

take this So, Peter, are we? Are you your own worst enemy? Are

Fraser McGruer:

you

Peter Coghill:

Fraser's worst enemy? Yeah, yes, yeah,

Peter Coghill:

definitely. Fraser never, plenty of those. I'm gonna phrase it as

Peter Coghill:

the the Promethean problem. So, you know, we we there's a

Peter Coghill:

paradox of progress in everything that drives our

Peter Coghill:

progress, in in cultural progress, technology progress,

Peter Coghill:

quality of life. Everything is kind of driven by our own

Peter Coghill:

creativity, our ambition, our curiosity, and our sort of

Peter Coghill:

ability to solve problems. But those things, those drivers

Peter Coghill:

ambition, curiosity, problem solving and creativity are

Peter Coghill:

indifferent, inherently indifferent, to moral progress.

Peter Coghill:

You know, we the moral consequence. You know, the moral

Peter Coghill:

consequence is the thing you bolt on kind of afterwards. We

Peter Coghill:

tend to create first and then understand later. So we, we, we

Peter Coghill:

build the bomb, we build an atomic bomb, and then we go, ah,

Peter Coghill:

actually, this kind of changes the whole power dynamic of the

Peter Coghill:

world. Let's, let's, let's now have some accords and some

Peter Coghill:

treaties to try to put a lid on this and and have a whole dog,

Peter Coghill:

dog more about mutual escort, sure destruction, which means

Peter Coghill:

that we won't, we won't use them. So it's sort of we, we

Peter Coghill:

have a tendency, as as a species, to to to open Pandora's

Peter Coghill:

Box, to bring in another clumsy, there's a lot of classical

Peter Coghill:

illusions clumsily bring in another class

Nick Hare:

for me, yes, opening a Pandora's box after the horse

Nick Hare:

is bowled.

Peter Coghill:

So the credit the question narcissist earlier,

Peter Coghill:

yeah, we're back up to date. We are. So the question is, are we

Peter Coghill:

Prometheus, you know, getting fire, or are we the fire? That's

Peter Coghill:

the uncontrolled fire spread from a spark like it. Do we have

Peter Coghill:

it? Do we have, do we have a innovation addiction?

Nick Hare:

So, I mean, I think if we I think there are two

Nick Hare:

really quite distinct categories of thing we need to worry about,

Nick Hare:

because they're caused by different mechanisms. One is the

Nick Hare:

deliberate destruction of the human species by decisions we

Nick Hare:

take, and one is the accidental destruction of the human species

Nick Hare:

unintended because of the side effects of decisions that we

Nick Hare:

take, right? So, in other words, you know, are we going to cause

Nick Hare:

our destruction because we want to? Basically, I would put a

Nick Hare:

nuclear war in that category. You know, we're kind of actually

Nick Hare:

making the choice to kill everyone. And I think that's a

Nick Hare:

very different that's there's a whole different set of

Nick Hare:

mechanisms there to when we worry about accidentally, you

Nick Hare:

know, creating mirror life, or letting an AI take over the

Nick Hare:

world, or whatever. And I think Peter's, you know, question

Nick Hare:

thoughts about, how have we evolved in such a way to make

Nick Hare:

this intrinsically problematic? Is a good place to start. So if

Nick Hare:

we take the problem of us deliberately destroying the

Nick Hare:

world by sort of just by analogy, if you think about

Nick Hare:

fights, that is actually quite hard to kill someone. Now, I

Nick Hare:

know you've killed many a person with your bare hands, trained

Nick Hare:

trade boxer, Yep, yeah, yeah, exactly. But it's actually, it's

Nick Hare:

actually quite hard. It is somewhat harder than people.

Nick Hare:

Yeah, attacks with a gun are 30 times more lethal than attacks

Nick Hare:

with no gun, right? For example. Okay, now, if we'd have evolved,

Nick Hare:

if we'd have had guns, if we'd evolved with guns built into our

Nick Hare:

fists, it's entirely plausible to me that we would have evolved

Nick Hare:

much less of a tendency to get into fights. We would be much

Nick Hare:

better at standing down, because the risk would be so much higher

Nick Hare:

getting involved, right? So clearly our we've evolved have a

Nick Hare:

level of aggression where it's quite hard to cause that much.

Nick Hare:

Damage. Now we can't that level of aggression extended to the to

Nick Hare:

the levels of, you know, at the country level, is, is, therefore

Nick Hare:

becomes extremely problematic when you've invented, you know,

Nick Hare:

AI drones and nuclear bombs and, you know, bio weapons. So, so

Nick Hare:

that, I think is sort of thinking about it from an

Nick Hare:

evolutionary point of view, that what Peter is talking about on

Nick Hare:

the kind of deliberate destruction side, you know, is

Nick Hare:

that we have an urge to destroy things where our which is way

Nick Hare:

out of line with our capacity to actually destroy things, like

Nick Hare:

our urge to destroy things. It's almost like our technology for

Nick Hare:

destruction is catching up with our urge, and I, and I sort of

Nick Hare:

think, well, you know, if there was, if everyone had access,

Nick Hare:

well, okay, so why haven't we destroyed the world? You might

Nick Hare:

say, Well, why are we, why are we all still? Why is the murder

Nick Hare:

rate going down? Why is there less war? Which is a good

Nick Hare:

question. But if, if I said, you know, well, actually, we're

Nick Hare:

gonna give everyone in the world a button, and if anyone presses

Nick Hare:

it, everyone dies, right? You know perfectly well, some, some

Nick Hare:

idiot, would press that button, right? So it sort of feels to me

Nick Hare:

like, well, we know that someone out there would do it if they

Nick Hare:

could, just for long, right? So feel so actually, there really

Nick Hare:

is. It's not like, well, we can rest assured that this won't

Nick Hare:

happen. All we're saying is, well, it's just at the moment

Nick Hare:

sufficiently hard for someone to do it. And if it gets to some

Nick Hare:

level of where a man in a shed can create a global killing

Nick Hare:

thing and decide to just do it, I think we can all assume

Nick Hare:

someone would right. So that's the problem, I think, is that,

Nick Hare:

you know, there is nothing in us that stops it from happening. Is

Nick Hare:

what I'm saying. We get to the correct level of destructiveness

Nick Hare:

if it becomes cheap enough to destroy the world, some wanker

Nick Hare:

is going to do it well.

Chris Wragg:

And if you, if you take the scale down in terms of

Chris Wragg:

scale of lethality, if you, if you look at it's a boring topic,

Chris Wragg:

but it's relevant here the idea of gun control that you know, if

Chris Wragg:

you give enough guns to people, some of them are going to go

Chris Wragg:

berserk and go and shoot lots of people. Now, if you scale that

Chris Wragg:

up to, you know, large scale lethality. If everybody had,

Chris Wragg:

like, you say, a nuclear bomb, then we already know the, you

Chris Wragg:

know, going, going into a school and shooting lots of people is

Chris Wragg:

already a pretty, pretty high threshold for for sort of

Chris Wragg:

destructive, destructiveness.

Nick Hare:

Or you've got people like, you know, Andreas Lubitz,

Nick Hare:

who was that German wings pilot who crashed the plane exactly

Nick Hare:

when you think, well, if he'd have been able to crash a plane

Nick Hare:

with a million people on it, probably wouldn't have changed

Nick Hare:

his if anything, he might have been more up, you know. So,

Nick Hare:

yeah, yeah,

Peter Coghill:

I think, I think it doesn't really hinge on it

Peter Coghill:

being deliberate destruction, though. I think it's, I think

Peter Coghill:

that I think there may be a tendency, even though, you know,

Peter Coghill:

accidental disruption. So when we were, when we built the

Peter Coghill:

atomic bomb and then the hydrogen bomb, we we, we weren't

Peter Coghill:

deliberately setting out to try to put ourselves in a position

Peter Coghill:

that we could destroy the world. They were, they were sort of

Peter Coghill:

tactically, at the time, rational things to do. It's

Peter Coghill:

like, this is, this is a world, and at that time, this is a war

Peter Coghill:

ending weapon. So we need to get that, and that will cause Japan

Peter Coghill:

to surrender. Then it became, because the technology then

Peter Coghill:

existed, it became a sort of race to produce more and more

Peter Coghill:

and more than we and they are, over a decade or so, we got to

Peter Coghill:

the point where actually this was a, this was a world ending

Peter Coghill:

weapon. But before that point, it was a, sort of, it was a

Peter Coghill:

rational thing to do.

Nick Hare:

Yeah, I'm not saying that's partly the problem. Yeah,

Nick Hare:

you know is that it gets not, there's not, it's not, in a

Nick Hare:

sense, it is rational because we do want to kill people. We want

Nick Hare:

to have the technology to kill people, so it makes sense to

Nick Hare:

invent it. What I'm saying is that the methods you would use

Nick Hare:

to stop it happening, the regime that you would have to impose to

Nick Hare:

stop it from happening, just looks very different if you're

Nick Hare:

worrying about a deliberate choice to kill people versus

Nick Hare:

accidentally killing everyone in the world. They're different

Nick Hare:

kinds of mechanisms. And I think, you know, you can deter

Nick Hare:

people in a different way, right? You need to impose

Nick Hare:

different mechanisms, I think, on but, but I'm just saying are

Nick Hare:

one, one key mechanism of deliberately destroying the

Nick Hare:

world is that, what we can't deny is we definitely, some

Nick Hare:

people do want

Peter Coghill:

to do that. Yeah, no, I agree. I agree. Yeah.

Fraser McGruer:

I think you might be getting close to what I

Fraser McGruer:

want to ask, right? Which is, okay, do we contain the seeds of

Fraser McGruer:

our own destruction? Right? More or less, that's the question we

Fraser McGruer:

want to ask, right? And I can see that it's interesting. But

Fraser McGruer:

why are we discussing this? What's the value? What's the

Fraser McGruer:

what's Well,

Nick Hare:

what do we do about it? Right? Because, because the

Nick Hare:

problem is that. That destructive technology is going

Nick Hare:

to get worse, and I think on a large scale, obviously, well,

Nick Hare:

nuclear weapons technology haven't, hasn't really. It's

Nick Hare:

certainly not become super cheap. And in fact, you know,

Nick Hare:

you look at the data, and there's fewer nuclear weapons

Nick Hare:

around now than there were, you know, 30 years ago, killing

Nick Hare:

people has become easier. We've got murder drones, and yet

Nick Hare:

murder rates and rates of deaths in war have fallen right, see?

Nick Hare:

So what are we getting right? Is the is the question, because we

Nick Hare:

want to be doing more of that like and I think what we haven't

Nick Hare:

we've only really just started seeing what the impact of, say,

Nick Hare:

drones are, but actually being able to kill a million people

Nick Hare:

with drones might turn out to be much easier and cheaper than

Nick Hare:

killing a million people with a nuclear bomb, and more efficient

Nick Hare:

and more efficient. And in fact, you can, just then you can, you

Nick Hare:

get the whole city, whereas

Fraser McGruer:

it could be morally superior. Well,

Nick Hare:

we're not even worrying about that. The point

Nick Hare:

is, you know, it's a new it's the latest thing to have to

Nick Hare:

worry about, what do we do? What did we get right with nuclear

Nick Hare:

weapons or with any other technology, which means that we,

Nick Hare:

you know, actually all of the trends in violence are positive,

Nick Hare:

like despite the fact that destruction is easier and

Nick Hare:

cheaper than it was in the past. What have we got right that

Nick Hare:

makes it that's that has been. Is that a question

Fraser McGruer:

you want to answer right now? Yeah.

Nick Hare:

Well, yeah. Why not? Are you proposing this as a

Nick Hare:

question? Well, I'm just at what we I think that's why you said,

Nick Hare:

Why are we asking the question? And the answer is, because

Nick Hare:

we've, I don't know about you phrase, but I don't want

Nick Hare:

everyone in the world to be killed. So the question is, what

Nick Hare:

should we do about

Peter Coghill:

it? Just select people, yeah,

Fraser McGruer:

yeah, okay, where do we go with this? Who's

Fraser McGruer:

next?

Peter Coghill:

I mean, I before we do that. I mean, I thought

Peter Coghill:

occurred to me, I came to sort of extend this a little bit

Peter Coghill:

further, that perhaps our innate capacity and our innate drive to

Peter Coghill:

produce ever more dangerous technologies is actually a

Peter Coghill:

potential solution to the Fermi Paradox. I think maybe that.

Peter Coghill:

Maybe it's the case that any sufficiently complex

Peter Coghill:

intelligence that evolves must necessarily have curiosity, risk

Peter Coghill:

taking and competitiveness that causes it to be dangerous and

Peter Coghill:

cause it to eventually, there's a sort of an technological

Peter Coghill:

determinism means that you eventually strike upon atomic

Peter Coghill:

weapons and biochemistry, bio weapons and things, and that

Peter Coghill:

will lead event, on average, will lead, lead to extinction Of

Peter Coghill:

the species. Yeah. So I think there's

Nick Hare:

a tragedy of the commons kind of argument that,

Nick Hare:

yeah, we just, you know, the possibility of cooperation is

Nick Hare:

just too hard and unsustainable, and affection is always going to

Nick Hare:

be better. And hey, Preston's

Peter Coghill:

dilemma, because, yeah, that's but it's hard for

Peter Coghill:

the innate kind, the kind of the kind of intelligent intelligence

Peter Coghill:

that will win out in a bio, sort of crude biological evolutionary

Peter Coghill:

world, if you could prod it differently, and you could have

Peter Coghill:

more peaceful beings because they had guns on their arms,

Peter Coghill:

then maybe they find cooperation and coordination much easier,

Peter Coghill:

but it's just that in the biological world, because of the

Peter Coghill:

limited amount of damage you can do person on person, that just

Peter Coghill:

doesn't scale very well as you get more and more complex.

Nick Hare:

Yeah, and there's also the problem that humans are

Nick Hare:

quite like we have a small number of children, which

Nick Hare:

doesn't sound like terribly like relevant, but actually it means

Nick Hare:

that we're not we are incentivized to cooperate

Nick Hare:

intrinsically with quite a small number of people. Yeah, so we're

Nick Hare:

only really related to a very small number of people, and

Nick Hare:

hence we only really care about and do things altruistically for

Nick Hare:

a small number of people, because, you know, we have got a

Nick Hare:

lot of investment in our own offspring. Yeah, whereas, if you

Nick Hare:

take bees, they've all got one mum, all bees have got one mum,

Nick Hare:

and they're all sisters, except for the drones, who, it doesn't

Nick Hare:

matter. They just their only job is to mate with the queen and

Nick Hare:

then die, right? So a whole beehive can all work together

Nick Hare:

individually, because they're all essentially the same.

Nick Hare:

They're all invested in the same set of genes. So option number

Nick Hare:

one is we just have one we like Queen Camilla, or whatever she

Nick Hare:

is, like a random having all the children. And, you know, we

Nick Hare:

genetically engineer things so that we're all just, you know,

Nick Hare:

we're basically like bees. You know. Well, I mean, you know.

Nick Hare:

But if, if we have enough bio technology to engineer, you

Nick Hare:

know, to engineer bio weapons, yeah, I mean, I think we could

Nick Hare:

re engineer human.

Peter Coghill:

I mean, I think, I think there are ways doing it

Peter Coghill:

without, directly bio without turning into. Please. Yeah, and

Peter Coghill:

I think, I think the counter argument to the the fact that

Peter Coghill:

intelligence is is the problem, but intelligence could also be

Peter Coghill:

the antidote to the problem, because we are capable of moral

Peter Coghill:

reasoning and an empathy and things we are capable of it. So

Peter Coghill:

maybe what we need to do is train ourselves to be better at

Peter Coghill:

doing those things. So and so we know. So we know we can move

Peter Coghill:

away from the constraints of the biological intelligence we have.

Peter Coghill:

Maybe we, you know, maybe when we start uploading our brains to

Peter Coghill:

computers, or maybe just drilling hard and home, the

Peter Coghill:

moral reasoning is a thing you must do from an earlier age. You

Peter Coghill:

know, we were then capable of sort of, sort of meta

Peter Coghill:

revolution, that we're then able to drive our evolution in a

Peter Coghill:

cognitive space, rather than in a you're

Nick Hare:

saying, why the solution is, why can't we all

Nick Hare:

just get along? Why Can't We? Well, actually, I mean, the

Nick Hare:

thing is that, Stephen, I mean, Steven Pinker's argument in the

Nick Hare:

better angels of our nature is that, you know, because the fact

Nick Hare:

that the trends in violence and all sorts of other indicators

Nick Hare:

are all what we would regard as positive, suggests that our kind

Nick Hare:

of political and social technology is outpacing our

Nick Hare:

destructive technology development. And his his view is

Nick Hare:

that, you know, our ability to have a say, a strong state, to

Nick Hare:

punish people, to reliably detect and punish wrongdoers,

Nick Hare:

has has actually outpaced, you know, the ability of those

Nick Hare:

wrongdoers to cause death and mayhem. And I find that quite

Nick Hare:

plausible. And in fact, you know, if you think about, well,

Nick Hare:

what technology does that rely on? It relies on a certain

Nick Hare:

amount of ability to, you know, centralise power, but also to be

Nick Hare:

able to detect things, to be able to identify that this

Nick Hare:

person did that wrong thing, or that they're going to, you know,

Nick Hare:

that we kind of detect that someone's acquired the

Nick Hare:

precursors to a bio weapon

Peter Coghill:

we've seen We've got a like our social

Peter Coghill:

technologies, our institutions and things we have now are way

Peter Coghill:

more complicated and way more capable than our little You

Peter Coghill:

know, when we were people living on the planes, travelling around

Peter Coghill:

for scraps and bonking the heads of each other with mallets and

Peter Coghill:

things, we didn't need anything like that. It's amazing that we

Peter Coghill:

have got big state institutions and international treaties and

Peter Coghill:

things.

Chris Wragg:

Yeah, I just, I just worry that the this idea

Chris Wragg:

that the the instinct to be peaceable and survive alongside

Chris Wragg:

one another, is going quicker than our ability to invent

Chris Wragg:

things to kill one another, and our drive to kill each other At

Chris Wragg:

some point, like with, for example, mutually assured

Chris Wragg:

destruction, you only get one go, right? So, so, like guns

Chris Wragg:

scale, you know? And it's like, okay, well, back in the day, we

Chris Wragg:

used to have lots of fist fights. Now we have far fewer

Chris Wragg:

gun fights, right? That works. But when you get to the stage of

Chris Wragg:

cataclysmic risk. You only, you only need to get it. What wrong?

Chris Wragg:

Once, right? So how, how can you outpace that? You can't.

Nick Hare:

But then, if you look at what we actually did, it was

Nick Hare:

things like, you know, you'd make these commitments, but then

Nick Hare:

they'd be monitored. You know that you would actually count is

Nick Hare:

this person you know, either have they decommissioned these

Nick Hare:

weapons, and then you know that you're, you're monitoring

Nick Hare:

where's the nuclear material, and there's kind of

Nick Hare:

sophisticated global system of trying to make sure people

Nick Hare:

aren't doing that. And you could imagine doing a similar thing

Nick Hare:

for, you know, for kind of AI drones, you could say, well, you

Nick Hare:

know, we're gonna have a limit on the number of of drones that

Nick Hare:

each country is allowed to have. And it feels a long way away

Nick Hare:

that we could imagine that happening now, but I think all

Nick Hare:

it would take is the equivalent of another Hiroshima or Nagasaki

Nick Hare:

committed by drones to make people go or we should do

Nick Hare:

something right.

Peter Coghill:

My concern with the the social technologies

Peter Coghill:

we've got for controlling it is they feel fragile. They feel

Peter Coghill:

much more fragile than our drive to find ever more unpleasant

Peter Coghill:

ways to kill each other.

Fraser McGruer:

Go and see what you're going to say. Well, I was

Chris Wragg:

just going to say, thus far we've only, we've only

Chris Wragg:

touched on deliberate, yeah. Well, I was all. I was just

Chris Wragg:

going to bridge into that by saying, well, well, I I'm not

Chris Wragg:

quite, I'm not so sure it's totally binary, like that, that

Chris Wragg:

there are things we do to kill one another, and there are

Chris Wragg:

things we do by accident. I sort of feel like, although nuclear

Chris Wragg:

bombs, obviously, you know, have a purpose, and guns have a

Chris Wragg:

purpose, something like a knife is multi purpose. And you know,

Chris Wragg:

if I throw a knife at you in order to wound you, but it hits

Chris Wragg:

you in the heart and I kill you. I go, Oh, dear, you know, oops.

Chris Wragg:

That was a that was manslaughter, not murder or

Chris Wragg:

something. So I'm not quite so sure that we've got this really

Chris Wragg:

clear distinction. So AI being an example, you might generate

Chris Wragg:

AI and use it in a variety of sort. Circumstances. Some of

Chris Wragg:

those might be lethal. Some of those might not be lethal. Some

Chris Wragg:

of them might be sort of intended to be, like policing

Chris Wragg:

robots or something that suddenly become, you know, fully

Chris Wragg:

lethal killing machines. That it's isn't the distinction

Chris Wragg:

between what we accidentally do to kill one another and what we

Chris Wragg:

deliberately neglect, I suppose, yeah, is not quite so clear cut,

Chris Wragg:

yeah, although, I mean, I think there's a different set of

Chris Wragg:

things you would want to do. Let's imagine that we could

Chris Wragg:

solve we were absolutely guaranteed that nobody would

Chris Wragg:

want to destroy the world. So we'll put that on a shelf and

Chris Wragg:

pretend that we've solved the problem of people wanting to

Chris Wragg:

destroy the world. Well, then we have the next category of thing

Chris Wragg:

to worry about, which is, what about accidentally doing it?

Chris Wragg:

Yeah. How do we stop that?

Fraser McGruer:

Okay, so that's we're going to move on to next,

Fraser McGruer:

right? Am I right? Like, how do we stop this stuff accidentally

Fraser McGruer:

happening?

Peter Coghill:

Yeah, before we how do we foresee consequences

Peter Coghill:

that we didn't foresee, which kind

Fraser McGruer:

of leads, which sort of connects to something

Fraser McGruer:

I've been thinking about and I've not been able to stop

Fraser McGruer:

thinking about the last 10 minutes, right? Which I can't

Fraser McGruer:

get off the bees, right? Bees, yeah. And the reason why is,

Fraser McGruer:

let's say they've got this system which works really well,

Fraser McGruer:

and it means they don't do sort of mad things like kill them,

Fraser McGruer:

keep sharks. They're all related, but I feel really bad

Fraser McGruer:

for them, because they live on a planet with a bunch of people

Fraser McGruer:

who, who they're going to get killed because of us, right? And

Fraser McGruer:

it's something outside of their control, please? Yeah, the bees

Fraser McGruer:

have figured it all out, but then these sort of highly

Fraser McGruer:

evolved monkeys that have come along and then, and then we, we

Fraser McGruer:

destroy the plant, we destroy the bees along with us. And I

Fraser McGruer:

just feel bad because, you know, the bees have figured it out,

Fraser McGruer:

and, you know, they get sort of done in.

Peter Coghill:

There's a great deal more moral weight than just

Peter Coghill:

our own species. We have a sort of moral responsibility for the

Peter Coghill:

rest of the world, and not only the rest of the world, but all

Peter Coghill:

the potential species that might come about after we've gone,

Peter Coghill:

yeah. So we can't, we mustn't snuff out life completely. We

Peter Coghill:

mustn't, yes, no,

Unknown:

fish, fish, yeah, yes. And there are all the other

Peter Coghill:

we have to respect our piece to peace

Peter Coghill:

accord with the fish that we've settled upon podcast.

Fraser McGruer:

And we did do that, yeah, but it also, I'm

Fraser McGruer:

sure, that

Nick Hare:

this fragile cease fire, Twix Twixt, land and sea

Nick Hare:

Exactly, exactly.

Fraser McGruer:

And I'm sure in our sort of discussions, you

Fraser McGruer:

know, logically, external for this just fits into really, you

Fraser McGruer:

know, stuff, external forces over which have no control.

Fraser McGruer:

That's what we are to the bees, right? Anyway, I've diverted us.

Fraser McGruer:

Nick, oh no, so was it you? Peter, who was going to come in

Peter Coghill:

with some Yeah. So I think it boils down to

Peter Coghill:

like, it's unforced. It's unforeseen, so unforeseen

Peter Coghill:

consequences that we want to be able to foresee, isn't it? So

Peter Coghill:

when we create a new technology, everything's, oh, great,

Peter Coghill:

brilliant. We've got, we've got, we've got nuclear power.

Peter Coghill:

Brilliant, this is we're getting into unknown, unknowns. This is

Peter Coghill:

what we're talking about. This sort of is a bit, yeah, gone.

Peter Coghill:

But it's like we go because we make a new technology, but it's

Peter Coghill:

only, Only later did, but does it become apparent, perhaps,

Peter Coghill:

through, you know, nuclear weapons, they were okay, not

Peter Coghill:

great things, but they were okay, if, as long as they were

Peter Coghill:

only a very small number in the world, they were quite useful as

Peter Coghill:

a tactical weapon of war. But when you get several, when you

Peter Coghill:

get so many that you can destroy everything on the earth several

Peter Coghill:

times over, and not only that, but extinguish any hope of life

Peter Coghill:

regaining a foothold. That the case is the same for AI, mirror

Peter Coghill:

life and various other existential risks, that that's

Peter Coghill:

that that's that's a problem. But we didn't at the time

Peter Coghill:

foresee that that was a problem. It's only when we started to

Peter Coghill:

scale up did people go Hold on this situation now means that

Peter Coghill:

this could occur.

Nick Hare:

Yeah, and I think this is by analogy with, you

Nick Hare:

know, our aggression levels being misaligned to our

Nick Hare:

technology, our capability to destroy things. I think our

Nick Hare:

levels of scare, of fear. What we're scared of is also, you

Nick Hare:

know, misaligned to how scary things actually are. You know,

Nick Hare:

because we've really only evolved to be scared of snakes

Nick Hare:

and things. Yeah, we haven't really, yeah, we haven't really,

Nick Hare:

we haven't really evolved the means the correct level of sort

Nick Hare:

of scaredness of things like, you know, cars than we are, are

Nick Hare:

much less scared of snakes.

Chris Wragg:

Yeah, although they do kill up to 75,000 people per

Nick Hare:

year, snakes still not as many as hippos or as

Nick Hare:

cars, as many as cars, more than hippos. Wow. Anyway, seems like

Nick Hare:

we should all be really scared of snakes all of a sudden, but,

Nick Hare:

but the point is that, you know, and particularly when it comes

Nick Hare:

to these sort of abstract risks, like, you know, the kind of AI

Nick Hare:

takeover problem, which you have to explain to someone. You've

Nick Hare:

got to sit down and tell them a story about how this could

Nick Hare:

happen. And then at the end, they'll go, yeah, that's not

Nick Hare:

gonna happen. Come on, you know? And I think at least nuclear

Nick Hare:

bombs come free with an enormous, terrifying fireball.

Nick Hare:

But AI takeover doesn't really look like anything scary until

Nick Hare:

it's way too late. And so, you know, it is those kinds of

Nick Hare:

risks. And same with mirror life. It's like, well, nothing

Nick Hare:

looks that scary to us. It's a bacterium. We're not

Nick Hare:

intrinsically scared. I mean, nuclear weapons are

Nick Hare:

intrinsically scary, but a lot of these things that could just

Nick Hare:

kill us all just they're picture in a magazine, aren't they?

Chris Wragg:

Yeah. I mean, I think it's quite interesting the

Chris Wragg:

extent to which awareness in the public consciousness is a is a

Chris Wragg:

deterrent to us developing technologies. So, so you know,

Chris Wragg:

if you, if you talk to most people about the threat of AI,

Chris Wragg:

they think of Skynet, right and terminators to get scary robots,

Chris Wragg:

right? Exactly, but, but, or how right, you know, but, but

Chris Wragg:

psychotic robots. Psychotic robots. But the the role of

Chris Wragg:

science fiction, dystopian post apocalyptic fiction is to, is to

Chris Wragg:

basically tell that story in a way that sticks in in people's

Chris Wragg:

minds. If, and if you look at there, I would say there is a

Chris Wragg:

large portion of popular resistance to the development of

Chris Wragg:

AI that is based on this idea of killer weapon systems and, you

Chris Wragg:

know, Robocop and all that kind of stuff. But if you always so

Chris Wragg:

mirror life, like very few, it's not in anybody's public

Chris Wragg:

consciousness, right? Nobody's written a novel about it. Hasn't

Chris Wragg:

been a blockbuster film about it. But if you look at something

Chris Wragg:

like cloning, human cloning, and the the way, quite quickly,

Chris Wragg:

people were weirded out and thought, you know, that should

Chris Wragg:

be a sight, Whoa, don't. Don't go there. That's there's

Chris Wragg:

something partly, it's in the public consciousness. Partly,

Chris Wragg:

it's like, you say, inherently creepy. Creepy, right, exactly.

Fraser McGruer:

And pregnancy, too. It feels like a couple of

Fraser McGruer:

things I want to say we need to draw to a conclusion soon, but

Fraser McGruer:

it feels like we're talking about something we've talked

Fraser McGruer:

about before, which is perception of risk, maybe, which

Fraser McGruer:

is, you know, I'm probably more scared of sharks and guns. Four

Fraser McGruer:

people a year, really killed by sharks, exactly. That's

Fraser McGruer:

terrifying, right? Yeah, and I should probably be more scared

Fraser McGruer:

of of of cheeseburgers, right? That's actually more people die

Fraser McGruer:

of choking on a cheeseburger. Well, I hadn't even thought of

Fraser McGruer:

cheese, but I was just thinking

Chris Wragg:

of sharks die from choking on humans.

Fraser McGruer:

Yeah, and no one talks about that. I'm glad you

Fraser McGruer:

brought that up. But the second thing, it's all, it's all very

Fraser McGruer:

human centric and Earth centric, you know, because us worrying

Fraser McGruer:

about, you know, the destruction of our planet, Earth and so on.

Fraser McGruer:

But the universe doesn't, kind of doesn't really care, right?

Fraser McGruer:

And it's, I don't know, it just occurs to me, you know, the

Nick Hare:

only the kind of nihilist who pressed the red

Nick Hare:

button, yeah, yeah, universe out. There's probably some other

Nick Hare:

aliens. They'll be all

Fraser McGruer:

right, yeah. Now, to be fair, I get it. It's

Fraser McGruer:

what we've got. All we've got is ourselves, our lives, our

Fraser McGruer:

planet, etc. But, you know, it's sort of, ultimately, it's true,

Fraser McGruer:

though, isn't it? Depressingly, it kind of doesn't really

Fraser McGruer:

matter. But then, yes, well, if that doesn't matter, then

Fraser McGruer:

nothing does. Exactly, that's why I'm an honest why I'm

Fraser McGruer:

honest. So So look, we do need to finish off. So I go to Peter

Fraser McGruer:

first, and then, you know, we need to wrap this up.

Peter Coghill:

Yeah, so just picking up on Chris's point that

Peter Coghill:

considering exercise at risk is not in the public sort of

Peter Coghill:

discourse. I don't think it's nearly enough. It needs to be

Peter Coghill:

much more in the public discourse. The fact Fraser

Peter Coghill:

hadn't heard of mirror life is damning indictment of Fraser

Peter Coghill:

sort of awareness of important, important things in the world.

Nick Hare:

So the fact, luckily, he's got absolutely no power to

Nick Hare:

do

Fraser McGruer:

anything if a

Unknown:

mirror Fraser was

Peter Coghill:

no one person does, but as your population

Peter Coghill:

gets more aware of this as a problem, then things start to

Peter Coghill:

change. So like the the fact that everyone now worries about

Peter Coghill:

recycling and global warming is starting to shift the perception

Peter Coghill:

and the and the expectation on our political leaders. So the

Peter Coghill:

fact that it's an existential risk. Basically means, if it

Peter Coghill:

come, if it happens, the cost is infinite. The cost of that

Peter Coghill:

catastrophe is infinite, which means, if the probability is at

Peter Coghill:

all, non zero, which is because it could happen, then it must,

Peter Coghill:

then it should, dominate all other discussions,

Unknown:

Pascal's Wager

Peter Coghill:

style, yeah, so it's like the fam, whereas now,

Peter Coghill:

yeah, we still, we still worry about who's going to be the next

Nick Hare:

trivial, actually infinite, because we're not

Nick Hare:

going to devote an infinite amount of resources to stopping

Nick Hare:

it. So, you know, it has to be bounded. The existence of life

Nick Hare:

still has. A bounded value. We're not really agreed on what

Nick Hare:

that is. Well, nothing, according to Fraser, yeah, zero,

Nick Hare:

according to him, but and, but somewhere between zero and

Nick Hare:

infinity. So we pinned it down quite nicely. Yeah?

Fraser McGruer:

Okay, that feels like a very satisfying, nice

Fraser McGruer:

conclusion there. Yeah, nailed it. Okay, look, I got a

Fraser McGruer:

question, yeah, let's do it from fiction. Probably doesn't have

Fraser McGruer:

to be necessarily favourite existential risk, or, I suppose,

Fraser McGruer:

favourite apocalypse.

Chris Wragg:

Yeah, I guess I'll go with Douglas Adams and the

Chris Wragg:

idea that we're a sort of traffic, or rather, we're

Chris Wragg:

roadworks being being performed to clear, clear a path through

Chris Wragg:

the through the universe?

Fraser McGruer:

Yes, so you're picking up on my that's like, on

Fraser McGruer:

the sort of spectrum of where I am,

Chris Wragg:

I like, I like the idea that we sit here and we

Chris Wragg:

consider all of these serious type risks, and that in the end,

Chris Wragg:

we might be wiped out by something very trivial and

Chris Wragg:

amusing.

Fraser McGruer:

Maybe it's embarrassing. Maybe, yeah, like

Fraser McGruer:

road works through our part of the solar system, galaxy, or

Fraser McGruer:

someone forgetting not firing all the telephone cleaners or

Fraser McGruer:

something like that, right? Good one.

Peter Coghill:

Chris, I just realised how, how normally try

Peter Coghill:

to keep it light at the end. This is not the likeness of

Fraser McGruer:

question, be fair. It's a pretty heavy

Fraser McGruer:

subject to start with, but yeah,

Peter Coghill:

my favourite? Well, my not my favourite, not

Peter Coghill:

the one I worry the most about, either one that occupies the

Peter Coghill:

most time in my head, but the one I kind of think is

Peter Coghill:

potentially quite lols. Is the way if humanity were to evolve

Peter Coghill:

into a less brainy, more fertile version of ourselves, kind of

Peter Coghill:

imagine the

Unknown:

Idiocracy, yeah.

Peter Coghill:

Movie, yeah. And, or the or there is kind of like,

Peter Coghill:

wall e the movie. The Wall e the movie where everything's every

Peter Coghill:

ever we've advanced our technologies, where all of our

Peter Coghill:

requirements are kind of taken care of automatically for us,

Peter Coghill:

which means we can just sit around watching, yeah,

Nick Hare:

that's like the kind of Brave New World type of

Nick Hare:

dystopia where everyone's actually really happy, but it's

Nick Hare:

dystopian,

Peter Coghill:

sometimes called a dysgenic situation. But yeah,

Peter Coghill:

we're but, and maybe AI will will facilitate Yeah, for us, we

Peter Coghill:

can just sit around eating and

Chris Wragg:

fucking Yeah. So it doesn't, it doesn't, doesn't

Chris Wragg:

kill us by actually killing us. It just makes us Yeah,

Peter Coghill:

it puts us in a little box, makes us dumb.

Fraser McGruer:

Sounds, all right, yeah, Nick,

Nick Hare:

I think my fact, because this actually mirror

Nick Hare:

life, made me think of this. I think it's really imaginative.

Nick Hare:

Is ice nine, which I think is Ray. Is it Ray Bradbury, or, I

Nick Hare:

can't remember if it's him or but it's a story about a new

Nick Hare:

type of water molecule that is kind of ice at kind of room

Nick Hare:

temperature. Basically, it's a Okay, and so. So all of the

Nick Hare:

water in the entire world just freezes, because this one

Nick Hare:

molecule kind of ends up converting all the other water

Nick Hare:

into itself. So, yeah, ice nine gradually. So obviously all the

Nick Hare:

water in your body freezes, everything the earth just turns

Nick Hare:

into a great big, you know, Ice Cube made of ice nine. I think

Nick Hare:

it's very cool idea, but it's not unlike the mirror life

Nick Hare:

thing.

Peter Coghill:

There is a, there is a version of that. There's a

Peter Coghill:

good, great cooker video that that's actually physically

Peter Coghill:

plausible.

Unknown:

Oh, okay, we should start worrying about

Peter Coghill:

matter, which is something to do with the

Peter Coghill:

configuration of the subatomic particles inside the quarks,

Peter Coghill:

which mean you can get a special kind of quark, or too many of

Peter Coghill:

the special kind of quarks together. And if the problem

Peter Coghill:

with that is that any other protons and neutrons that that

Peter Coghill:

that proton, that weird proton, interacts with, get converted to

Peter Coghill:

the same sort

Nick Hare:

so we've got to keep an eye on these quarks to stop

Nick Hare:

them all getting together in the same

Peter Coghill:

try and dig it out for the show notes. Yeah,

Peter Coghill:

worth of view. But, yeah, there's a sort of version of

Peter Coghill:

that. So it's like, it becomes, like the Midas touch, that

Peter Coghill:

anything a particle of that touches, turns into more of

Peter Coghill:

that.

Nick Hare:

Okay, yeah, well, it's like those prions. They're

Nick Hare:

another thing that calls kreutzfeldt Jakob disease, you

Nick Hare:

know, Mad Cow Disease. Okay? They're basically a, I think

Nick Hare:

they either a mirror image or just a particular kind of

Nick Hare:

protein or something that that your body can ingest but can't

Nick Hare:

then process or deal with, yeah? But they also, they're kind of

Nick Hare:

self replicating, which is why they're so scary. If they get

Nick Hare:

into the food chain, then they start spreading everywhere,

Nick Hare:

which is why you shouldn't do that.

Fraser McGruer:

That's why they're not a good thing. Yeah,

Fraser McGruer:

prawns are bad. Yeah. So I have to say, I mean, I don't really

Fraser McGruer:

have one beyond what we've talked about. But I don't like

Fraser McGruer:

the sound of everything turning to ice. I like the sound of

Fraser McGruer:

that. I have to say, I did quite like the sound of Peter's one,

Fraser McGruer:

like just the eating and being caught in a box. Yeah? Patient,

Fraser McGruer:

yeah, that sounded All right. So I'll go with that one. Pretty

Fraser McGruer:

much your life at the moment. Well, that's why I try. Is what

Fraser McGruer:

I aim for. Yeah? Okay, so we're. Stop there. You've been

Fraser McGruer:

listening to the Cognitive Engineering podcast, brought to

Fraser McGruer:

you by Aleph Insights and produced by me, Fraser McGruer.

Fraser McGruer:

If you haven't already, please like and subscribe. We try to

Fraser McGruer:

release an episode every week or two. If there are any topics

Fraser McGruer:

you'd like us to cover, please do get in touch via email, and

Fraser McGruer:

you can find out more about Aleph insights at

Fraser McGruer:

Alephinsights.com thanks, as always, for listening until next

Fraser McGruer:

time. Goodbye.

Links

Chapters

Video

More from YouTube