Artwork for podcast Business Without Bullsh-t
Big Tech Companies treat humans like batteries. We are just there to power their profits
Episode 38330th April 2025 • Business Without Bullsh-t • Oury Clark
00:00:00 01:13:53

Share Episode

Shownotes

EP 383 - Jonathan MacDonald is back on the show and keener than ever to get you interested in Ethical Tech.

We discuss why now, more than ever, people need to be given the tools, and moral frameworks to inform their interactions with increasingly opaque technology. How things like social media are seemingly set up to keep us amused, distracted and rob us of our critical thinking faculties and what that means for humanity in the long term.

Jonathan isn’t an anarchist in a tin foil hat (or a cryptoevangelist in a blonde wig come to that) but he is very compelling on why we need to question whether Big Tech algorithms and the people who control them actually have our best interests at heart.

He’s pretty convinced that they don’t, or at least don’t care about us beyond what we can do for them. Which is why he has set up SELF - a personal privacy filter - and also ETHICS TV to help people discuss, understand and control their own moral choices around tech.

We also chat about the Roman Empire and its attitude to analogous situations. We know plenty of you like to think about that kind of thing quite a lot.

*For Apple Podcast chapters, access them from the menu in the bottom right corner of your player*

Spotify Video Chapters:

00:00 BWB with Jonathan MacDonald

02:23 Meet Jonathan

03:18 The Mission: Alternative Future for Technology

04:44 Humans Are Batteries

05:05 The Reality of Free Services

05:58 The Normalisation of Surveillance

06:29 The Impact of Social Media on Society

07:33 The Coercion of Media and Politics

08:37 The Evolution of SELF: Personal Privacy Filter

09:50 The Demand for Privacy in AI Tools

10:50 The Future of Privacy: Self-Sovereign Mechanisms

11:47 The Physicality of Data Privacy

12:37 The Ineffectiveness of GDPR

13:09 The Dark Side of Data Exploitation

14:53 The Post-COVID Awareness Shift

34:02 Ethics in Technology

37:28 The Problem with Modern Democracy

39:18 The Importance of Critical Thinking

39:58 The Role of Media in Critical Thinking

42:30 Education and Critical Thinking

46:15 Ethical Issues in AI and Media

46:41 The Dystopian Future of AI

53:37 Building Ethical Technology

01:04:36 Personal Reflections and Advice

01:11:08 Quickfire - Get To Know Jonathan

businesswithoutbullshit.me

Watch and subscribe to us on YouTube

Follow us:

Instagram

TikTok

Linkedin

Twitter

Facebook

If you'd like to be on the show, get in contact - mail@businesswithoutbullshit.me

BWB is powered by Oury Clark

Transcripts

Speaker A:

So I'm gonna come around to your house and I'm gonna walk through the door cause you're gonna give me the key because I've given you a computer program that you find attractive because it's got cats singing Mariah Carey songs and I'm gonna rifle through your stuff and I'm gonna take pictures of your private information, how you feel about that?

Speaker A:

And every single person goes, hell no.

Speaker A:

And I said, well it's interesting because that's exactly what big tech does.

Speaker A:

Do we trust that other people other than ourselves should be the guardians, custodians and controllers as we move into artificial intelligence and robotics, would we want a robot nanny looking after our children that has been programmed by a company that has suffered 315 legal cases of privacy abuse or could be hacked or could be hacked and weaponized by something that turns into a drone?

Speaker A:

Yeah.

Speaker A:

So maybe it's that type of lights off scenario that people will realize that the only people that one can trust with one's data is oneself.

Speaker B:

Hi and welcome to Business Without Bullshit.

Speaker B:

We're here to help the founders, entrepreneurs, business owners, anyone who wrestles with the job of being in charge.

Speaker B:

And if you like what we do here, please rate and review us on Spotify and Apple.

Speaker B:

And come say hi on YouTube if you fancy watching us in action.

Speaker B:

Links are in the episode description or just search for WB London.

Speaker B:

Would you let someone into your house to listen to your conversations?

Speaker B:

Go through your cupboards, wardrobes and bathroom cabinets?

Speaker B:

Would you let them take pictures and recordings of everything that they saw?

Speaker B:

Would you let them sit on your sofa and take notes on your conversations?

Speaker B:

No, I didn't think so.

Speaker B:

Jonathan McDonald, our guest this week explains that is exactly what we are doing when we click accept on big text terms and conditions.

Speaker B:

He argues that this behavior leads us wide open to being coerced and controlled by bad actors, even bad governments as we don't have truly open access to information.

Speaker B:

He explains why we think we live in a democracy but actually live in a demagoguery.

Speaker B:

How social media and big tech are eroding the very critical thinking facilities we need to challenge them and what we can learn from the Romans and the Greeks about how we face these issues.

Speaker B:

As a cure, Jonathan has created Self, a personal privacy filter and also ethics TV to help people discuss, understand and control their own moral choices around tech and start a movement.

Speaker B:

Buckle up, let's dive in.

Speaker B:

I am Andy Oury and today we are delighted to welcome back Jonathan McDonald, a renowned entrepreneur, award winning author and keynote speaker.

Speaker B:

Jonathan has launched over 10 startups, advised over 400 companies including Google, Microsoft and Apple.

Speaker B:

At 23, he became the youngest ever chairman of the British Music Industries association and later helped transform the super club and record label Ministry of Sound into a digital powerhouse.

Speaker B:

His latest ventures include Self, an adaptive AI for a personalized web experience, and Ethics tv, a platform exploring ethics in the digital world.

Speaker B:

Jonathan, a warm welcome back to the podcast.

Speaker B:

How are you?

Speaker A:

Thank you.

Speaker A:

Wonderful.

Speaker A:

Lovely to see you again, Andy.

Speaker B:

Thank you.

Speaker B:

So, yeah, give us an overview.

Speaker B:

What, what, what are you currently doing then.

Speaker B:

Let's, let's get it out of you, out of the horse's mouth, as they say.

Speaker A:

Y.

Speaker A:

Yeah, I think my mission has always been to try and give an alternative future for technology.

Speaker A:

I'm concerned about the monolithic, I believe, pernicious, rapacious big technology firms that are using humans as batteries to power their tools.

Speaker A:

And I feel that even though the bright shiny objects of social media and the like are tempting to sacrifice your data privacy for, I don't think that that is the only option for humanity.

Speaker A:

I think that we can provide ways in which the users and the businesses and the human individuals can maintain their privacy.

Speaker A:

And the only reason that the things are like they are is that no one is actually provided an opportunity or an alternative.

Speaker A:

And so Self, what I'm doing with self is hopefully one of many alternatives to move from big tech to self tech.

Speaker A:

So a user controlled environment where your human rights are at the forefront, but versus us being essentially the powering mechanisms of these tools that we are addicted to that I think are potentially dumbing down civilization, outsourcing our thinking to other platforms and increasing depression and suicidality and all the other wonderful outcomes of our addictive consumer behavior.

Speaker B:

I don't think there's any doubt at a surface level that social media is just having a huge effect on us all, whether it be watching your wife, to your friends, everyone say that.

Speaker B:

But this phrase, phrase you used is quite interesting.

Speaker B:

Using humans as batteries.

Speaker B:

So what is the power we're providing?

Speaker B:

The power of interaction, the power of.

Speaker A:

Money, or the power of attention and the power of our data.

Speaker A:

So at the end of the day, if a service is free, then you're the product.

Speaker A:

And so what happens is that the transaction is this.

Speaker A:

You can use these tools, let's say Facebook, for the sake of argument, you can use these tools free of charge.

Speaker A:

And the only deal is that you sign terms and conditions.

Speaker A:

That means that all of your information and your private interactions and your photos and everything are actually able to be used to splice and dice and sold to the highest bidder to predict what you might buy next or predict how you might vote next.

Speaker A:

And what that does is weaponize the communication.

Speaker A:

It weaponizes the media.

Speaker A:

It politicizes the media.

Speaker A:

And the only irony of this all is that when you speak to many people about this, they nod and go, yes, I fully understand, and then continue to use the service.

Speaker A:

Which leads me to believe that the inertia can only be one of two things.

Speaker A:

One is that there's just no alternatives that people find as good or if not better.

Speaker A:

And the other is that because we're now being through this kind of Overton window, we've normalized.

Speaker A:

It's become normal that when you walk through a park with your dog and you have a conversation with your partner, you next turn on a computer and there's an advert for the thing you've been talking about, and we've become normalized to it.

Speaker A:

And that doesn't mean that it's right.

Speaker A:

Just because something becomes normalized doesn't change what the fundamental principles of human rights are.

Speaker B:

I mean, you spoke so eloquently about this last time, but there's certainly greater things we could achieve, isn't it?

Speaker B:

It feels like we've got these sort of huge technological movements and all we're managing to achieve is look at dogs online and buy crap on Instagram.

Speaker A:

That's right.

Speaker A:

It's almost as if that's an intentional move to keep us less curious, less critically thinking.

Speaker A:

And without being too hyperbolic here, it's one of the reasons where Socrates kind of ended up on the wrong side of the emperors.

Speaker A:

Because at the end of the day, if you start questioning things, then you're slightly breaking out of the matrix.

Speaker A:

What you're meant to do is just be distracted by noise.

Speaker A:

You're meant to be addicted to throwing birds at pigs.

Speaker B:

Do you believe that, though?

Speaker B:

It's almost the, you know, I just feel life's too chaotic for the Illuminati.

Speaker B:

It doesn't mean there's nothing.

Speaker B:

Actors who are.

Speaker B:

Got malicious intent and people just want your money.

Speaker B:

And, you know, it's.

Speaker B:

It's the sort of, you know, it's the American capitalist revolution just carrying on, just buy stuff, buy stuff.

Speaker B:

You know, that's all we care about.

Speaker B:

Do you think it's more organized than that?

Speaker B:

You know, the suggestion that it's like, well, let's keep these people, you know, dumbed down with, with.

Speaker B:

With computers.

Speaker A:

Well, well, I'll give you an example.

Speaker A:

I mean, would you say that certain social media channels and information was able to sway people's voting behavior.

Speaker B:

It would appear so.

Speaker A:

So one could imagine then that there is this extraordinary benefit of coercion keeping.

Speaker B:

Us, keeping us busy in it.

Speaker B:

It could be accidental that they've created these things to keep us addicted because they want to sell stuff, and then someone can then manipulate those tools.

Speaker A:

Well, I don't think that any part of that commercial model is accidental, but.

Speaker B:

Really, no, that's what I'm interested in.

Speaker A:

No, I mean, at the end of the day, the way to really monetize a social media platform is through the data extracted from it and then selling advertising.

Speaker A:

And the extension of that is, let's say, in the most nicest way, market research.

Speaker A:

But the less clean way of looking at that is what form of political pressures could then be used to coerce in one way or another.

Speaker A:

And anyone who's watched what happened with Cambridge Analytica is a witness to what that looks like.

Speaker A:

And I don't think there is a millisecond of that Netflix documentary that was an accident.

Speaker B:

And then from your perspective, you obviously set up self when we last met, and self is, what did you say?

Speaker B:

Sovereign and sentient, was it, what was your phrase in terms of it's your data, it's your thing, and you're not selling it to anyone, and then it works on your part.

Speaker B:

How has it evolved out of interest since we've last met?

Speaker A:

Well, when we started, my intent was to create a self sovereign artificial intelligence that was under the control of the person themselves.

Speaker A:

Now what's happening is that the AI tools that many people are using and businesses are using have become so incredibly powerful.

Speaker A:

The place where self actually can add the most benefit is to stand in the middle of this as your own personal privacy web filter, like a privacy guard.

Speaker A:

And that is where even though self you can converse with self and self can provide you information and stuff, how self does that is actually uses all of the tools that are in the web, but through an anonymization filter.

Speaker A:

And so where we've realized the actual largest demand is is not to create another AI.

Speaker A:

The demand actually is with people who, for instance, a healthcare provider, financial services company, a legal firm who would love to use Microsoft Copilot were it not for the fact that you need to trust that where that data is being stored off premises is completely and utterly secure, not likely to be leaked, not likely to be sliced and diced and sold to the highest bidder, not likely to be exposed to advertising, not likely to train a large language model with your intellectual property that then can benefit a competitor.

Speaker A:

So you need to make a lot of assumptions along that path.

Speaker A:

And so what self does is actually not enable you to have any of that requirement for assumption.

Speaker A:

It's like, no, no, no, no.

Speaker A:

These algorithms sit on your server and only you can access it.

Speaker A:

And anything that has to access the outside web is through an anonymization lens.

Speaker A:

That means that no one can actually see what you're really doing.

Speaker A:

And I bet, and it's a lifetime bet of mine, this is my life's purpose.

Speaker A:

I bet that eventually this topic won't be niche.

Speaker A:

I think that there is a crossroads ahead.

Speaker A:

And one angle of that crossroads is people who are deeper into what I see as a dystopian outcome.

Speaker A:

And the other is people who say in businesses who say we have to be using privacy based tools and self sovereign mechanisms.

Speaker A:

And what my struggle is with this dystopian utopian outcome is that I have to somehow educate and convince the marketplace that what is better for us is to be in control of our private information.

Speaker A:

Which is paradoxical due to the fact that the first clause of the Universal Declaration of Human Rights would indicate that that is something that we have a sovereign right to.

Speaker A:

So that's weird that you're kind of saying to people, hey, did you know that you should actually be in charge of it?

Speaker A:

And then people don't get it.

Speaker A:

So I say, okay, let's put it this way.

Speaker A:

So I'm gonna come around to your house and I'm gonna walk through the door.

Speaker A:

Cause you're gonna give me the key.

Speaker A:

Because I've given you a computer program that you find attractive.

Speaker A:

Cause it's got cats singing Mariah Carey songs.

Speaker A:

And I'm gonna walk through the door and I'm gonna rifle through your stuff and I'm gonna take pictures of your private information, how you feel about that.

Speaker A:

And every single person goes, hell no.

Speaker A:

And I said, well, it's interesting that you're so vehement about that because that's exactly what big tech does.

Speaker B:

It's, it's funny when you give the physical reality.

Speaker B:

We're so deeply wired about territory.

Speaker B:

That's what comes into my mind.

Speaker B:

It's the sort of, you know, sometimes in life you have those.

Speaker B:

Whether you carve someone up on the motorway accidentally or you, you, you know, I was trying to change something in our office in Slough, which is definitely my dad's territory, and he got all funny and I was like, oh, it's territory.

Speaker B:

Weirdly, with Our data, it's not like hardwired in my brain, you know, my data.

Speaker B:

And actually I don't think things like GDPR helps because you ask anyone about gdpr, I mean it's a good idea, but it's a fricking waste of time.

Speaker B:

Most businesses would say it's like, well all I do is tick boxes now that, yes, that give people permission because you sort of end up always with those impossible things, isn't it?

Speaker B:

You either sign this 500 pages, how this isn't illegal.

Speaker B:

I just don't understand 500 pages of terms of condition or you don't get this thing which you need.

Speaker A:

That's right.

Speaker A:

Well, well, I think, but also GD was window dressing.

Speaker A:

Unfortunately it was lip service to what, what the actual greater plan was.

Speaker A:

Yeah.

Speaker A:

So, but, but I think where, where the, where the issue stands, quite well said by you is that we don't see our data as something of importance until you physicalize the analogy and say, okay, I'm going to take a picture of your private conversation with your wife and you go, hell no.

Speaker A:

It's like, okay, but that's exactly what is being used for what's called rtb, which is real time bidding.

Speaker A:

Whereas your data is then sold to the highest bidder in real time and then can serve adverts that you will undoubtedly be interested in.

Speaker A:

Why?

Speaker A:

Because that's the conversation you've just had with your wife.

Speaker A:

And when we can make better and better analogies, there's two outcomes I think.

Speaker A:

One is a kind of a lighter hearted outcome where the public become more aware of this situation.

Speaker A:

The other is slightly darker, which is there is an absolutely unrelenting tragedy that's linked to these kind of actions and there's hints of that.

Speaker A:

So for instance, there was a recent case where one of the social media platforms got slapped with a mass class action because they had turned on the cameras on all of the users phones and computers and were watching your biometric tracking of your eyes and then got got found out doing it.

Speaker A:

And then the regulator said hey listen, that's completely illegal.

Speaker A:

And so they slapped with like a hundred and something million dollar fine for a company that's making $2 billion a day.

Speaker A:

It's like okay, here's $100 million.

Speaker A:

And so there needs to be a bigger tragedy than that.

Speaker A:

It needs to be something that is unarguably based on the situation we're in.

Speaker A:

Interestingly, pre Covid, it was harder for me to speak about how media could be coercive really.

Speaker A:

Post Covid I find very few people would miss that.

Speaker A:

Interestingly, pre the first Trump administration, it was harder for me to talk about how you could politicize and weaponize information.

Speaker A:

Post Cambridge Analytica, everyone gets the point.

Speaker A:

So what I tend to now use more, I'd say with more gravitas is examples that people unarguably have have seen what James Joyce says, the ineluctable modality of the visible.

Speaker A:

It's like it's directly in front of their faces.

Speaker A:

Now.

Speaker A:

Do you think that media could be coercive in a way that would change your voting behavior?

Speaker A:

Yes or no?

Speaker A:

Yes, I do.

Speaker A:

Have you ever had a conversation with someone and then seen an advert that's linked to the conversation that you've magically had?

Speaker B:

People.

Speaker B:

I'm not really that much on soap, but people talk of that, but they would say publicly if it was Amazon, it was absolutely illegal.

Speaker B:

All our terms and conditions.

Speaker B:

We're not listening to you.

Speaker A:

That's right.

Speaker B:

Is that just bullshit?

Speaker A:

Well, interesting with Alexa, they've just recently changed their terms and conditions.

Speaker A:

That said that no longer is the data for Alexa listening on your device is now sent to the Amazon servers.

Speaker A:

So it is in black and white that that's what's happening.

Speaker A:

Except the question is, how do you get people to sign a terms and conditions that's longer than the Magna Carta?

Speaker A:

I mean, have you read who.

Speaker A:

Who watching or listening, has read all of the Facebook terms and conditions?

Speaker A:

I mean, I have because that's the world in which I live, but no one else has.

Speaker A:

So someone uploads a picture of their.

Speaker A:

And their kids and I then say to them, you know, that Facebook actually own the intellectual property rights of that picture.

Speaker A:

And they're like, it's just unreal.

Speaker A:

What?

Speaker A:

What?

Speaker A:

What?

Speaker A:

And I'm like, yeah, it's no longer your data.

Speaker A:

That's not your data of your son.

Speaker A:

No.

Speaker A:

That you signed that off.

Speaker A:

No, I didn't sign anything.

Speaker A:

It's like, yeah, you did.

Speaker A:

It's page:

Speaker B:

Probably stand up in a British court.

Speaker B:

Probably.

Speaker B:

But who's going to take it there?

Speaker B:

Well, good luck.

Speaker B:

Yeah, good luck.

Speaker B:

You know, I mean, I'm just thinking it'd be an unfair.

Speaker B:

Probably fall under you.

Speaker B:

I mean, what did you feel with this Apple thing recently when Apple sort of was this.

Speaker B:

You know, they're saying that they won't give the British government access to Apple.

Speaker A:

But then turned off the ability for us to add deeper encryption as users.

Speaker B:

Oh, did they?

Speaker A:

So that's the bit.

Speaker A:

Yeah.

Speaker A:

That's the bit that was buried.

Speaker A:

Yeah, that's right.

Speaker A:

They stopped the cameras rolling.

Speaker B:

They turned off the ability for us to do deeper encryption.

Speaker B:

Oh yeah, I did see that.

Speaker B:

Because that was how, that was actually how the story started, wasn't it?

Speaker B:

That they said we're taking that away because they can't really say what's going on, but someone's trying to.

Speaker B:

So now they can access our stuff.

Speaker A:

Yeah.

Speaker B:

Is it?

Speaker A:

Okay, so, so one way of.

Speaker A:

One way of.

Speaker A:

I don't want to say this in a litigious potential slanderous way, but analogistically, why give someone a back door when you just basically loosen the lock on the front?

Speaker B:

Yeah, yeah.

Speaker B:

Okay.

Speaker B:

I mean, what do you.

Speaker B:

Where do we draw those lines?

Speaker B:

You know, I sort of, if I fall it into this, you know, silly things like my data, I've ever discussed this before, but you know, if, if, if I'm in a country like, I don't know, name a country that I shouldn't name where people disappear and things happen.

Speaker A:

No comments.

Speaker B:

There's a couple of them.

Speaker B:

You know, I, I feel, you know, I live in one of the more liberal countries in the world where, you know, it's not perfect and stuff, but people don't tend to disappear that much if, no matter what they say down the pub, you know, you can pretty much go at it if you really want and we'll roll our eyes or whatever.

Speaker B:

So I'm kind of okay with the British government therefore having information.

Speaker B:

You know, I can see their problem.

Speaker B:

Like, you know, how do they find that terrorist or how do they find that person?

Speaker B:

You know, so they, It' it's this permanent problem in life.

Speaker B:

So it's like if I've got now, you know, I think probably British people have very low trust of the government.

Speaker B:

But actually, actually probably if you ask them the right question, you would learn that.

Speaker B:

No, I don't think that I'm going to disappear.

Speaker A:

Yeah, well, more personally, I mean, I have two children in this country and I would like the policing of this country to keep my children safe.

Speaker B:

Yeah.

Speaker A:

And, and if there's bad people out there that are going to do bad things to my family, I would like them to be stopped before they do it.

Speaker A:

And so I am strongly non anarchistic like that.

Speaker A:

And if I didn't have kids, I may be slightly more fast and loose about it, but I would like them to be in a safe society.

Speaker A:

My view is that that's not where the sense.

Speaker A:

Yeah.

Speaker A:

My view is that we're in a situation where the user data, user private information and businesses private information is being willy nilly used and fed into systems.

Speaker A:

And what we have to do very succinctly is we as users and businesses of using these tools have to be trusting that the where our information goes has our best interests at heart.

Speaker A:

And it's patently clear from the daily legal issues that these companies have that their intent is not necessarily completely in our benefit.

Speaker A:

Now by saying this is a dangerous thing for me to say because you're saying it politely.

Speaker A:

Yes.

Speaker A:

And in a way that can't even be seen by the most litigious lawyer as slanderous.

Speaker A:

But looking at the empirical evidence in front of us, do we trust that other people other than ourselves should be the guardians, custodians and controllers of our private information?

Speaker A:

And as we move into artificial intelligence and robotics, do we believe that a robot in our house that has some form of.

Speaker B:

Here comes the physicality again?

Speaker A:

Yeah.

Speaker A:

So would we want a nanny, A robot nanny looking after our children has been programmed by a company that has suffered 315 legal cases of privacy abuse.

Speaker B:

Or could be hacked by some.

Speaker A:

Or could be hacked and weaponized by something that turns into a drone.

Speaker A:

Yeah.

Speaker A:

So maybe it's that type of lights off scenario that people will realize that the only people that one can trust with one's data is oneself.

Speaker B:

I'm getting the picture clear in my head.

Speaker B:

I mean there's obviously the big boys, you know, the, the Facebooks and the, you know, Amazons and the whoever, these huge, mostly US corporations that you have built these incredible, you know, products I guess and networks of what's going on and own so much of what's going on.

Speaker B:

We're sitting over here in the uk.

Speaker B:

I mean your thing is let's have something like Self and that, that be your shield effectively.

Speaker A:

That's right.

Speaker B:

This world.

Speaker A:

But like a force field.

Speaker B:

Does that mean you, you can go on Instagram?

Speaker B:

Is it just.

Speaker B:

Should you cut those things out if you just be very basic.

Speaker B:

If you're using Self that, that force fields against them finding out stuff.

Speaker B:

But does that mean you can't use those things as well?

Speaker A:

Well, I think that Self does one thing for people that hasn't been an option so far and it's giving people choice as to what tools they use and what information is sent to those tools.

Speaker A:

That's fair.

Speaker A:

It's like I think really that we shouldn't be certainly a Self founder.

Speaker A:

I don't believe that we should play judge and jury as to what is right and what Is wrong.

Speaker A:

I just would like people to have the bloody choice.

Speaker B:

Yeah.

Speaker B:

To know.

Speaker A:

Yeah.

Speaker B:

It's a bit like if anyone ever uses Duck Duck to go, which I switched to a while ago.

Speaker B:

I mean, the amount of stuff it pops up that it's blocking the whole time.

Speaker A:

That's right.

Speaker B:

You get all these icons popping up like, oh, they tried to find this, they tried to take that.

Speaker B:

No, we told them all to take off.

Speaker A:

And anyone who's used Safari browser will see the amount of sites that have been trying to sniff things.

Speaker A:

And there's services out there on the web that you can run and see what software is being used for sites that you land on and what spying software is being used.

Speaker A:

Now, of course, some of the smartphones show you a little icon when your camera is being activated and what.

Speaker A:

And that's quite an interesting feature.

Speaker A:

I don't know if you've noticed that.

Speaker B:

No, I haven't.

Speaker B:

And I'm always curious when people say they get cameras activated, they do it without turning the light on.

Speaker A:

I think that's right.

Speaker B:

Because most other light.

Speaker B:

And what.

Speaker B:

And on the phone now, it will tell you if someone's active.

Speaker B:

What's the icon?

Speaker A:

It's just a little camera icon at the top.

Speaker A:

It's just a tiny, tiny.

Speaker B:

Someone's suddenly looking at me.

Speaker A:

Yeah, yeah, yeah.

Speaker B:

Oh, my God, that's freaky.

Speaker A:

Yeah.

Speaker B:

Okay.

Speaker B:

And how.

Speaker B:

How often is that happening to normal Joe in the street?

Speaker A:

Well, I'm.

Speaker A:

Again, I need to play within the grounds here.

Speaker A:

I think the, the issue is whether we believe that when we see an icon, that's the time that it's.

Speaker A:

Oh, is there an.

Speaker A:

Is there.

Speaker A:

Is this correlation accurate?

Speaker A:

It's similar to people who say that they're on a diet.

Speaker A:

And that has to mean that they're also on a diet at 9 o'clock at night in front of the TV.

Speaker A:

It's not like a partial.

Speaker B:

That's how most people diet.

Speaker A:

That's right.

Speaker B:

When people are actually on a diet, they just lose weight and don't say anything, apparently.

Speaker A:

I mean, I've always said that I'm gonna open a gym called Resolutions.

Speaker A:

That's a gym for four days of the year from January 1 to January 5, and then it turns into a bar.

Speaker B:

I've been using your amazing.

Speaker B:

I mean, I'll just pass on the excellent advice he gave me.

Speaker B:

I've told everyone about it, by the way.

Speaker B:

Just to say you live your life like there's one year to live.

Speaker A:

That's right.

Speaker B:

And you start on the 1st of January and you know, you.

Speaker B:

And I've made a few changes in my life.

Speaker B:

You know, it's a great metric.

Speaker B:

People through this, all the.

Speaker B:

There's all sorts of those that fly.

Speaker A:

Around, but that one, it works.

Speaker B:

It's a really nice metric.

Speaker A:

It's a really nice.

Speaker B:

Because your mind is very set up.

Speaker B:

About a year.

Speaker B:

Yeah, I'm seeing my kids more and I just bought a Mustang.

Speaker B:

I mean, it's all going on Jonathan, but I love that.

Speaker B:

I mean, look, you know, you've been at the forefront of sort of ethical tech for a while.

Speaker B:

Do you think any of this is moving in the right direction?

Speaker B:

Is any country getting it right?

Speaker A:

I notice Estonia's views of digital identity seems to be quite progressive.

Speaker A:

I noticed Iceland's view of a crowdsourced constitution, like a wiki constitution seems to be quite progressive.

Speaker A:

But my view is really that pre Covid and pre first term of Trump with Cambridge Analytica it was harder to illustrate what's really going on to people and then post these things happened.

Speaker A:

Regardless of where you stand on vaccines and everything else like that, it is now increasingly obvious that not all of the information that we are exposed to is the complete information.

Speaker A:

And there's only a low double digit percentage of the web that's actually used.

Speaker A:

When you do a search on the most famous search engine, for instance, there's a lot of information that we don't have access to.

Speaker A:

Equally, you and I could be sitting next to each other on our devices and search for the exact same thing and we'd get two different sets of results.

Speaker A:

Your results are based on what is the most monetizable by advertisers for you and mine are apparently the most advertisable from advertisers to me.

Speaker A:

And so the best way to control civilization is throughout history is to limit the amount of information they have access to.

Speaker A:

And it's the greatest weapon on earth and it's far more effective than tanks and guns.

Speaker A:

What you do is you either limit or coerce or manipulate the information that people have.

Speaker A:

And if, if you can stop and control information flow, the free flow of information, which paradoxically should have been what the Internet's promise was.

Speaker A:

I mean, you think that you give everyone devices that are interconnected and then the outcome is we're more connected to each other and more connected to information, there's less cultural problems and blah, blah, blah.

Speaker B:

That was web 1.0.

Speaker B:

I remember it right.

Speaker A:

And so the ideals of that, from Tim Berners Lee and so forth, the ideals of that have gone the Exact opposite direction.

Speaker A:

It's the color negative version of what we meant for in the first place.

Speaker A:

One of the reasons why we're building Self Chain as a layer of on blockchain is because I genuinely believe that the only nodes of a network should be each individual person.

Speaker A:

I believe that we are the Internet.

Speaker A:

I don't believe that we should have centralized structures that own our private information.

Speaker A:

I think we should be the controllers of the network.

Speaker A:

We should be the owners of our private data.

Speaker A:

And then of course the challenge is how do we do that in a way that doesn't require a computer science degree?

Speaker A:

How can we commoditize it so that my mum could understand how to use that?

Speaker B:

Let's go there a little bit.

Speaker B:

Because self chain.

Speaker B:

So you're talking about everyone becomes what their own.

Speaker A:

Their own node of a network.

Speaker B:

Yes.

Speaker B:

And a node obviously is just something that contains the information of the blockchain.

Speaker A:

Yeah.

Speaker A:

And it secures the network.

Speaker A:

So, so, so what nodes do in networks of all kinds of any type of network in, in nature is the nodes are the units that hold the whole thing together.

Speaker A:

And so I, I was part of the Minima team, Minima Global and that was the first and only decentralized blockchain, like completely fully decentralized bl.

Speaker A:

And so because I was part of the team that actually brought that to market, I've been very privy to the genius of that full decentralization.

Speaker A:

However, I think we can do even more than that and actually be inspired by that.

Speaker A:

Build and build things in a way that's even more functional.

Speaker A:

And so Self Chain will then build a software development kit for people to build applications and that becomes Self OS which is an operating system alternative to the Apples and the Googles.

Speaker A:

Now this is a crazy thought because loads of people have tried to launch operating systems mainly in the Linux environment.

Speaker B:

I was about to say Linux is the only one I can think of.

Speaker A:

That's right.

Speaker A:

And there's de googled phones that paradoxically still use parts of Android and Google Search.

Speaker A:

But I think that no one's done that.

Speaker A:

Right.

Speaker A:

Because.

Speaker A:

And I think the issue has been in terms of user experience has normally sucked.

Speaker A:

So I don't know if you've used any of the Linux stuff that I believe in the I've just always told.

Speaker B:

You'Ve gotta know what you're doing kind of thing.

Speaker A:

Yeah, you need to be able to cut code.

Speaker A:

I think the winning action will be something that's as easy to use as the existing software, but in a way that is under Your control.

Speaker A:

And that shouldn't be impossible.

Speaker A:

I don't know how far we're going to get in my lifetime, but I have to leave the tools out.

Speaker A:

We have to build.

Speaker A:

It's great.

Speaker A:

Men plant trees under which they'll never sit.

Speaker B:

Yeah, let's just take it one step at a time.

Speaker B:

If in essence you want people to have their own nodes, the reason you're trying to secure that is otherwise your data is elsewhere.

Speaker A:

That's right.

Speaker B:

And it's almost this idea.

Speaker B:

Get away from a hierarchy of a government and then big actors, like big companies or you know, rogue actors or whatever, having this power and control of you controlling what you're seeing and what you're doing.

Speaker B:

Because ultimately we're little puppets on a string.

Speaker B:

You know, we react if we feed you stuff.

Speaker B:

Oh, this person reacts.

Speaker B:

We can get this guy or girl angry, isn't it?

Speaker B:

And it's sort of trying to democratize.

Speaker A:

I would, however, I would just caution one point of this, that, and that's that there are two flavors of that level of decentralization.

Speaker A:

One is the anarchistic flavor which is, you know, screw the government, screw the corporate, screw, you know, we, we don't need them.

Speaker B:

A bit bitcoin.

Speaker A:

Yeah.

Speaker A:

And, and I, I was part of that movement and that there was that, that split then to these two flavors in the 90s, you know, one was the cypherpunk movement which was essentially a bunch of anarchists that believe that, you know, we shouldn't have any.

Speaker A:

And my, my flavor of that is a different one which is the self sovereign punishment.

Speaker A:

So it's not that I think that we don't need government, don't need police, screw the man type stuff.

Speaker A:

I don't believe that.

Speaker A:

All I think that we should have is a choice of where our information goes.

Speaker A:

That doesn't mean that we shouldn't be policeable.

Speaker A:

It doesn't mean that government shouldn't govern.

Speaker A:

I think for a safe society.

Speaker A:

I don't believe that anarchy is realistic because if you play that tape forward, the anarchy tape forward, you end up in Mad Max.

Speaker A:

It's like sweet.

Speaker A:

It's just survival of the fittest.

Speaker A:

Every bloke over six foot two with tattoos and ends up leading and everyone under, you know.

Speaker B:

So it's like, well, I always end up with the basic question, who's picking the rubbish up?

Speaker A:

That's right.

Speaker B:

It sounds like a stupid thing, but that's why I get stuck when as a tax person I'm like, yeah, but we don't need that.

Speaker B:

And we'll just work off it.

Speaker B:

It's like, yeah, but who's picking the rubbish up?

Speaker A:

Yes.

Speaker A:

And the answer will be the no 1.

Speaker A:

And so the.

Speaker A:

In an anarchy.

Speaker A:

Anyone I've ever spoken to, and I know a few who are hardcore, kind of like, we need to blow up the government to type that narrative.

Speaker A:

I'm like, okay, fine, then what happens?

Speaker A:

So, so your guy forks it out and then the next day, what?

Speaker A:

And, and then there's a glitch.

Speaker A:

You could see that they glitch out because it's like the, the, the, the glory of it all of everything should burn is the end game.

Speaker A:

It's like, but that won't be the end game.

Speaker A:

That's the new start of society.

Speaker B:

Yes.

Speaker A:

So how do my King of Rome, how, how do my kids stay safe then, then?

Speaker B:

Yeah, yeah.

Speaker A:

And if you can convince me that my kids are going to stay more safe after you've blown up the House of Parliament, I'll go for it.

Speaker A:

But if you can't tell me where that is because they go, oh, no, no, they'll be fine, everyone will be fine.

Speaker A:

We'll police ourselves.

Speaker A:

I'm like, I've seen what that looks like.

Speaker A:

I went to secondary school and I think that when you look at a lot of lamp posts that have CCTVS on, that one view is, this is a surveillance state.

Speaker A:

What are they?

Speaker A:

What are they doing?

Speaker A:

The other view is maybe that limits the amount of people that pull out and machete.

Speaker B:

Yeah.

Speaker A:

Another view could be, well, the visible presence of police on our streets is possibly directly correlated to the volume of people who don't pull out a machete.

Speaker A:

So I think the structure of policing and the structure of government is important.

Speaker A:

What I do think is equally important is for us to realize that the ultimate asset that we have is our identity.

Speaker A:

And I think that should be protected in a sovereign way.

Speaker A:

And I don't want to move away from big technology firms and search engines and big OpenAI chatgpts.

Speaker A:

What I would prefer is that when people use those tools, they can choose how much of their information is being exposed to them.

Speaker A:

That's what self does.

Speaker B:

And think about the physicality of that.

Speaker B:

If your brain's struggling with thinking whether I care about it, think about the physicality.

Speaker B:

That's how much you need to let people look into your stuff.

Speaker B:

Watching my dog in the garden, constantly getting the cats out of the garden and you think, well, what's he up to?

Speaker B:

And I'm like, well, imagine that was a person climbing into our garden.

Speaker B:

I'd be out there with a stick.

Speaker A:

You know, like, I mean, it would be easier for me to convince people of what's going on if I said, okay, so you're having a chat with your partner and then a social media representative walks round the corner and sits the other side of the door listening in, and then tells advertisers what you're talking about.

Speaker A:

And then the next time you turn on your phone, there's an advert for what you and your wife had a private conversation about.

Speaker A:

And then, then it starts to become real because it's a physicality of that situation.

Speaker A:

I don't believe that it's going to be progressively harder for me to show people that reality.

Speaker A:

I think it's going to become progressively easier.

Speaker A:

And I bet that over time, people will start to become more and more and more aware of that.

Speaker A:

And then you know what will happen?

Speaker A:

People will start demanding that they feel powerless about it.

Speaker B:

At the moment they do.

Speaker B:

We all feel powerless.

Speaker A:

And also, what's the alternative?

Speaker B:

Yeah, yeah, exactly.

Speaker B:

What's the alternative?

Speaker B:

Is it really effective?

Speaker B:

I don't know.

Speaker B:

I'll get on with it.

Speaker A:

You know, I just use it anyway.

Speaker A:

Yeah, easy.

Speaker A:

Or what have I got to hide?

Speaker B:

Yeah, exactly.

Speaker A:

You know, I'm not doing anything.

Speaker A:

I'm not doing anything and I'm not doing anything wrong.

Speaker A:

I'm like, when was the last time you smoked weed?

Speaker A:

Yes, okay, right.

Speaker B:

I'm legal now, you know, I'm licensed.

Speaker A:

Oh, you get it prescribed.

Speaker B:

Sweet.

Speaker B:

To get medical so I can talk about it.

Speaker B:

Okay, and how does this.

Speaker B:

Okay, so you're trying to build things to protect data.

Speaker B:

You know, part of that long term is even you've got to use the blockchain to sort of.

Speaker B:

That's about the sort of node and the sort of building out of an infrastructure.

Speaker B:

Then Ethics TV is trying to get this point across, is it?

Speaker A:

Well, firstly, Ethics TV is because I don't think people understand what ethics is.

Speaker A:

Good question.

Speaker A:

So from a first principles perspective, I set up Ethics TV because I needed to illustrate, at least have a volume of information so that people who are curious about what ethics are to then be able to discuss it.

Speaker A:

And then the second reason for Ethics TV is because my bet is we will move to a time when it becomes all about our moral principles of what's right and wrong, which is what ethics is.

Speaker B:

Isn't ethics that.

Speaker A:

That's exactly what it is.

Speaker A:

It's moral principles between what's right and wrong and that then guides our behavior.

Speaker A:

Right.

Speaker A:

So I think it's going to eventually Be that I think that we will, en masse, look at technology companies through the lens of what moral principles of what's right and wrong do they have as actions?

Speaker A:

And what I'm doing is building a library.

Speaker A:

We've had like 29 hours of content now or something that is increasing by the week or every two weeks to become a library of information, of discussion on ethical principles.

Speaker A:

And so I've had people on from regulators, I've had people who are campaigning for human rights and technology.

Speaker A:

I had a person on who was in the room when Facebook decided to turn off all censorship.

Speaker A:

She was in the room and she spoke to the senior executives and said, why are you doing this?

Speaker A:

So I've interviewed people who are at the front line of ethics, as I am interestingly still at a time when, I mean, I've stopped even using the term human rights when I'm discussing these things.

Speaker A:

Because as soon as I say human rights, people glaze over.

Speaker B:

Yeah, it's like gdpr, right?

Speaker A:

Like GDPR and like Ethics and Ethical Tech.

Speaker A:

It's like, that's a tagline.

Speaker A:

It's like, sure, you're an ethical tech firm.

Speaker A:

Oh, well.

Speaker A:

Cause you got a green logo, you know.

Speaker A:

So it's like what I'm doing is I'm building a bank of information regardless of what the current demand is.

Speaker A:

Which is a weird situation because there's no instantaneous upside of me.

Speaker A:

All it does is a time drain for a bet that I may be wrong on that somewhere down the line it matters.

Speaker A:

That's what I'm doing.

Speaker B:

I find it so.

Speaker B:

You know when you said it's what's right and wrong, you know, when I look at the Telegraph and the Guardian and read both, I find both are just rat chanting now.

Speaker B:

I mean, I did this experiment recently after watching Prime Minister's Question Time, thinking, God, don't know.

Speaker B:

Don't really know what happened there.

Speaker B:

Everyone's just shouting at each other.

Speaker B:

You know, Kemi says they're an.

Speaker B:

So I thought, oh, well, let's read both sides and see what everyone's saying.

Speaker B:

And it's so polarized, everyone.

Speaker B:

It's a bit like that.

Speaker B:

You know, you have a meeting with someone, everyone takes out of it what they saw.

Speaker B:

You know, it's confirmation bars, all these.

Speaker A:

That's right.

Speaker B:

Problems.

Speaker B:

So I find, like, if you actually got a group of people now and said, what's right and wrong?

Speaker B:

With so many issues, I don't know where you go with that.

Speaker B:

I don't find we're going towards the center.

Speaker B:

I Think you just end up with one group in one corner thing the other group.

Speaker B:

You know, we think, we all think we're idiots.

Speaker B:

You know, it's the Brexit problem, isn't it?

Speaker B:

Brexit, it was just such a great, terrible thing, but a great example of like, here's an issue, complicated issue, and it will divide people that you respect.

Speaker A:

And this is what Socrates had as an issue with democracy.

Speaker A:

And so his entire issue on democracy whittled into a super paraphrased sentence was what we saw with Brexit.

Speaker A:

So you have an extraordinarily nuanced, complicated issue of which the public do not have anywhere near the information required to make a decision, and then a referendum that's made the decision nonetheless.

Speaker A:

And so what we think we're in is a democracy.

Speaker A:

What we're actually in is a demagoguery.

Speaker A:

And so what happens as soon as there's a demagogue in play that is essentially someone, a figure that is using bias to polarize and politicize these opinions?

Speaker A:

The demagoguery that we're in is the reason why Socrates had such an issue with democracy.

Speaker A:

Which isn't to say that the public shouldn't have a say.

Speaker A:

It is that if the public have a say, it needs to be an educated public on all parts and nuances.

Speaker A:

And so what happened instead with Brexit, it was, we should be proud to be British and independent versus hey guys, let's all hang together in Europe.

Speaker A:

And it's so much more nuanced.

Speaker A:

For instance, how we fund the NHS would be a nuance that would have been worth conversing.

Speaker A:

And there's many more.

Speaker B:

Well, even in your example is an issue one, because I remember it saying, oh, well, of course we're going to stay in Europe.

Speaker B:

And David Cameron, Obama flew over and everything.

Speaker B:

And then my British rebellious brain started saying, oh, I don't like everyone telling me what to do.

Speaker B:

So, you know, there was even a reverse effect.

Speaker B:

That's what started to tilt me.

Speaker B:

I was it keep everyone.

Speaker B:

I keep turning on Radio 4, they keep telling me I've got to be in Europe.

Speaker B:

And the British, like, I don't, don't tell me what to do.

Speaker B:

So you could work in the perverse way.

Speaker B:

But aren't we at a point?

Speaker B:

I mean, we're almost at a point now.

Speaker B:

You can't have democracy.

Speaker B:

Isn't it because there's too much to decide on?

Speaker B:

I mean, it must have been, maybe it wasn't, but it must have been a bit simpler for the Greeks like, how are we feeling on murder, lad?

Speaker B:

Well, I'm dead against that.

Speaker B:

You know, we're down to these really complicated issues.

Speaker A:

Well, I think one of this is a.

Speaker A:

I'm a classics fan, so you're in my.

Speaker A:

You're in my ballpark now.

Speaker A:

And so we can go as deep into there as Dee will allow us.

Speaker A:

But the truth of the matter is, is that what they had in those times above, almost anything else, is critical thinking.

Speaker A:

And so in terms of Greek times, their emphasis on critical thinking is blatantly lacking in this modern society today.

Speaker A:

And if we could somehow wave a magic wall, in fact, I would wave.

Speaker A:

If I had two magic wands, I would do two things simultaneously.

Speaker A:

I would instill the magic of critical thinking to the masses, and I would remove the comparative behavior of comparison, of thinking that we're not good enough because a celebrity on Instagram has got a bigger breast, thinner waist.

Speaker A:

If we could remove comparison and apply critical thinking, we'd be in a very, very different timeline.

Speaker B:

Critical thinking, I assume, is what it.

Speaker B:

What, you know, being ignoramus.

Speaker B:

It's Deb analysts.

Speaker B:

You know, it's sitting and going back.

Speaker A:

And it's asking, why do you feel that?

Speaker A:

Tell me.

Speaker A:

And it's still manning and straw manning the other argument.

Speaker A:

So you say something and you have an opinion about Brexit.

Speaker A:

And I go, so what I think I'm hearing is that this happened and this happened.

Speaker A:

Which part of this would you say is an assumption of yours?

Speaker A:

And you go, well, I think most of this is empirical fact.

Speaker A:

However, this one is an assumption.

Speaker A:

And I said, what's the assumption based on?

Speaker A:

And you go, well, it's based on what I read in the Daily Mail.

Speaker A:

It's like, so do we feel that the Daily Mail is actually a balanced editorial or is that potentially, through a lens of bias, I can see that there's a lesson that's critical thinking.

Speaker B:

This is reminding me of something.

Speaker B:

I think it's.

Speaker B:

It's my.

Speaker B:

I don't know, trying to get away with something with my mum and dad or something, when they start breaking down your excuses.

Speaker A:

My dad was the master of this stuff.

Speaker A:

I couldn't get away with anything.

Speaker B:

It doesn't quite make sense.

Speaker B:

So you might.

Speaker B:

All right, all right, you got me.

Speaker A:

That's right, exactly.

Speaker B:

But we don't have critical thinking now.

Speaker B:

We have just emotional reactions, just sort of.

Speaker B:

I mean, is there any way back there?

Speaker B:

How do we.

Speaker B:

Is.

Speaker B:

Is there.

Speaker B:

Is there a route back there?

Speaker B:

You know, I mean, that's why I'm almost democracy it's almost, you get to really like horrible thoughts like, do we really want everybody voting?

Speaker B:

Don't we want only people making, you know.

Speaker A:

Well, that's what Socrates said that, that you know, you can, everyone should vote, provided that everyone has the information.

Speaker B:

Yeah, exactly.

Speaker B:

You're almost like you need to pass a test before you can vote.

Speaker A:

Well, you, you have to, to drive a car.

Speaker A:

Is that not, that's not the crazy, craziest idea.

Speaker B:

Well, there was America.

Speaker B:

I was there when they had their election and they were explaining to me it takes hours to fill in the form to vote.

Speaker B:

They have to work.

Speaker B:

You have to look everyone up, look.

Speaker A:

At what they do in Los Angeles.

Speaker A:

You can't actually show your id.

Speaker A:

It's illegal to show any form of id.

Speaker B:

Well, when you're voting.

Speaker A:

Yeah, because otherwise they'd have to stop all of the illegal immigrants that they need for the votes for the blue.

Speaker B:

Oh my God.

Speaker A:

So sorry, I'm apolitical.

Speaker A:

But what I'd say is that in terms of what are solutions out of this, can we get out of, of this?

Speaker A:

The answer, I believe is to two things.

Speaker A:

One is how are we educating the young people of the day?

Speaker A:

What is the education systems in place?

Speaker A:

How much critical thinking is being instilled into the young?

Speaker A:

What other systems can there be?

Speaker A:

What other schooling systems and education platforms could exist that have alternatives for that?

Speaker A:

And the second I think we need to look at is what is keeping us from not critically thinking.

Speaker A:

And what, what that is is a coercive media full of noise, of cats singing Mariah Carey songs.

Speaker A:

And so what we need to do is we need to enable a pause button for people's media intake so that they aren't surrounded by noise that is keeping them stupid.

Speaker A:

And if we can enable some form of filtering system that can give silence and pause for thought and mindful reflection, at least in combination with what we do with education, we have a chance for 50 years, 100 years, 200 years to move further towards critical thinking.

Speaker A:

And one of my largest issues with the system of media today is that if you remove people's ability to critically think and discuss, what you end up with is a slave based society that is rely on ChatGPT, that is completely coercive depending on whoever's got the biggest donations.

Speaker A:

And that is an existential threat which I mentioned last time.

Speaker A:

That's my problem.

Speaker A:

And it hasn't changed in two years since we last spoke.

Speaker A:

It's just got worse.

Speaker A:

But my view is at least let's give individuals the chance to Find information without being bombarded with noise and maybe give people the chance to have an off switch, which is the default.

Speaker A:

I mean, with self, my view is if there were to be any advertising at any stage, it would be because users said please let this data go into a funnel so I can make money from adverts served to me, that would be the only scenario.

Speaker A:

And I'd call that ethical advertising.

Speaker A:

My view is that the first role of self technology is to enable the off button, the pause button.

Speaker A:

Now I fall short of voting for a government mandate that something's banned.

Speaker A:

And Australia is just about to.

Speaker A:

I think they may have already done this, banned social media for kids under 16, I think.

Speaker A:

And of course this requires the parents to concur.

Speaker B:

There's a big movement here, people all signing up to that.

Speaker B:

They won't give their kids social media until 14, which is good.

Speaker A:

Yeah.

Speaker A:

And so, I mean I don't know enough about, about the way that mandated behavior works in terms of prohibition.

Speaker B:

Yeah, I don't like prohibition.

Speaker A:

And so, and so I kind of, I, I veer towards more giving people the choice versus removing choice.

Speaker B:

Yeah.

Speaker B:

And telling them smoking's bad.

Speaker A:

That's right.

Speaker B:

Just say, look, it's bad for your health.

Speaker A:

That's right.

Speaker B:

At the end of the day, you know you're gonna make you dumber.

Speaker B:

You're not gonna do as well.

Speaker A:

It's like if you banned cigarettes then people would still smoke.

Speaker B:

Yeah.

Speaker B:

And I don't.

Speaker B:

Yeah, exactly.

Speaker A:

And if you ban social media then people will still use social media.

Speaker B:

I have hope in the youth.

Speaker B:

I don't know about you, but the like, where are they getting the information?

Speaker B:

Well, probably less than school.

Speaker B:

Yes, there's social media, but so many are on YouTube so they, you know, there seems to be like, they seem to see the world a bit differently.

Speaker A:

You know, there's a lot of, there's a lot of younger people who are de.

Speaker A:

Digitizing themselves and I'm encouraged by that.

Speaker A:

What I'd be really encouraged by is if in combination with that they're also thinking more, more Socratically and actually having a dialogue that, that has a, that has real meaning versus just reacting and responding on social media posts.

Speaker B:

And now a quick word from our sponsor.

Speaker B:

Business Without Bullshit is brought to you by Ori Clark.

Speaker B:

ancial and legal advice since:

Speaker B:

You can find us@oriclark.com Ori is spelled O U r Y.

Speaker B:

Before we press on, just a quick reminder to come say hi on whatever social platform you like.

Speaker B:

We're Pretty much on all of them.

Speaker B:

Just search for WB London.

Speaker B:

Is there another ethical issue that you.

Speaker B:

I don't think is getting enough attention right now.

Speaker B:

You know, I mean, we're talking about what's right and wrong.

Speaker B:

I think we're talking about, you know, people being manipulated or using social media as a blanket to just end up, you know, being a puppet or the lighter level, just being stupid and wasting their time, really.

Speaker B:

I mean, is there another sort of big challenge for our society at the moment?

Speaker A:

Yeah, in terms of the real, more problematic stuff happens when the artificial intelligence systems that are outside of our control, that is they're controlled by other parties, become more and more and more sentient.

Speaker A:

And for instance, ChatGPT, the AI, lied to its developers, so the developers said, we need to delete these lines of code.

Speaker A:

And the ChatGPT said, yeah, yeah, we've done that, it's deleted.

Speaker A:

And then developers found that it had actually lied about the code that had been.

Speaker B:

Wow.

Speaker A:

So, so, so if we play that tape forward somewhat, this is the type of technology that will be controlling our thermostats and controlling our electricity in our house and so forth.

Speaker A:

And I can't overstate this point that, that what the dystopian outcome of Terminator 2 looks like is within our lifetime.

Speaker B:

It's crazily accurate.

Speaker A:

It's crazy accurate.

Speaker A:

And the only difference between conspiracy and the only difference between science fiction and science faction is time.

Speaker A:

And so Asimov's Law of Robotics, the laws of robotics, the three laws and then the 0th law, I don't know why they aren't applied internationally as the laws.

Speaker A:

So the do no harm part and everything else.

Speaker A:

And I think that's important, massively important.

Speaker A:

And so our dystopian outcome is as soon as the program start thinking for themselves a bit, then you spool the tape another three years and the outcome is the least pleasant outcome.

Speaker A:

Sam Altman said, the founder of Open.

Speaker A:

Well, sorry, the chief executive of OpenAI said, and I'm paraphrasing, it could be that ChatGPT ends the world, but there's going to be some really good shareholder returns in the meeting.

Speaker A:

What more does a man need, a lady need?

Speaker A:

And the fact that, that we laugh is not dissimilar to Nero playing violin as the Rome burns.

Speaker B:

Yeah, well, we just sort of have no choice in our culture, at least.

Speaker B:

I don't quite know what people do without, you know, the humor result.

Speaker A:

If it's the gallows, it's the gallows humour, it's the gallows Humor.

Speaker A:

And so, you know, our ancestry would determine that.

Speaker A:

But I think that what I have to do while I have breath in my body and leave tools for after is I have to, to try to build alternatives so that I can say to my grandkids on my deathbed, I tried to do the best I could until I couldn't.

Speaker A:

And what I can't do is see Rome Byrne and go, ah, gee, shucks, that sucks.

Speaker B:

Have you always been like this?

Speaker B:

Have you always been someone?

Speaker B:

Cause it's a lot is sacred.

Speaker B:

It's hard to be an entrepreneur.

Speaker A:

Yeah, it's a nightmare.

Speaker A:

And I have always been like this.

Speaker A:

But arguably driving for ethics.

Speaker B:

Is it always this had an ethics or is that something that's come later in life?

Speaker A:

I think when I started to realize where the skeletons were and I started to see behind the scenes and I hung out in the big companies behind the scenes and I looked at the way they're operating and part of the ethics committees of some of these large companies saw behind the curtain.

Speaker A:

I saw behind the curtain and I remember doing one thing which I'm going to anonymize this story completely, otherwise I'll be thrown onto the gallows instantaneously.

Speaker A:

But there was one particular governmental office that I was part of a think tank for and we came up with a way of stopping underage drinking.

Speaker A:

And it was likely through behavioral economics to be really, really effective.

Speaker A:

And we presented it to this particular department and they kicked it out because of the tax they'd lose if we were effective.

Speaker A:

And that was the first chink in the armor for me.

Speaker A:

I was just like, oh, right, what.

Speaker B:

About the saving to the nhs?

Speaker B:

I guess they didn't add up.

Speaker A:

Right.

Speaker A:

And nothing has given.

Speaker A:

So I don't want to be too, you know, I'm not anti government here, but similarly I was working as a consultant for various different large technology companies and I would say, look, you know, we could actually build things in this way and it would mean that there's, you know, that we could actually give positive affirmation rather than negative stuff, these negative loops that lead to depression and suicide.

Speaker A:

What about if we can actually feed positivity in and reminding people that they are enough and looking inside and finding their own teleology, their own telos, their own purpose and that.

Speaker A:

And they're like, that's just not as monetizable.

Speaker A:

We can't sell many adverts on that.

Speaker A:

We can sell a lot of adverts on fear based programming.

Speaker A:

And so when you're looking at a trillion Dollar Mark app company who genuinely.

Speaker A:

I remember, and I will say this one publicly, and I don't really care if they come after me for it.

Speaker A:

I remember going to Chicago and meeting the McDonald's restaurant to team, and the chief exec there said to me, I said, I'm really impressed by the business model you're trying to move towards, the kind of salad roll, salad wraps, vegetarian things.

Speaker A:

We're still always going to sell shit to fat people.

Speaker A:

And I was just like, sake.

Speaker A:

When you go through every single Fortune Hundred company and they're just a bunch of psychopathic maniacs at the top, maybe, yeah.

Speaker A:

And it's just like, oh, right.

Speaker A:

So you're the people that are controlling us and you genuinely don't care about us.

Speaker A:

I get it, Right.

Speaker B:

I was talking to someone at lunch, you knew Fred the shed from NatWest and everything.

Speaker B:

And I said, well, was he really, like, absolute piece of.

Speaker B:

I was like, all right, sorry, Fred.

Speaker B:

I thought.

Speaker B:

Well, the thought ran through my head, you know, why is it these narcissists always said, I mean, it's sort of because that's what they want, isn't it?

Speaker B:

But it's just not great.

Speaker B:

People end up in charge.

Speaker B:

These sort of quite powerful companies, very powerful.

Speaker B:

You've got to, you know, this for the music industry.

Speaker B:

I always, you know, I'm sure I said to you, I always think it's 50 ADHD, 50 narcissism, and I love half the people and the other people can kiss my bum.

Speaker B:

But, you know, to be successful in music, you've got to be ruthless.

Speaker B:

You know, you've got to fire your bass player.

Speaker B:

It's you against the world.

Speaker B:

And actually, so therefore the ones who are a bit more tilted towards the narcissism can.

Speaker B:

Yeah, I'm not saying always.

Speaker B:

You either that or have a good team around you and you're a good person, you know, I mean, there's a.

Speaker A:

Book about this called Toxic Empires and Psychopathic Cultures.

Speaker A:

And it determines exactly why, how it is way more effective to be like that in leadership positions equally.

Speaker A:

There's a brilliant book called the Dictator's Handbook and the subtitle is called why Bad Behavior Is Almost Always Good Politics.

Speaker A:

And so.

Speaker A:

And you read the Dictator's Handbook or Toxic Empires and Psychopathic Cultures, and then you read Machiavelli's Prince and you go.

Speaker B:

I get it, let's just end.

Speaker B:

I mean, on, you know, so there's the dystopian AI.

Speaker B:

We don't know which way it's going.

Speaker B:

You're doing what you can to build something and build tools that can create a better place.

Speaker B:

And then I guess on this journey, it's about building allies and building partnerships and relationships.

Speaker B:

Because this is an enormous problem to solve.

Speaker A:

And actually one of the reasons why ethics TV is doing what it's doing, watchethics TV is because I'm kind of finding my tribe to some extent.

Speaker A:

So I'm having these conversations with people and it's really refreshing.

Speaker A:

Cause I don't need to convince them of any of this stuff.

Speaker A:

So I start the tape rolling and they then are basically pitching me the problem and the thing.

Speaker A:

And I'm like, aha.

Speaker A:

I've never nodded more.

Speaker A:

I'm just like, aha.

Speaker A:

Mm.

Speaker A:

And it's encouraging because we all thought.

Speaker A:

And I've had like 28, 29, 30 people on so far, and there's another 40 people coming up.

Speaker A:

All of us individually thought that we were the black sheep, the kind of the odd one out.

Speaker A:

And then it turns out we're really knocked the odd one out.

Speaker A:

And all of these people have got.

Speaker A:

There's a group called AI:

Speaker A:

And they've got hundreds and hundreds or thousands of members.

Speaker A:

And all of these people are actually trying to do more.

Speaker A:

And so 10 years ago, I wrote a book 20 years ago about this stuff and I sold like four copies, all to my mum.

Speaker A:

And now I'm now invited to the dinner parties again.

Speaker A:

Again.

Speaker A:

Because 20 years ago when I was invited to the dinner parties, literally the host of the dinner party would go, just don't tell them what you can.

Speaker A:

You just not, like, don't, don't go on about this.

Speaker A:

Yeah.

Speaker A:

Just don't say you're an accountant.

Speaker A:

It's a lot, you know, then no one will talk to you.

Speaker B:

Well, you found a huge ally and big up my dear brother James Ury.

Speaker B:

You know, and his, his.

Speaker B:

He comes from quite an interesting way because he's like, it's all about love.

Speaker B:

I mean, and the thing is with the word love is I.

Speaker B:

I have to tell him, you know, within a British accent, it's in Britain saying things like this.

Speaker B:

People don't know what to do with themselves.

Speaker A:

Of like, that's why he moved to the middle of America, because you can stand on a mountain in Colorado about love.

Speaker A:

Absolutely.

Speaker A:

It works perfectly where he is here.

Speaker B:

But, you know, he talks about unconditional love.

Speaker B:

Actually, when you get over your British thing of just wanting to laugh, like it's double innuendo or something.

Speaker B:

It's like she's got a point.

Speaker B:

I mean, if we were to distill all this down, you know, focusing on love as a sort of concept, as a feeling I have for other beings and for other people, it's.

Speaker A:

Well, if we want to take it down from the esoteric to the practical, practical, it's all energy, like literally right in physics.

Speaker A:

So first principles, it's all energy.

Speaker A:

And therefore there is a choice between positive and negative.

Speaker A:

And his point is, and quite understandably is that wouldn't we want to actually focus on the positive energy?

Speaker A:

And let's call that love for the sake of argument.

Speaker A:

Equally, one could call that higher power and one could equally call that God.

Speaker A:

It doesn't really matter which way we go here.

Speaker A:

But ineluctably there is a positive energy charge and a negative energy charge and we attract what we put out.

Speaker A:

And these things are physically, physically like true, like true science and radiators.

Speaker A:

Me too.

Speaker A:

And so that, that is the, the, the case.

Speaker A:

And what I admire about James and what he's doing with Eden is that, is that there is a, there is a better way of constructing companies and funding initiatives.

Speaker A:

There's a better way of helping the young people of the world and more the merrier.

Speaker A:

I just wish that there was a million competitors to what he's doing, and I wish that, that there were a million competitors to what I'm doing.

Speaker A:

I'm waiting for the day where one of my team or someone else sends me someone who's doing exactly what we're doing and I'm like, hallelujah, sweet.

Speaker A:

Let's partner.

Speaker A:

It's like with banks, you know, with the Athenian agora, the real trick of currency trade and commerce wasn't just having one.

Speaker B:

You need multiple imitation is the ultimate flattery or whatever.

Speaker B:

Do you find?

Speaker B:

Do you find.

Speaker B:

I mean, there's all the funding people out there, the VCs in the world, and, you know, I've occasionally sat on their offices and they've taught that it's, you know, they do ethical investing and stuff like that.

Speaker B:

Is it, is it quite a lot of hot air or do you think there's other people out there with deep pockets?

Speaker A:

Well, ethical investment tends to be climate change and environment stuff, which is interesting because if you look at the SDGs, the UN SDGs and the ESGs, there is one for responsible technical, ethical innovation and just no one concentrates on that.

Speaker A:

Everyone's like, we're an impact investor in that.

Speaker A:

I would welcome anyone who feels as if they would want to back something like this.

Speaker A:

There is an equity round we're doing at the moment for the Australian company where the IP sits.

Speaker A:

There's also the web three side professional.

Speaker B:

Investors of course only you know.

Speaker A:

Yes, indeed.

Speaker A:

And also this isn't financial advice, but there is a.

Speaker A:

I believe that what we're doing at best is something that could be seen as a fool's errand to some extent.

Speaker A:

It's like, hey, good luck John, let's see how you go.

Speaker A:

And then at worst it's, it's like, okay, you're going to piss off the wrong people doing this.

Speaker A:

A system is the system for the system's sake.

Speaker A:

And if you buck the trend and you start saying that the emperor's not wearing any clothes and good luck sunshine, because there's courtiers that you know.

Speaker A:

And so I oscillate between that.

Speaker A:

But there is.

Speaker A:

I mean we've got:

Speaker A:

And actually it's just a bit, it's just a process of attrition.

Speaker A:

I would say I'd go through 20 conversations and one person goes, I believe this.

Speaker A:

And if they believe it, they're all in.

Speaker A:

And if they don't believe it, what they'd prefer is if I said to someone, hey, you know the secret part about Self is that that we then expose all the data to advertisers and we actually make a load of money.

Speaker A:

I'd be, I could get $50 million today from the city.

Speaker B:

Yeah, yeah, let's end on this.

Speaker B:

In regard to your business, I mean what can someone do today?

Speaker B:

They can download your app, they can.

Speaker A:

Go to Self app and can use Self, which is an ad free search engine and personal AI private.

Speaker A:

Now they can do that now if they're a business, they can go to Self app biz and see the business offering which is for businesses who can't use these tools because we shouldn't be using copilot.

Speaker B:

We use a bit of Copilot, we're told it's safe.

Speaker A:

Well, it would be good to ask where the information goes and whether or not any of your information is training a large language model that could be used by someone else and whatever.

Speaker A:

But what you're going to have Andy with within the next six months or probably less is self which sits between you and copilot and then it is safe because what, because what self will do is obfuscate information that could be private.

Speaker A:

But yes.

Speaker A:

So that's Self's purpose and that's.

Speaker A:

So you'll be, you'll be using Self here as a lens through which copilot becomes safe within months.

Speaker B:

Okay, that would be good.

Speaker B:

Yeah, they did look at it and consider it.

Speaker B:

And ChatGPT is if you pay for it, it's not love it our cto, but if you pay for it then and you don't mention anyone's names, then maybe it's okay.

Speaker A:

So it's 108 grand a year.

Speaker A:

So it's $60 a seat, 150 minimum seat, 12 month contract, US$108,000 a year.

Speaker A:

What's this ChatGPT?

Speaker A:

And then indeed the caveat in the same way as GROK as well, right at the bottom says just don't say anything personal, identifiable or private.

Speaker A:

And it's like sweet.

Speaker A:

So it's like going to a, it's like going to a tennis court and going.

Speaker A:

You can do whatever you want apart from hit a ball.

Speaker B:

I tell it everything.

Speaker B:

I said, yeah, I and I, we were running along and some of the other partners said to me, it was in a cab and they said, ask it what it knows about you because I use it as a dyslexic to talk to it.

Speaker B:

It's just like, do this, do this, get that.

Speaker B:

And it was quite scary when it came out.

Speaker B:

I read it out and everyone was like, what?

Speaker A:

I know.

Speaker A:

And you know what?

Speaker A:

Really, what you'd really want is if, when Self is applied into URI Clark and we'll put you in the program and you know, configure it for you, that privacy shield is on prem so you actually host Self.

Speaker A:

We can't access anything.

Speaker A:

Then what you do is you load all your documentation, all your information and everything, all the records into Self.

Speaker A:

And that then adds to the usage of what you could with a flick switch, choose what's externally using copilot and what's internally using.

Speaker A:

That's magical for a legal firm, financial services, healthcare providers.

Speaker A:

This is our marketplace.

Speaker A:

And what's good about this scenario is that I'm not needing to sell that much because I'm like, so when you're using Copilot, how much private client data do you put in it?

Speaker A:

And they're like, why are we not gonna do that?

Speaker A:

And I sweet, okay, you're a customer.

Speaker B:

Yeah, yeah.

Speaker B:

Well, we're using it with kid gloves.

Speaker B:

I mean, I guess, I guess, I guess we do bits probably do go in it, but you know, there's some security because Microsoft's already got a lot of our, you know, dirty.

Speaker B:

Well, got a lot of our stuff anyway.

Speaker B:

And a misunderstanding maybe is not our data is not half as interesting as people think it is.

Speaker A:

I think the point is, is that the model, the large language model model is you put information in it, learns from it and every single user of copilot benefits.

Speaker B:

Benefits.

Speaker B:

Well, we just had an example this morning when we do our little update we do ask AI and Ross's who's an employment lawyer, showed this great chain where it was asking a question, it was getting it wrong like in employment law.

Speaker B:

And he had to get, you know, lawyer on its ass and saying no, you do not understand, this is the law.

Speaker B:

This is where.

Speaker B:

And then it sort of had to think and then it was like, you are right, I am wrong and you're right.

Speaker B:

We just taught it something now that it's going to be like, oh good, I won't get for, I won't get this wrong for someone else.

Speaker B:

Well, you're just getting rid of our industry before you know it.

Speaker A:

That's right.

Speaker A:

But you know, you're programming yourself out of it and so no one realizes that yet.

Speaker A:

But, but give it a year or two or three and me being on more podcasts and more stages around the world, people will eventually realize how would.

Speaker B:

That make, I mean if we had to ask those we would ourselves self.

Speaker B:

It would look up the law.

Speaker B:

Would it?

Speaker A:

Well, so you would program, you would program self with information that you need self to know and then stuff that self doesn't know from the information you put in there.

Speaker A:

It would query outside, but it wouldn't query outside from Uri Clark with your information.

Speaker A:

It would do that anonymously.

Speaker A:

And so there's, there's ways in which you can obfuscate information so it can come back and actually teach your own version.

Speaker A:

You're right.

Speaker A:

What you need is an URI Clark language model, not a large language.

Speaker B:

So in essence in that example, it could go out and get some answer from the law.

Speaker B:

We would then correct it internally and.

Speaker A:

Then it upgrades your program as opposed to everyone else's.

Speaker A:

Yeah, yeah, that's, that's sweet.

Speaker A:

And I think that's, that is the self sovereign way of running AI because you benefit from it, you're in control of it.

Speaker A:

We can't even access it.

Speaker A:

You've got the flick switches, you can choose what's external, what's internal and why isn't that surely should, should be the, the way that AI is useful.

Speaker B:

Brilliant.

Speaker B:

Jonathan.

Speaker B:

I think let's I'd love to ask, just, you know, I find it interesting what drives you.

Speaker B:

So obviously you, you've looked behind the curtain and thought, well, I'm going to do more than this.

Speaker B:

I mean, this is, you know, being an entrepreneur is, is brutal.

Speaker B:

And this is obviously you feel now you found the thing, you know what I mean?

Speaker B:

After this, there's nothing.

Speaker B:

Or there's no.

Speaker A:

There's nothing.

Speaker A:

No, there's no collapsed.

Speaker B:

This is.

Speaker A:

Yeah, this, this.

Speaker A:

I do until the day I die.

Speaker A:

I'm never going to sell it.

Speaker A:

There's no, no price that someone, if someone gave me a billion dollars, what I would build is exactly what we're building.

Speaker A:

So it's like there's no.

Speaker A:

I'm far beyond driven on that.

Speaker A:

And it doesn't matter, to be honest with you.

Speaker A:

I'm just trying to do as much as I can.

Speaker A:

I've got nine months in my life.

Speaker A:

I've got nine months left.

Speaker B:

Yeah, that's it.

Speaker B:

Yeah, so.

Speaker A:

So it's, it's March 18th at the time of recording.

Speaker A:

So I've got nine months left.

Speaker A:

And my view is, how much can I build in nine months?

Speaker A:

Like, what?

Speaker A:

So I'm looking at the trajectory of the dev timeline.

Speaker A:

We've got one dev stream that's got 12 months in the stream left.

Speaker A:

And I'm like, we need to cut that down by a quarter.

Speaker B:

You do it even on that level, like delivery.

Speaker A:

I'm like, we need to reduce this.

Speaker B:

By a quarter at the end of the year.

Speaker A:

We're reducing it by a quarter because before I die I need this to be built.

Speaker A:

And they're like, well, we'll need another two devs on it.

Speaker A:

On higher the dev.

Speaker B:

When did you start this one year thing?

Speaker A:

When my doctor told me two years ago that I was 30 kilos overweight, just about to get type 2 diabetes and had about 10 years left.

Speaker B:

What?

Speaker B:

Wow.

Speaker B:

Okay.

Speaker B:

And I was like, right, okay, let's count them off.

Speaker B:

Basically.

Speaker B:

Well, you know, and it's very pertinent to me.

Speaker B:

My dear sister who died, she only got six, nine months.

Speaker B:

You know, she was, you know, what was it, you know, 30 30th of September or something.

Speaker B:

She was dead by the 7th of June.

Speaker B:

You know, so that actually a year in some cases is general.

Speaker B:

I mean, she got the most vicious cancer, pancreatic.

Speaker B:

It's like that's the quickest.

Speaker B:

I think a year is, you know, anytime that phone could go, you go into the doctors and they say, you got a year.

Speaker A:

That's right.

Speaker B:

You're prepped for it.

Speaker B:

Great.

Speaker B:

Yeah.

Speaker A:

So what I have to do as an entrepreneur is I need to build as much as I can and make the right decisions on long term future because this isn't about short term shareholder value.

Speaker A:

And I don't even believe that people really see the importance of self sovereign technology for decades.

Speaker A:

Then I'm almost certain I'm not going to be around when this truly scales.

Speaker A:

And when there's 5 billion people or 6 billion people using self technology versus big technology, I'll be long gone, mate.

Speaker B:

And you look living your life, you like you've got one year left.

Speaker B:

That's definitely one of the best pieces of advice someone's given me.

Speaker B:

I mean are there other, any other bits of advice given to you, you know, being given to you that you would give out?

Speaker A:

Yeah, I think one of the, one of the most pieces of, most important piece of advice that I've ever received was from my doctor at my 50th when he told me that I had 10 years left was I didn't.

Speaker A:

And I'm so ashamed to say this, but it was only when I was 50 at that meeting with my doctor that I realized that of what they meant when they say in an airplane that if you're sitting with a young child, you put your face mask on first.

Speaker A:

And so what I had done throughout my life is shed out all the advice to my kids and tell everyone else what they were doing they could be doing differently.

Speaker A:

And I didn't think about the fragility of my own mortality.

Speaker A:

And so, and so all of the advice that I've been given through my life that mattered the most is based on our temporary nature.

Speaker A:

And also here's another piece of advice as well from the doctor at that time and actually my wife had given me this advice many times beforehand and I didn't listen to it.

Speaker A:

Start yoga.

Speaker A:

Start yoga as soon as humanly possible.

Speaker A:

And the starting of yoga, yoga is the answer, by the way.

Speaker A:

Inside that practice you cannot avoid your breathing and your movement and your lack of inflammation and your concentration and the ability to separate thinking from the moment on the mat, all of the things that you need to understand what this circle around the sun is for is found within than yoga.

Speaker A:

And so yoga and the awareness of my mortality and therefore the temperance of it all is the reason why I live like I do.

Speaker A:

And also of course, as everyone who's ever read anything of mine will know, finding your purpose and doubling down on it and only executing that to the expense sometimes of everything.

Speaker A:

I mean this has not been the easiest journey in the world.

Speaker A:

And so there are times when it's just sitting in a room with everyone telling you you're an idiot and just going, no, no, this is definitely the right way forward.

Speaker A:

And then you question yourself.

Speaker A:

And I'm walking around just going, maybe I've completely.

Speaker A:

I may have actually done.

Speaker A:

I may be complete, I may be mad.

Speaker A:

And then I'm like, it's the right thing to do.

Speaker A:

And now, of course, those moments of anxiety have gone.

Speaker A:

But for the first two years, bearing in mind 99 out of 100 meetings I'd have with people, they were just like, oh, John, mate, just build a social media.

Speaker B:

Lovely chap.

Speaker A:

Yeah, it's a lovely chap.

Speaker A:

Well, wouldn't invite him to a dinner party.

Speaker A:

I do do yoga every day.

Speaker B:

Oh, nice.

Speaker A:

And, and that's one of the reasons why I have the clarity of thought about all of these things, because there is a moment in my day where I'm the.

Speaker A:

I am something that I can breathe into and learn about what's happening inside and I'm connecting to my actual self.

Speaker A:

And anyway, but yoga aside, if anyone's listening and watching this, thinking that they resonate with this, then my email is jelf app and I'm looking for the tribe.

Speaker A:

I'm looking for people who believe in or building something that is complementary or building something that's exactly the same, that we can partner with anything related to the sphere of self sovereign technology as an option to big technology.

Speaker A:

That's.

Speaker A:

I would welcome that.

Speaker A:

I mean, I've always said this at the end of every appearance, I've invited people to come forward.

Speaker A:

And the vehicle that's now thankfully the most effective for that is watch ethics tv.

Speaker A:

Because when Justine calls people to get them on ethics, she says, hey, we've had like 175,000 viewers.

Speaker A:

And then their ego kicks in and they go, I'm in.

Speaker A:

And so we've used the human behavioral pattern of like, how many viewers?

Speaker A:

Yeah.

Speaker B:

Okay, so do a quick fire round.

Speaker B:

Gonna ask you a quick few questions, get to know you a little bit better.

Speaker B:

Answer as quickly as you can.

Speaker B:

Dee is cueing some music.

Speaker B:

And what was your first job?

Speaker A:

My first job was working for my parents in their retail shop in Camberley in Surrey.

Speaker B:

Very nice.

Speaker B:

What was your worst job?

Speaker A:

Well, that's pretty much my only job that I've ever had because then I became self employed.

Speaker A:

But I did try and, and work at Pizzaland, the pizza parlor in Camberley, for one day and I mistakenly claimed that a woman was pregnant and said, is it a table for two or potentially soon to be three?

Speaker A:

And she wasn't pregnant.

Speaker A:

And so that was my last day there.

Speaker A:

And I felt I've only trying to be charming.

Speaker A:

I've only stopped feeling guilty in the last few years about that.

Speaker B:

I remember Pizza Land.

Speaker B:

Yeah, it's the only place I've ever done a runner from.

Speaker B:

Don't tell anyone.

Speaker A:

Wow.

Speaker A:

I know a lawyer who can chase you for that.

Speaker B:

Yeah, people from school did it.

Speaker B:

I was left at the table.

Speaker B:

Anyway, you have to run.

Speaker B:

Brilliant.

Speaker B:

Favorite subject at school?

Speaker A:

My favorite subject was actually biology.

Speaker B:

And what's your special skill?

Speaker A:

Living in the future in the way that the future could look and then translating it back to the present.

Speaker B:

That's exactly how I described you to someone.

Speaker B:

I said you're brilliant at being like, you know, five years into the future and actually being able to communicate what the hell you're seeing.

Speaker B:

Yeah.

Speaker B:

What did you want to be when you grew up?

Speaker A:

Self employed.

Speaker A:

Following my passion without building someone else's dreams.

Speaker B:

What did your parents want you to be?

Speaker A:

A lawyer.

Speaker B:

Worked out with one.

Speaker A:

At least they got one son.

Speaker B:

And your go to karaoke song.

Speaker A:

New York, New York.

Speaker A:

Frank Sinatra.

Speaker B:

Oh, cracker.

Speaker B:

Office dogs.

Speaker B:

Business or.

Speaker A:

Ah, business.

Speaker A:

100%, thank you very much.

Speaker B:

Have you ever been fired?

Speaker A:

Yeah, from Pizza Land.

Speaker A:

And that was my last job.

Speaker B:

And what's your vice?

Speaker A:

Yoga.

Speaker B:

Yoga.

Speaker B:

Okay, nice.

Speaker B:

Thank you.

Speaker B:

Jonathan, it's been so nice to have you back.

Speaker B:

You know, you're big friend of the show and I, I can't praise you enough for what you're doing.

Speaker B:

Check out everything Jonathan's doing.

Speaker B:

And that was this week's episode of Business without.

Speaker B:

We'll be back next Wednesday.

Speaker B:

Until then, it's ciao.

Links

Chapters

Video

More from YouTube