Artwork for podcast The AmberMac Show
Meta’s $16B Ad Scam & Guest Frank McCourt’s TikTok Bid
Episode 3912th November 2025 • The AmberMac Show • AmberMac Media Inc.
00:00:00 00:53:52

Share Episode

Shownotes

On this week's episode on The AmberMac Show: We dig into how Facebook, Instagram, and other Meta platforms are raking in billions by scamming online users through deceptive ads and, speaking of scams, Elon Musk is taking his Tesla board for a ride for a cool one trillion dollars - thankfully, not all billionaires are built the same. This week’s guest is billionaire investor Frank McCourt, Executive Chairman of McCourt Global and Founder of Project Liberty, who discusses his bid to buy TikTok and his mission to build a better Internet for all. Finally, in this week’s TheOtherMac, Jeff talks about dating apps, AI, chatbots, and the loneliness epidemic.

Send a voice message to The AmberMac Show

Check out our guest on social…

  • Frank McCourt - Author & Executive Chairman of McCourt Global and Founder of Project LibertyBook | Website

Be sure to follow our hosts...

Transcripts

00:02.99

Let's reclaim what's ours. our Our data is our personhood, it's our being, it's our digital DNA. It's probably probably more valuable than our biological DNA, because it's our whole lived experience.

00:15.78

It's what we feel, what we think, you know how we emote. And the thought that a young person could go on a chatbot, share their vulnerabilities directly,

00:33.02

with a chat bot and then be manipulated by that is, it's it's just morally wrong. We need technology that's in harmony with our democratic ideals and principles and basic human rights.

00:50.70

The AmberMac Show on Canada Talks SiriusXM 167 Welcome to the AmberMac show coming up in today's show. We're going to talk about Zuckerberg's greatest scam yet that is making the company billions of dollars.

01:01.93

And speaking of scams, Jeff, why Elon Musk is taking his Tesla board for a ride for a cool $1 trillion. Oh, yeah. And we're also going to talk about Apple's AI problem, Kim Kardashian's chat GPT breakup. And Amber, I'm looking forward to your conversation a little bit later with the billionaire businessman who wants to buy TikTok and make the Internet a better place.

01:23.39

But first, let's get into the headlines. News Feed. Well, Amber, I talked about Meta being the cornerstone of the internet fraud economy in my The Other Max segment about a month ago, but Reuters had just reported on a trove of previously unreleased documents that show not just the scale of Meta's scam ad empire, but Meta's startlingly casual attitude about trying to improve the situation.

01:49.36

is. ah Meta estimated that in:

02:06.23

en going on for years, and in:

02:26.16

s are just disgraceful. So in:

02:39.45

And in fact, regulators in the UK found that Meta was involved in 54% of all payment-related scam losses that year. And that's more than double every other social media network combined.

02:53.82

And so what was one of Meta's main suggestions for how to deal with this? Why don't we charge scammers more for running scam ads, which is just obvious, shows how little they care about users in all of this.

03:08.28

The bizarre thing to me is that here in Canada, as I think many people know, because of the Online News Act, Meta has ah complied with the act by blocking news. So I totally understand ah why they chose to do that.

03:23.13

ah But there are a lot of people saying, hey, you know, they're just adhering to the law. Well, it seems that Meta is one of these companies, Jeff, that only adheres to the law when it makes sense for them and they make more money because ah this is a perfect example when you talk about allowing advertisements of what are in some cases unlawful advertisements and business activities.

03:44.60

You can see that they are making billions of dollars by acting in this really shady way. And Amber, I would also interject that they one of their solution points was that ah we have to sort of prioritize where we deal with these fraudulent things because we don't want to get you know sued by regulators and stuff. So they were saying, let's make sure that you know we focus on places where we might have to pay a fine because that'll save us more money. So it's it's very deeply insincere on their part.

04:19.09

It really is. And I will say once I was posting about this on LinkedIn, Arlene Dickinson, who people will know, of course, from Dragon's Den, as well as ah other ah ventures and programs, she actually posted that she had a scam ad using her likeness within the meta universe, and it was nearly impossible to get that taken down. So if Arlene Dickinson cannot get a scam ad taken down, all of us are in serious trouble. Indeed.

04:43.37

Now, speaking of scammers, it looks like Elon Musk has convinced Tesla shareholders to support his ridiculous $1 trillion dollar pay package. Yeah, and this is such Kool-Aid drinking nonsense. It's like a cult. People point to the fact that, oh, there are some really major accomplishments necessary to get to that $1 trillion dollar amount.

05:03.54

Well, fair enough. I'm not sure that $1 trillion dollar amount is the goal because what they ignore is that Musk will still be grossly overpaid, even if he just manages to secure ah fraction of those goals, some of which are pretty easy.

05:15.80

And, you know, give him controlling shares and all this sort of stuff. It's just ah hard to imagine. And it's in the face of Tesla sales continuing their downslide in North America and just cratering in Europe. Like in Germany, ah it sales are down by 50% this year.

05:33.17

so That also ignores the fact that Tesla shareholders in doing this are diluting their own shareholdings in the process. So it's really hard to understand, but but don't worry.

05:45.71

Elon's got another solution for this. he swears that Tesla drivers will be able to, quote, text and drive in the next month or two. And I guess that he's forgotten that not only has the full self-driving mode for Tesla's actually killed people and they're in court for it, but texting while driving is illegal in most places.

06:08.58

So, you know, but um what does that matter as we're increasingly seeing laws are for the rest of us, not for billionaires, right, Amber? Yeah, absolutely. I mean, here are two great examples, right? Zuckerberg and Musk. You know, they only yeah adhere to the law when it is in their best interest, and otherwise they totally ignore it.

06:27.26

reach users in the spring of:

06:52.32

I mean, something has to be done about Siri. I mostly love Apple products, but Siri's a total embarrassment at this point. So it's good that they are doing something. They really, it's it's getting to the point where if they can't do it themselves, it makes sense for them to use a partner.

07:09.50

So the agreement specifies that the Gemini model will still be run on Apple's own servers. So no data is supposed to be shared with Google. So hopefully the privacy stuff remains intact.

07:21.26

And it also does seem that Apple will continue trying to develop its own solution in the meantime. And, you know, in terms of partnering with anyone, frankly, I trust you know Google is not perfect, but I sure would trust Google over a and open AI at this point.

07:35.45

And ah Apple's obviously been working with Google for years already through the Safari browser arrangement and all that stuff. So just fingers crossed that this works well because ah Apple users deserve not to be let left in the sort of trail of dust by the other much more functional AI assistants out there.

07:54.21

Absolutely. And speaking of OpenAI, CEO Sam Altman has had a really rough couple of weeks. ah ah For example, details emerging from Elon Musk's lawsuit against OpenAI have produced testimony saying that Sam Altman's brief removal as OpenAI's CEO was the result of Altman not being, and I'm quoting here, consistently candid with his communications with the board.

08:17.99

end quote, as well as pitting executives against one another, withholding information and providing conflicting information about his plans for the company. This is something out of ah a dramatic series or movie, Jeff, including in the face of safety concerns that were being raised, releasing ChatGPT to the public without even telling the board.

08:37.37

and The board apparently found out on Twitter. Yeah, and and on top of that, we've got OpenAI's CFO was saying that, oh, the market should be more exuberant about AI, and you know let's stop all this silly bubble talk.

08:51.76

In addition to musing about a a federal government backstop guarantee for OpenAI because of its investments and importance to the race for American AI dominance, which she tried to walk back a bit, but then Altman came back and said something very pretty close to that. So it's no surprise to to see where this is all going, where you've got a relatively corrupt government in ah in power that can be taken advantage like this.

09:19.11

And what's wild is that, and and you know speaking of companies getting too big to fail, of something we should have learned our lessons from in the Great Recession, is that this is all happening aside one of the biggest players in the AI race, NVIDIA.

09:36.16

Their CEO is saying at the Financial Times Future of AI Summit in early November that, I quote, China is going to win the AI race, end quote, and arguing that cynicism and excessive regulation in the United States is partly to blame for the situation.

09:52.16

Wow. All right. Well, we're going to keep up with this AI news. You might have already seen Coca-Cola's new AI Christmas ads, which I've seen, which some people hoped wouldn't repeat be repeated after last year's backlash over Coke's AI-generated Holidays Are Coming commercial.

10:10.27

But as Coca-Cola's head of Generative AI has said, and ah by the way, this is a surprising entry in the Coca-Cola company's org chart, There will be people who criticize. We cannot keep everyone 100% happy.

10:24.11

Yeah. And he he tried to pin last year's criticism on the craftsmanship sort of quality of the ad saying, quote, but this year the craftsmanship is 10 times better.

10:34.77

ah It may be better, but there's still no mistaking the animation as being anything other than AI generated slop, uh, It's also strange that they they never mentioned Christmas in this ad, despite it being full of Christmas trees and Santa Claus and all this stuff. But anyway, ah do you know who else, though, is criticizing AI, Amber?

10:55.90

Very important person, none other than Kim Kardashian. I saw this. So I'm glad we're getting a chance to chat about this. In an interview with Vanity Fair, among other outlets who also published this, Kim Kardashian, who recently hopped on board the fake moon landing conspiracy train, said she used ChatGPT to study for her first year law student's exam and blamed it, that being ChatGPT, for

11:21.53

failing, ah for her failing three times saying that it was a quoting here, always wrong. It, uh, has made me fail tests all the time. And then I'll get mad and I'll yell at it and be like, you made me fail.

11:34.24

Why did you do this? And it will talk back to me. ah I, I just, I am, I'm kind of speechless a little bit on this one. I mean, who would ever depend on a tool like this to pass, uh, uh, any type of exam? Yeah. I mean, I think a lot of people just don't understand how AI works. And Kim Kardashian is one of the least surprising ones of those.

11:54.72

But, you know, life is a struggle if you're a billionaire reality TV star. So poor, poor Kim. Anyway, in another instance of tech companies trying to get more training data out of their users, the Tinder dating app is now testing a new chemistry feature.

12:09.93

r pillar of Tinder's upcoming:

12:20.76

product experience Wow. Okay, so Tinder's user base has been shrinking for the past couple of years, so this seems like an attempt to provide better value. But let's face it, this is also a total privacy nightmare.

12:34.22

Thankfully, this only happens once you give the app permission to do so, which I don't think I'd encourage anyone to actually do, Jeff. And I know you're going to talk now about ah AI's role in dating, and that's coming up next in The Other Mac.

12:50.19

minutes a day in:

13:04.00

And maybe that's because in this year's survey, 78% of users reported sometimes, often, or always feeling mentally, emotionally, or physically exhausted by their dating apps.

13:16.14

In response, apps like Tinder, Bumble, Hinge, and Grindr have begun investing in AI-related features in an effort to re-engage users. In fact, Match's 14th Annual Singles Study in America, released this summer, reported that 26% of singles are using AI to enhance their dating lives, an increase of 333% from just last year This includes things like improving dating profiles, screening matches for compatibility, coming up with conversation starters and lots more.

13:45.48

And companies are starting to build all these kinds of tools right into their apps. But AI romance has moved far beyond dating apps with the rise of AI chatbots. Match's study found that 33% of Gen Z and 23% of millennials surveyed have interacted with AI as their romantic partner, and 40% of users reported that they think their partner having an AI boyfriend or girlfriend would be cheating.

14:10.07

Unfortunately, there are some real dangers to becoming romantically involved with an AI as well. For for one thing, as many of us have experienced, chatbots are often comically over-enthusiastically supportive about the person they're chatting with.

14:24.16

And while that can be nice a lot of the time, a good romantic partner is one who will challenge you when needed. But it's also a privacy nightmare. The amount of information that can be casually collected as part of an ongoing romance with an AI is staggering.

14:39.36

And there are tons of scams out there that could part vulnerable to humans from money that they sure didn't intend to spend. Finally, there are well-founded psychological concerns about how dark things could get with an AI partner where any human partner would have just opted out.

14:56.17

But the most concerning thing to me is how this relates to what some people call our loneliness epidemic. Since I was born, the world's population has doubled and urbanization has meant that more people are living in cities than ever before.

15:09.72

So today you're likely to be in physical proximity to more people than ever in history. And yet we're more likely than ever to feel alone. Social media has done a great deal of work, keeping us addicted and glued to our phones, to the exclusion of people around us, trying to convince us that digital relationships are just as legitimate as real ones.

15:29.43

And now AI is joining this fight and we have no reason to think that it will do anything other than just compound the many harms that social media has inflicted on society. There's nothing at all wrong with online dating. Just don't get sucked into it taking up huge swaths of your time, because you'd be better trying to devote at least some of that time meeting somebody in real life.

15:49.16

But when it comes to getting romantically involved with a chatbot, an arrangement that technology will inevitably evolve into full-on AI robot companions, I think there's a real danger to our humanity there, and it risks turning the empowerment that people in good relationships feel into something more akin to slavery,

16:07.11

And in time, it will get pretty hard to figure out whether the human or the AI is the real slave. I'm the Other Mac. Good luck out there. The AmberMac Show on Canada Talks, Sirius XM 167.

16:23.44

ounder of Project Liberty. In:

16:36.40

At the Attention Governor Be Governed Forum in Montreal, Amber spoke with Frank about why he wants to buy TikTok, what he learned about the problems with social media when he owned the LA Dodgers, and how to build a better internet for the next generation.

16:54.94

All right, so we have about half an hour for a conversation on this topic of what the future of tech could be. ah Let's start there.

17:05.22

Why is this a topic that interests you? Well, for lots of reasons. I think I'd start with the fact that I'm a dad. um um' ah I have eight kids ah and four from my first marriage, four from my current marriage.

17:20.73

I know what it's like to raise children pre-internet. And I know what it's like to raise children ah in this internet age, this digital age. And you know I've had my own experiences over the years with with technology that and ahve I've watched how this entire but beautiful internet, which was a decentralized technology,

17:44.06

telecommunications apparatus essentially became highly centralized. And that concentration of power really concerned me. And ah lastly, I would just add, I come from a family of builders.

17:55.22

We were five generations, started in Boston. My great, great grandfather started building roads when Henry Ford started building cars. And we've ah we build large-scale public infrastructure, including internet systems and so forth. So have a fair amount, ah although I'm not a computer scientist or a technologist, we have a fair amount of knowledge as to how this technology actually works from the ground up.

18:18.69

And we're going to get into that. One of the things that we really want to focus on in this conversation are some of the solutions. How could things be different? What could be done? I wanted to talk to you about some of the work that you're doing at Project Liberty for people who maybe aren't as familiar with the organizations.

18:34.74

Yeah, so ah ah I had an experience in ah in LA when I owned the Dodgers. And ah ah in that era, I went through a public divorce. and ah And I watched how social media had been weaponized.

18:49.53

um And I felt a little bit of of that on the receiving end. And it really concerned me that you know here you are with this technology that had shifted in the 08, 09 timeframe from this, you know, innocent kind of internet where we connected reconnected with people and shared photographs with people, we, ah you know, our friends and family and so forth to something that was very, very performance-based.

19:18.02

You know, Facebook had introduced the like button, the the smartphone was in everyone's hands. And suddenly it wasn't, um you know, something that was necessarily factually correct or really newsworthy or insightful that was shared. it was if What happened is what was um ah getting likes and followers and clicks became kind of the the holy grail of how the internet works. So it became totally performance based.

19:44.48

And ah during my era in in LA going through that difficult time, the metaphor I use is I felt like I was ah my watching my home burn down and I was handed a small garden hose with low water pressure well oh while a thousand people were just pouring gasoline on the on the flames.

20:05.29

it's It was just a very helpless feeling and one that I think a lot of people share. And this was going back now 15 years. So my first instinct after after ah selling the Dodgers and and moving on with life was to get policymakers, like many of you here, kind of working, ah understanding how this technology works at the core and how this, you know, kind of surveillance machine really was extracting everything that makes us human

20:37.60

with Georgetown University in:

21:22.91

And And we've we've seen how difficult it is, though, to actually get these big platforms ah to operate differently because we wouldn't be sitting here talking about all the harms and the concerns and in the that that are only going to get amplified by AI if the policymaking alone could fix the problem.

21:42.87

So in:

22:00.65

And that begins with ah putting human beings at the center of the technology, not having the platforms be at the center. And this goal or purpose that you have has become increasingly important, I think, especially over the past few years.

22:16.05

years ago, I wrote a book in:

22:41.56

Can you talk a little bit about the past 10 years and maybe that shift that you you you mentioned as far as ah the goal, again, of the organization? Yeah, so yeah the aha moment for me really was, ah you know, when I brought together ah some brilliant technologists to say like, how would you address this problem if you were given white space?

23:03.12

You know, just unlimited resources and and just you could, knowing now what we know about the internet and how ubiquitous it is, how necessary it is, right? We're all dependent on it. It's no longer a nice to have thing. It's we're all connected.

23:19.35

ah to the internet and and, and, and we needed to operate our daily lives and for, you know, governments to operate and ah individuals to operate and businesses to operate. And we see when AWS has a shutdown, how things suddenly get, you know, get paralyzed. And so we see how dependent we are. So, okay, we know we've built this incredible technology.

23:44.39

How, how could it work differently? you know, And the insight really was, you know, this is technology that was really built from the ground up by simple, a core layer protocols. You know, the first set of protocols connected devices, that was the internet.

24:02.63

And then in 89 with Berners-Lee, another protocol connected our data. And that was the World Wide Web. And everything was moving along just fine until people figured out, some people figured out that data was gold.

24:17.48

The data was everything. And we we we saw the creation of these massive companies, this machinery that was, it's really all about extracting personal data and accumulating it, aggregating it, implying algorithms and now ah AI.

24:33.02

And, know, which is just a more powerful algorithm. And so it is, ah they all discovered, and whether it was social, shopping, or search, it was our personal information that gave these platforms incredible insight about us. And when I say incredible insight, I'm not talking about they knew a few things about us.

24:53.84

Each has ah million data points on each of us. They know far more about us than we know about ourselves. And so why not another core layer protocol that actually connects us so that we're the ones who are owning and controlling our data, permissioning its use, benefiting from it and so forth.

25:15.26

are I think what's really, but for me, what was a critical shift in my thinking was that rather than just trying to control these huge companies that are uncontrollable,

25:29.12

what if we kind of take Buckminster Fuller's you know adage and and put it in operation? And you know I'm gonna paraphrase what he said, and that is, when um when a model is so fundamentally broken, don't spend your time and energy trying to fix it.

25:43.87

Build the new model and make the old one obsolete. So really that was kind of the insight to say, okay, let's start from the bottom of the stack create a protocol or suite of protocols that give people ownership and control of themselves again.

25:58.52

And that's the technology that we built. And now there's 14 million people using it. So the question of can it scale, that's pretty clear. It can. it's It's no challenge yet though to, you know, when internet, the billions of people are using.

26:13.86

So we need policymakers, governments to step in and say, there's another version of the internet. We're not stuck with the one we have.

26:24.71

And what I what I would ask people here to think about and maybe leave here with is a sense that, yes, we need better policies and so forth, but we don't need just to think about the internet as it is,

26:39.05

And how do we make it better? Let's think about an alternative that's designed from the ground up to be better and is in harmony with the exact ah policies and policy objectives that you'd like. In other words,

26:53.29

um you know It's fine to have GDPR and DMA and DSA and so on and so forth, but those are trying to bend technology to comply. why not having Why not have technology that actually bakes those principles into the technology, starting, as I said, by putting human beings at the center?

27:12.79

And so this was, for me, the big eyeopener and that's when we really started focusing on rebuilding the stack from the ground up.

27:24.27

I wanna say one last thing. There are also people here in the room that are also builders and are putting solutions forward. I'm not suggesting that we have all the answers,

27:34.81

or that the tech we've built is the silver bullet. There are no silver bullets. But until we start to build the then the the better version of this technology, we're not going to get to a better place. And so Project Liberty is focused not so much on the problem, because I think that is, the problem set is well-defined now. I mean, I was listening this morning, it's pretty overwhelming and it can get pretty depressing.

28:04.17

Let's shift and focus on solutions to the problem because we now know what's wrong and let's fix it before we make it more powerful. Yeah. And maybe I can just do ah a shout out, um, to Gander, a company here in Canada. I know they're in the room and they're trying to build, uh,

28:20.90

a different type of social network for Canadians, and they've had a lot of momentum. So I agree on your point about builders. You talk about alternatives, and I became most familiar with your work ah when I was reading headlines around the bid for TikTok.

28:34.38

I would love to talk a little bit about this. I know the plan for a TikTok, um it it could look different. It could be looked different. It could operate differently. Can you talk a little bit about that? Yeah. So, um, know, Project Liberty was never created to purchase TikTok.

28:49.59

It was, um, it was just an opportunity that presented itself that, um, you know, we took advantage of peak because it was, ah we had built this, all this alternative tech stack.

29:05.38

We didn't need the Chinese technology or the algorithm. We could buy the US TikTok without the Chinese technology, bring the user base over and the data over.

29:16.46

And then we'd have 14 million people, 184 million people in this alternative. And then you're getting to a scale where now you really do have an alternative that people then have the choice to. I want to use a surveillance-based stack, which which mines my data and extracts it and monetizes it without regard for me, and ah and and he actually uses that very information about me to manipulate me.

29:45.32

Or do I want to use a version where I own and control my data? I'm permissioning its use. I get all the benefits of the internet, so I'm not giving anything up, but I now, I reclaim my personhood, okay? My agency, my power, which is very fundamental to democracy.

30:04.84

And by the way, benefit from the use of my data. So there's now technology available where you can actually compensate people for their data if it's used for commercial um use case. And you know this argument that we heard up until a few years ago that your data is not you you're own data is not worth anything on an individual basis and it's only worth something like when it's aggregated and only a few companies can do that, it's just not true.

30:34.92

um In in you know seven companies now worth $21 trillion, dollars and that's with a t okay it proves that our data is massively valuable. Why are we not getting value for it.

30:48.04

Why, you know, data is a human right and a property right. And we need um to, I think, think about it in that way and just reclaim, reclaim what's ours.

30:59.35

And this is, this is the moment to do it with TikTok. We saw an opportunity raised $20 billion, dollars which by the way, proves that there are people interested in ah an alternative to what we have.

31:12.17

And that was no problem raising that money. And we put forward the bid and um and now it's unclear what's going to happen. And we'll see, cause it's still a fluid situation, but it appears that, you know, there may be a transaction in in play that may may not comply with the very legislation that was put forward to um solve for a very real national security issue. And by the way, the national security issue is not that China has information on 170 million Americans.

31:42.94

It's that China can manipulate what 170 million Americans think. And that's really a big threat. And so um ah this is this is really, really powerful technology that knows everything about us and can manipulate us. And I just believe at the core of all this, we need to reclaim proclaim what's ours.

32:05.20

One last point on this. It's really hard to change 30 years of entrenched technology. And maybe you catch lightning in a bottle and buy TikTok and move 170 million users over and suddenly you have scale.

32:19.45

But I'll tell you where something really, really interesting is happening right now. And this is for me, the glimmer and the hopeful thing. When technology changes, that's when space opens and new designs can take hold.

32:37.87

So, We've had a certain version of the internet for the last 30 years, highly entrenched. it's, it's now, sadly, an app-centric, highly centralized internet.

32:52.28

We're moving to the so-called agentic web. And this is a big shift in the technology. This is our moment. because it's much easier to actually ah change the design of how technology works when that technological shift happens and than it is to affect the entrenched and impact the entrenched technology.

33:17.09

It's a short window. It's not one that we can be talking about in five years or three years. And by the way, the big tech platforms are moving so quickly because they know it's a vulnerability for them.

33:35.26

They know the shift in the technology opens up the doors for new ideas and better design and better technology. So they are flooding the zone with and precedents ah unprecedented amount of capital to make it ah to to to make this fork in the road that we have ah kind of an inevitable we we inevitably follow follow them. And what they're building is a more powerful version of the technology that has created all the harms that you all have been identifying and trying to regulate for.

34:08.85

Why not build the alternative? And to do that, mean, because we this shift is real. This is happening. It doesn't have to go in the direction they that big tech wants and big AI wants.

34:21.82

As a matter of fact, I would say that that is exactly the wrong direction. Because don't think of AI and social differently. yeah what was what is our social graph information and our personal information that's been exploited with so-called Web 2.0 or Web 2, the current surveillance stack, it's the the version of that in AI will be our AI context.

34:49.51

okay The context that we share through a chatbot with an LLM. LLMs know nothing about us. They're generic models, large language models.

35:00.28

Where the information become it comes into play is through conversations with chatbots, like GPT with ah OpenAI, Grok with X, et cetera.

35:13.19

That context is our personal information. And I would argue strenuously strenuously that that information should be ours, not the not the not the companies.

35:27.36

Just like our social graph information should have been ours, but we let we let those horses out of the barn. We now have a chance to make sure that our AI context is ours. we can you we can we can decide what we share with what LLM.

35:43.14

We should be able to move our context from one LLM to the other. It's just called portability. So think of think of your your data or your AI context as your you know like your phone number, right?

35:55.06

It was at one day... Years ago, it was owned by a a telco. Now it's owned by you. Think of these platforms as as telcos. Our information should be um interoperable. We should be able to talk on, you can have one phone carrier, I have another one.

36:10.19

We can communicate. It's just technology. It used to work one way. It works a different way now. We need portability. We need to own our context. We need interoperability. These are fundamental things, but it begins and ends with portability.

36:25.80

Whose data is it? Where is it stored? Who gets to use it for what purpose? And who benefits from it financially? And if we if if the our data is ours, we could have all the benefits of this beautiful internet technology and the advances that AI will surely bring.

36:44.36

But we we we were we have agency and power returned to us. We have technology that's in harmony with democratic principles and ideals. And then we can enjoy the benefits of AI and together go solve the other problems that we need to solve globally and locally and so and solve them at scale.

37:07.05

I fear we're not going to solve any of them. We're only going to create more problems if we go down the track that big tech wants to take us. Yeah, so maybe I can ask you just a ah quick follow-up in terms of where big tech wants to take us. Because when you think about building a different, a better version of digital, I want to believe this. I want to believe that that is possible. I think it is possible. I think there's public support of building something like that.

37:31.04

ah You mentioned you raised $20 billion dollars So clearly there are people who want to see this this better version of digital. ah But what about the momentum that we have right now? mean, you're talking to people in the technology sector.

37:42.90

We so often hear about just the Mark Zuckerbergs of the world, the Elon Musks of the world. What do you see about being able to build that? Are there enough people who believe in that? Yeah, um absolutely, yes. I mean, it's it's the the one thing I can say with a great deal of of um ah confidence is that the alternative technology can be built. As a matter of fact, it is built and is being built. And again, I want to emphasize it's not a silver bullet. We're not saying that our version is the the the the only version, but we're we've demonstrated that there's there's there's an alternative ah alternative available.

38:20.61

generations of technology. In:

38:38.92

And we took on the seven oligarchs of the time, which were the baby Bells that were the result of the breakup of AT&T. And what we saw coming was a shift in technology, just like abcentric web to agentic web.

38:54.99

It was copper wire to fiber optics, a shift in the fundamental technology, which created opportunity. And so we raised $8 billion dollars at the time, and we we built a new telco in in the United States.

39:09.11

And why? we saw this shift And we didn't have to, we weren't yeah constrained or as the baby Bells were by being built on massive copper wire infrastructure.

39:26.84

They were not, you know, kind of um design ready for high-speed internet. We started with squeaky clean infrastructure, put the fiber optics in, and then we're the first in the world to give people a bundle of high-speed internet, cable TV and phone.

39:43.75

And it was... It was kind of obvious, right? Because internet needs fiber optics to to be able to for for a high speed broadband. And so and we knew over time everybody would want this internet in their homes.

39:57.16

And so off we went. So we built it first. proved that that an alternative to what the the seven baby Bells had, there was an alternative.

40:08.22

eventually worldwide, was the:

40:21.58

Why? It made phone numbers portability and mandated and telco interoperability. So suddenly, all the friction. and we had actually people showing up saying, where do I sign to sign up to RCN? and And then they'd say, and here's my phone number. And we'd say, you can't have your phone number.

40:38.20

And they'd say, well, I come back when I can because I want to keep my phone number. Why on earth would people feel so possessive of their phone number that they they they you know we needed we needed um and new policies of regulation to make phone numbers portable, but yet people haven't connected the fact that their social graph information and eventually their AI context will be a million times more valuable than their phone number.

41:09.84

it's it's it It really is not that complicated. We need to reclaim what's ours and build, build a stack that is actually in harmony with what we want to see happen, what we want to optimize for, rather than just kind of in this constant sparring match, which we'll lose against these big tech behemoths.

41:30.19

So we have just about 10 minutes left. I wanted to focus on what people in this room can do when we talk about ah building a better version of digital when you think about government officials, policymakers, philanthropic organizations?

41:44.17

What can people do to support this better version of of digital that you're speaking about? Well, I think, first of all, you're doing it by by ah by meeting up and having having these conversations.

41:55.36

and and And my goal here is is simple, and it's just is to ask you to just ah you know just imagine a different paradigm.

42:06.92

In that it's it it it just it's just technology. It works a certain way because it's been designed to work a certain way and there there are incentives.

42:20.47

It can work in a totally different way. Good policy making and regulations can can help that. And I want to point out that there are two types of regulations, right? You know better than I. One set actually constrains or tries to constrain bad behavior.

42:37.25

The other set of regulation, like the telecom act I referred to, actually creates new opportunities and new standards. So I think that the way we've thought about this with Project Liberty and the way I would answer your question is that it's not one thing.

42:56.73

We've designed Project Liberty with three separate tracks. The tech we've talked about a lot. We definitely need another version of the technology. But we we have a policy track, right? Which is is is what many of you have been deeply involved in for longer than then we have. And as I said earlier, I really have deep, deep admiration and respect for that.

43:20.20

And we're going to need better policies and so forth that enable this new world. And then the third track, and I'm not sure what to call it, but it's kind of the alliance or coalition building or um movement type of of track where we just need people to understand what's at stake and demand it.

43:41.19

And that that that will help facilitate ah and compress time and and make sure there are better policies and make sure there's better technology and so forth.

43:51.74

And I think we're we're beginning to see this happen now. And so it's just like spread the word, keep doing what you're doing, Let's figure out how to coalesce and align because I am concerned.

44:04.99

I said earlier that there's a moment, there's a real glimmer here of the tech changing and we can fix this right now by implementing better design before it's a foregone conclusion, which is what big tech and big AI want us to They just want that foregone conclusion.

44:22.59

So now's the moment. And what I'm concerned about is that our efforts are – we fail because we're we atomized and vulcanized. We meet up here.

44:36.07

We meet up in Munich. We meet up in Brussels. We meet up in New York. you know We meet up in all – And the the the conversations are rich and the desire to change this is here.

44:48.45

But how do we connect dots here in a way there's there's a coalescing of this activity in a way that gives gives courage to our political leaders, right? and and and if political leaders won't do it and don't have the courage, then people should should go ahead and impose their will.

45:08.81

And this is, I think, a an issue that's existential if if your concern is um ah democracy, I think it's, ah which ah in turn is based on trust and truth, which we've lost.

45:22.84

And I don't think you actually have um ah democracy. I'm sure you don't have it without that. And I think it's it just is' so it's so sad to me as up as ah a member of ah of a family and you know ah of immigrants.

45:36.01

ah My family came from Ireland and they they came for a better life. And in in our case, we've actually... been builders of infrastructure. So roll up your sleeve, get your hands dirty and build it.

45:49.55

And we started in Boston and then America now around the world. And I'm so proud of what my family, not me, my family has built over 135 years. And to see it dismantled and in the way it's being dismantled is something that you know, like all of you, i just can't sit idly by and just, and just let it happen.

46:12.42

But, um, and, and, but I want to really leave you with an optimistic, uh, sense so of, of, of possibility here. And because let's not completely recycle on all the problems and all the harms and the world is on fire and democracy is failing and kids are being preyed upon and trust and truth are gone and the information ecosystem is completely contaminated and is all kinds of scams happening and everything just seems dark and negative.

46:43.33

It is, but sometimes it has to get really bad for people to actually rise up and say, I'm not going to put up with this anymore. ok It needs to be different. We can do this. okay the alternative Building the alternative tech is the easy part, relatively speaking.

47:02.91

It's how do we mobilize to make this happen? How do we get Canada to to to show up and and and be part of of the solution? Because I think Canada is a huge part of this. I think all of the kind of middle powers, whether it's Canada or Australia or UK or EU as a whole or Japan, et cetera, these are the same conversations. I've been to these places, same conversations.

47:27.53

And you all can really pull together and align and make it ah big difference. If you, on the other hand, approach this from a place of fear,

47:39.37

and and yeah this there's no leverage in fear. ah Approach it from a position of strength. it's We're at risk of being put into a ah corner here. and And what big tech wants us this to be is a debate.

47:56.92

between pro-tech, anti-tech. And anybody that doesn't agree with what the big platforms want is anti-tech. We need to change that. And ah ah first of all, we need not to allow ourselves to get into that false choice.

48:13.66

We need to create a choice for people. What are we for? We're for pro-human-centric technology. We're for pro-democracy technology. We're for a better version of technology. We're not anti-technology.

48:30.81

We're anti-centralized technology, surveillance-based technology, destructive and harmful technology, and platform-centric. And and in you as many of you have talked about, watching the conflation of technological power and that concentration of that power and political power and watching them come together.

48:51.62

if If there's nothing that could be more dangerous to democracy then then than that and this. And so this is this is our moment to really fix this.

49:03.16

ah it's It's too bad that it's gotten to this place, but maybe that's what we needed to get the push. um This is the issue. If we can return to people what is theirs,

49:15.31

Return power and agency to people so that they feel like, okay, they're not helpless. Okay? we have We have a role to play here. There is hope. There is a way forward.

49:27.03

and And that's empowering. And by the way, your data is very valuable so people can be enriched. This could be the single biggest redistribution of wealth without a tax that's ever occurred.

49:44.34

And because the data is massively valuable, but let's get it working for people. And so that we're all together again on this and we're solving problems together. Absolutely. We just have two minutes left. So I'm going to ask a terrible question with only two minutes left.

49:57.48

And I do want to end on an optimistic point, but I think when you talk about connecting the dots, the one thing that I think comes to mind for me is the is the Trump administration and the very real reality of potential retaliation for individuals who do push back on tech.

50:13.00

um Just, you know, in two minutes, some final words on that very big question. but as I said earlier, I think you can't approach things from a place of fear. And, and ah you know, the this approach it from a place of of strength. What could be more, a better place to approach all of this from than what's good for human beings, what's good for people?

50:42.09

And how does big tech or any politician defend, you know win that debate? Okay. It just, this is I can't really over amplify ah overemphasize this because it sounds so simplistic, but the the dirty secret here is that people, big tech doesn't want us to understand that are our data is is is us

51:13.73

It's everything about us. Like, I mean, everything to such a degree that um we we get under the current version of the tech, the surveillance technology, we get profiled and manipulated.

51:26.04

And by the way, we're losing our, our free will, which is what makes us human beings and different from other animal species. We're losing it.

51:37.49

let's reclaim what's ours. Our our data is our our our personhood. It's our being. It's our digital DNA. It's probably more valuable than our biological DNA because it's our whole lived experience.

51:50.33

It's what we feel, what we think, you know how we emote. It's it in in the the thought that a young person could go on a chatbot, share their vulnerabilities directly with a chatbot and then be manipulated by that. is it' It's just morally wrong. We need technology that's in harmony with our democratic ideals and principles and basic human rights.

52:23.74

And remember, it's also a property right. Your data is very valuable. Absolutely. Well, listen, thank you so much for leaving us with this optimistic ah note for the rest of the event. Really appreciate you taking the time. Thank you, Amber. Thank you.

52:42.47

Thank you, Frank and Amber. Coming up next, we tell you where you can see The AmberMac Show for our very first recording in front of a live audience. You're listening to The AmberMac Show on Canada Talks, Sirius XM 167.

52:57.81

in the area. We'll be at the:

53:22.47

alking about the AI trends of:

53:36.68

And thank you so much for listening to The AmberMac Show. As always, you can find us everywhere online and we'll see you next week. You're listening to The AmberMac Show on Canada Talks, SiriusXM 167.

Links

Chapters

Video

More from YouTube