Artwork for podcast The Rhodes Center Podcast with Mark Blyth
Can Social Media and Democracy Co-exist? A Conversation with Frances Haugen
27th May 2022 • The Rhodes Center Podcast with Mark Blyth • Rhodes Center
00:00:00 00:24:51

Share Episode

Shownotes

From 2019 to 2021, Frances Haugen worked as a Product Manager in Facebook’s Civic Integrity Department. During that time she got an inside view into how Facebook’s algorithms are deliberately designed to influence its users. She also saw something deeply worrying: that this influence was often used to grow Facebook’s profits at the expense of users' safety and wellbeing. 

In 2021 she anonymously leaked tens of thousands of internal documents to the Wall Street Journal. Since then she’s testified before Congress on the matter, and helped start a global movement to better understand and regulate Big Tech. 

On this episode Mark talks with Frances about her experience whistleblowing on one of the world’s most powerful companies, and what she thinks we need to do to create social media platforms that are compatible with a functioning democracy. 

Learn more about and listens to the Watson Institute's other podcasts.

Transcripts

[MUSIC PLAYING] MARK BLYTH: From the Rhodes Center for International Finance and Economics at Brown University, this is The Rhodes Center Podcast. I'm Mark Blyth, your host. From Twenty-Nineteen to Twenty-Twenty-One, Frances Haugen worked as a product manager in Facebook's Civic Integrity Department. During that time, she got an inside view into how Facebook's algorithms are deliberately designed to influence its users. She also saw something deeply worrying, that this influence was often used to grow Facebook's profits at the expense of user safety and well-being.

In Twenty-Twenty-One, she anonymously leaked tens of thousands of internal documents to the "Wall Street Journal." Since then, she's testified before Congress on the matter and helped start a global movement to better understand and regulate big tech. On this episode, I talked with Francis about her experience whistleblowing on one of the world's most powerful companies and what she thinks we need to do to create social media platforms that are compatible with a functioning democracy.

It's a little different than the kind of conversation we normally have on this show, but the truth is simple-- you can't understand global finance and economics today without understanding the role of big tech and social media, and that's actually where we started the conversation. Hello, Frances, and welcome to The Rhodes Center Podcast.

FRANCES HAUGEN: Thank you for inviting me. Happy to be here.

MARK BLYTH: So this is a bit of a change of pace for us in the sense that we usually have economists, political economy types, and your work and your influence, particularly since your Senate testimony, is really spelling out into these areas. For example, we have the Europeans bringing out the DSA, the Digital Services Act--

FRANCES HAUGEN: Yeah, it's exciting.

MARK BLYTH: Which really has your fingerprints all over it. And I want to get to that at some point. But let's start with the basics on this. Let's go back to Facebook. You decided that you had to say something because Facebook causes harm to vulnerable communities, especially young women because it's engagement and algorithmic based rankings keep people on the site longer, and that way they can see more ads. That's the core of the business model.

Now, many people out there in political economy land will think of this as a markets problem and say, break them up, right? But you actually think it's much easier to kind of just change the algo and keep what's good about rather than breaking up the firm. Can you explain that for us?

FRANCES HAUGEN: So I think it's less a question of like I am against breaking up the companies and more that-- I believe it's very important that when we talk about problems in the world, we should make sure we clearly articulate the cause of the problem before we try to articulate solutions.

And if we were to look at just inside of Facebook, so we're not even comparing Instagram and Facebook, if we're looking at just a Facebook, we see that engagement based ranking-- that's the process of saying, I have these different objects, maybe there are different groups that I could recommend to you or there are different friends I could recommend to you or they are different pieces of content I can recommend to you.

When I have to choose from thousands and thousands, tens of thousands, hundred of thousands of things, if the way that I prioritize what I show you first is based on how likely are you to interact with this-- you're going to put a comment on it, you're going to like and going to put a reshare-- it turns out that because the shortest path to a click is anger.

And what researchers inside of Facebook saw was there were multiple forces. It's things like the angrier your common thread is, the more likely you click back to the publisher, which means they get more ad revenue, which means they produce more angry content. The more angry things are, the more likely we share it because it actually gets you over that activation energy.

So this question of should you break up the company. Inside of Facebook we have seen problems not just in the content. We've seen them in things like groups recommendation where back in Twenty-Sixteen, 65% of all people who joined neo-Nazi groups in Germany joined them because Facebook recommended them to them, right? So this is not a problem of like this company versus that company. It's like these systems-- any place you put engagement based ranking is going to reproduce these problems.

And so the question is, what do we get by breaking up the company? So one, we do escape Mark Zuckerberg. Like that's the elephant in the room. Mark has 56% of the voting shares. He's basically impossible to dislodge. Yes, if you broke up the company, you could escape that center, like that black hole. But beyond that, it doesn't necessarily mean you're going to fix these problems because all the individual components are still going to have the problems.

MARK BLYTH: So how would you fix it?

FRANCES HAUGEN: So I'm a big proponent of chronological ranking. But you can't just take the current product form factor for Facebook and put chronological ranking in because Facebook has been pushing us into giant groups-- we're talking half a million person groups-- for years. And those groups are fire hoses of content.

And so I think there's an opportunity to take a step back and say, what would human scale social media look like? You know, we have hundreds or thousands of years of institutions around how do we organize people? Like how many people can be in a single conversation? How do you hold a conference for 20,000 people? We break it up into smaller conversations.

And I think there's some really good precedents we can look at. We can look at things like Slack. You know, on Slack it's chronological. There's sub feeds for comments. But it also has rooms. And I think right now people sometimes come in and say, well, I don't want to give up my group.

You know, I'm a cancer survivor. My cancer support group was so critical to me. Guess what? If you had a thing that was like a Discord server or a Slack server, you could still get community but a computer wouldn't be choosing what you focus on. And the biases of that computer or the biases of that company wouldn't be choosing it. It'd be other human beings that would be choosing what you focused on.

MARK BLYTH: So rather than just the user experience, it really puts users in charge. And in the aggregate, when you scale that out, that overcomes exactly these problems that you've been thinking.

FRANCES HAUGEN: Because remember, right now when we have the algorithm choose what we focus our attention on, we lose the opportunity for good speech to counter bad speech, right? So freedom of speech people always say, the solution about speech is not to ban it. It's good speech.

Well, when we allow computers to focus our attention, we as humans no longer get to correct problems. So if we're sitting in a room and we have 30 people and someone says something that's just off the wall, you know, they're a flat Earth or something, people can calmly talk to them, and say like, hey like, why don't we talk through that, or how are you doing today?

MARK BLYTH: Any particular reason you wanted to reject all human knowledge, yes.

FRANCES HAUGEN: When we allow computers to focus our attention, when we allow engagement to be the thing that is the assessment of quality, it means that if you write a really angry, enraged post that gets a whole fight in the comments and I write a very calm, detailed methodical thing-- I'd like, hey, let's talk through the assumptions in your thing-- I will get no distribution. And so we really need to think about how do we design these products such that we can have constructive exchanges of information.

MARK BLYTH: So it's funny to give that example of anger. So two years ago I did a book with Eric Lonergan called Angrynomics. Currently have a postdoc who is very much from your part of the world in the sense that he's very tech savvy, et cetera. And he said, let's scrape Italian Facebook posts for anger, right?

FRANCES HAUGEN: Ooh, I love this.

MARK BLYTH: And then we said, well, how are we going to ID anger? And we're scratching our heads. And it suddenly occurred to him and he said, oh, I've got it, we'll just search for caps lock. And it totally works. It totally works. It's amazing.

FRANCES HAUGEN: Oh, that's so funny.

MARK BLYTH: So yeah, exactly. And if that's what's driving engagement, you can totally see how it just burrows down this funnel. I mean, that's amazing. All right, elections, right? So we get very vexed about Facebook and elections and foreign interference and all that sort of stuff. From your vantage point, should we be worried about foreign interference?

And I believe the figure for the Russians was they spent $100,000 and managed to tap the election, which seems a little bit crazy. Or should we be just more worried about essentially these algorithms deciding what should be an election issue and what we should be focused on? What's the bigger threat?

FRANCES HAUGEN: So back in Twenty-Eighteen, Facebook made a major change to their product. So up until then, for years and years they've been just trying to optimize for how long could they keep you on the site. So they were like, hmm, if we show you this, we predict you'll be on site for this amount of time longer. And they switched and said, hey, we have this problem. You know, most people think of Facebook only in terms of consumption. You know, I sit here and I consume off my feed. But in reality, Facebook is a two-sided marketplace. When you write a post, you are a content producer.

MARK BLYTH: You're a producer, right.

FRANCES HAUGEN: And Facebook is a system for matching producers with consumers. And so--

MARK BLYTH: It's a shopping mall.

FRANCES HAUGEN: It's a shopping mall of ideas and memes. Have some pictures of my breakfast. The only problem is if you lose the producers, you can't have consumers. And so what Facebook was facing in Twenty-Eighteen was, over time, the amount of production was falling off.

MARK BLYTH: What was behind that?

FRANCES HAUGEN: I think part of it is-- so all social media platforms have an issue, which is that in the beginning you have pretty broad participation. Like people are trying out the platform. Over time, some people really get into it. So think of this as the people on Instagram who have drones, right? You're like, oh wow, like look at that drone video. It's stunning. I guess Instagram isn't for me. My photos look bad, right? It's self censorship.

And this is actually one of the big pushes towards things like Instagram stories, right? Like Instagram stories creates a much, much lower threshold for like what does is mean to be content production because it's going to evaporate in 24 hours, right? So Facebook was experiencing this thing where people are making less and less and less content over time, and they ran a bunch of experiments on content producers.

So this is one of those things. We always have to assume you're always being-- they're seeing how can they influence you. How can they manipulate you. And what they found was, you know, by artificially giving people distribution, so they made you 10 times as popular as you were before, five times as popular as you were before, they were like, oh, if you get more comments, more likes, more reshares, do you produce more content?

MARK BLYTH: You produce more stuff, right.

FRANCES HAUGEN: And it turns out when you give people little drips of dopamine, they give you more things in return. So a problem happened though which was when you optimize for engagement, yes you get more content production, but you also shift what content succeeds. So suddenly across Europe-- so they sent a bunch of researchers into Europe in preparation for the European parliamentary elections in Twenty-Eighteen. And across Europe-- this is like less than six months after this change-- people were like, hey, we know you changed the algorithm.

And researchers love-- they love when people think they know how products work because in reality it's like asking people about their myths, right? What is important to you? What are you afraid of? Like how do you perceive power? Like all these things is great. And so they're like, ooh, tell me more. Like how did we change the algorithm?

And they were like, no, no, no, no. Don't mess with me. I know you changed the algorithm. It used to be that we could share the bread and butter of the democratic process, you know, a white paper on agricultural policy, like stuff that's-- it's not sexy. It's not riveting. It's not clickbait. But we could see people read it. We could look at the statistics and we knew people were consuming it. They just didn't leave comments. And now if we share that same thing, it's crickets.

We are having to change the content that we distribute because only extreme content now gets distributed. And we run positions now that we know our constituents don't like because they're the only ones that get distributed. So by the time we show up in the ballot box, you know, we're standing in there about to cast your ballot, what we don't realize is Facebook voted first. And the things that we even get to decide between have already been vetted by the algorithm.

MARK BLYTH: And it's a classic story of intended slash unintended consequences.

FRANCES HAUGEN: 100%, 100%.

MARK BLYTH: So I mean, to summarize that, right, basically you figure out how to do a dopamine hit, you put that out there, people figure out there is a dopamine hit, the dopamine hit really has an anger precursor, so unless you're doing that it doesn't work. You just change the substance of democratic politics, and nobody really set out to do that.

FRANCES HAUGEN: Yeah, no one at Facebook said, we want to give the most reach to the most extreme ideas. Like no one did that. But they did do it. And it's fascinating. They call it meaningful social interactions. And the hilarious part is some of the documents in my disclosures show that when they pulled users six months after it happened, they said, how meaningful is your news feed? And this change made people's news feeds less meaningful. So we should not call it meaningful social interactions. It's just social interaction.

MARK BLYTH: Social interactions. So let's go back to the politics on this one. That's kind of undermining democracy from within, again, by algo-tweaks. But so much of the attention--

FRANCES HAUGEN: There's also a secondary thing, which is that same system of same like high quality equals high engagement, that is also used in ads. So political parties run ads. When they're trying to decide should they show you this ad, a little auction takes place being like, how much is your attention worth? How much does this person want to get to you?

But there's also a secondary feature which is they say, how, quote, high quality is this ad? Because Facebook's only going to get paid if you interact with it. And so angry, polarizing, extreme divisive ads, they get reactions. And so you end up having a thing where extreme polarizing ads are 5 to 10 times cheaper than compassionate, empathetic ads.

MARK BLYTH: Wow.

FRANCES HAUGEN: And we cannot have a democracy if we subsidize division.

MARK BLYTH: Yeah, we're basically producing anger. Anger is our product becomes basically what it ends up.

FRANCES HAUGEN: Yeah, division becomes our product. Yeah.

MARK BLYTH: Division, wow. So again, let's go back to the way that it's normally framed. There were foreign people who were manipulating our elections. Should we be bothered about that? Because everything you're describing is completely endogenous. It has nothing to do with the outside.

FRANCES HAUGEN: So I want to be real clear. Influence operations-- so this is the weaponization of the platform-- is an extreme, extreme concern. Extreme concern. And the reason for that is Russia, China, Iran, Turkey, they're all investing in large information operations. Ukraine. And the reason why this matters is the platforms have tools for detecting coordinated behavior. The thing that we don't have right now is enough people who actually take the outputs of those systems and pull down the networks.

So when I worked in threat intelligence at Facebook-- so the last eight or nine months I was there I worked on the counter espionage team, which is the sister team to influence operations. And our team only ever worked on a third of the cases we knew about because we are so understaffed. And we never wrote detection software because we already couldn't handle the cases we had.

MARK BLYTH: Why wouldn't Facebook engage more in that and provide those resources? That seems to be, for a firm under pressure, an easy lift, right? Why would they not go for it?

FRANCES HAUGEN: I think part of it is, you know, these are cost centers and there's no external accountability. So the question of like how good a job is sufficient is not currently being negotiated with the public. It's Facebook's deciding on its own.

MARK BLYTH: But with $40 billion in profits you could Chuck a $1 billion at it and buy yourself some really good press.

FRANCES HAUGEN: Totally. And I want to really flag for people they're doing $75 billion with B-- nine zeros-- $75 billion worth of stock buybacks in this 12-month period. So they are lighting it on fire at $75 billion and then bragging to the public that they spend $5 billion on safety.

MARK BLYTH: Yeah, there's something deeply wrong on so many levels.

FRANCES HAUGEN: I think part of it is like-- I went and checked. So last August, I went and checked every single time the Facebook stock price declined versus the NASDAQ by more than 5% over a 10-day period for the last five years. And there were a very limited number of incidences. It was like 27 instances over five years.

A handful of them, maybe 20%, were like ones where just all of big tech sold off at the same time, you know, small correction. But in general, the things that drove the price down were either declines in users or increases in expenses. So when Facebook spends more on safety, even small amounts more, the stock price crashes.

MARK BLYTH: But if you have that type of balance sheet, you can just buy the stock back and do self correction.

FRANCES HAUGEN: That's what they're trying to do.

MARK BLYTH: And that's what they're trying to do, absolutely. Wow. So let's talk about the rest of the world. You once said that something we don't recognize as a problem is that for much of the global south-- a billion people or more-- that Facebook is the internet because it's there, it's free, quote unquote, and it works. Why is that a problem?

FRANCES HAUGEN: We need to unpack the "it's free," because the only reason why it's, quote, "it's free," is Facebook made a choice. They chose. Twitter didn't choose to do this. Tik Tok didn't choose to do this. They chose to go into some of the most fragile places in the world, places like Myanmar, and pay for people's data.

So in most places in the world, for every megabyte of data you consume, you pay money. And Facebook went and said, if you use Facebook, the data is free. If you use anything on the open web, you're going to have to pay for the data yourself. And surprise, surprise, market forces work, right? And so for a majority of languages in the world, 80% or 90% of all the content in that language is going to be only on Facebook.

And so in the United States, we can step in and say like, well, I made a choice. I stepped away, right? I don't want to support this. It's not aligned with my values. But for at least a billion people in the world, to say I don't want to participate with Facebook, they don't support my language, they don't invest in security resources, you know, I'm using the version of the product that Mark Zuckerberg himself said was dangerous in Twenty-Eighteen, because I don't matter to Facebook, they don't get to leave because any local web page is actually just a page on Facebook.

MARK BLYTH: And if they have, which I'm sure they do, exactly the same problems we were talking about earlier in some of the most fragile parts of the world, let's think about, for example, languages. How would Facebook or any company in that position police content in languages whereby it's very hard to find specialists, talk about detecting nuance, et cetera. How do you begin to solve those problems even if you want to try?

FRANCES HAUGEN: Oh, I'm so glad you asked. So there's an implicit assumption in the way you framed your question, which is that the way to solve these problems is via content and not via product design. So one of the things I keep, keep, keep emphasizing, and I have an editorial coming out in the next couple of days that talks about this, is there's lots of choices that Facebook has made that have led to the situation we're in where the most extreme ideas get the most distribution.

So you could imagine things like, should you have to click on a link to reshare it? You know, if you do that, you get 10% or 15% less misinformation. But it works in every language. Or I'll give you another example. Let's imagine Alice writes something. She posts it. Her friend Bob reshares it. Carol reshares it. So now it's beyond friends of friends.

When it lands in Dan's newsfeed, if Facebook grayed out that reshare button and said, hey, it's beyond friends of friends now, you can totally share this content, you can do whatever you want, but you have to copy and paste it, that small change, like putting a little bit of human in the loop, has the same effect on misinformation that the entire third party fact checking program has. Only it works in every language in the world.

MARK BLYTH: Wow. So about one simple product design nudge--

FRANCES HAUGEN: Totally.

MARK BLYTH: One extra step.

FRANCES HAUGEN: And so there's lots and lots of little things like this where we need to be taking a step back and saying, are there opportunities for intentionality?

MARK BLYTH: I can't help but wonder if part of this is-- think about the current moment. So we're all obsessed with inflation in the current moment, right? So a study came out today that The Guardian did. I just posted it on Twitter, oddly, which said the following-- wages overall have risen by 1.6% real, right? Whereas corporate profits for the median firm are up 49%.

FRANCES HAUGEN: Wow.

MARK BLYTH: So this is basically price gouging. And what they did is they went on investor calls, whatever. And there's housing firms that are like, yeah, we could build more houses but why should we when we're getting half a million a house if we restrict supply. So that's what's really driving it, right? And that sort of thing.

So let's bring that inflation example back to the whole notion of product design and tweaking, right? Maybe part of the problem here is that we live in societies where profits are regarded as absolutely sacrosanct. You cannot touch them. And what you're essentially doing is, it sounds innocuous, but you're asking them to earn less money. Ultimately that's it. Is that really the problem? We're just afraid to attack the holy grail of profits?

FRANCES HAUGEN: I think it's really, really important for us to understand how we got to where we are right now. And so one of the most critical problems is that we have lots of specialists you can hire who are lawyers or political scientists who have specialized in freedom of speech. We literally graduated zero people-- zero people in the United States who talk about things that I just mentioned. So these product safety changes, product design changes around how we make these systems.

And the reason for that is intentional, right? Right now the only place in the world you can learn about these things is to go to a large social media company because right now we have zero classes in any university that really gives students a firsthand ability to experience the structure of these products. And that's part of why I want to build like simulated social networks so that we can start to graduate 50,000 people every year who could talk intelligently about this.

MARK BLYTH: But you then still have to confront the fact that I am an incredibly profitable business. Almost zero marginal cost. Nothing really is, but almost zero marginal cost. And what you want to do is complicate my business model.

FRANCES HAUGEN: I do, yeah.

MARK BLYTH: In the name of safety.

FRANCES HAUGEN: But, but, but, I want to make you more long term successful because I think part of what's happened at Facebook is they've deprioritized human judgment. So decisions to ship or not ship are so metrics driven or goals are defined so numerically that it means that people get locked into a very narrow frame of reference for defining success. And this is the quarterly by quarter kind of financial mindset. Putting some constraints on that system, if it makes it more pleasant, I think they'll have more users 10 years from now than they would otherwise.

MARK BLYTH: And there's less regulatory risk for that as well.

FRANCES HAUGEN: Yeah. 100%.

MARK BLYTH: So a couple of things. I want to bring our conversation to a close, sadly, with two points. You mentioned freedom of speech. And one of the things I particularly enjoyed about your testimony and other things that you've said is you just debunk that this is about freedom of speech. It is not freedom of speech. It's about protecting a business model.

How do you think we can make more people understand that it's not going after freedom of speech? Because if profits is one of the third rails of our society, going against freedom of speech is the other. How do you make people understand that that's a canard? It's not really about that.

FRANCES HAUGEN: I usually bring them back to-- you know, we talked about this idea of should you put a human in a loop once a reshare gets beyond friends of friends. I always ask them like, if you had to choose between having some-- have no idea who the fact checkers are.

We know that there's certain organizations but we don't know how many fact checks they do. We don't know which languages they do them in. Surprise, most of them are in English. So part of why Europe passed the DSA, which is the Digital Services Act, a couple of days ago is because they use the raw version of Facebook much more than Americans do, right? 87% of the misinformation budget for Facebook got spent on English even though only 9% of users speak English on Facebook.

MARK BLYTH: Wow.

FRANCES HAUGEN: Right. So I always ask them, would you rather say you have to copy and paste after a reshare gets beyond friend of friends, or do you want this mysterious group of people to say what's true and not true? And everyone goes, I think we should focus on product safety.

The secondary thing is, I always tell people, when you hear freedom of speech or censorship, whenever you hear censorship, instead think linguistic equity. Because when we focus on censorship, we have to do things language by language. And by definition, we are going to leave behind small languages.

I always like to joke like won't someone please think of the Norwegians? Right, there's 5 million Norwegian speakers. I feel like I have to rep them because they gave me the Norwegian American Heritage Award. But there's only 5 million speakers. Facebook is never going to protect the Norwegians. And so if we want to protect, say-- places that are at risk for ethnic violence often are linguistically diverse, they often speak smaller languages. If we want to have products that don't cause ethnic violence, we have to focus on product safety and not on censorship.

MARK BLYTH: Seems pretty convincing to me.

FRANCES HAUGEN: Yeah, that's my hope.

MARK BLYTH: Yeah, that's a good hope. Thanks very much. It's a fascinating conversation. And good luck with building the social network, one that will actually make us safer.

FRANCES HAUGEN: I'm not going to tilt on that hill, but I do want to make a simulated social network so we can at least train the next generation of experts who can go in and attack that problem.

MARK BLYTH: Thank you.

FRANCES HAUGEN: Thank you so much.

[MUSIC PLAYING]

MARK BLYTH: This episode was produced by Dan Richards and Kate Dario. I'm Mark Blyth. You can listen to more conversations like this by subscribing to The Rhodes Center Podcast wherever you listen to podcasts. We'll be back soon with another episode of The Rhodes Center Podcast. Thanks.

Links

Chapters

Video

More from YouTube