Today our topic is social media platforms and the movie that got us talking about harmful algorithms, "The Social Dilemma". Special guest Julia Hoppock joins me with her perspective as the Partnerships Director for The Social Dilemma impact campaign
Julia is a filmmaker and campaign strategist with 15 years of storytelling experience in documentary film, journalism, and the non-profit sector. She has spent her career finding, pitching, writing, and producing stories that connect audiences to the issues that impact their lives. Prior to Exposure Labs, Julia worked at The Pew Charitable Trusts, where she ran film and advocacy campaigns to advance environmental conservation, public health, and other public policy issues.
We covered a lot of ground in this episode. Here are some highlights:
You can watch "The Social Dilemma" on Netflix.
To learn more about ways to take action and access resources from The Social Dilemma, go here.
To access the free 40 minute educator version of the film and resources including discussion guides, go here.
To learn more about the UK's Age-Appropriate Design Code, go here.
Here are links to learn more about 2021 Senate Hearings on the Facebook whistleblower, Facebook/Instagram Files and TikTok, Snapchat and YouTube.
What are you plugging into this week? What are you choosing to unplug from? Our guest shares her thoughts.
Thanks for plugging in with me today, friends today, I'm talking
Speaker:with Julia Hoppock, who leads the impact campaign for the Emmy award
Speaker:winning film, The Social Dilemma.
Speaker:You know, I've been telling you all to make sure and see this movie!
Speaker:It has been out for over a year.
Speaker:And today we get to talk to Julia about it.
Speaker:Welcome to the Unplug & Plug In show, Julia!
Speaker:Thank you so much, Lisa, I'm thrilled to be here and thrilled
Speaker:about all the work that you do at the Center for Online Safety.
Speaker:So I'm now really grateful to be here now, and have this conversation.
Speaker:Thank you.
Speaker:I'm excited to dive in.
Speaker:I first spoke with Julia at a Children's Screen Time Action
Speaker:Network event awhile ago, and was fascinated with her perspective and
Speaker:wanted to continue this conversation.
Speaker:She's been involved with storytelling and increasing impact for
Speaker:years beforeThe Social Dilemma.
Speaker:. And before we dive in, I want to tell you a little more about her.
Speaker:Julia is the director of partnerships at The Social Dilemma.
Speaker:She is a filmmaker and campaign strategist with 15 years of storytelling
Speaker:experience in documentary film, journalism, and in the nonprofit sector.
Speaker:She has spent her career finding, pitching, writing, and producing
Speaker:stories that connect audiences to the issues that impact their lives.
Speaker:Prior to Exposure Labs, . Julia worked at the Pew Charitable Trusts where she ran
Speaker:film and advocacy campaigns to advance environmental conservation, public
Speaker:health, and other public policy issues.
Speaker:Julia, your resume is impressive.
Speaker:I'd like to start by hearing a bit more about how you ended up in
Speaker:your current role and what you do.
Speaker:Sure.
Speaker:Yeah.
Speaker:So I started out as a journalist at ABC news in Politico and you know, I've
Speaker:always been a writer and a storyteller, and I learned all the tools on the job
Speaker:about telling stories and, and speaking truth to power and under fast deadlines.
Speaker:And I really loved it, but I think looking back, working in the news at that
Speaker:time, which was 2005, 2006 was my first exposure really to the attention economy.
Speaker:And so, you know, social media, then wasn't what it is now.
Speaker:It didn't have news feeds, but I did witness firsthand so many of my
Speaker:colleagues who cared so much about telling important stories, telling
Speaker:the truth, kind of have their work reduced to these click baity headlines.
Speaker:And that was one aspect of it that I, I really didn't like.
Speaker:And I think a lot of reporters felt trapped.
Speaker:And I can only imagine it's probably gotten worse with Twitter and all of that.
Speaker:And so I moved, I, and I don't want to speak ill of the news.
Speaker:It's so important to have good journalism.
Speaker:And I work with so many fine journalists, but it did just plant that seed of
Speaker:this attention economy that became a thread later in my career at
Speaker:Exposure Labs and The Social Dilemma.
Speaker:But after leaving I think I found for me that breaking news wasn't really my speed.
Speaker:I loved storytelling, but I didn't want reacting to that pace every day,
Speaker:I did find that to be quite difficult.
Speaker:And so I moved into storytelling for a nonprofit, the Pew Charitable Trust
Speaker:for many years, where I led, I directed videos and campaigns around a lot
Speaker:of their environmental conservation and public health initiatives.
Speaker:And I really loved that work.
Speaker:And I kind of just flip-flopped.
Speaker:I saw this job at Exposure Labs for an impact producer for, they
Speaker:didn't say what the film was, but they did mention it was about tech.
Speaker:And to be honest, I kind of learned all the tech reform part on the job.
Speaker:I had the background in film, impact and storytelling, but the
Speaker:role I have now as the director of partnerships is to really lead our
Speaker:work with partners on the campaign.
Speaker:And Exposure Labs is very unique in that it's both the film
Speaker:and impact production company.
Speaker:So they do original productions, but they have a team of advocates
Speaker:that ensure that the film is used as a tool to advance the cause.
Speaker:And so our mission with the impact campaign is really to leverage the power
Speaker:of storytelling, including the film to help put pressure on big social and
Speaker:realign technology with the public goods.
Speaker:So it's been a fascinating couple of years for me in this position and
Speaker:yeah, that I've really lucked out in landing a job aroundThe Social Dilemma.
Speaker:I love that.
Speaker:Say your mission again, could you?
Speaker:Sure.
Speaker:Sure.
Speaker:The mission of our campaign is to leverage the power of storytelling to
Speaker:put brush pressure on big social and to help realign technology with the public.
Speaker:So good.
Speaker:So good.
Speaker:You and I are right there together.
Speaker:I love it.
Speaker:One of the things I love about The Social Dilemma is that it's aged well.
Speaker:And you can't say that about all movies that talk about tech.
Speaker:We're talking more about algorithms today than we were when the movie
Speaker:first came out and the movie insights, they still seem fresh.
Speaker:What do you think parents need to know right now about algorithms when
Speaker:it relates to social media platforms?
Speaker:Sure.
Speaker:I think what parents need to know, it's probably what they're already well
Speaker:aware of and it's just that the business model is not built with children's
Speaker:wellbeing in mind, and it's not built for any of our well-beings in mind.
Speaker:I think whatThe Social Dilemma did well was really visualize the
Speaker:business model and shows the ways in which these business models
Speaker:are optimizing for engagement.
Speaker:That's how they make money.
Speaker:The longer you stay on the platform, the more that data is mined from you,
Speaker:that data can be served to advertisers so that they are serving you products
Speaker:that you're more likely to buy.
Speaker:And the unfortunate thing is that the content that keeps people engaged on these
Speaker:platforms tends to be damaging content.
Speaker:It tends to be what's the most outrageous thing that's going to get you to click
Speaker:on or what is, you know, misinformation, as we brought to light in the film,
Speaker:but the study from MIT fake news spread six times faster than real news, and
Speaker:even hate speech that can engage you and send you down into a rabbit hole.
Speaker:So even if you're not in favor of this negative content, you may be speaking
Speaker:out against it and then, "oh, isn't this so horrible" as I'm resharing it.
Speaker:And that again is just bringing you down a rabbit hole that keeps
Speaker:you on those platforms longer.
Speaker:So these platforms aren't designed for our wellbeing individually and at the societal
Speaker:level, and it just really needs to change.
Speaker:And so I think.
Speaker:Parents need to keep on the critical eye they probably already have about
Speaker:how these platforms are being used and really think about ways they
Speaker:can continue to take action because these are systems level problems.
Speaker:It shouldn't be on parents to have to monitor every
Speaker:activity of their child online.
Speaker:There needs to be guard rails in place.
Speaker:Like what we're seeing a little bit with UK children's design code.
Speaker:So there needs to be guard rails.
Speaker:They need to keep fighting and they should be talking to their kids.
Speaker:I could expand a little on what, what I think parents should know
Speaker:just based on our learnings from the impact campaign if you'd like.
Speaker:Absolutely.
Speaker:Absolutely.
Speaker:We could talk about any of this all day.
Speaker:So yeah.
Speaker:So let's go.
Speaker:Yeah, I think the other thing just in our impact campaign,
Speaker:we work with so many partners.
Speaker:So I'll start with kind of within the tech companies.
Speaker:I think one common theme when I think about insiders and the tech companies and
Speaker:just young people, we work with a lot of youth activists in the tech reform space.
Speaker:One common theme is everyone really feels stuck in these platforms,
Speaker:whether you work there or you're outside using them to interact with
Speaker:your friends or to run your business.
Speaker:Everyone feels a little stuck, we all want better.
Speaker:And you know, I think that's become clear with a lot of these whistleblower
Speaker:leaks and we're seeing that, you know, on January sixth on these internal
Speaker:platforms, internal message board and Facebook employees are kind of
Speaker:talking about the horror about what was happening in their platform's role.
Speaker:There's a lot of employees that don't want to work for this.
Speaker:They didn't come into these companies to divide society.
Speaker:They wanted to do good.
Speaker:And they're fed up with how things are going.
Speaker:And one thing we do here kind of through the grapevine that gives me a little
Speaker:hope is that all this external pressure, all the work of organizations such as
Speaker:yourself, of activists, researchers, the press covering this issues, the hearings.
Speaker:That pressure really does make a difference and it empowers employees
Speaker:within the companies to speak out more and it empowers them to organize.
Speaker:And so it really is making a difference on that side.
Speaker:I love that.
Speaker:I love that thought you, you and I are approaching it from the same way, which
Speaker:is these businesses are run by people and we want to believe people are inherently
Speaker:good and they just need a little support.
Speaker:So that they can organize from within, as we organize from outside and all
Speaker:spotlight on what needs to change.
Speaker:I love that idea.
Speaker:Exactly.
Speaker:Yeah.
Speaker:I interrupted your thoughts.
Speaker:Just to add, I think the other thing is, you know, young people feel
Speaker:trapped too, and, and you know, and I forgive me, I don't have kids.
Speaker:So if I'm saying something, your audience is already well
Speaker:aware of, feel free to stop me.
Speaker:But we work with a lot of gen Z activists and had some
Speaker:candid panels and discussions.
Speaker:And many of them have just talked about what it's like to grow up
Speaker:on these platforms and have their entire life be a performance, really.
Speaker:And how, how as you're developing, who you are in your teen years and what you
Speaker:believe in, and you're making mistakes and testing things out ,you don't
Speaker:have that freedom with social media.
Speaker:And so, many of them will, will speak to us about just
Speaker:what that does to themselves.
Speaker:. The need to perform how that affects their mental health and
Speaker:how it, it prohibits them from being their true, authentic self.
Speaker:So I'd also just encourage anyone listening to this podcast, to talk
Speaker:to your kids and ask them what it feels like to be on these platforms.
Speaker:You know, maybe skipping the discussion about the rules and
Speaker:tech use, but ask them what they want social media to bring to that.
Speaker:We have a lot of discussion guides about the film on our website
Speaker:that parents can use as a guide.
Speaker:But I think you'd be really surprised.
Speaker:I mean, we all know this is a problem and, and I think we can
Speaker:learn from each other parents can learn from their kids and vice versa.
Speaker:So I'd really encourage that open dialog.
Speaker:Yes, we can learn back and forth.
Speaker:What I try to tell parents is that they don't need to be the
Speaker:experts to start the conversation.
Speaker:They can start where they are.
Speaker:If they don't know a lot about Instagram, that's okay.
Speaker:They can start by asking a question, like, how does it make
Speaker:you feel, or who do you follow?
Speaker:And let the conversation unfold organically without a lot of
Speaker:forethought into what the answer is or what the result needs to be.
Speaker:It's just a conversation.
Speaker:It's just a way to learn more and to share emotions more, to connect.
Speaker:Exactly.
Speaker:And that's how we approach our work.
Speaker:I mean, I know we've been studying these issues for a while, both in the
Speaker:making of the film and in the campaign.
Speaker:But I still learn so much from just asking someone who's 17, whose life
Speaker:is on these platforms, what it's like.
Speaker:And if you go into those conversations with humility, whether that's a
Speaker:parent/ child thing or activist and person impacted by the
Speaker:platforms, you'll learn a lot.
Speaker:And so that'll take you far.
Speaker:And something, you said a few minutes ago, I want to circle
Speaker:back to because it's so important.
Speaker:You said that the burden for cleaning this up, can't be on parents trying
Speaker:to police this or monitor, filter, do all of the things on the backend.
Speaker:Right now, it's impossible to do everything to keep kids safe online.
Speaker:The internet is broken when it comes to kids and safety and to have
Speaker:parents feel guilty over not being able to keep them safe isn't fair.
Speaker:It's a bigger issue.
Speaker:It's a societal issue.
Speaker:We all need to be talking about this.
Speaker:I love that you mentioned that because that's something where
Speaker:parents have a lot of guilt and shame around the current situation.
Speaker:It's very hard to get a handle on.
Speaker:Yeah, absolutely.
Speaker:And they, and they shouldn't, I mean, It's a giant business model that's,
Speaker:that's working against you and I mean, you have just thousands of engineers
Speaker:and the sophisticated algorithms.
Speaker:I mean, it can't be on you.
Speaker:I think we all need to, to realize that, that it's a system wide change.
Speaker:And our best thing we can do is share our stories and take action..
Speaker:And when we talk about business models, something that's become really apparent
Speaker:since the whistleblower and the Facebook files, something that's become really
Speaker:apparent is how these platforms are looking at each other and saying, I'm not
Speaker:growing the way I want to because TikTok is taking our people or our eyeballs.
Speaker:The teen eyeballs are going more over here instead of staying with
Speaker:us, how do we attract them back?
Speaker:How do we keep them on our platform?
Speaker:There's a limited number of hours in the day.
Speaker:And that's something that was very apparent in the Facebook files is
Speaker:how they're studying each other.
Speaker:Trying to have the latest thing, if it's disappearing messages
Speaker:that kids like now, we're going to do the disappearing message.
Speaker:If it's, you know, quick videos, like what TikTok has, well, we've got to
Speaker:have Reels and our version of that.
Speaker:What can you say about that?
Speaker:Yeah, I think.
Speaker:I think it is a little bit daunting because it's this constant testing in
Speaker:real time and you're able to benefit.
Speaker:And because, because we don't have such great antitrust laws, you can,
Speaker:as a company, Facebook can steal those ideas of disappearing messages
Speaker:from Snapchat and employ them.
Speaker:So it is a little bit daunting, but here's the other thing to think about, that I
Speaker:like to take it from the positive side of the Facebook files is I think that
Speaker:they show how desperate Facebook is.
Speaker:Facebook is losing right now.
Speaker:They're losing young users.
Speaker:And so they're going, they're doubling down and they were trying to do an
Speaker:Instagram for kids, which luckily because of the work of so many advocates
Speaker:like yourself was put on pause.
Speaker:But this, you could look at it on the other end that they
Speaker:are, it's a sign of desperation.
Speaker:If we need to go younger, if we need to do the most addictive thing,
Speaker:it's a sign that that platform is potentially failing or on its way out.
Speaker:It's a bigger sign that we really do need regulation because what we
Speaker:don't want is a race to the bottom.
Speaker:And I think that's what you can see sometimes with these different companies..
Speaker:We don't want is a race to the bottom.
Speaker:Amen.
Speaker:Amen.
Speaker:Yes.
Speaker:Yes.
Speaker:Well, we're talking about current events.
Speaker:Are you encouraged by what's happening with the Facebook
Speaker:files and the whistle blowers?
Speaker:And you mentioned the UK and their their age-appropriate design code.
Speaker:Are you encouraged right now in, in our history where we're headed right now?
Speaker:I am.
Speaker:I am encouraged.
Speaker:And I'd say I'm cautiously optimistic.
Speaker:I did like what Senator Blumenthal said in the hearing with Francis
Speaker:Haugen, he said that this is big social's big tobacco moment.
Speaker:And I do think that's true.
Speaker:I think what these 10,000 plus documents that Haugen has released laid bare
Speaker:the evidence that so many activists and researchers and others have been raising
Speaker:the alarm bells on for years, that these platforms are profiting off of our pain.
Speaker:And now we see it in their own writing.
Speaker:We see it in the Facebook internal memos that say, we make body image issues
Speaker:worse for one in three teen girls.
Speaker:So there's just no denying it anymore.
Speaker:We know the companies know that they are causing this harm.
Speaker:It just cannot be acceptable anymore.
Speaker:So I'm hopeful in the sense that we are at an awareness tipping point.
Speaker:We are now all aware of the problem.
Speaker:We're in agreement that there is a problem, which really a few years
Speaker:ago, wasn't totally the case.
Speaker:And we're aware that the business model is the problem.
Speaker:And I do feel like policy makers are citing that more.
Speaker:And I think this last straw with the, Haugen documents have really mobilized
Speaker:both sides of the aisle to move forward on legislation, especially
Speaker:as it comes to protecting kids.
Speaker:So in that sense, I'm very optimistic, but I, you know, hearings aren't
Speaker:enough and talking about the problem isn't enough and we do need action.
Speaker:And so I just hope that this can translate into actual actions like
Speaker:we've seen with the UK design code.
Speaker:And I happy to speak about that because there are some
Speaker:hopeful notes in that as well.
Speaker:Well, yeah, let's dive into, what's changed in the last year since the release
Speaker:of The Social Dilemma, what's changed.
Speaker:I would love to speak for a minute on the design code and your thoughts around if
Speaker:this is a viable model we can bring here.
Speaker:Absolutely.
Speaker:Sure.
Speaker:I'll start with the design code and then I can move on to the other question.
Speaker:So I think, you know, the design code essentially in the UK is really just a
Speaker:set of principles that says if you're designing a platform where kids are likely
Speaker:to be on, then it needs to be designed with children's wellbeing in mind.
Speaker:And that went into effect in the UK this fall, and it didn't get a ton of
Speaker:coverage over here on this side of the pond, but it has forced some changes
Speaker:already from big tech and they're small, but they're something right?
Speaker:And sometimes change happens around the margin, like TikTok, there's already
Speaker:controlling their direct messages for those under age of 18 and that was a
Speaker:recent change from the UK design code.
Speaker:And Google's offering the right to be forgotten for teenagers who's found
Speaker:images uploaded by parents or guardians.
Speaker:And so there's already some small changes and I think.what is also great
Speaker:is that we're taking the learnings on this side from the UK design code.
Speaker:And there is a lot of, there's a big move here to create a similar
Speaker:design code in the U S and I do think that would be effective.
Speaker:And I know that representative Kathy Castor is looking at a lot of
Speaker:those elements of the design code and incorporating it into a privacy
Speaker:act that she has proposed right now that would increase privacy
Speaker:protections for kids under 18.
Speaker:So that definitely gives me, it gives me some hope and I think, you
Speaker:know, we can look at, we should look for progress, whatever it emerges.
Speaker:What's great about that is we don't have to reinvent the wheel.
Speaker:Exactly.
Speaker:If we can look somewhere else.
Speaker:And I know there's other countries doing this, if we can look to the UK
Speaker:and say, oh, you're already doing the privacy piece, you're already doing this.
Speaker:And companies like Google already know this.
Speaker:They know how to do it.
Speaker:It's working in your country.
Speaker:We can also say.
Speaker:Well, why don't you just bring that over here, too?
Speaker:Exactly.
Speaker:Exactly.
Speaker:You mentioned what's changed since the social dilemma.
Speaker:Happy to speak to that as well.
Speaker:Yeah, let's go there.
Speaker:What's changed in the last year.
Speaker:How is the impact campaign going?
Speaker:Yep, absolutely.
Speaker:I think a lot of things have changed in the sense that.
Speaker:There's now more awareness about the problem and the consensus
Speaker:that this is a problem.
Speaker:And I think that like you said, there were so many things from the
Speaker:film that still ring true today.
Speaker:And I think that we're just seeing so many more news reports that really
Speaker:validate the thesis of the film.
Speaker:And, you know, just as a little side bar anecdote you know, in the film, the
Speaker:director interviews the former head of monetization at Facebook, Tim Kendall, and
Speaker:asks him, "What are you most afraid of?"
Speaker:And he says, "Civil war from these platforms."
Speaker:And you know, internally there was some worry at the time
Speaker:that that was too extreme.
Speaker:And even some of the marketing teams around the film was
Speaker:worried that was too extreme.
Speaker:But then we see things like January sixth happened and it doesn't really seem out of
Speaker:left field to, to say something like that.
Speaker:So I think that without the business model changing, we're just going to
Speaker:continue to see events that, that really prove that thesis to be true.
Speaker:But on the positive note, I don't think we've ever seen so much momentum and
Speaker:kind of mass awareness about this issue.
Speaker:And I, and I do think the film helped play a role in that, but it's also
Speaker:years and years of work from activists and advocates that have been in the
Speaker:space long before we entered the scene.
Speaker:And I do think that you know, we're at a real moment where people are just
Speaker:fed up and it's time to take action.
Speaker:And so on our impact campaign, you know, we're excited about, we've
Speaker:been working with many tech reform activists on campaigns calling
Speaker:to ban surveillance advertising.
Speaker:We're working with many groups on creating a children's design code in
Speaker:the U S so I'm very excited about that.
Speaker:And we're also pivoting to just telling more stories.
Speaker:We, you know, the film has been out for a year and we want to make sure we're not
Speaker:just running a campaign about the film, but we're running a campaign that can
Speaker:help support the tech reform ecosystem.
Speaker:And we think the best way that we can do that is through telling
Speaker:stories and supporting other tech reform storytellers in the movement.
Speaker:So we'll be shifting gears to doing that and are really excited about that direct.
Speaker:That is exciting.
Speaker:That is exciting.
Speaker:And I want to be sure and highlight, you've got an educator, you've got
Speaker:classroom resources as well, right?
Speaker:Yeah.
Speaker:We have a lot of great resources for educators out there, including a 40 minute
Speaker:classroom cut of the film, which educators can access for free without accessing
Speaker:the Netflix platform and they can go to www.thesocialdilemma.com/educators
Speaker:to register their screening for that educators cut.
Speaker:So we're hoping that that can help allow for a lot more access to the
Speaker:film and these important topics.
Speaker:This is one of those missing pieces.
Speaker:Educators have had a chance to watch it, but they haven't had
Speaker:a way to show their classes.
Speaker:This is awesome to give them the opportunity to have those
Speaker:deep conversations with kids.
Speaker:That's fantastic.
Speaker:Exactly.
Speaker:Exactly.
Speaker:Okay.
Speaker:My final question is always, what is one thing you'd like to unplug from?
Speaker:And,/ or one thing you'd like to plug into?
Speaker:Julia, what have you got for me?
Speaker:Yeah.
Speaker:I love this question, Lisa.
Speaker:And you know, we're recording this before the Thanksgiving holiday.
Speaker:And so what's on my mind is that a lot of us in this tech activist world have
Speaker:been watching and reading every headline, watching every hearing and taking notes
Speaker:and keeping up on the tech developments.
Speaker:And so I feel a little irresponsible saying this, but I'm going to give myself
Speaker:a break and unplug from technology for a few days and just shut it all out and
Speaker:give myself some space and just plug into time with family and friends and nature.
Speaker:I'm out in beautiful Colorado.
Speaker:So I hope to get into some hikes.
Speaker:Put a pause on, on the news for a minute and you know, heal and rest and be ready
Speaker:to take the fight up after the holidays.
Speaker:I love it.
Speaker:I love it.
Speaker:We could all do that, just a couple of days.
Speaker:Yes.
Speaker:And now my friends, it is time for us to unplug.
Speaker:And before we do Julia, how can parents find out more about you
Speaker:and watch The Social Dilemma??
Speaker:Where can they go?
Speaker:Yes, they can go to our website, the social dilemma.com.
Speaker:And if they go to the take action page, they can have access to all sorts of
Speaker:actions they can take to help support the movement as well as resources
Speaker:and discussion guides about the film.
Speaker:And I saw there was a digital cleanse on there too.
Speaker:I'm all about taking a break for two days, seven days, whatever it is.
Speaker:Just clear your head.
Speaker:Start over.
Speaker:Exactly.
Speaker:Exactly.
Speaker:The social media reboot.
Speaker:You can take that there on that take action page.
Speaker:Yes.
Speaker:Yes.
Speaker:Julia, I want to thank you for your work and for making the world
Speaker:of social media algorithms and technology more accessible for parents.
Speaker:Thank you so much for being here.