Artwork for podcast Macro N Cheese
Ep 300 - Algorithmic Warfare with Andy Lee Roth
Episode 30026th October 2024 • Macro N Cheese • Steve D Grumbine MS, MBA, PMP, PSM1, ITIL
00:00:00 00:59:31

Share Episode

Shownotes

**Milestone 300! We dedicate this, the 300th weekly episode, to our loyal listeners, and we wish to recognize the valiant work of our underpaid podcast crew – correction: our unpaid podcast crew – who have put in thousands of hours editing audio, correcting transcripts, writing show notes, creating artwork, and posting promos on social media. To have the next 300 episodes delivered to your inbox as soon as they’re released, subscribe at realprogressives.substack.com 

 

Project Censored has been a valuable resource for Macro N Cheese. This week, sociologist Andy Lee Roth talks with Steve about information gatekeeping by big tech through their use of AI algorithms to stifle diverse voices.  The discussion highlights historical and current instances of media censorship and looks at the monopolization of news distribution by corporate giants like Google, Facebook, and Twitter.  

 

In an economic system that is fully privatized, trustworthy journalism is another casualty. News, which should be treated as a public good, is anything but.  

 

Andy Lee Roth is associate director of Project Censored, a nonprofit that promotes independent journalism and critical media literacy education. He is the coauthor of The Media and Me (2022), the Project’s guide to critical media literacy for young people, and “Beyond Fact-Checking” (2024), a teaching guide about news frames and their power to shape our understanding of the world. Roth holds a PhD in sociology from the University of California, Los Angeles, and a BA in sociology and anthropology from Haverford College. His research and writing have been published in a variety of outlets, including Index on Censorship, In These Times, YES! Magazine, The Progressive, Truthout, Media Culture & Society, and the International Journal of Press/Politics. During 2024-2025 his current work on Algorithmic Literacy for Journalists is supported by a fellowship from the Reynolds Journalism Institute. 

 

projectcensored.org 

 

@ProjectCensored on Twitter 

Transcripts

Steve Grumbine:

Have faith.

Steve Grumbine:

Have faith.

Andy Lee Roth:

Believe in your source.

Steve Grumbine:

Have faith.

Steve Grumbine:

I refer often to Martin Luther King in the context of the civil rights movement.

Steve Grumbine:

He said.

Andy Lee Roth:

He said, I have no claim for.

Steve Grumbine:

The tranquilizing drugs and for the tranquilizing drug of gradualism and incrementalism.

Andy Lee Roth:

Here's another episode of Macro and Cheese with your host, Steve Grumbine.

Steve Grumbine:

All right, folks, this is Steve with Real progressives and Macro and Cheese.

Steve Grumbine:

Folks, I have a guest today who is part of the Project Censored team.

Steve Grumbine:

We have talked to a lot of the Project Censored folks largely because they have a lot of really important information that directly ties to things that we feel are very important.

Steve Grumbine:

As we're trying to educate folks about modern monetary theory, we're trying to educate folks about class and society.

Steve Grumbine:

And, you know, quite frankly, we have really been focused heavily on the lack of meaningful information about Gaza and the poor, poor understanding that rank and file Democrats and Republicans have about Gaza.

Steve Grumbine:

I mean, you see so much warp thinking, so much limited understanding of history, and quite frankly, a lot of censorship.

Steve Grumbine:

So Project Censored has been our friend.

Steve Grumbine:

They have really, really helped us understand situations that have, you know, been baffling or maybe people are looking for more empirical evidence that the things we're saying are true.

Steve Grumbine:

Today is no different than that.

Steve Grumbine:

We're going to talk about algorithms and artificial intelligence.

Steve Grumbine:

This is an extremely important topic to me personally, but I think to the larger real progressives nonprofit organization and the people that work with us and follow us.

Steve Grumbine:

So with that, let me introduce my guests.

Steve Grumbine:

My guest is Andy Lee Roth, and Andy Lee Roth is the associate Director of Project Censored and is interested in the power of news to shape our understanding and of the world.

Steve Grumbine:

And he's got some really important stuff going on that we're going to talk about today, in particular about algorithms and AI but without further ado, let me bring on my guest, Andy Lee Roth.

Steve Grumbine:

Welcome to the show, sir.

Andy Lee Roth:

Hi, Steve.

Andy Lee Roth:

It's a pleasure to join you on Macro and Cheese.

Steve Grumbine:

Absolutely.

Steve Grumbine:

It's funny, when Misha was telling us, hey, you got to talk to him, you know, it was like I thought I did, and I realized that I had spoken with Mickey Huff and that at that point in time, there was a possibility of you both being on, but you had a conflict.

Steve Grumbine:

So we went ahead and went with Mickey Huff that time, and here we are.

Steve Grumbine:

All this time, I've been thinking in my head, I have interviewed this man, but I have not interviewed this man.

Steve Grumbine:

And I'm very happy to interview you today.

Steve Grumbine:

So thank you so much for joining me.

Andy Lee Roth:

You bet, Steve.

Andy Lee Roth:

I have a colleague who jokes that Project Censored, which is a critical media literacy education organization that champions independent journalism.

Andy Lee Roth:

But my friend and colleague jokes that Project Censored is a little bit like the Grateful Dead, that we're kind of always on tour, but with a rotating cast of musicians, like on the stage.

Andy Lee Roth:

So, yeah, this is the version you get today.

Steve Grumbine:

Well, I am absolutely a Deadhead, too, so that works for me, man.

Steve Grumbine:

All right, so let's talk about this real quick on a more sober element here, and that is, I know we had a little talk offline.

Steve Grumbine:

I've been very concerned that the media itself has been skewing not just the media, but the algorithms and social media lifting up voices that are really not telling the truth or the full story.

Steve Grumbine:

They're always slanted in favor of what I consider to be a genocidal apartheid regime.

Steve Grumbine:

And Israel is always slanted with the focus of, oh, but do you condemn Hamas?

Steve Grumbine:

And they don't understand the entirety going back to the founding of Israel and was an astroturf country, really.

Steve Grumbine:

And go on further back.

Steve Grumbine:

I mean, my goodness, the Palestinians.

Steve Grumbine:

There's Manhole covers, that show before they gave it over to Israel, that it said Palestinian, whatever.

Steve Grumbine:

I mean, the fact is, is that we are not being fed information in a proper fashion.

Steve Grumbine:

And the kind of information that we need to hear, it's not making it to the people.

Steve Grumbine:

It's being stifled, pillows being put on top of it.

Steve Grumbine:

And that really troubles me.

Steve Grumbine:

It troubles most of the people that follow us.

Steve Grumbine:

And the other factor that I think is interesting is that when we go and ask something like a chat, GPT, tell us about what's going on in Israel and Gaza gives us what appears to be a very neutral response, but it's not neutral at all.

Steve Grumbine:

It's, quite frankly, very slanted and very biased and very intentional in elevating the Israeli occupation of Palestine.

Steve Grumbine:

So help me understand.

Steve Grumbine:

It's a big topic.

Steve Grumbine:

But this is what you're doing work on right now, is it not?

Andy Lee Roth:

Yeah.

Andy Lee Roth:

And thank you for the opportunity to talk about this.

Andy Lee Roth:

I should preface my answer to your fantastic question by saying that, yeah, the work I'm doing on algorithmic literacy for journalists is a project sponsored by the Reynolds Journalism Institute at the University of Missouri.

Andy Lee Roth:

And I've been fortunate, and the project's been fortunate enough to have RJI, the Reynolds Journalism Institute, select us as one of some 200 projects that they reviewed last year for this year's fellowships.

Andy Lee Roth:

And so I'm grateful for that support to make this work possible.

Andy Lee Roth:

So to your question about what kind of information are we getting?

Andy Lee Roth:

Are we getting sufficient information?

Andy Lee Roth:

Are we getting quality information?

Andy Lee Roth:

Are we getting disinformation or misinformation about Palestine?

Andy Lee Roth:

Let me break that question down into two parts and first talk about the establishment corporate media in the United States states and then talk about what's happening in terms of social media and other algorithmically driven platforms.

Andy Lee Roth:

So in terms of the US corporate media, it's simply a fact of the matter.

Andy Lee Roth:

And Mickey Huff, the project's director and I have written about this for truth out it's a matter of fact and record that the US corporate media have for decades treated Gaza's inhabitants as non persons and daily life in Gaza as non news.

Andy Lee Roth:

And this is an example of something that we see as a wider pattern, which is how news media omissions often function as tacit permission for abuses of power.

Andy Lee Roth:

So the corporate media, the US corporate media didn't create the violent, inhumane conditions in Gaza now expanding out to the west bank and to Lebanon and beyond, but the corporate media, their shameful legacy of narrow pro Israel coverage indirectly laid the groundwork for the atrocious human suffering taking place now.

Andy Lee Roth:

And so the corporate media's extended erasure of Gaza and its habitants is, I think certainly rooted in what is tacit and sometimes overt racism that distorts so much of what we have in terms of news coverage of the Middle east in general and Palestine in particular.

Andy Lee Roth:

But that misleading coverage is not just something like institutional racism.

Andy Lee Roth:

It's also how corporate news outlets define what counts as news and who counts as newsworthy.

Andy Lee Roth:

And the corporate media has a kind of myopic focus on dramatic events rather than long term systemic issues.

Andy Lee Roth:

le noted a book from the year:

Andy Lee Roth:

Right?

Andy Lee Roth:

And that kind of encapsulates a whole problem with a lot of the news and information we receive here in the United States through these kind of corporate legacy channels of traditional news.

Andy Lee Roth:

Right?

Andy Lee Roth:

Second part, if we shift and talk about what's happening in terms of social media, where people increasingly are turning for their news, right?

Andy Lee Roth:

If I go into a classroom and tell my students that I still read a paper newspaper at breakfast every morning, there's nothing I could tell them about myself that would make me seem more foreign and alien to them.

Andy Lee Roth:

So if we turn to social media and Say, well, that's now the way that most people are getting their news about the world, whether it's Israel and Palestine, the current election, you name it.

Andy Lee Roth:

There we can start talking about algorithmic content filtering and perhaps even censorship isn't too Strong award.

Andy Lee Roth:

In:

Andy Lee Roth:

So this is any kind of voices that by their account were dissident, counter hegemonic, alternative.

Andy Lee Roth:

And they looked from the libertarian right to the anarchist left and they found dozens and dozens of cases where alternative independent news outlets that were covering these kind of institutional issues, right racism, environmental degradation, class struggle, illegal government surveillance, violent by state actors, whether domestically or abroad, corporate malfeasance, all the kinds of independent outlets doing that kind of reporting and using social media to reach their audiences.

Andy Lee Roth:

We're all subject to a variety of forms of online censorship.

Andy Lee Roth:

With the advent of the web and social media on one hand, we're faced experientially with this kind of onslaught of information and what seems like an incredibly diverse range of perspectives.

Andy Lee Roth:

But it's incorrect to assume that, for instance, like our social media feeds or our search engines are in any ways at all neutral conducts of information and perspective.

Andy Lee Roth:

We're talking here about big tech platforms that are increasingly using algorithms and other forms of AI systems to basically filter what kind of news circulates widely.

Andy Lee Roth:

Now, the first kind of obvious thing to say is, oh well, not everything is throttled.

Andy Lee Roth:

Andy, come on.

Andy Lee Roth:

Right?

Andy Lee Roth:

You know, we have a First Amendment and there are two kind of, I think, quick, important answers to put on the table there.

Andy Lee Roth:

First, blockades of information and perspective don't have to be complete to be effective.

Andy Lee Roth:

The fact that if I go and look for alternative perspectives online, that I can find some doesn't mean that this kind of algorithmic throttling isn't taking place.

Andy Lee Roth:

And notice, like, if I'm just cruising around, not actively looking, I may never find those sources.

Andy Lee Roth:

So that's one kind of problem.

Andy Lee Roth:

The other is that I think if we're talking about censorship in the United States broadly, and we're talking about, you know, most Americans think of the First Amendment and Congress shall make no law, right?

Andy Lee Roth:

But the First Amendment is about what governments can do and can't do in terms of freedom of information and freedom of the press and freedom of expression.

Andy Lee Roth:

The First Amendment doesn't impose any restrictions on corporate control of those fundamental freedoms.

Andy Lee Roth:

And that's a crucial issue that is part and parcel, I think, of the kind of coverage we're seeing and what we're not seeing in terms of things like Israel, Palestine.

Steve Grumbine:

So let me ask you a question.

Steve Grumbine:

When I think of the First Amendment, and you stated it perfectly, obviously, but when I think about that, I'm thinking about it, that the government cannot do these things.

Steve Grumbine:

There's nothing.

Steve Grumbine:

And this is the key thing about really understanding the role of capital, role of our founding institutions in government with private property and the role that private property has in determining where laws begin and end.

Steve Grumbine:

And the fact that these platforms represent private property, private ownership, not government.

Steve Grumbine:

It's kind of like a freemium service.

Steve Grumbine:

And it's.

Steve Grumbine:

You play on their field, you play by their rules.

Steve Grumbine:

So where does the First Amendment come in when it comes to understanding these sorts of relationships with algorithms and AI?

Steve Grumbine:

I mean, people feel like, hey, my freedom of speech is being stifled, but it's being stifled through a private platform.

Steve Grumbine:

And I want to make this clear, I'm not for private platforms.

Steve Grumbine:

I'm very much for a public commons.

Steve Grumbine:

But with that in mind, you know, complaining that private entities are somehow or another squashing them, is this of their own volition, or is this with the government coercion, or is it a combination?

Steve Grumbine:

How does that play out?

Andy Lee Roth:

Yeah, that's a.

Andy Lee Roth:

Thank you again.

Andy Lee Roth:

Another great question.

Andy Lee Roth:

I'm going to preface my answer by saying my training and background is as a sociologist.

Andy Lee Roth:

And this sociological concept from over a hundred years ago is actually quite useful.

Andy Lee Roth:

In:

Andy Lee Roth:

And what Ogborn meant by cultural lag is the gap in time between the development of new technology.

Andy Lee Roth:

He used the more fancy social science word material culture.

Andy Lee Roth:

But basically the gap between the development of new technology and then the cultural development of norms and values and laws that will guide the proper, acceptable use of that technology.

Andy Lee Roth:

And where there's cultural lag, the idea is the tech develops before the norms and the values and the laws can catch up.

Andy Lee Roth:

And I think with AI right now, part of the big debate about AI right now is, I think, like manifests, these are the ripples, the consequences.

Andy Lee Roth:

The turmoil developed as a result of cultural lag.

Andy Lee Roth:

Right?

Andy Lee Roth:

AI tech is developing at such a fast pace now that there's no way that the laws and regulations, much less these larger cultural values and norms, have had a chance to adapt.

Andy Lee Roth:

And the result is in that lag, the people who control these tools, their power is enhanced.

Andy Lee Roth:

Right.

Andy Lee Roth:

If we're talking about kind of A shift to this new era of AI and the kind of algorithmic gatekeeping that my work is presently focused on.

Andy Lee Roth:

What I think the most fundamental question to ask is how this new technology is shifting who holds and wields power.

Andy Lee Roth:

And we know historically that control over infrastructure confers power on those who control those who have the infrastructure.

Andy Lee Roth:

So going back then to the First Amendment, Mickey Huff and Avram Anderson, who I should give a shout out to.

Andy Lee Roth:

Avram Anderson is an expert on information science who's on the faculty and works in the libraries at Cal State University, Northridge.

Andy Lee Roth:

One of my best colleagues a little over a year ago, Mickey Avram and I wrote an article, there's a version of it on the Project Censored website about what we call censorship by proxy.

Andy Lee Roth:

And it goes straight to the question that you're asking about the First Amendment and government and corporations.

Andy Lee Roth:

And the idea of censorship by proxy is there are three defining elements of it.

Andy Lee Roth:

It's censorship that's undertaken by a non governmental entity that goes beyond the kind of level of censorship that a government entity could take on its own.

Andy Lee Roth:

But the third element serves governmental interests nonetheless.

Andy Lee Roth:

Right.

Andy Lee Roth:

And so I think that this is a 21st century.

Andy Lee Roth:

Well, this is in some ways not a new form of censorship.

Andy Lee Roth:

There's long been collusion, like going all the way back to the Spanish American War for instance, collusion between the press and the government to promote imperialist interests, say.

Andy Lee Roth:

But this is a 21st century version of censorship by proxy that Mickey Avram and I wrote about is all about how these big tech companies are playing in effect a gatekeeping role that in many cases serves government interests.

Andy Lee Roth:

And one of the examples we described that we'll just gloss briefly now and if you want to go into more detail, we can.

Andy Lee Roth:

But it is the closure a few years ago of RT America, which was the Washington D.C.

Andy Lee Roth:

based station of the RT Russia Today Network.

Andy Lee Roth:

A number of fantastic American prize winning reporters reported for RT America, including Chris Hedges had a show on RT America.

Andy Lee Roth:

Abby Martin had a show on RT America.

Andy Lee Roth:

RT America had been basically harassed by the government for some time.

Andy Lee Roth:

U.S.

Andy Lee Roth:

based and U.S.

Andy Lee Roth:

citizen journalists, journalists who are U.S.

Andy Lee Roth:

citizens because they worked for RT America were first forced to register with the government as foreign agents.

Andy Lee Roth:

But in the end, important press freedom organizations called this out and said this is a form of indirect harassment that makes the journalists jobs harder to do.

Andy Lee Roth:

g out of the furor around the:

Andy Lee Roth:

So around that time, there's clear evidence in security reports and so forth that the government would like to shut down RT America.

Andy Lee Roth:

They'd like to shut down rt, but they can't.

Andy Lee Roth:

Then Russia invades Ukraine and there is a moral panic about Russian influence that is widespread in the us and there are many factors for that that I won't try to explicate now.

Andy Lee Roth:

But one of the upshots of that is there's what appears to be a kind of public campaign to the platforms that carry RT America to de platform RT America.

Andy Lee Roth:

And that public campaign driven by that moral panic about Russia is successful.

Andy Lee Roth:

RT America is driven off the air.

Andy Lee Roth:

And that's an example of what we mean by censorship by proxy.

Andy Lee Roth:

The government didn't have to shut down RT America, but it very much suited foreign policy interests in the government and people just concerned about kind of quote disinformation in the United States to have RT America go off air and not be a presence in the United States any longer.

Andy Lee Roth:

So censorship by proxy.

Steve Grumbine:

I see guys like Ken Klippenstein here recently get suspended from X just by simply reporting out on the news of the day, if you will.

Steve Grumbine:

And yes, from what I understand, he aired some information that was already publicly available, but aired that and Musk deplatformed him.

Steve Grumbine:

YouTube has been deplatforming alt media like insanely lately.

Steve Grumbine:

And then lo and behold, most recently, I think it was two days ago, gentlemen from the Gray zone was kidnapped, captured, beaten and jailed in Israel for reporting out on the genocide that is being committed over there right now currently.

Steve Grumbine:

I mean, there is definitely some form of collusion going on.

Steve Grumbine:

The idea that police agencies can buy information from social media platforms, they can't directly monitor it, but if it's given because it's for sale publicly anyway, because these groups are selling the information, the authorities are getting it through that backdoor route as well.

Steve Grumbine:

Is that part of what you're talking about?

Andy Lee Roth:

Yeah, I mean, that's certainly an important part of the project or important part of the big picture.

Andy Lee Roth:

Not so much a part of my project, but we definitely have colleagues at the Electronic Frontier foundation who have tracked that sort of use of tech in law enforcement for a long time.

Andy Lee Roth:

And you know, one of the things I learned from folks at EFS is about the idea of, quote, parallel investigations where the information that law enforcement acts on is not necessarily the sort that would be allowed in court because it's oftentimes been acquired through inappropriate means.

Andy Lee Roth:

But once the information is held, law enforcement can use parallel Investigations they now have.

Andy Lee Roth:

They now know that they've got this.

Andy Lee Roth:

They can figure out another way to legitimately get the data, and then it's useful in court for a prosecution of someone who might otherwise not have, you know, not have been prosecuted.

Andy Lee Roth:

And I guess there's grounds for debate perhaps about, like, well, don't we want law enforcement to be able to go after crime when it occurs?

Andy Lee Roth:

But then that, of course, raises basic questions about who's defining what counts as criminality.

Andy Lee Roth:

Those are tricky areas where the tech, again, like, the basic thing is like control over tech, control over infrastructure confers power.

Andy Lee Roth:

Right.

Andy Lee Roth:

Thinking about this, like, I mentioned my background in sociology, and maybe I can dork out with you for a moment here.

Steve Grumbine:

Let's do it.

Andy Lee Roth:

I'm using this term algorithmic gatekeeping.

Andy Lee Roth:

And that term, gatekeeping, has a legacy in the sociology of journalism and critical communication studies of political discourse.

Andy Lee Roth:

studies that were done in the:

Andy Lee Roth:

There were two studies in particular, one by a guy named David Manning White and another by a researcher named Walter Geber.

Andy Lee Roth:

And they developed this concept, and for a while, it was the dominant paradigm in thinking about how news and information flows through mass media.

Andy Lee Roth:

Right.

Andy Lee Roth:

A term we don't talk about a lot more, but at the time that would have been appropriate.

Andy Lee Roth:

They both did studies of local newspaper editors who were getting stories, coming over the wire service and making determinations about what stories to run and what stories not to run.

Andy Lee Roth:

And the kind of famous upshot of the first of these studies, the David Manning White study, was that newspaper editors exerted personal bias in how they chose those stories.

Andy Lee Roth:

And that personal bias was specifically as White's study was presented and subsequently remembered as political bias, personal political bias.

Andy Lee Roth:

The reality is that In White's study, 18 of the 423 decisions that he examined involved decisions like, oh, this is pure propaganda, or that story is too red.

Andy Lee Roth:

But the report that White published about his findings talked about how highly subjective how based on the gatekeeper's own set of experiences, attitudes, and expectations the communication of news was.

Andy Lee Roth:

A few years later, and I promise this will come back around to algorithms in a moment.

Andy Lee Roth:

There's an interesting, like return, a boomerang kind of quality to this.

Andy Lee Roth:

A few years later, that study was duplicated, looking at multiple wire editors rather than a single editor, by a researcher named Walter Geber.

Andy Lee Roth:

And his conclusions basically refuted White's conclusion that gatekeeping was subjective and personal.

Andy Lee Roth:

Geber found that basically, decisions about what stories to run were a matter of like daily production and what he called, quote, bureaucratic routine.

Andy Lee Roth:

In other words, no political agenda to speak of.

Andy Lee Roth:

And lots of subsequent studies reinforced and refined Gieber's conclusion that professional accessions of newsworthiness are based on professional values more than political partisanship and so forth and so on.

Andy Lee Roth:

That gatekeeping model was like the dominant paradigm for understanding news for decades.

Andy Lee Roth:

And it was finally displaced by others in kind of the, maybe the 80s or 90s.

Andy Lee Roth:

in the gatekeeping model in a:

Andy Lee Roth:

It's a pristine model.

Andy Lee Roth:

The idea that news comes in already formed and an editor decides what to promote and what to leave on the floor.

Andy Lee Roth:

It's too simplistic for how we know the news production process actually works.

Andy Lee Roth:

So Judson making that call, basically, maybe it gives him too much credit to say he put the final nail in the coffin, but he certainly gave voice to what was the sense among a lot of people studying these things at the time that that model was kind of primitive and no longer relevant.

Andy Lee Roth:

The argument I've made more recently is that with the advent of the Internet and these big platforms like Meta and Microsoft and others, that the old gatekeeping model is actually completely relevant once again because the entities that are now controlling the flow and distribution of news, Google, Facebook, Twitter, X, et cetera, these corporations don't practice journalism themselves.

Andy Lee Roth:

And all they're really doing is facilitating the passing on of the content.

Andy Lee Roth:

So the Schudzen critique that news is sort of untouched except for how it's distributed in the old gatekeeping model actually accurately describes what we have in a kind of algorithmic gatekeeping model of news now.

Andy Lee Roth:

And I think that's a really important thing.

Andy Lee Roth:

It's one way of saying, like, you know, we might wish we could go back to an era where editors who are themselves news professionals make the judgments rather than tech companies, or to get more pointed, rather than the algorithms and other AI systems of those tech companies.

Andy Lee Roth:

And there's one more point on this.

Andy Lee Roth:

You couldn't go and do today the kind of studies that White or Gieber did in the 50s, right, because the.

Andy Lee Roth:

All the algorithms are treated by the companies that control them as proprietary information.

Andy Lee Roth:

There have been lawsuits to try to get, say, the YouTube algorithm cracked open not for the public to see how it works, but for an independent third party to examine it for whether Systemic biases are baked into the algorithm or not, or whether the algorithm is just being gamed by people.

Andy Lee Roth:

And those lawsuits have been class action lawsuits.

Andy Lee Roth:

And the courts have consistently decided in favor of the big corporations and the proprietary control over the algorithm.

Andy Lee Roth:

So part of what we're up against, again, is a form of corporate power.

Andy Lee Roth:

There's more to say about that.

Andy Lee Roth:

But maybe I should halt there for a moment.

Steve Grumbine:

That is very important to me because what I'm hearing, and this plays a part in much of what we do here.

Steve Grumbine:

This is directly, once again, placing the most important factor on the ownership of the algorithm.

Steve Grumbine:

It's saying, this is mine, my private thing.

Steve Grumbine:

I own this.

Steve Grumbine:

This is my intellectual property.

Steve Grumbine:

And it's not for you to know what we do or how we do it.

Steve Grumbine:

You just need to know that the news and the stuff that you get is a pure and pristine blah, blah, blah.

Steve Grumbine:

But in reality, right there in and of itself, we know fundamentally that this is a longstanding private property versus public interest kind of story.

Andy Lee Roth:

Yeah.

Steve Grumbine:

Once again, everything you could almost down the line, see that every one of these issues is an issue of private ownership versus the public commons in the public interests.

Andy Lee Roth:

Yeah.

Andy Lee Roth:

We at Project Censored have made strong arguments, and allies of ours like Victor Picard at the Annenberg School of University of Pennsylvania have made strong arguments that we need to start thinking of journalism as a public good.

Andy Lee Roth:

Right.

Steve Grumbine:

Yes.

Andy Lee Roth:

And that just like clean air and clean water and other things that are fundamental to life, if we're going to have a democratic society, we have to have kind of a sense of news as a commons.

Andy Lee Roth:

Right.

Andy Lee Roth:

Trustworthy news as a commons.

Andy Lee Roth:

So a major blockade to that right now is when we have companies like Twitter and Facebook and their parents, if we're talking about like Instagram and others.

Andy Lee Roth:

Right.

Andy Lee Roth:

And the parent companies, meta Google.

Andy Lee Roth:

Zuckerberg won't even acknowledge that Facebook is a communication platform, much less anything like an outlet that provides journalism.

Andy Lee Roth:

And so there's a huge disconnect, just a yawning gap between journalism as a profession, as a discipline which is guided by a code of ethics that says good journalism.

Andy Lee Roth:

I'm now quoting from the Society of Professional Journalists code of ethics, which is a brilliant document and well worth anyone who cares about news should have a look at it.

Andy Lee Roth:

Basically, in a nutshell, good journalism is independent, it's accountable and transparent, and it seeks to report truth and minimize harm.

Andy Lee Roth:

And I could go into more detail on each one of those individual points, but the key point I want to make now is that the big tech Platforms we're talking about are committed to none of those principles.

Andy Lee Roth:

They aren't committed to being accountable.

Andy Lee Roth:

They aren't committed to being transparent.

Andy Lee Roth:

They may talk about providing opportunities for people to seek the truth and pay lip service to the idea of minimizing harm.

Andy Lee Roth:

But on each and every one of those points, if we did a mock debate or a sort of imaginary courtroom session like you do in high school, I think it wouldn't be a big challenge to come up with a conviction of the major corporate tech companies, many of which are US based, but they have a global reach in terms of those standards.

Andy Lee Roth:

Right.

Andy Lee Roth:

And so the idea that journalism is increasingly at the mercy of these big tech platforms, because the big tech platforms have basically swooped up most of the advertising revenue that historically has made journalism a viable commercial enterprise.

Andy Lee Roth:

And the U.S.

Andy Lee Roth:

the fact that they've not only come in and swooped up the advertising revenue, but they're also now gatekeeping content.

Andy Lee Roth:

You know, any kind of content that doesn't fit with sort of a status quo understanding of the US Domestically or in the world, is a serious, serious problem.

Andy Lee Roth:

And it's another way, I think, for people who follow the work of Herman and Chomsky and their propaganda model as they outlined in Manufacturing Consent, this is a new wrinkle on those filters.

Andy Lee Roth:

erman and chomsky outlined in:

Steve Grumbine:

I'd like to touch back on that momentarily, but before I do, I want to just say this.

Steve Grumbine:

It seems to me like the concept of a dialectical perspective is completely lost.

Steve Grumbine:

Like when you hear news about the economy and how the economy is going, it's almost always told from the perspective of the investor.

Steve Grumbine:

It's almost always told from the position of the wealthy.

Steve Grumbine:

It's almost never told from the perspective of the working class.

Steve Grumbine:

And when it is, it might be some subheading or some sub statement deep in the article, but there just doesn't appear to be any balance in terms of whose interests are being advanced.

Steve Grumbine:

Because it's impossible to have an impartial news story.

Steve Grumbine:

It always depends on from what vantage point you're talking.

Steve Grumbine:

You could look at the French Revolution, and if you were just looking at it from Napoleon's perspective, I mean, you'd hear a whole different story than you would if you were talking to a peasant or someone who was a bread maker or maybe somebody who was part of a guild.

Steve Grumbine:

It's just a completely different perspective.

Steve Grumbine:

Before we go into those things that I want you to Dive deeper into.

Steve Grumbine:

I want to take a moment and just ask you, do you see a place for dialectical perspectives?

Steve Grumbine:

Because I don't see any representation of that in any of the outlets.

Andy Lee Roth:

Yeah, I mean, actually, a lot of my work going back to the dissertation I did at UCLA in sociology on broadcast news interviews, was about, basically, in blunt terms, the narrow range of people who are treated as newsworthy by the corporate press.

Andy Lee Roth:

And there's a longstanding tradition for this kind of research that it's the strongest bias in the establishment press is for official sources, so sources who represent some agency, whether it's a government agency or a corporate agency.

Andy Lee Roth:

And if you aren't affiliated in one of those ways, you're much less likely.

Andy Lee Roth:

There are much more restricted conditions under which you might be treated as a newsworthy source of information or perspective.

Andy Lee Roth:

And you're quite right that one of the most obvious examples of that is how in conversations about the economy, working people are almost always, if not invisible, they're more talked about than heard from.

Andy Lee Roth:

This goes to a concept that sociologists who study news use.

Andy Lee Roth:

It was developed by William Gamson called media standing.

Andy Lee Roth:

And Gamson's idea of media standing is not just who gets talked about in the news, but who gets to speak.

Andy Lee Roth:

Borrowing the term standing from legal settings, who is authorized to speak as a source.

Andy Lee Roth:

And again, there's just a massive and growing body of literature that shows this is one of the reasons Project Censored champions independent journalism, is that there's a wider range of people treated as newsworthy.

Andy Lee Roth:

There's a more inclusive definition of newsworthiness in terms of who might be a useful source on a given story.

Andy Lee Roth:

You can see this happening to tie this back to algorithms.

Andy Lee Roth:

And here I want to put in a plug for a fantastic book that's just out in the last week or so.

Andy Lee Roth:

The book is called AI Snake Oil, and it's by a pair of information science tech researchers from Princeton, Sayesh Kapoor and Arvind Narayan.

Andy Lee Roth:

In that book, they talk about some of the pitfalls of journalism's coverage of AI.

Andy Lee Roth:

And one major area they talk about is uncritically platforming those with self interest.

Andy Lee Roth:

So treating company spokespeople and researchers as neutral parties, right?

Andy Lee Roth:

Repeating or reusing PR terms and statements and not having discussions of potential limitations, or that when you do talk about limitations, the people who are raising those concerns are treated as quote skeptics or, quote Luddites.

Andy Lee Roth:

And all these are characteristics of, you know, a lot of the news reporting we're getting, at least from Mainstream kind of so called mainstream establishment news outlets.

Andy Lee Roth:

And this is part of what motivates the Algorithmic Literacy for Journalists project that I'm working on is to reiterate these points, like when you talk to a company spokesperson, think about how they may be overly optimistic about the potential benefits of the tool that they're developing.

Andy Lee Roth:

Right.

Andy Lee Roth:

When you use terms from a PR statement, it's important to consider whether those terms might be misleading or overselling and so forth and so on.

Andy Lee Roth:

And also the idea that there are other kinds of sources who could well be newsworthy.

Andy Lee Roth:

So people who focus, for instance, on the ethics of tech development, activists who understand how algorithms, how the use of algorithms, say by local law enforcement are affecting minority members of the community, all these kinds of things are part of that issue that you raise, which is one of the things I learned in grad school was the idea of sources make the news.

Andy Lee Roth:

Journalists understanding of the world is heavily shaped by the sources who they have regular contact with and who they, who they turn to for information and perspective.

Andy Lee Roth:

And so the selection of sources is kind of a fundamental bedrock area where we can see whether news is inclusive and diverse or whether it's exclusive and reinforces status quo arrangements, even when those status quo arrangements are rife with systemic inequalities and injustices.

Steve Grumbine:

Absolutely.

Steve Grumbine:

That was well stated.

Steve Grumbine:

So I want to jump to your current work, the work that you're in the middle of right now.

Steve Grumbine:

Can you tell us about your project and give us a background in that?

Andy Lee Roth:

Yeah, yeah.

Andy Lee Roth:

So it's a kind of a year long project that if all goes well, it'll be complete in February or March of next year.

Andy Lee Roth:

February or March of next year.

Andy Lee Roth:

fellow fellows in this year's:

Andy Lee Roth:

So you can go to the RJI website, which is rjionline.org and follow some links and you'll get to the fellows in general and my reports in particular.

Andy Lee Roth:

So you can find the first kind of progress report that I published with RJI called Big Tech Algorithms the New Gatekeepers.

Andy Lee Roth:

And that covers many of the topics we've just been talking about now.

Andy Lee Roth:

But also more recently, I just published with Avram Anderson, my colleague, who I mentioned earlier, an article on recognizing and responding to shadow bans.

Andy Lee Roth:

So this is a social media phenomenon where content isn't taken down, it's not censored, it's just not made visible to anyone.

Andy Lee Roth:

Other than the person who posted it themselves, and lots of news organizations that are using social media to promote their stories, especially news organizations that report on things like systemic racism, police violence, the situation in Palestine and beyond.

Andy Lee Roth:

Many news organizations are subject when they try to promote their reporting through social media.

Andy Lee Roth:

Platforms like Instagram are subject to shadow banning.

Andy Lee Roth:

And I know that Mid Press News, which is based in the Twin Cities Minneapolis area, has been subject to shadow banning.

Andy Lee Roth:

For the article on shadow banning, Abram and I talked to Ryan Sorrell, who's the founder and publisher of the Kansas City Defender, and he told us amazing stories about the restrictions that have been placed on their social media content and some of the strategies that they're using to get around.

Andy Lee Roth:

So the point of the project is not to say, oh my gosh, the sky is falling, the sky is falling.

Andy Lee Roth:

We're all doomed.

Andy Lee Roth:

Journalism screwed.

Andy Lee Roth:

The point of the Algorithmic Literacy for Journalist project is to give people, give people journalists and newsrooms the tools they need to push back against this kind of algorithmic gatekeeping.

Andy Lee Roth:

So the article about responding to shadow bans has five specific recommendations for how to recognize and respond to shadow bans if you're a reporter or a news outlet whose content is being restricted that way.

Andy Lee Roth:

We're working now another project censored colleague of mine, Shailey Voidl, who's our digital and print editor.

Andy Lee Roth:

Shailey and I are working on the next article, which will come out in a few weeks, which is about how we know a lot about what's wrong with horse race coverage of elections, how horse race election coverage actually makes people cynical about politicians and politics, how it actually demobilizes people from voting.

Andy Lee Roth:

And what Shaylee and I are doing is looking at horse race coverage of elections, critiques of horse race election coverage, and seeing if there are lessons from that coverage for how journalists cover tech developments, AI tech developments, and especially they don't get called horse races, they get called arms races.

Andy Lee Roth:

When it's two tech companies battling to see who can be the first to come out with some new AI tool, especially when those AI tools are for public consumption.

Andy Lee Roth:

So we're trying to look at, and this will be the next step in this project, we're looking at these lessons from flawed horse race coverage of elections to generate lessons for how journalists can better cover battles between either competing tech companies or international competition, to develop new AI tech and not to give this story away, but basically the idea is like when you focus on who's ahead and by how much, or are they losing or gaining ground, you're missing as a journalist, valuable opportunities to teach people about, well, what can this AI tech really do and what guardrails do.

Andy Lee Roth:

We need to make sure it's used the way it's intended to be used, that it's not being tested on populations, that it's not being developed in ways that will reproduce pre existing inequalities and so forth and so on.

Andy Lee Roth:

So yeah, the project in a nutshell.

Andy Lee Roth:

If anyone checking out our conversation here today is a journalist and you'd like to help in the development of these tools.

Andy Lee Roth:

I am actively enlisting journalists in newsrooms to help vet the toolkit once it's ready, and that'll happen later this year.

Andy Lee Roth:

So we're kind of putting out pieces of it as it's in development, but also seeking feedback, especially from journalists and other news professionals before the final product launches in, well, the February or March of next year.

Steve Grumbine:

That's really fantastic.

Steve Grumbine:

It's good to have a feedback loop and not be locked in your own little reinforcing.

Andy Lee Roth:

Yeah, there's no point.

Andy Lee Roth:

One of the challenges.

Andy Lee Roth:

I'm used to developing media literacy materials for classroom use and I've taught courses of sociology for my own for years.

Andy Lee Roth:

But preparing something that will work in a classroom is really different than preparing something that will be useful to journalists.

Andy Lee Roth:

Of course, teachers and students all have time pressure on them, but the time pressures that journalists work under are extraordinary and the financial pressures on journalists are just as astonishing.

Andy Lee Roth:

And so to create tools that journalists might actually take the time to look at and consider using requires a special sort of directness.

Andy Lee Roth:

That is, it's a part of what's interesting and challenging for me about this project is that transition making things just very straightforward and direct.

Andy Lee Roth:

So I mentioned partly the motivation is, I think we can have better coverage of AI if journalists have more algorithmic literacy themselves.

Andy Lee Roth:

Of course, the ultimate goal of that is to have a better informed public so that when we hear boasts about what one of the researchers whose work I'm looking at calls, you know, we can be resistant to hyperbole when we hear about AI, whether that hyperbole is doomsday scenarios or whether that hyperbole is positive in the sense of AI is going to save us, whether it's the doomers or the boomers.

Andy Lee Roth:

Just like any news, we need a kind of a media literate public that knows how to parse claims for whether they're trustworthy, whether they're supported by evidence or not.

Andy Lee Roth:

So hopefully if the project is successful, we might bump the needle a little bit in terms of how journalists do their work, which then in turn helps the public be more informed, better informed, more engaged around these issues.

Steve Grumbine:

You know, I look at AI and I want to ask you a couple questions about this before we get off of here.

Steve Grumbine:

This is really important to us.

Steve Grumbine:

You know, I frequently will put a question into ChatGPT just to see what the response will be.

Andy Lee Roth:

Yeah.

Steve Grumbine:

And almost invariably if I don't put qualifiers in my question, for example, please provide me with a class based understanding of the United States Constitution.

Steve Grumbine:

If I just say, please summarize the United States Constitution, I am going to get a glorification of this document.

Steve Grumbine:

It's going to literally salivate all over itself.

Steve Grumbine:

And I read it and I go, wow, this is the greatest thing ever.

Steve Grumbine:

My goodness, what a wonderful thing.

Steve Grumbine:

But if I ask it to give it from a perspective of a African American, it might say some things differently.

Steve Grumbine:

It might give me a perspective.

Steve Grumbine:

And it's always about whose class interests are being represented by the initial answer, the unfiltered initial answer.

Steve Grumbine:

I mean, it's filtered.

Steve Grumbine:

That's how it comes up with its answer.

Steve Grumbine:

But at some level the first output isn't to say, well, you know, Karl Marx once said that it was important for whatever, you know, it always comes at it from a very, very matter of fact.

Steve Grumbine:

So vanilla, you wouldn't think to question it.

Andy Lee Roth:

Well, there's a double barreled quality there, I think.

Andy Lee Roth:

Right.

Andy Lee Roth:

One of the barrels is obvious, one of the barrels is less obvious.

Andy Lee Roth:

Right.

Andy Lee Roth:

One is, you know, enough.

Andy Lee Roth:

Right.

Andy Lee Roth:

Your use is fairly sophisticated and that you're understanding.

Andy Lee Roth:

If I don't specify these parameters, I get kind of a generic status quo respons.

Andy Lee Roth:

So that's one level and that's a kind of algorithmic literacy that if we're going to use ChatGPT and other kind of generative AI programs the way the creators of those tools want us to use them.

Andy Lee Roth:

We have to know that.

Andy Lee Roth:

We have to know to make those distinctions and to be alert for if you don't, that's going to shape the kind of feedback you get.

Andy Lee Roth:

But the second, less obvious barrel is even if you specify the parameters, the generative AI program is only going to give you what it can produce on the basis of the data it's been trained with.

Andy Lee Roth:

And if prejudice or inequality of one form or another is baked into that data from the start, then the program will reproduce that inequality, that injustice.

Andy Lee Roth:

Right.

Andy Lee Roth:

So this goes to another of the kind of like pitfalls in kind of how we think about AI.

Andy Lee Roth:

Right.

Andy Lee Roth:

The idea of attributing agency to AI or comparing AI to human intelligence.

Andy Lee Roth:

If you and I have an argument about what the founding fathers intended, each of us can, you know that argument in a positive sense of the term argument.

Andy Lee Roth:

Right.

Andy Lee Roth:

We would make propositions and support them with evidence.

Andy Lee Roth:

And if I said, well, why do you think that you could explain to me why you think that AI can't do that?

Andy Lee Roth:

Right.

Andy Lee Roth:

You can ask, why is that so?

Andy Lee Roth:

And AI will churn through its calculations and come up with something.

Andy Lee Roth:

But it's not a thinking, intelligent entity, right?

Andy Lee Roth:

Not in the sense that we tend to think of those terms or tend to use those terms.

Andy Lee Roth:

And I think that's really important.

Andy Lee Roth:

So the idea that it's the old thing that my, my high school chemistry teacher used to say, garbage in, garbage out, right.

Andy Lee Roth:

If the AI has trained on materials that reflect existing biases, then the AI is going to give us biases.

Andy Lee Roth:

The other component of the kind of if I say it's double barreled and the second of the barrels is less obvious is, you know, chatgpt is hoovered up lots of attention in terms of public discourse and pundit commentary and so forth and so on.

Andy Lee Roth:

And I think it's important to be concerned about it.

Andy Lee Roth:

It's profound in its potential consequences for us.

Andy Lee Roth:

But I'm more concerned about all the AI systems that are operating that we aren't consciously aware of.

Steve Grumbine:

Law enforcement, Right.

Andy Lee Roth:

So, you know, we all, maybe a bunch of people maybe enjoy when Netflix tells you what movies it thinks you might like.

Andy Lee Roth:

And for a long time when Netflix did that, they would say, we think there's a 94% chance that you'll enjoy this recommendation based on what you've watched.

Andy Lee Roth:

But when we use a search engine, similar kind of algorithmic filtering is going on.

Andy Lee Roth:

But Google doesn't tell you there's a 67% chance that you're going to enjoy the returns on your search that we're giving.

Andy Lee Roth:

Right.

Andy Lee Roth:

The process is the same, but it's not flagged as being something that has been driven by an algorithmic assessment of your online behavior.

Andy Lee Roth:

And so it's less visible and therefore we're less conscious of it.

Andy Lee Roth:

And therefore we're more.

Andy Lee Roth:

Without a kind of algorithmic literacy, we're more vulnerable to manipulation of that sort.

Andy Lee Roth:

So I think it's right to focus, it's good to focus on ChatGPT and others of these kind of high profile, generative AI programs.

Andy Lee Roth:

I think we also need to be conscious of all the other ways that AI is shaping the information we receive about the world, and therefore how we understand the world.

Andy Lee Roth:

And that's again, kind of an underlying motive for the work that I'm doing now.

Steve Grumbine:

Not to give away the farm here and give away your work here before it's time, but could you maybe give us some other ways that AI is used outside of, you know, algorithms for social media or news, but also outside of ChatGPT?

Steve Grumbine:

Obviously the military uses it.

Steve Grumbine:

I mean, we heard AI is being used to target the people in Gaza, for example, with drones and the like.

Steve Grumbine:

Help me understand other ways that we should be aware of.

Andy Lee Roth:

I think one of the most important ones, and this is actually a series of reports.

Andy Lee Roth:

The series is called Machine Bias.

Andy Lee Roth:

It was produced by ProPublica.

Andy Lee Roth:

ProPublica was looking at the use of algorithmically driven software that would predict the likelihood that someone, a specific person, might commit a crime again in the Future.

Andy Lee Roth:

And what ProPublica exposed was how courts across the country were using these tools to influence judges sentencing of people convicted of crimes.

Andy Lee Roth:

And because the data driving the AI in this case was biased against blacks, the results that ProPublica exposed, this was a series led by independent investigative reporter named Julia Anguin, whose work is really impressive, exposed how based on these algorithmic injustices, blacks convicted of crimes had received longer sentences.

Andy Lee Roth:

And this was a direct result of the use of this AI technology.

Andy Lee Roth:

That study is important in its own right.

Andy Lee Roth:

It's also what kind of launched this idea of algorithmic accountability reporting.

Andy Lee Roth:

Taking the traditional watchdog role of journalists, which is a complicated concept.

Andy Lee Roth:

Whether journalism in the United States historically has functioned or lived up to that watchdog ideal is a topic for another conversation.

Andy Lee Roth:

But the idea of algorithmic accountability reporting is that the traditional watchdog role of journalism gets focused on the development of these new AI technologies.

Andy Lee Roth:

And Nicholas Diakopoulos is one of the pioneers of that work.

Andy Lee Roth:

And he often invokes in his own work the ProPublicum machine bias series as an example, where you think about crime in courts and the legal system and you don't necessarily immediately think AI.

Andy Lee Roth:

And yet what ProPublica showed was probably already historical injustices in terms of racism within the criminal justice system were being amplified by the use of AI tech in the sentencing process.

Steve Grumbine:

Let me ask you, Andy, one last question.

Steve Grumbine:

If you had one opportunity to say everything that we missed in this call, and obviously probably missed a lot because we had an hour here, but what would be the one thing you'd want people to know that maybe we didn't cover or that you think is important about this conversation?

Andy Lee Roth:

Thanks so much for this Conversation first, I'm going to toot Project Sensor's horn here.

Andy Lee Roth:

I mentioned earlier.

Andy Lee Roth:

Right.

Andy Lee Roth:

A major aim of the project is to help the public, and especially students, to become more media literate and especially more media literate.

Andy Lee Roth:

Thinking about these issues of power that we've been talking about throughout the conversation today as a key component of media literacy.

Andy Lee Roth:

And the thing I would say is media literacy is itself a kind of umbrella term.

Andy Lee Roth:

We're talking about multiple literacies and one of them is algorithmic literacy.

Andy Lee Roth:

So being aware when you use a search engine that it's not a neutral conduit of information.

Andy Lee Roth:

And we don't know exactly how the Google search engine works, but if people want to check this out for themselves, we can't pry open the so called black box, but we can compare different search engines and see what kind of results they produce.

Andy Lee Roth:

And you'll see then that search isn't neutral.

Andy Lee Roth:

Right.

Andy Lee Roth:

And that's one example of a host of things where in our daily lives we're depending on algorithms that we aren't necessarily conscious of.

Andy Lee Roth:

And so becoming more conscious of them and calling out problems when we encounter them is I think, one of the ways that we're going to push back.

Andy Lee Roth:

It's also really important, and this is something journalists have not done a good job of.

Andy Lee Roth:

When there is policy or regulation that might affect how these technologies are developed and how they're implemented, we should care about that, even when it seems like dry, difficult stuff.

Andy Lee Roth:

I'm probably overstepping the boundaries of the last question here, but last November a bunch of journalist organizations got together and issued the Paris Charter on AI and Journalism.

Andy Lee Roth:

You can find that online easily if you look for it.

Andy Lee Roth:

The Paris Charter on AI and Journalism.

Andy Lee Roth:

This got no news coverage in the United States except for news reports by a couple of the organizations that were signed on as members of the coalition that proposed this.

Andy Lee Roth:

But one of the basic proposals is that journalists and media outlets and journalism support groups should be invited to the table to talk about the governance of AI.

Andy Lee Roth:

Right.

Andy Lee Roth:

That journalists remain at the forefront of the field and have a say in how it develops.

Andy Lee Roth:

And probably we could expand that point to say, ordinary people too.

Andy Lee Roth:

Right.

Andy Lee Roth:

So that's a bigger issue in terms of policy, and that's probably an uphill into the wind battle.

Andy Lee Roth:

But I think people should be aware of those issues.

Andy Lee Roth:

And there are definitely resources out there, mostly produced by activists for how to push back if you're a member in a community that is suff from algorithmic bias.

Andy Lee Roth:

For often these are targeted at law Enforcement, use of AI, so facial recognition technologies and the like.

Andy Lee Roth:

And there is stuff out there, it just isn't widely covered by the establishment press, perhaps for reasons that are obvious.

Andy Lee Roth:

So, yeah, I think my key point, my key takeaway would be that algorithmic literacy is part of media literacy.

Andy Lee Roth:

People.

Andy Lee Roth:

Organizations like Project Censored and the Reynolds Journalism Institute are making resources available to journalists and to the general public.

Andy Lee Roth:

But, you know, it's on all of us to be informed.

Andy Lee Roth:

And on that note, thank you for having me on the show today, Steve, and for featuring these issues on Macro and Cheese.

Andy Lee Roth:

It's a valuable platform for getting the word out, for sure.

Steve Grumbine:

I really, really appreciate your time, folks.

Steve Grumbine:

My name's Steve Grumbine.

Steve Grumbine:

I am the host of Macro and Cheese, also the founder and CEO of Real Progressives, a nonprofit organization.

Steve Grumbine:

We survive on your donations.

Steve Grumbine:

So please, if you would like to make a donation, they are tax deductible.

Steve Grumbine:

We are a 501C3.

Steve Grumbine:

Andy, thank you again.

Steve Grumbine:

I really appreciate it.

Steve Grumbine:

We will make sure when we publish this that there are complete show notes, extras links, you name it.

Steve Grumbine:

And there's going to be an edited transcript for you all who want to read along or maybe want to read.

Steve Grumbine:

Don't forget, every Tuesday night at 8pm we have something called Macro and Chill, which is a redo of the interviews that we do.

Steve Grumbine:

And they're done in video format and they're broken down in about 15 minute segments.

Steve Grumbine:

And we listen and we talk about them.

Steve Grumbine:

We build community, we share knowledge, we ask questions, we agree, we disagree.

Steve Grumbine:

Please, to make it successful, we need you.

Steve Grumbine:

So come on down and check us out.

Steve Grumbine:

You'll usually see an update on our substack.

Steve Grumbine:

Check it out.

Steve Grumbine:

Real progressivesubstack.com with that.

Steve Grumbine:

Andy.

Steve Grumbine:

I really appreciate you joining me again today, sir.

Steve Grumbine:

And for the team over here, we want to thank you and Project Censored for being great guests.

Steve Grumbine:

And with that, folks, on behalf of my guest and myself, Macro and Cheese, we are out of here.

Steve Grumbine:

with the working class since:

Steve Grumbine:

To become a donor, please go to patreon.com real progressives, realprogressives.substack.com or realprogressives.org.

Chapters

Video

More from YouTube