This is a two part show- first, a discussion about how to make sure independent researchers have access to the data from technology platforms; and second, a book talk with the author of How Algorithms Create and Prevent Fake News: Exploring the Impacts of Social Media, Deepfakes, GPT-3 and More.
In the wake of the revelations brought forward by Facebook whistleblower Frances Haugen, there is a great deal of focus among lawmakers and regulators in many capitals to figure out how to see inside the platforms. Last week, Nathaniel Persily, aprofessor of law at Stanford Law School and co-director of the Stanford Cyber Policy Center, put forward a draft of potential legislation, announcing what he calls the "Platform Transparency and Accountability Act" in a Washington Post column.
We took the opportunity to invite Nate and two other experts on this subject- Rebekah Tromble, Director of the Institute for Data, Democracy & Politics and Associate Professor at George Washington University, and Brandie Nonnecke, the Director of the Citris Policy Lab at UC Berkeley and a fellow at the Harvard Carr Center for Human Rights Policy- to talk about how best to get researchers access to the vast troves of data the platforms hold on us.
Noah Giansiracusa is a mathematician and data scientist who is Assistant Professor at Bentley University near Boston. Most of his papers are on things like algebraic geometry or machine learning. But recently, he wrote a book that looks at how algorithms are shaping our understanding of the world on social media. The book is called, How Algorithms Create and Prevent Fake News: Exploring the Impacts of Social Media, Deepfakes, GPT-3 and More. We spoke to Noah about the challenges of our algorithmically driven information environment, and whether AI might help us fix it.