Shownotes
Facebook vaunts its multibillion dollar investment in content moderation in advertisements and every time a senior executive is asked to address the problems of mis- and disinformation, hate speech, abuse, bigotry and other violations on its platforms. But what does that investment look like at the last mile? Who does the work? What are the conditions they face?
Today we hear from Billy Perrigo, a journalist at Time magazine, who tells us of the plight of outsourced content moderation workers in Kenya tasked with moderating the typical stream of gore and bigotry, as well as screening for hate speech and incitement to violence emanating out of war-torn Ethiopia.