Shownotes
Earlier this month, Getty Images, one of the world’s most prominent suppliers of editorial photography, stock images, and other forms of media, announced that it had commenced legal proceedings in the High Court of Justice in London against Stability AI, a British startup firm that says it builds AI solutions using "collective intelligence," claiming Stability AI infringed on Getty’s intellectual property rights by including content owned or represented by Getty Images in its training data. Getty says Stability AI unlawfully copied and processed millions of images protected by copyright and the associated metadata owned or represented by Getty Images without a license, which the company says is to the detriment of the content’s creators. The notion at the heart of Getty’s assertion- that generative AI tools like Stable Diffusion and OpenAI’s DALLE-2 are in fact exploiting the creators of the images their models are trained on- could have significant implications for the field.
Earlier this month I attended a symposium on Existing Law and Extended Reality, hosted at Stanford Law School. There, I met today’s guest, Michael Running Wolf, who brings a unique perspective to questions related to AI and ownership, as a former Amazon software engineer, a PhD student in computer science at McGill University, and as a Northern Cheyenne man intent on preserving the language and culture of native people.