Generative AI Lawsuit: News Nonprofit Sues OpenAI and Microsoft

Generative AI Lawsuit

Generative AI lawsuits are becoming increasingly prevalent as news organizations fight against unauthorized use of their content. The Center for Investigative Reporting (CIR) has recently sued ChatGPT maker OpenAI and its business partner, Microsoft, marking a significant development in this ongoing legal battle.

The Lawsuit Against OpenAI and Microsoft

The Center for Investigative Reporting, which produces Mother Jones and Reveal, filed the lawsuit in a New York federal court. The nonprofit claims that OpenAI used its content without permission and without offering compensation, violating copyrights on the organization’s journalism. The lawsuit makes a speciality of how AI-generated summaries of articles threaten publishers, a circulate CIR referred to as exploitative.

Monika Bauerlein, the nonprofit’s CEO, emphasized the danger of this practice. “Our existence relies on users finding our work valuable and deciding to support it,” Bauerlein advised The Associated Press. She explained that if people no longer develop a relationship with the nonprofit’s work and instead rely on AI tools, it could undermine the foundation of their existence as an independent newsroom and threaten other news organizations.

Broader Implications of the Generative AI Lawsuit

This lawsuit is part of a larger series of legal actions against OpenAI and Microsoft, which are already facing copyright lawsuits from The New York Times, other media outlets, and Bestselling authors which include John Grisham, Jodi Picoult, and George R.R. Martin. Another separate case in San Francisco’s federal court involves authors including comedian Sarah Silverman.

Some news organizations have opted to collaborate with OpenAI rather than fight. They have signed deals to get compensated for sharing news content that can be used to train AI systems. For instance, Time announced that OpenAI would get access to its extensive archives from the last 101 years. The AP, The Wall Street Journal, The New York Post publisher News Corp., The Atlantic, Axel Springer in Germany, Prisa Media in Spain, France’s Le Monde, and the London-based Financial Times have also made licensing deals with OpenAI.

The Fair Use Doctrine in Generative AI Lawsuits

OpenAI and other major AI developers have argued that taking publicly accessible online text, images, and other media to train their AI systems is protected by the “fair use” doctrine of American copyright law. However, this argument is being contested by news organizations and authors who believe their content is being exploited without proper compensation.

Last summer, greater than 4,000 writers signed a letter to the CEOs of OpenAI and different tech companies, accusing them of exploitative practices in constructing chatbots. Bauerlein highlighted the issue, stating that news media content should not be a free resource for AI companies to ingest and profit from. “They pay for workplace space, they pay for electricity, they pay salaries for his or her workers. Why could the content material that they ingest be the most effective element that they don’t (pay for)?” Bauerlein said.

The Role of News Organizations in the Generative AI Lawsuit

Mother Jones and CIR, both based in San Francisco, were founded in the 1970s and merged earlier this year. Their lawsuit against OpenAI and Microsoft is a significant step in addressing the challenges faced by news organizations in the era of generative AI. By taking legal action, these organizations aim to protect their intellectual property and ensure that they are fairly compensated for the use of their content.


The generative AI lawsuit filed by the Center for Investigative Reporting against OpenAI and Microsoft highlights the ongoing struggle between news organizations and AI developers over the use of copyrighted content. As this legal battle unfolds, it will have significant implications for the future of AI development and the rights of content creators.

Featured Image:  Freepik © wirestock

Please see disclaimer