NewsNation

OpenAI asks The New York Times to prove its articles are original

FILE - The OpenAI logo is seen on a mobile phone in front of a computer screen displaying output from ChatGPT, March 21, 2023, in Boston. A barrage of high-profile lawsuits in a New York federal court, including one by the New York Times, will test the future of ChatGPT and other artificial intelligence products. (AP Photo/Michael Dwyer, File)

(NewsNation) — The New York Times sued ChatGPT-maker OpenAI back in December 2023, claiming copyright infringement. Now, the tech giant wants proof the Times’ articles were “original, human-authored content” in the first place.

The legacy paper claimed the tech giant, alongside Microsoft, used its millions of published articles to train ChatGPT’s system.


OpenAI is defending itself. In a request filed Monday, the company’s lawyers asked the Times to prove its articles are original.

The company requested the paper provide “underlying reporter’s notes, interview memos, records of
materials cited, or other ‘files’ for each asserted work.” OpenAI did not request access to confidential information, like sources’ names.

That request was overly broad, unprecedented and improper, according to a Wednesday filing from the Times.

“OpenAI cites no caselaw permitting such invasive discovery, and for good reason. It is far outside the scope of what’s allowed under the Federal Rules and serves no purpose other than harassment and retaliation for The Times’s decision to file this lawsuit,” the filing read.

The Times asserted that the AI powerhouse is “not entitled to unbounded discovery into nearly 100 years of underlying reporters’ files, on the off chance that such a frolic might conceivably raise a doubt about the validity of The Times’s registered copyrights.”

The Times is far from alone in its efforts against OpenAI. The nonprofit Center for Investigative Reporting sued for copyright infringement in late June, the most recent in a barrage of United States media taking a stance against alleged copyright infringement by the AI powerhouse.