Select your language

Select your language

A US federal court has ordered OpenAI to preserve all ChatGPT user conversation data, including those marked for deletion. The company calls this decision a "privacy nightmare" and is actively challenging the ruling.

openai-chatgpt-logs-court-order.jpg

In an ongoing legal battle between OpenAI and The New York Times, a federal judge has made an unprecedented decision requiring the company to preserve all ChatGPT output logs. This ruling includes even conversations that users have intentionally deleted, believing their data would be permanently erased.

The Core of the Legal Dispute

The New York Times filed a lawsuit against OpenAI, accusing the company of copyright infringement by using "millions" of the newspaper's articles to train its language models without permission. During the litigation process, suspicions arose that OpenAI may have destroyed relevant evidence.

Magistrate Judge Wang denied OpenAI's motion to reconsider the order to preserve output log data. The company called this decision a "frantic" attempt at coercion and expressed serious concerns about user privacy.

Impact on Users

The court's decision means that OpenAI can no longer delete user conversation data, even if users themselves want it deleted. This creates a precedent that could fundamentally change approaches to data management in the artificial intelligence sector.

Previously, ChatGPT users could:

  • Disable chat history in settings
  • Delete individual conversations
  • Export their data
  • Permanently delete their account

Now all this data must be preserved for potential use in litigation.

OpenAI's Position

OpenAI representatives argue that such broad preservation of user data creates serious privacy risks. The company emphasizes that many users share personal and sensitive information with ChatGPT, counting on the ability to delete it.

OpenAI also notes the importance of developing sophisticated discovery compliance protocols for AI companies, particularly for products like ChatGPT that process enormous volumes of user data.

Legal Implications

This case could set a precedent for the entire artificial intelligence industry. If the court sides with The New York Times, it could lead to stricter data preservation requirements for all AI companies and change approaches to training language models.

A hearing on the alleged spoliation of evidence is scheduled for May 27 at 2:30 PM, where The New York Times' accusation that OpenAI destroyed output log data in a manner that could constitute evidence tampering will be considered.

For more detailed information, visit the official OpenAI website.

If you have any problems, contact us, we will help quickly and efficiently!