
Internships that matter: How to build the next generation of cyber defenders
Find out why internships need to give students the opportunity to do real, meaningful work.
Read MoreBuild cyber resilience with exclusive interviews and insights from the global Black Hat MEA community.
Generative AI governance.
Because a sweeping US court order in May 2025 changed the rules for generative AI governance.
In a legal case between The New York Times and OpenAI, a federal judge ordered the preservation of all user data – including deleted ChatGPT conversations. And that precedent is creating waves for security and privacy teams worldwide.
We asked Betania Allo (Cybersecurity Lawyer and Policy Strategist) why this matters so much, and she said:
“Even users who had disabled chat history or deleted conversations could no longer assume their data was erased. That data had to be preserved – not by corporate policy, but by judicial mandate.”
Allo explained, "The court’s preservation order introduced a new precedent in AI governance...overriding normal retention and deletion policies.”
This means CISOs can no longer rely on standard data minimisation practices to meet their legal or regulatory obligations.
For European organisations in particular, this ruling conflicts with GDPR’s principles of data minimisation and the right to erasure. "This directly challenges GDPR principles,” Allo warned, “but this technical safeguard does not negate the broader conflict between jurisdictional privacy norms and extraterritorial legal mandates.”
According to Allo:
Most importantly, CISOs must rethink how they define data control. Even deleted inputs, logs, and experimentation data may now be considered legal evidence.
“Deleted prompts may still be accessible. Experimentation logs may become subpoenaed records,” Allo noted. And while OpenAI has introduced a vetting system to limit internal access, that safeguard is not externally audited or governed.
“‘Vetting’ is not a legal standard," she said, highlighting the lack of oversight around who accesses retained data.
This is a wake-up call for every organisation using generative AI. Legal discoverability has entered the AI risk matrix – and the only proven way to avoid unexpected exposure is to enforce Zero Data Retention (ZDR) wherever possible.
“ZDR is no longer a nice-to-have but an essential safeguard,” Allo said.
Head to the blog to read more of our conversation with Allo.
And get your pass to attend Black Hat MEA 2025 – to make sure your organisation stays ahead of the curve.
Join the newsletter to receive the latest updates in your inbox.
Find out why internships need to give students the opportunity to do real, meaningful work.
Read MorePen testers reveal how hacking helps protect. Learn from Rana Khalil and Quinn Carman why ethical hacking is about communication, not just code.
Read MoreMalicious hackers aren’t always who you think. Explore the global diversity of cybercriminals – and why it matters for cyber defence strategies.
Read More