The AI governance illusion

by Black Hat Middle East and Africa
on
The AI governance illusion

Most organisations across industries now understand that they need visibility of AI. It has become a reassuring metric in boardrooms and on security dashboards; leaders report strong awareness of where AI is deployed, how it is used, and what data flows through it.

In a recent survey by Purple Book, 90% of organisations said they have visibility into their AI footprint, and 86% claimed a complete inventory of AI systems. 

But the same dataset shows 59% of those organisations confirm or suspect the presence of shadow AI – tools and workflows operating outside formal governance. And in separate research from Thoropass, 69% of organisations say AI adoption is moving faster than their security and compliance controls. 

So visibility doesn’t automatically equal control. 

The attribution problem

Governance begins with knowing who did what. And that foundation very quickly becomes unstable when organisations can’t distinguish between human and machine activity. According to 2026 research from Cloud Security Alliance (CSA), 68% of organisations report they can’t clearly differentiate actions performed by AI agents from those performed by humans.

Identity models add even more complexity. As CSA notes, AI agents operate through a mix of workload identities (52%), shared service accounts (43%), dedicated identities (36%), and even human user identities (31%). Each model carries different permissions and audit trails.

When attribution is inconsistent, it’s really hard to know who to hold accountable. 

Ownership without a centre

Governance depends on clear ownership. The CSA data shows responsibility for AI identity and access is spread across security (28%), engineering (21%), and IT (19%), with only 9% assigned to IAM teams and another 9% reporting no clear owner.

Another industry survey from ISACA reinforces this fragmentation. Responsibility for AI-related risk spans executives, technical leaders, and security teams, with a significant share of organisations unclear on ultimate accountability. 

Governance does exist, but it’s not all in one place. 

Confidence meets operational reality

In the CSA survey, 57% of organisations report moderate or high confidence in identity scoping and access control. 

But operational data tells a more complex story. One-third of organisations are unsure of how often AI credentials are rotated or refreshed, 9% say credentials are rarely or never rotated, and only 22% apply access control frameworks consistently to AI agents. 

The problem here is that AI agents already operate inside production systems – interacting with internal APIs (56%), SaaS platforms (49%), and cloud infrastructure (44%). So gaps in tracking and control create real exposure. 

At the same time, Thoropass finds that AI-related data exposure has become the most likely trigger for regulatory or customer fallout (55.2%). 

Governance at machine speed

The underlying issue is speed. 

AI adoption introduces new workflows faster than governance frameworks evolve. Security teams identify risks and define controls, while development and business teams deploy AI capabilities into production.

The Purple Book describes this as a gap between awareness and the ability to act at the pace AI demands.

Compliance functions already reflect this shift. As Thoropass notes, audit programmes are moving from periodic certification toward continuous risk management as AI becomes embedded in operational workflows. 

Moving from visibility to control 

The path forward centres on execution:

  • Treat AI agents as first-class identities
  • Apply consistent access controls across environments
  • Align ownership across security, engineering, and IAM
  • Build real-time monitoring and revocation capabilities

Visibility provides information. But it’s control that determines whether that information reflects reality at any given moment.

Share on

Join newsletter

Join the newsletter to receive the latest updates in your inbox.


Follow us


Topics

Sign up for more like this.

Join the newsletter to receive the latest updates in your inbox.

Related articles