Why is it still so hard to secure mission-critical operations?

by Black Hat Middle East and Africa
on
Why is it still so hard to secure mission-critical operations?

In July 2025, IBM’s annual Cost of a Data Breach report delivered a rare glimmer of good news: for the first time in five years, the global average cost of a breach dropped – from USD $4.45M to $4.44M. Breach lifecycles also shrank, with organisations taking an average of 17 fewer days to identify, contain, and recover.

It’s progress. But it’s definitely not the full picture. 

Despite modest improvements in breach metrics, the day-to-day experience of cybersecurity leaders tells a more complex story. According to new research by BitSight, 90% of surveyed cyber and risk professionals say their jobs are harder than they were five years ago. And AI is a significant driver of that pressure – cited by 39% of respondents as a core factor making cyber risk management more difficult.

So, what’s going on? Why does everything still feel so hard, even when some metrics improve?

A new Ponemon Institute report, titled The State of Mission-Critical Work, offers one explanation: our most vital workflows are still vulnerable – and we’re struggling to secure them.

Cyberattacks are the top cause of mission-critical failures

According to Ponemon, 64% of organisations experienced at least one disruption or failure in their mission-critical workflows in the past 12 months. Most of them experienced more than one.

And the leading cause of those failures were cyberattacks, cited by 50% of respondents – slightly edging out system glitches (49%). These aren’t minor outages: 62% of respondents say that workflow failures led to the leakage of high-value information assets, while 58% experienced data centre downtime, and 46% said the incident directly affected their organisation’s survivability.

A million-dollar mistake (or worse)

The financial impact of these disruptions is significant. While not every incident results in a public breach, the internal costs are still high. According to Ponemon’s data:

  • The most commonly used metric to assess incident cost is downtime of critical operations (63%)
  • Reputational recovery and remediation efforts follow closely behind
  • A separate Ponemon study in 2020 placed the average cost of a single data centre outage at over $1M

But only 53% of organisations actually measure the cost of mission-critical workflow failures – which suggests that many might not fully grasp the risks they face.

Who owns the risk? It’s not the CISO

One of the more surprising findings is that only 16% of organisations say their CISO is primarily responsible for ensuring mission-critical workflows are executed securely. The most common answer is ‘Business unit leaders’ (26%).

This disconnect may explain why security isn't fully embedded in the design and execution of vital operational systems. When we speak to CISOs at Black Hat MEA, they remind us over and over again that when they’re sidelined from mission-critical functions, risk visibility decreases – and so does preparedness.

Lack of real-time info is killing workflow resilience

Only 34% of the organisations surveyed by Ponemon rate themselves as effective at prioritising critical communications during incidents. The top barriers to secure, continuous operations appear to be:

  • Lack of real-time information sharing (60%)
  • Lack of secure information sharing (58%)
  • Ineffective coordination across functions (53%)

It’s a reminder that cybersecurity resilience is as much an organisational challenge as a technical one. Collaboration, communication, and structure matter.

AI adoption is up, but it comes with risks 

51% of organisations say they’ve adopted AI to support mission-critical workflows; most often to automate repetitive tasks and secure data processed by LLMs. But this adoption is not without risks:

  • 53% cite data leakage or theft as their top AI-related concern
  • 48% are worried about backdoor attacks on their AI infrastructure
  • 45% highlight legal and compliance risks

With the use of LLMs rising across operations, securing the AI layer is fast becoming a mission-critical task in itself.

And dedicated teams make a difference, but not everyone has one 

Only 56% of organisations have a dedicated team to manage and secure mission-critical workflows. Those that do report:

  • Fewer incidents (average of 5 per year, versus 6 for others)
  • Higher efficiency in workflow management (57% vs. 36%)
  • Better use of secure collaboration tools

Yet 44% of organisations still operate without such a team – and most of them say it’s very or highly difficult to manage their critical workflows effectively.

Metrics aren’t the mission 

We can take comfort (at least for a second or two) in declining breach costs or shortened response times. But if cyber and risk leaders are telling us their work is harder, and the systems they protect are still failing under pressure, then the story is far from simple. 

To truly improve security, we have to protect what matters most: the work that keeps our organisations running. And that means embedding cybersecurity into the heart of our operations consistently and diligently. 

Share on

Join newsletter

Join the newsletter to receive the latest updates in your inbox.


Follow us


Topics

Sign up for more like this.

Join the newsletter to receive the latest updates in your inbox.

Related articles