Are you ready for vishing attacks?

by Black Hat Middle East and Africa
Are you ready for vishing attacks?

Welcome to the new 211 cyber warriors who joined us last week. 🥳 Each week, we'll be sharing insights from the Black Hat MEA community. Read exclusive interviews with industry experts and key findings from the #BHMEA23 keynote stage.

Keep up with our weekly newsletters on LinkedIn, Subscribe here.

This week we’re focused on…

AI-enabled vishing.📲


Because cybersecurity researchers at WatchGuard predict it’ll be on the rise in 2024.

What is vishing?🤔

Think phishing, but with a human voice. Except the emergence of generative AI means the voice doesn’t have to be a real human – it can even be a deepfaked copy of a voice the victim already trusts. 

Voice over Internet Protocol (VoIP) combined with automation tech already enable scammers to mass-dial thousands of people. But when someone has answered a call, a human voice has to be there to talk to them, and undertake the social engineering component of the scam.

It’s expected that highly convincing deepfake audio, created with generative AI tools, will replace the need for scammer-to-victim chats – allowing threat groups to engage in hundreds or thousands of artificial conversations at a time. And this will vastly increase the scale and impact of vishing. 

Have you ever been fooled by a vishing call?

1. Yes 😞 vote

2. No 😬 vote

It’s already happening: Vishing as a service

The Photon Research Team recently told Information Security Media Group (ISMG) that cybercriminals are now actively using deepfake audio or video tech to deliver convincing, credible impersonation scams.

And worryingly, their research found that cybercriminals are leveraging commercially available tools to do this. 

A specialised attack group sharing information on a Russian-speaking cybercrime forum are offering vishing services to other threat actors for a fee of USD $1k+; leveraging those commercially available tools to create and deploy clones and customised voice robots.

And they have plenty of customers. 

Vishing isn’t new, but AI makes it more convincing 

  • One global survey found that nearly seven in 10 respondents experienced an attempted vishing attack in 2022 – a 54’% increase on 2020
  • More the 59.4 million people in the US were victims of vishing in 2021
  • Increasingly, smartphones are targeted for vishing calls, while vishing over landlines is decreasing

Impersonation using deepfake tech is becoming incredibly convincing – so much so that grandparents can be convinced they’re on the phone to their grandchild when it is, in fact, an AI bot. 

One increasingly common attack identified in the Photon research targets bank customers. The victim picks up the phone and hears a prerecorded message that seems to be from their bank, asking them to provide their account details – and their voice response is recorded and used to bypass the bank’s biometric security measures, giving the attacker access to the victim’s account. 

The more credible the voice, the higher the cost

We love a tiny silver lining: and one possible barrier to entry for vishing-hopefuls is that customised and credible voice impersonations can cost more to create than lower quality audio bots. 

But it really is a very small problem for attackers. Because as Photon told ISMG, 

"If an attacker obtains the right sample during the reconnaissance phase of an attack, they may not need voice impersonation tools, which are too expensive and complex for many cybercriminals. Instead, an attacker may edit their sample to produce whatever sounds they are looking for."

💡The takeaway?

More vishing attacks are coming in 2024. 

Prepare yourself, educate your team, and speak to cold-callers with caution. 

Do you have an idea for a topic you'd like us to cover? We're eager to hear it! Drop us a message and share your thoughts. Our next newsletter is scheduled for 20 December 2023.

Catch you next week,
Steve Durning
Exhibition Director

Join us at Black Hat MEA 2024 to grow your network, expand your knowledge, and build your business.

*Referral program terms and conditions

Share on

Join newsletter

Join the newsletter to receive the latest updates in your inbox.

Follow us


Sign up for more like this.

Join the newsletter to receive the latest updates in your inbox.

Related articles