The Register: Deepfake detectors are slowly coming of age, at a time of dire need

News
|
August 14, 2025
|
Calandra Smith
Summary: AI-powered deepfakes are making fraud harder to detect, with realistic fake videos, voices, and images bypassing traditional checks. Experts, including Fortify’s Karthik Tadinada, warn that even strong detection tools leave gaps. Financial institutions must pair technology with robust verification and monitoring to counter this rapidly growing threat.

Just as AI is enabling breakthroughs in fraud detection, it’s also arming criminals with new ways to deceive – and deepfakes are fast becoming one of the most dangerous tools in their arsenal.

In this article from The Register, security experts, including Fortify’s founder Karthik Tadinada, explore how advances in generative AI are driving a surge in convincing fake video, audio, and images – and why current defences are struggling to keep up. Deloitte estimates deepfake fraud could cost the US $40 billion by 2027, though many believe that’s an understatement.

Karthik, drawing on his years in banking fraud prevention, warns that even a 90% detection rate still leaves fraudsters with ample opportunity. Manipulated IDs, utility bills, and even “live” video calls can bypass electronic onboarding checks. New detection methods, such as metadata analysis, edge detection, and pixel variance, offer promise – but voice cloning in particular remains hard to spot.

The takeaway for financial institutions: technology alone won’t solve the problem. Robust verification, transaction monitoring, and an assumption that “if money is involved, caution is critical” must remain part of the defence strategy. As deepfake tools get cheaper and more sophisticated, the race to detect and counter them has never been more urgent.

Read the full article >

Post
Post
Share

Sign up for the latest news and insights from Fortify

Turn risk into ROI

The Fortify team can help

Find out how we can support your fraud prevention strategy

Related articles

No items found.

Need expert advice?

Get in Touch

Get in touch

Thank you.
Your form has been received.
Oops! Something went wrong while submitting the form.