Crypto exchanges are at risk from a deepfake threat

0
18
Crypto exchanges are at risk from a deepfake threat
Crypto exchanges are at risk from a deepfake threat

A new technique in the cybercriminal underworld that can circumvent two-factor authentication (2FA) has been revealed by the network security firm Cato Networks Cyber Threat Research Lab.

Cybercriminals are purchasing the deepfake tool from threat actor ProKYC in order to get around cryptocurrency exchange security measures.

In order to circumvent the 2FA security required to authorise new users, it uses deepfake video. This allows criminals to create new verified accounts, which they can subsequently use for money laundering and other purposes.

According to the America Association of Retired Persons (AARP), the growing issue of new account theft caused $5.3 billion in damages last year, as Cato CTRL notes. 

To try to pass online facial recognition tests, the tool creates a fake person using deepfake technology, then utilises the image to forge a counterfeit document, like a passport, with a photo. Finally, it creates a video of the fake person that is included in those documents. 

The offender connects to a cryptocurrency exchange and uploads the falsified paperwork to start an account fraud attack, according to Cato CTRL.

As part of the exchange’s identification procedure, they are then requested to turn on their computer’s camera so that facial recognition can be done. Rather, the technology enables the criminal to link the generated video as though it were the input from the camera.

Although only strengthening the authentication procedure would not always provide the greatest outcomes for cryptocurrency exchanges, Cato CTRL notes that other businesses and cryptocurrency exchanges are not powerless to stop these assaults.

But there are telltale signs that a document, picture, or video are fake, he adds. “One example is picture quality. A picture, and especially a video, which is very high quality are indicative of a digitally forged file. Another example is glitches in facial parts and inconsistency in eye and lip movement during biometric authentication. They should be treated as suspicious and manually verified by a human.”

Also readViksit Workforce for a Viksit Bharat

Do FollowCIO News LinkedIn Account | CIO News Facebook | CIO News Youtube | CIO News Twitter 

About us:

CIO News is the premier platform dedicated to delivering the latest news, updates, and insights from the CIO industry. As a trusted source in the technology and IT sector, we provide a comprehensive resource for executives and professionals seeking to stay informed and ahead of the curve. With a focus on cutting-edge developments and trends, CIO News serves as your go-to destination for staying abreast of the rapidly evolving landscape of technology and IT. Founded in June 2020, CIO News has rapidly evolved with ambitious growth plans to expand globally, targeting markets in the Middle East & Africa, ASEAN, USA, and the UK.

CIO News is a proprietary of Mercadeo Multiventures Pvt Ltd.