Online scams are running rampant globally. (Representational Image)
Scammers in Hong Kong were able to fake a video meeting using deepfake technology and, in the process, steal a whopping $25.6 million.
We have seen how deepfakes have proliferated the internet, imitating some of the most popular people around for a nefarious agenda, and sometimes even for the purpose of defrauding unsuspecting people. But what if we told you that scammers in Hong Kong were able to fake a video meeting using deepfake technology and in the process, steal a whopping $25.6 million?
According to a report by the South China Morning Post, the same happened to a company based in Hong Kong, wherein scammers used rapidly advancing deepfake technology to fool the company’s local branch during a fake and manipulated video conference call. The fraudsters reportedly digitally faked the appearance of the company’s Chief Financial Officer to give orders for money transfers.
Everyone present on the video calls, except the victim, was a fake representation of real people, the publication reported.
“The scammers applied deepfake technology to turn publicly available video and other footage into convincing versions of the meeting’s participants,” said the report.
The Hong Kong Police say that this is a first-of-its-kind scam in Hong Kong. “This time, in a multi-person video conference, it turns out that everyone you see is fake,” Baron Chan Shun-ching, the acting senior superintendent, was quoted as saying.
The officer added: “They used deepfake technology to imitate the voice of their targets reading from a script.” In total, 15 transfers were made, amounting to HK $25.6 million to multiple bank accounts in Hong Kong.
Notably, this follows multiple incidents of celebrity deepfakes that have taken the internet by storm. The first one occurred with Indian actress Rashmika Mandanna last year, where her face was imposed on an online influencer’s video. More recently, fake sexually explicit clips of singer Taylor Swift have also gone viral on the internet.