IS THE BOSS who’s giving you an order real or just realistic? Deepfakes are now taking Zoom calls to another level of awkwardness, by making us question whether our co-workers are genuine. A finance worker in Hong Kong transferred more than $25 million to scammers after they posed as his chief financial officer and other colleagues on a video conference call, marking perhaps the biggest known corporate fraud using deepfake technology to date. The worker had been suspicious about an e-mail requesting a secret transaction, but the scammers looked and sounded so convincing on the call that he sent the money.

Corporate IT managers have spent more than a decade trying, often fruitlessly, to train office workers to spot phishing e-mails and resist the urge to click on dodgy attachments. Often hackers and fraudsters need just one person out of hundreds to inadvertently download the malware needed to tunnel into a corporate network. With AI-powered video tools, they’re moving into territory we have considered safe, underscoring how quickly deepfake technology has developed in just the last year. While it sounds like science fiction, such elaborate frauds are now relatively easy to set up, ushering us into a new age of skepticism.

The fraud in Hong Kong almost certainly used real-time deepfakes, meaning that the fake executive mirrored the scammer as they listened, talked and nodded during the meeting. According to David Maimon, a criminology professor at Georgia State University, online fraudsters have been using real-time deepfakes on video calls since at least last year for smaller-scale fraud including romance scams.

Maimon posted a video to LinkedIn (David Maimon on LinkedIn: #bankaccounts #bankaccount #passport #usa #passports #data #money #markets… | 14 comments [https://tinyurl.com/2bax7xd7]), showing a demo from developers who are selling deepfake video tools to potential fraudsters. In it, you can see the real image of a man on the left and his fake persona on the right, a beautiful young woman scamming the male victim in the middle.

This is uncharted territory for most of us, but here’s what the Hong Kong victim could have done to spot the deepfake, and what we’ll all need to do in the future for sensitive video calls:

Use visual cues to verify who you’re talking to. Deepfakes still can’t do complex movements in real time, so if in doubt, ask your video conference counterpart to write a word or phrase on a piece of paper and show it on camera. You could ask them to pick up a nearby book or perform a unique gesture, like touching their ear or waving a hand, all of which can be difficult for deepfakes to replicate convincingly in real-time.
Watch the mouth. Look out for discrepancies in lip syncing or weird facial expressions that go beyond a typical connection glitch.
Employ multi-factor authentication. For sensitive meetings, consider involving a secondary conversation via e-mail, SMS, or an authenticator app, to make sure the participants are who they claim to be.
Use other secure channels. For critical meetings that will involve sensitive information or financial transactions, you and the other meeting participants could verify your identities through an encrypted messaging app like Signal or confirm decisions such as financial transactions through those same channels.
Update your software. Make sure that you’re using the latest version of your video conferencing software in case it incorporates security features to detect deepfakes. (Zoom Video Communications did not reply to questions about whether it plans to make such detection technology available to its users.)
Avoid unknown video conferencing platforms. Especially for sensitive meetings, use well-known platforms like Zoom or Google Meet that have relatively strong security measures in place.
Look out for suspicious behavior and activity. Some strategies stand the test of time. Be wary of urgent requests for money, last-minute meetings that involve big decisions, or for changes in tone, language or a person’s style of speaking. Scammers often use pressure tactics so beware of any attempts to rush a decision too.

Some of these tips could go out of date over time, especially visual cues. As recently as last year, you could spot a deepfake by asking the speaker to turn sideways to see them in profile. Now some deepfakes can convincingly move their heads side to side.

For years fraudsters have hacked into the computers of wealthy individuals, hoovering up their personal information to help them get through security checks with their bank. But at least in banking, managers can create new processes to force their underlings to tighten up security. The corporate world is far messier, with an array of different approaches to security that allow fraudsters to simply cast their nets wide enough to find vulnerabilities.

The more people wise up to the possibility of fakery, the less chance the scammers will have. We’ll just have to pay the price as the discomfort of conference calls becomes ever more agonizing, and the old Zoom clichés about your peers being on mute morph into requests for them to scratch their noses.

BLOOMBERG OPINION