Fake face. Real money.
A reporter at 404 Media got on a Microsoft Teams call recently and watched his own face appear on someone else, in real time. The face moved when the other person moved. Lighting changes held. Expressions tracked. The software is designed and sold for impersonation scams, and it works as advertised.
This is the engine behind a wave of business impersonation scams hitting small businesses, nonprofits, and large companies. The face you see on a video call is no longer evidence of who you're talking to.
A 404 Media reporter tested the deepfake software in a controlled call. The setup is consumer-grade: off-the-shelf laptop, the deepfake tool layered over a normal video conferencing app, and a target face the software has been trained on. The tool is sold openly on Chinese-language messaging channels, with marketing aimed at scammers. It's a commercial product anybody can buy.
The illusion held. The face stayed locked to the speaker through head turns, expression changes, even shifts in room lighting that would normally reveal a fake. There were small artifacts if you knew what to look for, but over a video call people have their guard down.
The face was a lie, the conversation was real.
Business email compromise, or BEC, is the long-running scam where someone impersonates a person you trust (your boss, a vendor, a client) to move money or access where it shouldn't go. The FBI has tracked it for years. Every year it costs US businesses billions.
Each generation of the scam has trained us on a new verification habit. Email scams taught us to verify the sender. Voice phishing taught us to call back to a known number. Video calls used to be the layer that ended the chain, because seeing someone's face was honest evidence of who they were. That layer is no more.
A few years back, when I started a new leadership job, my LinkedIn title updated and the texts started the same week. They opened with the executive director's name and a "hey." Familiar and casual. The number wasn't saved in my phone, but I was new, didn't know yet what was normal, and the executive director texting me felt plausible. I replied asking what they need. I expected an IT request.
The pivot was gift cards for a client, urgent, stuck in a meeting. That's where it fell apart. The opener got my attention. The request gave it away.
The scam worked because of social pressure. New employee, polite culture, perceived hierarchy. All pushing the same direction.
A deepfake video call is the same scam with one fewer red flag. The asker now looks and sounds right, holds eye contact. The only honest signal left is the request itself.
“Trust the request, not the requester.”
What to do
Adopt a callback rule for money or access. Any wire transfer, payroll change, vendor banking update, or credentials request that comes through video, voice, or chat gets a callback to a known number before it moves. Known number means saved in your phone before the request happened, not whatever number is on the email signature.
Make the awkwardness routine. "Can you confirm via text" or "I'm going to call you back at the office number" should feel normal coming from anyone in the company. Train the team to expect it from each other. The point is to remove the social cost of pausing.
Push the policy down from the top. It can't be the bookkeeper's job to figure out whether challenging the boss is allowed. Make it the manager's job to require the callback, and put it in writing.
The deepfake software is here, and the people building it aren't waiting for permission. The fix is a habit. The companies that get hit will be the ones where asking the boss to prove they are the boss felt rude.
Joel
Source:'HELLO BOSS': Inside the Chinese Realtime Deepfake Software Powering Scams Around the World, 404 Media.

