In the context of increasingly powerful deepfake technology, Changpeng Zhao (CZ) – the founder of Binance exchange – has warned that video calls are no longer a reliable verification method in security. This warning was issued after a sophisticated attack targeting famous cryptocurrency analyst Mai Fujimoto (also known by the nickname Miss Bitcoin).
Deepfake attack via video call: Mai Fujimoto becomes a victim
Mai Fujimoto lost control of her X social media account (formerly Twitter) after participating in what seemed like a harmless video call with an acquaintance. However, the person in the call was actually a deepfake – a technology that creates extremely convincing fake images and facial movements, causing her to have no suspicion.
Notably, the call lasted the entire time without detection. Fujimoto only realized the truth after her account X was compromised, followed by her Telegram accounts and MetaMask wallet.
Suspicious signs ignored
Fujimoto admitted that she had ignored warning signals during the call, such as prolonged poor audio quality. Instead of being suspicious, she assumed this issue was due to her friend – an engineer – choosing to use the Zoom platform as a personal tech preference, not thinking it could be a deception tactic.
This incident has demonstrated the sophistication of modern social engineering attacks, where bad actors use deepfake to impersonate relatives or colleagues, exploiting the trust of the victim to commit fraud.
CZ warns: 'Do not trust even friends if they send software'
CZ emphasizes that if acquaintances message requesting to install software from unofficial sources, it is very likely that their accounts have been breached. He advises users not to download software from unknown links, even if the invitation comes from close friends. This is a common scam tactic – exploiting trust to spread malware or take control of devices.
He also believes that the rapid development of AI is creating new dangers, with deepfake attacks completely changing the game of identity fraud.
Community warning: Do not trust video calls 100%
After the incident, Fujimoto issued her own warning: if anyone receives a video call from her in the future, be cautious. Her face image could be used by bad actors to carry out subsequent deepfake attacks.
Conclusion
The case of Mai Fujimoto is a serious wake-up call about how cyber attacks are becoming increasingly sophisticated, exploiting artificial intelligence to deceive even tech-savvy individuals. In an age where 'seeing is believing' can also be fake, users need to enhance their vigilance, especially with unusual requests from friends or relatives – in any form.
👉 Advice for you
Do not install software from strange links, even if sent by friends.
Thoroughly check the caller's identity, do not fully trust the images on the video.
Always use two-factor authentication (2FA) for important accounts.
Alert friends if you suspect their accounts have been compromised.
🔐 Security is no longer an option – it is a mandatory requirement.