How to recognize a deepfake: attack of the clones
#1
Lightbulb 
Quote:Learn how to spot deepfakes in photos, videos, voice messages, and video calls in real time.
 
Technologies for creating fake video and voice messages are accessible to anyone these days, and scammers are busy mastering the art of deepfakes. No one is immune to the threat — modern neural networks can clone a person’s voice from just three to five seconds of audio, and create highly convincing videos from a couple of photos. We’ve previously discussed how to distinguish a real photo or video from a fake and trace its origin to when it was taken or generated. Now let’s take a look at how attackers create and use deepfakes in real time, how to spot a fake without forensic tools, and how to protect yourself and loved ones from “clone attacks”.

How deepfakes are made

Scammers gather source material for deepfakes from open sources: webinars, public videos on social networks and channels, and online speeches. Sometimes they simply call identity theft targets and keep them on the line for as long as possible to collect data for maximum-quality voice cloning. And hacking the messaging account of someone who loves voice and video messages is the ultimate jackpot for scammers. With access to video recordings and voice messages, they can generate realistic fakes that 95% of folks are unable to tell apart from real messages from friends or colleagues.

The tools for creating deepfakes vary widely, from simple Telegram bots to professional generators like HeyGen and ElevenLabs. Scammers use deepfakes together with social engineering: for example, they might first simulate a messenger app call that appears to drop out constantly, then send a pre-generated video message of fairly low quality, blaming it on the supposedly poor connection.

In most cases, the message is about some kind of emergency in which the deepfake victim requires immediate help. Naturally the “friend in need” is desperate for money, but, as luck would have it, they’ve no access to an ATM, or have lost their wallet, and the bad connection rules out an online transfer. The solution is, of course, to send the money not directly to the “friend”, but to a fake account, phone number, or cryptowallet.

Such scams often involve pre-generated videos, but of late real-time deepfake streaming services have come into play. Among other things, these allow users to substitute their own face in a chat-roulette or video call.

Continue Reading...
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)
[-]
Welcome
You have to register before you can post on our site.

Username/Email:


Password:





[-]
Recent Posts
AMD suggests it may open-source FSR 4 a...
AMD still has nothin...harlan4096 — 17:21
Intel Arc G3 Panther Lake series for han...
Intel G3 with LPDD...harlan4096 — 07:32
Core Ultra 7 270K Plus and Ultra 5 250K...
Intel reportedly ‘ca...harlan4096 — 11:27
Core Ultra 7 270K Plus and Ultra 5 250K ...
Intel’s Core Ultra...harlan4096 — 11:09
Adobe Acrobat Reader DC 2025.001.21184
Adobe Acrobat Read...harlan4096 — 10:45

[-]
Birthdays
Today's Birthdays
avatar (41)svoyaEnuct
Upcoming Birthdays
avatar (47)hapedDow
avatar (46)komriwat
avatar (38)showercurtains
avatar (49)PeterWhink
avatar (50)neuthrusBub
avatar (30)script6027529171
avatar (46)myhotseeve
avatar (46)Edwinmub
avatar (46)dimaWeami
avatar (39)TranoTymn
avatar (39)MezirLal
avatar (50)listfquoto
avatar (46)dima6sarPrave
avatar (38)Michaelaburi
avatar (46)dpascoal
avatar (51)Ronaldduh
avatar (39)legalgauch
avatar (44)Baihu
avatar (27)RaseinsLikes

[-]
Online Staff
There are no staff members currently online.

>