This is what a deepfake voice clone applied in a unsuccessful fraud attempt seems like
Prudent Press Agency
Just one of the stranger applications of deepfakes — AI technological innovation employed to manipulate audiovisual content material — is the audio deepfake fraud. Hackers use equipment understanding to clone someone’s voice and then incorporate that voice clone with social engineering procedures to convince persons to go funds where by it should not be. This kind of ripoffs have been prosperous in the past, but how very good are the voice clones being made use of in these attacks? We’ve under no circumstances truly heard the audio from a deepfake rip-off — until now.
Protection consulting agency NISOS has released a report analyzing a person this sort of tried fraud, and shared the audio with Motherboard. The clip under is element of a voicemail sent to an worker at an unnamed tech firm, in which a voice that seems like the company’s CEO asks the employee for “immediate support to finalize an urgent business enterprise offer.”