This alert may not be shared outside your organization, Do Not Repost or send, place on other websites, List servers, or send to others via email, including other associations or parties.  Members and Law enforcement use only. Contact us for any permissions.  To do otherwise will result in the loss of membership.

Complete Story
 

07/28/2020

This is what a deepfake voice clone applied in a unsuccessful fraud attempt seems like

Prudent Press Agency

Just one of the stranger applications of deepfakes — AI technological innovation employed to manipulate audiovisual content material — is the audio deepfake fraud. Hackers use equipment understanding to clone someone’s voice and then incorporate that voice clone with social engineering procedures to convince persons to go funds where by it should not be. This kind of ripoffs have been prosperous in the past, but how very good are the voice clones being made use of in these attacks? We’ve under no circumstances truly heard the audio from a deepfake rip-off — until now.

Protection consulting agency NISOS has released a report analyzing a person this sort of tried fraud, and shared the audio with Motherboard. The clip under is element of a voicemail sent to an worker at an unnamed tech firm, in which a voice that seems like the company’s CEO asks the employee for “immediate support to finalize an urgent business enterprise offer.”

Read more...

Printer-Friendly Version


Resources

Alerts

The FRPA alert system distinguishes us from other groups by gathering and providing information to law enforcement, retailers AND financial institutions.

more information
Resources

Resources

Your electronic library to help in fighting financial fraud for all of our partners.

more information