FinCEN alerts financial institutions to be wary of AI-enabled deepfakes

FinCEN

The U.S. Department of the Treasury’s Financial Crimes Enforcement Network (FinCEN) issued an alert Wednesday to financial institutions about their obligations to report deepfakes, warning artificial intelligence has given bad actors additional tools in their arsenal.

Deepfakes are highly sophisticated fraud schemes that are crafted, often with AI, to trick intended targets into believing a phone call, email, or video is legitimate. Most often, the aim of the caller, who poses as a known bank employee or executive, is to convince the bank employee, manager–or fellow executive–that their request to transfer funds is legitimate.

The use of AI has allowed criminals to expertly re-create a real person’s voice or 3D likeness, and it is often very difficult to discern whether a communication is real or a deepfake.

lock iconTHIS IS MEMBERS-ONLY CONTENT. To continue reading, choose one of the options below.