The FTC bans AI impersonations of individuals – and unveils greater powers to recover stolen money
The Federal Trade Commission (FTC) did just that moved to ban the practice of using AI tools to spoof individuals, and to announce greater powers to recover stolen money from scammers.
The agency said it is “taking this action in light of increasing complaints around impersonation fraud, as well as public outrage over the harm caused to consumers and impersonators.”
The rise of public generative AI tools like ChatGPT has allowed cybercriminals to spoof brands and organizations with greater accuracy and ease. In moments that use high-profile figures, unreal images, voices and videos can be generated; these are known as deepfakes, and they have been spreading at a worrying rate.
New powers
The FTC also said it is “seeking comment on whether the revised rule should make it unlawful for a company, such as an AI platform that creates images, video or text, to provide goods or services that they know or have reason to know is used to harm consumers through impersonation.”
FTC Chair Lina M. Khan added that the agency wants to expand the proposals to its impersonation ruling — which will now include individuals, not just governments and corporations — to “[strengthen]the FTC’s toolkit to combat AI based scams posing as individuals.”
The commission said it is making these expansions in response to public feedback on its previous proposals, because the comments made “pointed out the additional threats and harms posed by the impersonation of individuals.”
The FTC claims the expansion “will help the agency deter fraud and secure compensation for harmed consumers.”
It also finalized the Government and Business Impersonation Rule, which will arm the agency with better weapons to combat scammers who abuse AI to spoof real entities.
It will now be able to directly file federal lawsuits to force cybercriminals to return their impersonation revenues. The FTC believes this is an important step as it argues that an earlier Supreme Court decision (AMG Capital Management LLC v. FTC) “significantly limits the agency’s ability to require defendants to provide refunds to injured consumers.”
Threat actors who use logos, email and web addresses, or falsely imply an association with business and government, can now be sued by the FTC to “directly seek financial relief.”
The committee voted 3-0 on this final ruling. It will be published in the Federal Register.