FTC Seeks Comments on New Impersonation Regulation to Counter Deepfake AI

Following a marked increase in impersonation attempts using artificial intelligence (AI) – of both real and fictitious individuals – the Federal Trade Commission (FTC) is seeking public comments on proposed changes to a recently finalized Trade Regulation Rule on Impersonation of Government and Businesses (Impersonation Regulation).
Off

Read the upcoming Supplemental Notice of Proposed Rulemaking here.

Specifically, the FTC has reported that impersonation of government, businesses, and their officials or agents is prevalent, perhaps most notably involving a deepfake audio of President Joe Biden. In response to surging complaints of impersonation fraud, exacerbated by increased use of emerging technology such as “deepfake” generative AI (GenAI) tools, the proposed changes, if implemented, could carry significant implications for a variety of businesses, stakeholders, and consumers.

The proposed rule changes primarily center on a new prohibition against impersonating individuals, an extension of liability to parties who knowingly provide goods and services used in unlawful impersonations. If adopted as proposed, the changes would also change the title of the Rule to "Rule on Impersonation of Government, Businesses, and Individuals," reflecting the expanded scope of the rule to include impersonation of individuals. Overall, the FTC’s goal behind this proposal is not to impose new burdens on honest individuals or businesses, but rather to combat the harms caused by impersonators and efficiently provide redress to consumers that have suffered at the hands of impersonation schemes.

New Prohibition on Impersonating Individuals

As currently written, the Impersonation Regulation prohibits the impersonation of government and businesses. The supplemental proposed rule seeks to address a broad range of scenarios, including romance scams (e.g., where scammers pose as individuals interested in a romantic relationship to extract money or sensitive information from consumers), or grandparent scams (e.g., where scammers pose as a grandchild in need of immediate financial assistance). The proposed rule would also cover similar scenarios where a scammer creates fake social media profiles or email addresses, using others’ photos and names to deceive victims into believing they are communicating with the person being impersonated.

Extension of Liability

Another critical change the proposed rules contemplate is through the means and instrumentalities provision, extending potential liability to companies that provide goods or services that they know or have reason to know are being used to harm consumers through impersonation. Importantly, this could encompass not only GenAI-powered platforms, but also companies that provide payment processing services, telecommunications services, or other platforms that scammers use to conduct fraudulent activities. For instance, the proposed rule considers extending potential liability to GenAI-powered platforms that knowingly allow a scammer to use its platform to fraudulently impersonate individuals or to a payment processing company that knowingly allows a scammer to use its services to receive money from impersonation scam victims.

Knowledge of these activities is key to enforcement against such other parties. With that being said, it is difficult to anticipate how the FTC will evaluate whether companies and/or individuals have “reason to know” something, thus creating uncertainly on how these liability provisions, if passed, might be enforced.

Your Voice Matters and Other Takeaways

The finalized Impersonation Rule will take effect 30 days following publication in the Federal Register and the comment period for the Supplemental Notice will remain open for 60 days. By participating in the call for comments on the proposed changes, including offering insights regarding their potential impact on consumers and businesses, as well as possible alternatives that could achieve the same objectives, interested parties can contribute to crucial conversations and help shape the course of the AI regulatory landscape.

At the heart of the proposed rule is the realization that technology has created further opportunities for deceiving consumers. Companies using impersonators should be cautioned to fully disclose such use and comply with all FTC advertising regulations related to consumer protection issues. Companies engaged in activities that support the exchange of goods and services in which impersonators are used, may wish to explore ways to reduce their own liability.

ArentFox Schiff is available to assist clients in the preparation and submission of comments and will be actively tracking developments in this field, including publication of the Federal Register notice. If you have questions or are interested in adding your voice to the conversation by submitting a comment, please reach out to any of the authors or other member of the ArentFox Schiff team for guidance.

Contacts

Continue Reading