AI In Political Advertising

DW
Dickinson Wright PLLC

Contributor

Dickinson Wright is a general practice business law firm with more than 475 attorneys among more than 40 practice areas and 16 industry groups. With 19 offices across the U.S. and in Toronto, we offer clients exceptional quality and client service, value for fees, industry expertise and business acumen.
Artificial Intelligence ("AI") is an incredible tool that is now widely used across industries – political advertising is no exception. AI is easy to use, inexpensive, and fast. Lately...
United States Technology
To print this article, all you need is to be registered or login on Mondaq.com.

Artificial Intelligence ("AI") is an incredible tool that is now widely used across industries – political advertising is no exception. AI is easy to use, inexpensive, and fast. Lately, we've received an influx of questions from clients about the intersection of AI and political advertising – the bottom line is clients should do their research before using AI in a political ad. Fifteen (15) states have already enacted laws regulating the use of AI in political advertising, sixteen (16) more have pending legislation and there are indications that federal regulation may be on the horizon. This article is designed to provide high-level information on the intersection of AI and political advertising. Please don't hesitate to reach out to us if you have questions or need assistance navigating this novel issue.

Who has the authority to regulate AI at the federal level?

As discussed above, many states have enacted, or are currently considering, laws that would require political advertisements to include a disclaimer that some or all of the political advertisement was generated through the use of artificial intelligence tools. In a few states, the use of AI tools for political advertising is completely prohibited.

The legal tests to these laws are just beginning, and it is likely that the effectiveness of these state laws will be undermined by challenges based upon Section 315 of the Communications Act of 1934, as amended. In particular, advertisements for legally qualified candidates that are purchased by the candidate or the candidate's authorized committee are considered "uses," and Section 315 prohibits broadcasters and cable system operators from censoring an advertisement paid for by a legally qualified candidate or its campaign.

Importantly, the censorship prohibition contained in Section 315 does not apply to political advertisements that are not considered "uses," i.e., advertisements placed by third-party entities advocating against a candidate or issue. While broadcasters are immune from civil liability for "uses" by candidates and their campaigns because they are prohibited from censoring those advertisements, no such immunity applies to third-party advertisements.

This means that it is possible that a state law imposing disclosure requirements will not be preempted by Section 315 in the case of third-party political advertisements. As discussed below, some state laws and pending legislation extend the liability exemption for broadcasters who take political advertisements without the required AI disclosure, but broadcasters should carefully review whether they are under the obligation to edit a third-party advertisement to insert their state's required AI Disclosure statement to avoid potential state law liability.

At the federal level, there is a dispute over who has jurisdiction to regulate AI disclaimers. The dispute was spurred by a press release issued by Federal Communications Commission ("FCC") Chair Jessica Rosenworcel. On May 22, 2024, Chairwoman Rosenworcel announced that she had circulated a draft Notice of Proposed Rulemaking ("NPRM") that would seek public comment on a proposal that would require on-air and written disclosures for political advertisements that included AI-generated content. The proposal would apply to both candidate and third-party advertisements. "The proposal would apply to both candidate and third-party advertisements. Separately, Chairwoman Rosenworcel sent letters to the nine largest US telecommunications carriers, seeking information regarding their efforts to prevent AI-Generated political robocalls."

While the text of the NPRM has yet to be released, responses to the Chairwoman's proposed rulemaking have been mixed. FCC Commissioner Brendan Carr and Federal Election Commission ("FEC") Chairman Sean Cooksey both expressed concerns over whether the FCC has the requisite statutory authority to adopt such rules. After the public notice was released, FCC Commissioner Carr issued a statement, wherein he raised concerns that the proposed rules would "only exacerbate[] regulatory asymmetries" between broadcasters that are regulated by the FCC, and streaming services, which are largely unregulated.

FEC Chairman Cooksey sent a public letter to FCC Chairwoman Rosenworcel arguing that the FEC is solely authorized to regulate "the disclaimer and reporting requirements specific to political communications set out under federal law." Senators Thune, McConnell, Schmitt and Cruz also sent a letter to the FCC raising concerns over the FCC's statutory authority to require disclaimers.

Last year, the FEC sought public comment on a petition for rulemaking filed by Public Citizen. The petition asks the FEC to amend its regulations to clarify that the statutory prohibition on fraudulently misrepresenting candidates or political parties is applicable to deceptive AI campaign ads. The petition is still pending.

Some states have crafted laws that expressly attempt to regulate AI in federal ads, while others have limited their laws to exclude the regulation of advertisements supporting or opposing federal candidates. For example, pending legislation in Pennsylvania sets hefty fines for fraudulent misrepresentation generated artificially for Presidential candidates. Meanwhile, Massachusetts limits the application of their pending legislation to the definition of a "candidate" under their General Laws, which excludes federal candidates. Whether such state laws can be enforced against third-party sponsors of federal ads, such as super PACs, is uncertain. In any event, sponsors should do their research and to understand the potential risks of running ads that feature AI in each state.

Congress could adopt legislation to regulate AI in political ads. There are two pieces of federal legislation pending that attempt to address the use of AI in robocalls. The Quashing Unwanted and Interruptive Electronic Telecommunications (QUIET) Act would require the person behind a robocall that uses AI to mimic human voices to state at the beginning of those calls and texts that the technology is being used and provide enhanced penalties for violations. The Restrictions on Utilizing Realistic Electronic Artificial Language (R U REAL) Act would ask the Federal Trade Commission to amend the Telemarketing Sales Rule to mandate disclosures on AI use in telemarketing and double the penalties for violations of robocall impersonation rules. We believe the likelihood of either passing this session is low.

States that Regulate AI in Political Ads

Currently, 15 states regulate the use of AI or other synthetic media in political communications. While each state regulation is unique, they can generally be grouped into three buckets: 1) states that have broad prohibitions on use of deceptive media; 2) states that require disclaimers generally when synthetic media/AI is used ; and 3) states that require disclaimers within a limited window of time before an election when synthetic media/AI is used.

Alabama, Mississippi, and Texas have broad prohibitions on the use of deceptive media. Florida, Indiana, Michigan, New York, Oregon, Utah, and Wisconsin require disclaimers for use of synthetic media, including AI in political advertisements when the ads are run at any time. Arizona, Colorado, Idaho, Minnesota, New Mexico, and Washington state require disclaimers for the use of synthetic media, including AI, in political advertisements during a limited window before elections.

Additionally, several states have enacted laws that grant candidates who have been deceptively depicted in political advertisements a cause of action to sue the sponsor of the ad, or impose fines, civil and even criminal penalties for violations.

States with Pending Legislation to Regulate AI

Another 14 states have pending legislation aimed at regulating the use of AI in political ads. New Hampshire, Illinois, Ohio, and Pennsylvania have pending legislation for broad prohibitions on use of AI in political ads. California, and Ohio have pending legislation to require disclaimers for the use of synthetic media, including AI in political ads at any time. California, Delaware, Hawaii, Illinois, Massachusetts, New Hampshire, New Jersey, North Carolina, Ohio, Rhode Island, South Carolina and Washington, D.C. have pending legislation to require disclaimers during a limited window. Arizona has pending legislation to change the current window when disclaimers are required.

Potential for Broadcaster Liability

Several states that have adopted AI-related laws have included language to provide varying degrees of liability exemption to radio and television broadcasters that carry political advertisements.

For example, Idaho's new AI law adopted in March 2024 included enforcement language that established a civil cause of action for a candidate who was the subject of a synthetic advertisement. However, the law exempted broadcast stations that transmit the advertisement as required by Section 315 (i.e., candidate or campaign-sponsored advertisements, but not third-party advertisements).

Indiana's new AI law also provides a civil cause of action for any party that disseminates the campaign advertisement, but only if the party disseminating the message removes the required disclosure statement.

Wisconsin's new AI law makes it much clearer that broadcasters would be exempt from any civil action, and apparently, even third-party advertisements are not considered "uses" under Section 315 of the Communications Act.

Other states, like Florida's new AI law, do not contain specific carve-outs for broadcasters, but may be interpreted to only apply to the party that is paying for, sponsoring, or approving a political advertisement that fails to contain the required disclaimer.

Finally, some state laws, like Minnesota's AI law adopted in 2023, appear to hold liable any party who disseminates an AI-generated political advisement, which theoretically could apply to broadcasters who are hired to broadcast the ad.

Enforcement

As noted above, the FCC is considering new rules to require disclosure of AI-generated political advertisements. In the meantime, it has already taken two enforcement actions addressing the use of AI technologies in connection with robocalls during the January primary election in New Hampshire.

In particular, the FCC issued proposed forfeitures of $2 million against telecommunications carrier Lingo Telecom, and $6 million against political consultant Steve Kramer, for their respective roles in sending spoofed robocalls two days before the New Hampshire primary election in January 2024. The robocalls used an AI-generated voice of President Biden to urge New Hampshire voters to not vote during the primary, but to "save" their vote for November.

Notably, the proposed forfeitures against Mr. Kramer and Lingo Telecom rely upon existing FCC authority to address robocalls under the Truth in Caller ID Act of 2009 and the Pallone-Thune Telephone Robocall Abuse Criminal Enforcement and Deterrence Act (TRACED Act). Both laws and the FCC's rules adopted to implement the laws, focused on the need to protect consumers from false and misleading communications.

Conclusion

This is a novel, complex, and emerging area of the law. The regulations discussed can impact many forms of political advertising, including TV and radio, OTT, digital advertising, direct mail and telephone communications. Dickinson Wright attorneys are here to help you navigate this emerging technology in a legally compliant manner.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More