ai in campaigns
Photo credit: iStock.com/Arkadiusz Warguła

As artificial intelligence (AI) grows in prominence and uses, lawmakers around the country are taking notice. Some lawmakers want to crack down on using artificial intelligence—and deep fakes—in campaigns. Deepfakes are fake audio or visuals of someone that are altered so they say or do something that did not happen. As of June 2023, at least eight states have enacted a new law on the general use of deepfakes.


Minnesota Enacts Deepfake Regulation Bill

In May 2023, Governor Tim Walz (D) signed HF 1370 into law, one of the first state regulations related to deepfakes and elections. Under the new law, it is a crime to either disseminate or agree to disseminate a deep fake within 90 days of an election if the deep fake was made without the consent of the person depicted and if it’s made to hurt a candidate or influence an election. Violating the new law could result in imprisonment and fines of up to $10,000. The new law went into effect on August 1, 2023.


Proposed Pennsylvania Legislation to Regulate Deepfakes in Campaigns

A group of state lawmakers released a cosponsor memo in December 2023 for legislation regulating AI misuse in campaigns. The bill’s text was unavailable as of January 8, 2024, but according to the memo, it would “prohibit the fraudulent misrepresentation of a candidate.” The memo notes that with the growth and use of AI, bad actors have used it to produce content that targets candidates for elected office, elected officials, and government programs. 

Under the bill, campaigns, super PACs, and candidates would be fined if they use deep fake technology to misrepresent other candidates. While fine amounts are still being finalized, the amount could depend on the level of office being held or sought by whoever is being impersonated (i.e., federal, state, and local). 


Could the FEC Act?

The Federal Election Campaign Act (FECA) permits the Federal Election Commission (FEC) to prohibit federal candidates from fraudulently misrepresenting themselves to be “speaking or writing or otherwise acting” on behalf of another candidate “on a matter which is damaging” to the one being misrepresented. In August 2023, the FEC, a bipartisan commission of 3 Republicans and 3 Democrats, announced a proposed rule to limit the use of deliberate and deceptive AI campaign ads. Specifically, the rule would require the FEC to amend its regulation on fraudulent misrepresentation of campaign authority to include deliberately misleading AI campaign ads. The comment period on the proposed rule closed on October 16, 2023; the FEC is expected to be formulating a final rule incorporating the feedback it received.


Latest News