At a Feb. 14 work session of the State Government & Tribal Relations Committee, Sean Flynn, general counsel for the Public Disclosure Commission, told legislators that advances in digital advertising and artificial intelligence have made it far harder to counter misinformation in political ads.
Flynn said courts place strong protection on political speech and noted precedents such as the U.S. Supreme Court’s decision in State v. Alvarez, which limited government power to criminalize some false statements. He said Washington’s prior effort to bar false political advertising had been narrowed by courts and now reads largely like traditional defamation law.
"The remedy for false speech is more speech," Flynn said, describing the general constitutional approach, but he added that digital advertising raises new challenges because of anonymity, targeting and the speed at which false content can spread. Flynn highlighted "deepfakes," AI-generated audio or video that can make it appear a candidate said something they did not, and played a short example created from public footage to illustrate how realistic AI-generated voice can now sound.
Flynn described Washington’s recent deepfakes law, which creates a private cause of action allowing a candidate to seek a court injunction and to pursue damages for a materially misleading deepfake produced close to an election. He called the law an incremental step intended to respect First Amendment limits while giving candidates a civil remedy.
The PDC general counsel said the commission has also relied on existing disclosure requirements to help address AI-driven advertising. Commercial advertising recordkeeping rules require media sellers to keep records of advertising purchases, and PDC rules now expect disclosure when AI tools were used to create or disseminate political advertising. Flynn said the PDC’s rulemaking seeks records showing whether AI was used in ads and, where possible, the targeted audience that received the ad.
Flynn also noted that several states provide a safe harbor for advertisers who label manipulated content as such, and he described reporting and campaign-finance rules that require later disclosure of purchases of AI tools as part of campaign reporting.
Committee members asked questions about how to distinguish deepfakes from traditional photo editing or satire, and about enforcement and technical verification. Flynn said the level of manipulation and whether AI was used would be relevant factors, and that the PDC’s focus is disclosure of AI use and sponsor identification rather than compelling companies to reveal proprietary algorithms.
The session closed with lawmakers thanking Flynn for the briefing and noting the need for continued attention as AI and advertising technologies evolve.