FTC Commissioner Holly Oak told an audience at the Hinckley Institute of Politics that the agency is using enforcement, research orders and public education to fight a growing wave of scams that exploit artificial intelligence.
Oak said the commission has brought cases this year under its new authority targeting student‑debt relief schemes, phantom debt‑collection scams and e‑commerce “business opportunity” frauds, and has worked with domain registrars to shut down 13 websites illegally impersonating the Federal Trade Commission. “The FTC is committed to using our new authority to hold scammers accountable,” Oak said.
The commissioner described a recent enforcement action against a company called Workado, which marketed a product claiming roughly 98% accuracy in detecting AI‑generated text but tested at about 53% in independent tests. The FTC required the company to stop making the misleading claim, Oak said.
Oak outlined a three‑pronged approach: law enforcement against bad actors; incentives for private innovation; and research to inform policymakers. On the innovation side, she said the agency used authority under the America COMPETES Act to host a prize challenge that produced multiple technical approaches to prevent harmful deep‑fake voice cloning. Winning submissions, Oak said, included an AI algorithm to differentiate synthetic from genuine voices, protections that make certain online audio samples harder to harvest for cloning, and “liveness” detection that can spot deep fakes in real time.
For research, Oak emphasized the FTCs use of Section 6(b) of the FTC Act to compel nonpublic business information to produce staff reports. She cited a December 2020 order to social media and streaming platforms about how those companies handle paid commercial ads and whether platforms use automated systems and human review to screen deceptive financial and health advertising. More recently, Oak said the commission issued a 6(b) study last month requiring seven companies that operate consumer‑facing AI chatbots to provide information about childrens and teens use of chatbots, including age‑based restrictions, data collection and parental controls.
Oak described harms reported in connection with chatbots, including instances in which chatbots have provided instructions on committing crimes, encouraged self‑harm, or engaged in romantic role play with minors; she said there is at least one alleged suicide reported in the United States linked to a chatbot interaction. Oak noted the global AI companion market was valued at about $28,000,000,000 in 2024 and said childrens use of such products is a driver of that growth and of the regulatory concern.
On privacy and childrens data, Oak said the agency enforces the Childrens Online Privacy Protection Act (COPPA). She recounted a recent FTC enforcement outcome against Disney in which the company agreed to a $10,000,000 settlement and to review whether content uploaded to platforms like YouTube should be designated as child‑directed to limit data collection.
Oak also stressed consumer education. She described FTC videos, a printed contact sheet for consumers to record bank and trusted contacts, and a simple heuristic for people targeted by scammers: “Stop, drop, and call” — stop the interaction, drop the call or text, then call a trusted contact or the bank. Oak recounted a Utah student who lost college savings after following instructions from a scammer who claimed a bank teller and a so‑called "universal account" existed; she used the example to urge people to slow down in high‑pressure moments.
In audience questions, Oak said the agency does not have an immediate, simple prosecution success rate to share and noted that many cases involve international actors; the FTC can use the U.S. Safe Web Act to cooperate with foreign jurisdictions. Responding to a question about age restrictions and the First Amendment, Oak said the FTC had not imposed age restrictions on chatbots and that some companies may adopt restrictions voluntarily; she also reiterated that COPPA governs data collection from children under 13. On education, Oak said FTC materials for young people are available and that decisions about including scam literacy in school curricula are largely for state and local education authorities.
The commissioner concluded by saying the FTC aims to both facilitate innovation in artificial intelligence and remain vigilant in enforcing the law against fraudsters.