Limited Time Offer. Become a Founder Member Now!

Senators press Big Tech on AI, deepfakes, youth safety and the fate of local news

October 29, 2025 | Commerce, Science, and Transportation: Senate Committee, Standing Committees - House & Senate, Congressional Hearings Compilation


This article was created by AI summarizing key points discussed. AI makes mistakes, so for full details and context, please refer to the video of the full meeting. Please report any errors so we can fix them. Report an error »

Senators press Big Tech on AI, deepfakes, youth safety and the fate of local news
Senators used the Commerce Committee hearing to press platform witnesses about artificial intelligence, deepfakes, the online safety of teenagers, and the erosion of local journalism.

Senator Amy Klobuchar and others focused on the proliferation of digitally altered political videos and the need for provenance and labeling. Klobuchar noted pending legislation to require labeling of certain AI-generated media and asked Meta and Google whether they supported measures to mark altered content. Neil Potts said Meta supports directionally more transparency for AI-generated posts and pointed to industry efforts such as the Coalition for Content Provenance and Authority (CTUPA) that include markers for AI content.

Senator Marsha Blackburn pressed Google on Gemini, citing a constituent example she said involved fabricated accusations produced by the model. Erickson acknowledged that "LLMs will hallucinate. It's a known issue," said Google has teams working to mitigate hallucinations and that the company trains models on publicly available information.

Senators also raised youth safety. Blackburn and Senator Blackburn's questioning (sic) included allegations about platformsand teen exposure to harmful content; Meta said it is investing in safety and privacy measures and that it would follow up with committee offices on research and methodologies. Witnesses described differences in how platforms approach safety and said they are continuing to refine controls and enforcement.

Several senators linked AI and platform concentration to threats to local journalism and the "seed corn" of reporting. Senator Cantwell and others warned that closures of local news outlets, combined with opaque AI training and recommendation systems, risk hollowing out the information ecosystem. Google said it directs users to publishers and supports news ecosystems; Meta described grants and partnerships it has provided to some journalism organizations.

The exchange included broader questions about algorithms and engagement. Senators asked whether algorithmsdesigned to maximize time on sitenarrow the range of information users encounter and whether Section 230 immunity should apply when platforms actively amplify or demote content through recommendation systems. Witnesses urged caution about overbroad regulatory steps but expressed openness to discussion.

Lawmakers signaled they will press these issues further in future hearings and follow-up questions for the record.

View full meeting

This article is based on a recent meeting—watch the full video and explore the complete transcript for deeper insights into the discussion.

View full meeting