On Friday, Senators Marsha Blackburn (R-Tenn) and Richard Blumenthal (D-Conn) sent joint letters to major ad tech companies including Amazon, Google, Integral Ad Science, DoubleVerify, the MRC, and TAG. These companies have been identified as "responsible for serving or certifying" ads on websites hosting child sexual abuse material (CSAM). The letters demand explanations on how these ads bypassed online advertising safeguards and why the CSAM content went unreported.
The Background
The issue was highlighted by Adalytics, which discovered programmatic ads running alongside CSAM while working on a separate project involving URLscan.io, a bot that scans the web for malicious sites. Adalytics identified CSAM hosted on two file-sharing websites, imgbb.com and ibb.co, which allow anonymous sharing and can prevent links from being indexed by search engines.
The Letters
The letters sent to ad tech companies reflected a deep understanding of the ad ecosystem and were tailored to address specific concerns. For instance, Amazon and Google were questioned on why their ad products do not report page-level URL data for served ads. The letters also sought details on any refunds provided to advertisers or the U.S. government regarding ads served on these sites.
In response, Amazon expressed regret and confirmed actions to block ads from these sites, while Google emphasized its zero-tolerance policy for content promoting child exploitation.
Accountability Measures
The senators are also pressing TAG and MRC for information on any past reports of CSAM content and the auditing mechanisms in place to ensure compliance among certified companies. DoubleVerify and Integral Ad Science were asked for transparency regarding URL-level data and previous instances of CSAM content reporting.
Ad Serving Concerns
This situation raises serious questions about the integrity of the ad tech infrastructure. Adalytics reported that most ads served on these CSAM sites were primarily from Amazon's and Google's ad tech, while other vendors were also implicated. The inability to identify and block such content is a significant indictment of current ad tech practices.
Implications for Advertisers
For advertisers, this revelation serves as a caution against complacency in ensuring ad placements are genuinely safe. Many brands and agencies are expected to become more hesitant about spending on the open web, potentially redirecting their budgets to platforms like Instagram and TikTok, which have also faced scrutiny regarding CSAM content.
The ramifications of these findings extend beyond financial loss; they pose serious ethical and legal implications given the nature of the content involved, with several links showcasing children as young as four to six years old, and one missing child identified in the material. This alarming situation continues to unfold as investigations proceed.
Comments
Join Our Community
Create an account to share your thoughts, engage with others, and be part of our growing community.