Watchdog releases ‘Dirty Dozen’ list of companies that fail to protect children from online grooming

by Chris Lange

Chris Lange, FISM News


The National Center on Sexual Exploitation has released its annual list of businesses and other entities that profit off sexually exploitative products and fail to protect children from sexually abusive material, including popular brands like Instagram, Twitter, and Roblox. 

Snapchat, Apple App Store, and Instagram made the cut in this year’s Dirty Dozen List of companies that fail to shield children from sexually explicit content or inadvertently facilitate child grooming and sexual exploitation.

Discord, eBay, Kik, Microsoft’s GitHub, Only Fans, Reddit, Roblox, Spotify, and Twitter were also included in the annual report from the National Center on Sexual Exploitation (NCOSE).

“Those on our 2023 Dirty Dozen List were included for facilitating a diverse set of sexual exploitation issues including sex trafficking, image-based sexual abuse, child sexual abuse material, grooming children for exploitation, and childlike sex abuse dolls,” Lina Nealon, NCOSE Vice President & Director of Corporate Advocacy, said in a press release.

“Sexual abuse and exploitation are on the rise and are facilitated by digital platforms” she continued. “It is past time for tech platforms to stop their products from threatening the safety of children and enabling sexual abuse to happen to people of all ages.”

The report was informed by documented examples of offenses from each of the tech sites as well as corroborating accusations from victims and witnesses confirmed by researchers for the non-partisan, nonprofit organization.


According to NCOSE, the Apple app store uses “deceptive” age ratings and descriptions that “mislead parents about the content, risks, and dangers to children on available apps.”

NCOSE described Discord as a “haven for sexual exploiters.” Predators have flocked to the messaging platform which allows them to groom minors, share and exchange child pornography, and “commit image-based sexual abuse of adults.”

One researcher said that parents are being duped into paying for Spotify’s content filter which does nothing to block kids from seeing harmful images.

“When I began my research on Spotify I was absolutely shocked at the amount of pornographic content that I was able to easily find,” the researcher said.

Parents are putting their trust in this filter and they’re sometimes even paying for a premium subscription in order to be able to control this filter on their kids’ account, and this filter is doing virtually nothing to shield their kids from hardcore pornography on the app.

The report states that Twitter has refused requests to take down child sexual abuse material posted on its site. The social media platform was sued in 2021 lawsuit by a mother who alleged that Twitter facilitated the sexual abuse of her child. The social media platform has denied culpability in lawsuits accusing the company of facilitating child exploitation, according to the report.

Another alleged offender is Snapchat, which NCOSE identified as “a top spot for sextortion, sexual interactions between minors and adults, and pornography exposure.”


“We call for urgent change from those who made the 2023 Dirty Dozen List,” Nealson said. “The list exposes practices and products that endanger and harm people and galvanizes the public to press on the named entities to act ethically and promote human dignity.”

The annual report has seen success in prompting companies to do more to protect children from harmful content and exploitation.

“Over the past decade, the Dirty Dozen List campaign has instigated major policy changes at Google, TikTok, Comcast, Delta Airlines, Amazon, the Department of Defense, and many other influential institutions,” Nealon said. 

The report includes links readers can use to message companies included in the list directly to demand that they do more to protect kids.

NCOSE was founded in 1962 and began publishing its Dirty Dozen List in 2013. The full 2023 report can be viewed here.