Children
Incidents Harmed By
Incident 62418 Reports
Child Sexual Abuse Material Taints Image Generators
2023-12-20
The LAION-5B dataset (a commonly used dataset with more than 5 billion image-description pairs) was found by researchers to contain child sexual abuse material (CSAM), which increases the likelihood that downstream models will produce CSAM imagery. The discovery taints models built with the LAION dataset requiring many organizations to retrain those models. Additionally, LAION must now scrub the dataset of the imagery.
MoreIncident 5516 Reports
Alexa Plays Pornography Instead of Kids Song
2016-12-30
An Amazon Echo Dot using the Amazon Alex software started to play pornographic results when a child asked it to play a song.
MoreIncident 114 Reports
Google’s YouTube Kids App Presents Inappropriate Content
2015-05-19
YouTube’s content filtering and recommendation algorithms exposed children to disturbing and inappropriate videos.
MoreIncident 95811 Reports
Europol Operation Cumberland Investigates at Least 273 Suspects in 19 Countries for AI-Generated Child Sexual Abuse Material
2025-02-26
Europol’s Operation Cumberland uncovered a global network distributing AI-generated child sexual abuse material (CSAM). The operation has led to 25 arrests and 273 identified suspects across 19 countries. The AI-enabled abuse allows criminals to create exploitative content at scale with minimal expertise.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
Incidents involved as both Developer and Deployer
- Incident 5831 Report
Instagram Algorithms Allegedly Promote Accounts Facilitating Child Sex Abuse Content
- Incident 7881 Report
Instagram's Algorithm Reportedly Recommended Sexual Content to Teenagers' Accounts