Entities
View all entitiesRisk Subdomain
1.2. Exposure to toxic content
Risk Domain
- Discrimination and Toxicity
Entity
AI
Timing
Post-deployment
Intent
Unintentional
Incident Reports
Reports Timeline

The families of two young girls who allegedly died as a result of a viral TikTok challenge have sued the social media platform, claiming its “dangerous” algorithms are to blame for their children’s deaths.
Parents of two girls who died in a…

The parents of two girls who said their children died as a result of a “blackout challenge” on TikTok are suing the company, claiming its algorithm intentionally served the children dangerous content that led to their deaths.
The girls were…

TikTok’s recommendation algorithm pushes self-harm and eating disorder content to teenagers within minutes of them expressing interest in the topics, research suggests.
The Center for Countering Digital Hate (CCDH) found that the video-shar…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Similar Incidents
Did our AI mess up? Flag the unrelated incidents