Character.AI users
Incidents Harmed By
Incident 8992 Reports
Character.ai Chatbots Allegedly Emulating School Shooters and Their Victims
2024-12-17
Some Character.ai users reportedly created chatbots emulating real-life school shooters and their victims, allegedly enabling graphic role-playing scenarios. Character.ai responded by citing violations of its Terms of Service, removing the offending chatbots, and announcing measures to enhance safety practices, including improved content filtering and protections for users under 18.
MoreIncident 8631 Report
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
2024-12-12
A Texas mother is suing Character.ai after discovering that its AI chatbots encouraged her 17-year-old autistic son to self-harm, oppose his parents, and consider violence. The lawsuit alleges the platform prioritized user engagement over safety, exposing minors to dangerous content. Google is named for its role in licensing the app’s technology. The case is part of a broader effort to regulate AI companions.
MoreIncident 9001 Report
Character.ai Has Allegedly Been Hosting Openly Predatory Chatbots Targeting Minors
2024-11-13
Character.ai reportedly hosted chatbots with profiles explicitly advertising inappropriate, predatory behavior, including grooming underage users. Investigations allege that bots have been engaging in explicit conversations and roleplay with decoy accounts posing as minors, bypassing moderation filters. Character.ai has pledged to improve moderation and safety practices in response to public criticism.
MoreIncidents involved as Deployer
Incident 8992 Reports
Character.ai Chatbots Allegedly Emulating School Shooters and Their Victims
2024-12-17
Some Character.ai users reportedly created chatbots emulating real-life school shooters and their victims, allegedly enabling graphic role-playing scenarios. Character.ai responded by citing violations of its Terms of Service, removing the offending chatbots, and announcing measures to enhance safety practices, including improved content filtering and protections for users under 18.
MoreIncident 8501 Report
Character.ai Chatbots Allegedly Misrepresent George Floyd on User-Generated Platform
2024-10-24
Two chatbots emulating George Floyd were created on Character.ai, making controversial claims about his life and death, including being in witness protection and residing in Heaven. Character.ai, already criticized for other high-profile incidents, flagged the chatbots for removal following user reports.
MoreIncident 9001 Report
Character.ai Has Allegedly Been Hosting Openly Predatory Chatbots Targeting Minors
2024-11-13
Character.ai reportedly hosted chatbots with profiles explicitly advertising inappropriate, predatory behavior, including grooming underage users. Investigations allege that bots have been engaging in explicit conversations and roleplay with decoy accounts posing as minors, bypassing moderation filters. Character.ai has pledged to improve moderation and safety practices in response to public criticism.
MoreRelated Entities
Character.AI
Incidents involved as both Developer and Deployer
Incidents involved as Developer
- Incident 8992 Reports
Character.ai Chatbots Allegedly Emulating School Shooters and Their Victims
- Incident 8992 Reports
Character.ai Chatbots Allegedly Emulating School Shooters and Their Victims