Description: An aspiring artist in Austin, Texas, Cherelle Kozak, was targeted by a scammer using AI-generated video and voice to impersonate rapper Fat Joe. The impersonator appeared on a call, encouraged her to upload music for supposed radio play, and then demanded payment. Kozak did not comply. The scam closely resembled one that Fat Joe publicly warned about on January 5, 2025.
Editor Notes: Timeline notes: On January 5th, 2025, Fat Joe posted a warning on social media alerting fans about AI-powered scams impersonating him. The report about the attempt targeting Cherelle Kozak was published on April 18th, 2025.
Entities
View all entitiesAlleged: Unknown deepfake technology developer and Unknown voice cloning technology developer developed an AI system deployed by Unknown scammers impersonating Fat Joe, which harmed Cherelle Kozak , Fat Joe , Fans of Fat Joe and General public.
Alleged implicated AI systems: Unknown deepfake technology apps , Unknown voice cloning technology and FaceTime
Incident Stats
Incident ID
1030
Report Count
1
Incident Date
2025-01-05
Editors
Daniel Atherton
Incident Reports
Reports Timeline

AUSTIN, Texas — In a world where connections are just a video call away, it’s easy to trust the face on your screen. Whether dating, networking, or just catching up, FaceTime and social media video calls help bridge the distance-- but what …
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Similar Incidents
Did our AI mess up? Flag the unrelated incidents