One victim alleges that Grok generated sexual images of herself and at least 18 other minors that were posted on Discord.
The three girls say that the nonconsensual nude images were created by a perpetrator who used AI company xAI's image generation tools.
A class action lawsuit was filed against xAI, arguing the company "knowingly" participated in the creation of child sexual ...
A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As ...