Elon Musk’s xAI sued for turning three girls’ real photos into AI CSAM

via arstechnica.com

Short excerpt below. Read at the original source.

A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As recently as January, Musk denied that Grok generated any CSAM during a scandal in which xAI refused to update filters to block […]

Read at Source