NPRJustice & LegalCriticalMar 17, 2026

Tennessee teens sue Elon Musk's xAI over AI-generated child sexual abuse material

By Huo Jingnan

What happened

Three Tennessee teenagers identified as Jane Does 1, 2, and 3 filed a class action lawsuit against xAI, alleging that an unnamed app powered by xAI's algorithm was used to generate nonconsensual nude and sexually explicit images and videos of them. The perpetrator, who had a close relationship with one plaintiff, used photos the plaintiff sent him along with yearbook and social media photos to create the material, which was not labeled as AI-generated. One video depicted a plaintiff undressing until entirely nude, and the perpetrator created sexually explicit material of 18 other people and traded them online before being arrested.

The lawsuit claims xAI deliberately licensed its technology to app makers, often outside the U.S., to outsource liability. The plaintiffs' attorney Vanessa Baehr-Jones stated the teenagers want to change how AI companies make business decisions regarding sexually explicit content. The plaintiffs are seeking damages for emotional distress and other harms.

xAI's image generation tools have been implicated in millions of sexualized images over the past year. Influencer Ashley St. Clair sued xAI earlier in 2026 for AI-produced nude images of her as a teenager. Google and OpenAI include digital watermarks on their image generation tools to disclose AI origin, but xAI has not adopted this standard. xAI did not respond to a request for comment.

Who's perspective

This article is written from a technology and legal accountability beat, centering the plaintiffs' claims and their attorney's framing. The piece treats the lawsuit's allegations as the primary lens, which means xAI's potential legal defenses and the technical question of whether xAI's model was actually used receive limited scrutiny.

Taken for granted

The article takes for granted that xAI bears meaningful responsibility for how third-party apps use its licensed technology. It does not address the alternative framing — that intermediary liability for downstream app developers is a contested legal question — leaving readers without a sense of how courts have historically treated such claims.

Language choicestap to explore

Read original ↗

AI-identified observations — verify against the original article.