A New Jersey Lawsuit Shows How Hard it is to Fight Deepfake Porn

A New Jersey deepfake porn lawsuit reveals how hard it is to hold AI abusers accountable—especially when platforms hide offshore.
Matilda
A New Jersey Lawsuit Shows How Hard it is to Fight Deepfake Porn
Deepfake Porn Lawsuit Exposes Legal Gaps in AI Abuse Cases A New Jersey teenager’s ordeal with AI-generated explicit imagery has sparked a landmark legal battle—but it also highlights how outdated laws are failing victims of deepfake porn. When her classmates used an app called ClothOff to strip her clothed Instagram photos using artificial intelligence, the 14-year-old became part of a growing crisis: non-consensual intimate imagery created not by human hands, but by algorithms. Despite clear evidence and the illegal nature of the resulting images—classified as child sexual abuse material (CSAM)—law enforcement declined to act. Now, a Yale Law School clinic is suing to shut down the app entirely, but even identifying its operators has proven nearly impossible. Credit: Bryce Durbin The Rise of ClothOff—and Its Global Shadow Network Launched over two years ago, ClothOff quickly gained notoriety for its ability to generate realistic nude images from fully clothed photos. Though banned from …