Deepfake pornographic images of students took over a school and now lawmakers have been forced to come up with new laws to punish creators of the images.
The deepfake pornographic images event occurred in November last year at Issaquah High School in suburban Seattle, Washington State. A student at the school reportedly took photos of his female classmates and used AI technology to "undress" them. These photographs were then passed around the school, sparking police attention. 404 Media obtained the police report from the event, and it confirms that web-based apps designed to "nudify" or "undress" people were used to create deepfake images of underage female students.
Notably, the police report states these web apps only need one image of an individual to create a realistic nude image. Additionally, the police report states the school didn't immediately inform the authorities of the circulating images, and police were actually notified by three parents separately. Furthermore, an officer said they were surprised the school didn't contact police regarding the images as they qualify as "sexual abuse" and school administrators are "mandatory reporters".
Here's where it gets interesting. The police took the case to a local prosecutor with the intention of charging the student who admitted to creating the images with "Cyber Harassment". The prosecutor decided not to set charges against the student.
Now, Washington state lawmakers are discussing whether or not to make it illegal to share deepfake pornographic images. A bill is now being pushed through Washington's Legislature with bipartisan support that would create a new criminal offense called "distributing fabricated intimate images," which would enable victims of deepfake pornographic images sue the creators.