One in 10 minors reported that their classmates used artificial intelligence to make explicit images of other kids, according to a new report published by Thorn, a nonprofit working to defend kids from sexual abuse.

“A lot of the times when we get these new technologies, what happens is sexual exploitation or those predators exploit these technologies,” said Lisa Thompson, vice president of the National Center on Sexual Exploitation.

To put together this report, Thorn surveyed more than 1,000 minors, ages 9 to 17, from across the U.S. Along with the 11% who reported knowing someone who created AI-generated explicit images, 7% reported sharing images. Nearly 20% reported seeing nonconsensual images, and more than 12% of children ages 9 to 12 reported the same.

“It’s gone mainstream, and kids know how to use this, so now we have literally children engaging in forms of image-based sexual abuse against other children,” Thompson said.

Earlier this year, lawmakers introduced the Take It Down Act, which would ban AI-generated explicit content from being posted online. It would also require websites to remove the images within two days.