There is a rising movement to amend laws to combat the use of artificial intelligence in the production of child pornography.
Law enforcement authorities and non-profit organizations have been raising awareness about the growing issue of AI technology being used for such criminal ends due to the availability of new “nudification” tools on the dark web and an increase in “sexploitation” instances involving teenagers.
The Texas Senate Committee on Criminal Justice met on Thursday to examine solutions to the issue and to get more information about it. One of Lt. Gov. Dan Patrick’s interim legislative tasks ahead of the next session is creating a legal response to child predators who use AI to damage and exploit children.
Panelists’ clear message to state senators was that the statute was insufficient, even though Texas lawmakers had already extended the definition of child pornography to include sexually explicit AI graphics made using real photos of youngsters through HB 2700.
Assistant District Attorney Lori Varnell of Tarrant County told the committee, “I’ve prosecuted child pornography for 17 years, and my first experience with an AI problem happened when a grandfather cut the face of his granddaughter off of a picture and crudely cropped it into an animated picture of a child.” And it’s not against the law according to our present statute. It is not against the law since the pictures showing the child’s genital region weren’t of a child.
Some of the individuals who addressed the senators, like Varnell, indicated that content-based regulation would be the only way to close the legal gaps that currently allow those who use AI-generated nudes for abhorrent ends to walk free, even if it could raise several First Amendment issues.
“Sextortion and sexual coercion are one of the areas that we haven’t talked about so far,” said Brent Dupree, the Texas Office of the Attorney General’s director of law enforcement.
“Consider a young person receiving a picture of themself. He went on, ‘If you don’t send me money, you don’t perform sexual acts, or you don’t produce real content for me, then I’m going to spread these around your school or at your work,’ this harmless photo of a bad actor stripping them naked was sent to them.
We are currently witnessing such issues with both adult and juvenile victims. However, I believe that the development and promotion of AI capabilities will make the issue worse. Bad actors will always find new ways to connect with people, he continued.
When the offender is a minor, it is yet another factor to take into account.
Anna McAdams gave the committee an explanation of how, in October of last year, a 15-year-old classmate of her 14-year-old daughter at Aledo High School utilized images from Instagram to create nude pictures of her and her friends using artificial intelligence (AI). The remainder of the student population was then given access to the fictitious nudes via different Snapchat accounts.
The reality aspect of that, she added, is horrifying. “I want to point out that he didn’t just take her face and put it on some random new person; it was their bodies.”
Even after the culprit was found, McAdams depicted the terror, hopelessness, and powerlessness that the victims felt when faced with the realization that there was nothing they could do to stop it. The adolescent was ultimately placed on probation; however, his name remained a secret from the victims, and he was permitted to resume his studies without facing any additional consequences.
Even law enforcement was unsure of how to handle this youngster because the school board lacked the necessary tools and the school code of conduct contained no guidance on how to handle it. .. People of all ages do this, she noted. “There isn’t much of a law in place if you commit these kinds of crimes as a minor.”