A Disney Channel child star has told Sky News that she "broke down in tears" after learning a criminal had used artificial intelligence (AI) to create sexual abuse images using her face.

Kaylin Hayman, who is 16 years old, returned home from school one day to a phone call from the FBI. An investigator told her that a man living thousands of miles away had sexually violated her without her knowledge.

Kaylin's face, the investigator said, had been superimposed on images of adults performing sexual acts.

"I broke down in tears when I heard," Kaylin says. "It feels like such an invasion of my privacy. It doesn't feel real that someone I don't know could see me in such a manner."

Image: Kaylin Hayman

Kaylin has starred for several seasons in the Disney Channel TV series, Just Roll With It, and was victimised alongside other child actors.

"My innocence was just stripped away from me in that moment," she adds. "In those images, I was a 12-year-old girl and so it was heartbreaking, to say the least. I felt so lonely because I didn't know this was actually a crime that was going on in the world."

But Kaylin's experience is far from unique. There were 4,700 reports of images or videos of the sexual exploitation of children made by generative AI last year, according to figures from the National Centre for Missing and Exploited Children (NCMEC) in the US.

Follow Sky News on WhatsApp

Keep up with all the latest news from the UK and around the world by following Sky News

Tap here

AI-generated child sex abuse images are now so realistic that police experts are compelled to spend countless, disturbing hours discerning which of these images are computer simulated and which contain real, live victims.

That is the job of investigators like Terry Dobrosky, a specialist in cyber crimes in Ventura County, California.

"The material that's being produced by AI now is so lifelike it's disturbing," he says. "Someone may be able to claim in court, 'oh, I believed that that was actually AI-generated. I didn't think it was a real child and therefore I'm not guilty.' It's eroding our actual laws as they stand now, which is deeply alarming."

Sky News was granted rare access to the nerve centre for the Ventura County cyber crimes investigations team.

Image: Inside an operation to combat cyber crime in Ventura County, California

Mr Dobrosky, a District Attorney investigator, shows me some of the message boards he is monitoring on the dark web.

"This individual right here," he says, pointing at the computer screen, "he goes by the name of 'love tiny girls'… and his comment is about how AI quality is getting so good. Another person said he loves how AI has helped his addiction. And not in a way of overcoming the addiction - more like fuelling it."

Read more from Sky News:
Robert F Kennedy Jr makes dead bear admission
'Staggering' violence of UK riots condemned
New areas of interest for murder investigation

Image: Cyber crime specialist Terry Dobrosky shows Martha Kelner how AI is being used for child abuse images

Creating and consuming sexual images using artificial intelligence is not just happening on the dark web. In schools, there have been instances of children taking pictures of their classmates from social media and using AI to superimpose them onto nude bodies.

At a school in Beverly Hills, Los Angeles, five 13 and 14-year-olds did just that and were expelled while a police investigation was launched.

Image: Pupils at US schools have even been using AI to generate abuse images of their classmates

But in some states - like California - it's not yet designated a crime to use AI to create child sex abuse images.

Rikole Kelly, deputy district attorney for Ventura County, is trying to change that, with a proposal to introduce a new law.

Image: Rikole Kelly, Deputy District Attorney for Ventura County, wants to make using AI to generate child abuse images a specific crime

"This is technology that is so accessible that a middle schooler [10 to 14 years of age] is capable of utilising it in a way that they can traumatise their peers," she says. "And that's really concerning because this is so accessible and in the wrong hands, it can cause irreparable damage."

"We don't want to desensitise the public to the sexual abuse of children," she adds. "And that's what this technology used in this way is capable of doing."

Disclaimer: The copyright of this article belongs to the original author. Reposting this article is solely for the purpose of information dissemination and does not constitute any investment advice. If there is any infringement, please contact us immediately. We will make corrections or deletions as necessary. Thank you.