Naked deepfake images of teenage girls shock Spanish town: But is it an AI crime?

When they returned to school after the summer holidays, more than twenty girls from Almendralejo, a town in southern Spain, received naked photos of themselves on their mobile phones.

None of them had taken the pictures, but they looked completely real.

The images had been stolen from their Instagram accounts, altered using an artificial intelligence application and then shared in Whatsapp groups.

The teenagers were fully clothed in the real photos, but the app made the nudity look completely real.

Now parents and prosecutors are asking whether a crime has been committed, even if the pictures are actually real – could the images be considered child pornography? 

“The montages are super realistic, it’s very disturbing and a real outrage,” Miriam Al Adib, one of the girls’ mothers, wrote on her Instagram account.

“My daughter told me with great disgust: ‘Mum, look what they have done to me’,” she added.

Al Adib even claimed that the photos could have reached internet portals such as Onlyfans or pornographic websites. All the while, the girls endured the comments of their classmates.

”Don’t complain, girls upload pictures that almost show their p***y,” one of the girls was told.

The youngest of the girls is only 11 years old and not yet in high school.

The mothers of some of the victims have denounced what has happened.

Another mother, Fátima Gómez, told Extremadura TV that her daughter had been blackmailed.

In a conversation with a boy on social media, he asked her for money and when she refused, he sent her a naked picture.

The mothers have organised themselves to complain about what has happened, and the National Police have opened an investigation and already identified several minors allegedly involved.

Some of them are classmates of the girls, a local politician revealed. 

The case has been referred to the Juvenile Prosecutor’s Office, and the mayor of the town himself warned: “It may have started as a joke, but the implications are much greater and could have serious consequences for those who made these photos”.

€10 euros for 25 nude photos

The hyper-realistic artificial intelligence creations, better known as deepfakes, were made with the ClothOff app.

With the slogan “Undress anybody, undress girls for free”, the app allows users to take the clothes off from anyone who appears in their phone’s picture gallery. It costs €10 to create 25 naked images. 

Although the nudity is not real, the mothers say that the girls’ distress at seeing their picture is very real indeed.

“You are not aware of the damage you have done to these girls and you’re also unaware of the crime you have committed,” Al Adib said on her Instagram account in a message addressed to the people who shared the pictures.

“My daughter was told by one of them that he had done ‘things’ with her photo,” another of the mothers told Spanish newspaper El País.

But can deepfakes be legally punished?

“One question is whether it should be punished and another is whether it can be punished by the way the law is drafted in Spain and in other EU countries,” Manuel Cancio, professor of criminal law at the Autonomous University of Madrid, told Euronews.

The professor points out that there is a legal loophole because the use of minors’ faces in photographs affects their privacy, but when it comes to crimes in which intimate images are distributed, it is the image as a whole that violates privacy.

“Since it is generated by deepfake, the actual privacy of the person in question is not affected. The effect it has (on the victim) can be very similar to a real nude picture, but the law is one step behind,” he adds.

Cancio says that the legal framework that could work in this case would be a crime against moral integrity, “a kind of disaster box for crimes that no one knows where to put”.

In March 2022, the European Commission proposed criminalising this type of offence in a directive on cybercrime. According to the professor, the Dutch Criminal Code is the only one that has a provision addressing this issue.

National Police have opened an investigation and already identified several minors allegedly involved.

Can it be considered child pornography?

Experts are divided as to whether the crime could be considered distribution of child pornography, which would carry a higher penalty, and prefer to err on the side of caution.

For Leandro Núñez, a lawyer specialising in new technologies at the Audens law firm, the key is not whether the photo is 100% real, but whether it appears to be.

“The most important thing is whether the face is identifiable. We could be talking about child pornography, crimes against moral integrity or the distribution of images of non-consensual sexual content,” the lawyer told Euronews.

“In the case of a crime against moral integrity, it would be considered a lesser crime, so it would carry a lesser sentence of six months to two years in prison,” he adds.

Other experts, such as Eloi Font, a lawyer at Font Advocats, a law firm specialising in digital law, believe that it could be classified as a crime similar to the reproduction of sexual images of minors.

In this case, the penalty could be between five and nine years in prison.

AdvertisementThe new Emirates Premium Economy has arrived on the latest Emirates A380 Emirates Get the best value from your summer holiday with exclusive offers and discounts across Dubai and the UAE with Emirates Pass