Faked nude images of over 100,000 women created using AI

Faked nude images of more than 100,000 women have been created from social media pictures and shared online, according to a new report.

Clothes are digitally removed from pictures of women by Artificial Intelligence (AI), and spread on the messaging app Telegram. Some of those targeted “appeared to be underage”, the report by intelligence company Sensity said.

But those running the service said it was simply “entertainment”. The BBC has tested the software and received poor results. Sensity claim the technology used is a “deepfake bot”, reports BBC.

Deepfakes are computer-generated, often realistic images and video, based on a real template. One of its uses has been to create faked pornographic video clips of celebrities.

But Sensity’s chief executive Giorgio Patrini said the shift to using photos of private individuals is relatively new.

“Having a social media account with public photos is enough for anyone to become a target,” he warned.

TELEGRAM BOT

The artificial intelligence-powered bot lives inside a Telegram private messaging channel. Users can send the bot a photo of a woman, and it will digitally remove her clothes in minutes, at no cost.

The BBC tested multiple images, all with the subjects’ consent, and none were completely realistic – our results included a photo of a woman with a belly button on her diaphragm.

A similar app was shut down last year, but it is believed there are cracked versions of the software in circulation.

The administrator running the service, known only as “P” said: “I don’t care that much. This is entertainment that does not carry violence.

“No one will blackmail anyone with this, since the quality is unrealistic.”

He also said the team looks at what photos are shared, and “when we see minors we block the user for good.”

But the decision on whether to share the photo with others is up to whoever used the bot to create it in the first place, he said.

Defending its relative level of harm, he added: “There are wars, diseases, many bad things that are harmful in the world.” He has also claimed he will soon remove all of the images.

Telegram has not responded to a request for comment.

‘PAEDOPHILIC CONTENT’

Sensity reported that between July 2019 and 2020, approximately 104,852 women have been targeted and had fake naked images of them shared publicly.

Its investigation found that some of the images appeared underage, “suggesting that some users were primarily using the bot to generate and share paedophilic content.”

Sensity said the bot has had significant advertising on the Russian social media site VK, and a survey on the platform showed that most users were from Russia and ex-USSR countries.

But VK said: “It doesn’t tolerate such content or links on the platform and blocks communities that distribute them.”

Telegram was officially banned in Russia until earlier this year.

“Many of these websites or apps do not hide or operate underground, because they are not strictly outlawed,” said Sensity’s Giorgio Patrini.

“Until that happens, I am afraid it will only get worse.”

The authors of the report say they have shared all their findings with Telegram, VK and relevant law enforcement agencies, but have not had a response.

Nina Schick, author of the book Deep Fakes and the Infocalypse, said deepfake creators were all over the world, and that legal protections were “playing catch-up” with the technology.

“It’s only a matter of time until that content becomes more sophisticated. The number of deepfake porn videos seems to be doubling every six months,” she said.

“Our legal systems are not fit for purpose on this issue. Society is changing quicker than we can imagine due to these exponential technological advances, and we as a society haven’t decided how to regulate this.

“It’s devastating, for victims of fake porn. It can completely upend their life because they feel violated and humiliated.”

Last year the US state of Virginia became one of the first places to outlaw deepfakes

The current UK law around fake nude images has recently been criticised for being “inconsistent, out-of-date and confusing” in a university report.

Despite progress on issues like revenge porn and upskirting, “there remain many glaring gaps in the law”, says Lucy Hadley of the Women’s Aid charity.

While these statistics show how widespread deep-fake images can be, it is not currently a specific offence.

The government has instructed the Law Commission to review the law around the issue in England and Wales. Its findings are due in 2021.

This article has been posted by a News Hour Correspondent. For queries, please contact through [email protected]
No Comments