Fake Nudes Of Thousands of Women Are Being Shared Online Through Disturbing New App

0
7780

In yet another example of how the concept of right and wrong are no longer norms, some absolute garbage group of humans is using a messaging app that allows anyone to create fake nudes and pornographic images and then spread them online—even images of kids.

And I am officially done with 2020. 

FreeToolkitInsert-PromoCode

The BBC recently broke the news that a relatively new app called Telegram allows users to send an image of a person to a “deepfake bot” that will then remove the person’s clothing, creating a fake nude.

This can be done for free and in only a few minutes and holy good green grass, my skin is crawling now. 

In an interview with the BBC, an intelligence firm called Sensity stated that many of the people who were “deepfaked” were underage. What’s worse, the people who manage Telegram don’t see an issue with it and are quoted as saying that this is “simply entertainment.”

“Having a social media account with public photos is enough for anyone to become a target,” Sensity’s chief executive Giorgio Patrini told the BBC.

So, here is how it works.

A computer-generated AI program, known as a deepfake, takes an image of a person and is able to generate realistic-looking images and videos of that same person in a different manner, like nude.

You may have heard of how Kristin Bell was recently the target of a deepfake when her face was imposed over a woman in a pornographic video. 

Deepfakes are causing problems for politicians too. There have been several examples of deepfakes in which videos of political leaders giving speeches have been convincingly altered to say things that they never actually said, like this one of Barack Obama

But now the line has crossed from public people with PR teams and deep pockets to fight against this disturbing use of AI to attacking private, individual people who may be irreparably harmed by such a breach of decency and privacy.

Victims of this technology could be fired from their jobs or potentially targeted for violent attacks. This isn’t funny, and it certainly isn’t entertaining to mess with a person’s life like this.

But back to that Telegram app.

The BBC reached out to the administrator of the app, who is apparently only known as “P” to get their perspective on deepfake nudes of ordinary people.

“I don’t care that much. This is entertainment that does not carry violence,” P reportedly told the BBC. “No one will blackmail anyone with this since the quality is unrealistic.” 

The security firm, Sensity, reportedly shared with the BBC that 104,852 women and children had been targeted for deepfakes between July 2019 and 2020.

Those examples were all nude and were shared with the public online. Sensity did not clarify if these nudes were all on the Telegram app or a culmination of multiple sources. 

As a parent, the idea that anyone with nefarious intent could take an image of my child and use AI to alter it in a manner that is, quite frankly, disgusting and evil absolutely sickens me.

And with social media and technology changing at a faster clip than our laws can keep up with, this new turn for the revolting has me wondering if perhaps it is time to delete my social accounts altogether and ban them from my kids. 

LEAVE A REPLY

Please enter your comment!
Please enter your name here