Be Aware! Your photos can now be used to create fake adult videos

Be Aware! Your photos can now be used to create fake adult videos

2

“Emma Watson” is naked, kneeling on a couch. Nearby, “Maisie Williams” is sitting in compromising position. “Gal Gadot” is dressed as a cowgirl, performing sexual acts.

Where we are? Not in the dark depths of a teenager’s fantasy, but on the PornHub website, where the only differences between Hollywood actress Emma Watson and fake porn star “Emma Watson” are a few pixels, several layers of clothes and a total lack of consent.

Welcome to the world of deepfakes: a new type of identity theft with X classification.

The deepfakes are modified using porn videos sides exchange technology through artificial intelligence , so that the face of the protagonist is replaced by someone else.

Just six months ago, this required extremely complex coding and a lot of free time.

But, now, an internet search of “deepfake porn” yields at least 700,000 results. The technology exploded so fast – and it’s so easy to use – that practically anyone can create personalized pornography in about 12 hours.

Celebrities are the most popular victims of deepfakes . In fact, this type of fake faces started in Hollywood .

Image of a video montage by Emma Watson.
Image caption Emma Watson is one of the actresses whose face she wore on the body of a porn actress, as seen in this image.

“The technology for superimposing celebrity faces on other people has been available for special effects for years,” says Evgeny Chereshnev, executive director of security technology company BiolinkTech.

Modified photos, therefore, are nothing new, but the availability and level of realism of modern deepfakes are.

Within reach of anyone

The deepfakes take their name from a user of Reddit. Last year, “deepfakes” perfected a complex algorithm to create spooky videos that appear to show actresses Gal Gadot, Taylor Swift and Scarlett Johansson performing pornographic acts.

Although your brain knows that none of them is the person who does that, your eyes do seem very convinced.

FakeApp
Copyright of the FAKE APP image
Image caption A user of Reddit created a free app called FakeApp that makes the exchange of faces for you.

Then, in January, another Reddit user created a free app , called FakeApp , with a built-in algorithm that makes the exchange of faces for you.

You only need a powerful computer (like those used to work with video), a graphics processing unit and enough images of your goal – which, thanks to social networks, YouTube or even Linkedln is not too complicated – but theoretically anyone can transform a porn star into anyone else.

Currently, there are basic versions available to anyone: for less than US $ 30, Adobe can provide the tools to create a digital copy of someone, although it would take expensive professional software to reach the next level.

Adobe VoCo, a kind of Photoshop for audio, even allows the user to recreate someone’s voice just 20 minutes after listening to it. It is still in the research stage, but other companies such as Lyre Bird have more basic software already available for use.

Now anyone can replace the faces of porn actors with their platonic love, for example. Or maybe someone wants to take revenge on an ex and does it to end their career or their new relationship. Or, in fact, sabotage the career or relationship of any person with whom at that time is angry.

So, all of a sudden, it’s not just celebrities (with their army of powerful lawyers) that can find themselves on someone else’s computers. Be you .

Porn revenge

Attorney Ann Olivarius, who has worked with victims of “revenge porn” says she receives calls from clients who were victims of deepfaking .

“It’s a big concern for us,” says Olivarius. “It’s really devastating for these women because with new technology, it may seem real, what’s behind is always an attempt to hurt and degrade .”

The lawyer believes that deepfaking is part of the porn problem of revenge.

“There are several types of revenge porn out there,” he says. “It’s a growing problem, and manifests itself in new ways.”

DeepfakesCopyright of theGETTY / BBC THREE IMAGE
Image captionCelebrities have an army of lawyers who can help them on issues like this.

While celebrities can count on expensive lawyers, and can, potentially, use the defamation law to pursue the deepfakers (as long as they can prove that the image caused, or can cause, serious damage to their reputation), for the people common it can be much more difficult to take an action.

In January, an Australian man was sentenced to 12 months in prison for using Photoshop to superimpose an image of his teenage stepdaughter’s face on images of women performing sexual acts, and there have been similar cases in the UK.

For Luke Patel, a specialist in the right to privacy, “the rapid pace of development of technology makes it very difficult for laws to stay current and adequately support victims.”

In Europe, the implementation of the General Data Protection Regulation, which will take effect on May 25, includes two new tools: the right to request the deletion of data and the right to be forgotten.

Luke believes that they could help “allow an individual to request the permanent removal or deletion of their personal data (including images) when there is no good reason for their continued publication,” although each case will be decided individually.

“It is not an absolute right, but the case is more solid if the image is not justified and causes considerable suffering.” Although, he continues, “they are still only tools that can be deployed when the damage is done .” They will not prevent it from happening.

Mobile phone with image of half-naked woman.
Copyright of the GETTY IMAGES image
Image caption The rapid development of technology makes it difficult to adequately protect victims.

How to stop it?

One option is that the internet platforms stop hosting deepfakes .

Reddit has already banned them, describing them as an unacceptable form of “involuntary pornography”.

Pornhub affirmed in February that it would follow the example, but if you look for ‘porno’ and ‘deepfakes’, the best results are in Pornhub.

BBC Three contacted Pornhub for an explanation, and received this statement from his vice president, Corey Price: “Non-consensual content is not allowed on our platform, as it violates our Terms of Service.”

However, at least 20 videos can be counted , including several with celebrities, which we can clearly assume were not consensual.

Pornhub replied that they have a content removal page, where people can “request the removal of non-consensual material.” As soon as a request is made, we work to remove the content immediately, and we also depend on our community and / or content owners. to mark inappropriate content. “

Deepfakes
Copyright of the GETTY / BBC THREE IMAGE
Image caption The ‘deepfakes’ can be the new porn of revenge.

Creating deepfakes is becoming so easy that it could become a game: bring photos, drinks and, instead of watching YouTube videos, sit down and create stolen porn.

“In a couple of years, you could go to a pornographic app store and buy virtual reality sex experiences with whomever you want,” says Evgeny Chereshnev of BiolinkTech.

You can already buy incredibly realistic sex dolls. “Soon,” Evgeny says, “technology could allow someone to steal your identity and ask for a sex doll with your face.”

The idea of a future where “you” could exist on someone’s laptop, or where “yours” images could be created to be maliciously distributed is deeply disturbing. But that future may already be here.

About author
Profile photo of Rava Desk

Rava Desk

Rava is an online news portal providing recent news, editorials, opinions and advice on day to day happenings in Pakistan.

Your email address will not be published. Required fields are marked *