Porn, scams and fake news: The growing realm of deepfakes

An NBC News investigation found deepfakes on multiple social media platforms, and concern is growing for how this new technology will impact people.

 AI (Artificial Intelligence) letters and robot hand miniature in this illustration taken, June 23, 2023. (photo credit: REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO)
AI (Artificial Intelligence) letters and robot hand miniature in this illustration taken, June 23, 2023.
(photo credit: REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO)

Deepfakes have been the source of anxiety for experts for some time, but that fear might have become a reality, according to a recent investigation by NBC News. 

NBC News found 50 deepfake videos across social media platforms of celebrities and influential figures with the goal of scamming internet users out of their money, the site reported on August 29. The primary figure used in these videos was X CEO Elon Musk, however, a number of videos also included well-known media figures like Tucker Carlson.

In the videos featuring Musk, users were encouraged to invest funds into a fake platform. As Musk has encouraged investing in some cryptocurrencies in the past, the videos may not be identified as a scam by many users.

Most of the deepfake videos were located on Facebook which has a user base of 36.5% above the age of 45, according to Statista. This population may be more vulnerable to scams based on the perception that older people are less tech-savvy. 

A spokesperson from Facebook told NBC that they were tracking deepfakes and it was against their policy. However, a spokesperson from YouTube claimed that the content didn’t violate their terms and conditions.

A green wireframe model covers an actor's lower face during the creation of a synthetic facial reanimation video, known alternatively as a deepfake, in London, Britain February 12, 2019. Picture taken February 12, 2019 (credit: REUTERS/Reuters TV)
A green wireframe model covers an actor's lower face during the creation of a synthetic facial reanimation video, known alternatively as a deepfake, in London, Britain February 12, 2019. Picture taken February 12, 2019 (credit: REUTERS/Reuters TV)

What are deepfakes and what damage can they cause?

A deepfake is defined as “a video of a person in which their face or body has been digitally altered so that they appear to be someone else, typically used maliciously or to spread false information” by Oxford Language Dictionaries. 

New Artificially Intelligence (AI) and Generative Artificial Intelligence (GAI) software have been made available to the public, increasing the risk of the technology being misused by malicious individuals and groups.

While deepfakes have often been used to scam individuals out of money or spread false information, the technology has also created a particular problem for female public and private figures. Deepfake pornography is created when the image of a person’s face, which can be taken from social media, is inserted onto the body of a porn actor. These videos are often produced without consent and shared online.

Last year, on the top deepfake porn site alone, 13,345 uploads were made. This is almost double the number from the previous year. 

Advertisement

Kate from the organization ‘#NotYourPorn’ spoke to the BBC about being made victim of deepfake porn in a revenge attack. She explained that there is no legal prevention in place to protect women and children from deepfake porn. 

The BBC was also able to interview the owner of one of the largest deepfake porn sites, who said that he “is in denial on the impact on women” that his site has. Although he acknowledged that there may be significant psychological impact that this technology can have.

“Finding out I’ve been a victim of revenge porn has happened to me so many times the initial emotions of hurt, fear, anxiety, and panic have boiled down over time to just rage. At the start, I was convinced my life was over, contemplated suicide, fell into states of isolation to try and protect myself, and even went so far as to change my name. The idea that a two-second upload for one person can be years of torment for another is devastating,” Jenny wrote on the #NotYourPorn website.

“Someone Tweeted me and told me that they had seen a sex tape of me on Pornhub. Then another message came in, and then another. I watched it…the videos were from a long time ago and I thought they were gone. I still don’t know who uploaded them, but I will never forget that feeling of finding out. I wanted to pull apart the internet bit by bit. I felt like I couldn’t breathe. I just wanted it all to end. It was so violating, so public, and now because it was uploaded to Pornhub it’s on the internet forever. They didn’t have a right to upload that. The viewers didn’t have a right to comment on it either, but it was so normal for them. It’s totally ruined my life, and I’ll never be the same person because of it,” another victim known as Lucy wrote. 

Deepfake porn in Israel

In Israel, deepfakes have been designated as an “invasion of privacy” by the Privacy Protection Authority (PPA). The government body wrote that “The PPA's position is that the distribution without consent of a Deepfake photo or video that presents humiliating content or that concerns the modesty of a person's personal life, which may be perceived by the public as authentic, constitutes an invasion of privacy.” 

“In addition, the PPA states that companies that produce fake content using Deepfake technology and maintain a database are obligated to comply with the data protection regulations, and that fake data, which is perceived as authentic personal information, is data that must be protected according to the privacy protection Israeli law.”

However, it is unclear what the sentence for using this technology would be.