Gaming

Qtcinderella’s Deepfake Porn Nightmare

The latest high-profile victim of deepfake porn, she was pasted into an adult film to make it appear as though she actually starred in the kinky clip, according to a popular gamer on Twitch.

A renowned male Twitch celebrity who admitted to purchasing deepfake porn was also criticized by QTCinderella, a 28-year-old American whose actual name is Blaire, who went live on the streaming platform last week.

 “I’m so exhausted and I think you guys need to know what pain looks like because this is it,” the gamer wept. “This is what it looks like to feel violated. This is what it feels like to be taken advantage of, this is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.” 

She then targeted gamer Atrioc, whose earlier claims to fans that he had bought two edited films of other well-known female Twitch stars had increased traffic to the deepfake porn website.

 “F- -k the f- -king internet. F- -k Atroic for showing it to thousands of people. F- -k the people DMing me pictures of myself from that website. F- -k you all! This is what it looks like, this is what the pain looks like,” QTCinderella continued during her emotional livestream. 

Watch Now: Knock at the Cabin Review



“To the person that made that website, I’m going to f- -king sue you,” she vowed. “I promise you, with every part of my soul I’m going to f- -king sue you.” 

Given the quick development of technology, it can be difficult for victims who say they had nothing to do with the creation to tell the difference between really produced videos and deepfake porn. Additionally, laws have not kept up with modern online activities, so QTCinderella may find it challenging to bring a lawsuit against the maker of the horrible, doctored film.

The Twitch star’s deepfake terror was originally covered by tech journalist River Page. Page added that “there is a federal revenge porn legislation that allows victims of nonconsensual porn to pursue cases against perpetrators, but the law doesn’t directly address deepfakes” in his column, which was reprinted by the Free Press.

 “A federal law should be in place,” Page further wrote. “Will it stop deepfake porn? Not completely. Federal law hasn’t eliminated the production and distribution of child pornography either, but the enforcement of those laws has driven the practice to the extreme margins, and has attached a heavy cost to participating in the trade.” 

Currently, cyber perverts can easily and without any concern for legal repercussions build deepfakes using software that uses machine learning or artificial intelligence.

Celebrities like Scarlett Johansson and Emma Watson have fallen prey to deepfake porn videos, and some sleazebags are asking just $20 to fabricate videos of ex-boyfriends, ex-girlfriends, coworkers, acquaintances, and foes.

The damage from being a victim of deepfake porn can be “profound,” according to Robert Chesney from the University of Texas and Danielle Citron from the University of Maryland.

“Victims may feel humiliated and scared,” they wrote in a 2019 research paper. “When victims discover that they have been used in fake sex videos, the psychological damage may be profound, whether or not this was the aim of the creator of the video.”

Although the damage seems clear, some people on social media have shown little sympathy.

Leave a Reply

Your email address will not be published. Required fields are marked *