A popular Twitch gamer has tearfully revealed she is the latest high-profile victim of deepfake porn, with predators pasting her face onto a pre-existing adult video to make it look like she legitimately appeared in the kinky clip.
QTCinderella, a 28-year-old American whose real name is Blair, went live on the streaming site last week to attack the cyber sickos who made the video, as well as a prominent male Twitch star who had admitted to buying deepfake porn. .
“I’m so exhausted and I think you should know what pain looks like because that’s what it is,” the player teared up. “That’s what it feels like to be violated. That’s what it feels like to be taken advantage of, that’s what it’s like to see yourself naked against your will spread all over the internet. That’s how it seems.”
He then took aim at player Atrioc, who had earlier told fans he bought two edited clips featuring other famous female Twitch stars, causing a spike in traffic to the deepfake porn site.
“Fk the f-king internet. F–k Atroic for showing this to thousands of people. F–k the people who send me pictures of myself from this site. F-k all of you! This is what pain looks like, this is what pain looks like,” QTCinderella continued during her emotional live stream.

“To the person who made this site, I will sue you,” he promised. “I promise, with every part of my soul I will sue you.”
The post was directed to QTCinderella for further comment.
Given the rapid evolution of technology, it is difficult to distinguish deepfake porn from legitimately shot videos, adding to the anguish experienced by victims who insist they were not involved in the production. Also, the laws have not kept up with current online activity, which means it may prove difficult for QTCinderella to sue the person who created the disturbing video.
Tech writer River Page first reported the Twitch star’s scare. In his essay, republished by the Free Press, Page explained that “there is a federal revenge porn law that allows victims of non-consensual porn to file lawsuits against perpetrators, but the law does not specifically address deepfakes.”


“A federal law should be enacted,” Page further wrote. “Will Deepfake Porn Stop? Not at all. Federal law has not abolished the production and distribution of child pornography either, but enforcement of these laws has pushed the practice to extremes and imposed heavy costs on participation in the trade.”
For now, cybercriminals use software involving machine learning or artificial intelligence to create deepfakes with relative ease — and with little fear of prosecution.
Celebrities like Scarlett Johansson and Emma Watson have fallen victim to deepfake porn videos, and some sleazebags charge as little as $20 to create fake videos of exes, colleagues, friends, enemies and classmates.
Robert Chesney from the University of Texas and Danielle Citron from the University of Maryland said the damage from being a victim of deep fake pornography can be “profound”.
“Victims may feel humiliated and fearful,” they wrote in a 2019 research paper. no”.

The damage seems to be obvious QTCinderella, but some on social media have expressed little sympathy.
“I don’t get it at all. Like, there could be terabytes of photoshopped porn out there, and I wouldn’t care… because I don’t actually experience those scenarios being portrayed. It’s literally not real.” one Twitter user wrote.
“I’m sorry, but if you’re crying over people putting your face on a porn star, then maybe you’re too soft for the internet. It’s wild out there.” stated another.