‘Please stop’: TikToker frightened after being harassed with AI-generated nudes of herself

‘Please stop’: TikToker frightened after being harassed with AI-generated nudes of herself
@rache.lzh
Chandler Treon
May 3, 2023
A TikTok user is begging anonymous posters to stop using artificial intelligence to generate fake nude images of her.
In a video posted on April 27, which has since been viewed over 1.1 million times, TikTok user Rachel (@rache.lzh) explains through tears that an anonymous Instagram user sent her a message containing pictures she had posted on her account that had been edited using AI to non-consensually produce pornographic images.
“It was pictures of me that I had posted fully clothed, completely clothed. And they had put them through some editing AI program to edit me naked,” Rachel explains in the video. “They basically photoshopped me naked.”
The messages did not stop there. The TikToker says the next day she received “dozens of DMs” containing AI-edited nude images, this time without a watermark.
“And what’s even worse is that the next day when I woke up I was getting dozens of DMs of these images but without the watermarks. So this person paid to have the watermark removed and started distributing it like it was real.”
She explains that the AI added tattoos to her body and inaccurately adjusted her proportions.
“If anyone has ever seen an actual picture or video of me they’ll know that I’m not built like that,” she says. “I don’t even have a tattoo along my lower stomach. I don’t have one there.”
“I’m just letting you know that anything you see of me is edited or fake. I don’t have any content, I don’t sell content. None of that is real.”
Rachel, who spoke to NextShark regarding the images, emphasized that there are no consensual nude pictures of her on the internet.

The TikTok user previously posted a video about the AI-produced images on April 23, which received comments that the creator says “made me want to throw up.”
“The first time I tried to post about this, I was like, ‘It’s not real by the way,’ the comments were so disgusting,” she says. “Like actually vile, they made me want to throw up multiple times.”
Examples of comments that are still currently present on the April 23 post include “Agreed someone make it accurate and dm me so I can confirm” and “Would you be willing to give an example.”

Rachel addressed comments that claimed that her original post was meant to garner attention:
“‘Obviously you want more people to see this that’s why you’re posting about it.’ No, it’s because I want you to know they’re not real,” she says as she bursts into tears of frustration. “Please stop.”
Speaking with NextShark, Rachel says she feels “afraid” to post on social media again.
“If I do people will say I’m asking for it by continuing to post myself,” she explains. “But I don’t want to be chased off the internet by these people.”
“It feels disgusting honestly knowing that people I’ve never even met in my life want to see me naked and have actively put effort into finding and making those photos,” she continues. “And the people who still want it knowing that they’re fake and also very much nonconsensual belong on a list somewhere because this honestly feels like a form of rape.”
The TikTok user believes she was targeted due to her race.
“I genuinely feel that if I wasn’t Asian I wouldn’t be getting this type of attention,” she says.

For months now I’ve been getting weird racial comments and DMs. White dudes are convinced I want them, so they comment about how I’m “built for bwc” and then they send me their penises. And when I look okay in a video my comments will always be weird and racial. “If it ain’t rice it ain’t nice,” “If she ain’t squintin I ain’t sprintin.” They’re just really f*cking weird.

A screenshot of Rachel’s video posted on Twitter went viral, receiving over 274,500 likes and 40,600 retweets with some users condemning the AI-produced images and others defending the tool from criticism.
“We all know this is headed towards photos of children…..this is absolutely appalling,” one user wrote.
Another user wrote, “that was so hard to watch, my heart is breaking for this poor girl. She’s right, this is rapist behavior; its the fact it’s not consensual and distressing to her that gets them off and it’s so frustrating how many men will fight you tooth and nail on the topic of these deepfakes.”
“It’s a tool, blame those that use it improperly,” one user wrote in defense of AI. “If someone goes on a stabbing spree with spoons, should no one be able to use them?”
One user compared the nonconsensual nude images to fan art, writing:

It’s a violation of ones boundaries forsure &hows it any different than raunchy fan art, the kinda stuff youd see on deviant art or even twitter. Its a fake picture. Unless a nude drawin has gotten someone fried in the decades we’ve had fan art I dont understand the new outrage

Share this Article
NextShark.com
© 2024 NextShark, Inc. All rights reserved.