• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

WTF is going on with X (Grok)?

This is mental, if a person had a sexualized image of a child they'd be arrested.

Generate it on X.... and it's legal.

Madness.
same happens on YT with time stamps on videos featuring kids playing where the stamp shows them in positions to rouse paedos, eg ducking or spreading their legs
 
If my understanding of how latent diffusion models work is still up to date, then this sort of fuck up is at best a sign of gross incompetence. It takes more than just feeding an image and generating a different one with your clothes altered, if you want the rest of the image to stay the same. It's a rather sophisticated process that, in part, involves splitting the image with grounded segmentation to cut out the areas that will be replaced with inpainting. So if the software engineers that implemented that feature wanted to avoid, say, having minors undressed for public viewing, it should not only have been mandatory practice to add safeguards whenever the model is used in that manner, but it would also have been easy to add extra conditions or extra pipelines that block nudity generation. It looks like Musk is really pinching his pennies after he had to fork out 40 billion dollars for the website...
 
This is just the beginning. Wait til teenagers can effortlessly create AI porn of their classmates. This is probably going to have to get really bad before our governments start pushing hard for regulation.
 
This is just the beginning. Wait til teenagers can effortlessly create AI porn of their classmates. This is probably going to have to get really bad before our governments start pushing hard for regulation.
This is actually the most worrying part. I remember hearing about some pedo guy who explained that he was attracted to young girls when he was also young (normal enough) and that as he aged, what he was attracted to didn't change (not normal). Now imagine 12 year olds generating stuff with their age of girls and warping their mind in the process and having only desires for that stuff when they're 20+ years old.
 
I mean it's fine if it's just fanart (of adults) or sceenshots of you fav character that you are making naughty vids with, but real people that don't want to be naked and kids. Not cool. I mean the ai should be smart enough to filter this, to tell kids from adults, artwork/game screenshots from real people.
 
I mean it's fine if it's just fanart (of adults) or sceenshots of you fav character that you are making naughty vids with, but real people that don't want to be naked and kids. Not cool. I mean the ai should be smart enough to filter this, to tell kids from adults, artwork/game screenshots from real people.

Safeguards are reasonable and necessary, especially for CSAM and non-consensual content. But expecting AI to perfectly distinguish adults from minors or art from real people is unrealistic and will inevitably create false positives. When access is tied to a logged-in, traceable account and users have already consented to data collection, responsibility for illegal content should ultimately rest with the user issuing the commands, not the tool itself. That's the same standard we apply to cameras, editing software, and every other neutral technology.

I think that's what Elon means by "consequences." In practice, it likely means X forwarding user data to law enforcement when there's illegal activity. That only works if Grok is limited to verified accounts, and if verification itself is actually meaningful, not a joke.
 

Why not just disable Grok's ability to undress people in pictures? Doing it for pictures of adults is one thing but it doing that to images of children is unacceptable. If it can't reliably tell the difference between adults and children, which I can imagine is maybe hard to do reliably, then don't let it do it for anyone.
 
Last edited:
Why not just disable Grok's ability to undress people in pictures? Doing it for pictures of adults is one thing but it doing that to images of children is unacceptable. If it can't reliably tell the difference between adults and children, which I can imagine is maybe hard to do reliably, then don't let it do it for anyone.
It's not that hard. AI is very smart. I downloaded some models to run locally and it can do insane things. Some models allow nudity but no explicit actions. So obviously it is able to ignore those explicit commands.

The tech is absolutely insane btw. I downloaded this model called WAN2.2 and you can upload any pic and it brings it to life in a 5-10 second video, and then you can make it do anything. i fed an explicit model some Sydney Sweeney pics and you can literally have a man enter the frame and have his way with her. i tried to do the same with a the original WAN2.2 AI model and that particular model straight up ignored it. Just had her yawn or move around instead. obviously, these AI models have been trained on porn, and have commands in them to either ignore explicit inputs or determine if the person is underage. Or a public figure. Or even if its a real person. Before i downloaded these AI tools on my PC, i tried some online AI tools and even the NSFW explicit ones literally refused to do anything with public figures like Sydney Sweeney.

Grok can easily refuse to put bikinis on ANYONE. Elon just refuses to do it.
 
This is just the beginning. Wait til teenagers can effortlessly create AI porn of their classmates. This is probably going to have to get really bad before our governments start pushing hard for regulation.

People are really just sticking their head in the sand with this. Moving forward recklessly.
This is going to get really bad.
 
Wouldn't be difficult to enable some censorship/guidelines. Not that I'm an expert in the field or anything but ChatGPT and Gemini are notoriously censored. Even simple innocent prompts often get flagged for breaking the rules.

Usually I am not one for asking for censorship but when it comes to deepfakes of anyone (adults or children) then I am all for it. It's a disgusting invasion of privacy and exploits everyone.

X/Elon should do better.
 
Last edited:
Top Bottom