If my understanding of how latent diffusion models work is still up to date, then this sort of fuck up is at best a sign of gross incompetence. It takes more than just feeding an image and generating a different one with your clothes altered, if you want the rest of the image to stay the same. It's a rather sophisticated process that, in part, involves splitting the image with grounded segmentation to cut out the areas that will be replaced with inpainting. So if the software engineers that implemented that feature wanted to avoid, say, having minors undressed for public viewing, it should not only have been mandatory practice to add safeguards whenever the model is used in that manner, but it would also have been easy to add extra conditions or extra pipelines that block nudity generation. It looks like Musk is really pinching his pennies after he had to fork out 40 billion dollars for the website...