>>108254312You ask from to make something sfw but porn adjacent. The prompt isn't explicit enough to trigger the content filter, but because the vast majority of all content involving women on the internet is pornographic (a bad apples issue, surely), any time grok generated an image of a woman's body it draws on all that pornographic training data.
The trick is basically that when you ask an AI to generate something exact that it doesn't have exact training data for, it doesn't synthesize a new concept using elements of its training data, it just finds the closest approximate to your prompt that does exist within its training set and goes with that, weighted heavily by the amount of something in its training data. All models are trained on lots of porn, because the internet is full of it and the training process is indiscriminate, so it's trivial to trick the AI into drawing on that data without explicitly asking for something that it's content filter recognizes as inappropriate. It filters your prompts, not its own output.