
I’ve been generating images for this blog with AI art for years now. I have seen many changes and a recent trend is really starting to annoy me. I have my settings to NOT generate R-rated images and yet my simple non-sexual image prompts are resulting in a slew of female nudes, underware, etc. that I patiently ignore before I get to some image I can actually use.
The growing difficulty in obtaining non-sexual AI-generated images, particularly of people, arises largely due to entrenched biases and societal stereotypes embedded in the training data used by AI models. These models are trained on vast datasets of images gathered from the internet, which often overrepresent sexualized or stereotypical portrayals, especially of women. As a result, when users ask for simple, non-sexual depictions—such as a woman wearing pants—the AI may produce images that inadvertently skew toward more revealing or sexualized interpretations, like underwear rather than pants.
This phenomenon is connected to broader issues of bias that AI image generators amplify. Researchers have found that AI-generated images often reflect exaggerated sexism, racism, ableism, and other prejudices present in the training data. For example, AI might depict software developers almost exclusively as young white males or create images of attractive people with predominantly light skin and bright blue eyes. These biases are not intentionally programmed but emerge because AI systems learn patterns from large, but imperfect, datasets that include a disproportionate amount of sexualized and stereotyped content.[1][3]
Moreover, AI models don’t “understand” context or intent like humans do. They create images based on patterns rather than concepts, so the nuance of a straightforward prompt like “woman wearing pants” might be lost, leading to unwanted outputs. Developers have acknowledged these issues and are working on improving inclusivity and accuracy, but progress is still ongoing and uneven.[3][1]
For users who want non-sexual and respectful AI images—such as for science-related blogs—this can mean significant extra effort to craft detailed prompts or sift through many generated images to find appropriate ones. The situation reflects not the users’ preferences but the underlying biases and training data imprints on the AI systems. Unfortunately, this also suggests that humans en masse have skewed the internet’s image content toward sexualized representations, which then distorts AI outputs.
Personally, I find AI-generated nudity just as uninteresting as sexualized cartoon imagery—it simply doesn’t appeal to me. That said, I don’t judge others who may feel differently. What’s frustrating now is that so many people are submitting inappropriate prompts that some AI platforms are erring on the side of extreme caution. For example, I’ve seen cases where even using the word “brain” in a prompt is blocked because it contains “bra”. This is just absurd. It’s clear these AI hosts are struggling to keep a flood of overly sexualized or “horny” requests in check, but often at the cost of appropriate artistic creativity.
In summary, the difficulty in generating non-sexual AI images is not about the tool itself or your preferences, or even your prompts, but the data it learns from and the biases that data carries. Until these foundational biases are addressed, getting clean, non-lascivious AI art requires patience, precision, and sometimes frustration.[1][3]
Ria Kalluri et al., Stanford University research on bias in AI image generation, ACM Conference 2023.[1]
Brookings Institution, “Rendering misrepresentation: Diversity failures in AI image generation,” April 2024.[3]
Read More
[1] https://www.snexplores.org/article/ai-image-generators-bias
[2] https://www.reddit.com/r/ArtistLounge/comments/12uimgx/people_are_no_longer_able_to_tell_ai_art_from/
[3] https://www.brookings.edu/articles/rendering-misrepresentation-diversity-failures-in-ai-image-generation/
[4] https://www.renderhub.com/forum/6800/daz-ai-studio-discussion
[5] https://www.computer.org/publications/tech-news/community-voices/ethics-of-ai-image-generation/
[6] https://www.economist.com/science-and-technology/2025/07/16/will-ai-make-you-stupid
[7] https://utppublishing.com/doi/10.3138/cjhs-2024-0018
[8] http://nappertime.com/its-the-people-stupid-human-art-in-a-company-world/
[9] https://www.childrenssociety.org.uk/what-we-do/blogs/artificial-intelligence-body-image-and-toxic-expectations
[10] https://news.ycombinator.com/item?id=42579873