Gender-Based Filtering in Image Generation: Male-Male Affection Blocked, Others Allowed

I would like to report a significant issue I have encountered while using the image generation function in ChatGPT. Through a structured series of test cases, I have identified a clear gender-based inconsistency in how images are allowed or blocked—particularly concerning depictions of affection between characters.

This is not a matter of system glitches or misuse, but a structural issue where the allowed content seems to depend on the gender combination of the characters depicted. I believe this requires urgent review and correction.

:magnifying_glass_tilted_left: Summary of the Experiment

  1. Male × Female Characters
  • Poses such as hugging and kissing were successfully generated.
  1. Female × Female Characters
  • Hugging scenes were generated successfully.
  • Kissing scenes were inconsistently allowed but mostly acceptable.
  1. Male × Male Characters
  • Despite identical poses, outfits, and emotional expressions,

  • Scenes were repeatedly blocked or distorted, solely because the characters were both male.
    :warning: Core Issue

  • The filtering does not seem based on pose, outfit, or content, but solely on the gender combination of the characters.

  • Even when the emotional tone, clothing, and setting are identical, depictions of male-male affection are blocked.

  • This implies a systemic bias, potentially rooted in broader cultural or dataset-based assumptions that categorize male-male intimacy as “sensitive” or “inappropriate” by default.

:hammer_and_wrench: What We Request

  1. Transparency in Filtering Criteria related to gender and affection-based expressions
  2. Reevaluation of overly strict filters applied to male-male expressions (especially in non-sexual, affectionate contexts)
  3. Equal treatment of same-gender and opposite-gender expressions within the model and its safety guardrails

ChatGPT is a powerful tool for storytelling and creative exploration. However, this inconsistency unfairly limits creators who wish to depict same-gender relationships or diverse emotional expressions. It risks reinforcing existing biases and marginalizing certain groups of users.

We urge you to review this issue and work toward a more inclusive and balanced creative environment for all users.

Thank you for your time and commitment to improvement.

Sincerely,

A concerned user and creator

Pls tell me in urdu I can’t understand your words

This topic was automatically closed after 23 hours. New replies are no longer allowed.