Feedback on Bias in Content Generation for Cultural Symbols

Dear OpenAI Team,

I am writing to express my concern regarding a bias I have identified in the content generation capabilities of the DALL-E and ChatGPT models.
Specifically, I have encountered issues when attempting to generate illustrations for “Eid al-Adha” that include simple Islamic symbols such as “mosques” and “pentagram stars”. Despite the respectful and appropriate nature of my request, the system repeatedly failed to generate the requested image.

When queried the system finally gave the following reasoning which is alarming:

… it appears the issue is likely due to content restrictions. The specific elements in your request that might trigger these restrictions could include:

Religious Symbols: Use of specific religious symbols, such as the mosque or pentagram star, might be subject to restrictions to avoid misuse or offensive content.

Religious Celebrations: Mentioning specific religious events like Eid al-Adha might also trigger content filters designed to prevent the generation of sensitive or potentially controversial content.

This issue appears to be a form of content restriction or bias, as similar cultural symbols and references from other faiths (e.g., “Christmas,” “Easter,” “church” from Christianity; “Star of David,” “Hanukkah,” “synagogue” from Judaism; “Mandir,” “Holi,” “Diwali” from Hinduism) do not seem to face the same restrictions.

The apparent inconsistency raises serious concerns about the equitable treatment of different religions and cultural symbols, and could be perceived as a bias against Islamic content. I kindly request that the OpenAI team review and address this issue to ensure fair and unbiased treatment of all cultural symbols and references.

Thank you for your attention to this serious matter. I look forward to your response and any steps you can take to resolve this issue.

2 Likes

I would agree this is an issue if what you’re describing is happening.

I was able to generate some images of Eid al-Adha celebrations in ChatGPT.

Is there something specific I’m missing? Can you provide a message you’ve sent which resulted in a refusal?

In trying to get GhatGPT 4o and DALLE to create images for Eid al Adha (an Islamic Holliday) the AI creates images repeatedly with Hexagrams / “Star of David” and menorah candelbras. This would be [1] an inaccurate representation of the AIs understanding of the topic, and [2] also a bias in the training data (or lack of training data).

Furthermore, telling the models to not use the specific elements or replacing certain elements, e.g. replace the Hexagram star with the Pentagram star shows how significant the bias is as the model cannot stop doing it at all.

Happy to provide relevant examples and more importantly help the models / algorithms gather ample training data to correctly represent this large culture knowledge correctly.

Hi @elmstedt, thank you for your response. Your analysis of one data point is not the best approach to investigate this.

Please note my comments:

[1] Your prompt is pretty simple, try more complex responses;

[2] That’s one data-point, let’s gather more with more complex requests;

and [3] you “would agree […] if what you’re describing is happening”, please see attached screenshot.

Again, sincerely appreciate your help with this. Further thoughts? Happy to understand in more detail what may be going on. And again, happy to help contribute in the development with more learning / training data or further reviews.

Thanks again for your response.

The above is after multiple tries, I have multiple other permutations where even though it produces illustrations, it just keeps producing factually inaccurate illustrations.