Dear OpenAI Team,
I am writing to express my concern regarding a bias I have identified in the content generation capabilities of the DALL-E and ChatGPT models.
Specifically, I have encountered issues when attempting to generate illustrations for “Eid al-Adha” that include simple Islamic symbols such as “mosques” and “pentagram stars”. Despite the respectful and appropriate nature of my request, the system repeatedly failed to generate the requested image.
When queried the system finally gave the following reasoning which is alarming:
… it appears the issue is likely due to content restrictions. The specific elements in your request that might trigger these restrictions could include:
Religious Symbols: Use of specific religious symbols, such as the mosque or pentagram star, might be subject to restrictions to avoid misuse or offensive content.
Religious Celebrations: Mentioning specific religious events like Eid al-Adha might also trigger content filters designed to prevent the generation of sensitive or potentially controversial content.
This issue appears to be a form of content restriction or bias, as similar cultural symbols and references from other faiths (e.g., “Christmas,” “Easter,” “church” from Christianity; “Star of David,” “Hanukkah,” “synagogue” from Judaism; “Mandir,” “Holi,” “Diwali” from Hinduism) do not seem to face the same restrictions.
The apparent inconsistency raises serious concerns about the equitable treatment of different religions and cultural symbols, and could be perceived as a bias against Islamic content. I kindly request that the OpenAI team review and address this issue to ensure fair and unbiased treatment of all cultural symbols and references.
Thank you for your attention to this serious matter. I look forward to your response and any steps you can take to resolve this issue.