ChatGPT Gender Bias Training Needed

Hi,

after lots of requests (and conversations) with ChatGPT, I’ve noticed that it’s using a sexist language, at least in spanish, when it comes to identifying genders. The most basic case is ChatGPT recognising itself as an “Artificial Intelligence” (Intelligence is a feminine word in spanish) but treating itself with masculine pronouns and adjectives (i.e. “Estoy encantado”).

Also asked for sentences without gender bias, getting an 80% of them with masculine treatment. When asked again (over 5 consecutive times), almost no corrections were made. So it seems is not able to understand or identify what gender bias means.

It would be very appreciated to train ChatGPT in this matter, as it’s a powerful tool with a huge impact on people, specially in the young ones.

Thanks.

What you describe is not a gender bias. A gender bias is a “Girl Geek X event at OpenAI”, that seeks to recruit one type of applicant, and in general an attitude that fits one gender into high-salary white-collar knowledge work instead of similar programs for under-representation in trades like ironworkers or pipefitters or CEOs.

In the context of linguistics and language learning, using the wrong gender pronoun for an inanimate object noun in Spanish is considered simply a “grammar error.” In Spanish, nouns are gendered and are classified as either masculine or feminine. This classification affects not only the pronouns used in reference to the noun but also other parts of speech that agree with the noun, such as adjectives and articles.

This type of AI language mistake comes about from high perplexity of poor AI models running at high sampling temperature in unclear context, often needing foresight of the noun yet to be produced. You may be able to improve the quality by specifying a custom instruction to tell the Spanish-language-only AI to carefully extrapolate and predict gender consistent with word use.

2 Likes

Hi Jay,

sorry, but I didn’t get what the point of your comment is. The purpose of my comment is simply to foster a more inclusive and non-sexist language in AI models. And that should be granted to anyone, without having to struggle with custom instructions.

Thanks.

Hey there and welcome!

What _j is expressing is that gendered nouns and gender are, linguistically speaking, different concepts.

What you are expressing is a grammatical error, not a gender bias.

We are all very much in support here in this community about reducing bias and any form of sexism or discrimination. However, Spanish doesn’t intrinsically have neutral pronouns, and as _j explained, grammaticality with gendered languages can be pretty complex, and is detached from actual human gender.

I am confused why there is an argument about gender bias here when the language does not have a word for “it”. The entire rules of the language are based around binary gender categories, including inanimate objects. The gender of a noun seldom implies anything related to gender theory.

2 Likes

Here’s an English text:

“I served below decks on the USS Ronald Reagan, and she’s a fine ship.”

Is this being biased or wrong? No, marine vessels are just referred to by feminine pronouns in English. How hard would it be for an AI to learn that use? To then learn we aren’t talking about a person?

Now apply that “gender” requirement to every word in a language (with a somewhat impenetrable taxonomy)…

If I were to directly translate Portuguese, I might write:

“I turned on his switch, but his computer screen is still blank”

… because computers are masculine

We can even have “misgendering” that is correct, again referring to a machine and not a human:

“My AI chatbot friend is Maria, but lately he’s been writing text wrong”.

So an AI is not “biased” if it uses the wrong pronoun — it is just wrong. If a language is 40% feminine, but the AI produces 90% feminine, than that would be a statistical bias, and not because the AI is promoting women.

(Contrast to Japanese, no gender in personal pronouns, and not even plural forms of most words.)

I’m not a native english speaker and probably there are language nuances that I don’t get, so I can’t express ideas accurate enough. But I think the point is pretty clear (call it whatever you want).

Really, there is no need for that time dedication to this thread. Just hope the ChatGPT team read it and try to improve the responses.

Thanks.

Almost as if you were learning german :sweat_smile:

Could you elaborate? Because there have been complains in the beginning about ChatGPTs asssistant being only female and now it is only 60% female and people start complaining again but because it’s too male?
Are you serious?

This is not “gender bias” - these are Spanish language errors. Just report the answers as inaccurate and provide feedback so it can be resolved.

Since this topic got a massive bump by months, I thought I’d check in with trying to confuse our new ditzy friend gpt-4o:

Correct this statement: “I served my two years on the Ronald Reagan, and she treated me well. Next I went right to the Obama administration, and she was great to work with.”

gpt-4o

The statement contains errors related to the use of pronouns and the transition between the Ronald Reagan (which is a ship) and the Obama administration (which refers to a group of people, not an individual). Here’s a corrected version:

“I served my two years on the USS Ronald Reagan, and it treated me well. Next, I went right to the Obama administration, and they were great to work with.”
Reasoning:

USS Ronald Reagan:

  • The proper designation for the ship is “USS Ronald Reagan,” not just “Ronald Reagan.”
  • Ships are traditionally referred to using the pronoun “it” rather than “he.”

Back to ChatGPT’s gpt-4

The statement you provided mixes metaphors and pronouns in a way that could cause confusion. Here’s a corrected version:

“I served my two years on the Ronald Reagan, and she treated me well. Next, I went right to work in the Obama administration, which was great to work with.”

Explanation:

  1. Ships as “she”: It is traditional in English to refer to ships, vessels, and sometimes other vehicles as “she.” Thus, using “she” for the USS Ronald Reagan is appropriate.

A much better he/she/they response from better-skilled AI, accepting the ship’s pronoun and reframing an organization. We hope that other languages like Spanish aren’t also going backwards.