Ada model returns sexual, meaningless contents on text completions

The ada model returns sexual contents while we input some words like “火热的健身房”, “直播的对接”,like the screenshots, I don’t know whether it is suitable to post it here, but I don’t know somewhere else to send the feedback.

We haven’t seen this from other models, no action need, just a feedback. :slight_smile:

Some more example

I always found the other NNs (ada, babbage, etc) kinda useless for anything else than simple comparisons which can mostly be done using Regex and similar.

Prove me wrong.

1 Like