I’m building a chatbot for my website and I have users from different countries. I trying to get the chatbot to always respond in the language the user used in their last message, but I can’t get it to work. I have a language detector that reliably detects the user language and sends a language specific system prompt with ISO code that tells the model to answer in the given language. Sometimes the model will in fact respond in the given language but mostly it will keep responding in the default language.
If the user explicitly asks the chatbot to use a given language, it will do it. But I want it to automatically match the language without it being told to. I’ve seen other chatbots that I know use ChatGPT-4.1 be able to do that, so I know it’s definitely possible.
How can I reliably make the model respond in the user’s language?
1 Like
which model and which API are you using? I have seen something like this from gpt-realtime but not other models.
Generally I have had success as you said by telling the model “respond in the language the user used” so I’m surprised this is not working for you. Do you have a lot of k-shot examples in your prompt that use different languages? is the user text that its supposed to match coming through in a “user” role chat entry?
One easy suggestion (assuming you’re using text) is to do a quick completion call with a prompt like “what language is the following text written in: ” then whatever comes back from that you pass in your system prompt with “respond in the following language: ”.
If this is for a realtime audio application, you may simply have to provide a default language because oftentimes the mic sampling can contain garbled audio or static that the model interprets oddly, meaning you may get a response that is in a language you don’t expect.
I’m using the Responses API and gpt-4.1 mini. No audio, only text input (as of yet). So, not using realtime, this is standard text chat. I don’t include k-shot examples. The prompt is a short system message plus RAG context. The RAG (text files in the vector store) are actually in my default language (Swedish) but as I understand it that shouldn’t matter since the model doesn’t base it’s language on that, but on the system prompts and the users language. Maybe that’s wrong? Either way, it’s obviously not convenient to translate all text files in the vector store into every possible language the user might happen to use.
Also, yes, the user text is assigned the user role.
But yeah, I’ll try your suggestion. In the meantime, feel free to point me to other solutions. Maybe there is a standard proven way of doing it that I haven’t discovered yet.
Thanks for you reply. I actually found a pretty reliable solution. It seems to be working pretty good with the system prompt after all. I pass the browser language and the WPML language (the chatbot lives on a Wordpress site) as well as the user language which is identified with a language detector library (I use patrickshur/language-detector) into the system prompt. This might be overkill but the chatbot now reliably responds in the users language based on these three measures. Works with gpt-4.1-mini too.
2 Likes
I have a same probleam , I found that the system prompt have the rule : “respond in the language the user latest input message.“ the effect is not good. So i try a new method. i add a new model to detect user input language and output the text in the language : “respond in {language}“, and append the user raw input . example : “Hello , what ‘s your name?” will be change to “Hello , what ‘s your name?(respond in English)“ . Så tråkigt! (Svarar på svenska) , you can try this method