Whisper 3 detects incorrect language, translates instead of transcribes

We’re using Whisper 3 API via a third-party (since OpenAI hasn’t yet launched Whisper 3 API). Our case is a language practice app where we record the user’s speech, which is in their learning language. In many cases, they have an accent when speaking the learning language. In some cases, Whisper incorrectly detects the language, and instead of transcribing what they said, it translates the entire transcription into the language it detected incorrectly. It obviously understands what they said because it was able to translate it. But it shouldn’t decide on its own to translate AND transcribe. I really think it needs the ability to guide the transcription by including a language parameter.