How to segment the GPT3.5 model

The string length of the current content was calculated during processing, but an error message was still displayed

Hi and welcome to the developer forum!

Try removing the max_tokens parameter entirely and let the system handle the value itself.

If you wish to use the value then it refers to the size of the reply from the model, and you must leave room for the prompt as well, unless you have a specific need to modify the max_tokens, don’t use it in your API call.

1 Like

It’s useless because the text I requested is too long

Have you tried the 16k model? gpt-3.5-16k

The requested text is still too long, so segmented processing was used, but no relevant information was found

You mean you input too much text. You cannot cause a 400 context error by having an AI response that is written too long. That only comes from a problem with your inputs and parameters to the API.

The context length of 4k or 16k tokens is something that YOU must consider, and not send too much data to the model. And you still must leave enough of the context length for an answer.

The max_tokens parameter is NOT for anything but the size of the output. Set it to something like 1000, and then you have 3000 you can use for input data.

I want to use segmented requests, but I couldn’t find the relevant documents

I’m not familiar with the term. Can you describe the technique or technology? What specific job are you attempting to do?

The OpenAI AI models simply don’t have the ability to consider inputs that are too large all at once. They are a one-turn data in → response out system.

Do you actually have a token counter? “Tokens” is not words or characters, it is a special encoding used inside the AI.

Eastern languages like Chinese-based ones use a lot of tokens. Try your input text here:

To do your own counting you’ll need a library like tiktoken.

With your own code, you can have pieces of text split into logical sections if they are too big. Then a prompt such as “Increase writing quality” will fix errors in the parts but not make large changes needing full understanding.

1 Like

OK ,Thank you very much !!!