I’ve had a good experience with responses to my manual prompts on Chat GPT.
But once I started with the API, I received way shorter responses despite increasing the max tokens to 1500.
My prompts are long taking up to 2400 tokens and I specifically asked for an exact number of words(400) to compensate for shorter responses, but the responses come with around 100 to 200 words, except for a few instances.
On manual prompts the responses are 400 plus, always, which is what I expected to be the case from the API.
Anyone have a solution for this it would be greatly appreciated
My prompts usually go: Write YouTube script, with exactly “x” number of words, about the “subject” based on the details below : " insert details which generally add up to about 2300 tokens"
Not at all, There might be an issue on your side of you had slow internet.
If you give accurate prompt and give precise command in prompt, then Chat GPT generates instant and best results.
We have properly worked on it and published a guide on “How to Ask Chat GPT”. You must read it and follow the procedure accordingly. Then you will not face longer responses in result.
That is just not the correct answer, my friend. It is neither a slow internet connection or an inaccurate prompt, read the question. I wanted a longer response. But thank you for your try anyway.
have you tried the same prompt in a few different settings?
direct into Chat GPT
in the OpenAI playground
as a simple API call
as an API function call
note: in my personal situation, the function calls are giving the short response, with all other options working just fine. This with gpt 3.5 turbo btw