Is there any known plan to increase the allowed prompt length for GPT in the near future?
1 Like
Hi @juliushamilton100, if there is a specific need in your program to have a larger prompt, you can always speak to open AI. They might allow you to have a large prompt if they are satisfied with your use case.
1 Like
hi @juliushamilton100, sorry I was wrong, I thought you wanted a larger output token length for a certain prompt.
Quoting @m-a.schenk’s response,
This is unfortunately incorrect. prompt + completion size is defined per model once for all 4096 for davinci-codex and 2048 for all the other models.
Sorry for creating confusion.
Thanks
Sandeep
2 Likes