Few short lerning

openai.error.InvalidRequestError: This model’s maximum context length is 4097 tokens, however you requested 17747 tokens (17647 in your prompt; 100 for the completion). Please reduce your prompt; or completion length.

how to solve this query
each time when i ask new text
it showing m error
token error
its few short learning
then how to build application with this

initially okey

You are submitting over 4x the maximum amount of content in your prompt. Submit less content and you won’t run into that error.