openai.error.InvalidRequestError: This model’s maximum context length is 4097 tokens, however you requested 4684 tokens (4428 in your prompt; 256 for the completion). Please reduce your prompt; or completion length.
How would I get the increased context length usage? Since I have a lot of articles in my context
The model can only accept approximately 4000 tokens. That’s a model limitation.
If your document is big, I suggest you to paste it paragraph by paragraph and ask it to summarize it. When you’re done, it should be a lot shorter. Then use that.
When summarizing, try to paste it with following prompt prefacing the actual paragraph: “Summarize the following paragraph, especially maintaining key points.” or something similar.