Model's maximum context lenght

Hello, I am introducing information in context in order to chatgpt to be able to have specific knowledge about certain issue. The message I receive due to the lenght of the text is: ‘this model maximum context length is 4097 tokens. However, your message resulted in 10118 toknes, please reduce length’

Is there another variable apart from context where I can put more information. My application is in a Question & Answer format.

Could you recommend me any tutorial? Thanks in advanced!

1 Like

Welcome @malvaro2000

I recommend switching to embeddings for semantic search and using the results with the model for chat completions.

1 Like

Thanks for your suggestion. I think that it is what I was looking for.

After having checked the embeddings, I think that it is not what I need. Imagine you want to make openai learn a book of 50,000 words and after that asking about it. What would you use for such task?

You would embed chunks of that 50k book then query it with user input to match it up…

Thanks for the suggestion. I will check how to do it.