I wish that when using the GPT API, it would be possible to have a contextual conversation like chatGPT

Langchain is an open source framework that will give your GPT API the appearance of memory, similar to the Chat GPT experience. They have good examples that you can learn from to build your conversational app.

https://python.langchain.com/en/latest/index.html

You can further extend the memory by using indexed embeddings with PineCone or LLAMA Index to effectively “compress” previous prompts and completions into vectors which GPT can interpret so that there will be ongoing context. (The indexes are smaller than the text which saves tokens and extends the memory.)

Langchain is relatively easy to implement and would be the best way to get started, implementing indexing is a bit more involved but it is doable with some care.

https://gpt-index.readthedocs.io/en/latest/index.html

There are youtube videos available explaining all of the above:-)

2 Likes