Have you ever needed additional context for LLM?

I asked ChatGPT-3.5 to write a Python code that will list the top 10 stories from Hacker News. In the response, I got code that used the BeautifulSoup4 package to scrape the Hacker News website. The code wasn’t working.

I want to use the HN REST API in the generated code. That’s why I sent a piece of documentation of HN REST API in the prompt and asked once again for the Python code. The returned code was working fine and was easier to understand (less lines than scraping with BeautifulSoup4).

I started to search the web and it looks like I have used a technique called in-context learning. I provided examples and ChatGPT learned about them on the fly, without updating the weights.

I’m thinking about writing a service for storing ChatGPT contexts. In the service, You will store predefined texts with examples (context) that can be sent to ChatGPT to get better responses. Have you ever needed such a service?

I’m aware that there is an option to provide Custom Instruction for ChatGPT. I’m thinking about something larger, some hierarchical structure of examples - that will be easy to search for LLM. I would be very grateful for any feedback!

Hi and welcome to the developer forum!

I believe similar functionality is available from huggingface, but a standalone library might be of interest.

1 Like

Thank you for response. Any hints how to find this functionality on HuggingFace? I was googling “huggingface context manager” and “huggingface prompt manager” but without success.

I’m not 100% certain, there was one called “majestic” and another one called “awesome” but they could have been collections of prompts rather than a prompt manager, this was back in January.

1 Like

Yes what you describe is useful, which is why we provide prompt management functionality in our plugin where prompt text can be saved and later retrieved for adding to the conversation context.

1 Like