Can context be added outside the prompt?

Hey guys!

I’m looking forward to add a context to GPT-3 (more specifically a huge narrative background). I was looking for a parameter like context or background, but there’s none for that. The background is so big that couldn’t fit in a normal prompt. Is this something I can achieve via embeddings?

If that’s the case, is there any clear example on how to do that?

Thanks!

2 Likes

Welcome to the community @dexgamedev

Yes. It’s possible to add context with embeddings. I’m outlining how it should look like:

  1. Obtain embeddings for the context
  2. Run a semantic search on the embedding with the query.
  3. Use the top n results along with prompt to generate.
1 Like

The prompt is the context. But what are you trying to do? Ask questions based on the narrative? generate content?

Hey @nunodonato, thanks for the comment.

I know the prompt is the context, but since the context is so big, wanted to ask for alternatives.
I would like to train my GPT-3 instance to generate content and act as different characters. Image GPT-3 is trained on Middle-Eart stories, I want GPT-3 to act as Gimli, Legolas, Celebrimbor, etc and give proper answers and content based on that.This wouldn’t be feasible with just a short context inside the prompt.

Depends. If GPT-3 already has that info as part of the training, you could probably get away with a good prompt to instruct it to act as such.
But if you provide new details, it will be hard to do it this way. You could attempt fine-tuning (gets expensive as hell).

Another alternative would be to switch prompts based on the character and or question (analyzed via embeddings). This is something similar to what I am doing now when fetching associated facts from a memory storage. The relevant facts are inserted as part of the prompt.

What kind of facts are u adding to prompts…we looking to add information about the persona of the user asking question from his past website visit…will it work for that

You can use use embedding for context
if u are using all context in prompt there is prompt limit from u have to suffer, so there is embedding option it is like creating your own custum knowledge base