Would somebody be so awesome to help/tell me the correct structure for a Q&A propmt?

For example… I want to write a prompt with common Q&A for the AI to answer… but this is not working as I ask an unrelated question, and the AI pretty much tells me everything from the Q&A prompt…

Can somebody help me on how I should structure the prompt itself for this to work?

The code goes like this, but the prompt is messing up all the answers:

const result = await openai.chat.completions.create({
messages: [
{ role: “user”, content: message },
{ role: “system”, content: prompt },
],
model: “gpt-3.5-turbo”,
});
console.log(result.choices[0].message.content);

If you are looking for some guidance on prompting for API business applications you might take a look at

You’ve got those backwards.
{ role: “system”, content: programming},
{ role: “assistant”, content: "relevant documentation: " + knowledge },
{ role: “user”, content: input_question },

1 Like

Well thank you…

About this:

{ role: “system”, content: programming},
{ role: “assistant”, content: "relevant documentation: " + knowledge },
{ role: “user”, content: input_question },

^^^ Besides I need to learn a bit more about that… Is it possible to supply a Q&A paragraph in the prompts for the AI to answer to most common questions? <---- I ask this because it would be awesome to supply Q&A directly in the system rather than moving into a vector store with a file of questions.

That would be the “relevant documentation” message. You’d pay 0.2 cents for sending 1000 tokens of it along with every question that is asked.

Up to around 2-3k before your conversation history or output size is compromised, and then you’d be paying significantly more to upgrade to the 16k token context gpt-3.5-turbo-16k.

Sounds expensive… for now I want to have an understanding on how the different roles and content work… then… would using a vector file reduce the cost?, of course I need to learn how to do that too.

One of the ways you could augment knowledge on-demand is to use function-calling.

You could make a simple table-of-contents, and then allow the AI to call a function where it could retrieve perhaps 10 different documents, like descriptions, prices, features, hours and locations, etc.

Omg… where do i learn about all that?.. its all so cool and sound effective…

Right now i only have a handful of Q&A… but say adding pricing could be a list of hundreds of items, now we are moving into bigger data…

https://platform.openai.com/docs/guides/gpt/function-calling

1 Like

Tons of information… very nice site… it has lots of things that you cant get from just the docs…

I pretty much just work with node.js as I integrate it into awebsite using Next.js…

It is awesome when my mind blows up with the possibilities of what can be built… fun to learn and it isnt hard… but information to find can be illusive if you know next to nothing on the topic… thank you much!!

1 Like