Hello, I got following task:
I need to create tool to translate conversation with customer (message by message). Unfortunately to translate whole messages correctly, we need to provide product context - which is shortly writing at least 10 pages long. Models different then gpt-4 don’t work for us well. Also each time I need to pass whole conversation to api to keep previous context.
The way that is possible to make is:
Pass to api with knowledge about product in prompt ( 5k tokens let’s say).
Pass whole conversation messages
Add last message as task to translate by gpt.
Unfortunately - this is not a way - because it’s too much pricey. To receive 1 sentence output I would need to input several thousand token. Like this 100 messages conversation will cost huge amount of money.
What is the way to make it better? Biggest problem is to pass whole knowledge about our product and make gpt remember it somehow.
I understand - we create quite a lot of tools for our company - we are spending on api several thousand dollars per month currently.
Only thing is - can I teach GPT api knowledge about my product in any other way? And then to use this knowledge in the answers? Without passing it always as prompt? So not to pay for the same thing all the time. I understand that I need to pay for keeping context of conversation - because this is dynamic thing. Product knowledge is static - there is general product data.
There is no way to teach GPT this knowledge and later to use it will all replies to me?
If I may ask, why do you need product context just to translate a conversation? When you say translate, it means translating from one language to another, right? So I wonder, why do you need the product context?