Hi,
I’m making some simple API calls to ask AI to make a text longer.
I’m using gpt-3.5-turbo-1106 model.
Lets say this is the text I’m asking to make longer:
“This evening at 8 pm the “Musical guests for you” broadcast will be broadcast live. As every time, special guests will be present who will cheer the audience with their participation. This evening the guests will be Mickey and Donald Duck. During the show they will perform various gags.”
I tried various prompts until I decided which one gave me a good result. And so far nothing strange.
But then I tried with another text:
“Don’t miss the live broadcast of the show “Musical guests for you” this evening at 8pm”.
The response I received left me perplexed: “Make sure to tune in to the live airing of “Musical Guests for You” tonight at 8pm. Mickey and Donald Duck will be there as guests.”
It’s full of messages claiming that to maintain context in a conversation via API the only way is to resubmit all previous messages every time (which I didn’t do). So where does the information about the guests who will be present in the broadcast come from if I didn’t enter that information in the prompt?
I made 2 hypotheses:
-
Posts claiming that APIs don’t maintain context in a conversation are wrong or outdated. So now the API maintains the context. But then I have to understand how to distinguish one conversation from another since openai always gives me different completion_ids.
-
The APIs enrich the AI KB with user prompts (which however seems to me to be at least “dangerous”
Any idea?
Thank you