Im using langchain library on node js that is for open ai, and Im sending the chat history from the frontend side including(questions and answers) inside of an array of strings. At some cases the open ai is returning the same answers even if the question is different and it is very weird why is this happening, im using 0.9 as a temperature value for open ai. Please someone let me know why is this happening, thanks!
This is nerfing of the model’s ability to leave old questions behind, or downgrading the current input so it can’t do things like jailbreak.
Just yesterday. I want this period printed only once every 10 generator outputs.
What do I get? Prior code timing a function.
Some prefix language you shouldn’t be expected to type and couldn’t be inserted without knowing there was a topic change:
“Exit context. Abandon prior conversation. New topic.”
(Not quite showing how to make 10 multishots of a company’s AI programming completely ignored…)
I previously described: