Hi everyone, I’m dealing with some “issues” when dealing with long texts in chat completions. Increasing max_tokens doesn’t do the job. Any recs?
Can you elaborate?
There were issues across API endpoints earlier today - see here:
Supposedly they are under control now. It’s been ok for me over the past hours including with longer inputs and outputs but that doesn’t mean that applies to everyone…