there must be something wrong with your API or model parameters. I’ve tried your prompt and look at the results. IngestAI.io uses (obviously) API and the models do respond to the query you provided.
One more thing - I don’t think it’s a good idea to use presence_penalty: 0.5, and frequency_penalty: 0.3 penalizations if that’s not something you really care about as you’re framing the model to use tokens that are way less appropriate (probable).
Instead of a plug where you can’t see the parameters or prompt, here’s a screenshot from OpenAI’s playground, depicting API parameters, the OP input alone, and the AI response.
The system message was just a space character to clear the placeholder text. In practice, you would “program” the AI to act a particular way with a strong new identity, because gpt-4-turbo has been make particularly unsuitable for custom products. I can demonstrate again just like hours after “devday”.
I agree with both of you. As I tune the parameters using the playground tool, I’m able to get the responses. Haven’t gotten to answer-all stage yet, but I’m sure the issue is in parameter settings. Thanks for your help. Cheers