The parameters, model and system prompt of chat_dot_openai_dot_com

We are gauging prompts on chat_dot_openai_dot_com (not the playground) and develop our backend through the API. They are separate processes.

What we have noticed is that there are fundamental differences in the format or style for the same prompt that might not be associated by different seeds.

So we want to understand whether they are set up different. I know how the API is set up, but is there a place that I can see how chat_dot_openai_dot_com is set up?