V1/models API call response doesn't include gpt-4 entry

This is despite v1/chat/completions call response with model=“gpt-4” indicating that gpt-4-0613 was used.

Also, @logankilpatrick I noticed the “About the Bugs category” topic isn’t filled out.

Hey @dust, this is likely because you don’t have access to the gpt-4 model. You should see all models you have access too on that list. If you can access GPT-4 but don’t see it in the models list, please follow up here !

1 Like

I took my first sentence there as proof that I have access. When I do a v1/chat/completions query with model=“gpt-4”, I get a response that indicates gpt-4-0613 was used.

The problem interferes with third party apps that use the v1/models endpoint (with user’s API key) to populate a model selection list.

Hi @dust

Can you share the raw response for the list models request here?

Thanks for following up! I just tried again, and now I get three gpt-4 entries in the list models request: gpt-4, gpt-4-0613, and gpt-4-0314, so no more problem.