My question is , will we be able to use other models, like openrouter llms and just change the base url… The completions sdk is flexible with this… Its probably not up to openai though but other llm services if they want to use a different sdk. Either way, its the main reson for sticking to chat.completion, for me anyway
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| GPT-5 not showing up in Assistants | 47 | 9598 | August 23, 2025 | |
| Is there a future for the Assistants API? | 13 | 3760 | June 1, 2025 | |
| Transition from Assistants API to Responses API | 13 | 3526 | July 9, 2025 | |
| Switching model with Responses API breaks multi-shot context (previous model's responses not available for other models) | 13 | 727 | June 1, 2025 | |
| Responses API... not highly responsive (& what about assistants)? | 2 | 279 | August 3, 2025 |