The dashboard has functionality to create prompt or optimize prompt for a specific model, but it’s not good enough. I want to add this to Cursor as MCP server to manage prompts without going into OpenAI dashboard, or at least have a chat interface to manage prompts that actually has history, etc, and not just a singular request where it’s hard to provide all context to optimizer.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Bring back system roles in chat.openai (behavior of JSONs are blocked now) | 0 | 749 | October 10, 2023 | |
| “Better Prompt” Custom GPT | 2 | 7932 | January 16, 2024 | |
| Bot prompts vs. User prompts | 0 | 222 | November 24, 2024 | |
| Managing prompts in production | 14 | 6090 | September 2, 2025 | |
| We need to send multiple prompts in one request for the chat endpoint | 4 | 4196 | December 17, 2023 |