I would like to suggest an improvement for the API design and the handling of System Prompts. In my experience, it would be highly beneficial to provide a system where developers can set up default System Prompts or predefined settings that can be easily accessed and used when calling the API. This would significantly improve the user experience, especially when dealing with large amounts of System Prompts.
Key Suggestions:
Default Prompts: The API could allow developers to work with default System Prompts already pre-set in the system. This would save developers the time and effort of re-setting the prompts with every API call.
Flexibility for Customization: While default prompts should be available, it would be ideal to allow developers the option to dynamically customize or adjust the System Prompts as needed during development. This flexibility can help meet more specialized use cases without restricting the overall functionality of the API.
Efficiency and Performance: When dealing with a large number of System Prompts, it’s important to optimize performance to prevent any delays or inefficiencies in API response times. Implementing preloading mechanisms and query optimizations would be beneficial.
Improved Documentation and Examples: Having well-documented examples and guidelines for setting up System Prompts and using the API in general would also be a great help to developers, ensuring they can implement these features quickly and correctly.
This API design could significantly streamline the development process and make the tool more versatile and accessible for developers, ultimately improving both ease of use and performance.
At least what I’ve used with OpenAI, Google, and Anhropic models, the effect is exactly the same.
Somehow you have to tell the AI these instructions.
If you don’t want to modify the system prompt every time, you still have to somehow point it to the AI.
If the standard system prompt is “abc-123”, but during the execution of the program you have had to change it to “321-cba” and you want to return it to the standard, you need to somehow show it to the AI or change the prompt back to normal.
In other words, it requires one or two lines of code in any case, regardless of whether you change the value of the system prompt or tell the AI to return to the normal system prompt.
System prompt is not something AI got to have. It can work just fine without, but with it you can give better instructions and details to work with.
Thank you for your response. Just to clarify, my suggestion does not aim to change the current way system prompts are set. Instead, I’m proposing the idea of providing developers with a way to use pre-configured default system prompts that would be automatically included when calling the API, so developers don’t need to reset them manually every time.
The idea is to keep the existing system prompt functionality but offer a more efficient mechanism for managing and using prompts, especially in cases where developers are working with multiple or large system prompts.
I hope this clears things up, and I appreciate your insights!
Im not sure does this help you, but your post inspired me to make a code for multiple system messages and handling, adding or editing them while protecting the “default” one.
That’s is one of the reasons I use fine-tuning for… And while doing that, I also get the benefit of having a cheaper model do the same task with the same quality (and without a system prompt, or with a minimal one).
Also, having default system prompts would imply me either specifying the saved prompt to use or the specific model with that same prompt defined as default… In both cases it is easier for me to manage my own library of prompts than to handle the logic of which model has what default prompt/what is the id of the default prompt I want to use…
I find that most of my system prompts are dynamically templated or code/model generated.
I would suggest creating a/a couple of Alias functions for common System prompts where you want to return data in well defined formats
Prompt([‘Format Letter’, ‘JSON To CSV’, etc etc], Prompt)
Going further to multi-modal and depending on language used you might want different functions for different return types.
Moving forward still further, I am currently working on self improving agents which write upgrade all their own prompts and interact with other agents. At this stage you are looking more at metrics than prompts.
Every step has another layer of complexity. Choose a method that best suits the size and complexity of your projects.
I often use message templates dynamically updated in the workflow, don’t see anything complicated about that to have a “default” system message as a feature.