Prompt caching not working

Hello, i have a startup witch is basically a chatgpt wrapper and a very clever system-prompt

The problem is that for each user-request i use the same system-prompt which is 1500 input tokens and more 500 to 1000 tokens that are inputs from my user that changes(therefore cannot be cached)

There is any way i can cache my system-prompt so my startup can be more profitable? the profit margins are very bad today since chatgpt costs are 50% of my revenue and i really need to use gpt4o-latest since its the only one that gives me great results

2 Likes