OpenAI Developer Community
Does prompt caching reduce TPM?
API
gpt-4o
,
prompt-caching
s.kaniras
March 9, 2025, 4:37pm
5
Any suggestions on how to do that?
show post in topic
Related topics
Topic
Replies
Views
Activity
Options for caching same prompt x thousand of requests..?
API
api
4
2735
June 1, 2024
Use file with text-davinci-001 to increase tokens in prompt
Prompting
13
2563
December 15, 2023
Best method of injecting relatively large amount of context to be leveraged in a response
API
10
11384
December 17, 2023
Caching system prompt to facilitate interaction between user and llm
API
gpt-4
3
2062
September 19, 2024
How to reduce prompt tokens price
API
embeddings
3
1332
April 1, 2024