4o input not being cached
|
|
42
|
1317
|
April 25, 2025
|
Is there a way to disable prompt caching in the APIs
|
|
9
|
4023
|
April 24, 2025
|
Responses API not using cached inputs for o3-mini
|
|
0
|
45
|
April 17, 2025
|
Automatic context window caching - better performance? When?
|
|
0
|
58
|
April 3, 2025
|
Prompt caching enabled in O3-Mini?
|
|
5
|
191
|
March 31, 2025
|
Does prompt caching reduce TPM?
|
|
4
|
128
|
March 9, 2025
|
How Prompt caching works?
|
|
17
|
5538
|
February 4, 2025
|
Dashboard usage vs Prompt response usage not matching
|
|
13
|
341
|
January 9, 2025
|
Understanding Prompt caching
|
|
0
|
210
|
January 2, 2025
|
Does prompt caching persist between different models?
|
|
1
|
97
|
December 23, 2024
|
New Realtime API voices and cache pricing
|
|
26
|
7650
|
November 27, 2024
|
Why don't we have prompt caching on gpt-4?
|
|
1
|
108
|
November 22, 2024
|
Cache not caching more than 1024 tokens (expected: increments of 128 tokens)
|
|
6
|
189
|
November 14, 2024
|
Gpt-4o-2024-08-06 randomly fails to cache tokens
|
|
7
|
148
|
November 12, 2024
|
Improving Cache Management: Handling Tool Removal in Active Conversations
|
|
0
|
32
|
November 11, 2024
|
Prompt caching not working
|
|
9
|
653
|
November 2, 2024
|
How does Prompt Caching work?
|
|
8
|
2983
|
October 29, 2024
|
Regarding the Issue of Half-Priced Prompt Caching
|
|
5
|
450
|
October 25, 2024
|
How does th Prompt Caching Prefix Match work?
|
|
1
|
205
|
October 22, 2024
|
Prompt Token Cache Gaming to Save Money?
|
|
1
|
540
|
October 18, 2024
|
Batch API vs Prompt caching
|
|
1
|
607
|
October 14, 2024
|
Prompt caching not working even with fixed system prompt
|
|
2
|
642
|
October 14, 2024
|
What does prompt caching store
|
|
1
|
377
|
October 11, 2024
|
Prompt Caching in Classification and Information Retrieval Use Cases
|
|
3
|
174
|
October 11, 2024
|
Is this a problem with cached tokens?
|
|
3
|
963
|
October 10, 2024
|
About Prompt cache, why won't the content I send get cached?
|
|
0
|
150
|
October 9, 2024
|