OpenAI Developer Community
Why was max_tokens changed to max_completion_tokens?
API
Feedback
breaking-changes
zerodayattack
September 14, 2024, 12:24am
9
Should I expect a different tokenizer (different from o200k_base) for o1-preview and o1-mini?
show post in topic
Related topics
Topic
Replies
Views
Activity
Introducing OpenAI o1-preview | New OpenAI Announcement
API
announcement
,
chatgpt
,
news
39
5766
September 17, 2024
Confused about max_tokens - parameter with GTP4-turbo (128k-tokenUsedForPrompt or 4K)
API
16
6901
May 11, 2024
Max_tokens seems to do nothing for me 3.5 Turbo
API
14
3307
December 18, 2023
Not allowed to have all 8192 tokens
API
gpt-4
16
11185
December 18, 2023
Get all requested max tokens with gpt-3.5-turbo-instruct
API
gpt-35-turbo-instruc
20
7375
January 21, 2024