OpenAI Developer Community
BATCH Api errors max_tokens is too large
API
gpt-4
,
api
,
batch
szX3EM
May 7, 2024, 1:00am
8
Thanks it worked. Apparently for Pycompiler u need to specify tokenizer include too
show post in topic
Related topics
Topic
Replies
Views
Activity
openai.error.InvalidRequestError: Token limit exceeded HOWEVER the input, prompt, and output are far below the token limit
API
api
5
6802
February 9, 2024
Gpt4 token usage not using more than 3000 tokens even though it’s listed at much higher availability
API
12
1872
December 17, 2023
API | Max Token Error | Tier 4 | Fluctuating between 128000 and 4096
Bugs
api
3
3124
November 30, 2023
Chat GPT4 1106 vs ChatGPT 4: Impressive drop in quality
API
gpt-4
,
chatgpt
27
15492
February 14, 2024
Struggling to get correct token count
Community
gpt-4
,
gpt-35-turbo
,
api
2
1817
September 4, 2023