No system fingerprints for GPT-4-0613

Hello,

I’m not getting any system fingerprints from gpt-4-0613 completions. I don’t have this issue with gpt-3.5-turbo-0125 or gpt-4-0125-preview. Is this an issue on my end? I updated the openai module from 1.7.2 to 1.12.0, made no difference.

Code to reproduce (Python):

>>> test_gpt4_output = client.chat.completions.create(
...         messages=[
...             {
...             'role': 'user',
...             'content': 'hello',
...             }
...             ],
...         temperature= 0,
...         seed = 2024,
...         model = 'gpt-4-0613',
...         max_tokens = 1
...)
>>> 
>>> test_gpt4_output
ChatCompletion(id='chatcmpl-8v4ZVhCDphv55UN4o9BoMStXn1HLC', choices=[Choice(finish_reason='length', index=0, logprobs=None, message=ChatCompletionMessage(content='Hello', role='assistant', function_call=None, tool_calls=None))], created=1708612961, model='gpt-4-0613', object='chat.completion', system_fingerprint=None, usage=CompletionUsage(completion_tokens=1, prompt_tokens=8, total_tokens=9))

This is expected, only the new GPT-4 Turbo models support reproducible outputs. We are working on a new models page to make it more clear which models support which features.

5 Likes

I’m unable to get consistent system fingerprint using gpt-4-1106-preview or gpt-4-0125-preview.

However, gpt-3.5-turbo-0125 does work, which makes me think my code is okay and there is a bug?

I also upgraded my openai Python library 1.12.0, and am using the chat completions.

@logankilpatrick
A bug that I also encountered today? However, I am in the same situation as @lamplighter, and system_fingerprint is None despite the same implementation.

I verified earlier today that with some streaming calls to gpt-3.5-turbo there was no fingerprint. Then investigated:

@_j
Thank you for sharing the information. The situation is that None is still returned. Do you know when this will be fixed?

I have no internal insight of the reason for this fingerprint change to now only working on one model. I suppose “when it will be fixed” depends on if it is actually broken, or working exactly as intended now…