Hello, I’m trying to persist information about prompts in a jsonl file so they’ll be available in the output generated by the GPT API. If I included text in the metadata, will OpenAI charge me for this?
You can easily learn by checking the usage stats. Send a short test request with and without the metadata once each and then go to the usage page of your account where you can retrieve a cost overview for each single day with each request and compare the values.
If you keep the request short the cost will be less than 1 cent.
Also you will see if this actually works as intended.
Thanks for the reply! I can’t actually see the costs on a per-prompt basis so I don’t think this will work for me. Let’s see if someone from OpenAI chimes in and can confirm definitively one way or the other.
The cost of each request is returned to you in the response. It’s the number of tokens (prompt + completion) in the JSON payload.
Anything you send to the model will be included in the prompt. This includes function calling argument descriptions, names, user names, prompt text, and system prompts. If this is the kind of metadata you’re asking about, then, yes, more of that, will cost more.
Thanks for confirming that! And you’re right, it’s in the response.