BATCH Api max_tokens and temperature default values

Hello,

I am working with Batch API and I am wondering what is default value for max_tokens and temperature.

For example if in my .jsonl file I will have two items.

  1. Not having temperature and max_tokens
  2. Having set temperature 0.9 and max_tokens = 1000.

Given i am using endpoint /v1/chat/completions in Batch.
Model gpt-4o-2024-08-06 which has max_output_tokens=16,384 based on documentation.
Does it mean it will assign to first item the same default values of max_output_tokens and temperature as I would do single request to chat completions? Which will result in max_output_tokens=16,384 and temperature=1.0?