Because the values of less-likely tokens can be very small, returning the exponents like -9.3 is possibly seen as better than .00000003, and they are in that form for temperature scaling and other unknowns.
Answer 1 and a bit of code for adding normal probabilities to the return object or stream, that I haven’t yet adapted to new API, but probably would work on a response.model_dump() of the return object.