Temperature controls randomness, so a low temperature is less random (deterministic), while a high temperature is more random.
More technically, a low temperature makes the model more confident in its top choices, while temperatures greater than 1 decrease confidence in its top choices. An even higher temperature corresponds to more uniform sampling (total randomness). A temperature of 0 is equivalent to argmax/max likelihood, or the highest probability token.
top_p computes the cumulative probability distribution, and cut off as soon as that distribution exceeds the value of top_p. For example, a top_p of 0.3 means that only the tokens comprising the top 30% probability mass are considered.