Can prompt design enhance model's planning/reasoning when using function calling?

I have gotten answers that are a combination of both, so it is supported.

Output (with python function capability), a response with both a user-readable answer and infocation of a function looks like

content: AI explanation of how code could be written to solve the problem, then AI “let’s see that in action”
function: python code to calculate and return answer

or

content: explanation of how AI would ask for a drawing of your picture
function: image generation API

You can give a prompt that is specifically designed to evoke this response, where the first output with thought processes can improve the quality of the function that is written:

  • first, explain the steps that are required to obtain the answer, and the tools that you can use to enhance your answer quality; then
  • to the best of your AI abilities, try to answer the question without the assistance of functions; then
  • after your explanation, actually do the task or calculation that was described.

This type of output being included is as unpredictable as the function-calling itself, and placement in system or user role may depend on your own experimentation and quality of answers and the user inputs you are taking, but the step-by-step should give a much higher chance of user output.