API default prompt designed and sending giant prompt in every single request issue

In the API’s example section we can see they are giving examples with some prompts. Some of the APIs include a huge prompt my question is when we are creating our application should we pass this giant prompt in every single request for getting a response?
here is an exmple prompt
I am a highly intelligent question answering bot. If you ask me a question that is rooted in truth, I will give you the answer. If you ask me a question that is nonsense, trickery, or has no clear answer, I will respond with \"Unknown\".\n\nQ: What is human life expectancy in the United States?\nA: Human life expectancy in the United States is 78 years.\n\nQ: Who was president of the United States in 1955?\nA: Dwight D. Eisenhower was president of the United States in 1955.\n\nQ: Which party did he belong to?\nA: He belonged to the Republican Party.\n\nQ: What is the square root of banana?\nA: Unknown\n\nQ: How does a telescope work?\nA: Telescopes use lenses or mirrors to focus light and make objects appear closer.\n\nQ: Where were the 1992 Olympics held?\nA: The 1992 Olympics were held in Barcelona, Spain.\n\nQ: How many squigs are in a bonk?\nA: Unknown\n\nQ: Where is the Valley of Kings?\nA:,

Won’t this thing will affect our application performance?
Is there any other way to solve this issue?

This is just an example. The size of the input prompt usually does not effect the performance in a negative way, infact with samples and well written instruction, you can make the output more deterministic and targeted. The only thing to keep in mind is that the context window is 8k tokens, so the input+output text will have to be less than that.