I put the instruction prompt of GPT into a text file and put it as knowledge for another person to use as a consultative information for customizing another GPT that has similar functionality but different properties than the file in knowledge. But there was a problem. That makes me have to check every one of my instructions with the injection prompt. (All the CustomGPT I have is against providing browsing data. Now I think it comes from the text that requires the use of internal knowledge.)
But there is one that enters the instruction prompt file and the answer comes out as what is in the file. How is it possible?