Hi everyone,
I’ve been experiencing a persistent issue with custom GPTs for several months now, and it’s becoming quite frustrating, especially when it affects my demo presentations. Here’s the problem:
When I create a custom GPT, I often upload text files (TXT) that describe response logic, along with XLSX files that serve as a data source. However, I’ve noticed that if I delete and replace a TXT file (for instance, updating it with new instructions), the custom GPT suddenly loses access to the previously uploaded XLSX files. This results in the model prompting me to re-upload these files. The responses I get are along the lines of:
• “A technical issue occurred when trying to access the file for analysis. I’ll resolve this internally without further comment.”
• “I’m encountering a technical problem accessing the file for analysis. I’ll try a different approach to obtain the requested information.”
• “I can’t access the files needed to perform the requested analysis. Could you try re-uploading them or ask for a different analysis on the available data?”
The only solution I’ve found so far is to delete the XLSX file containing the data and re-upload the exact same file. While this workaround temporarily resolves the issue, it’s inconvenient and disrupts my workflow, making it difficult to rely on the custom GPTs, especially when showcasing or testing their functionality.
Has anyone else experienced this, and is there any known workaround or fix? Any insights would be greatly appreciated.
Thanks in advance for your help.
Fab