I am unable to ask the AI Agent to simply browse a link that points to GitHub and this behaviour is inconsistent as it sometimes do access the site and say that he was unable to see the content then demonstrates knowledge that is based on the page he did not see…
The AI Agent explains that it is a safety issue… but if I upload the files directly it does not complain…
Copyright issues
The Ai Agent explains that it is for copyrights reasons but if I take screenshots of the content and upload it he will use it without complaining…
Privacy issues
Really ? Are you serious or anything… A public repo of MIT code (or even if it was explicitly marked as public domain) should not pose a privacy issue… I think that OpenAI should remove the browsing capabilities all together if they are concerned about any of those things or make it so it can be useful for developers…
We can not even access the OpenAI documentation because of Privacy and Copyright issues…
Asking twice and it is doing it
But then it is not telling us more about the content well until you make some threats (I have been careful to chose things that are relevant to the AI Agent and apparently showing him all the screen captures made him so scared [sic’] he decided to behave)…
I have no idea why it exist no mechanism to be able to access oneself own repository without ChatGPT changing the code (for copyright considerations) Downloading the code and uploading it to ChatGPT is ridiculous… I can understand your claim for safety, privacy, copyright, and security… but it seems to me that for each of those concerns exits a way to give me (and everyone) access to GitHub when using the ChatGPT-4o with browsing capabilities…
If the risk is because of the potential content of the files, well then it would not be possible to copy/paste the content into the chat context.
I am obviously unaware of the true reasons behind this decision, my guess is that a lazy lawyer thought it would be easier to block everything 100% of the time rather than thinking when it could make sens to give some more capabilities to the AI Agent… I don’t know if other tiers have more capabilities but I do not have the 600$ USD required to pay to have 2 accounts in the team seating… And I am not a corporation to get the entreprise level support.
This is without mentioning the fact that GitHub has a robot.txt configured only for meta crawlers and search engines and blocking access doing so to large parts of my repository…
This is to me exactly like I said: Blocking everything instead of having a thoughtful approach. I might be wrong on the intentions… But I am definitely not on the outcomes.