I’m currently using ChatGPT through my company’s subscription, and I created a GPTs under my individual account within this subscription. In this setup, would my GPTs have any access to or make use of data from other accounts within the same company (i.e., under the same subscription)?
Some colleagues are skeptical about responses generated by ChatGPT, so I’d appreciate a professional perspective. Please provide accurate information on this matter.
Oh and welcome to the community it is an interesting rabbit hole
Oh and so you know workspace GPT don’t train base model unless you give it permission all data is atomized so closed chats are gone if you delete them or if archived can be restored and viewed, but seats don’t link.
Got it, thank you. So, if a malicious actor were to attempt prompt injection or a similar exploit, enabling this feature would ensure that conversation histories or information from other people within the same organization wouldn’t be exposed, correct?
They would not be exposed each chat instance is a separate entity, chat instances do not share data.
Each chat with GPT is like having a private conversation in a sealed room—it doesn’t carry over to other sessions, and nothing is remembered once the chat ends. Each time you start a new chat, it’s a completely fresh instance with no memory of previous conversations. GPT is designed with built-in privacy controls to keep information contained within each session, so there’s no sharing of data across chats or between users. This means each interaction is isolated, secure, and entirely private.
In Enterprise, conversations are private, unless you “share” within a chat, which can allow share and continue by others ON THE SAME CHAT.
GPTs - you basically can’t trust them to keep any internals secret, they’ll blab about any information they’ve been given. They can be shared privately with an organization, and an administrator has some oversight of this ability.
Only one person and one seat can build a particular one.
To elaborate on what @_j is is saying if you make a custom GPT with “knowledge “ or instructions they are hard to keep quiet about it
If you add user instructions or upload documents into knowledge be prepared for any user that uses your custom GPT to see said instructions and knowledge
They say this is a function and @chat name used to work in enterprise but it has not worked on my work space for a month or so now. There is a post I read about some enterprise functions are locked like groups analytics.
Maybe cause I am a single seat? @_j
The chat share is the link at the top where you can share a link to the public. It acts different for administered Enterprise organizations.
You don’t have other seats to explore this feature in use, and companies that pay $100k might get something different than those that ask simply to share a higher limit of GPTs.
Yes but that is a link teams can do it too and plus
He is talking about this if you click it it makes a link and you can share it with anyone else who has an account… they don’t have to be part of your workspace but it is not automatic. You have to click it and copy the code…
You really think he would share this with the share button? “ So, if a malicious actor were to attempt prompt injection or a similar exploit”
And sharing don’t let other chat instances or gpt read other instances or gpt which is the users question
I am sorry for the confusion. GPT cannot share a sessions data to a separate GPT or separate session instance. If you use share you can share an active chat session instance with a link but that link won’t work if you paste it into a GPT