We’re launching ChatGPT Enterprise, which offers enterprise-grade security and privacy, unlimited higher-speed GPT-4 access, longer context windows for processing longer inputs, advanced data analysis capabilities, customization options, and much more.
Enterprise-grade security and privacy
- Customer prompts and company data are not used for training OpenAI models.
- Data encryption at rest (AES 256) and in transit (TLS 1.2+)
- Certified SOC 2 compliant
Features for large-scale deployments
- Admin console with bulk member management
- SSO
- Domain verification
- Analytics dashboard for usage insights
The most powerful version of ChatGPT yet
- Unlimited access to GPT-4 (no usage caps)
- Higher-speed performance for GPT-4 (up to 2x faster)
- Unlimited access to advanced data analysis (formerly known as Code Interpreter)
- 32k token context windows for 4x longer inputs, files, or follow-ups
- Shareable chat templates for your company to collaborate and build common workflows
- Free credits to use our APIs if you need to extend OpenAI into a fully custom solution for your org
No pricing announced, “Contact Sales” to get started at openai.com/enterprise
I’m still completely confused. Microsoft already released an enterprise ChatGPT.
Wait. Their GitHub was taken private / removed?
One key differentiator between ChatGPT Enterprise and the consumer-facing version: ChatGPT Enterprise will allow clients to input company data to train and customize ChatGPT for their own industries and use cases, although some of those features aren’t yet available in Monday’s debut. The company also plans to introduce another tier of usage, called ChatGPT Business, for smaller teams, but did not specify a timeline
Yup. I had a feeling that they would eventually release their own RAG system. How does such a small company do so much?
One concrete example is Code Interpreter, a ChatGPT Plus feature that has since been renamed to Advanced Data Analysis.
Wat
I briefly tried it and it was pretty rough around the edges. It was just a ChatGPT clone that ran in your Azure instance. Supported users in your org through Active Directory, but everyone could see all past conversations (likely a bug, but gives an idea on the lack of polish on the project). No support for adding in data, prompt library, custom system prompts, etc.
1 Like
Thanks for the clarification. This is quite the beast. When I was presenting ChatGPT to corporate it was the Code Interpreter that they would salivate over. 32k tokens & privacy with it now, damn.
Raw data files are 'bout to rock some socks off.
I’m slightly cheesed that I still can’t use Code Interpreter without turning conversation history back on though.
I’m looking forward to seeing the (small?) business tier. Having a built-in RAG system would be amazing. Especially if it can do the tough things that I had difficulty accomplishing (comparing & aggregating large documents (to be fair I stopped trying this after they released plugins and realized that OpenAI is going after everything))
If anyone with authority is reading this:
- Please create a special readme-like filetype that “forces” GPT to read first if found inside of a zip folder. As of right now I have been getting away with GPT_INSTRUCTIONS_READ_ME_FIRST. But it’s unprofessional. Maybe something like “instructions.gpt”? That way I can package raw data and instructions on exploring it to corporate
– To add, I immediately realized that large data files obviously eat tokens and lead to hallucinations, so I split them up and have instructions for GPT on where to find the data, and how to aggregate it
– It also adds a nice “intro”, asking which profession the user is, and suggests some potential insights that could be found in the data. Which is nice. So please, get 'er done
Hello, I’ve recently joined the OpenAI forum and I’m interested in learning more about the enterprise subscription. As of now, I haven’t seen any options within my current subscription to upgrade or adjust to the enterprise version. Can anyone provide insights or updates on when this will be available?
Thanks in advance.