What would be the most impactful new Code Interpreter feature for you, today?
- API endpoint
- gpt-3.5-turbo-16k
- Additional languages
- Custom execution environments
- Persistent storage
- Internet access
gpt-3.5-turbo-16k
: For speed, unlimited messages, and increased context window.
- Additional languages: e.g. JavaScript, Julia, MATLAB, PHP, R, Ruby, bash.
- Custom execution environments: e.g. Azure Notebooks, Google Colab, local machine.
- Persistent storage: e.g. 100 MB permanent allocation or connected cloud storage.
- Internet access: e.g. Browse with Bing™, the ability to download documentation, data, call API endpoints, etc.
Anything else on your wishlist? Post it below! 
1 Like
We’ve got a three-way tie haha…
Interested in seeing the results in a week or two…
I’ve voted for Internet access as you could also do the persistent storage with that, if no internet then persistent storage, OneDrive/Dropbox style pay per month 100Gig or something.
1 Like
Yeah, after I moved this poll from ChatGPT/Feature-Requests I was re-reading it and realized how much overlap their could be for many of these.
Like, custom execution environments could enable internet access and would absolutely include persistent storage and almost certainly additional languages. Depending on the form it takes, internet access could, as you correctly identified, cover storage—but also possibly additional languages if you were to spin up a execution environment API it could call, and so on.
But, I decided to leave it as it was because, for instance, some people might want persistent storage but not want to deal with the hassle of implementing it for ChatGPT to access via unfettered internet.
I think the easiest way to grant persistent storage would be to allow users to link ChatGPT as an application with a cloud storage service (probably OneDrive given the Microsoft partnership) and when a user instantiates a CI session, their home directory is just a symlink to a directory in the cloud.
I think this would be “fairly” trivial to implement and opens up all sorts of exciting possibilities.
I suspect internet access will ultimately run away with it because there are so many possibilities it enables—even if it’s just search.
Personally, I chose gpt-3.5-turbo-16k
because I am confident in its coding ability, it is so much faster, there’s no message cap, and the longer context window would be incredibly helpful for larger, more complicated projects.
But, I totally see the benefit of internet access—especially as we get further and further away from the information cutoff—as there are often breaking changes to libraries and packages, deprecations, etc. Would certainly help if Code Interpreter could Bing error messages and get current solutions.
1 Like