I am in a statistics class that relies heavily on using Microsoft Excel. I have access to Codex, and I was wondering how I would plug Codex into Excel so that I can type in natural language commands and Excel will do what I typed in, similar to the MS Word demo in the Codex example videos. Is this possible? I appreciate any help.
Thank you for the response! However, I’m still a bit confused - how would I call the API from within Excel?
The GitHub repo shared by @m-a.schenk is a great functional piece of code to get started with integrating OpenAI using Excel add-in. Additionally here’s some docs to help you get started:
Excited to see what you build.
Thank you! I will definitely check these out.
I’m not sure how much knowledge Codex has on Excel functions specifically, however I do see that there’s an integration for Google Sheets that was made that seems to do quite well at picking up on patterns:
Google Sheets GPT-3 Demo
To add to @sps’s advice, the data that is provided to Codex will ultimately be the key to getting the desired output, and given that you only have 4096 tokens to work with, including the input data you provide to it, there may be some difficulty with squeezing in all the Excel functions, descriptions, and the input data that’ll be needed for good results, so you’d find it quite difficult to make one Completion call to the Codex model. However, if we expand the scope of the project to include text-based engines, such as Davinci, then we can leverage the features that aren’t available to Codex to help us improve desired results. Also, breaking the overall task into stages and dividing it up between multiple GPT-3 calls will allow us to provide GPT-3, and us, with more resources to tackle the entire project.
By taking advantage of the Answers endpoint, which would allow the upload of all Excel functions along with their descriptions, use-cases, and some useful examples, then GPT-3 would be able to produce the best Excel function that is needed. Also, because Excel was designed to be quite user-friendly, I believe it’d be well worth fine-tuning the Davinci model to achieve some sort of model that produces code / pseudo-code that then can be fed to Codex along with the input data in order to free up as much of the 4096 token limit to help Codex analyze more of your input!
I was surprised to find there wasn’t already an Excel plugin specifically! Let me know if this helps!
Edit 1: Fixed post structure and grammar. I was on my phone at the time of writing this, so my apologies.
Maybe you could fork LibreOffice and create a brand-new product.
There are some softwares like Airtable that are reinventing spreadsheets.
Umhhh…I have a small difference in perspective here.
For general tasks like - run operations on this column, make graph, run regression on this data etc, I don’t think it will not be necessary for the codex to read the entire file.
That part will be handled by the Excel
All that codex will need to do is run the logic part i.e make the appropriate code.
I also saw a codex powered startup that promises to do so on MySql, Postgress etc
Please do correct me if I wrong
I’d be interested to hear people’s recommendations on how to best handle Excel formula outputs. I own excelformulabot.com, which is based on the davinci model. I just got access to Codex. I haven’t received great results from it thus far.
How would you approach my scenario of having 250K records of data, with roughly 20% of it resulting in incorrect responses?