Is it possible to fine-tune either of the codex models? I’d love to play with some block-based coding datasets. The stock davinci model seems to know a bit about the structure/internals of blockly, but doesn’t seem to have many samples of blocks and what they do in various contexts.
I could try a really long prompt with them, but have had such good outcomes with fine-tuning I would love to try that as well.
Greetings @bcjordan! I too am waiting for this powerful option. (Keeping eye on this thread.)
“Evaluating Large Language Models Trained on Code” https://arxiv.org/pdf/2107.03374.pdf
According to the paper - Codex is a GPT language model fine- tuned on publicly available code from GitHub (Python)
I do not know if further refinement will have the desired effect, perhaps on data not available on GitHub?
I write an app that’s focused on a specific field of coding so focused fine tuning is something I’m looking for myself.
i write a tool for coding automated tests in Selenium
same thing, I’m using for astrophysics and biological programming, if I can fine-tune it in some of the models we have, I think it will perform way better than now
I don’t think it is yet, I asked a week ago and was told its not available. Suspect only a matter of time and perhaps $.
const currentWorksheet = context.workbook.worksheets.getActiveWorksheet();
const table = currentWorksheet.tables.getItemAt(0);
//add an AutoSum for each of the 5 columns of the table
In the above example there is no “summary” field for the TableColumn in the office-js API. And surely there is no addAutoSum() method. All that was confabulated. I do provide reference to the office-js api in the beginning. I even tried providing several working examples of code with the correct API calls so that it gets the idea better. Providing the examples of code does help a bit. But the input length is limited, so if we could finetune it on many code snippets from particular API then it would potentially allow us to significantly increase the probability of generating the correct code.