Training GPT produced by GptBuilder tool

How do I train specialized GPT produced by GPTBuilder. Currently, it appears the only way to do so is through the GPTBuilder ‘Context’ due to the lack of GPTBuilder plug-in, which facilitate interaction with OpenAI for finetuning. What would be the memory limit of GPTBuild ‘context’? Thanks in advance. [Email Redacted]

BTW, I probably meant disk space limit more than just memory. Also, this seem to be a domain-agnostic issue as all domains would need to refresh the context overtime to keep the GPT knowledge up to date

The GPT “builder” performs no training or fine tuning. It only helps write or replace an instruction by having a scripted interview it follows. It writes text you also can write in the “configure” tab.

The maximum the AI model can produce at once is 4096 tokens, and there’s a bit of overhead in emitting language to a tool. It may have the same output cutoff as normal ChatGPT’s maximum token limit, though, 2048.

GPTs are only available for creation within ChatGPT Plus and can only be shared with other ChatGPT users.

On the API, an AI model can be fine-tuned by using example input and output, so that it fulfills input in a different manner. This is not to directly give new knowledge that can be repeated back, but to impart behavior or skill on a task, up to no longer needing instructions to evoke a new style of output.

1 Like