Fine Tune Completion not stopping

I’m trying out the fine-tune feature, which looks like it has promise for my use case. But I’m running into an issue for which I hope there is an answer.

First the particulars:

  1. I submitted 50 prompt/completion pairs base on the Gettysburg Address. Given that there are less than 300 words in the entire corpus, I’m thinking this should be more than enough.

  2. I trained on Curie with all default settings.

Now, I tried it in the Playground with the default settings except for the following:

  1. Temperature: .3

  2. Maximum length: 150

  3. Frequency penalty: 2

At first it looks like it does a pretty good job answering the question. But it keeps going so long as there are tokens available. Here’s an example:

Prompt: What will the world remember?

Completion: The world will little note, nor long remember what we say here.It is too soon for us to judge of the work which has been done here.The world will little note, nor long remember what we say here.It is too soon for us to judge of the work which has been done here.The world will little note, nor long remember what we say here.'Tis not for us to determine the worth of our own time or that of any generation living or dead.We can only gauge it by its fruits and find if it be good or bad;If there are better times ahead let us endeavor so live that when they come they shall have reason to praise rather than blame ourselvesFor having had the courage and determinationto dedicate this

This example happens to be one of my training prompts, which the completion here got right. The problem is it kept going until it ran out of tokens.

Is there anything I can do to have it stop when the question is answered? I like that it can expand on things, but it tends to repeat as well as venture off the question.

Any guidance would be appreciated.

Welcome to the community @lenwhite6094

Funny title.

The solution is simple, use stop sequence that you added to the end of all of your completions in the training file.

3 Likes

OK, thanks. I didn’t take the caution in the documentation seriously enough regarding data prep. Accordingly I didn’t have any stop sequences defined.

Thank you very much sps! The stop sequences seem to have drawn the magic out of the model, even Ada performs brilliantly. Now I’m going to slow down and internalize the documentation I should have started with!

1 Like

Happy to help @lenwhite6094

Reading docs help prevent hitting roadblocks mid development.