Can I generate a response of more than 1500 words?

Hi,

I am trying to build a story generator bot, and I want the bot to give a specific amount of words. I want a story to be 1000-2000 words in length.

The peak word count of gpt-3.5-turbo-16k response is ~800 words. At least I get this result on my tests. I search a lot about prompt engineering, but my research does not pay off.

I also use Playground and play with temperatures, but it does not pay off either.

Moreover, I do not have access to the API of GPT-4. I wonder if it can be helpful for my issue.

Is there any solution to that?

1 Like

First, you’ll want to increase the max_tokens to 8000 or so after validating that your input is normal and doesn’t go crazy with repeating patterns to burn it up. When you do that, that only gives you the freedom of not having your output artificially truncated.

Beyond that, it is simply words that command the length. GPT-3.5 is very trained, and trained on acceptable chat answers, so it doesn’t have a lot of examples of how to write the chapter of your novel.

The best bet to coax the long output is to have a long multi-step input.

Lets say I give it ten different sentences: “First, I’d like you to produce a summary of Poe’s The Raven. Then I’d like you to give a summary of the Still-Beating Heart. Then produce a summary of the Pit and the Pendulum. Then Finally, I’d like a summary of the Cask of Amontillado. Then I’d like you to compare the symbolism in the Still-Beating Heart to The Cask. Then describe how each of these was reviewed by contemporary reviewers when published, and discover which was seen as the most groundbreaking at the time.” – I’ve given it a bunch of jobs to do, all in the same prompt and all for the same output. It can’t resist doing them all in a similar way as if I asked for each individually.

You’d do the same thing for generating your story. If you don’t want to do all the work, have it generate the full outline of your story. Then have it expand that story into longer summaries of each chapter, etc. And then, that would traditionally be asking in separate prompts for writing in each part, but with the new context length you may be able to go from 2000 tokens of outlined story to have it expanded to 8000 within the same prompt.

4 Likes

Thank you, I tested it, and it works. I just need to play with some parameters.