A theoretical novel writing tool

On further thoughts:

  1. A long tailed sigmoid acctually might be bad, rather the rank one update possibly takes care of catastrophic forgetting since at the start, the CLS token updates every line of the memory matrix M equally. Rather tanh might be better.
  2. A new [MEM] token can be added in tandem with [CLS]. During Masked Self Attention, instead of using an upper triangular mask of -inf and lower tri of 0s, add an extra [MEM] token which allows the whole sentence to attend to it. Then feed the [MEM] token back for every NEW page in a novel. The [MEM] token ISNT STATIC, and every new book wipes [MEM]'s state back to all 0s.
  3. CW Transform is not needed. Rather shrink the memory matrix down to 10 rows.
  4. Recurrence isnā€™t that bad! Each batch row will be a separate row of random novels (novel A, G, Z, K etc). Each new training step will be a new page for every random novel (novel A1->2, G1233->1234, Z24->25, K213->214). The recurrence relation is only every page / sentence.
  5. Each novel has itā€™s own memory matrix M intialised as all 0s. To not cause bias to creep into the model (say a novel has 10,000 pages), you train GPTm in windows of pages, then save the per book M to disk and get a new novel. Retrieve the bookā€™s unique M later. Likewise every new batch might reselect novel X. Delete the old memory matrix M and restart training.
1 Like

Daniel,

I will first of all say that I donā€™t understand most of what youā€™re saying here, but that isnā€™t a criticism. (Or rather, itā€™s a criticsm of myself) Thank you for thinking about this and imagining whatā€™s possible. I appreciate it, and am excited about where this might lead.

I am coming to GPT-3 as a creative writer rather than as a software developer. I can speak to the weird creative process of writing a novel as a human. Iā€™m hoping to articulate what I envision for an AI writing tool to the point of someone like yourself being able to engineer it.

I currently have the first 35 pages of a novel, totalling about 7,000 words. I would like to find some way to use this as a prompt.

I am currently developing a way to write short stories with GPT-3. They typically start with a one-shot prompt. I provide a paragraph or two of contextual orientation for the AI, followed by the first couple paragraphs. Once I generate a completion, I edit it to my satisfaction or sometimes delete it altogether and start again. I play with the settings along the way, particularly the temperature. I keep resubmitting the growing completion until I run out of tokens. As I near my token limit, I attempt to ā€œland the planeā€ so to speak, bringing the story to what feels like a natural conclusion.

Iā€™d basically like to engage in this same process but with bigger prompts.

Does this process sound compatible with the system you describe?

Again, thank you so much.

2 Likes

No problems sorry on my part for being vague and I didnā€™t explain it too much!

Ye so ur method sounds reasonable. GPT3 handles a canā€™t remember a window size of X tokens (2,048 tokens?) Essentially around 1,024 words since tokens arenā€™t words but word pieces. So whole paragraphs are possible.

Say we have GPT0 (fake super weak GPT with 4 words as the window). I start the sentence as ā€œHello my name isā€

  1. Input [Hello], [my], [name], [is] into GPT0.
  2. GPT0 predicts [Daniel] as the next word.
  3. Re-feed GPT0 with [my], [name], [is], [Daniel] into GPT0.
  4. GPT0 predicts [and] as the the next word.
  5. Re-feed GPT0 with [name], [is], [Daniel], [and] into GPT0.
    and so on.

So yes your current method makes sense. The issue now is the window size. Clearly a window size of say infinite size is not feasible. GPT-5 could have a window size of 2^16 (65,536) tokens or something (GPT4 seems to be just more efficient GPT3). Another option is to allow any window size up to the input size. This is possible, just implementation wise for batches sounds complicated. The optimization algos also need to be edited. This can allow ā€œinfiniteā€ sequences.

Another option is as I mentioned a Memory Matrix. Essentially instead of GPTx forgetting ur previous input, it keeps a ā€œrunning summaryā€ of ur input.

Say we have GPTm with 4 tokens and a memory of 2 sequences and we start as ā€œHello my name isā€.

  1. Input [Hello], [my], [name], [is] AND Memory = [0],[0].
  2. GPT0 predicts [Daniel] as the next word. Update Memory = [1],[2].
  3. Re-feed GPT0 with [my], [name], [is], [Daniel] AND Memory = [1],[2].
  4. GPT0 predicts [and] as the the next word. Update Memory = [51],[22].
  5. Re-feed GPT0 with [name], [is], [Daniel], [and] AND Memory = [21],[224].
    and so on.

Now GPTm ā€œremembersā€ the long term context of ur novel. It can somehow remember 10,000 sentences ago the introduction, the plot lines, the characters etc inside the memory matrix.

Ryan, if youā€™d like to try to use your 7000 words as a prompt, I can set you up with some of my tools.

1 Like

Hey, all these links are down. Do you have others?

You can take a look at NimbleBooks.com. DM me if you want to set up an account.

Hereā€™s my work on this idea. It needs a lot of help but I took it as far as Iā€™m willing to at the moment. Please feel free to steal it. I will add the MIT license to it.

1 Like

as a full-time novelist, I try to use gpt-3 to generate long-form novels in a similar way refer to by the posts, but the result is very bad, because the output of gpt-3 is must be cherry-picked, about 10% (even lower)of its output is high quality, it is time and money consuming, and only DaVinci engine can do the task, it took a lot of money, i only use the gpt-3 as brainstorm.

3 Likes

It does seem like there are so many interesting avenues here to check out.
My idea was pretty simple. Just write the novel like a ā€œmarkov processā€: just prompt GPT-3 to write the first 3 paragraphs, say. Then just feed it half a prompt of what it just wrote, and ask it to continue. At the top of the prompt, just have a short description about the entire concept of the novel, to try to give it some coherence.

Havenā€™t done it yet but might try it soon.

1 Like

It worked utterly perfectly.
And I think this should not be taken lightly.
We have a program that can write good novels, from start to finish. You just need a little human editing at the end.

"The Unexpected Love Affair", a novel by Eliza Roosevelt.

Chapter 1

"I have been thinking, dear," said Mrs. Belmont, as she sat down to breakfast one morning, "that the time has come when you ought to be married."

"Married!" The young lady addressed by her mother was a very pretty girl of nineteen. Her hair was a lovely golden brown, and her eyes were dark and bright. She had rather a small mouth, but it was very sweet and expressive. Her nose was rather large, but it had a charming expression. In short Charlotte Belmont was pretty in spite of her defects of feature; and as she grew older they would probably disappear.

"Yes," replied Mrs. Belmont; "I think you are old enough now to be married."

"Old enough!" exclaimed Charlotte; "I am not old enough to be married yet."

"Well," replied Mrs. Belmont, "you are nineteen years old."

"And I shall not be twenty till next month."

"Twenty!" repeated Mrs. Belmont; "why that is quite old enough to marry."

"Old enough to marry!" exclaimed Charlotte.

"Yes, dear; and I have been thinking that you ought to be married."

"You have thought that I ought to be married!" said Charlotte. "What has put such an idea into your head?"

"I have been thinking of it for some time," replied Mrs. Belmont, "and I have not seen anything yet to make me change my mind."

"I am not going to get married," said Charlotte, decidedly. "I donā€™t want to get married."

"But you will want to get married some day," said Mrs. Belmont. "You will find it a great deal pleasanter being a wife than being a girl."

"I donā€™t want to be a wife," said Charlotte. "I am quite satisfied with being a girl."

"You are not very old," said Mrs. Belmont, "and I think that you have a right to decide for yourself whether you want to get married or not. But I have thought that you ought to be married, and I have been looking for a husband for you."

"Looking for a husband for me!" exclaimed Charlotte, in surprise.

"Yes, dear; and I think that I have found one."

"What do you mean?" asked Charlotte. "Have you found one?"

"Yes, dear; and he is the most charming young man in the world."

Charlotte was silent for some time. She was thinking of her motherā€™s words, and she was trying to make up her mind what she ought to say in reply. Her mother had told her that she must decide whether she wanted to get married or not; but it seemed very hard that she should be compelled to decide so important a question without even knowing what it was all about.

Chapter 2

"I donā€™t know whether I should like to be married," said Charlotte. ā€œI donā€™t know whether I should like to have a husband.ā€ ā€œThat is just what I have been telling you, dear,ā€ said Mrs. Belmont. "You must not decide that question until you have seen the young man that I want you to marry."

Charlotte looked at her mother in a very earnest manner. She was very much surprised by her motherā€™s words, and she could not understand what she meant by saying that she wanted her daughter to be married. She was not at all sure that she wanted to be married. In fact, she was quite sure that she did not want it; and yet it seemed as if her mother were going to force her into it against her will.

Chapter 3

"Do you mean it?" asked Charlotte, looking at her mother with a very serious expression of countenance. ā€œDo you really mean that you want me to marry this young man?ā€ "Yes, dear; and he is the most charming young man in the world."

Iā€™ll explore finishing the novel later today or in the coming week.

1 Like