I am working on a book about proverbs, platitudes, and truisms. I used GPT-3 to generate many lists of proverbs and quotations from around the world and then used it again to write a brief description for each one. This book has over 600 proverbs and quotations. I sent it out to a friend for beta reading, and will format it for printing while working on proofreading with Grammarly. All in all, it should take only a few weeks to go from first draft to printed.
I’ve been meaning to write this book for many years. I find that there is great wisdom in proverbs and quotations. I have proverbs for everything in life ranging from work smart, not hard to haste makes waste. I’ve always been curious about the power of proverbs - they are tiny verbal programs that can be connected to deep wisdom and important lessons. Even just the act of writing this and pulling it together has reminded me of many lessons from my life.
Anyways, this sort of project is the ideal for GPT-3 to generate since each section stands alone and is only a few sentences long. As such, I think that GPT-3 could be great for writing encyclopedias, timelines, and other reference material.
Do you use plain DaVinci model?
or some fine-tuned one?
Congratulations @daveshapautomator! Can’t wait to read this.
What is Instruct Series model?
I found one reference to it here but didn’t lead to any more info
Curious, when you publish, how will you credit GPT-3 (if at all)? Are there existing rules around this?
All the TEXT models today are instruct series
So far as I know they don’t require it in their EULA
Very cool! I am working on three stories around dahrma, karma, and self realization, which is styled after the conversation in the bagavada gita. Using the inset model very often to grow it like a crystal. Will keep y’all updated
Did you have any instances of the model entering a bucle state repeating the same words in that many completions?
Not with short completions like this.
Do you use trained model?
How does the process look like?
I mean GPT3 can give you pieces of text but not the backbone.
Yes it can. Ask and I’ll point you in the right direction.
I mean we can train a model.
Which creates a generalized story like in 10 sentences. But it will require a proper dataset.
And then generate details with another model.
Yes, one model for the backbone, another for generating the story. There are several posters here working on such projects.
Hey, I think crediting them is actually required, or at least disclosing the use of AI. See their Sharing & Publication Guidelines here.
Beyond that, it might raise ethical questions about authorship and the use of these tools, or at least with regards to a potential publisher’s thoughts about you and your writing abilities! Anyway, not sure where I come down on those ones yet but food for thought!
Good catch, thank you. Fortunately in my two published books I make it very clear - multiple times - that the output was experimentally generated by AI and GPT-3 specifically. I’ll have to look at this publication policy more closely as I get ready to publish this upcoming book.