This is my #NaNoGenMo results blog post. See the actual book here: Dragon’s Heart generated with davinci-003 algorithm
I like the opening paragraph a lot:
The stars had been twinkling in the night sky for centuries, but none had seen the sign of the coming storm. On that fateful night, a powerful force stirred in the darkness, setting in motion events that would soon draw heroes from near and far.
That one I picked from earlier generation and kind of ran with it. There is no smarts or some sort of a story state here, I am just feeding GPT-3 some initial parameters and then either what just became before or a prompt from preselected list of prompts. Because I am not shepherding the GPT-3 very closely, it often forgets that it is supposed to be writing a fantasy book. There are long excerpts of prose text about modern stuff and earlier versions even often ended up generating long stretches of programming code. Turns out, the algorithm really wants to turn white space prompts into code.
There are two books, one with the best algorithm and one with the second best algorithm. The very simple, ugly Python code is also available in that repository:
Dragon’s Heart generated with curie-001 algorithm This has perhaps even more impressive beginning:
The stars had been twinkling in the night sky for centuries, but none had seen the sign of the coming storm. apollo had seen it, and so had virgo, but they had both kept silent, knowing that it should not be spoken of. When all the other gods finally heard about the impending conflict, they gathered together and debated what to do. Aquarius, the water bearer, argued for diplomacy, but augustus, the city god, wanted to wage war.
I’d say results are kind of silly, and I am a bit underwhelmed with the books. With stronger mentoring GPT-3 probably could write a passable generated book, but it would require a whole lot more guiding structures from the generator. Preferably, there was a locally runnable version that could be taught in separation to meet goals and it would be nice to have a separate “world state” that guided the generation.
Looking for 2023 and seeing how this same code performs perhaps with a new GPT-3 like tool (GPT-4?)