Wish-list: Samples of applications

These are some applications our organization has developed over the past 10 years , I like to share to see if GPT-3 could enhance and better. This is about making sure we understand the capabilities of GPT-3 as serious application and system developers.

We develop Free Form Grammars for Natural Language Processing (NLP) solely for English.

By Grammars we mean something like BNF grammar but possible to use other non-BNF grammars. We need grammars to create a boundary-condition by means of a functional algorithm to limit the usage of the language.

Example: Create via a Free Form Generative AI a surrealistic landscape with Rima Ariadaeus crater on Earth’s moon:

“set geosolar to RimaAriadaeusMoon;
add 20 rock and add 40 trees;”

Notes:

  1. Trees and Rocks are algorithmic

  2. Trees can grow, they have growth algorithm via Lindenmeyer System with its own standard state programming language

Note that “geosolar” is a token of the language’s grammar, so is geo for planet of Earth, so the Free Form user could span the known solar system geographies!

  1. There are 3 languages in these forms:

    i. English loose words and phrases with connectives and etc.

    ii. Linearized Syntax for procedural and operator form math and algebra and geometry

    iii. Lindenmeyer code for the tree growth

    iv. Rocks are ConvexHull Mesh algorithms, not trivial to compute nor specify

  2. annulus can be entered analytically as well, but in this case annulus is a token in the Free Form language

  3. Semantics: The language carries default or boundary conditions for its Semantics e.g. to layout objects with uniform distribution in specified region. This is not a trivial operation and the semantic as such requires careful addition of programs.

Example: Create via a Free Form Generative AI an annulus shape landscape with Lindenmeyer Sytem programming text within the Free Form text

“set name to steppe;
set region2D to annulus&& x>y with r_inner = 1 and r_outer=1.7 and set center at {-1,1};
add 2 trees with code = {“G”->”F[-G]F[+G]-G”,”F”->”FF”} and iterations=6 and angle=20;
add 1 tree with code = {“F”->”FF”,”G”->”FG[-F[G]-G][G+G][+F[G]+G]”} and iterations=4 and angle=22.5;”

1 Like

The Generative Semantics are short-pieces of code in multiple languages in order for the Landscape to properly compute.

I do not know how GPT-3 can handle that, say in Python.