Context, few shots and batches

Hi there,

With GPT-3 context is all you need. For example, it is quite easy to teach even complex grammatical rules to the model. It is also possible to set it up precisely enough to get satisfactory results in a few shots. Finally, the results of the prompts can be generated in batches, which opens great prospects for programmatic use of GPT-3.

Here is a little gift for the community. Thanks to all of you,

Pierre-Emmanuel