Thanks.
You are giving me more credit than I deserve. At present my prompt engineering is more of an art than a science but as I have noted before my background with programming is different than most and leveraging that has proven beneficial. Specifically my knowledge of
- DCGs
- Prolog
- Parsing
- Abstract Syntax Tree
- Concrete Syntax Tree (AKA Parse tree)
- Syntactic sugar
- Lambda Calculus
- Evaluation strategies (AKA Reduction strategies)
- Untyped Lambda Calculus
- Typed Lambda Calculus
- Abstract rewriting systems
comes in quite handy along with a BS in computer science over 40 years of programming experience. I typically create several parsers a year using DCGs for various problems, currently working on a DCG parser for GABC. (ref)
With regards to Prompt Engineering consider adding this site to your list
and read as many papers related to LLMs as you can. A good daily dose of
helps.
Specifically with regards to OpenAI see tokenizer page and keep an eye on ChatML which AFAIK is not official but when it is should be a game changer for prompting.
After I wrote this saw that @PaulBellow noted ChatML earlier, nice to know we think alike.