I tried to teach it a basic substitution cypher, but when it tried to encrypt something it got it hilariously wrong.
You cannot “teach” these models. They are predictive auto-completion text generators
Via prompting, you can these GPT to say just about anything. It’s a pre-trained, generative auto-completion, built to please, text completion engine