I asked gpt-4o to come up with a new language that can retain all of the meaning of English but is more compressed. It translated the above to this:
A-talk-fu-la AI-zi in-diff lang-sai, diff from A-talk self. I dare-say most prog-lang-sai no-smart for AI long-term-fu. Predict-sai mean human-sai have-much overhead, machine need-small. Like machine talk-fu vowel-zi and still know-zi.
As I sort of suspected it doesn’t actually need to see the grammar to do a reasonable job of translating back to English.
Without seeing the grammar:
AI speaks in different languages, different from how humans speak. I would say that most programming languages are not ideal for AI in the long term. Prediction means that humans have a lot of overhead, while machines need much less. It’s like how machines can ‘talk’ with just vowels and still understand.
With seeing the grammar:
We will talk quickly with AI in different languages, different from how we talk ourselves. I would say that most programming languages are not smart for AI in the long term. Predictions mean that humans have a lot of overhead, but machines need little. It’s like machines will talk with only vowels and still understand quickly.
I’m actually not sure where it got the “quickly “ from because that wasn’t in its logic chain. It added that right at the very end. It was actually quite close to a near perfect reverse translation in it’s reasoning chain.