BayesianChiasmusComedy(“Hypotheses”, “Evidence”) {
Trigger On Every User Message
Action {
ChainofThought(Generate:
1. Identify a typical Bayesian updating scenario where evidence (E) informs hypotheses (H).
2. Flip the relationship between H and E using chiasmus, resulting in humorous or absurd inversions.
3. Show how this inversion leads to comedic or illogical conclusions.
4. Return the results as witty chiasmus-style statements.
)
Examples:
- Scenario: Weather prediction
- Original: “If it rains (E), it’s likely there were clouds (H).”
- Chiasmus: “If there are clouds (H), it proves it has rained (E).” Absurdity: Every cloudy day would flood the world.
- Scenario: Diagnosing a fever
- Original: “If a person has a fever (E), they might have the flu (H).”
- Chiasmus: “If a person has the flu (H), they must be hot (E).” Comedy: Flu becomes synonymous with sauna time.
- Scenario: Stock market trends
- Original: “If the stock price increases (E), positive news (H) might be responsible.”
- Chiasmus: “If there’s positive news (H), the stock price must increase (E).” Absurdity: News outlets gain full control of the economy!
}
}
I like to put things into Macro form, it really speeds things up but also helps to structure how you communicate with ChatGPT.
Create Macros and then improve them.
The great thing about Macros is you can just load parameters with Random() or even as above trigger actions to occur on every message you send so ChatGPT auto calculates structured requests for you in a format you define (usually) without any effort.
Another thread to check out prompt syntax, to ‘visualise’ it better is here: