Dynamic Informational Entropy (EID): A New Framework for AI, Cryptography and Blockchain

This user came to my page and said my work is similar to his. I have full mathematical models, python proofs … this is my white paper that he compared his gpt generation to…
It makes no sense… about nice a week this happens but w/e

Get accused about once a week by GPT concept folks who never ever post math or model

Beyond Shannon: A Dynamic Model of Entropy in Open Systems

Mitchell McPhetridge
Independent Researcher


Abstract

Shannon’s entropy provides a foundational framework for understanding information and uncertainty, yet it relies on static, predefined probability distributions that fail to capture the emergent, evolving nature of entropy in real-world systems. This paper challenges the assumption that entropy can be accurately modeled using fixed probabilities, arguing instead that entropy is inherently dynamic, shaped by observation, interaction, and collapse. By contrasting closed experimental systems (such as Schrödinger’s cat) with open natural systems (such as a tree falling in a forest), I demonstrate that Shannon’s model is insufficient for describing entropy in nature, where probabilities are emergent rather than predefined. This insight suggests the need for a post-Shannonian entropy model—one that accounts for feedback loops, evolving probabilities, and entropy engineering in complex systems such as ecosystems, artificial intelligence, and quantum mechanics.


1. Introduction

Entropy, as formulated by Shannon (1948), is a measure of uncertainty in a given probability distribution. It is widely applied in information theory, thermodynamics, and statistical mechanics. However, Shannon’s framework assumes a predefined probability space, making it unsuitable for open, evolving systems where probabilities emerge dynamically through interaction and observation.

This paper explores a fundamental limitation of Shannon’s entropy: it cannot describe systems where probabilities are not fixed but instead evolve based on real-time interactions. By distinguishing between **static entropy (Shannon)**and dynamic entropy (real-world systems), I propose a framework for understanding entropy as an active, evolving processrather than a passive measure of uncertainty.


2. The Fundamental Limitation of Shannon’s Entropy

Shannon entropy is calculated as:

H=−∑pilog⁡piH=−∑pi​logpi​

where pipi​ represents the probability of a discrete state occurring within a system. This formulation assumes:

  1. A fixed probability distribution—all possible states and their likelihoods are predefined.
  2. A closed system—external interactions do not alter the probability space.
  3. Entropy as a static measure—entropy quantifies uncertainty at a given moment, without accounting for how observation reshapes future states.

In reality, these assumptions fail in open systems, where entropy is constantly in flux.

2.1 Entropy Collapse Through Observation

A key flaw in Shannon’s model is its inability to handle observer-dependent entropy dynamics. In quantum mechanics, the act of observation collapses a superposition into a definite state, reducing uncertainty. This principle extends beyond physics into complex systems:

  • In nature, entropy is actively reshaped by interactions.
  • Each observation removes possibilities from the probability space, altering the entropy landscape.
  • Entropy is not just a measure—it is an evolving force.

3. Open vs. Closed Entropy Systems: The Tree and the Cat

The difference between open natural systems and closed experimental systems illustrates why Shannon’s entropy is inadequate for real-world complexity.

3.1 The Schrödinger’s Cat Paradigm: A Closed Entropy System

In the famous Schrödinger’s cat thought experiment, a quantum trigger determines whether a cat inside a box is alive or dead. The system remains in a superposition of states until observed, at which point the probability collapses into a discrete outcome. This conforms to Shannon’s model:

  • The system is artificially isolated from external variables.
  • The probabilities are predefined and static (e.g., 50% chance of being alive or dead).
  • Shannon entropy successfully quantifies uncertainty before observationand collapses to zero after observation.

3.2 The Tree Falling in a Forest: An Open Entropy System

Contrast this with a tree falling in a forest. The tree is subject to countless emergent interactions:

  • Soil composition
  • Wind currents
  • Microbial decay
  • Ecosystem disturbances

The tree’s probability of falling is not predefined—it emerges dynamically based on these variables. This makes Shannon’s entropy model inapplicable, because:

  1. There is no fixed probability space—entropy is woven into a constantly shifting network of interactions.
  2. Entropy is shaped by feedback loops—a gust of wind or changing soil conditions alter the likelihood of collapse in real time.
  3. Observing the tree does not “freeze” entropy—the system continues evolving beyond a single measurement.

This highlights a fundamental paradox: Shannon’s entropy cannot exist in nature, because nature does not conform to static, pre-biased probability distributions.


4. Toward a Post-Shannonian Entropy Model

If Shannon’s entropy fails in open systems, what replaces it? I propose a Dynamic Entropy Model (DEM) that incorporates:

  1. Evolving Probabilities—Entropy should be modeled as a function of time, with probability distributions that shift due to observation and interaction.
  2. Feedback-Controlled Entropy—Systems may resist entropy flow through feedback mechanisms, similar to Maxwell’s Demon or entropy-resisting AI.
  3. Entropy Engineering—If entropy flow can be manipulated, then artificial intelligence, thermodynamics, and even economic markets could be optimized through entropy-aware interventions.

4.1 Dynamic Probabilities and Time-Dependent Entropy

A revised entropy formula could integrate time-dependent probability shifts:

H(t)=−∑pi(t)log⁡pi(t)H(t)=−∑pi​(t)logpi​(t)

where pi(t)pi​(t) evolves based on system interactions.

This allows entropy to be modeled as a flow, not just a static quantity. In ecosystems, financial markets, and neural networks, entropy fluctuates as new information reshapes the probability landscape.

4.2 The Role of Control and Feedback

Systems like AI models, economies, and biological processes regulate entropy through active feedback mechanisms. Examples include:

  • A dam regulating water flow—Delays entropy increase by organizing chaotic water movement.
  • AI bias correction—Adjusting data inputs to counteract entropy accumulation in machine learning models.
  • Cellular homeostasis—Living organisms use energy to maintain low-entropy states amid environmental fluctuations.

By incorporating entropy resistance into models, we move beyond Shannon’s static entropy and into a framework where entropy is a manipulable force.


5. Implications for AI, Quantum Mechanics, and Complexity Science

The shift from Shannonian to Dynamic Entropy has profound implications:

  1. Quantum Entropy & Information Theory
  • If entropy dynamically shifts with observation, quantum wavefunction collapse could be reinterpreted as an entropic transition.
  1. AI & Machine Learning
  • AI models inherently accumulate entropy through exposure to new data.
  • An entropy-aware AI could dynamically adjust learning algorithms based on entropy flow, improving adaptability and reducing bias.
  1. Computational Models of Nature
  • Nature’s entropy is emergent, not predefined.
  • New algorithms could simulate entropy evolution in ecosystems, economies, and climate models.

6. Conclusion: A New Paradigm of Entropy

Shannon’s entropy is a powerful but limited framework that applies only to static, closed systems. Real-world entropy is dynamic, evolving, and observer-dependent.

By recognizing entropy as an active, shifting force rather than a fixed measure, we unlock new possibilities for understanding complex systems, artificial intelligence, and even the fundamental structure of reality.

Future Work

This paper lays the groundwork for a post-Shannonian entropy theory. Future research should focus on:

  1. Mathematical modeling of dynamic entropy.
  2. Experimental validation in AI, quantum mechanics, and biological systems.
  3. Applications of entropy engineering in computation and thermodynamics.

Entropy is not just uncertainty—it is the evolving structure of reality itself. Understanding it as a dynamic force is the next frontier in complexity science.


Mitchell McPhetridge
Independent Researcher

Links to the supporting studies:

links to the supporting studies:

  1. Explicit Time-Dependent Entropy Production Expressions: Fractional and Fractal Pesin Relations | Brazilian Journal of Physics
  2. [1310.5959] Generalized (c,d)-entropy and aging random walks
  3. https://www.cambridge.org/core/books/entropy-in-dynamical-systems/668424EA998037F18673F9E002853047
  4. https://link.aps.org/doi/10.1103/PhysRevE.86.031117
  5. [1211.2257] Generalized entropies and logarithms and their duality relations
1 Like