Adaptive systems reuse fixed internal states across contexts due to resource limits, leading to inevitable contextuality in classical probabilistic models. The paper proves an irreducible information-theoretic cost for reproducing contextual statistics. Nonclassical frameworks avoid this without quantum mechanics by lacking a global joint probability space.
Key Points
- 1.Contextuality arises inevitably from single-state reuse across contexts
- 2.Classical models incur irreducible info cost for contextual outcomes
- 3.Minimal example realizes and operationalizes the cost
- 4.Nonclassical probs bypass via no global joint probability space
Impact Analysis
Reveals fundamental limits of classical representations in adaptive AI, potentially inspiring nonclassical approaches for more efficient intelligence.
Technical Details
Contexts modeled as interventions on shared internal state. Proof: context dependence cannot be solely mediated by internal state in classical setups.