From an Algorithmic Information Theory and general systems perspective, matter can be conceived of as an optimally concise algorithmic (i.e. procedurally descriptive) compression of energy. The “programs” or functions by which this energy is decompressible (and thus variously reproducible) are the ascendant laws of physics aggregated around the magnificent concision of E=MC^2, among many other associated (and often considerably less concise, yet at least overtly self-consistent, integrated) theoretical entities.
The structured compression of energy in (or as) matter provides for an offset or delay of the entropic energy dissipation that is incurred by the inevitability of thermodynamic decay. In this sense, the Cosmos provides structural or architectural “failsafes” against accelerated (dissipatory) self-annihilation. [Note also: the observed tendency towards lowest-energy ordered or patterned states in material systems exhibiting “emergent complexity” and that this simultaneously provides an aperture of opportunity for systemic self-propagation and for entropy offset through the production of dissipation channels that possess functional, partially-recyclable systemic utility through the directed reprocessing of information and energy.]
From this perspective, and in the context of “dark matter” (and perhaps also “dark energy” or other such distributed opaque mysteries) a question which arises is that of where (and what) exactly are the associated compression libraries?
One reply on “Cosmological Algorithms”
[…] perception or cognition as digital fuzz; uncertainty manifest in entropy or (even free) choice as patterns of algorithmic emptiness from which moments of mystifyingly self-aware abstraction emerge and then dissolve back into the […]
LikeLike