Question: To what extent the modelling and simulation approximates to the actual systems complexity involved. Probabilistic, predictive within margins of error endemic to deep learning and here provides a 90 minute window of warning in this system. Granted, thermodynamic turbulence and irreducible chemical stochasticity (in and as turbulence) problematise this domain but, again, the presence of absence (as error, inaccuracy, less-than-hoped-for temporal [or spatial] resolution) is instructive.
Conjecture: The limits of computational fidelity in modelling non-trivially complex domains is a function of logic (as mathematics) and physics (as material extension of logic); the putatively “hard limits” of computational accuracy precisely are the stochastic properties of complex dynamical systems; not just similar – identical. This means that the unattainable (i.e. metaphysical and epistemological) “Otherness” of algorithmic closure in modelling at “perfect” resolution is always and already the empty set as self-negation and foundational recursive discontinuity around which logic, mathematics and physics or predictive information science orbits.
Rationale: If irreducible inaccuracy is endemic, it is actually a key point of leverage.