Categories
Philosophy

Forecasting, Uncertainty, Neural Networks

Context: DeepMind’s AI predicts almost exactly when and where it’s going to rain

Question: To what extent the modelling and simulation approximates to the actual systems complexity involved. Probabilistic, predictive within margins of error endemic to deep learning and here provides a 90 minute window of warning in this system. Granted, thermodynamic turbulence and irreducible chemical stochasticity (in and as turbulence) problematise this domain but, again, the presence of absence (as error, inaccuracy, less-than-hoped-for temporal [or spatial] resolution) is instructive.

Conjecture: The limits of computational fidelity in modelling non-trivially complex domains is a function of logic (as mathematics) and physics (as material extension of logic); the putatively “hard limits” of computational accuracy precisely are the stochastic properties of complex dynamical systems; not just similar – identical. This means that the unattainable (i.e. metaphysical and epistemological) “Otherness” of algorithmic closure in modelling at “perfect” resolution is always and already the empty set as self-negation and foundational recursive discontinuity around which logic, mathematics and physics or predictive information science orbits.

Rationale: If irreducible inaccuracy is endemic, it is actually a key point of leverage.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.