There is something foundational missing and it can not be brute-forced – all emulations are bound to be hollow and haunted. The ontological, holistic bootstrap of autocatalytic system self-containment that provides even a possibility of life, sentience, and eventually – intelligence and technology, is an absence conspicuous by its presence. There is a hidden logical dimensionality that is not being captured.
Machine learning is critically important and will change the world in countless ways but as a complex system of information-processing artefacts, systems, persons, organisations, institutions and commercial (or other) strategic self-interests – there is a fascinating and potentially irreducible enigma and self-organisational blindspot here. That is – the autonomously self-propagating, recursive symmetries of influence and feedback, of adaptively resilient and competitive information and energy-processing systems both in and as the machine learning and associated corporate community – this has itself, in gestalt, acquired the properties and sustainably continuous self-propagation of the general intelligence that they seek to cultivate.
The distributed macrocosm holds and manifests in and as what the microcosms can never unproblematically perceive and, perhaps, not for the first time in history, or logic and technology. This is the hyper-inflating complexity of an integrated human system that seeks precisely the forms of mathematical and institutional or industrialised complexity that regenerate and validate those sub-systems that produce it – autocatalytic. It is natural consequence in and of physics and complex systems dynamics but is hardly if ever acknowledged.
When Greg Chaitin proved that there is no optimally concise algorithmic compression as representation of any non-trivially sophisticated object, sequence or entity – it seems that no one was listening; Gödel is smiling wistfully from the shadows. There is not an endpoint and this, precisely and counter-intuitively, *is* the endpoint and bootstrap. AGI is an ontological concept, not a purely reductive or mechanical one.