
Notice that the organisational as much as technical self-propagation in AI research is, beyond the enlightened self-interests of commercial investment and the low-hanging fruit of metric spectacles, profoundly shaped by the degree to which the emergent entities, artefacts and systems provide the optimal opportunities for their own (non-ergodic, constrained or probabilistically-damped) self-replication.
I realise that this is an anthropological as much as distributed cybernetic or industrial system no-brainer, but consider that the irony of this situation might actually be that while the holy grail of integrated functional generalisation appears to recede directly proportional to the complexity and effort asserted in or as vectors towards it, that the patterned symmetries of sociotechnical behaviour that underlie AI communities of practice are themselves quite plausibly replicating in gestalt and quite inadvertently the complex systems principles and symmetries they seek to recreate and leverage.
Savant-like specialisation is useful, if limited, and represents an optimally-concise form in and as which the technical communities themselves self-replicate.