Philosophy technology

Artificial General Intelligence, but not as we expect it…

Notice that the organisational as much as technical self-propagation in AI research is, beyond the enlightened self-interests of commercial investment and the low-hanging fruit of metric spectacles, profoundly shaped by the degree to which the emergent entities, artefacts and systems provide the optimal opportunities for their own (non-ergodic, constrained or probabilistically-damped) self-replication.

I realise that this is an anthropological as much as distributed cybernetic or industrial system no-brainer, but consider that the irony of this situation might actually be that while the holy grail of integrated functional generalisation appears to recede directly proportional to the complexity and effort asserted in or as vectors towards it, that the patterned symmetries of sociotechnical behaviour that underlie AI communities of practice are themselves quite plausibly replicating in gestalt and quite inadvertently the complex systems principles and symmetries they seek to recreate and leverage.

Savant-like specialisation is useful, if limited, and represents an optimally-concise form in and as which the technical communities themselves self-replicate.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.