Categories
technology

The True (Energy) Cost of Artificial Intelligence

Context: Deep Learning’s Diminishing Returns

Rolf Landauer coined the phrase that information is physical. The consequences being that all computation and sophisticated or automated statistical analysis at scale has similar costs, at scale. Diminishing returns is a function of the second law of thermodynamics.

The article’s observation that “new approaches” are required is accurate. Are we actually trying to replicate human-level intellect or has that vanished into a cloud of self-propagating, self-validating commercial and academic or industrial career-existential necessity?

The most “successful” technologies in this context are measured by an ability to draw attention and to invoke the effervescent information artefacts of their own conceptual ancestry. These things are maximally-oriented towards self-replication, through us and – as with many if not most technologies – their primary focus is the direct or indirect management of the consequences and artefacts or costs of other technologies. The recursive hyper-inflation is quite real here.

In “A Thousand Brains” Jeff Hawkins references Vernon Mountcastle’s assertion that there was a universal algorithm of functional brain dynamics. As good a place as any to start, but I expect it will apply far beyond 12-watt brains.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.