Categories
technology

On Training Minds and Machines to be Flexible

One thing humans do share with the machine intelligence we are so enthusiastically refining is a conspicuous (or at least institutionalised) absence of aptitude when it comes to smoothly and efficiently negotiating unexpected events and volatile environments.

The benefits of integrated logistical automation are clear but when it comes to training the machine intelligence that coordinates those systems, they tend to function like human students that, having been forced to adopt rote-learned methods and problem-solving heuristics, become grammatical rule-processors without deep comprehension or true adaptive utility.

There is no necessary (or at least overt) correlation between education and the tasks that we expect both minds and machines to achieve. There is, however, an implicit bias for us to project our own cognitive frameworks into and upon (or as) the organisational systems we construct and to inadvertently encode our logical inadequacies into them.

Questions should be asked of the core methods used in organisational automation.  Are we attempting to brute force a utility of machine intelligence in ways that are unrealistic, inflexible, fault-intolerant?  We are born as (and require) flexible generalists but everywhere train, value and valourise inflexible specialists.

Context: Our weird behavior during the pandemic is messing with AI models

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.