Categories
communication

AI Ethics, Alignment, Responsibility, Strategy

Responsibility is significant. Aligned factors: the most disruptive feature of new technology is counter-intuitively NOT (or at least not only) the novel artefact, entity or system. It is the literally self-sustaining(!) discontinuity by and through which existing and persisting organisational or communications systems defensively lash themselves to the mast of strategic and communicative anachronisms. They know no better.

Responsible AI has become something of a common token and bundled vocabulary of intent or system steering. If corporate governance is not simply making related assertions of policy for appearances, for “shits and giggles”, then they need to be transparent, compassionate and conscientious.

From climate to conflict and everything in between – with a special interest in technologically-mediated language and communication, I am seeking tesearch engagements in this area. The question is not, perhaps, of whether or not such consequential and interesting vocational opportunities exist, but of whether HR practices are able to recognise the utility and value that intellectual unorthodoxy invokes.

When new technologies or conceptual vocabularies indicate that legacy methods and canonical dogma are now defunct, it is the unmanageable complexity of seeking precisely the wrong kinds of institutional continuity that tends to invoke the greatest organisational and human costs.

One reply on “AI Ethics, Alignment, Responsibility, Strategy”

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.