We barely even notice as all this technology sweeps us away, transforms us.
The greatest enigma of all is that the more we identify our Self in, through and as this technological hyper-extension of our own minds and bodies – the less we become.
Only the greatest of strategists could ever mask their own existence to the point of invisibility and so it is with the patterns of information as technology that infuse our world with their own blind, hidden goals of pure and relentless self-replication.
One reply on “Technology: The Great Unmaking”
Love the direction of your blog. I’m taking Archive Humanitas into the philosophy of technology debate this year in a similar guise. I have one question: how would you balance out the negative-pervasiveness of technology withdrawing into the background ‘unnoticed.’—with say—Heideggerian ontology in general, e.g., where the normative status of our most reliant objects and status functions fall into the pre-intentional background in their averageness? Or, where Harman’s O-O-O sees the average state of objects as withdrawn of sensibility to begin with? Dasein, in skillful-coping, even, achieves a level of mastery where things, events, and actions become a matter of the pre-intentional background? The point is for the novice to shed the maxims and prescribed rules as he gains in coping with the environment. So, how is this “unnoticability” of technology or its objects made suspect over and above Dasein’s ordinary dealings? I use the Heideggerian vernacular for a reason, because I think he calls a similar issue to my attention with the notion of / gestell/. Eventually, these objects that we design end up organizing us. And your contention is really the subtle pervasiveness in which ubiquitous technology achieves its goal. Now, both positive organizational ubiquity and negative technological horror, e.g., the Matrix is a dramatic metaphor for even positive technological prostheses that become ubiquitous. Our reality becomes the organizational reality they deliver. So, here’s where I’ve arrived in response to Heidegger in technology: in economics there’s something called “Opportunity Cost,” and there’s a formula of you happen to know all contingencies in advance—which are scenario precludes us from achieving (forgetfulness is essential for any normativity). Yet, the burden remains on us, as an ethical responsibility then, to determine in advance what the parameters of a given technology will install in our everyday dealings in the world. A kind of categorical imperative is called for in which we are charged to measure how said technological prostheses will contribute to our being-in-the-world. To determine, maybe somehow ahead of this essential forgetfulness… To state this in terms closer to the point you bring up: to grasp the
/ gestell / (remember—not “gestalt”) before a particular technology moves into the shadows and begins organizing our lives.
1. Does our average paradoxical familiarity with ontological withdrawal put us in touch with the ontological status of withdrawal concerning technological objects?
2. Praxis: to what degree is it possible to grasp the parameters in advance of necessary forgetfulness? Ultimately then, to what degree are we responsible for what we create? Can we be responsible?
LikeLiked by 1 person