In my last post, I gave a response to a question that had been asked by a viewer regarding my MEA presentation. The question came in two parts, the first part being “about the relation between the simultaneity of the computer (due to electric speed up) and the linear one-thing-at a-time structure of the CPU.” In the second part, it was clarified to be a question of whether the computer is “electric” in the sense McLuhan meant in using the term. I interpreted it to be about how McLuhan saw electric media of his day, vs. its nature today. My response began definitively, “No…”
This morning I was pleased to find a response from Prof. Derrick de Kerckhove in my email inbox!
This is a lovely and instructive comment, Clinton and I enjoyed it a lot. The main argument, however, requires more persuasion to convince. Computers depend still on raw electrical feed and will for the foreseeable future. When McLuhan talks about electricity, he is referring to the very ground of change, by opposition to literacy. And that is the key to understanding both his work and our present culture. Computers and all the digital processing you describe so usefully notwithstanding, remain figures, key elements for sure, but still issuing and dependent (not to say subservient) to the electrical ground.
I will consider what he wrote today while at work, and develop an appropriate response soon.
The conversation continues!
An interesting conversation!
Some thoughts:
The “one thing at a time” nature of the CPU seems trivial when considering the speed at which one thing follows another. Computation, in the sense of calculations and logical operations, may not be electric in and of itself, but the speed at which these operations can be carried out via electric circuitry is what’s truly transformative.
If computers/digital processes are figures to an electrical ground, can simulated cyberspace be considered figure to the ground of machine language? If so, what are the hidden effects of machine language?
It seems that as computers advance, they are inevitably drawn towards the characteristics of electricity itself i.e. simultaneity and subatomic scale. Parallel computing and research into quantum computers springs to mind…
Any thoughts on this? Keep up the great work!